WO2020090010A1 - Abnormal behavior detection device, abnormal behavior detection program, and abnormal behavior detection method - Google Patents
Abnormal behavior detection device, abnormal behavior detection program, and abnormal behavior detection method Download PDFInfo
- Publication number
- WO2020090010A1 WO2020090010A1 PCT/JP2018/040347 JP2018040347W WO2020090010A1 WO 2020090010 A1 WO2020090010 A1 WO 2020090010A1 JP 2018040347 W JP2018040347 W JP 2018040347W WO 2020090010 A1 WO2020090010 A1 WO 2020090010A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- motion
- unit
- occurrence probability
- frame
- abnormality
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
Definitions
- the present invention relates to an abnormal behavior detection device, an abnormal behavior detection program, and an abnormal behavior detection method for detecting abnormal behavior from the movement of a crowd captured in an image.
- Non-Patent Document 1 there is a technique for detecting abnormal behavior from the movement of a crowd (for example, Non-Patent Document 1).
- the technique of Non-Patent Document 1 detects a motion from a temporal change of a foreground and creates an activity map by adding together foreground images for several consecutive frames.
- Anomalous behavior is detected when Shannon entropy with respect to the occurrence frequency of pixel values calculated from the activity map histogram is high and the difference between activity maps created at consecutive different times is large.
- Non-Patent Document 1 the situation in which the crowd continues to move is always determined to be abnormal, so in a place where the crowd constantly moves back and forth like a station, the normal state is erroneously detected as an abnormal state. there's a possibility that. Further, most of the conventional techniques are specialized in detecting a specific abnormal behavior, and there is no technique for detecting a plurality of abnormal behaviors and discriminating the type of each abnormal behavior.
- the present invention aims to provide a device for accurately detecting abnormal behavior without misdetecting normal behavior of a crowd as abnormal behavior.
- the abnormality detection device of the present invention An image acquisition unit that acquires multiple images, A motion extraction unit that extracts a plurality of motion vectors determined by a magnitude of motion and a direction of motion from a plurality of moving objects included in an image of one frame; A weighting unit that calculates an occurrence probability for each motion vector of the plurality of motion vectors and weights at least one of the occurrence probabilities, A scattering degree calculation unit that calculates the scattering degree of the one frame based on the weighted occurrence probability; Comparing the scattering degree and the threshold value, a detection unit for detecting an abnormality based on the comparison result, Equipped with.
- FIG. 2 is a diagram of the first embodiment and shows a hardware configuration of the abnormality detection device 100.
- FIG. FIG. 3 is a diagram of the first embodiment and shows a flow of abnormality detection processing performed by the abnormality detection device 100.
- 3 is a diagram of the first embodiment and is a flowchart of the operation of the abnormality detection device 100.
- FIG. FIG. 3 is a diagram of the first embodiment showing that a plurality of persons 61 move irregularly.
- FIG. 4 is a diagram showing a frame 72 which is one frame in the diagram of the first embodiment.
- FIG. 4 is a diagram for explaining the motion vector v t (r, ⁇ ) and the occurrence probability p t (r, ⁇ ) in the diagram of the first embodiment.
- FIG. 4 is a diagram showing the relationship between the scattering degree E t calculated by a scattering degree calculation unit 14 and the occurrence probability p t (r, ⁇ ) in the diagram of the first embodiment.
- FIG. 4 is a diagram of the first embodiment and shows the detection of an abnormality by the detection unit 15.
- FIG. 3 is a diagram of the first embodiment and shows a frame 71.
- FIG. 10 is a diagram of the first embodiment and schematically shows the occurrence probability p t (r, ⁇ ) obtained from FIG. 9.
- FIG. 3 is a diagram of the first embodiment and shows a frame 73 for explaining convection.
- FIG. 12 is a diagram of the first embodiment and schematically shows the occurrence probability p t (r, ⁇ ) obtained from FIG. 11.
- FIG. 11
- FIG. 3 is a diagram of the first embodiment and shows a video area 81.
- FIG. 14 is a diagram of the first embodiment and schematically shows the occurrence probability p t (r, ⁇ ) obtained from FIG. 13.
- FIG. 3 is a diagram of the first embodiment and shows a frame 75 for explaining reverse running.
- FIG. 16 is a diagram of the first embodiment and schematically shows the occurrence probability p t (r, ⁇ ) obtained from FIG. 15.
- FIG. 4 is a diagram of the first embodiment and shows a first example in which the detection unit 15 detects a type of abnormality.
- FIG. 6 is a diagram of the first embodiment and shows a second example in which the detection unit 15 detects the type of abnormality.
- FIG. 6 is a diagram of the first embodiment and shows a configuration of an abnormality detection device 100 according to a second modification of the abnormality detection device 100.
- the abnormality detection device 100 is a device that calculates the degree of scattering of each frame of a plurality of frames that form an image and detects an abnormality occurring in the image from changes in the degree of scattering.
- FIG. 1 shows a hardware configuration of the abnormality detection device 100.
- the abnormality detection device 100 is a computer.
- the abnormality detection device 100 includes the processor 10 and other hardware such as the main storage device 20, the auxiliary storage device 30, the input interface 40, and the output interface 50.
- the processor 10 is connected to other hardware via the signal line 101, and controls these other hardware.
- the abnormality detection device 100 includes, as functional elements, an image acquisition unit 11, a motion extraction unit 12, a weighting unit 13, a scattering degree calculation unit 14, and a detection unit 15.
- the functions of the image acquisition unit 11, the motion extraction unit 12, the weighting unit 13, the scattering degree calculation unit 14, and the detection unit 15 are realized by an abnormality detection program that is software.
- the processor 10 is a device that executes an abnormality detection program.
- the abnormality detection program is a program that realizes the functions of the image acquisition unit 11, the motion extraction unit 12, the weighting unit 13, the scattering degree calculation unit 14, and the detection unit 15.
- the processor 10 is an IC (Integrated Circuit) that performs arithmetic processing. Specific examples of the processor 10 are a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a GPU (Graphics Processing Unit).
- the main storage device 20 is a storage device that temporarily stores data. Specific examples of the main storage device 20 are SRAM (Static Random Access Memory) and DRAM (Dynamic Random Access Memory). The main storage device 20 holds the calculation result of the processor 10.
- the auxiliary storage device 30 is a storage device that stores data in a nonvolatile manner.
- a specific example of the auxiliary storage device 30 is a HDD (Hard Disk Drive).
- the auxiliary storage device 30 may be an SD (registered trademark) (Secure Digital) memory card, a CF (CompactFlash), a NAND flash, a flexible disk, an optical disk, a compact disk, a Blu-ray (registered trademark) disk, a DVD (Digital Versatile Disk), or the like. It may be a portable recording medium.
- the input interface 40 is a port to which various devices are connected and data of various devices is input.
- the camera 200 is connected to the input interface 40, and the image of the camera 200 is input.
- the output interface 50 is a port to which various devices are connected and data is output by the processor 10 to the various devices.
- the processing result such as the detection result by the detection unit 15 is output to various devices via the output interface 50.
- the abnormality detection program is stored in the main storage device 20.
- the abnormality detection program is read from the main storage device 20 into the processor 10 and executed by the processor 10.
- the main storage device 20 stores not only the abnormality detection program but also an OS (Operating System).
- the processor 10 executes the abnormality detection program while executing the OS.
- the abnormality detection program and the OS may be stored in the auxiliary storage device 30.
- the abnormality detection program and the OS stored in the auxiliary storage device 30 are loaded into the main storage device 20 and executed by the processor 10. Note that part or all of the abnormality detection program may be incorporated in the OS.
- the abnormality detection device 100 may include a plurality of processors that replace the processor 10.
- the abnormality detection device 100 may separately include an image processor. These plural processors share the execution of the abnormality detection program.
- Each processor like the processor 10, is a device that executes an abnormality detection program.
- the data, information, signal values, and variable values used, processed, or output by the abnormality detection program are stored in the main storage device 20, the auxiliary storage device 30, or a register or cache memory in the processor 10.
- the abnormality detection program is obtained by replacing “part” of each part of the image acquisition part 11, the motion extraction part 12, the weighting part 13, the scattering degree calculation part 14 and the detection part 15 with “processing”, “procedure” or “process”. It is a program that causes a computer to execute processing, each procedure, or each step.
- the abnormality detection method is a method performed by the abnormality detection device 100, which is a computer, executing an abnormality detection program.
- the abnormality detection program may be stored in a computer-readable recording medium and provided, or may be provided as a program product.
- the operation of the abnormality detection device 100 corresponds to an abnormality detection method.
- the procedure of the abnormality detection method corresponds to the procedure of the abnormality detection program.
- FIG. 2 shows a flow of abnormality detection processing performed by the abnormality detection device 100.
- FIG. 3 is a flowchart of the operation of the abnormality detection device 100.
- the step numbers of FIG. 3 are attached to the processing corresponding to FIG.
- An outline of the operation of the abnormality detection device 100 will be described with reference to FIGS. 2 and 3.
- step S101 the image acquisition unit 11 acquires a plurality of images. Specifically, as shown in FIG. 1, the image acquisition unit 11 acquires a plurality of images from the camera 200 via the input interface 40.
- step S102 the motion extraction unit 12 extracts a plurality of motion vectors determined by the magnitude of the motion and the direction of the motion from the plurality of moving objects included in the image of one frame.
- the motion vector is expressed as a motion vector v t (r, ⁇ ) or simply v t (r, ⁇ ).
- the motion vector v t (r, ⁇ ) will be described later.
- the subscript t means one frame.
- a person is taken as an example of the moving body that extracts the motion vector v t (r, ⁇ ).
- the moving body that extracts the motion vector v t (r, ⁇ ) is not limited to a person.
- the moving body can be any moving body such as an insect, a vehicle, or a floating substance floating in a liquid.
- step S103 the weighting unit 13, a plurality of motion vectors v t (r, ⁇ ) of the motion vector v t (r, ⁇ ) for each, to calculate the probability p t (r, ⁇ ).
- the weighting unit 13 weights at least one of the occurrence probabilities p t (r, ⁇ ).
- the occurrence probability is expressed as the occurrence probability p t (r, ⁇ ) or simply p t (r, ⁇ ).
- the occurrence probability p t (r, ⁇ ) will be described later.
- the subscript t means one frame.
- the occurrence probability p t (r, ⁇ ) has a one-to-one correspondence with the motion vector v t (r, ⁇ ).
- the weighting by the weighting unit 13 will be described as types 1 to 4.
- the weighting of types 1 to 4 will be described later.
- step S104 the scattering degree calculation unit 14 calculates the scattering degree E t of one frame based on the weighted occurrence probability p t (r, ⁇ ).
- the degree of scattering is also called entropy. Details of the scattering degree E t will be described later.
- step S105 the detection unit 15 compares the scattering degree Et with the threshold value TH and detects an abnormality based on the comparison result.
- the threshold TH is set in the detection unit 15.
- FIG. 4 shows a frame 71 which is one frame of an image.
- Frame 71 shows a randomly moving crowd.
- the scattering degree E t is large in a randomly moving crowd.
- a plurality of people 61 shown in a circular shape move irregularly. Each of the round shapes represents a person.
- the direction of a circle-shaped arrow 82 indicating the person 61 indicates the moving direction of the person, and the size of the arrow indicates the moving speed of the person.
- FIG. 5 shows a frame 72, which is one frame of an image. Frame 72 shows a crowd that moves regularly. The degree of scattering E t is small in a regularly moving crowd.
- FIG. 6 is a diagram for explaining the motion vector v t (r, ⁇ ) and the occurrence probability p t (r, ⁇ ). Description will be made assuming the frame 71 of FIG. The uppermost part of FIG. 6 is a diagram showing the distribution of the person vector 82 in the direction ⁇ . The horizontal axis represents the direction ⁇ of the person vector 82, and the vertical axis represents the number N ⁇ of the directions ⁇ .
- the circle on the right side shows the measurement direction of the direction ⁇ .
- the direction ⁇ is measured counterclockwise with respect to the X axis.
- the XY coordinates are shown in FIGS. 4, 5, 9, 11, 13, and 15.
- the second row of FIG. 6 is a diagram showing the distribution of the size r of the person vector 82.
- the horizontal axis represents the size r of the person vector 82, and the vertical axis represents the number Nr of the sizes r.
- the third row of FIG. 6 schematically shows the distribution of the motion vector v t (r, ⁇ ).
- the motion vector v t (r, ⁇ ) is extracted from the frame 71 by the motion extraction unit 12.
- the motion vector v t (r, ⁇ ) is obtained by referring to another frame different from the frame 71. Since the motion vector v t (r, ⁇ ) is determined from the size r and the direction ⁇ , when a plurality of person vectors 82 are the same, a plurality of the same person vectors 82 are one motion vector v t (r, ⁇ ). ) Is summarized.
- the fourth row of FIG. 6 schematically shows the distribution of the occurrence probabilities p t (r, ⁇ ) of the motion vector v t (r, ⁇ ).
- the weighting unit 13 calculates the occurrence probability p t (r, ⁇ ) for each motion vector v t (r, ⁇ ).
- FIG. 7 shows the relationship between the scattering degree E t calculated by the scattering degree calculating unit 14, the occurrence probability p t (r, ⁇ ), and the weight.
- log [p t (r, ⁇ )] is sometimes called an information amount.
- Expression 1 shown in FIG. 7 is an expression of the scattering degree E t when there is no weight.
- Expression 2 shown in FIG. 7 is an expression of the scattering degree E t when the weight w r, ⁇ is reflected. The details of the weight w r, ⁇ will be described later.
- FIG. 8 is a graph showing the detection of abnormality by the detection unit 15.
- the scattering degree E t is calculated for each t, that is, for each frame, and this corresponds to the time on the horizontal axis of FIG. 8.
- the vertical axis of FIG. 8 represents the magnitude of the scattering degree E t .
- the detection unit 15 determines that the acquired image is abnormal when the scattering degree Et is equal to or greater than the threshold value TH. In FIG. 8, the detection unit 15 detects that the abnormal time zone is abnormal.
- the graph of FIG. 8 is generated by the detection unit 15 for each of the types 1 to 4 shown in step S103 of FIG.
- the threshold value TH is set for each of type 1 to type 4. For example, each threshold TH is set in the abnormality detection program that realizes the detection unit 15. As the threshold TH, different types of thresholds are set for each type.
- the threshold TH of type 1 is a threshold indicating confusion
- the threshold TH of type 2 is a threshold indicating convection
- the threshold TH of type 3 is a threshold indicating confluence
- the threshold TH of type 2 is The threshold value TH is a threshold value indicating the reverse transmission. Therefore, the detection unit 15 can detect the occurrence of an abnormality and the type of the abnormality that has occurred as the abnormality. Specifically, it is as follows. Detector 15 can detect the occurrence of abnormality when the scattering degree E t exceeds the threshold TH. The detection unit 15, when the scattering degree E t exceeds the threshold value TH, the type of the threshold TH, can detect the type of abnormality that has occurred.
- the abnormality detection apparatus 100 performs weighting processing of types 1 to 4 by parallel processing on the monitoring image acquired in step S101 in step S103. It should be noted that the present invention is not limited to the method of performing the weighting processing of types 1 to 4 at the same time, and any one of types 1 to 4, any two, or any three or more may be selected.
- the weighting unit 13 performs weighting processing according to the selection command.
- Type 1 The type 1 of the weighting process performed by the weighting unit 13 will be described with reference to FIGS. 9 and 10.
- the weighting section 13 the occurrence probability p t (r, theta) depending r to the magnitude of the motion of the motion is calculated vector v t (r, theta), occurrence probability p t (r, theta) Are weighted.
- Type 1 can detect a confusion state as an abnormal state. In other words, the state in which the type 1 is detected as abnormal is the confusion state.
- FIG. 9 is the same frame 71 as FIG.
- FIG. 10 schematically shows the occurrence probability p t (r, ⁇ ) obtained from FIG. 9.
- the occurrence probability p t (r, ⁇ ) is determined by the size r and the direction ⁇ .
- the larger the size r the larger the weight w r, ⁇ .
- W r, ⁇ may be proportional to the size r.
- the weight w r, ⁇ is not reflected in the occurrence probability p t (r, ⁇ ) of the information amount log [p t (r, ⁇ )].
- the weight w r, ⁇ is reflected in the occurrence probability p t (r, ⁇ ) multiplied by the information amount log [p t (r, ⁇ )]. This is the same for types 1 to 4.
- Type 2 The type 2 of the weighting process performed by the weighting unit 13 will be described with reference to FIGS. 11 and 12.
- the weighting section 13 the occurrence probability p t (r, ⁇ ) in accordance with the direction of movement theta motion vector v t is calculated (r, theta) is the probability p t (r, ⁇ ) Perform weighting. More specifically, in Type 2, one frame is divided into left and right, and different weighting is performed on the left side of the screen and the right side of the screen.
- the type 2 can detect a convection state as an abnormal state. In other words, the type 2 abnormal state is a convection state.
- FIG. 11 is a frame 73 for explaining convection. In FIG.
- FIG. 12 schematically shows the occurrence probability p t (r, ⁇ ) obtained from FIG. 11.
- the occurrence probability p t (r, ⁇ ) is determined by the size r and the direction ⁇ , but in Type 2, weighting is performed according to the direction ⁇ .
- Type 2 is characterized in that the left side of the screen and the right side of the screen have different weighting directions. On the left side of the screen, the top (3/2 * ⁇ ⁇ ⁇ ⁇ 2 ⁇ ) and the bottom region (0 ⁇ ⁇ ⁇ ⁇ / 2) of FIG. 12 are weighted.
- the weighting unit 13 may attach the weight w r, ⁇ to the occurrence probability p t (r, ⁇ ) of the region to be weighted according to the value of the corresponding size r.
- the weight w r, ⁇ may be proportional to the size r.
- the weighting unit 13 may give a fixed value as the weight w r, ⁇ .
- the left side of the screen is emphasized in the right direction and the right side of the screen is emphasized in the left direction, so that convection can be easily detected.
- the type 3 of the weighting process performed by the weighting unit 13 will be described with reference to FIGS. 13 and 14.
- the weighting unit 13 performs weighting according to the video area 81 in one frame.
- the frame 74 in FIG. 13 is one frame.
- FIG. 13 is a diagram showing the video area 81.
- the image area 81 in FIG. 13 shows a merge from the merge area 83 to the main line 84.
- FIG. 14 schematically shows the occurrence probability p t (r, ⁇ ) obtained from FIG. 13.
- the motion extraction unit 12 extracts a video area 81, which is a partial area of one frame, from the frame 74, and extracts a plurality of motion vectors from the video area 81.
- the motion extraction unit 12 extracts a plurality of motion vectors from the exclusion area of the frame where the video area 81 is removed.
- the exclusion area is an area of the frame 74 from which the video area 81 is excluded.
- the weighting unit 13 calculates the occurrence probability for each motion vector of the plurality of motion vectors extracted from the video area 81. Further, the weighting unit 13 calculates the occurrence probability for each motion vector of the plurality of motion vectors extracted from the exclusion area.
- the weighting unit 13 weights the plurality of occurrence probabilities calculated based on the video area 81 differently from the plurality of occurrence probabilities calculated based on the exclusion area. Specifically, it is as follows.
- the motion vector v t (r, ⁇ ) extracted in the video area 81 is expressed as vin , t (r, ⁇ ), and the motion vector v t (r, ⁇ ) extracted in the exclusion area is v out, t. Notated as (r, ⁇ ).
- the motion vector v in, t (r, ⁇ ) occurrence probability p in the, t (r, ⁇ ) is expressed as, the motion vector v out, t (r, ⁇ ) occurrence probability p out of, t ( r, ⁇ ).
- the scattering degree calculator 14 distinguishes between p in, t (r, ⁇ ) and p out, t (r, ⁇ ) in (Equation 2) of FIG. 7.
- the motion extracting unit 12 extracts the video area 81 from the frame 74 and the plurality of motion vectors v in, t (r, ⁇ ) from the video area 81.
- the motion extraction unit 12 extracts a plurality of motion vectors v out, t (r, ⁇ ) from the exclusion area.
- the weighting unit 13 calculates the occurrence probability p in, t (r, ⁇ ) for each motion vector of the plurality of motion vectors v in, t (r, ⁇ ) extracted from the video area 81.
- the weighting unit 13 also calculates the occurrence probability p out, t (r, ⁇ ) for each motion vector of the plurality of motion vectors v out, t (r, ⁇ ) extracted from the exclusion area.
- the range indicated by the broken line in FIG. 14 indicates the range of occurrence probabilities p in, t (r, ⁇ ) corresponding to the motion vector v in, t (r, ⁇ ) extracted from the video area 81 by the motion extracting unit 12.
- the nine occurrence probabilities p in, t (r, ⁇ ) are indicated by dots in the range indicated by the broken line in FIG.
- the weighting unit 13 differs from the plurality of occurrence probabilities p in, t (r, ⁇ ) calculated based on the video area 81 and the plurality of occurrence probabilities p out, t (r, ⁇ ) calculated based on the exclusion area. Perform weighting. For example, the weighting unit 13 sets, for the plurality of occurrence probabilities p in, t (r, ⁇ ), a weight w r, ⁇ that is inversely proportional to the magnitude r of the occurrence probability p in, t (r, ⁇ ).
- the type 4 of the weighting process performed by the weighting unit 13 will be described with reference to FIGS. 15 and 16.
- the type 4 is similar to the type 2, but the type 4 does not divide the screen into left and right with respect to the type 2, and the weighting unit 13 according to the direction ⁇ in the occurrence probability p t (r, ⁇ ) in the entire one frame. Weights.
- the type 4 can detect a reverse running state as an abnormal state. In other words, the state detected as abnormal in Type 4 is the reverse running state.
- FIG. 15 is a frame 75 for explaining reverse running. In FIG. 15, the person vectors 82 of the plurality of persons 61 are aligned in the left direction. Only the person vector 82 of the person 62 indicated by the black circle is to the left.
- FIG. 15 is a frame 75 for explaining reverse running. In FIG. 15, the person vectors 82 of the plurality of persons 61 are aligned in the left direction. Only the person vector 82 of the person 62 indicated by the black circle is to the left.
- the weight w r, ⁇ is given to the occurrence probability p t (r, ⁇ ) according to the direction ⁇ .
- the weight w r, ⁇ is zero in the uppermost region (3/2 * ⁇ ⁇ ⁇ ⁇ 2 ⁇ ) and the lowermost region (0 ⁇ ⁇ ⁇ ⁇ / 2) of FIG. 16. ..
- the weighting unit 13 gives the weight w r, ⁇ proportional to the size r to the occurrence probability p t (r, ⁇ ). By this weighting, the backward run of the person 62 in FIG. 15 can be detected.
- the weighting unit 13 weights the types 1 to 4.
- the scattering degree calculating unit 14 calculates the scattering degree E t using the weight w r, ⁇ calculated by the weighting unit 13.
- the detection unit 15 determines that the scattering degree Et is abnormal when the scattering degree Et is equal to or greater than the threshold value TH.
- Detector 15 uses the area of the foreground of the frame to identify the abnormality.
- Detector 15 a result of comparison between the threshold value TH and the scattering degree E t, based on the foreground area in at least one frame, to detect the abnormality.
- the type of abnormality can be specified from the shape of the foreground of a specific frame or the shape of the foreground of the frame in which the abnormality is detected.
- the type of abnormality can be specified from the change in the foreground area in a plurality of frames.
- the detection unit 15 determines the frame t 1, the frame t + 1, the frame t + 2 ,. . . Compute the foreground area change over multiple frames of. Detector 15 can detect the foreground area change, with the change in the degree of scattering E t shown in FIG. 8, as an abnormality, a type of concrete abnormalities.
- FIG. 17 shows a first example in which the detection unit 15 uses the change in the foreground area to detect an abnormality.
- the graph on the upper side of FIG. 17 is the same as the graph shown in FIG.
- the lower graph of FIG. 17 is a graph of the change in the foreground area over time.
- the area of the foreground increases from before the abnormal time zone to during the abnormal time zone.
- the area of the foreground maintains a maximum value from the abnormal time zone to after the abnormal time zone. That is, since the scattering degree Et is small after the abnormal time period and the area of the foreground maintains the maximum value, the detection unit 15 determines that the type of abnormality is a collection.
- the state is “(1) normal” before the abnormal time zone.
- the detection unit 15 detects “(2) Abnormality”. “(2) Abnormality” is actually a mess or a group with movement.
- a comparison result between the threshold TH and the scattering degree E t, based on the change in the foreground area in a plurality of frames "(3) abnormality" a type of detecting a throng ing.
- the detection unit 15 can detect an abnormality such as a roar that does not cause movement.
- FIG. 18 shows a second example in which the detecting unit 15 detects an abnormality using the change in the foreground area.
- the graph on the upper side of FIG. 18 is the same as the graph shown in FIG.
- the lower graph in FIG. 18 is a graph of the change in the foreground area over time.
- the area of the foreground decreases from before the abnormal time zone to during the abnormal time zone.
- the area of the foreground maintains a minimum value from the abnormal time zone to after the abnormal time zone. That is, since the scattering degree Et is small after the abnormal time period and the area of the foreground keeps the minimum value, the detection unit 15 determines that the type of abnormality is evacuation.
- the state is “(1) normal” before the abnormal time zone.
- the detection unit 15 detects “(2) Abnormality”. "(2) Abnormal” is actually confusion or evacuation. After the abnormal time zone detection unit 15, a comparison result between the threshold TH and the scattering degree E t, based on the change in the foreground area in a plurality of frames to detect accused the type of "(3) abnormality” ing. In this way, by using the change in the foreground area, the detection unit 15 can detect the type of abnormality that has occurred during the abnormal time period.
- the detection unit 15 can detect the type of abnormality from the change and the foreground area change of the scattering of E t. Specifically, the corresponding information type abnormalities associated with the change and the foreground area change of the scattering degree E t is set to the detection unit 15 for each type of abnormality. The detection unit 15 can detect the type of abnormality by referring to the correspondence information. Note that the abnormality detection device 100 is not limited to the examples of FIGS. 17 and 18, and can detect various types of abnormalities.
- the coping method such as “guard dispatch” or “evacuation guidance” differs depending on the type of abnormal behavior. Therefore, it is effective to identify the type of abnormal behavior.
- Detecting section 15 of the abnormality detection device 100 and detects the type of the anomaly from the change and the foreground area change of the scattering of E t, it is possible to select what to do in response to the type of abnormal behavior.
- FIG. 19 shows a configuration of the abnormality detection device 100 according to the second modification of the abnormality detection device 100.
- the electronic circuit 90 of FIG. 19 includes an image acquisition unit 11, a motion extraction unit 12, a weighting unit 13, a scattering degree calculation unit 14, a detection unit 15, a main storage device 20, an auxiliary storage device 30, an input interface 40, and an output interface 50. It is a dedicated electronic circuit that realizes the function.
- the electronic circuit 90 is connected to the signal line 91.
- the electronic circuit 90 is specifically a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a GA, an ASIC, or an FPGA.
- GA is an abbreviation for Gate Array.
- ASIC is an abbreviation for Application Specific Integrated Circuit.
- FPGA is an abbreviation for Field-Programmable Gate Array.
- the functions of the constituent elements of the abnormality detection device 100 may be realized by one electronic circuit, or may be realized by being distributed to a plurality of electronic circuits. As another modification, some functions of the components of the abnormality detection device 100 may be realized by an electronic circuit, and the remaining functions may be realized by software.
- Each of the processor 10 and the electronic circuit 90 is also called a processing circuit.
- the functions of the image acquisition unit 11, the motion extraction unit 12, the weighting unit 13, the scattering degree calculation unit 14, and the detection unit 15 may be realized by the processing circuitry.
- the functions of the image acquisition unit 11, the motion extraction unit 12, the weighting unit 13, the scattering degree calculation unit 14, the detection unit 15, the main storage device 20, the auxiliary storage device 30, the input interface 40, and the output interface 50 are the same as the processing circuitry. May be realized by
- the types of weighting by the weighting unit 13 are described as types 1 to 4.
- the weighting by the weighting unit 13 is not limited to Type 1 to Type 4.
- first embodiment of the present invention has been described above, one of the first embodiments may be partially implemented. Alternatively, two or more of the first embodiment may be partially combined and implemented. The present invention is not limited to the first embodiment, and various modifications can be made if necessary.
- processors 11 image acquisition unit, 12 motion extraction unit, 13 weighting unit, 14 scattering degree calculation unit, 15 detection unit, 20 main storage device, 30 auxiliary storage device, 40 input interface, 50 output interface, 61, 62 people, 71, 72, 73, 74, 75 frames, 81 video areas, 82 person vectors, 83 confluence areas, 84 main lines, 90 electronic circuits, 91 signal lines, 100 abnormality detection devices, 101 signal lines, 200 cameras.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Alarm Systems (AREA)
- Image Analysis (AREA)
Abstract
An abnormality detection device (100) comprises an image acquisition unit (11), a motion extraction unit (12), a weighting unit (13), a scattering degree calculation unit (14), and a detection unit (15). The image acquisition unit (11) acquires a plurality of images from a camera (200). The motion extraction unit (12) extracts, from a plurality of moving bodies included in the image in one frame, a plurality of motion vectors determined by the magnitude of a motion and the direction of the motion. The weighting unit (13) calculates an occurrence probability for each motion vector of the plurality of motion vectors, and weights the occurrence probabilities. The scattering degree calculation unit (14) calculates the degree of scattering in the one frame on the basis of the weighted occurrence probabilities. The detection unit (15) compares the degree of scattering to a threshold value, and detects abnormalities on the basis of the comparison results.
Description
この発明は、画像に撮影された群集の動きから異常行動を検出する、異常行動検出装置、異常行動検出プログラム及び異常行動検出方法に関する。
The present invention relates to an abnormal behavior detection device, an abnormal behavior detection program, and an abnormal behavior detection method for detecting abnormal behavior from the movement of a crowd captured in an image.
従来では、群集の動きから異常行動を検知する技術がある(例えば非特許文献1)。非特許文献1の技術は、動きを前景の時間的変化から検出し、連続する数フレームぶん前景画像を足し合わせたActivity mapを作成する。Activity mapのヒストグラムから計算した画素値の発生頻度に対するシャノンエントロピーが高く、かつ、連続する異なる時間に作られたActivity mapの差分が大きい場合に、異常行動が検知される。
Conventionally, there is a technique for detecting abnormal behavior from the movement of a crowd (for example, Non-Patent Document 1). The technique of Non-Patent Document 1 detects a motion from a temporal change of a foreground and creates an activity map by adding together foreground images for several consecutive frames. Anomalous behavior is detected when Shannon entropy with respect to the occurrence frequency of pixel values calculated from the activity map histogram is high and the difference between activity maps created at consecutive different times is large.
しかし非特許文献1の技術では、群集が移動を続けている状況を常に異常と判定してしまうため、駅のように絶えず規則的に群集が行き来する場所では、正常状態を異常状態と誤検知する可能性がある。また、従来技術の多くは特定の異常行動検知に特化しており、複数の異常行動を検知して、各異常行動の種類を判別する技術はない。
However, in the technique of Non-Patent Document 1, the situation in which the crowd continues to move is always determined to be abnormal, so in a place where the crowd constantly moves back and forth like a station, the normal state is erroneously detected as an abnormal state. there's a possibility that. Further, most of the conventional techniques are specialized in detecting a specific abnormal behavior, and there is no technique for detecting a plurality of abnormal behaviors and discriminating the type of each abnormal behavior.
この発明は、群集の正常行動を異常行動と誤検知することなく、異常行動を正確に検知する装置の提供を目的とする。
The present invention aims to provide a device for accurately detecting abnormal behavior without misdetecting normal behavior of a crowd as abnormal behavior.
この発明の異常検出装置は、
複数の画像を取得する画像取得部と、
1フレームの画像に含まれる複数の移動体から、動きの大きさと動きの方向とによって決定される複数の動きベクトルを抽出する動き抽出部と、
前記複数の動きベクトルの動きベクトルごとに生起確率を計算し、少なくともいずれかの前記生起確率に重み付けを行う重み付け部と、
重み付けされた前記生起確率に基づいて、前記1フレームの散乱度を計算する散乱度計算部と、
前記散乱度と閾値とを比較し、比較結果に基づいて異常を検出する検出部と、
を備える。 The abnormality detection device of the present invention,
An image acquisition unit that acquires multiple images,
A motion extraction unit that extracts a plurality of motion vectors determined by a magnitude of motion and a direction of motion from a plurality of moving objects included in an image of one frame;
A weighting unit that calculates an occurrence probability for each motion vector of the plurality of motion vectors and weights at least one of the occurrence probabilities,
A scattering degree calculation unit that calculates the scattering degree of the one frame based on the weighted occurrence probability;
Comparing the scattering degree and the threshold value, a detection unit for detecting an abnormality based on the comparison result,
Equipped with.
複数の画像を取得する画像取得部と、
1フレームの画像に含まれる複数の移動体から、動きの大きさと動きの方向とによって決定される複数の動きベクトルを抽出する動き抽出部と、
前記複数の動きベクトルの動きベクトルごとに生起確率を計算し、少なくともいずれかの前記生起確率に重み付けを行う重み付け部と、
重み付けされた前記生起確率に基づいて、前記1フレームの散乱度を計算する散乱度計算部と、
前記散乱度と閾値とを比較し、比較結果に基づいて異常を検出する検出部と、
を備える。 The abnormality detection device of the present invention,
An image acquisition unit that acquires multiple images,
A motion extraction unit that extracts a plurality of motion vectors determined by a magnitude of motion and a direction of motion from a plurality of moving objects included in an image of one frame;
A weighting unit that calculates an occurrence probability for each motion vector of the plurality of motion vectors and weights at least one of the occurrence probabilities,
A scattering degree calculation unit that calculates the scattering degree of the one frame based on the weighted occurrence probability;
Comparing the scattering degree and the threshold value, a detection unit for detecting an abnormality based on the comparison result,
Equipped with.
本発明によれば、群集の正常行動を異常行動と誤検知することなく、異常行動を正確に検知する装置を提供できる。
According to the present invention, it is possible to provide a device that accurately detects abnormal behavior without misdetecting normal behavior of a crowd as abnormal behavior.
以下、本発明の実施の形態について、図を用いて説明する。なお、各図中、同一または相当する部分には、同一符号を付している。実施の形態の説明において、同一または相当する部分については、説明を適宜省略または簡略化する。
Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the drawings, the same or corresponding parts are designated by the same reference numerals. In the description of the embodiments, description of the same or corresponding parts will be appropriately omitted or simplified.
実施の形態1.
図1から図19を参照して、実施の形態1の異常検出装置100を説明する。異常検出装置100は、画像を構成する複数のフレームの各フレームの散乱度を計算し、散乱度の変化から画像に発生している異常を検出する装置である。Embodiment 1.
Theabnormality detection device 100 according to the first embodiment will be described with reference to FIGS. 1 to 19. The abnormality detection device 100 is a device that calculates the degree of scattering of each frame of a plurality of frames that form an image and detects an abnormality occurring in the image from changes in the degree of scattering.
図1から図19を参照して、実施の形態1の異常検出装置100を説明する。異常検出装置100は、画像を構成する複数のフレームの各フレームの散乱度を計算し、散乱度の変化から画像に発生している異常を検出する装置である。
The
***構成の説明***
図1は、異常検出装置100のハードウェア構成を示す。異常検出装置100は、コンピュータである。異常検出装置100は、プロセッサ10を備えるとともに、主記憶装置20、補助記憶装置30、入力インタフェース40及び出力インタフェース50といった他のハードウェアを備える。プロセッサ10は、信号線101を介して他のハードウェアと接続され、これら他のハードウェアを制御する。 *** Description of structure ***
FIG. 1 shows a hardware configuration of theabnormality detection device 100. The abnormality detection device 100 is a computer. The abnormality detection device 100 includes the processor 10 and other hardware such as the main storage device 20, the auxiliary storage device 30, the input interface 40, and the output interface 50. The processor 10 is connected to other hardware via the signal line 101, and controls these other hardware.
図1は、異常検出装置100のハードウェア構成を示す。異常検出装置100は、コンピュータである。異常検出装置100は、プロセッサ10を備えるとともに、主記憶装置20、補助記憶装置30、入力インタフェース40及び出力インタフェース50といった他のハードウェアを備える。プロセッサ10は、信号線101を介して他のハードウェアと接続され、これら他のハードウェアを制御する。 *** Description of structure ***
FIG. 1 shows a hardware configuration of the
異常検出装置100は、機能要素として、画像取得部11、動き抽出部12,重み付け部13,散乱度計算部14及び検出部15を備える。画像取得部11、動き抽出部12,重み付け部13,散乱度計算部14及び検出部15の機能は、ソフトウェアである異常検出プログラムにより実現される。
The abnormality detection device 100 includes, as functional elements, an image acquisition unit 11, a motion extraction unit 12, a weighting unit 13, a scattering degree calculation unit 14, and a detection unit 15. The functions of the image acquisition unit 11, the motion extraction unit 12, the weighting unit 13, the scattering degree calculation unit 14, and the detection unit 15 are realized by an abnormality detection program that is software.
プロセッサ10は、異常検出プログラムを実行する装置である。異常検出プログラムは、画像取得部11、動き抽出部12,重み付け部13,散乱度計算部14及び検出部15の機能を実現するプログラムである。プロセッサ10は、演算処理を行うIC(Integrated Circuit)である。プロセッサ10の具体例は、CPU(Central Processing Unit)、DSP(Digital Signal Processor)、GPU(Graphics Processing Unit)である。
The processor 10 is a device that executes an abnormality detection program. The abnormality detection program is a program that realizes the functions of the image acquisition unit 11, the motion extraction unit 12, the weighting unit 13, the scattering degree calculation unit 14, and the detection unit 15. The processor 10 is an IC (Integrated Circuit) that performs arithmetic processing. Specific examples of the processor 10 are a CPU (Central Processing Unit), a DSP (Digital Signal Processor), and a GPU (Graphics Processing Unit).
主記憶装置20は、データを一時的に記憶する記憶装置である。主記憶装置20の具体例は、SRAM(Static Random Access Memory)、DRAM(Dynamic Random Access Memory)である。主記憶装置20は、プロセッサ10の演算結果を保持する。
The main storage device 20 is a storage device that temporarily stores data. Specific examples of the main storage device 20 are SRAM (Static Random Access Memory) and DRAM (Dynamic Random Access Memory). The main storage device 20 holds the calculation result of the processor 10.
補助記憶装置30は、データを不揮発的に保管する記憶装置である。補助記憶装置30の具体例は、HDD(Hard Disk Drive)である。また、補助記憶装置30は、SD(登録商標)(Secure Digital)メモリカード、CF(CompactFlash)、NANDフラッシュ、フレキシブルディスク、光ディスク、コンパクトディスク、ブルーレイ(登録商標)ディスク、DVD(Digital Versatile Disk)といった可搬記録媒体であってもよい。
The auxiliary storage device 30 is a storage device that stores data in a nonvolatile manner. A specific example of the auxiliary storage device 30 is a HDD (Hard Disk Drive). The auxiliary storage device 30 may be an SD (registered trademark) (Secure Digital) memory card, a CF (CompactFlash), a NAND flash, a flexible disk, an optical disk, a compact disk, a Blu-ray (registered trademark) disk, a DVD (Digital Versatile Disk), or the like. It may be a portable recording medium.
入力インタフェース40は、各種機器が接続され、各種機器のデータが入力されるポートである。図1では入力インタフェース40にはカメラ200が接続され、カメラ200の画像が入力される。
The input interface 40 is a port to which various devices are connected and data of various devices is input. In FIG. 1, the camera 200 is connected to the input interface 40, and the image of the camera 200 is input.
出力インタフェース50は、各種機器が接続され、各種機器にプロセッサ10によりデータが出力されるポートである。検出部15による検出結果のような処理結果が、出力インタフェース50を介して各種機器へ出力される。
The output interface 50 is a port to which various devices are connected and data is output by the processor 10 to the various devices. The processing result such as the detection result by the detection unit 15 is output to various devices via the output interface 50.
異常検出プログラムは、主記憶装置20に記憶されている。異常検出プログラムは主記憶装置20からプロセッサ10に読み込まれ、プロセッサ10によって実行される。主記憶装置20には、異常検出プログラムだけでなく、OS(Operating System)も記憶されている。
プロセッサ10は、OSを実行しながら、異常検出プログラムを実行する。 The abnormality detection program is stored in themain storage device 20. The abnormality detection program is read from the main storage device 20 into the processor 10 and executed by the processor 10. The main storage device 20 stores not only the abnormality detection program but also an OS (Operating System).
Theprocessor 10 executes the abnormality detection program while executing the OS.
プロセッサ10は、OSを実行しながら、異常検出プログラムを実行する。 The abnormality detection program is stored in the
The
異常検出プログラム及びOSは、補助記憶装置30に記憶されていてもよい。補助記憶装置30に記憶されている異常検出プログラム及びOSは、主記憶装置20にロードされ、プロセッサ10によって実行される。なお、異常検出プログラムの一部または全部がOSに組み込まれていてもよい。
The abnormality detection program and the OS may be stored in the auxiliary storage device 30. The abnormality detection program and the OS stored in the auxiliary storage device 30 are loaded into the main storage device 20 and executed by the processor 10. Note that part or all of the abnormality detection program may be incorporated in the OS.
異常検出装置100は、プロセッサ10を代替する複数のプロセッサを備えていてもよい。例えば、異常検出装置100は、別途、画像処理プロセッサを備えてもよい。これら複数のプロセッサは、異常検出プログラムの実行を分担する。それぞれのプロセッサは、プロセッサ10と同じように、異常検出プログラムを実行する装置である。
The abnormality detection device 100 may include a plurality of processors that replace the processor 10. For example, the abnormality detection device 100 may separately include an image processor. These plural processors share the execution of the abnormality detection program. Each processor, like the processor 10, is a device that executes an abnormality detection program.
異常検出プログラムにより利用、処理または出力されるデータ、情報、信号値及び変数値は、主記憶装置20、補助記憶装置30、または、プロセッサ10内のレジスタあるいはキャッシュメモリに記憶される。
The data, information, signal values, and variable values used, processed, or output by the abnormality detection program are stored in the main storage device 20, the auxiliary storage device 30, or a register or cache memory in the processor 10.
異常検出プログラムは、画像取得部11、動き抽出部12,重み付け部13,散乱度計算部14及び検出部15の各部の「部」を「処理」、「手順」あるいは「工程」に読み替えた各処理、各手順あるいは各工程をコンピュータに実行させるプログラムである。
The abnormality detection program is obtained by replacing “part” of each part of the image acquisition part 11, the motion extraction part 12, the weighting part 13, the scattering degree calculation part 14 and the detection part 15 with “processing”, “procedure” or “process”. It is a program that causes a computer to execute processing, each procedure, or each step.
また、異常検出方法は、コンピュータである異常検出装置100が異常検出プログラムを実行することにより行われる方法である。異常検出プログラムは、コンピュータ読取可能な記録媒体に格納されて提供されてもよいし、プログラムプロダクトとして提供されてもよい。
The abnormality detection method is a method performed by the abnormality detection device 100, which is a computer, executing an abnormality detection program. The abnormality detection program may be stored in a computer-readable recording medium and provided, or may be provided as a program product.
***動作の説明***
以下に異常検出装置100の動作を説明する。異常検出装置100の動作は、異常検出方法に相当する。異常検出方法の手順は、異常検出プログラムの手順に相当する。 *** Description of operation ***
The operation of theabnormality detection device 100 will be described below. The operation of the abnormality detection device 100 corresponds to an abnormality detection method. The procedure of the abnormality detection method corresponds to the procedure of the abnormality detection program.
以下に異常検出装置100の動作を説明する。異常検出装置100の動作は、異常検出方法に相当する。異常検出方法の手順は、異常検出プログラムの手順に相当する。 *** Description of operation ***
The operation of the
図2は、異常検出装置100の行う異常検出処理の流れを示す。
図3は、異常検出装置100の動作のフローチャートである。図2では図3に対応する処理に、図3のステップ番号を付している。図2及び図3を参照して、異常検出装置100の動作概要を説明する。 FIG. 2 shows a flow of abnormality detection processing performed by theabnormality detection device 100.
FIG. 3 is a flowchart of the operation of theabnormality detection device 100. In FIG. 2, the step numbers of FIG. 3 are attached to the processing corresponding to FIG. An outline of the operation of the abnormality detection device 100 will be described with reference to FIGS. 2 and 3.
図3は、異常検出装置100の動作のフローチャートである。図2では図3に対応する処理に、図3のステップ番号を付している。図2及び図3を参照して、異常検出装置100の動作概要を説明する。 FIG. 2 shows a flow of abnormality detection processing performed by the
FIG. 3 is a flowchart of the operation of the
ステップS101において、画像取得部11が複数の画像を取得する。具体的には、図1に示すように、画像取得部11は、入力インタフェース40を介して、カメラ200から複数の画像を取得する。
ステップS102において、動き抽出部12は、1フレームの画像に含まれる複数の移動体から、動きの大きさと動きの方向とによって決定される複数の動きベクトルを抽出する。動きベクトルは動きベクトルvt(r、θ)または単にvt(r、θ)と表記する。動きベクトルvt(r、θ)については後述する。添え字のtは1フレーム関することを意味する。
なお、実施の形態1では、動きベクトルvt(r、θ)を抽出する移動体として人を例にしている。しかし、動きベクトルvt(r、θ)を抽出する移動体は、人に限られない。移動体は、昆虫、車両、液体に浮いている浮遊物のような、あらゆる移動体を対象とすることができる。 In step S101, theimage acquisition unit 11 acquires a plurality of images. Specifically, as shown in FIG. 1, the image acquisition unit 11 acquires a plurality of images from the camera 200 via the input interface 40.
In step S102, themotion extraction unit 12 extracts a plurality of motion vectors determined by the magnitude of the motion and the direction of the motion from the plurality of moving objects included in the image of one frame. The motion vector is expressed as a motion vector v t (r, θ) or simply v t (r, θ). The motion vector v t (r, θ) will be described later. The subscript t means one frame.
In the first embodiment, a person is taken as an example of the moving body that extracts the motion vector v t (r, θ). However, the moving body that extracts the motion vector v t (r, θ) is not limited to a person. The moving body can be any moving body such as an insect, a vehicle, or a floating substance floating in a liquid.
ステップS102において、動き抽出部12は、1フレームの画像に含まれる複数の移動体から、動きの大きさと動きの方向とによって決定される複数の動きベクトルを抽出する。動きベクトルは動きベクトルvt(r、θ)または単にvt(r、θ)と表記する。動きベクトルvt(r、θ)については後述する。添え字のtは1フレーム関することを意味する。
なお、実施の形態1では、動きベクトルvt(r、θ)を抽出する移動体として人を例にしている。しかし、動きベクトルvt(r、θ)を抽出する移動体は、人に限られない。移動体は、昆虫、車両、液体に浮いている浮遊物のような、あらゆる移動体を対象とすることができる。 In step S101, the
In step S102, the
In the first embodiment, a person is taken as an example of the moving body that extracts the motion vector v t (r, θ). However, the moving body that extracts the motion vector v t (r, θ) is not limited to a person. The moving body can be any moving body such as an insect, a vehicle, or a floating substance floating in a liquid.
ステップS103において、重み付け部13は、複数の動きベクトルvt(r、θ)の動きベクトルvt(r、θ)ごとに、生起確率pt(r,θ)を計算する。重み付け部13は、少なくともいずれかの生起確率pt(r、θ)に重み付けを行う。生起確率は生起確率pt(r、θ)または単にpt(r、θ)と表記する。生起確率pt(r、θ)については後述する。
なお添え字のtは1フレーム関することを意味する。生起確率pt(r、θ)は動きベクトルvt(r、θ)に一対一に対応している。実施の形態1では、重み付け部13による重み付けは、タイプ1からタイプ4を説明する。タイプ1からタイプ4の重み付けは、後述する。 In step S103, theweighting unit 13, a plurality of motion vectors v t (r, θ) of the motion vector v t (r, θ) for each, to calculate the probability p t (r, θ). The weighting unit 13 weights at least one of the occurrence probabilities p t (r, θ). The occurrence probability is expressed as the occurrence probability p t (r, θ) or simply p t (r, θ). The occurrence probability p t (r, θ) will be described later.
The subscript t means one frame. The occurrence probability p t (r, θ) has a one-to-one correspondence with the motion vector v t (r, θ). In the first embodiment, the weighting by theweighting unit 13 will be described as types 1 to 4. The weighting of types 1 to 4 will be described later.
なお添え字のtは1フレーム関することを意味する。生起確率pt(r、θ)は動きベクトルvt(r、θ)に一対一に対応している。実施の形態1では、重み付け部13による重み付けは、タイプ1からタイプ4を説明する。タイプ1からタイプ4の重み付けは、後述する。 In step S103, the
The subscript t means one frame. The occurrence probability p t (r, θ) has a one-to-one correspondence with the motion vector v t (r, θ). In the first embodiment, the weighting by the
ステップS104において、散乱度計算部14は、重み付けされた生起確率pt(r、θ)に基づいて、1フレームの散乱度Etを計算する。散乱度はエントロピーとも呼ばれる。散乱度Etの詳細は後述する。
ステップS105において、検出部15は、散乱度Etと閾値THとを比較し、比較結果に基づいて異常を検出する。閾値THは、検出部15に設定されている。 In step S104, the scatteringdegree calculation unit 14 calculates the scattering degree E t of one frame based on the weighted occurrence probability p t (r, θ). The degree of scattering is also called entropy. Details of the scattering degree E t will be described later.
In step S105, thedetection unit 15 compares the scattering degree Et with the threshold value TH and detects an abnormality based on the comparison result. The threshold TH is set in the detection unit 15.
ステップS105において、検出部15は、散乱度Etと閾値THとを比較し、比較結果に基づいて異常を検出する。閾値THは、検出部15に設定されている。 In step S104, the scattering
In step S105, the
図4から図6を参照して動きベクトルvt(r、θ)を説明する。画像の1フレームを対象として動きベクトルvt(r、θ)を説明する。
図4は、画像の1フレームであるフレーム71を示す。フレーム71は不規則に動く群集を示す。不規則に動く群集では散乱度Etは大きくなる。図4では、丸形状で示す複数の人61が、不規則に動くことを示す。丸形状の一つ一つが人を示している。人61を示す丸形状に付されている矢印82の方向が人の動く方向を示し、矢印の大きさが人の動く速さを示す。動きベクトルvt(r、θ)と区別するため、矢印82を人ベクトル82と呼ぶこととする。
図5は、画像の1フレームであるフレーム72を示す。フレーム72は規則的に動く群集を示す。規則的に動く群集では散乱度Etは小さくなる。
図6は、動きベクトルvt(r、θ)及び生起確率pt(r、θ)を説明する図である。図4のフレーム71を想定して説明する。図6の最上段は、人ベクトル82の方向θの分布を示す図である。横軸は人ベクトル82の方向θを示し、縦軸は方向θの数Nθを示す。右側の円は方向θの計測方向を示している。方向θはX軸を基準に左回りに測る。図4、図5、図9、図11、図13及び図15にはX-Y座標を示している。図6の2段目は、人ベクトル82の大きさrの分布を示す図である。横軸は人ベクトル82の大きさrを示し、縦軸は大きさrの数Nrを示す。図6の3段目は、動きベクトルvt(r、θ)の分布を模式的に示している。動きベクトルvt(r、θ)は、動き抽出部12によってフレーム71から抽出される。動きベクトルvt(r、θ)はフレーム71と異なる別のフレームを参照して求められる。動きベクトルvt(r、θ)は大きさr及び方向θから決まるので、複数人の人ベクトル82が同一のときは、複数の同一の人ベクトル82は一つの動きベクトルvt(r、θ)に集約される。図6の4段目は、動きベクトルvt(r、θ)の生起確率pt(r、θ)の分布を模式的に示している。重み付け部13は、動きベクトルvt(r、θ)ごとに生起確率pt(r、θ)を計算する。 The motion vector v t (r, θ) will be described with reference to FIGS. 4 to 6. The motion vector v t (r, θ) will be described for one frame of the image.
FIG. 4 shows aframe 71 which is one frame of an image. Frame 71 shows a randomly moving crowd. The scattering degree E t is large in a randomly moving crowd. In FIG. 4, a plurality of people 61 shown in a circular shape move irregularly. Each of the round shapes represents a person. The direction of a circle-shaped arrow 82 indicating the person 61 indicates the moving direction of the person, and the size of the arrow indicates the moving speed of the person. The arrow 82 is referred to as a person vector 82 in order to distinguish it from the motion vector v t (r, θ).
FIG. 5 shows aframe 72, which is one frame of an image. Frame 72 shows a crowd that moves regularly. The degree of scattering E t is small in a regularly moving crowd.
FIG. 6 is a diagram for explaining the motion vector v t (r, θ) and the occurrence probability p t (r, θ). Description will be made assuming theframe 71 of FIG. The uppermost part of FIG. 6 is a diagram showing the distribution of the person vector 82 in the direction θ. The horizontal axis represents the direction θ of the person vector 82, and the vertical axis represents the number Nθ of the directions θ. The circle on the right side shows the measurement direction of the direction θ. The direction θ is measured counterclockwise with respect to the X axis. The XY coordinates are shown in FIGS. 4, 5, 9, 11, 13, and 15. The second row of FIG. 6 is a diagram showing the distribution of the size r of the person vector 82. The horizontal axis represents the size r of the person vector 82, and the vertical axis represents the number Nr of the sizes r. The third row of FIG. 6 schematically shows the distribution of the motion vector v t (r, θ). The motion vector v t (r, θ) is extracted from the frame 71 by the motion extraction unit 12. The motion vector v t (r, θ) is obtained by referring to another frame different from the frame 71. Since the motion vector v t (r, θ) is determined from the size r and the direction θ, when a plurality of person vectors 82 are the same, a plurality of the same person vectors 82 are one motion vector v t (r, θ). ) Is summarized. The fourth row of FIG. 6 schematically shows the distribution of the occurrence probabilities p t (r, θ) of the motion vector v t (r, θ). The weighting unit 13 calculates the occurrence probability p t (r, θ) for each motion vector v t (r, θ).
図4は、画像の1フレームであるフレーム71を示す。フレーム71は不規則に動く群集を示す。不規則に動く群集では散乱度Etは大きくなる。図4では、丸形状で示す複数の人61が、不規則に動くことを示す。丸形状の一つ一つが人を示している。人61を示す丸形状に付されている矢印82の方向が人の動く方向を示し、矢印の大きさが人の動く速さを示す。動きベクトルvt(r、θ)と区別するため、矢印82を人ベクトル82と呼ぶこととする。
図5は、画像の1フレームであるフレーム72を示す。フレーム72は規則的に動く群集を示す。規則的に動く群集では散乱度Etは小さくなる。
図6は、動きベクトルvt(r、θ)及び生起確率pt(r、θ)を説明する図である。図4のフレーム71を想定して説明する。図6の最上段は、人ベクトル82の方向θの分布を示す図である。横軸は人ベクトル82の方向θを示し、縦軸は方向θの数Nθを示す。右側の円は方向θの計測方向を示している。方向θはX軸を基準に左回りに測る。図4、図5、図9、図11、図13及び図15にはX-Y座標を示している。図6の2段目は、人ベクトル82の大きさrの分布を示す図である。横軸は人ベクトル82の大きさrを示し、縦軸は大きさrの数Nrを示す。図6の3段目は、動きベクトルvt(r、θ)の分布を模式的に示している。動きベクトルvt(r、θ)は、動き抽出部12によってフレーム71から抽出される。動きベクトルvt(r、θ)はフレーム71と異なる別のフレームを参照して求められる。動きベクトルvt(r、θ)は大きさr及び方向θから決まるので、複数人の人ベクトル82が同一のときは、複数の同一の人ベクトル82は一つの動きベクトルvt(r、θ)に集約される。図6の4段目は、動きベクトルvt(r、θ)の生起確率pt(r、θ)の分布を模式的に示している。重み付け部13は、動きベクトルvt(r、θ)ごとに生起確率pt(r、θ)を計算する。 The motion vector v t (r, θ) will be described with reference to FIGS. 4 to 6. The motion vector v t (r, θ) will be described for one frame of the image.
FIG. 4 shows a
FIG. 5 shows a
FIG. 6 is a diagram for explaining the motion vector v t (r, θ) and the occurrence probability p t (r, θ). Description will be made assuming the
図7は、散乱度計算部14が計算する散乱度Et、生起確率pt(r、θ)、重みとの関係を示す。散乱度Etの式においてlog[pt(r、θ)]は情報量と呼ばれることがある。図7に示す式1は、重みがないときの散乱度Etの式である。図7に示す式2は、重みwr,θを反映するときの散乱度Etの式である。重みwr,θの詳細は後述する。
図8は、検出部15による異常の検出を示すグラフである。散乱度Etはtごと、つまりフレームごとに計算されるが、これが図8の横軸の時間に対応する。図8の縦軸は散乱度Etの大きさである。検出部15は、散乱度Etが閾値TH以上となった場合に、取得する画像に異常があると判定する。図8では、検出部15によって異常時間帯が異常と検出される。
図8のグラフは、検出部15によって、図2のステップS103に示すタイプ1からタイプ4のそれぞれについて生成される。
閾値THは、タイプ1からタイプ4のそれぞれについて設定されている。例えば、各閾値THは、検出部15を実現する異常検出プログラムに設定されている。閾値THはタイプごとに異なる種類の閾値が設定されている。図2の場合であれば、タイプ1の閾値THは混乱を示す閾値であり、タイプ2の閾値THは対流を示す閾値であり、タイプ3の閾値THは合流を示す閾値であり、タイプ2の閾値THは逆送を示す閾値である。よって、検出部15は、異常として、異常の発生と、発生した異常の種類とを検出できる。
具体的には以下のようである。検出部15は散乱度Etが閾値THを超えた場合に異常の発生を検出できる。また、検出部15は、散乱度Etが閾値THを超えた場合に、閾値THの種類から、発生した異常の種類を検出できる。 FIG. 7 shows the relationship between the scattering degree E t calculated by the scatteringdegree calculating unit 14, the occurrence probability p t (r, θ), and the weight. In the formula of the scattering degree E t , log [p t (r, θ)] is sometimes called an information amount. Expression 1 shown in FIG. 7 is an expression of the scattering degree E t when there is no weight. Expression 2 shown in FIG. 7 is an expression of the scattering degree E t when the weight w r, θ is reflected. The details of the weight w r, θ will be described later.
FIG. 8 is a graph showing the detection of abnormality by thedetection unit 15. The scattering degree E t is calculated for each t, that is, for each frame, and this corresponds to the time on the horizontal axis of FIG. 8. The vertical axis of FIG. 8 represents the magnitude of the scattering degree E t . The detection unit 15 determines that the acquired image is abnormal when the scattering degree Et is equal to or greater than the threshold value TH. In FIG. 8, the detection unit 15 detects that the abnormal time zone is abnormal.
The graph of FIG. 8 is generated by thedetection unit 15 for each of the types 1 to 4 shown in step S103 of FIG.
The threshold value TH is set for each oftype 1 to type 4. For example, each threshold TH is set in the abnormality detection program that realizes the detection unit 15. As the threshold TH, different types of thresholds are set for each type. In the case of FIG. 2, the threshold TH of type 1 is a threshold indicating confusion, the threshold TH of type 2 is a threshold indicating convection, the threshold TH of type 3 is a threshold indicating confluence, and the threshold TH of type 2 is The threshold value TH is a threshold value indicating the reverse transmission. Therefore, the detection unit 15 can detect the occurrence of an abnormality and the type of the abnormality that has occurred as the abnormality.
Specifically, it is as follows.Detector 15 can detect the occurrence of abnormality when the scattering degree E t exceeds the threshold TH. The detection unit 15, when the scattering degree E t exceeds the threshold value TH, the type of the threshold TH, can detect the type of abnormality that has occurred.
図8は、検出部15による異常の検出を示すグラフである。散乱度Etはtごと、つまりフレームごとに計算されるが、これが図8の横軸の時間に対応する。図8の縦軸は散乱度Etの大きさである。検出部15は、散乱度Etが閾値TH以上となった場合に、取得する画像に異常があると判定する。図8では、検出部15によって異常時間帯が異常と検出される。
図8のグラフは、検出部15によって、図2のステップS103に示すタイプ1からタイプ4のそれぞれについて生成される。
閾値THは、タイプ1からタイプ4のそれぞれについて設定されている。例えば、各閾値THは、検出部15を実現する異常検出プログラムに設定されている。閾値THはタイプごとに異なる種類の閾値が設定されている。図2の場合であれば、タイプ1の閾値THは混乱を示す閾値であり、タイプ2の閾値THは対流を示す閾値であり、タイプ3の閾値THは合流を示す閾値であり、タイプ2の閾値THは逆送を示す閾値である。よって、検出部15は、異常として、異常の発生と、発生した異常の種類とを検出できる。
具体的には以下のようである。検出部15は散乱度Etが閾値THを超えた場合に異常の発生を検出できる。また、検出部15は、散乱度Etが閾値THを超えた場合に、閾値THの種類から、発生した異常の種類を検出できる。 FIG. 7 shows the relationship between the scattering degree E t calculated by the scattering
FIG. 8 is a graph showing the detection of abnormality by the
The graph of FIG. 8 is generated by the
The threshold value TH is set for each of
Specifically, it is as follows.
図2に戻って説明する。異常検出装置100は、ステップS101で取得する監視画像を対象として、ステップS103において、並列処理によって、タイプ1からタイプ4の重み付け処理を行う。なお、同時にタイプ1からタイプ4の重み付け処理を行う方式に限らず、タイプ1からタイプ4のうち、どれか一つ、またはどれか二つ、またはどれか三つ以上を選択できる構成でも良い。重み付け部13は選択指令に従って重み付け処理を行う。
Return to FIG. 2 to explain. The abnormality detection apparatus 100 performs weighting processing of types 1 to 4 by parallel processing on the monitoring image acquired in step S101 in step S103. It should be noted that the present invention is not limited to the method of performing the weighting processing of types 1 to 4 at the same time, and any one of types 1 to 4, any two, or any three or more may be selected. The weighting unit 13 performs weighting processing according to the selection command.
<タイプ1>
図9及び図10を参照して、重み付け部13が行う重み付け処理のタイプ1を説明する。タイプ1では、重み付け部13は、生起確率pt(r、θ)が計算される動きベクトルvt(r、θ)の動きの大きさにr応じて、生起確率pt(r、θ)に重み付けを行う。
タイプ1によって異常状態として混乱状態を検知できる。言い換えれば、タイプ1で異常と検出された状態が混乱状態である。
図9は、図4と同一のフレーム71である。
図10は、図9から得られる生起確率pt(r、θ)を模式的に示す。生起確率pt(r、θ)は大きさr及び方向θで決まるが、タイプ1では、大きさrが大きいほど、重みwr,θを大きくする。wr,θを大きさrに比例させてもよい。なお、情報量log[pt(r、θ)]の生起確率pt(r、θ)には,重みwr,θは反映させない。重みwr,θは、情報量log[pt(r、θ)]と掛け算される生起確率pt(r、θ)に反映する。これはタイプ1からタイプ4とも同じである。 <Type 1>
Thetype 1 of the weighting process performed by the weighting unit 13 will be described with reference to FIGS. 9 and 10. In Type 1, the weighting section 13, the occurrence probability p t (r, theta) depending r to the magnitude of the motion of the motion is calculated vector v t (r, theta), occurrence probability p t (r, theta) Are weighted.
Type 1 can detect a confusion state as an abnormal state. In other words, the state in which the type 1 is detected as abnormal is the confusion state.
FIG. 9 is thesame frame 71 as FIG.
FIG. 10 schematically shows the occurrence probability p t (r, θ) obtained from FIG. 9. The occurrence probability p t (r, θ) is determined by the size r and the direction θ. InType 1, the larger the size r, the larger the weight w r, θ . W r, θ may be proportional to the size r. Note that the weight w r, θ is not reflected in the occurrence probability p t (r, θ) of the information amount log [p t (r, θ)]. The weight w r, θ is reflected in the occurrence probability p t (r, θ) multiplied by the information amount log [p t (r, θ)]. This is the same for types 1 to 4.
図9及び図10を参照して、重み付け部13が行う重み付け処理のタイプ1を説明する。タイプ1では、重み付け部13は、生起確率pt(r、θ)が計算される動きベクトルvt(r、θ)の動きの大きさにr応じて、生起確率pt(r、θ)に重み付けを行う。
タイプ1によって異常状態として混乱状態を検知できる。言い換えれば、タイプ1で異常と検出された状態が混乱状態である。
図9は、図4と同一のフレーム71である。
図10は、図9から得られる生起確率pt(r、θ)を模式的に示す。生起確率pt(r、θ)は大きさr及び方向θで決まるが、タイプ1では、大きさrが大きいほど、重みwr,θを大きくする。wr,θを大きさrに比例させてもよい。なお、情報量log[pt(r、θ)]の生起確率pt(r、θ)には,重みwr,θは反映させない。重みwr,θは、情報量log[pt(r、θ)]と掛け算される生起確率pt(r、θ)に反映する。これはタイプ1からタイプ4とも同じである。 <
The
FIG. 9 is the
FIG. 10 schematically shows the occurrence probability p t (r, θ) obtained from FIG. 9. The occurrence probability p t (r, θ) is determined by the size r and the direction θ. In
<タイプ2>
図11及び図12を参照して、重み付け部13が行う重み付け処理のタイプ2を説明する。タイプ2では、重み付け部13は、生起確率pt(r、θ)が計算される動きベクトルvt(r、θ)の動きの方向θに応じて、生起確率pt(r、θ)に重み付けを行う。より具体的には、タイプ2では1フレームを左右に分けて、画面左側と画面右側とで異なる重み付けを行う。タイプ2によって異常状態として対流状態を検知できる。言い換えれば、タイプ2で異常と検出された状態が対流状態である。
図11は、対流を説明するフレーム73である。図11では、画面左側の人ベクトル82は右方向にそろっており、画面右側の人ベクトル82は左方向にそろっている。
図12は、図11から得られる生起確率pt(r、θ)を模式的に示す。生起確率pt(r、θ)は大きさr及び方向θで決まるが、タイプ2では、方向θに応じて重み付けを行う。タイプ2の特徴は、画面左側と画面右側とで重み付けする方向が異なる点である。画面左側では、図12の一番上(3/2*π≦θ≦2π)と、一番下の領域(0≦θ≦π/2)とが重み付けの対象となる。画面右側では、図12の中央(π/2≦θ≦3/2*π)の領域が重み付けの対象となる。重み付け部13は、重み付けの対象となる領域の生起確率pt(r、θ)に、対応する大きさrの値に応じて重みwr,θを付けてもよい。例えば、重みwr,θを大きさrに比例させてもよい。あるいは重み付け部13は、重みwr,θとして、固定値を与えてもよい。図11をみるとわかるように、画面左側は右方向が強調され、画面右側は左方向が強調されるので、対流を検出しやすい。 <Type 2>
Thetype 2 of the weighting process performed by the weighting unit 13 will be described with reference to FIGS. 11 and 12. In Type 2, the weighting section 13, the occurrence probability p t (r, θ) in accordance with the direction of movement theta motion vector v t is calculated (r, theta) is the probability p t (r, θ) Perform weighting. More specifically, in Type 2, one frame is divided into left and right, and different weighting is performed on the left side of the screen and the right side of the screen. The type 2 can detect a convection state as an abnormal state. In other words, the type 2 abnormal state is a convection state.
FIG. 11 is aframe 73 for explaining convection. In FIG. 11, the person vectors 82 on the left side of the screen are aligned in the right direction, and the person vectors 82 on the right side of the screen are aligned in the left direction.
FIG. 12 schematically shows the occurrence probability p t (r, θ) obtained from FIG. 11. The occurrence probability p t (r, θ) is determined by the size r and the direction θ, but inType 2, weighting is performed according to the direction θ. Type 2 is characterized in that the left side of the screen and the right side of the screen have different weighting directions. On the left side of the screen, the top (3/2 * π ≦ θ ≦ 2π) and the bottom region (0 ≦ θ ≦ π / 2) of FIG. 12 are weighted. On the right side of the screen, the area in the center (π / 2 ≦ θ ≦ 3/2 * π) of FIG. 12 is to be weighted. The weighting unit 13 may attach the weight w r, θ to the occurrence probability p t (r, θ) of the region to be weighted according to the value of the corresponding size r. For example, the weight w r, θ may be proportional to the size r. Alternatively, the weighting unit 13 may give a fixed value as the weight w r, θ . As can be seen from FIG. 11, the left side of the screen is emphasized in the right direction and the right side of the screen is emphasized in the left direction, so that convection can be easily detected.
図11及び図12を参照して、重み付け部13が行う重み付け処理のタイプ2を説明する。タイプ2では、重み付け部13は、生起確率pt(r、θ)が計算される動きベクトルvt(r、θ)の動きの方向θに応じて、生起確率pt(r、θ)に重み付けを行う。より具体的には、タイプ2では1フレームを左右に分けて、画面左側と画面右側とで異なる重み付けを行う。タイプ2によって異常状態として対流状態を検知できる。言い換えれば、タイプ2で異常と検出された状態が対流状態である。
図11は、対流を説明するフレーム73である。図11では、画面左側の人ベクトル82は右方向にそろっており、画面右側の人ベクトル82は左方向にそろっている。
図12は、図11から得られる生起確率pt(r、θ)を模式的に示す。生起確率pt(r、θ)は大きさr及び方向θで決まるが、タイプ2では、方向θに応じて重み付けを行う。タイプ2の特徴は、画面左側と画面右側とで重み付けする方向が異なる点である。画面左側では、図12の一番上(3/2*π≦θ≦2π)と、一番下の領域(0≦θ≦π/2)とが重み付けの対象となる。画面右側では、図12の中央(π/2≦θ≦3/2*π)の領域が重み付けの対象となる。重み付け部13は、重み付けの対象となる領域の生起確率pt(r、θ)に、対応する大きさrの値に応じて重みwr,θを付けてもよい。例えば、重みwr,θを大きさrに比例させてもよい。あるいは重み付け部13は、重みwr,θとして、固定値を与えてもよい。図11をみるとわかるように、画面左側は右方向が強調され、画面右側は左方向が強調されるので、対流を検出しやすい。 <
The
FIG. 11 is a
FIG. 12 schematically shows the occurrence probability p t (r, θ) obtained from FIG. 11. The occurrence probability p t (r, θ) is determined by the size r and the direction θ, but in
<タイプ3>
図13及び図14を参照して、重み付け部13が行う重み付け処理のタイプ3を説明する。行う重み付け処理のタイプ3では、重み付け部13は1フレームにおける映像エリア81に応じた重み付けを行う。図13のフレーム74は1フレームである。
図13は、映像エリア81を示す図である。図13の映像エリア81は、合流領域83から本線84への合流を示している。
図14は、図13から得られる生起確率pt(r、θ)を模式的に示す。
動き抽出部12は、フレーム74から1フレームの部分領域である映像エリア81を抽出し、映像エリア81から複数の動きベクトルを抽出する。また、動き抽出部12は、1フレームのうち映像エリア81が除かれた除外領域から複数の動きベクトルを抽出する。図13において除外領域は、フレーム74のうち、映像エリア81が除かれた領域である。
重み付け部13は、映像エリア81から抽出された複数の動きベクトルの動きベクトルごとに生起確率を計算する。また、重み付け部13は、除外領域から抽出された複数の動きベクトルの動きベクトルごとに生起確率を計算する。
重み付け部13は、映像エリア81に基づき計算された複数の生起確率に、除外領域に基づき計算された複数の生起確率と異なる重み付けを行う。
具体的には、以下のようである。映像エリア81で抽出される動きベクトルvt(r、θ)をvin,t(r、θ)と表記し、除外領域で抽出される動きベクトルvt(r、θ)をvout,t(r、θ)と表記する。また、動きベクトルvin,t(r、θ)の生起確率をpin,t(r、θ)と表記し、動きベクトルvout,t(r、θ)の生起確率をpout,t(r、θ)と表記する。散乱度計算部14は図7の(式2)において、pin,t(r、θ)とpout,t(r、θ)とを区別する。つまり図7の(式2)において、pin,t(r、θ)とpout,t(r、θ)とは、異なる項として扱われる。
動き抽出部12は、フレーム74から映像エリア81を抽出し、映像エリア81から複数の動きベクトルvin,t(r、θ)を抽出する。動き抽出部12は、除外領域から複数の動きベクトルvout,t(r、θ)を抽出する。
重み付け部13は、映像エリア81から抽出された複数の動きベクトルvin,t(r、θ)の動きベクトルごとに生起確率pin,t(r、θ)を計算する。また、重み付け部13は、除外領域から抽出された複数の動きベクトルvout,t(r、θ)の動きベクトルごとに生起確率pout,t(r、θ)を計算する。
図14の破線で示す範囲は、動き抽出部12が映像エリア81から抽出した動きベクトルvin,t(r、θ)に対応する生起確率pin,t(r、θ)の範囲を示す。図14の破線で示す範囲には9つの生起確率pin,t(r、θ)を点で示している。重み付け部13は、映像エリア81に基づき計算された複数の生起確率pin,t(r、θ)に、除外領域に基づき計算された複数の生起確率pout,t(r、θ)と異なる重み付けを行う。例えば、重み付け部13は、複数の生起確率pin,t(r、θ)には、生起確率pin,t(r、θ)の大きさrに反比例する重みwr,θを生起確率pin,t(r、θ)に乗じ、複数の生起確率pout,t(r、θ)には重みwr,θとしてゼロを乗じる。これによって、映像エリア81の状態を強調して検出できる効果がある。 <Type 3>
Thetype 3 of the weighting process performed by the weighting unit 13 will be described with reference to FIGS. 13 and 14. In the type 3 of weighting processing to be performed, the weighting unit 13 performs weighting according to the video area 81 in one frame. The frame 74 in FIG. 13 is one frame.
FIG. 13 is a diagram showing thevideo area 81. The image area 81 in FIG. 13 shows a merge from the merge area 83 to the main line 84.
FIG. 14 schematically shows the occurrence probability p t (r, θ) obtained from FIG. 13.
Themotion extraction unit 12 extracts a video area 81, which is a partial area of one frame, from the frame 74, and extracts a plurality of motion vectors from the video area 81. In addition, the motion extraction unit 12 extracts a plurality of motion vectors from the exclusion area of the frame where the video area 81 is removed. In FIG. 13, the exclusion area is an area of the frame 74 from which the video area 81 is excluded.
Theweighting unit 13 calculates the occurrence probability for each motion vector of the plurality of motion vectors extracted from the video area 81. Further, the weighting unit 13 calculates the occurrence probability for each motion vector of the plurality of motion vectors extracted from the exclusion area.
Theweighting unit 13 weights the plurality of occurrence probabilities calculated based on the video area 81 differently from the plurality of occurrence probabilities calculated based on the exclusion area.
Specifically, it is as follows. The motion vector v t (r, θ) extracted in thevideo area 81 is expressed as vin , t (r, θ), and the motion vector v t (r, θ) extracted in the exclusion area is v out, t. Notated as (r, θ). In addition, the motion vector v in, t (r, θ ) occurrence probability p in the, t (r, θ) is expressed as, the motion vector v out, t (r, θ ) occurrence probability p out of, t ( r, θ). The scattering degree calculator 14 distinguishes between p in, t (r, θ) and p out, t (r, θ) in (Equation 2) of FIG. 7. That is, in (Equation 2) of FIG. 7, p in, t (r, θ) and p out, t (r, θ) are treated as different terms.
Themotion extracting unit 12 extracts the video area 81 from the frame 74 and the plurality of motion vectors v in, t (r, θ) from the video area 81. The motion extraction unit 12 extracts a plurality of motion vectors v out, t (r, θ) from the exclusion area.
Theweighting unit 13 calculates the occurrence probability p in, t (r, θ) for each motion vector of the plurality of motion vectors v in, t (r, θ) extracted from the video area 81. The weighting unit 13 also calculates the occurrence probability p out, t (r, θ) for each motion vector of the plurality of motion vectors v out, t (r, θ) extracted from the exclusion area.
The range indicated by the broken line in FIG. 14 indicates the range of occurrence probabilities p in, t (r, θ) corresponding to the motion vector v in, t (r, θ) extracted from thevideo area 81 by the motion extracting unit 12. The nine occurrence probabilities p in, t (r, θ) are indicated by dots in the range indicated by the broken line in FIG. The weighting unit 13 differs from the plurality of occurrence probabilities p in, t (r, θ) calculated based on the video area 81 and the plurality of occurrence probabilities p out, t (r, θ) calculated based on the exclusion area. Perform weighting. For example, the weighting unit 13 sets, for the plurality of occurrence probabilities p in, t (r, θ), a weight w r, θ that is inversely proportional to the magnitude r of the occurrence probability p in, t (r, θ). In, t (r, θ) is multiplied, and the plurality of occurrence probabilities p out, t (r, θ) are multiplied by zero as the weight w r, θ . This has the effect of emphasizing and detecting the state of the video area 81.
図13及び図14を参照して、重み付け部13が行う重み付け処理のタイプ3を説明する。行う重み付け処理のタイプ3では、重み付け部13は1フレームにおける映像エリア81に応じた重み付けを行う。図13のフレーム74は1フレームである。
図13は、映像エリア81を示す図である。図13の映像エリア81は、合流領域83から本線84への合流を示している。
図14は、図13から得られる生起確率pt(r、θ)を模式的に示す。
動き抽出部12は、フレーム74から1フレームの部分領域である映像エリア81を抽出し、映像エリア81から複数の動きベクトルを抽出する。また、動き抽出部12は、1フレームのうち映像エリア81が除かれた除外領域から複数の動きベクトルを抽出する。図13において除外領域は、フレーム74のうち、映像エリア81が除かれた領域である。
重み付け部13は、映像エリア81から抽出された複数の動きベクトルの動きベクトルごとに生起確率を計算する。また、重み付け部13は、除外領域から抽出された複数の動きベクトルの動きベクトルごとに生起確率を計算する。
重み付け部13は、映像エリア81に基づき計算された複数の生起確率に、除外領域に基づき計算された複数の生起確率と異なる重み付けを行う。
具体的には、以下のようである。映像エリア81で抽出される動きベクトルvt(r、θ)をvin,t(r、θ)と表記し、除外領域で抽出される動きベクトルvt(r、θ)をvout,t(r、θ)と表記する。また、動きベクトルvin,t(r、θ)の生起確率をpin,t(r、θ)と表記し、動きベクトルvout,t(r、θ)の生起確率をpout,t(r、θ)と表記する。散乱度計算部14は図7の(式2)において、pin,t(r、θ)とpout,t(r、θ)とを区別する。つまり図7の(式2)において、pin,t(r、θ)とpout,t(r、θ)とは、異なる項として扱われる。
動き抽出部12は、フレーム74から映像エリア81を抽出し、映像エリア81から複数の動きベクトルvin,t(r、θ)を抽出する。動き抽出部12は、除外領域から複数の動きベクトルvout,t(r、θ)を抽出する。
重み付け部13は、映像エリア81から抽出された複数の動きベクトルvin,t(r、θ)の動きベクトルごとに生起確率pin,t(r、θ)を計算する。また、重み付け部13は、除外領域から抽出された複数の動きベクトルvout,t(r、θ)の動きベクトルごとに生起確率pout,t(r、θ)を計算する。
図14の破線で示す範囲は、動き抽出部12が映像エリア81から抽出した動きベクトルvin,t(r、θ)に対応する生起確率pin,t(r、θ)の範囲を示す。図14の破線で示す範囲には9つの生起確率pin,t(r、θ)を点で示している。重み付け部13は、映像エリア81に基づき計算された複数の生起確率pin,t(r、θ)に、除外領域に基づき計算された複数の生起確率pout,t(r、θ)と異なる重み付けを行う。例えば、重み付け部13は、複数の生起確率pin,t(r、θ)には、生起確率pin,t(r、θ)の大きさrに反比例する重みwr,θを生起確率pin,t(r、θ)に乗じ、複数の生起確率pout,t(r、θ)には重みwr,θとしてゼロを乗じる。これによって、映像エリア81の状態を強調して検出できる効果がある。 <
The
FIG. 13 is a diagram showing the
FIG. 14 schematically shows the occurrence probability p t (r, θ) obtained from FIG. 13.
The
The
The
Specifically, it is as follows. The motion vector v t (r, θ) extracted in the
The
The
The range indicated by the broken line in FIG. 14 indicates the range of occurrence probabilities p in, t (r, θ) corresponding to the motion vector v in, t (r, θ) extracted from the
<タイプ4>
図15及び図16を参照して、重み付け部13が行う重み付け処理のタイプ4を説明する。タイプ4はタイプ2に類似するが、タイプ4はタイプ2に対して画面を左右に分けずに、1フレーム全体で、生起確率pt(r、θ)における方向θに応じて、重み付け部13が重み付けを行う。タイプ4によって異常状態として逆走状態を検知できる。言い換えれば、タイプ4で異常と検出された状態が逆走状態である。
図15は、逆走を説明するフレーム75である。図15では、複数の人61の人ベクトル82は左方向にそろっている。黒い丸で示す人62の人ベクトル82のみが左方向である。
図16は、図15から得られる生起確率pt(r、θ)を模式的に示す。タイプ4では、方向θに応じて生起確率pt(r、θ)に、重みwr,θが与えられる。図15については、図16の一番上(3/2*π≦θ≦2π)と、一番下の領域(0≦θ≦π/2)とは、重みwr,θがゼロである。中央(π/2≦θ≦3/2*π)の領域では、重み付け部13は大きさrに比例する重みwr,θを生起確率pt(r、θ)に与える。この重み付けにより、図15の人62の逆走を検出できる。 <Type 4>
The type 4 of the weighting process performed by theweighting unit 13 will be described with reference to FIGS. 15 and 16. The type 4 is similar to the type 2, but the type 4 does not divide the screen into left and right with respect to the type 2, and the weighting unit 13 according to the direction θ in the occurrence probability p t (r, θ) in the entire one frame. Weights. The type 4 can detect a reverse running state as an abnormal state. In other words, the state detected as abnormal in Type 4 is the reverse running state.
FIG. 15 is aframe 75 for explaining reverse running. In FIG. 15, the person vectors 82 of the plurality of persons 61 are aligned in the left direction. Only the person vector 82 of the person 62 indicated by the black circle is to the left.
FIG. 16 schematically shows the occurrence probability p t (r, θ) obtained from FIG. In Type 4, the weight w r, θ is given to the occurrence probability p t (r, θ) according to the direction θ. Regarding FIG. 15, the weight w r, θ is zero in the uppermost region (3/2 * π ≦ θ ≦ 2π) and the lowermost region (0 ≦ θ ≦ π / 2) of FIG. 16. .. In the central region (π / 2 ≦ θ ≦ 3/2 * π), theweighting unit 13 gives the weight w r, θ proportional to the size r to the occurrence probability p t (r, θ). By this weighting, the backward run of the person 62 in FIG. 15 can be detected.
図15及び図16を参照して、重み付け部13が行う重み付け処理のタイプ4を説明する。タイプ4はタイプ2に類似するが、タイプ4はタイプ2に対して画面を左右に分けずに、1フレーム全体で、生起確率pt(r、θ)における方向θに応じて、重み付け部13が重み付けを行う。タイプ4によって異常状態として逆走状態を検知できる。言い換えれば、タイプ4で異常と検出された状態が逆走状態である。
図15は、逆走を説明するフレーム75である。図15では、複数の人61の人ベクトル82は左方向にそろっている。黒い丸で示す人62の人ベクトル82のみが左方向である。
図16は、図15から得られる生起確率pt(r、θ)を模式的に示す。タイプ4では、方向θに応じて生起確率pt(r、θ)に、重みwr,θが与えられる。図15については、図16の一番上(3/2*π≦θ≦2π)と、一番下の領域(0≦θ≦π/2)とは、重みwr,θがゼロである。中央(π/2≦θ≦3/2*π)の領域では、重み付け部13は大きさrに比例する重みwr,θを生起確率pt(r、θ)に与える。この重み付けにより、図15の人62の逆走を検出できる。 <Type 4>
The type 4 of the weighting process performed by the
FIG. 15 is a
FIG. 16 schematically shows the occurrence probability p t (r, θ) obtained from FIG. In Type 4, the weight w r, θ is given to the occurrence probability p t (r, θ) according to the direction θ. Regarding FIG. 15, the weight w r, θ is zero in the uppermost region (3/2 * π ≦ θ ≦ 2π) and the lowermost region (0 ≦ θ ≦ π / 2) of FIG. 16. .. In the central region (π / 2 ≦ θ ≦ 3/2 * π), the
以上のように、重み付け部13はタイプ1からタイプ4の重み付を行う。散乱度計算部14は、重み付け部13によって計算された重みwr,θを用いて、散乱度Etを計算する。検出部15は、図8で述べたように、散乱度Etが閾値TH以上になった場合に、異常と判定する。以上の処理によって、混乱、対流、合流及び逆走のような異常を感度よく検出することができる。また、これらのタイプに限らず、任意の重みを付けることで、任意の異常を検出することができる。
As described above, the weighting unit 13 weights the types 1 to 4. The scattering degree calculating unit 14 calculates the scattering degree E t using the weight w r, θ calculated by the weighting unit 13. As described with reference to FIG. 8, the detection unit 15 determines that the scattering degree Et is abnormal when the scattering degree Et is equal to or greater than the threshold value TH. By the above processing, abnormalities such as confusion, convection, merging, and reverse running can be detected with high sensitivity. Further, not limited to these types, any weight can be attached to detect any abnormality.
<変形例1>
異常検出装置100の変形例1として、検出部15がフレームの前景の面積を使用して、異常を特定する方式を説明する。検出部15は、閾値THと散乱度Etとの比較結果と、少なくとも一つのフレームにおける前景面積とに基づいて、異常を検出する。
前景の面積を使用する具体例としては、特定のフレームの前景の形状又は異常が検出されたフレームの前景の形状から、異常の種類を特定することができる。
あるいは複数フレームにおける前景の面積変化から、異常の種類を特定することができる。複数フレームにおける前景の面積変化から異常の種類を特定する場合は、検出部15は、フレームt、フレームt+1、フレームt+2,...の複数フレームにわたる、前景の面積変化を計算する。検出部15は、前景の面積変化と、図8に示す散乱度Etの変化とを用いて、異常として、具体的な異常の種類を検出することができる。 <Modification 1>
As a modified example 1 of theabnormality detection device 100, a method in which the detection unit 15 uses the area of the foreground of the frame to identify the abnormality will be described. Detector 15, a result of comparison between the threshold value TH and the scattering degree E t, based on the foreground area in at least one frame, to detect the abnormality.
As a specific example of using the foreground area, the type of abnormality can be specified from the shape of the foreground of a specific frame or the shape of the foreground of the frame in which the abnormality is detected.
Alternatively, the type of abnormality can be specified from the change in the foreground area in a plurality of frames. When identifying the type of abnormality from the area change of the foreground in a plurality of frames, thedetection unit 15 determines the frame t 1, the frame t + 1, the frame t + 2 ,. . . Compute the foreground area change over multiple frames of. Detector 15 can detect the foreground area change, with the change in the degree of scattering E t shown in FIG. 8, as an abnormality, a type of concrete abnormalities.
異常検出装置100の変形例1として、検出部15がフレームの前景の面積を使用して、異常を特定する方式を説明する。検出部15は、閾値THと散乱度Etとの比較結果と、少なくとも一つのフレームにおける前景面積とに基づいて、異常を検出する。
前景の面積を使用する具体例としては、特定のフレームの前景の形状又は異常が検出されたフレームの前景の形状から、異常の種類を特定することができる。
あるいは複数フレームにおける前景の面積変化から、異常の種類を特定することができる。複数フレームにおける前景の面積変化から異常の種類を特定する場合は、検出部15は、フレームt、フレームt+1、フレームt+2,...の複数フレームにわたる、前景の面積変化を計算する。検出部15は、前景の面積変化と、図8に示す散乱度Etの変化とを用いて、異常として、具体的な異常の種類を検出することができる。 <
As a modified example 1 of the
As a specific example of using the foreground area, the type of abnormality can be specified from the shape of the foreground of a specific frame or the shape of the foreground of the frame in which the abnormality is detected.
Alternatively, the type of abnormality can be specified from the change in the foreground area in a plurality of frames. When identifying the type of abnormality from the area change of the foreground in a plurality of frames, the
図17は、検出部15が前景面積の変化を使用して、異常を検出する第1の例を示す。図17の上側のグラフは、図8に示すグラフと同じである。図17の下側のグラフは、時間経過に伴う前景の面積変化のグラフである。下側のグラフでは、前景の面積は、異常時間帯の前から、異常時間帯にかけて増加している。そして、前景の面積は、異常時間帯から異常時間帯の後にかけて、ほぼ最大値を保っている。つまり、異常時間帯の後は散乱度Etは小さくなっていること、及び、前景の面積はほぼ最大値を保っていることとから、検出部15は、異常の種類を蝟集と判定する。図17では異常時間帯の前では、「(1)正常」の状態である。異常時間帯の範囲では検出部15によって「(2)異常」が検出されている。「(2)異常」は実際には動きのある混乱または集合である。異常時間帯の後、検出部15は、閾値THと散乱度Etとの比較結果と、複数のフレームにおける前景面積の変化とに基づいて、「(3)異常」の種類を蝟集と検出している。このように、前景面積の変化を使用することで、検出部15は蝟集のような動きの発生しない異常も検出することができる。
FIG. 17 shows a first example in which the detection unit 15 uses the change in the foreground area to detect an abnormality. The graph on the upper side of FIG. 17 is the same as the graph shown in FIG. The lower graph of FIG. 17 is a graph of the change in the foreground area over time. In the lower graph, the area of the foreground increases from before the abnormal time zone to during the abnormal time zone. The area of the foreground maintains a maximum value from the abnormal time zone to after the abnormal time zone. That is, since the scattering degree Et is small after the abnormal time period and the area of the foreground maintains the maximum value, the detection unit 15 determines that the type of abnormality is a collection. In FIG. 17, the state is “(1) normal” before the abnormal time zone. In the range of the abnormal time zone, the detection unit 15 detects “(2) Abnormality”. “(2) Abnormality” is actually a mess or a group with movement. After the abnormal time zone detection unit 15, a comparison result between the threshold TH and the scattering degree E t, based on the change in the foreground area in a plurality of frames, "(3) abnormality" a type of detecting a throng ing. As described above, by using the change in the foreground area, the detection unit 15 can detect an abnormality such as a roar that does not cause movement.
図18は、検出部15が前景面積の変化を使用して、異常を検出する第2の例を示す。図18の上側のグラフは、図8に示すグラフと同じである。図18の下側のグラフは、時間経過に伴う前景の面積変化のグラフである。下側のグラフでは、前景の面積は、異常時間帯の前から、異常時間帯にかけて減少している。そして、前景の面積は、異常時間帯から異常時間帯の後にかけて、ほぼ最小値を保っている。つまり、異常時間帯の後では散乱度Etは小さくなっていること、及び、前景の面積はほぼ最小値を保っていることとから、検出部15は、異常の種類を避難と判定する。図18では異常時間帯の前では、「(1)正常」の状態である。異常時間帯の範囲では検出部15によって「(2)異常」が検出されている。「(2)異常」は実際には混乱または避難である。異常時間帯の後、検出部15は、閾値THと散乱度Etとの比較結果と、複数のフレームにおける前景面積の変化とに基づいて、「(3)異常」の種類を非難と検出している。このように、前景面積の変化を使用することで、検出部15は異常時間帯に発生していた異常の種類を検出できる。
FIG. 18 shows a second example in which the detecting unit 15 detects an abnormality using the change in the foreground area. The graph on the upper side of FIG. 18 is the same as the graph shown in FIG. The lower graph in FIG. 18 is a graph of the change in the foreground area over time. In the lower graph, the area of the foreground decreases from before the abnormal time zone to during the abnormal time zone. Then, the area of the foreground maintains a minimum value from the abnormal time zone to after the abnormal time zone. That is, since the scattering degree Et is small after the abnormal time period and the area of the foreground keeps the minimum value, the detection unit 15 determines that the type of abnormality is evacuation. In FIG. 18, the state is “(1) normal” before the abnormal time zone. In the range of the abnormal time zone, the detection unit 15 detects “(2) Abnormality”. "(2) Abnormal" is actually confusion or evacuation. After the abnormal time zone detection unit 15, a comparison result between the threshold TH and the scattering degree E t, based on the change in the foreground area in a plurality of frames to detect accused the type of "(3) abnormality" ing. In this way, by using the change in the foreground area, the detection unit 15 can detect the type of abnormality that has occurred during the abnormal time period.
図17及び図18では、検出部15は散乱度Etの変化と前景の面積変化とから異常の種類を検出することができる。具体的には、散乱度Etの変化及び前景の面積変化に異常の種類が対応付けられた対応情報が、異常の種類ごとに検出部15に設定されている。検出部15は対応情報を参照して異常の種類を検出することができる。なお、図17及び図18の例に限らず、異常検出装置100は、様々な異常の種類を検出することができる。
In FIGS. 17 and 18, the detection unit 15 can detect the type of abnormality from the change and the foreground area change of the scattering of E t. Specifically, the corresponding information type abnormalities associated with the change and the foreground area change of the scattering degree E t is set to the detection unit 15 for each type of abnormality. The detection unit 15 can detect the type of abnormality by referring to the correspondence information. Note that the abnormality detection device 100 is not limited to the examples of FIGS. 17 and 18, and can detect various types of abnormalities.
***実施の形態1の効果***
(1)異常検出装置100は重み付け部13によって生起確率pt(r、θ)に重み付けするので、多様な異常行動を正確に検出できる。具体的には、駅のように多くの人が集まる環境における雑踏事故防止は、パブリックセーフティを実現するための重要な要素の一つである。 雑踏事故の一因として、不審物または急病人に起因するパニック状態の発生がある。パニック状態の予兆として、群集が入り乱れて動き回る「混乱行動」、あるいは群集が一箇所に集まる「蝟集行動」がある。そのため、これら予兆の早期発見が雑踏事故の防止に有効である。異常検出装置100は重み付け部13を有するので、これら予兆の早期発見が可能である。
(2)また、異常行動の種類に応じて「警備員派遣」または「避難誘導」など、対処法が異なる。従って、異常行動の種類を特定することは有効である。異常検出装置100の検出部15は、散乱度Etの変化と前景の面積変化とから異常の種類を検出するので、異常行動の種類に応じた対処法を選択することができる。 *** Effect ofEmbodiment 1 ***
(1) Since theabnormality detection device 100 weights the occurrence probability p t (r, θ) by the weighting unit 13, various abnormal behaviors can be accurately detected. Specifically, prevention of crowded accidents in an environment where many people gather, such as at a station, is one of the important factors for realizing public safety. One cause of crowded accidents is the occurrence of a panic situation caused by a suspicious object or a sudden sick person. As signs of a panic situation, there are "confusing behavior" in which crowds move around in disorder, or "collective behavior" in which crowds gather in one place. Therefore, early detection of these signs is effective in preventing crowded accidents. Since the abnormality detection device 100 has the weighting unit 13, it is possible to detect these signs early.
(2) Moreover, the coping method such as “guard dispatch” or “evacuation guidance” differs depending on the type of abnormal behavior. Therefore, it is effective to identify the type of abnormal behavior. Detectingsection 15 of the abnormality detection device 100, and detects the type of the anomaly from the change and the foreground area change of the scattering of E t, it is possible to select what to do in response to the type of abnormal behavior.
(1)異常検出装置100は重み付け部13によって生起確率pt(r、θ)に重み付けするので、多様な異常行動を正確に検出できる。具体的には、駅のように多くの人が集まる環境における雑踏事故防止は、パブリックセーフティを実現するための重要な要素の一つである。 雑踏事故の一因として、不審物または急病人に起因するパニック状態の発生がある。パニック状態の予兆として、群集が入り乱れて動き回る「混乱行動」、あるいは群集が一箇所に集まる「蝟集行動」がある。そのため、これら予兆の早期発見が雑踏事故の防止に有効である。異常検出装置100は重み付け部13を有するので、これら予兆の早期発見が可能である。
(2)また、異常行動の種類に応じて「警備員派遣」または「避難誘導」など、対処法が異なる。従って、異常行動の種類を特定することは有効である。異常検出装置100の検出部15は、散乱度Etの変化と前景の面積変化とから異常の種類を検出するので、異常行動の種類に応じた対処法を選択することができる。 *** Effect of
(1) Since the
(2) Moreover, the coping method such as “guard dispatch” or “evacuation guidance” differs depending on the type of abnormal behavior. Therefore, it is effective to identify the type of abnormal behavior. Detecting
<変形例2>
図1の異常検出装置100では、異常検出装置100の機能がソフトウェアで実現されるが、変形例2として、異常検出装置100の機能がハードウェアで実現されてもよい。
図19は、異常検出装置100の変形例2に係る異常検出装置100の構成を示す。図19の電子回路90は、画像取得部11、動き抽出部12,重み付け部13,散乱度計算部14、検出部15、主記憶装置20、補助記憶装置30、入力インタフェース40及び出力インタフェース50の機能を実現する専用の電子回路である。電子回路90は、信号線91に接続している。電子回路90は、具体的には、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ロジックIC、GA、ASIC、または、FPGAである。GAは、Gate Arrayの略語である。ASICは、Application Specific Integrated Circuitの略語である。FPGAは、Field-Programmable Gate Arrayの略語である。異常検出装置100の構成要素の機能は、1つの電子回路で実現されてもよいし、複数の電子回路に分散して実現されてもよい。別の変形例として、異常検出装置100の構成要素の一部の機能が電子回路で実現され、残りの機能がソフトウェアで実現されてもよい。 <Modification 2>
In theabnormality detection device 100 of FIG. 1, the function of the abnormality detection device 100 is realized by software, but as a second modification, the function of the abnormality detection device 100 may be realized by hardware.
FIG. 19 shows a configuration of theabnormality detection device 100 according to the second modification of the abnormality detection device 100. The electronic circuit 90 of FIG. 19 includes an image acquisition unit 11, a motion extraction unit 12, a weighting unit 13, a scattering degree calculation unit 14, a detection unit 15, a main storage device 20, an auxiliary storage device 30, an input interface 40, and an output interface 50. It is a dedicated electronic circuit that realizes the function. The electronic circuit 90 is connected to the signal line 91. The electronic circuit 90 is specifically a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, a logic IC, a GA, an ASIC, or an FPGA. GA is an abbreviation for Gate Array. ASIC is an abbreviation for Application Specific Integrated Circuit. FPGA is an abbreviation for Field-Programmable Gate Array. The functions of the constituent elements of the abnormality detection device 100 may be realized by one electronic circuit, or may be realized by being distributed to a plurality of electronic circuits. As another modification, some functions of the components of the abnormality detection device 100 may be realized by an electronic circuit, and the remaining functions may be realized by software.
図1の異常検出装置100では、異常検出装置100の機能がソフトウェアで実現されるが、変形例2として、異常検出装置100の機能がハードウェアで実現されてもよい。
図19は、異常検出装置100の変形例2に係る異常検出装置100の構成を示す。図19の電子回路90は、画像取得部11、動き抽出部12,重み付け部13,散乱度計算部14、検出部15、主記憶装置20、補助記憶装置30、入力インタフェース40及び出力インタフェース50の機能を実現する専用の電子回路である。電子回路90は、信号線91に接続している。電子回路90は、具体的には、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ロジックIC、GA、ASIC、または、FPGAである。GAは、Gate Arrayの略語である。ASICは、Application Specific Integrated Circuitの略語である。FPGAは、Field-Programmable Gate Arrayの略語である。異常検出装置100の構成要素の機能は、1つの電子回路で実現されてもよいし、複数の電子回路に分散して実現されてもよい。別の変形例として、異常検出装置100の構成要素の一部の機能が電子回路で実現され、残りの機能がソフトウェアで実現されてもよい。 <
In the
FIG. 19 shows a configuration of the
プロセッサ10と電子回路90の各々は、プロセッシングサーキットリとも呼ばれる。
異常検出装置100において、画像取得部11、動き抽出部12,重み付け部13,散乱度計算部14及び検出部15の機能がプロセッシングサーキットリにより実現されてもよい。あるいは、画像取得部11、動き抽出部12,重み付け部13,散乱度計算部14、検出部15、主記憶装置20、補助記憶装置30、入力インタフェース40及び出力インタフェース50の機能が、プロセッシングサーキットリにより実現されてもよい。 Each of theprocessor 10 and the electronic circuit 90 is also called a processing circuit.
In theabnormality detection device 100, the functions of the image acquisition unit 11, the motion extraction unit 12, the weighting unit 13, the scattering degree calculation unit 14, and the detection unit 15 may be realized by the processing circuitry. Alternatively, the functions of the image acquisition unit 11, the motion extraction unit 12, the weighting unit 13, the scattering degree calculation unit 14, the detection unit 15, the main storage device 20, the auxiliary storage device 30, the input interface 40, and the output interface 50 are the same as the processing circuitry. May be realized by
異常検出装置100において、画像取得部11、動き抽出部12,重み付け部13,散乱度計算部14及び検出部15の機能がプロセッシングサーキットリにより実現されてもよい。あるいは、画像取得部11、動き抽出部12,重み付け部13,散乱度計算部14、検出部15、主記憶装置20、補助記憶装置30、入力インタフェース40及び出力インタフェース50の機能が、プロセッシングサーキットリにより実現されてもよい。 Each of the
In the
なお、実施の形態1では重み付け部13による重み付けのタイプに関して、タイプ1からタイプ4を説明した。重み付け部13による重み付けは、タイプ1からタイプ4に限定されるものではない。
In addition, in the first embodiment, the types of weighting by the weighting unit 13 are described as types 1 to 4. The weighting by the weighting unit 13 is not limited to Type 1 to Type 4.
以上、本発明の実施の形態1を説明したが、実施の形態1のうち、1つを部分的に実施しても構わない。あるいは、実施の形態1のうち、2つ以上を部分的に組み合わせて実施しても構わない。なお、本発明は、実施の形態1に限定されるものではなく、必要に応じて種々の変更が可能である。
Although the first embodiment of the present invention has been described above, one of the first embodiments may be partially implemented. Alternatively, two or more of the first embodiment may be partially combined and implemented. The present invention is not limited to the first embodiment, and various modifications can be made if necessary.
10 プロセッサ、11 画像取得部、12 動き抽出部、13 重み付け部、14 散乱度計算部、15 検出部、20 主記憶装置、30 補助記憶装置、40 入力インタフェース、50 出力インタフェース、61,62 人、71,72,73,74,75 フレーム、81 映像エリア、82 人ベクトル、83 合流領域、84 本線、90 電子回路、91 信号線、100 異常検出装置、101 信号線、200 カメラ。
10 processors, 11 image acquisition unit, 12 motion extraction unit, 13 weighting unit, 14 scattering degree calculation unit, 15 detection unit, 20 main storage device, 30 auxiliary storage device, 40 input interface, 50 output interface, 61, 62 people, 71, 72, 73, 74, 75 frames, 81 video areas, 82 person vectors, 83 confluence areas, 84 main lines, 90 electronic circuits, 91 signal lines, 100 abnormality detection devices, 101 signal lines, 200 cameras.
Claims (7)
- 複数の画像を取得する画像取得部と、
1フレームの画像に含まれる複数の移動体から、動きの大きさと動きの方向とによって決定される複数の動きベクトルを抽出する動き抽出部と、
前記複数の動きベクトルの動きベクトルごとに生起確率を計算し、少なくともいずれかの前記生起確率に重み付けを行う重み付け部と、
重み付けされた前記生起確率に基づいて、前記1フレームの散乱度を計算する散乱度計算部と、
前記散乱度と閾値とを比較し、比較結果に基づいて異常を検出する検出部と、
を備える異常検出装置。 An image acquisition unit that acquires multiple images,
A motion extraction unit that extracts a plurality of motion vectors determined by a magnitude of motion and a direction of motion from a plurality of moving objects included in an image of one frame;
A weighting unit that calculates an occurrence probability for each motion vector of the plurality of motion vectors and weights at least one of the occurrence probabilities,
A scattering degree calculation unit that calculates the scattering degree of the one frame based on the weighted occurrence probability;
Comparing the scattering degree and the threshold value, a detection unit for detecting an abnormality based on the comparison result,
Anomaly detection device. - 前記重み付け部は、
前記生起確率が計算される前記動きベクトルの前記動きの大きさに応じて、前記生起確率に重み付けを行う請求項1に記載の異常検出装置。 The weighting unit,
The abnormality detection device according to claim 1, wherein the occurrence probability is weighted according to the magnitude of the motion of the motion vector for which the occurrence probability is calculated. - 前記重み付け部は、
前記生起確率が計算される前記動きベクトルの前記動きの方向に応じて、前記生起確率に重み付けを行う請求項1または請求項2に記載の異常検出装置。 The weighting unit,
The abnormality detection device according to claim 1, wherein the occurrence probability is weighted according to the direction of the movement of the motion vector in which the occurrence probability is calculated. - 前記動き抽出部は、
前記1フレームから前記1フレームの一部分の部分領域を抽出し、前記部分領域から複数の動きベクトルを抽出し、前記1フレームのうち前記部分領域が除かれた除外領域から複数の動きベクトルを抽出し、
前記重み付け部は、
前記部分領域から抽出された前記複数の動きベクトルの動きベクトルごとに生起確率を計算し、前記除外領域から抽出された前記複数の動きベクトルの動きベクトルごとに生起確率を計算し、前記部分領域に基づき計算された生起確率に、前記除外領域に基づき計算された生起確率と異なる重み付けを行う請求項1から請求項3のいずれか1項に記載の異常検出装置。 The motion extraction unit is
A partial area of a part of the one frame is extracted from the one frame, a plurality of motion vectors is extracted from the partial area, and a plurality of motion vectors is extracted from an exclusion area of the one frame from which the partial area is removed. ,
The weighting unit,
The occurrence probability is calculated for each motion vector of the plurality of motion vectors extracted from the partial area, the occurrence probability is calculated for each motion vector of the plurality of motion vectors extracted from the exclusion area, and the partial area is calculated. The abnormality detection device according to claim 1, wherein the occurrence probability calculated based on the exclusion area is weighted differently from the occurrence probability calculated based on the exclusion area. - 前記検出部は、
前記比較結果と、少なくとも一つのフレームにおける前景面積とに基づいて、異常を検出する請求項1から請求項4のいずれか1項に記載の異常検出装置。 The detection unit,
The abnormality detection device according to any one of claims 1 to 4, which detects an abnormality based on the comparison result and a foreground area in at least one frame. - コンピュータに、
複数の画像を取得する処理と、
1フレームの画像に含まれる複数の移動体から、動きの大きさと動きの方向とによって決定される複数の動きベクトルを抽出する処理と、
前記複数の動きベクトルの動きベクトルごとに生起確率を計算し、少なくともいずれかの前記生起確率に重み付けを行う処理と、
重み付けされた前記生起確率に基づいて、前記1フレームの散乱度を計算する処理と、
前記散乱度と閾値とを比較し、比較結果に基づいて異常を検出する処理と、
を実行させる異常検出プログラム。 On the computer,
The process of acquiring multiple images,
A process of extracting a plurality of motion vectors determined by a magnitude of motion and a direction of motion from a plurality of moving objects included in an image of one frame;
A process of calculating an occurrence probability for each motion vector of the plurality of motion vectors, and weighting at least one of the occurrence probabilities,
Calculating the degree of scattering of the one frame based on the weighted occurrence probability;
Comparing the scattering degree and the threshold value, a process of detecting an abnormality based on the comparison result,
Anomaly detection program that executes - コンピュータが、
複数の画像を取得し、
1フレームの画像に含まれる複数の移動体から、動きの大きさと動きの方向とによって決定される複数の動きベクトルを抽出し、
前記複数の動きベクトルの動きベクトルごとに生起確率を計算し、少なくともいずれかの前記生起確率に重み付けを行い、
重み付けされた前記生起確率に基づいて、前記1フレームの散乱度を計算し、
前記散乱度と閾値とを比較し、比較結果に基づいて異常を検出する、
異常検出方法。 Computer
Acquire multiple images,
From a plurality of moving objects included in one frame image, a plurality of motion vectors determined by the magnitude of motion and the direction of motion are extracted,
An occurrence probability is calculated for each motion vector of the plurality of motion vectors, and at least one of the occurrence probabilities is weighted,
Calculating the degree of scattering for the one frame based on the weighted occurrence probabilities;
The scattering degree and the threshold value are compared, and an abnormality is detected based on the comparison result.
Anomaly detection method.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/040347 WO2020090010A1 (en) | 2018-10-30 | 2018-10-30 | Abnormal behavior detection device, abnormal behavior detection program, and abnormal behavior detection method |
JP2020554649A JP6818965B2 (en) | 2018-10-30 | 2018-10-30 | Anomaly detection device, anomaly detection program and anomaly detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/040347 WO2020090010A1 (en) | 2018-10-30 | 2018-10-30 | Abnormal behavior detection device, abnormal behavior detection program, and abnormal behavior detection method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020090010A1 true WO2020090010A1 (en) | 2020-05-07 |
Family
ID=70463651
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/040347 WO2020090010A1 (en) | 2018-10-30 | 2018-10-30 | Abnormal behavior detection device, abnormal behavior detection program, and abnormal behavior detection method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6818965B2 (en) |
WO (1) | WO2020090010A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210390326A1 (en) * | 2020-04-28 | 2021-12-16 | Pfu Limited | Information processing system, area determination method, and medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012022370A (en) * | 2010-07-12 | 2012-02-02 | Hitachi Kokusai Electric Inc | Monitoring system and monitoring method |
JP2013127716A (en) * | 2011-12-19 | 2013-06-27 | Nippon Signal Co Ltd:The | Abnormal state detection system for congestion |
JP2017151875A (en) * | 2016-02-26 | 2017-08-31 | 三菱電機株式会社 | Residence determination device and residence determination program |
JP2018117331A (en) * | 2017-01-19 | 2018-07-26 | 株式会社イデアクエスト | Bed watching device |
-
2018
- 2018-10-30 WO PCT/JP2018/040347 patent/WO2020090010A1/en active Application Filing
- 2018-10-30 JP JP2020554649A patent/JP6818965B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012022370A (en) * | 2010-07-12 | 2012-02-02 | Hitachi Kokusai Electric Inc | Monitoring system and monitoring method |
JP2013127716A (en) * | 2011-12-19 | 2013-06-27 | Nippon Signal Co Ltd:The | Abnormal state detection system for congestion |
JP2017151875A (en) * | 2016-02-26 | 2017-08-31 | 三菱電機株式会社 | Residence determination device and residence determination program |
JP2018117331A (en) * | 2017-01-19 | 2018-07-26 | 株式会社イデアクエスト | Bed watching device |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210390326A1 (en) * | 2020-04-28 | 2021-12-16 | Pfu Limited | Information processing system, area determination method, and medium |
US11960967B2 (en) * | 2020-04-28 | 2024-04-16 | Pfu Limited | Information processing system, area determination method, and medium |
Also Published As
Publication number | Publication date |
---|---|
JPWO2020090010A1 (en) | 2021-03-18 |
JP6818965B2 (en) | 2021-01-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6616521B2 (en) | Image processing device | |
CN105354563B (en) | Face datection prior-warning device and implementation method are blocked in conjunction with depth and color image | |
KR101708547B1 (en) | Event detection apparatus and event detection method | |
JP2022506905A (en) | Systems and methods for assessing perceptual systems | |
JP6764481B2 (en) | Monitoring device | |
Kumar et al. | A novel method of edge detection using cellular automata | |
US20170330315A1 (en) | Information processing apparatus, method for processing information, discriminator generating apparatus, method for generating discriminator, and program | |
US10691956B2 (en) | Information processing apparatus, information processing system, information processing method, and storage medium having determination areas corresponding to waiting line | |
US20150146006A1 (en) | Display control apparatus and display control method | |
WO2019119659A1 (en) | Method and equipment for monitoring vortex-induced vibration for wind turbine generator set | |
US20090324016A1 (en) | Moving target detecting apparatus, moving target detecting method, and computer readable storage medium having stored therein a program causing a computer to function as the moving target detecting apparatus | |
US20160259990A1 (en) | Region-of-interest detection apparatus, region-of-interest detection method, and recording medium | |
US20230044673A1 (en) | Detection device and control method of the same | |
TWI493510B (en) | Falling down detection method | |
WO2020090010A1 (en) | Abnormal behavior detection device, abnormal behavior detection program, and abnormal behavior detection method | |
US20220262121A1 (en) | System and method for mitigating crowd panic detection | |
US20200013172A1 (en) | Object tracking device and object tracking method | |
WO2016092783A1 (en) | Information processing apparatus, method for processing information, discriminator generating apparatus, method for generating discriminator, and program | |
JP6113018B2 (en) | Object detection device | |
US10372750B2 (en) | Information processing apparatus, method, program and storage medium | |
KR20160073490A (en) | System for assessment of safety level at construction site based on computer vision | |
CN114549456A (en) | Image anomaly detection method and device and electronic equipment | |
JP7485449B2 (en) | Monitoring device, monitoring method, and program | |
JP2008165705A (en) | Image processor and image processing method | |
JP7351571B2 (en) | Image tracking device, image tracking method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18938847 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020554649 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18938847 Country of ref document: EP Kind code of ref document: A1 |