WO2015166612A1 - 映像解析システム、映像解析方法および映像解析プログラム - Google Patents
映像解析システム、映像解析方法および映像解析プログラム Download PDFInfo
- Publication number
- WO2015166612A1 WO2015166612A1 PCT/JP2015/000528 JP2015000528W WO2015166612A1 WO 2015166612 A1 WO2015166612 A1 WO 2015166612A1 JP 2015000528 W JP2015000528 W JP 2015000528W WO 2015166612 A1 WO2015166612 A1 WO 2015166612A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- monitoring target
- identification information
- shooting
- information
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
- G06V40/173—Classification, e.g. identification face re-identification, e.g. recognising unknown faces across different face tracks
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/04—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
Definitions
- the present invention relates to a video analysis system, a video analysis method, and a video analysis program for analyzing a monitoring target in a video.
- Surveillance cameras are installed at stations and specific facilities, and various judgments are made by analyzing the video captured by the surveillance cameras. As one of them, a person or object that exists unnaturally for a long time within the range to be monitored is specified as a suspicious person or a suspicious object.
- a behavior analysis method for tracking a specific person and analyzing the behavior of the person is known.
- this behavior analysis method one person or a plurality of cameras with overlapping monitoring areas are used to recognize where a specific person is, and by tracking the time change of the position of the person, Identify where and how long you have stayed.
- Patent Document 1 describes a face image recognition device that speeds up face image recognition processing and simplifies registration work.
- the face image recognition apparatus described in Patent Document 1 registers a face image in the front direction of a person to be recognized and an average face image in a direction other than the front, and stores the face area extracted from the video.
- the face image in the video is recognized by comparing the feature with the registered face image.
- Patent Document 2 describes a suspicious person detection device that automatically detects a suspicious person from a camera image.
- the apparatus described in Patent Document 2 periodically acquires an image with a camera having a shooting range in all directions around the vehicle, calculates an extracted person's behavior feature amount every predetermined time, and the behavior feature Judgment is made based on the frequency distribution of the quantity.
- Patent Document 3 describes an image processing apparatus that associates a plurality of objects in an image.
- the evaluation providing unit gives an evaluation to the object detected by the object detecting unit.
- the related evaluation assigning means gives an evaluation based on the evaluation to another object related by the associating means to the object given the evaluation by the evaluation giving means.
- the range in which a person's behavior can be analyzed depends on the shooting range of the camera. For example, when one camera is used, the range in which a person's behavior can be analyzed is limited to the range in which the camera can shoot. Also, when trying to cover a large range, it is necessary to use a large number of cameras so that there is no gap in the monitored range. Furthermore, even if an attempt is made to shoot a person in a crowded situation, it is often difficult to see a specific person because it is often invisible with another person.
- the suspicious person detection device described in Patent Document 2 is premised on using temporally continuous images within a shooting range covered by the camera. This is because the behavior feature quantity used by the suspicious person detection device described in Patent Document 2 is a continuous image such as movement speed, movement direction variation, distance from the vehicle, and head height variation. This is because it uses information that can be obtained from. Therefore, in a situation where the range to be photographed cannot be covered, such as when the range to be monitored is wide, it is difficult to detect the suspicious person using the suspicious person detection device described in Patent Document 2.
- the present invention provides a video analysis system and video that can analyze a suspicious movement from a captured video even when the monitoring range is wide or the monitoring target to be discovered in advance is not registered
- An object is to provide an analysis method and a video analysis program.
- the video analysis system extracts monitoring target identification information, which is identification information of a monitoring target, from each video as information used for estimating the identity of the monitoring target, and the extracted monitoring target identification information and the monitoring target are photographed.
- Monitoring target shooting information generating means for generating monitoring target shooting information including the captured shooting time, and appearance history generation means for generating an appearance history of the monitoring target estimated to be the same from the plurality of generated monitoring target shooting information
- a specifying means for specifying a monitoring target whose appearance history conforms to the prescribed rule.
- the video analysis method extracts monitoring target identification information, which is identification information of a monitoring target, from each video as information used for estimating the identity of the monitoring target, and the extracted monitoring target identification information and the monitoring target are photographed.
- the monitoring target shooting information including the captured shooting time is generated, the appearance history of the monitoring target estimated to be the same is generated from the generated plurality of monitoring target shooting information, and the appearance history conforms to the prescribed rule.
- a monitoring target is specified.
- the video analysis program extracts, from a video, monitoring target identification information that is identification information of a monitoring target as information used for estimating the identity of the monitoring target, and extracts the monitoring target identification information and the monitoring
- monitoring target shooting information generation process for generating monitoring target shooting information including a shooting time when the target is shot, and an appearance history for generating an appearance history of a monitoring target estimated to be the same from a plurality of generated monitoring target shooting information
- a generation process and a specific process for specifying a monitoring target whose appearance history conforms to a specified rule are executed.
- the present invention even if the monitoring range is wide or the monitoring target to be discovered in advance is not registered, it is possible to analyze the monitoring target performing suspicious movement from the captured video.
- the monitoring target is not limited to a person, and may be an object such as an automobile.
- FIG. FIG. 1 is a block diagram showing a configuration example of a first embodiment of a video analysis system according to the present invention.
- the video analysis system of the present embodiment includes a video acquisition unit 11, a person identification information analysis unit 12, a person identification information matching unit 15, an appearance history management unit 21, an appearance history storage unit 22, and a rule management unit 23. , A rule storage unit 24, an action detection unit 25, and a result output unit 17.
- the video analysis system may be connected to an output result storage unit 20 that stores output results.
- the video acquisition unit 11 acquires video in a predetermined monitoring range.
- the video acquisition unit 11 also acquires the time at which the video was acquired (hereinafter referred to as the shooting time).
- the video acquisition unit 11 inputs the video in the monitoring range and the time when the video was shot (that is, the shooting time) to the person identification information analysis unit 12.
- the video acquisition unit 11 is realized by a photographing device such as a camera, for example.
- the video analysis system illustrated in FIG. 1 includes only one video acquisition unit 11, the number of the video acquisition units 11 is not limited to one and may be two or more. Moreover, the range which each image
- photographs may overlap, and all do not need to overlap.
- the video analysis system includes a plurality of video acquisition units 11, when the acquired video is input to the person identification information analysis unit 12, information for identifying each video acquisition unit 11 (hereinafter, shooting) (Also referred to as device identification information).
- the video acquisition unit 11 is installed so as to be able to photograph a particularly watched area in the facility to be monitored.
- the video acquisition unit 11 does not have to be installed so as to capture the entire range in the facility.
- the video shot in the present embodiment is not limited to video that is temporally continuous. That is, the video used for analysis in the present embodiment may be a set of temporally discontinuous images (video).
- temporally discontinuous video includes not only video obtained by thinning out frames from video that is normally shot, but also video shot in completely different time zones and places. That is, in this embodiment, if the time and place where the image was taken can be grasped, the video is not questioned in terms of temporal continuity, and the completeness of the place taken is not questioned.
- the video analysis system may include a video storage unit (not shown) that stores the video captured by the video acquisition unit 11 for a certain period.
- the video storage unit is realized by a magnetic disk or the like. By providing such a storage unit, it is possible to use images shot in the past.
- the person identification information analysis unit 12 analyzes the monitoring target shown in each video captured by each video acquisition unit 11. Specifically, when the person identification information analysis unit 12 detects a monitoring target in the video, the identification information of the monitoring target (hereinafter referred to as monitoring target identification information) and the shooting time when the monitoring target was shot. (Hereinafter referred to as monitoring target photographing information).
- the monitoring target identification information is used when estimating the identity of the monitoring target.
- the person identification information analysis unit 12 may generate monitoring target shooting information including a shooting position where the monitoring target is shot. This shooting position may be the position of the camera itself, camera identification information (shooting apparatus identification information), or the camera shooting range.
- the content of the monitoring target identification information extracted by the person identification information analysis unit 12 is predetermined according to the content of the monitoring target. For example, when the monitoring target is a person, the person identification information analysis unit 12 may extract the face image of the person as the monitoring target identification information when detecting the person from the video. For example, when the monitoring target is an automobile, the person identification information analysis unit 12 may extract an image of a license plate of the automobile as monitoring target identification information when detecting the automobile from the video.
- the information extracted by the person identification information analysis unit 12 is not limited to a face image or a license plate image.
- the information to be extracted is arbitrary as long as it can identify the monitoring target.
- the monitoring target identification information used for identifying a person can be referred to as person identification information. Since a method for extracting specific identification information from a target image is widely known, detailed description thereof is omitted here.
- the person identification information analysis unit 12 receives each video acquisition unit that receives a video when the video is input from each video acquisition unit 11. 11 may be provided with the imaging device identification information.
- the appearance history management unit 21 generates the appearance history of the monitoring target estimated to be the same from the monitoring target imaging information extracted by the person identification information analysis unit 12. Specifically, the appearance history management unit 21 generates, as an appearance history, the history of the monitoring target that is estimated to be the same as the image captured by the video acquisition unit 11 and stores it in the appearance history storage unit 22. The process of estimating the identity of the monitoring target is performed by the person identification information matching unit 15 described later.
- the appearance history storage unit 22 stores the appearance history of the monitoring target. Specifically, the appearance history storage unit 22 stores an appearance history in which each piece of information including the shooting time is associated with each monitoring target. The shooting time can also be referred to as the appearance time of the monitoring target.
- FIG. 2 is an explanatory diagram illustrating an example of information stored in the appearance history storage unit 22 of the present embodiment.
- the appearance history storage unit 22 includes, for each person ID for identifying a monitoring target, a person image ID that is an identifier for identifying monitoring target photographing information, and person identification information indicating monitoring target identification information. And the photographing time and the name of the photographed camera (for example, photographing device identification information) are stored in association with each other.
- the appearance history storage unit 22 may generate an appearance history including a shooting position.
- the camera name and imaging device identification information set as the imaging position are information indicating the location where the monitoring target appears, the camera name and imaging device identification information can also be referred to as the appearance location.
- the appearance history storage unit 22 includes the likelihood (identification accuracy) of the monitoring target estimated from the acquired video and the personal identification information included in the monitoring target estimated to be the same.
- information representation image
- the image is representatively used for comparison may be stored in association with the person ID. It is preferable to select a representative image having higher accuracy from the person identification information indicating the same monitoring target.
- the person identification information collation unit 15 collates the monitoring target identification information and identifies the monitoring target estimated to be the same. For example, when the monitoring target face image is extracted as the monitoring target identification information, the person identification information collating unit 15 may collate the face image and specify the person estimated to be the same.
- the person identification information collation unit 15 collates the monitoring target identification information extracted by the person identification information analysis unit 12 with the monitoring target identification information stored in the appearance history storage unit 22, and performs the same monitoring. It is determined whether the target has already been registered in the appearance history storage unit 22.
- the appearance history management unit 21 uses the monitoring target identification information extracted by the person identification information analysis unit 12 as the appearance history of the monitoring target. Add to On the other hand, when it is determined that the same monitoring target is not registered in the appearance history storage unit 22, the appearance history management unit 21 creates an appearance history as a new monitoring target.
- the person identification information analysis unit 12 extracts a face image as the monitoring target identification information (person identification information).
- the person identification information collation unit 15 collates the generated person identification information with the person identification information stored in the appearance history storage unit 22, and the same person is identified. Determine if it exists.
- the person identification information matching unit 15 compares the face images included in the generated monitoring target imaging information and estimates whether the monitoring targets are the same.
- the appearance history management unit 21 adds the appearance history using the person ID of the person.
- the appearance history management unit 21 generates a new person ID and adds the appearance history.
- the person identification information matching unit 15 may calculate the probability (identification accuracy) of the monitoring target estimated to be the same when the monitoring target identification information is verified. Then, the appearance history management unit 21 may register the appearance history including the certainty in the appearance history storage unit 22.
- the person identification information collating unit 15 may collate only the representative image illustrated in FIG. 2 among the monitoring object identification information included in the appearance history of each person. . By determining such a representative image, it is possible to speed up the processing by reducing the number of collations while improving the collation accuracy with new monitoring target imaging information.
- the rule storage unit 24 stores a rule that defines a behavior pattern of a monitoring target to be extracted. When it is desired to extract a monitoring target that is performing suspicious movement as in the present embodiment, the rule storage unit 24 stores a suspicious movement pattern specified from the appearance history as a rule.
- FIG. 3 is an explanatory diagram illustrating an example of the rules stored in the rule storage unit 24.
- the rule storage unit 24 stores a rule name representing a monitoring target behavior pattern to be extracted in association with a rule name for identifying a rule.
- a monitoring target that moves in a manner that deviates from that of a normal use of the facility can be determined as a monitoring target that moves suspiciously. Therefore, rules exemplified below can be defined.
- the first rule is a rule that “appears in a specified number or more different areas within a certain time”. For example, it is assumed that there is a stadium having four areas: a north spectator seat, a south spectator seat, an east spectator seat, and a west spectator seat. If it is a normal audience, it usually appears in one area where there is a seat. There may be a spectator appearing in two areas by moving to adjacent areas from time to time, but the movements appearing in the three areas are suspicious movements.
- the rule storage unit 24 specifies a monitoring target that conforms to the first rule by defining a rule of “having appearance history of more than a specified number of different shooting positions within a certain time”.
- the second rule is a rule that “it appears more than a specified number of times in two or more specific places within a certain time with different places in between”. For example, it is assumed that there are a square A and a square B that are a little apart. A person whose appearance history is observed as A or A can be judged as a person who is simply walking around the square A. A person whose appearance history is observed as A, A, B, B can be determined as a person who has moved from the square A to the square B. On the other hand, a person whose appearance history is observed as A, B, A, B, A, B can be said to be a suspicious movement because it can be determined that the person is wandering between the square A and the square B unnaturally.
- the rule storage unit 24 defines, for example, a rule “having an appearance history in which two or more different shooting positions change within a certain period of time”, so that a monitoring target that conforms to the second rule can be obtained. Identify.
- the third rule is a rule that “a case of appearing at a designated place but not appearing at another place corresponding within a certain time is observed more than a designated number of times”. For example, after passing through the entrance of a store, it usually appears at a cash register which is a place necessary for achieving the original purpose within about one hour. Moreover, after passing through the entrance of a bank, it usually appears at a window which is a place necessary for achieving the original purpose within about one hour. There are people who do not use the facility even if they enter the facility, but for example, it is suspicious that it does not appear in the place necessary to achieve its original purpose although it has been repeatedly entering the facility in the morning and at night or on different days. It's a movement.
- the rule storage unit 24 has, for example, “appearance history indicating other shooting positions expected to appear based on the shooting position within a certain period after the appearance history indicating the predetermined shooting position”. By defining the rule “No”, the monitoring target that matches the third rule is specified.
- the fourth rule is a rule that “it appears repeatedly in a time zone where a normal person does not appear repeatedly”. For example, in a station where many people meet at a station, it is assumed that they are simply looking for a partner or are killing their time, so it is difficult to say suspicious movements even if they appear many times. However, movements that repeatedly appear during commuting hours, early mornings, late nights, etc. can be considered suspicious movements.
- a monitoring target that conforms to the fourth rule is specified.
- the rule storage unit 24 stores a rule defined by the relationship between two or more appearance histories including a shooting position and a shooting time. And the action detection part 25 specifies the monitoring object with which the produced
- the defined rules are not limited to the four types of rules described above.
- the rule management unit 23 manages the rules stored in the rule storage unit 24. For example, in response to an access request to the rule storage unit 24, the rule management unit 23 returns the contents of the rule storage unit 24, adds a rule, and updates the rule.
- the behavior detection unit 25 identifies a monitoring target whose appearance history pattern conforms to a specified rule. Specifically, the behavior detection unit 25 compares the pattern specified from the appearance history stored in the appearance history storage unit 22 with the rule stored in the rule storage unit 24, and determines the pattern conforming to the rule. A monitoring target having an appearance history is specified.
- the behavior detection unit 25 may operate when a new appearance history is registered in the appearance history storage unit 22, or may operate periodically like a batch process.
- the result output unit 17 outputs the monitoring target specified by the behavior detection unit 25.
- the result output unit 17 may store the output result in the output result storage unit 20.
- FIG. 4 is an explanatory diagram illustrating an example of information stored in the output result storage unit 20. As illustrated in FIG. 4, the result output unit 17 may output a person image ID that identifies a person image, a shooting time when the person is shot, and a camera name. The result output unit 17 may output only the person image ID.
- the person identification information analysis unit 12, the person identification information collation unit 15, the appearance history management unit 21, the rule management unit 23, the behavior detection unit 25, and the result output unit 17 are, for example, a program (video analysis program). It is realized by a CPU of a computer that operates according to the above.
- the program is stored in a storage device (not shown) included in the video analysis system, and the CPU reads the program, and according to the program, the person identification information analysis unit 12, the person identification information matching unit 15, the appearance history
- the management unit 21, the rule management unit 23, the behavior detection unit 25, and the result output unit 17 may be operated.
- the person identification information analysis unit 12, the person identification information collation unit 15, the appearance history management unit 21, the rule management unit 23, the behavior detection unit 25, and the result output unit 17 each have dedicated hardware. It may be realized with. Further, the appearance history storage unit 22 and the rule storage unit 24 are realized by, for example, a magnetic disk.
- FIG. 5 is an explanatory diagram illustrating an operation example of the video analysis system of the present embodiment.
- the persons u1 to u10 to be monitored are identified from the video captured by the two cameras A and B corresponding to the video acquisition unit 11 in the direction of the time axis. Indicates.
- a person to be photographed usually exists in a plurality of continuous frames S. Therefore, the number of appearances of a person specified continuously from a frame within a predetermined period is counted as one.
- the person u2, the person u6, the person u7, and the person u9 are estimated to be the same person.
- this person is P1
- four appearance histories of this person P1 are created.
- the appearance history is represented by a symbol obtained by combining a photographed camera symbol and time.
- the appearance history is A02, B03, B08, A10.
- FIG. 5 shows the appearance history of the person P1.
- the person u2 and the person u6 are the same person. If this person is P2, two appearance histories of the person P2 (A03, A05) are created. In addition, it is estimated that the person u5 and the person u8 are the same person, and if this person is P3, two appearance histories of the person P3 (B05, A06) are also created.
- the behavior detection unit 25 identifies the person P1 as a monitoring target that conforms to the rule, and the result output unit 17 outputs various types of information necessary to identify the monitoring target.
- FIG. 6 is a flowchart showing an operation example of the video analysis system of the present embodiment.
- the video acquisition unit 11 acquires a video in the shooting range (step S11), and inputs the shot video to the person identification information analysis unit 12.
- the person identification information analysis unit 12 detects the monitoring target from the input video
- the person identification information analysis unit 12 extracts the identification information of the monitoring target (step S12) and also acquires the shooting time.
- the person identification information analysis unit 12 generates monitoring target shooting information including the monitoring target identification information and the shooting time (step S13).
- the person identification information collation unit 15 collates the extracted monitoring target identification information with the monitoring target identification information stored in the appearance history storage unit 22 (step S14).
- the appearance history management unit 21 adds the appearance history of the monitoring target (step S16).
- the appearance history management unit 21 adds the monitoring target and adds the appearance history to the monitoring target (step S17).
- the behavior detection unit 25 identifies a monitoring target whose appearance history matches the prescribed rule (step S18).
- the result output unit 17 outputs the specified monitoring target (step S19).
- the person identification information analysis unit 12 includes the monitoring target identification information of the monitoring target and the shooting time at which the monitoring target was shot from each video shot by the video acquisition unit 11.
- the monitoring target shooting information is generated, and the appearance history management unit 21 generates the appearance history of the monitoring target estimated to be the same from the plurality of generated monitoring target shooting information.
- the behavior detection unit 25 identifies a monitoring target whose appearance history matches the prescribed rule. Therefore, even when the monitoring range is wide or the monitoring target to be discovered in advance is not registered, it is possible to analyze the monitoring target performing suspicious movement from the captured video.
- a feature of the present invention is that the identification result is expressed by a history based on time and place, and the suspiciousness can be determined only by the history pattern.
- Embodiment 2 a second embodiment of the video analysis system according to the present invention will be described.
- the appearance history management unit 21 generates the appearance history of the monitoring target estimated to be the same.
- the video analysis system according to the present embodiment generates appearance histories collectively.
- FIG. 7 is a block diagram showing a configuration example of the second embodiment of the video analysis system according to the present invention.
- symbol same as FIG. 1 is attached
- subjected and description is abbreviate
- the video analysis system of the present embodiment includes a video acquisition unit 11, a person identification information analysis unit 12, a person identification information management unit 13, a person identification information matching unit 15, a person identification information storage unit 18, and an appearance history management.
- the video analysis system of the present embodiment further includes a person identification information management unit 13 and a person identification information storage unit 18 as compared with the configuration of the video analysis system of the first embodiment.
- a person identification information management unit 13 and a person identification information storage unit 18 as compared with the configuration of the video analysis system of the first embodiment.
- a person identification information management unit 13 and a person identification information storage unit 18 as compared with the configuration of the video analysis system of the first embodiment.
- a person identification information management unit 13 and a person identification information storage unit 18 as compared with the configuration of the video analysis system of the first embodiment.
- a person identification information management unit 13 and a person identification information storage unit 18 as compared with the configuration of the video analysis system of the first embodiment.
- a person identification information storage unit 18 is the same as that of 1st Embodiment.
- the person identification information management unit 13 stores the monitoring target identification information extracted by the person identification information analysis unit 12 in the person identification information storage unit 18. Further, the person identification information management unit 13 extracts and returns necessary information from the person identification information storage unit 18 in response to a request from the appearance history management unit 21 and the person identification information matching unit 15.
- the person identification information storage unit 18 stores the monitoring target photographing information extracted by the person identification information analysis unit 12. Specifically, the person identification information storage unit 18 stores an identifier for identifying individual monitoring target shooting information, monitoring target identification information included in the monitoring target shooting information, and the shooting time in association with each other.
- FIG. 8 is an explanatory diagram illustrating an example of information stored in the person identification information storage unit 18.
- the example shown in FIG. 8 is an example of the contents stored in the person identification information storage unit 18 when the monitoring target is a person.
- the person image ID is used as the identifier for identifying the monitoring target photographing information
- the person identification information is used as the monitoring target identification information.
- the person identification information storage unit 18 holds the shooting time at which the person identification information extraction source video was shot in association with each other.
- the person identification information storage unit 18 may store the name of the camera that acquired the video (for example, photographing apparatus identification information).
- the appearance history management unit 21 generates the appearance history of the monitoring target estimated to be the same from the monitoring target shooting information stored in the person identification information storage unit 18. Further, the process of estimating the identity of the monitoring target is performed by the person identification information matching unit 15. The method for estimating the identity of the monitoring target and the contents of the process for generating the appearance history are the same as those in the first embodiment.
- the person identification information analysis unit 12 creates monitoring target identification information and stores it in the person identification information storage unit 18. Then, the appearance history management unit 21 generates an appearance history as necessary. That is, in the video analysis system according to the present embodiment, the generation timing of the monitoring target identification information and the generation timing of the appearance history can be shifted. Therefore, a rule for specifying the monitoring target can be added later.
- the person identification information analysis unit 12, the person identification information management unit 13, the person identification information matching unit 15, the appearance history management unit 21, the rule management unit 23, the behavior detection unit 25, and the result output unit 17 are: For example, it is realized by a CPU of a computer that operates according to a program (video analysis program).
- the person identification information analysis unit 12, the person identification information management unit 13, the person identification information matching unit 15, the appearance history management unit 21, the rule management unit 23, the behavior detection unit 25, and the result output unit 17 may be realized by dedicated hardware.
- the person identification information storage unit 18, the appearance history storage unit 27, and the rule storage unit 24 are realized by, for example, a magnetic disk.
- FIG. 9 is a flowchart showing an operation example of the video analysis system of the present embodiment.
- the processing from step S11 to step S13 until the monitoring target identification information is generated from the video is the same as the processing illustrated in FIG.
- the person identification information management unit 13 stores the monitoring target identification information extracted by the person identification information analysis unit 12 in the person identification information storage unit 18 (step S21). Then, at a predetermined timing, the person identification information collation unit 15 collates the monitoring target identification information stored in the person identification information storage unit 18 with the monitoring target identification information stored in the appearance history storage unit 22 ( Step S14). In addition, when the appearance history storage unit 22 does not hold the monitoring target identification information but holds only the person image ID, for example, the person identification information matching unit 15 sets the corresponding monitoring target identification information to the person identification information storage unit 18. You can specify from.
- step S15 to step S19 from the determination of the same monitoring target to the generation of the appearance history and the extraction of the monitoring target conforming to the rule are the same as the processes illustrated in FIG.
- the person identification information management unit 13 causes the person identification information storage unit 18 to store the monitoring target identification information extracted by the person identification information analysis unit 12 once. Therefore, since the generation timing of the monitoring target identification information and the generation timing of the appearance history can be shifted, a rule for specifying the monitoring target can be added later.
- Embodiment 3 a third embodiment of the video analysis system according to the present invention will be described.
- the video analysis system of the present embodiment uses an event occurrence time and occurrence location in a rule.
- the rules defined in the present embodiment extract more than a predetermined number of monitoring targets whose appearance history has a shooting time close to the event occurrence date and time and a shooting time close to the event occurrence location. This is the rule.
- the shooting time close to the event occurrence date means that the time difference between the event occurrence time and the shooting time is equal to or less than a predetermined threshold
- the shooting position close to the event occurrence location Means that the difference in the distance between the event occurrence location and the shooting position is equal to or less than a predetermined threshold.
- the shooting position includes a camera installation location and a camera shooting range.
- the time that can actually be moved between the respective shooting positions may be used as the distance. For example, even if two shooting positions are physically close to each other, when moving to each shooting position, if it is necessary to make a detour, their movement time will be longer, so even if it is determined that the distance is long Good. In addition, even when two physically distant shooting positions are used, if there are other means other than walking (for example, bicycles, so-called moving sidewalks, etc.) for moving to the respective shooting positions, their travel times May be determined to be short.
- walking for example, bicycles, so-called moving sidewalks, etc.
- FIG. 10 is a block diagram showing a configuration example of the third embodiment of the video analysis system according to the present invention.
- symbol same as FIG. 1 is attached
- subjected and description is abbreviate
- the video analysis system of the present embodiment includes a video acquisition unit 11, a person identification information analysis unit 12, a person identification information matching unit 15, an appearance history management unit 21, an appearance history storage unit 22, an event management unit 28, and the like. , An event storage unit 29, a behavior detection unit 25, and a result output unit 17.
- the video analysis system of the present embodiment includes an event management unit 28 instead of the rule management unit 23 and an event storage unit instead of the rule storage unit 24 as compared with the configuration of the video analysis system of the first embodiment. 29.
- the video analysis system of the present embodiment includes an event management unit 28 instead of the rule management unit 23 and an event storage unit instead of the rule storage unit 24 as compared with the configuration of the video analysis system of the first embodiment. 29.
- the video analysis system of the present embodiment includes an event management unit 28 instead of the rule management unit 23 and an event storage unit instead of the rule storage unit 24 as compared with the configuration of the video analysis system of the first embodiment. 29.
- the video analysis system of the present embodiment includes an event management unit 28 instead of the rule management unit 23 and an event storage unit instead of the rule storage unit 24 as compared with the configuration of the video analysis system of the first embodiment. 29.
- the video analysis system of the present embodiment includes an event management unit 28 instead of the rule management unit 23 and an event storage unit instead of the rule storage unit 24 as compared with the configuration of the video
- the event storage unit 29 stores event information used by the behavior detection unit 25 to specify a monitoring target.
- the event information includes the event occurrence time and location.
- FIG. 11 is an example of information stored in the event storage unit 29. In the example illustrated in FIG. 11, the event storage unit 29 stores, as event information, the date and time when an event occurred and a location in association with each other.
- the event information may include information for identifying the content of the event.
- event information is preset in the event storage unit 29. In the event storage unit 29, at least two pieces of event information are set for each type of event to be analyzed.
- the video analysis system (specifically, the behavior detection unit 25) has an appearance history having a shooting time close to the event occurrence date and time and a shooting time close to the event occurrence location in advance. Extract more than a specified number of monitoring targets. Therefore, it can be said that the content of the event stored in the event storage unit 29 represents the content of the rule stored in the rule storage unit 24. That is, it can be said that the rule storage unit 24 of the first embodiment has a list of events as a kind of rule.
- the behavior detection unit 25 sets the shooting time and the event occurrence location close to the event occurrence date and time for each identical or similar event. You may count the appearance history which has the imaging
- the event management unit 28 extracts necessary information from the event storage unit 29 and returns it in response to a request from the behavior detection unit 25.
- the person identification information analysis unit 12, the person identification information collation unit 15, the appearance history management unit 21, the event management unit 28, the behavior detection unit 25, and the result output unit 17 are, for example, a program (video analysis program). It is realized by a CPU of a computer that operates according to the above.
- the person identification information analysis unit 12, the person identification information collation unit 15, the appearance history management unit 21, the event management unit 28, the behavior detection unit 25, and the result output unit 17 each have dedicated hardware. It may be realized with. Further, the appearance history storage unit 22 and the event storage unit 29 are realized by a magnetic disk, for example.
- the monitoring target that appears near the event in which the behavior detection unit 25 occurs is specified.
- the behavior detection unit 25 identifies more than a predetermined number of monitoring targets whose appearance history has a shooting time close to the event occurrence date and time and a shooting position close to the event occurrence location.
- the monitoring target can be specified according to the case.
- Embodiment 4 FIG. Next, a fourth embodiment of the video analysis system according to the present invention will be described. Similarly to the third embodiment, the video analysis system according to the present embodiment also uses the event occurrence time and occurrence location in the rule. On the other hand, in the third embodiment, event information is set in the event storage unit 29 in advance, but in the present embodiment, event information is stored in the event storage unit 29 as appropriate.
- FIG. 12 is a block diagram showing a configuration example of the fourth embodiment of the video analysis system according to the present invention.
- symbol same as FIG. 10 is attached
- subjected and description is abbreviate
- the video analysis system of the present embodiment includes a video acquisition unit 11, a person identification information analysis unit 12, a person identification information matching unit 15, an appearance history management unit 21, an appearance history storage unit 22, an event management unit 28, and the like.
- the event storage unit 29, the behavior detection unit 25, the result output unit 17, the video storage unit 30, and the event detection unit 31 are provided.
- the video analysis system of the present embodiment further includes a video storage unit 30 and an event detection unit 31 as compared with the configuration of the video analysis system of the third embodiment.
- a video storage unit 30 and an event detection unit 31 as compared with the configuration of the video analysis system of the third embodiment.
- the video analysis system of the present embodiment further includes a video storage unit 30 and an event detection unit 31 as compared with the configuration of the video analysis system of the third embodiment.
- the video storage unit 30 stores the video taken by the video acquisition unit 11.
- the event detection unit 31 detects an event used by the behavior detection unit 25 to specify a monitoring target.
- the type of event detected by the event detection unit 31 is arbitrary as long as it is an event that can be used to identify a monitoring target that is performing a suspicious movement.
- the event detection unit 31 itself may detect the occurrence of the event, or may detect the occurrence of the event by receiving a notification that the event has occurred from another device (not shown).
- the event detection unit 31 may include a sensor (not shown) and detect the occurrence of an event from an event detected by the center.
- the event detection unit 31 may receive the location and time when the suspicious object leaving detection is detected as event information from an apparatus (not shown) that has analyzed the video. Then, the event detection unit 31 may instruct the event management unit 28 to register the received event information in the event storage unit 29.
- the event detection unit 31 may instruct the event management unit 28 to register event information indicating the location and time at which the abnormal sound was detected in the event storage unit 29.
- the event detection unit 31 may receive as event information that smoke has been detected from the smoke detector. Then, the event detection unit 31 instructs the event management unit 28 to register the time when the event information is received and the installation location of the smoke detector in the event storage unit 29 as event information indicating smoking. Good.
- the event detection unit 31 may detect the occurrence of an event from information posted on the Internet, in addition to a sensor that detects the event that has occurred. For example, the event detection unit 31 analyzes a sentence posted on the website, specifies event information such as the event occurrence time and location, and the content of the event, and registers the specified event information in the event storage unit 29. As such, the event management unit 28 may be instructed. In this case, in order to increase the accuracy of the detected event, the event detection unit 31 may register the event information when a plurality of event information having the same content can be specified.
- the event that the event detection unit 31 instructs the event management unit 28 to register is not limited to the event that occurred at that time.
- the event detection unit 31 may instruct the event management unit 28 to receive manually input event information after the event occurs (after the event) and register the event information in the event storage unit 29.
- the event detection unit 31 inputs the occurrence time and location of these incidents manually input, and instructs the event management unit 28 to register the input event information in the event storage unit 29. Also good. If such event information is registered, a suspicious person who appears repeatedly can be found by analyzing the video of the surveillance camera taken in the past.
- the event detection unit 31 may instruct the event management unit 28 to receive event information that is expected to occur before the event occurs (in advance) and register the event information in the event storage unit 29. Good.
- the event detection unit 31 may input the predicted information as event information and instruct the event management unit 28 to register the input event information in the event storage unit 29. If such event information is registered, a person who appears multiple times in an expected crime-prone district and time zone can be found.
- the person identification information analysis unit 12 analyzes video captured during a certain period before and after the event occurrence time detected by the event detection unit 31 and extracts monitoring target shooting information (monitoring target identification information). To do.
- the person identification information analysis unit 12 extracts from the video storage unit 30 a video of a certain period that was taken before the time when the event was detected, and analyzes the video. To extract monitoring target photographing information (monitoring target identification information). Furthermore, the person identification information analysis unit 12 inputs a video imaged for a certain period after the time when the event is detected from the video acquisition unit 11, analyzes the video image, and obtains monitoring target imaging information (monitoring target identification information). Extract.
- the person identification information analysis unit 12 extracts videos taken during a certain period before and after the time at which the event occurred from the video storage unit 30, The video may be analyzed to extract monitoring target photographing information (monitoring target identification information).
- the person identification information analysis unit 12 captures a video imaged for a certain period after the time at which the event is expected to occur. May be input, and the video may be analyzed to extract monitoring target photographing information (monitoring target identification information).
- the person identification information analysis unit 12, the person identification information matching unit 15, the appearance history management unit 21, the event management unit 28, the behavior detection unit 25, the result output unit 17, and the event detection unit 31 are, for example, This is realized by a CPU of a computer that operates according to a program (video analysis program).
- the person identification information analysis unit 12, the person identification information collation unit 15, the appearance history management unit 21, the event management unit 28, the behavior detection unit 25, and the result output unit 17 are the event detection unit 31.
- Each may be realized by dedicated hardware.
- the appearance history storage unit 22, the event storage unit 29, and the video storage unit 30 are realized by, for example, a magnetic disk.
- FIG. 13 is a flowchart showing an operation example of the video analysis system of the present embodiment.
- the video acquisition unit 11 acquires a video in the shooting range (step S11), and stores the shot video in the video storage unit 30 (step S31).
- the event detection unit 31 does not detect an event (No in step S32)
- the processes after step S11 are repeated.
- the person identification information analysis unit 12 extracts the identification information of the monitoring target from the video of a certain period taken before and after the time when the event is detected. (Step S12).
- the subsequent processing is the same as the processing from step S13 to step S19 illustrated in FIG.
- the person identification information analysis unit 12 uses the video for a certain period before and after the time when the event is detected to identify the monitoring target identification information. To extract. Therefore, it is possible to suppress processing for analyzing video.
- the video analysis system can detect an event using a sensor or the like even in a place where it is difficult to capture a video by installing a camera.
- the monitoring target can be specified using the captured video.
- Embodiment 5 a fifth embodiment of the video analysis system according to the present invention will be described.
- the present embodiment not only the monitoring target conforming to the rule but also the monitoring target assumed to be related to the monitoring target is specified.
- FIG. 14 is a block diagram showing a configuration example of the fifth embodiment of the video analysis system according to the present invention.
- symbol same as FIG. 1 is attached
- subjected and description is abbreviate
- the video analysis system of the present embodiment includes a video acquisition unit 11, a person identification information analysis unit 12, a person identification information matching unit 15, an appearance history management unit 21, an appearance history storage unit 22, and a rule management unit 23. , A rule storage unit 24, a behavior detection unit 25, a result output unit 17, an appearance history comparison unit 32, a person relation management unit 33, and a person relation storage unit 34.
- the video analysis system may be connected to an output result storage unit 20 that stores output results.
- the video analysis system of the present embodiment further includes an appearance history comparison unit 32, a person relation management unit 33, and a person relation storage unit 34, as compared with the configuration of the video analysis system of the first embodiment. Yes.
- the video analysis system of the present embodiment further includes an appearance history comparison unit 32, a person relation management unit 33, and a person relation storage unit 34, as compared with the configuration of the video analysis system of the first embodiment. Yes.
- the video analysis system of the present embodiment further includes an appearance history comparison unit 32, a person relation management unit 33, and a person relation storage unit 34, as compared with the configuration of the video analysis system of the first embodiment. Yes.
- the video analysis system of the present embodiment further includes an appearance history comparison unit 32, a person relation management unit 33, and a person relation storage unit 34, as compared with the configuration of the video analysis system of the first embodiment. Yes.
- the video analysis system of the present embodiment further includes an appearance history comparison unit 32, a person relation management unit 33, and a person relation storage unit 34, as compared with
- the appearance history comparison unit 32 identifies related monitoring targets based on the similarity of appearance histories of different monitoring targets. Specifically, the appearance history comparison unit 32 compares the appearance histories between different monitoring targets, and the appearance location (photographed location and imaging device identification information) and the appearance time (imaging time) are respectively between the monitoring targets. Count the number of adjacent appearance histories.
- the appearance location is close means that the position where the monitoring target appears is close by being photographed by the same camera or by a nearby camera.
- the movable time described in the first embodiment may be taken into consideration.
- the fact that the appearance times are close means that the photographing times are close, and for example, a period of several tens of seconds is set as a period determined to be close.
- the appearance history comparison unit 32 determines that there is a relationship between the monitoring targets.
- This threshold is set to a minimum of 2, for example.
- the appearance history comparison unit 32 executes the above-described process at an arbitrary timing before the behavior detection unit 25 compares the appearance history with the rule.
- a situation where it can be determined that a plurality of monitoring targets are related for example, a situation where the group is roaming. For example, even if some people appear almost simultaneously in one place in a public facility, they can be judged as just passing. Also, if some people appear almost simultaneously in two separate places, they may be friends, but there is also a chance. However, if some people appear almost simultaneously in three or more places apart, it is possible that they are related because they are likely to be friends.
- the person relation management unit 33 registers a list of monitoring targets determined to be related by the appearance history comparison unit 32 in the person relation storage unit 34.
- the person-related storage unit 34 stores a list of related monitoring targets.
- FIG. 15 is an explanatory diagram illustrating an example of information stored in the person-related storage unit 34.
- the person-related storage unit 34 stores one or more person IDs identified as related persons for each person ID in association with each other.
- the behavior detection unit 25 of the present embodiment identifies a monitoring target that conforms to a rule in which the pattern of appearance history is defined
- the behavior detection unit 25 also identifies the monitoring target related to the monitoring target. Specifically, when the monitoring target that matches the rule is specified, the behavior detection unit 25 specifies the monitoring target related to the specified monitoring target from the person-related storage unit 34.
- the result output unit 17 outputs the related monitoring target as well as the monitoring target conforming to the rule.
- the behavior detection unit 25 identifies a monitoring target that appears in the vicinity of the generated event, and outputs a monitoring target related to the monitoring target. May be. That is, in this embodiment, the appearance history comparison unit 32 specifies related monitoring targets based on the similarity of appearance histories of different monitoring targets. Therefore, it is possible to specify not only the monitoring target that conforms to the rule but also the monitoring target that is estimated to be related to the monitoring target.
- the person-related management unit 33 is realized by a CPU of a computer that operates according to a program (video analysis program), for example.
- each person-related management unit 33 may be realized by dedicated hardware.
- the appearance history storage unit 22, the rule storage unit 24, and the person-related storage unit 34 are realized by, for example, a magnetic disk.
- the behavior detection unit 25 specifies a monitoring target that conforms to the rule, and also specifies a monitoring target related to the monitoring target.
- the behavior detection unit 25 identifies a related monitoring target from among a plurality of monitoring targets that match the rule.
- the behavior detection unit 25 specifies a plurality of monitoring targets that conform to the rule.
- the behavior detection unit 25 determines whether or not a related monitoring target is included in the plurality of specified monitoring targets from the person related storage unit 34.
- the behavior detection unit 25 identifies the related monitoring targets. As described above, for example, the following situation can be estimated by determining whether or not a monitoring target conforming to the rule is related.
- the plurality of people may be a group pickpocket. However, it may be a coincidence that only one of those people takes suspicious behavior. However, if more than one related mate is taking suspicious behavior, these mate may be a suspicious group.
- the behavior detection unit 25 may specify the monitoring target using information on the rule (event) used when specifying the monitoring target. In this way, for example, the following situation can be estimated by using the rule (event) used when specifying the monitoring target.
- the behavior detection unit 25 may identify a monitoring target that is a related monitoring target among monitoring targets that conform to the rule and that conforms to a different rule.
- the behavior detection unit 25 identifies the related monitoring target based on the similarity of the appearance history of the monitoring target from the plurality of monitoring targets that match the rule. The accuracy of specifying can be further increased.
- FIG. 16 is a block diagram showing an outline of a video analysis system according to the present invention.
- the video analysis system according to the present invention uses monitoring target identification information (for example, a face image and a license plate image) that is identification information of a monitoring target as information used for estimating the identity of the monitoring target (for example, a person, a car).
- monitoring target identification information for example, a face image and a license plate image
- Monitoring target shooting information generation for generating monitoring target shooting information including the extracted monitoring target identification information and the shooting time when the monitoring target was shot, extracted from the video (for example, each video shot by the video acquisition unit 11)
- Appearance history generating means 82 for example, person identification information collation
- Constant means 83 e.g., behavior detecting unit 25 and a.
- the monitoring target shooting information generating unit 81 generates monitoring target shooting information including a shooting position (for example, camera installation position, shooting device identification information, camera shooting range, etc.) at which the monitoring target was shot, and appearance history.
- the generation unit 82 may generate an appearance history including a shooting position.
- the specifying unit 83 generates an appearance that is generated according to a rule (for example, the first rule to the fourth rule described above) defined by the relationship between two or more appearance histories including the shooting position and the shooting time.
- a monitoring target to which the history is applicable may be specified. According to such a configuration, the monitoring target can be specified based on the appearance information included in the appearance history.
- the specifying unit 83 may specify monitoring targets having a predetermined number or more of appearance histories having shooting times close to the event occurrence date and time and shooting positions close to the event occurrence location.
- the video analysis system may include a detection unit (for example, event detection unit 31) that detects an event that occurs. Then, the monitoring target shooting information generation unit 81 may generate monitoring target shooting information for a certain period of time before and after the event that has occurred. According to such a configuration, the amount of video to be analyzed can be reduced.
- a detection unit for example, event detection unit 31
- the monitoring target shooting information generation unit 81 may generate monitoring target shooting information for a certain period of time before and after the event that has occurred. According to such a configuration, the amount of video to be analyzed can be reduced.
- the video analysis system may include a relevance specifying unit (for example, an appearance history comparison unit 32) that specifies related monitoring targets based on the similarity of appearance histories of different monitoring targets. According to such a configuration, it is possible to specify not only the monitoring target that conforms to the rule, but also the monitoring target that is estimated to be related to the monitoring target.
- a relevance specifying unit for example, an appearance history comparison unit 32
- the relevance specifying means may specify a related monitoring target based on the similarity of the appearance history of the monitoring target from among a plurality of monitoring targets that match the rule. According to such a configuration, it is possible to further increase the accuracy of specifying the monitoring target.
- the monitoring target photographing information generation means 81 may extract a face image as monitoring target identification information. Then, the appearance history generation unit 82 may estimate whether the monitoring targets are the same by comparing the face images included in the generated monitoring target shooting information.
- a rule that defines that there is no appearance history indicating another shooting position that is supposed to appear based on the shooting position within a certain period is defined.
- the rule storage unit 24 may store this rule.
- the specifying unit 83 may specify a monitoring target that conforms to this rule.
- the present invention is preferably applied to, for example, a video analysis system that automatically detects a person who moves according to a specific rule in a specific area by using video captured by a surveillance camera.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Emergency Management (AREA)
- Business, Economics & Management (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Alarm Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Collating Specific Patterns (AREA)
Abstract
Description
図1は、本発明による映像解析システムの第1の実施形態の構成例を示すブロック図である。本実施形態の映像解析システムは、映像取得部11と、人物識別情報解析部12と、人物識別情報照合部15と、出現履歴管理部21と、出現履歴記憶部22と、ルール管理部23と、ルール記憶部24と、行動検出部25と、結果出力部17とを備えている。映像解析システムは、出力結果を記憶する出力結果記憶部20に接続されていてもよい。
例えば、北側観客席、南側観客席、東側観客席、西側観客席の4つのエリアを有するスタジアムがあるとする。普通の観客であれば、通常、自席がある1つのエリアに出現する。時々隣接するエリアに移動することで、2つのエリアに出現する観客はいるかもしれないが、3つのエリアに現れる動きは、不審な動きと言える。
例えば、少し離れた広場Aおよび広場Bがあるとする。出現履歴がA,Aと観測された人は、単に広場Aをうろうろしている人と判断できる。また、出現履歴がA,A,B,Bと観測された人は、広場Aから広場Bへ移動した人と判断できる。一方、出現履歴がA,B,A,B,A,Bと観測された人は、広場Aと広場Bの間を不自然にうろついている人と判断できるため、不審な動きと言える。
例えば、店舗の入り口を通ったら、通常、1時間程度以内には、本来の目的達成に必要な場所であるレジにも出現するのが普通である。また、銀行の入り口を通ったら、通常、1時間程度以内には、本来の目的達成に必要な場所である窓口にも出現するのが普通である。施設に入ってもその施設を利用しない人は存在するが、例えば、朝と夜や、異なる日に繰り返し施設に入っているのに本来の目的達成に必要な場所に出現しない動きは、不審な動きと言える。
例えば、駅において、多くの人が待ち合わせをする時間帯では、単に相手を探していたり、暇を潰していたりすることが想定されため、何度も現れても不審な動きとは言い難い。しかし、通勤時間帯や、早朝、深夜などの時間帯に繰り返し現れる動きは、不審な動きと言える。
次に、本発明による映像解析システムの第2の実施形態を説明する。第1の実施形態では、人物識別情報解析部12が監視対象撮影情報を抽出するごとに、出現履歴管理部21が、同一と推定される監視対象の出現履歴を生成していた。一方、本実施形態の映像解析システムは、出現履歴を一括で生成する。
次に、本発明による映像解析システムの第3の実施形態を説明する。本実施形態の映像解析システムは、ルールにイベントの発生時刻および発生場所を利用する。具体的には、本実施形態で規定されるルールは、イベントの発生日時に近接する撮影時刻およびイベントの発生場所に近接する撮影時刻を有する出現履歴が予め定めた数以上の監視対象を抽出する、というルールである。
次に、本発明による映像解析システムの第4の実施形態を説明する。本実施形態の映像解析システムも、第3の実施形態と同様、ルールにイベントの発生時刻および発生場所を利用する。一方、第3の実施形態では、イベント記憶部29にイベント情報が予め設定されていたが、本実施形態では、イベント情報が適宜イベント記憶部29に記憶される。
次に、本発明による映像解析システムの第5の実施形態を説明する。本実施形態では、ルールに適合する監視対象だけでなく、その監視対象に関連すると想定される監視対象も併せて特定する。
12 人物識別情報解析部
13 人物識別情報管理部
15 人物識別情報照合部
17 結果出力部
18 人物識別情報記憶部
20 出力結果記憶部
21 出現履歴管理部
22 出現履歴記憶部
23 ルール管理部
24 ルール記憶部
25 行動検出部
28 イベント管理部
29 イベント記憶部
31 イベント検知部
32 出現履歴比較部
33 人物関連管理部
34 人物関連記憶部
u1~u10 人物
Claims (12)
- 監視対象の同一性の推定に用いられる情報として当該監視対象の識別情報である監視対象識別情報を各映像から抽出し、抽出された監視対象識別情報と監視対象が撮影された撮影時刻とを含む監視対象撮影情報を生成する監視対象撮影情報生成手段と、
生成された複数の監視対象撮影情報から、同一と推定される監視対象の出現履歴を生成する出現履歴生成手段と、
規定されたルールに前記出現履歴が適合する監視対象を特定する特定手段とを備えた
ことを特徴とする映像解析システム。 - 監視対象撮影情報生成手段は、監視対象が撮影された撮影位置を含む監視対象撮影情報を生成し、
出現履歴生成手段は、前記撮影位置を含む出現履歴を生成し、
特定手段は、前記撮影位置および撮影時刻を含む2つ以上の出現履歴の関係性により規定されるルールに、生成された出現履歴が適合する監視対象を特定する
請求項1記載の映像解析システム。 - 特定手段は、イベントの発生日時に近接する撮影時刻およびイベントの発生場所に近接する撮影位置を有する出現履歴が予め定めた数以上の監視対象を特定する
請求項2記載の映像解析システム。 - 発生するイベントを検知する検知手段を備え、
監視対象撮影情報生成手段は、発生したイベントの前後一定期間の映像を対象として、監視対象撮影情報を生成する
請求項3記載の映像解析システム。 - 異なる監視対象の出現履歴の類似性に基づいて、関連する監視対象を特定する関連性特定手段を備えた
請求項1から請求項4のうちのいずれか1項に記載の映像解析システム。 - 関連性特定手段は、ルールに適合する複数の監視対象の中から、当該監視対象の出現履歴の類似性に基づいて、関連する監視対象を特定する
請求項5記載の映像解析システム。 - 監視対象撮影情報生成手段は、監視対象識別情報として顔画像を抽出し、
出現履歴生成手段は、生成された監視対象撮影情報に含まれる顔画像を比較して、監視対象が同一か否か推定する
請求項1から請求項6のうちのいずれか1項に記載の映像解析システム。 - 所定の撮影位置を示す出現履歴の後、一定期間内に、前記撮影位置に基づいて出現が想定される他の撮影位置を示す出現履歴を有さないことを規定したルールが定義され、
特定手段は、前記ルールに適合する監視対象を特定する
請求項2記載の映像解析システム。 - 監視対象の同一性の推定に用いられる情報として当該監視対象の識別情報である監視対象識別情報を各映像から抽出し、
抽出された監視対象識別情報と監視対象が撮影された撮影時刻とを含む監視対象撮影情報を生成し、
生成された複数の監視対象撮影情報から、同一と推定される監視対象の出現履歴を生成し、
規定されたルールに前記出現履歴が適合する監視対象を特定する
ことを特徴とする映像解析方法。 - 監視対象が撮影された撮影位置を含む監視対象撮影情報を生成し、
前記撮影位置を含む出現履歴を生成し、
前記撮影位置および撮影時刻を含む2つ以上の出現履歴の関係性により規定されるルールに、生成された出現履歴が適合する監視対象を特定する
請求項9記載の映像解析方法。 - コンピュータに、
監視対象の同一性の推定に用いられる情報として当該監視対象の識別情報である監視対象識別情報を各映像から抽出し、抽出された監視対象識別情報と監視対象が撮影された撮影時刻とを含む監視対象撮影情報を生成する監視対象撮影情報生成処理、
生成された複数の監視対象撮影情報から、同一と推定される監視対象の出現履歴を生成する出現履歴生成処理、および、
規定されたルールに前記出現履歴が適合する監視対象を特定する特定処理
を実行させるための映像解析プログラム。 - コンピュータに、
監視対象撮影情報生成処理で、監視対象が撮影された撮影位置を含む監視対象撮影情報を生成させ、
出現履歴生成処理で、前記撮影位置を含む出現履歴を生成させ、
特定処理で、前記撮影位置および撮影時刻を含む2つ以上の出現履歴の関係性により規定されるルールに、生成された出現履歴が適合する監視対象を特定させる
請求項11記載の映像解析プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/307,600 US10552713B2 (en) | 2014-04-28 | 2015-02-05 | Image analysis system, image analysis method, and storage medium |
JP2016515843A JPWO2015166612A1 (ja) | 2014-04-28 | 2015-02-05 | 映像解析システム、映像解析方法および映像解析プログラム |
US16/523,619 US11157778B2 (en) | 2014-04-28 | 2019-07-26 | Image analysis system, image analysis method, and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-092401 | 2014-04-28 | ||
JP2014092401 | 2014-04-28 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/307,600 A-371-Of-International US10552713B2 (en) | 2014-04-28 | 2015-02-05 | Image analysis system, image analysis method, and storage medium |
US16/523,619 Continuation US11157778B2 (en) | 2014-04-28 | 2019-07-26 | Image analysis system, image analysis method, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015166612A1 true WO2015166612A1 (ja) | 2015-11-05 |
Family
ID=54358359
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/000528 WO2015166612A1 (ja) | 2014-04-28 | 2015-02-05 | 映像解析システム、映像解析方法および映像解析プログラム |
Country Status (3)
Country | Link |
---|---|
US (2) | US10552713B2 (ja) |
JP (1) | JPWO2015166612A1 (ja) |
WO (1) | WO2015166612A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018057523A (ja) * | 2016-10-04 | 2018-04-12 | 株式会社三洋物産 | 遊技場用システム |
CN109743541A (zh) * | 2018-12-15 | 2019-05-10 | 深圳壹账通智能科技有限公司 | 智能监控方法、装置、计算机设备及存储介质 |
JP2019087842A (ja) * | 2017-11-06 | 2019-06-06 | 京セラドキュメントソリューションズ株式会社 | 監視システム |
JP2019153296A (ja) * | 2016-03-30 | 2019-09-12 | 日本電気株式会社 | 解析装置、解析方法及びプログラム |
JP2019213116A (ja) * | 2018-06-07 | 2019-12-12 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JPWO2018116488A1 (ja) * | 2016-12-22 | 2019-12-12 | 日本電気株式会社 | 解析サーバ、監視システム、監視方法及びプログラム |
JP2021013188A (ja) * | 2018-09-06 | 2021-02-04 | 日本電気株式会社 | 方法、装置およびプログラム |
CN112507760A (zh) * | 2019-09-16 | 2021-03-16 | 杭州海康威视数字技术股份有限公司 | 暴力分拣行为的检测方法、装置及设备 |
WO2021140966A1 (en) * | 2020-01-07 | 2021-07-15 | Nec Corporation | Method, apparatus and non-transitory computer readable medium |
JP7380812B2 (ja) | 2018-09-06 | 2023-11-15 | 日本電気株式会社 | 識別方法、識別装置、識別システム及び識別プログラム |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016157825A1 (ja) * | 2015-03-30 | 2016-10-06 | 日本電気株式会社 | 監視システム、監視対象装置、制御方法、及び記録媒体 |
US10654422B2 (en) | 2016-08-29 | 2020-05-19 | Razmik Karabed | View friendly monitor systems |
CN107480246B (zh) * | 2017-08-10 | 2021-03-12 | 北京中航安通科技有限公司 | 一种关联人员的识别方法及装置 |
US10186124B1 (en) | 2017-10-26 | 2019-01-22 | Scott Charles Mullins | Behavioral intrusion detection system |
CN107820010B (zh) * | 2017-11-17 | 2020-11-06 | 英业达科技有限公司 | 摄影计数装置 |
US11615623B2 (en) | 2018-02-19 | 2023-03-28 | Nortek Security & Control Llc | Object detection in edge devices for barrier operation and parcel delivery |
US11295139B2 (en) | 2018-02-19 | 2022-04-05 | Intellivision Technologies Corp. | Human presence detection in edge devices |
CN109348120B (zh) * | 2018-09-30 | 2021-07-20 | 烽火通信科技股份有限公司 | 一种拍摄方法、图像的显示方法、系统及设备 |
JP2022527661A (ja) * | 2019-04-10 | 2022-06-02 | スコット チャールズ マリンズ、 | 監視システム |
CN110188594B (zh) * | 2019-04-12 | 2021-04-06 | 南昌嘉研科技有限公司 | 一种基于计算机视觉的目标识别与定位方法 |
US11388354B2 (en) | 2019-12-06 | 2022-07-12 | Razmik Karabed | Backup-camera-system-based, on-demand video player |
JP2022122498A (ja) * | 2021-02-10 | 2022-08-23 | 富士通株式会社 | 移動履歴変更方法及び移動履歴変更プログラム |
CN113392800A (zh) * | 2021-06-30 | 2021-09-14 | 浙江商汤科技开发有限公司 | 一种行为检测方法、装置、计算机设备和存储介质 |
JP2023073535A (ja) * | 2021-11-16 | 2023-05-26 | 富士通株式会社 | 表示プログラム及び表示方法 |
JP2023086471A (ja) * | 2021-12-10 | 2023-06-22 | トヨタ自動車株式会社 | 防犯システム、防犯方法、及び防犯プログラム |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003143593A (ja) * | 2001-11-05 | 2003-05-16 | Hitachi Eng Co Ltd | 画像処理による監視方法およびシステム |
JP2007219948A (ja) * | 2006-02-17 | 2007-08-30 | Advanced Telecommunication Research Institute International | ユーザ異常検出装置、及びユーザ異常検出方法 |
WO2008111459A1 (ja) * | 2007-03-06 | 2008-09-18 | Kabushiki Kaisha Toshiba | 不審行動検知システム及び方法 |
JP2010238184A (ja) * | 2009-03-31 | 2010-10-21 | Sogo Keibi Hosho Co Ltd | 警備装置 |
JP2010272948A (ja) * | 2009-05-19 | 2010-12-02 | Mitsubishi Electric Corp | 遠隔映像取得システム |
JP2013008298A (ja) * | 2011-06-27 | 2013-01-10 | Secom Co Ltd | 警備システム |
JP2013131153A (ja) * | 2011-12-22 | 2013-07-04 | Welsoc Co Ltd | 自律型防犯警戒システム及び自律型防犯警戒方法 |
WO2014045843A1 (ja) * | 2012-09-19 | 2014-03-27 | 日本電気株式会社 | 画像処理システム、画像処理方法及びプログラム |
Family Cites Families (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050162515A1 (en) * | 2000-10-24 | 2005-07-28 | Objectvideo, Inc. | Video surveillance system |
US20050146605A1 (en) * | 2000-10-24 | 2005-07-07 | Lipton Alan J. | Video surveillance system employing video primitives |
US9892606B2 (en) * | 2001-11-15 | 2018-02-13 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
US7868912B2 (en) * | 2000-10-24 | 2011-01-11 | Objectvideo, Inc. | Video surveillance system employing video primitives |
WO2003028376A1 (en) * | 2001-09-14 | 2003-04-03 | Vislog Technology Pte Ltd | Customer service counter/checkpoint registration system with video/image capturing, indexing, retrieving and black list matching function |
GB0218982D0 (en) * | 2002-08-15 | 2002-09-25 | Roke Manor Research | Video motion anomaly detector |
WO2007018523A2 (en) * | 2004-07-28 | 2007-02-15 | Sarnoff Corporation | Method and apparatus for stereo, multi-camera tracking and rf and video track fusion |
US20060072010A1 (en) * | 2004-09-24 | 2006-04-06 | Objectvideo, Inc. | Target property maps for surveillance systems |
US7623676B2 (en) * | 2004-12-21 | 2009-11-24 | Sarnoff Corporation | Method and apparatus for tracking objects over a wide area using a network of stereo sensors |
US7825954B2 (en) * | 2005-05-31 | 2010-11-02 | Objectvideo, Inc. | Multi-state target tracking |
US20090041297A1 (en) * | 2005-05-31 | 2009-02-12 | Objectvideo, Inc. | Human detection and tracking for security applications |
US7796780B2 (en) * | 2005-06-24 | 2010-09-14 | Objectvideo, Inc. | Target detection and tracking from overhead video streams |
US7801330B2 (en) * | 2005-06-24 | 2010-09-21 | Objectvideo, Inc. | Target detection and tracking from video streams |
US7579965B2 (en) * | 2006-03-03 | 2009-08-25 | Andrew Bucholz | Vehicle data collection and processing system |
US10078693B2 (en) * | 2006-06-16 | 2018-09-18 | International Business Machines Corporation | People searches by multisensor event correlation |
TW200822751A (en) * | 2006-07-14 | 2008-05-16 | Objectvideo Inc | Video analytics for retail business process monitoring |
US20080074496A1 (en) * | 2006-09-22 | 2008-03-27 | Object Video, Inc. | Video analytics for banking business process monitoring |
JP5040258B2 (ja) * | 2006-10-23 | 2012-10-03 | 株式会社日立製作所 | 映像監視装置、映像監視システムおよび画像処理方法 |
JP5284599B2 (ja) * | 2007-03-30 | 2013-09-11 | 株式会社日立国際電気 | 画像処理装置 |
US20090002489A1 (en) * | 2007-06-29 | 2009-01-01 | Fuji Xerox Co., Ltd. | Efficient tracking multiple objects through occlusion |
EP2174310A4 (en) * | 2007-07-16 | 2013-08-21 | Cernium Corp | DEVICE AND METHOD FOR VERIFYING VIDEO ALARMS |
JP5141317B2 (ja) * | 2008-03-14 | 2013-02-13 | オムロン株式会社 | 対象画像検出デバイス、制御プログラム、および該プログラムを記録した記録媒体、ならびに対象画像検出デバイスを備えた電子機器 |
US8320613B2 (en) * | 2008-06-04 | 2012-11-27 | Lockheed Martin Corporation | Detecting and tracking targets in images based on estimated target geometry |
JP2010113692A (ja) * | 2008-11-10 | 2010-05-20 | Nec Corp | 顧客行動記録装置及び顧客行動記録方法並びにプログラム |
US20100299021A1 (en) * | 2009-05-21 | 2010-11-25 | Reza Jalili | System and Method for Recording Data Associated with Vehicle Activity and Operation |
US8460220B2 (en) * | 2009-12-18 | 2013-06-11 | General Electric Company | System and method for monitoring the gait characteristics of a group of individuals |
WO2011102416A1 (ja) * | 2010-02-19 | 2011-08-25 | 株式会社 東芝 | 移動物体追跡システムおよび移動物体追跡方法 |
US9143843B2 (en) * | 2010-12-09 | 2015-09-22 | Sealed Air Corporation | Automated monitoring and control of safety in a production area |
US20120213404A1 (en) * | 2011-02-18 | 2012-08-23 | Google Inc. | Automatic event recognition and cross-user photo clustering |
JP5702663B2 (ja) | 2011-05-10 | 2015-04-15 | 日本放送協会 | 顔画像認識装置及び顔画像認識プログラム |
US8792684B2 (en) * | 2011-08-11 | 2014-07-29 | At&T Intellectual Property I, L.P. | Method and apparatus for automated analysis and identification of a person in image and video content |
JP2013088870A (ja) | 2011-10-13 | 2013-05-13 | Toyota Infotechnology Center Co Ltd | 不審者検知装置、不審者検知方法およびプログラム |
US8682036B2 (en) * | 2012-04-06 | 2014-03-25 | Xerox Corporation | System and method for street-parking-vehicle identification through license plate capturing |
JP6080409B2 (ja) * | 2012-07-09 | 2017-02-15 | キヤノン株式会社 | 情報処理装置、情報処理方法、およびプログラム |
US8781293B2 (en) * | 2012-08-20 | 2014-07-15 | Gorilla Technology Inc. | Correction method for object linking across video sequences in a multiple camera video surveillance system |
US20140078304A1 (en) * | 2012-09-20 | 2014-03-20 | Cloudcar, Inc. | Collection and use of captured vehicle data |
SG11201508085QA (en) * | 2013-03-29 | 2015-11-27 | Nec Corp | Target object identifying device, target object identifying method and target object identifying program |
JP6098318B2 (ja) * | 2013-04-15 | 2017-03-22 | オムロン株式会社 | 画像処理装置、画像処理方法、画像処理プログラムおよび記録媒体 |
US9904852B2 (en) * | 2013-05-23 | 2018-02-27 | Sri International | Real-time object detection, tracking and occlusion reasoning |
JP5438861B1 (ja) * | 2013-07-11 | 2014-03-12 | パナソニック株式会社 | 追跡支援装置、追跡支援システムおよび追跡支援方法 |
US10410278B2 (en) * | 2013-07-25 | 2019-09-10 | The Crawford Group, Inc. | Method and apparatus for integrated image capture for vehicles to track damage |
US9111143B2 (en) * | 2013-09-27 | 2015-08-18 | At&T Mobility Ii Llc | Method and apparatus for image collection and analysis |
KR102126868B1 (ko) * | 2013-11-15 | 2020-06-25 | 한화테크윈 주식회사 | 영상 처리 장치 및 방법 |
JP6323465B2 (ja) * | 2014-01-15 | 2018-05-16 | 富士通株式会社 | アルバム作成プログラム、アルバム作成方法およびアルバム作成装置 |
US9760792B2 (en) * | 2015-03-20 | 2017-09-12 | Netra, Inc. | Object detection and classification |
-
2015
- 2015-02-05 US US15/307,600 patent/US10552713B2/en active Active
- 2015-02-05 WO PCT/JP2015/000528 patent/WO2015166612A1/ja active Application Filing
- 2015-02-05 JP JP2016515843A patent/JPWO2015166612A1/ja active Pending
-
2019
- 2019-07-26 US US16/523,619 patent/US11157778B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003143593A (ja) * | 2001-11-05 | 2003-05-16 | Hitachi Eng Co Ltd | 画像処理による監視方法およびシステム |
JP2007219948A (ja) * | 2006-02-17 | 2007-08-30 | Advanced Telecommunication Research Institute International | ユーザ異常検出装置、及びユーザ異常検出方法 |
WO2008111459A1 (ja) * | 2007-03-06 | 2008-09-18 | Kabushiki Kaisha Toshiba | 不審行動検知システム及び方法 |
JP2010238184A (ja) * | 2009-03-31 | 2010-10-21 | Sogo Keibi Hosho Co Ltd | 警備装置 |
JP2010272948A (ja) * | 2009-05-19 | 2010-12-02 | Mitsubishi Electric Corp | 遠隔映像取得システム |
JP2013008298A (ja) * | 2011-06-27 | 2013-01-10 | Secom Co Ltd | 警備システム |
JP2013131153A (ja) * | 2011-12-22 | 2013-07-04 | Welsoc Co Ltd | 自律型防犯警戒システム及び自律型防犯警戒方法 |
WO2014045843A1 (ja) * | 2012-09-19 | 2014-03-27 | 日本電気株式会社 | 画像処理システム、画像処理方法及びプログラム |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11176698B2 (en) | 2016-03-30 | 2021-11-16 | Nec Corporation | Analysis apparatus, analysis method, and storage medium |
US11094076B2 (en) | 2016-03-30 | 2021-08-17 | Nec Corporation | Analysis apparatus, analysis method, and storage medium |
JP2019153296A (ja) * | 2016-03-30 | 2019-09-12 | 日本電気株式会社 | 解析装置、解析方法及びプログラム |
JP2020107354A (ja) * | 2016-03-30 | 2020-07-09 | 日本電気株式会社 | 解析装置、解析方法及びプログラム |
JP2018057523A (ja) * | 2016-10-04 | 2018-04-12 | 株式会社三洋物産 | 遊技場用システム |
JPWO2018116488A1 (ja) * | 2016-12-22 | 2019-12-12 | 日本電気株式会社 | 解析サーバ、監視システム、監視方法及びプログラム |
JP7040463B2 (ja) | 2016-12-22 | 2022-03-23 | 日本電気株式会社 | 解析サーバ、監視システム、監視方法及びプログラム |
JP2019087842A (ja) * | 2017-11-06 | 2019-06-06 | 京セラドキュメントソリューションズ株式会社 | 監視システム |
JP2019213116A (ja) * | 2018-06-07 | 2019-12-12 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP7210163B2 (ja) | 2018-06-07 | 2023-01-23 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP7380812B2 (ja) | 2018-09-06 | 2023-11-15 | 日本電気株式会社 | 識別方法、識別装置、識別システム及び識別プログラム |
JP2021013188A (ja) * | 2018-09-06 | 2021-02-04 | 日本電気株式会社 | 方法、装置およびプログラム |
US11250251B2 (en) | 2018-09-06 | 2022-02-15 | Nec Corporation | Method for identifying potential associates of at least one target person, and an identification device |
JP7302566B2 (ja) | 2018-09-06 | 2023-07-04 | 日本電気株式会社 | 方法、装置およびプログラム |
CN109743541A (zh) * | 2018-12-15 | 2019-05-10 | 深圳壹账通智能科技有限公司 | 智能监控方法、装置、计算机设备及存储介质 |
CN112507760A (zh) * | 2019-09-16 | 2021-03-16 | 杭州海康威视数字技术股份有限公司 | 暴力分拣行为的检测方法、装置及设备 |
CN112507760B (zh) * | 2019-09-16 | 2024-05-31 | 杭州海康威视数字技术股份有限公司 | 暴力分拣行为的检测方法、装置及设备 |
WO2021140966A1 (en) * | 2020-01-07 | 2021-07-15 | Nec Corporation | Method, apparatus and non-transitory computer readable medium |
JP7380889B2 (ja) | 2020-01-07 | 2023-11-15 | 日本電気株式会社 | 方法、装置、及び、プログラム |
JP2023509059A (ja) * | 2020-01-07 | 2023-03-06 | 日本電気株式会社 | 方法、装置、及び、プログラム |
Also Published As
Publication number | Publication date |
---|---|
US20190347528A1 (en) | 2019-11-14 |
US20170053191A1 (en) | 2017-02-23 |
US10552713B2 (en) | 2020-02-04 |
US11157778B2 (en) | 2021-10-26 |
JPWO2015166612A1 (ja) | 2017-04-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015166612A1 (ja) | 映像解析システム、映像解析方法および映像解析プログラム | |
JP6406246B2 (ja) | 対象物特定装置、対象物特定方法および対象物特定プログラム | |
WO2018180588A1 (ja) | 顔画像照合システムおよび顔画像検索システム | |
JP7405200B2 (ja) | 人物検出システム | |
JP6508041B2 (ja) | 対象物監視システム、対象物監視方法および監視対象抽出プログラム | |
US9704264B2 (en) | Method for tracking a target in an image sequence, taking the dynamics of the target into consideration | |
JP6233624B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
US20100318566A1 (en) | Behavior history retrieval apparatus and behavior history retrieval method | |
KR101541272B1 (ko) | 사람들의 움직임 불규칙성을 이용한 폭력 행위 검출 장치 및 방법 | |
CN111814510B (zh) | 一种遗留物主体检测方法及装置 | |
WO2018179202A1 (ja) | 情報処理装置、制御方法、及びプログラム | |
CN110717357B (zh) | 预警方法、装置、电子设备及存储介质 | |
JP5758165B2 (ja) | 物品検出装置および静止人物検出装置 | |
JP6253950B2 (ja) | 画像監視システム | |
KR20190047218A (ko) | 교통정보 제공 방법, 장치 및 그러한 방법을 실행하기 위하여 매체에 저장된 컴퓨터 프로그램 | |
KR101848367B1 (ko) | 모션벡터와 dct 계수를 이용한 메타데이터 기반 의심영상 구분식별 방식의 영상 관제 방법 | |
US9959460B2 (en) | Re-wandering alarm system and method | |
CN113744443B (zh) | 一种闸机通道防欺骗控制方法、装置、设备及存储介质 | |
JP2019159377A (ja) | 監視システム、サーバ装置、監視方法、及び監視プログラム | |
WO2012074366A2 (en) | A system and a method for detecting a loitering event | |
WO2012074352A1 (en) | System and method to detect loitering event in a region | |
JP2019046053A (ja) | 監視システム、監視方法 | |
CN115546737A (zh) | 一种机房监控方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15786699 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016515843 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15307600 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15786699 Country of ref document: EP Kind code of ref document: A1 |