WO2014155922A1 - 対象物特定装置、対象物特定方法および対象物特定プログラム - Google Patents
対象物特定装置、対象物特定方法および対象物特定プログラム Download PDFInfo
- Publication number
- WO2014155922A1 WO2014155922A1 PCT/JP2014/000523 JP2014000523W WO2014155922A1 WO 2014155922 A1 WO2014155922 A1 WO 2014155922A1 JP 2014000523 W JP2014000523 W JP 2014000523W WO 2014155922 A1 WO2014155922 A1 WO 2014155922A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- time
- target
- person
- monitoring target
- shooting
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/53—Recognition of crowd images, e.g. recognition of crowd congestion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
- G06V40/173—Classification, e.g. identification face re-identification, e.g. recognising unknown faces across different face tracks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
Definitions
- the present invention relates to an object specifying device, an object specifying method, and an object specifying program for specifying a desired object from among monitoring targets.
- Surveillance cameras are installed at stations and specific facilities, and various judgments are made by analyzing the video captured by the surveillance cameras. As one of them, a person or object that exists unnaturally for a long time within the range to be monitored is specified as a suspicious person or a suspicious object.
- a behavior analysis method for tracking a specific person and analyzing the behavior of the person is known.
- this behavior analysis method one person or a plurality of cameras with overlapping monitoring areas are used to recognize where a specific person is, and by tracking the time change of the position of the person, Identify where and how long you have stayed.
- Patent Document 1 describes a face image recognition device that speeds up face image recognition processing and simplifies registration work.
- the face image recognition apparatus described in Patent Document 1 registers a face image in the front direction of a person to be recognized and an average face image in a direction other than the front, and stores the face area extracted from the video.
- the face image in the video is recognized by comparing the feature with the registered face image.
- the range in which a person's behavior can be analyzed depends on the shooting range of the camera. For example, when one camera is used, the range in which a person's behavior can be analyzed is limited to the range in which the camera can shoot. Also, when trying to cover a large range, it is necessary to use a large number of cameras so that there is no gap in the monitored range. Furthermore, even if an attempt is made to shoot a person in a crowded situation, it is often difficult to see a specific person because it is often invisible with another person.
- the present invention provides an object identification device that can identify an object that exists unnaturally for a long period of time within a monitored object even when the range to be monitored is wide or in a crowded situation,
- the object is to provide an object specifying method and an object specifying program.
- the object identification device is configured to collate a monitoring object shown in a video imaged by one or more imaging devices, identify a monitoring object estimated to be the same, and a specified monitoring object And an object specifying means for specifying a desired object from among the specific monitoring objects using the shooting time of the specific monitoring object.
- the object specifying method is to identify a monitoring object that is estimated to be the same by checking a monitoring object shown in an image captured by one or more imaging devices, and a specific monitoring object that is the specified monitoring object A desired object is specified from among the specific monitoring objects using the shooting time.
- An object identification program is a computer program that compares a monitoring target in a video captured by one or more imaging devices and identifies a monitoring target that is estimated to be the same.
- a target object specifying process for specifying a desired target object from the specific monitoring target is executed using the photographing time of the specific monitoring target that is the monitored target.
- the present invention even when the range to be monitored is wide or in a congested situation, it is possible to identify an object that exists unnaturally for a long time within the range from among the monitoring targets.
- the monitoring target is not limited to a person.
- the monitoring target may be an object such as an automobile, for example.
- an automobile that has been stopped for a long time within the range to be monitored is specified as an object.
- objects include not only objects but also people.
- FIG. 1 is a block diagram illustrating a configuration example of an embodiment of an object identification device according to the present invention.
- the object specifying device of the present embodiment includes a video acquisition unit 11, a person identification information analysis unit 12, a person identification information management unit 13, a long-time resident estimation unit 14, a person identification information matching unit 15, and a person An information management unit 16, a result output unit 17, a person identification information storage unit 18, and a person information storage unit 19 are provided.
- the video acquisition unit 11 acquires video in a predetermined monitoring range.
- the video acquisition unit 11 also acquires the time at which the video was acquired (hereinafter referred to as the shooting time).
- the video acquisition unit 11 inputs the video in the monitoring range and the time when the video was shot (that is, the shooting time) to the person identification information analysis unit 12.
- the video acquisition unit 11 is realized by a photographing device such as a camera, for example.
- the video acquisition unit 11 identifies information (hereinafter referred to as “video identification unit”) when inputting the acquired video to the person identification information analysis unit 12. (Also referred to as “photographing device identification information”).
- the video acquisition unit 11 is installed so as to be able to photograph a particularly watched area in the facility to be monitored.
- the video acquisition unit 11 does not have to be installed so as to capture the entire range in the facility.
- the person identification information analysis unit 12 analyzes the monitoring target shown in each video captured by each video acquisition unit 11. Specifically, when the person identification information analysis unit 12 detects a monitoring target in the video, the identification information of the monitoring target (hereinafter referred to as monitoring target identification information) and the shooting time when the monitoring target was shot. (Hereinafter referred to as monitoring target imaging information).
- monitoring target identification information the identification information of the monitoring target
- monitoring target imaging information the shooting time when the monitoring target was shot.
- the content of the monitoring target identification information extracted by the person identification information analysis unit 12 is predetermined according to the content of the monitoring target. For example, when the monitoring target is a person, the person identification information analysis unit 12 may extract the face image of the person as the monitoring target identification information when detecting the person from the video. For example, when the monitoring target is an automobile, the person identification information analysis unit 12 may extract an image of a license plate of the automobile as monitoring target identification information when detecting the automobile from the video.
- the information extracted by the person identification information analysis unit 12 is not limited to a face image or a license plate image.
- the information to be extracted is arbitrary as long as it can identify the monitoring target.
- the monitoring target identification information used for identifying a person can be referred to as person identification information. Since a method for extracting specific identification information from a target image is widely known, detailed description thereof is omitted here.
- the person identification information analysis unit 12 receives each video acquisition unit that receives a video when the video is input from each video acquisition unit 11. 11 may be provided with the imaging device identification information.
- the person identification information management unit 13 stores the information analyzed by the person identification information analysis unit 12 in the person identification information storage unit 18. Further, the person identification information management unit 13 extracts necessary information from the person identification information storage unit 18 in response to a request from the person information management unit 16 and notifies the person information management unit 16 of the information.
- the person identification information storage unit 18 stores information analyzed by the person identification information analysis unit 12. Specifically, the person identification information storage unit 18 stores an identifier for identifying individual monitoring target shooting information, monitoring target identification information included in the monitoring target shooting information, and the shooting time in association with each other.
- FIG. 2 is an explanatory diagram illustrating an example of information stored in the person identification information storage unit 18.
- the example illustrated in FIG. 2 is an example of contents stored in the person identification information storage unit 18 when the monitoring target is a person.
- the person image ID is used as an identifier for identifying monitoring target photographing information
- the person identification information is used as monitoring target identification information.
- the person identification information storage unit 18 stores the name of the camera that acquired the video (for example, the imaging device identification information) and the person image itself detected by the person identification information analysis unit 12. Also good.
- the person information management unit 16 stores the information collated by the person identification information collation unit 15 in the person information storage unit 19. Further, the person information management unit 16 extracts necessary information from the person information storage unit 19 in response to a request from the long-time resident estimation unit 14 and notifies the long-time resident estimation unit 14 of the information.
- the person information storage unit 19 stores the identified monitoring target and the shooting time of the monitoring target.
- FIG. 3 is an explanatory diagram illustrating an example of information stored in the person information storage unit 19.
- the example shown in FIG. 3 is an example of the contents stored in the person information storage unit 19 when the monitoring target is a person.
- the person information storage unit 19 identifies the monitoring target shooting information from which the person is extracted (person image ID) and the number of times the person has been shot. And the earliest time when the person was photographed (hereinafter referred to as the earliest photographing time) and the latest time (hereinafter referred to as the latest photographing time) are stored in association with each other.
- FIG. 4 is an explanatory diagram illustrating another example of information stored in the person information storage unit 19.
- the example shown in FIG. 4 is also an example of the contents stored in the person information storage unit 19 when the monitoring target is a person.
- the person information storage unit 19 associates the identifier (person image ID) of the monitoring target shooting information from which the person is extracted with the shooting time. Indicates that it is remembered. Further, as illustrated in FIG. 4, the person information storage unit 19 uses the identifier of the monitoring target shooting information (person image ID) as the name of the camera that acquired the video (for example, shooting device identification information) and the acquired video. The likelihood (likelihood) of the monitoring target estimated from the above may be stored in association with each other.
- the person identification information collation unit 15 collates the monitoring target identification information and identifies the monitoring target estimated to be the same. For example, when the monitoring target face image is extracted as the monitoring target identification information, the person identification information collating unit 15 may collate the face image and specify the person estimated to be the same.
- the person identification information collating unit 15 includes the monitoring target identification information stored in the person information storage unit 19 and the monitoring target included in the monitoring target photographing information stored in the person identification information storage unit 18. The identification information is collated, and it is determined whether or not the same person is stored in the person information storage unit 19.
- the person identification information storage unit 18 stores information illustrated in FIG.
- the person identification information matching unit 15 determines the shooting time included in the monitoring target shooting information of the person stored in the person information storage unit 19. Compare with the earliest and latest shooting times.
- the person identification information matching unit 15 requests the person information management unit 16 to update the earliest shooting time of the person with the shooting time. If the shooting time is later than the latest shooting time, the person identification information matching unit 15 requests the person information management unit 16 to update the latest shooting time of the person with the shooting time. If the shooting time is the same as the earliest shooting time or later than the earliest shooting time, and the same as the latest shooting time or earlier than the latest shooting time, the person identification information matching unit 15 updates the shooting time. Do not process. Then, the person identification information matching unit 15 requests the person information management unit 16 to increase the number of times the person has been photographed by one.
- the person identification information matching unit 15 adds a person to the person information storage unit 19, and the earliest shooting time and the latest shooting time.
- the person information management unit 16 is requested to set the shooting time.
- the person identification information matching unit 15 requests the person information management unit 16 to set 1 to the number of times the person has been photographed.
- the person identification information storage unit 18 stores information illustrated in FIG.
- the person identification information matching unit 15 stores information in which the person image ID and the shooting time are associated with the person ID of the person determined to be the same.
- 19 asks the person information management unit 16 to store the information.
- the person identification information collation unit 15 allocates a new person ID, and associates the person ID with the person image ID and the shooting time.
- the person information management unit 16 is requested to store the stored information in the person information storage unit 19.
- the person identification information matching unit 15 stores the name of the camera (for example, the imaging device identification information) that captured the video together with the imaging time.
- the person information management unit 16 may be requested to store in the unit 19.
- the person identification information matching unit 15 may calculate the likelihood (likelihood) of the monitoring target estimated to be the same when the monitoring target identification information is verified. For example, when a certain face image is collated, the probability of determining Mr. A is calculated as 0.8, the probability of determining Mr. B as 0.3, and the probability of determining Mr. C as 0.2. . In this case, the person identification information matching unit 15 stores the information in which the person image ID, the shooting time, and the calculated probability are associated with each other in the person information storage unit 19 for each person ID of the person who has determined the likelihood. The information management unit 16 may be requested.
- the long-time stay estimator 14 specifies a desired object from the captured monitoring targets using the shooting time included in the monitoring target shooting information of the monitoring target estimated to be the same. Specifically, the long-time stay estimator 14 determines that the frequency of the monitoring target shooting information including the shooting time within a predetermined period is equal to or higher than a predetermined threshold, or the time width of the shooting time within the predetermined period is predetermined. A monitoring target that is longer than the specified period is identified as an object. In addition, this predetermined period may be described as an analysis time width.
- the time width of the shooting time means a difference between two shooting times arbitrarily selected within a predetermined period.
- the time width of the shooting time can be referred to as a width between arbitrary shooting times among the shooting times of the monitoring targets shot within a predetermined period.
- the frequency of the monitoring target imaging information in which the imaging time is included within a predetermined period can be said to be the frequency at which the monitoring target is captured within the predetermined period.
- the monitoring target that meets these conditions is identified as the target.
- a case where an unnatural long-term resident in a station is specified will be described as an example.
- the place for identifying unnatural long-term visitors is not limited to the station premises.
- an arbitrary range such as a commercial facility or a predetermined area of a building where long-term stay is determined to be unnatural can be considered.
- the visitor is unnatural for a long time. For example, in the vicinity of a meeting place or a bench, even if a person stays in one place for a long time, it is not considered as an unnatural action. However, when a person is moving in such a place, the action of the person is considered to be unnatural. Thus, a person whose frequency of photographing with a plurality of cameras exceeds a predetermined value can also be said to be an unnatural long-term visitor.
- the long-time stay estimator 14 estimates the unnatural long-term stay defined as described above, and extracts a person who seems to be a suspicious person as a greylist.
- an unnatural long-term visitor is not limited to the monitoring object defined by the said description. For example, if a camera is installed at the entrance and exit, if a person is seen only on one camera, the person's behavior is considered unnatural, so such a person is It may be defined as a natural long-term resident.
- the long-time resident estimation unit 14 may exclude from the target a monitoring target whose time width of the photographing time is too long.
- an appropriate analysis time width may be determined in advance according to the property of the monitoring target.
- the long-time resident estimation unit 14 may specify not only a person but also an object such as an automobile that stops unnaturally for a long time. In this case, an object that stays unnaturally for a long time can be said to be an unnatural long-term stay.
- a method will be described in which the long-time resident estimation unit 14 identifies an unnatural long-term resident or an unnatural long-term resident as a desired object from the monitoring target.
- the long-time resident estimation unit 14 specifies an object based on the time width of the photographing time within a predetermined period.
- the long-time resident estimation unit 14 targets a monitoring target whose time width between the earliest photographing time and the latest photographing time within a predetermined time period is equal to or greater than a predetermined threshold defined by the determination condition. May be specified.
- the predetermined threshold value below may be referred to as a long-term stay determination time.
- FIG. 5 is an explanatory diagram showing an example of an operation for determining the stay time of the monitoring target.
- the persons u1 to u10 to be monitored are specified from the video captured by the two video acquisition units 11a and 11b in the direction of the time axis.
- a period between two dotted lines is an analysis time width
- a period between long broken lines is a long-term stay determination time.
- a range for specifying an object such as 3 hours or 1 day is specified for the analysis time width
- a time for determining an unnatural stay such as 30 minutes is set for the long-term stay determination time.
- the long-time resident estimation unit 14 does not determine the person u3 (person u4) as an unnatural long-term resident.
- the captured video acquisition unit is different, but the width of the shooting time is also shorter than the long-term stay determination time. Therefore, the long-time resident estimation unit 14 does not determine the person u5 (person u8) as an unnatural long-term resident.
- the long-time resident estimation unit 14 determines that the person u2 (person u6, person u7, and person u9) is an unnatural long-term resident.
- one determination condition for the long-term stay determination time may be determined as a whole regardless of the number of video acquisition units 11.
- this determination condition is a condition for determining the time during which the monitoring target stays for the entire facility where the video acquisition unit 11 is installed.
- this determination condition is a condition for determining the stay time of the monitoring target imaged in the imaging range of each video acquisition unit 11.
- this determination condition is a condition for determining the stay time between the shooting ranges when the monitoring object moves between the shooting ranges of the plurality of video acquisition units 11.
- FIG. 6 is an explanatory diagram showing an example of a determination condition that defines the time width.
- the example illustrated in FIG. 6 illustrates a time width threshold determined according to each video acquisition unit 11 and a time width threshold determined between the video acquisition units 11.
- the stay time allowed for the range captured by the camera 1 is 10 seconds
- the stay time allowed for the range captured by the camera 1 and the camera 2 is 50 seconds. It shows that there is.
- the long-time stay estimator 14 identifies, as a target, a monitoring target whose time width of continuous shooting times is equal to or greater than a predetermined threshold defined by the determination condition among the time widths of shooting times within a predetermined period. Also good. This is because a person whose time width is too long to be photographed by a different video acquisition unit 11 is considered an unnatural long-term visitor.
- One determination condition that defines the time width condition of consecutive shooting times may be determined as a whole regardless of the number of video acquisition units 11. In this case, it can be said that this determination condition is a condition for determining the time during which the monitoring target moves within the facility where the video acquisition unit 11 is installed.
- this determination condition is a condition that determines the stay time of the monitoring target that is allowed according to the nature of the shooting range of each video acquisition unit 11.
- this determination condition is a condition for determining the time for the monitoring target to move between the imaging ranges of the plurality of video acquisition units 11. This is because, for example, if the time width taken between the video acquisition units 11 installed at close positions is too long, it is considered an unnatural long-term visitor.
- FIG. 7 is an explanatory diagram illustrating a specific example of information stored in the person information storage unit 19.
- the three video acquisition units 11 are referred to as camera 1, camera 2, and camera 3, respectively.
- the conditions illustrated in FIG. 6 are set as the determination conditions for defining the time width.
- the camera 1 has captured the video identified by the person image ID1 at 0: 0: 0.
- the camera 2 captures the video identified by the person image ID2 at 0: 0: 30.
- the condition set as the time width between the camera 1 and the camera 2 is 50 seconds in the example shown in FIG. Since the difference between the two shooting times is 30 seconds, the long-time stay estimator 14 does not specify this monitoring target as an object.
- the camera 3 has captured the video identified by the person image ID3 at 0: 0: 45.
- the condition set as the time width between the camera 2 and the camera 3 is 40 seconds in the example shown in FIG. Since the difference between the two shooting times is 15 seconds, the long-time stay estimator 14 does not specify this monitoring target as an object.
- the long-time stay estimator 14 identifies this monitoring target as an object.
- this determination condition may be dynamically generated based on the imaging time of the monitoring target that has not been identified as the target.
- the person information management unit 16 may calculate the long-term stay determination time by collecting statistics on the stay time of a monitoring target whose stay time is shorter than the long-term stay determination time.
- the person information management unit 16 calculates the average value of the staying time from the monitoring target information as the statistical target, at the maximum value of the shooting time interval or the time interval of each video acquisition unit 11, A time that is a constant multiple (for example, twice) may be calculated as the long-term stay determination time.
- the person information management unit 16 calculates the average and variance of the staying time based on the monitoring target information to be statistically targeted at the maximum value of the shooting time interval or the time interval of each video acquisition unit 11.
- the threshold value of the time interval at which a certain percentage of the number of people (for example, 5%) is determined to be a long-term visitor may be calculated as the long-term stay determination time.
- the method for calculating the long-term stay determination time is not limited to these methods.
- the moving speed of the monitoring target in the video captured by one video acquisition unit 11 may be measured, and the long-term stay determination time may be calculated based on the distance between the moving speed and the video acquisition unit 11.
- the long-time resident estimation unit 14 specifies an object based on the frequency with which the monitoring object is photographed in a predetermined period.
- the long-time resident estimation unit 14 may identify a monitoring target whose frequency of monitoring target shooting information including shooting time within a predetermined period is equal to or higher than a predetermined threshold as a target. This is because a monitoring subject who is photographed too frequently is considered an unnatural long-term resident.
- FIG. 8 is an explanatory diagram illustrating an example of an operation in which a plurality of video acquisition units 11 capture a monitoring target.
- the video acquisition units 11a, 11b, 11c, and 11d are shooting areas A, B, C, and D, respectively.
- the person identification information of each person is extracted and stored in the person identification information storage unit 18.
- the person u11 is photographed in any of the images in which the areas A, B, C, and D are photographed. Therefore, four pieces of person identification information regarding the person u11 are registered in the person identification information storage unit 18. Therefore, the long-time stay estimator 14 specifies the person u11 as an object from among the persons to be monitored based on the registration frequency.
- the identified object is output to the output result storage unit 20, for example, by the result output unit 17.
- the determination condition that defines the frequency condition may be determined as a whole regardless of the number of video acquisition units 11. In this case, it can be said that this determination condition is a condition for determining the frequency with which the monitoring target is imaged for the entire facility where the video acquisition unit 11 is installed.
- the determination condition that defines the frequency condition may be determined according to each video acquisition unit 11. In this case, it can be said that this determination condition is a condition for determining the imaging frequency of the monitoring target in the imaging range of each video acquisition unit 11.
- the determination condition that defines the frequency condition may be determined according to a group in which arbitrary video acquisition units 11 are grouped. In this case, it can be said that this determination condition is a condition for determining the imaging frequency of the monitoring target in the imaging range of the video acquisition unit 11 belonging to the group.
- the long-term resident estimation unit 14 uses the frequency calculated based on the likelihood.
- the long-time stay estimator 14 may perform the estimation process only on the monitoring target photographing information whose likelihood is equal to or greater than a predetermined value. . This is similarly applicable when the long-time resident estimation unit 14 specifies an object based on the time width of the photographing time within a predetermined period. In other words, the long-time stay estimator 14 may calculate the time width of the shooting time included in the monitoring target shooting information whose likelihood is greater than or equal to a predetermined value for a certain monitoring target.
- the long-time stay estimator 14 lists the identified object images, photographing device identification information, photographing time, and the like, and notifies the result output unit 17 of the list.
- the result output unit 17 outputs the identified object by an arbitrary method.
- the result output unit 17 may store the output result in the output result storage unit 20.
- the result output unit 17 may extract a common monitoring target extracted from a different time zone or a different day from the information stored in the output result storage unit 20. Since such an object is an object to be particularly watched, the result output unit 17 may output such a person as an addicted criminal of an unnatural long-term resident, for example.
- the person identification information analysis unit 12, the person identification information management unit 13, the long-time resident estimation unit 14, the person identification information matching unit 15, the person information management unit 16, and the result output unit 17 are, for example, a program This is realized by a CPU of a computer that operates according to the (object specifying program).
- the program is stored in a storage unit (not shown) of the object identification device, and the CPU reads the program, and according to the program, the person identification information analysis unit 12, the person identification information management unit 13, and the long-time resident The estimation unit 14, the person identification information matching unit 15, the person information management unit 16, and the result output unit 17 may be operated.
- the person identification information analysis unit 12, the person identification information management unit 13, the long-time resident estimation unit 14, the person identification information matching unit 15, the person information management unit 16, and the result output unit 17 are respectively May be realized by dedicated hardware.
- the person identification information storage unit 18 and the person information storage unit 19 are realized by, for example, a magnetic disk.
- FIG. 9 is a flowchart showing an operation example of the object specifying device of the present embodiment.
- the video acquisition unit 11 acquires a video in the shooting range (step S11), and inputs the shot video to the person identification information analysis unit 12.
- the person identification information analyzing unit 12 detects the monitoring target from the input video
- the person identification information analyzing unit 12 extracts the identification information of the monitoring target (step S12), and stores the extracted monitoring target identification information in the person identification information storage unit 18 (step S12). Step S13).
- the person identification information collation unit 15 collates the monitoring target identification information stored in the person identification information storage unit 18 with the monitoring target identification information stored in the person information storage unit 19 (step S14).
- the person information management unit 16 updates the shooting time of the monitoring target (step S16).
- the person information management unit 16 stores the monitoring target as a new monitoring target in the person information storage unit 19 together with the photographing time (step S17).
- the long-time stay estimator 14 identifies a desired object from among the monitoring targets by using the shooting time included in the monitoring target shooting information of the specified same person (step S18).
- the long-time stay estimator 14 identifies the object based on the time width and frequency of the shooting time.
- the result output part 17 outputs the identified target object (step S19).
- the person identification information analysis unit 12 analyzes the monitoring target shown in each video, extracts the monitoring target shooting information including the monitoring target identification information and the shooting time, and The identification information collation unit 15 collates the monitoring target identification information and identifies the monitoring target estimated to be the same. Then, the long-time resident estimation unit 14 specifies a desired object from among the monitoring targets using the shooting time included in the monitoring target shooting information of the specified monitoring target.
- the long-time resident estimation unit 14 determines that the frequency of the monitoring target shooting information including the shooting time within a predetermined period is equal to or higher than a predetermined threshold, or the time width of the shooting time within the predetermined period is predetermined.
- a monitoring target that is longer than the specified period is identified as an object. Therefore, even when the range to be monitored is wide or the situation is congested, it is possible to identify an object that exists unnaturally for a long period of time within the range from among the monitoring targets.
- the person identification information analysis unit 12 is obtained from a video imaged by one or more video acquisition units 11 installed in the station. Obtain information that can identify a face or person.
- the person identification information collating unit 15 calculates the time width and frequency at which the same person was photographed by collating information that can identify the person with information obtained by different time zones and a plurality of cameras. Therefore, even when a suspicious person is not provided in advance, a list of shooting times can be generated for each monitoring target.
- the long-time resident estimation unit 14 uses this time width and frequency to find a long-term resident in an unnatural facility that is different from a general user. Therefore, it is not necessary to install a camera so as to cover all the monitoring target areas, and even if there is a situation where a part of the monitoring target cannot be photographed in a crowded situation, the monitoring target is specified from other shooting times It becomes possible.
- FIG. 10 is a block diagram showing an outline of the object specifying device according to the present invention.
- the object identification device according to the present invention collates a monitoring object reflected in a video imaged by one or more imaging devices (for example, the video acquisition unit 11), and identifies a monitoring object estimated to be the same.
- the target specifying means 72 specificallyation target 72 that specifies a desired target from the specific monitoring target For example, a long-time resident estimation unit 14) is provided.
- the object specifying unit 72 is configured so that the shooting time is within a predetermined period (for example, analysis time width) or more than a predetermined frequency, or the shooting time within the predetermined period is for a predetermined period (for example, long-term stay). (Determination time) More specific monitoring targets may be specified as objects.
- the object specifying means 72 has a predetermined threshold (for example, long-term stay determination time) in which the time width of the earliest shooting time and the latest shooting time among the time widths of the shooting time within a predetermined period is defined by the determination condition. )
- the above specific monitoring target may be specified as an object.
- the object specifying unit 72 determines the object based on a determination condition (for example, a determination condition illustrated in FIG. 6) that defines the conditions of the time width between the earliest shooting time and the latest shooting time according to each shooting apparatus. May be specified.
- a determination condition for example, a determination condition illustrated in FIG. 6
- the object specifying device generates a determination condition that defines a condition for defining a time width between the earliest shooting time and the latest shooting time based on the shooting time of the monitoring target that has not been specified as an object (for example, a person information management unit 16) may be provided.
- a determination condition that defines a condition for defining a time width between the earliest shooting time and the latest shooting time based on the shooting time of the monitoring target that has not been specified as an object (for example, a person information management unit 16) may be provided.
- the object specifying unit 72 specifies, as an object, a specific monitoring target in which the time widths of successive shooting times out of the time widths of the shooting times within a predetermined period are equal to or greater than a predetermined threshold defined by the determination condition. May be.
- the object specifying means 72 may specify a specific monitoring target whose time width between successive shooting times corresponds to the determination condition between the shooting apparatuses as the target.
- the object specifying means 72 may specify the object based on a determination condition that prescribes for each imaging apparatus the frequency at which the imaging time at which the specific monitoring object was imaged was included within a predetermined period.
- the object specifying unit 72 specifies the object based on a determination condition that prescribes for each group of the imaging devices that the frequency at which the imaging time at which the specific monitoring object was captured is included within a predetermined period is predetermined. Also good.
- the monitoring target collating unit 71 may calculate the likelihood of the monitoring target estimated to be the same, and the target specifying unit 72 may specify the target using the frequency calculated based on the likelihood. .
- the monitoring target collating unit 71 may collate the face image extracted from the video captured by the photographing apparatus and specify the monitoring target estimated to be the same person.
- the monitoring target collating unit 71 may extract information (monitoring target identification information) for identifying each monitoring target before collating the monitoring target shown in the video. Specifically, the monitoring target collating unit 71 may execute processing performed by the person identification information analyzing unit 12 in the first embodiment. However, when the monitoring target can be verified without using the monitoring target identification information, the explicit extraction process of the monitoring target identification information is not necessarily required.
- the target object specifying device including the configuration for extracting the monitoring target identification information will be described.
- FIG. 11 is a block diagram showing another outline of the object specifying device according to the present invention.
- the object specifying device according to the present invention analyzes a monitoring object (for example, a person or a car) reflected in each image captured by one or more imaging devices (for example, the image acquisition unit 11), and identifies the monitoring object.
- a monitoring object for example, a person or a car
- imaging devices for example, the image acquisition unit 11
- Monitoring target shooting information analysis means 81 (for example, person identification information) that extracts monitoring target shooting information including the monitoring target identification information (for example, face image, license plate image) used for shooting and the shooting time when the monitoring target was shot
- the analysis unit 12 the monitoring target identification means 82 for collating the monitoring target identification information to identify the monitoring target estimated to be the same (for example, the person identification information matching unit 15), and the monitoring target of the identified monitoring target
- the object specifying means 83 for example, the long-time stayer's recommendation for specifying a desired object (for example, an unnatural long-term visitor) from the taken monitoring targets. Part 14) and a.
- the object specifying unit 83 is configured such that the frequency of the monitoring target imaging information in which the imaging time is included within a predetermined period (for example, the analysis time width) is equal to or more than a predetermined threshold or the time width of the imaging time within the predetermined period May specify a monitoring target for a predetermined period (for example, long-term stay determination time) or more as a target.
- a predetermined period for example, the analysis time width
- a predetermined threshold or the time width of the imaging time within the predetermined period May specify a monitoring target for a predetermined period (for example, long-term stay determination time) or more as a target.
- the object specifying unit 83 has a predetermined threshold (for example, long-term time) in which the time width of the earliest shooting time and the latest shooting time is defined by the determination condition among the time widths of the shooting time within a predetermined period. You may specify the monitoring object more than stay determination time) as a target.
- a predetermined threshold for example, long-term time
- the monitoring target shooting information analyzing unit 81 may extract monitoring target shooting information including shooting device identification information used for identifying a shooting device that has shot the monitoring target.
- the object specifying unit 83 determines the target based on the determination condition (for example, the determination condition illustrated in FIG. 6) that defines the condition of the time width between the earliest shooting time and the latest shooting time according to each shooting device. You may identify things.
- the object specifying device generates a determination condition that defines a condition for defining a time width between the earliest shooting time and the latest shooting time based on the shooting time of the monitoring target that has not been specified as an object (for example, a person information management unit 16) may be provided.
- the object specifying unit 83 specifies a monitoring object having a time width of continuous shooting times within a predetermined time period within a predetermined period as a target object that is equal to or greater than a predetermined threshold defined by the determination condition. Also good.
- the monitoring target shooting information analyzing unit 81 may extract monitoring target shooting information including shooting device identification information used for identifying a shooting device that has shot the monitoring target.
- the object specifying unit 83 may specify a monitoring target whose time width between successive shooting times in the monitoring target shooting information corresponds to the determination condition between the shooting apparatuses as the target.
- the monitoring target shooting information analyzing unit 81 may extract monitoring target shooting information including shooting device identification information used for identifying a shooting device that has shot the monitoring target.
- the object specifying unit 83 may specify the object based on a determination condition that defines the frequency condition of the monitoring target shooting information including the shooting time within a predetermined period for each shooting apparatus.
- the object specifying unit 83 specifies the object based on a determination condition that prescribes a condition for the frequency of the monitoring target shooting information including the shooting time within a predetermined period for each group of the shooting devices. Also good.
- the monitoring target collating unit 82 may calculate the likelihood (probability) of the monitoring target estimated to be the same. Then, the object specifying unit 83 may specify the object using the frequency calculated based on the likelihood.
- the monitoring target photographing information analysis means 81 may extract a monitoring target face image as monitoring target identification information.
- the monitoring object collation means 82 may collate a face image, and may specify the monitoring object estimated to be the same person. According to such a configuration, it is possible to specify a person who stays unnaturally for a long time within the range to be monitored.
- the present invention is preferably applied to, for example, a video analysis system in which a person in a specific facility is photographed with a surveillance camera, and the photographed video is analyzed to automatically find long-term visitors.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Closed-Circuit Television Systems (AREA)
- Alarm Systems (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
Abstract
Description
12 人物識別情報解析部
13 人物識別情報管理部
14 長時間滞在者推定部
15 人物識別情報照合部
16 人物情報管理部
17 結果出力部
18 人物識別情報記憶部
19 人物情報記憶部
20 出力結果記憶部
u1~u11 人物
Claims (15)
- 1台以上の撮影装置で撮影された映像に映る監視対象を照合して、同一と推定される監視対象を特定する監視対象照合手段と、
特定された前記監視対象である特定監視対象の撮影時刻を用いて、前記特定監視対象の中から所望の対象物を特定する対象物特定手段とを備えた
ことを特徴とする対象物特定装置。 - 対象物特定手段は、撮影時刻が所定の期間内に所定の頻度以上、または、所定の期間内における撮影時刻の時間幅が所定の期間以上の特定監視対象を対象物として特定する
請求項1記載の対象物特定装置。 - 対象物特定手段は、所定の期間内における撮影時刻の時間幅のうち、最早撮影時刻と最遅撮影時刻の時間幅が判定条件で規定される所定の閾値以上の特定監視対象を対象物として特定する
請求項1または請求項2記載の対象物特定装置。 - 対象物特定手段は、各撮影装置に応じて最早撮影時刻と最遅撮影時刻の時間幅の条件を規定した判定条件に基づいて、対象物を特定する
請求項3記載の対象物特定装置。 - 対象物と特定されなかった監視対象の撮影時刻に基づいて、最早撮影時刻と最遅撮影時刻の時間幅の条件を規定した判定条件を生成する判定条件生成手段を備えた
請求項3または請求項4記載の対象物特定装置。 - 対象物特定手段は、所定の期間内における撮影時刻の時間幅のうち、連続する撮影時刻の時間幅が判定条件で規定される所定の閾値以上の特定監視対象を対象物として特定する
請求項1または請求項2記載の対象物特定装置。 - 対象物特定手段は、連続する撮影時刻の時間幅が、撮影装置間の判定条件に該当する特定監視対象を対象物として特定する
請求項6記載の対象物特定装置。 - 対象物特定手段は、特定監視対象が撮影された撮影時刻が所定の期間内に含まれる頻度を撮影装置ごと規定した判定条件に基づいて、対象物を特定する
請求項1または請求項2記載の対象物特定装置。 - 対象物特定手段は、特定監視対象が撮影された撮影時刻が所定の期間内に含まれる頻度を予め定めた撮影装置のグループごと規定した判定条件に基づいて、対象物を特定する
請求項8記載の対象物特定装置。 - 監視対象照合手段は、同一と推定される監視対象の尤度を算出し、
対象物特定手段は、前記尤度を基に算出される頻度を用いて対象物を特定する
請求項8または請求項9記載の対象物特定装置。 - 監視対象照合手段は、撮影装置が撮影した映像から抽出される顔画像を照合して、同一人物と推定される監視対象を特定する
請求項1から請求項10のうちのいずれか1項に記載の対象物特定装置。 - 1台以上の撮影装置で撮影された映像に映る監視対象を照合して、同一と推定される監視対象を特定し、
特定された前記監視対象である特定監視対象の撮影時刻を用いて、前記特定監視対象の中から所望の対象物を特定する
ことを特徴とする対象物特定方法。 - 撮影時刻が所定の期間内に所定の頻度以上、または、所定の期間内における撮影時刻の時間幅が所定の期間以上の特定監視対象を対象物として特定する
請求項12記載の対象物特定方法。 - コンピュータに、
1台以上の撮影装置で撮影された映像に映る監視対象を照合して、同一と推定される監視対象を特定する監視対象照合処理、および、
特定された前記監視対象である特定監視対象の撮影時刻を用いて、前記特定監視対象の中から所望の対象物を特定する対象物特定処理
を実行させるための対象物特定プログラム。 - コンピュータに、
対象物特定処理で、撮影時刻が所定の期間内に所定の頻度以上、または、所定の期間内における撮影時刻の時間幅が所定の期間以上の特定監視対象を対象物として特定させる
請求項14記載の対象物特定プログラム。
Priority Applications (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201480016621.7A CN105075248A (zh) | 2013-03-29 | 2014-01-31 | 对象物体标识设备、对象物体标识方法和对象物体标识程序 |
EP14774192.0A EP2981075A1 (en) | 2013-03-29 | 2014-01-31 | Target object identifying device, target object identifying method and target object identifying program |
JP2015507981A JP6406246B2 (ja) | 2013-03-29 | 2014-01-31 | 対象物特定装置、対象物特定方法および対象物特定プログラム |
US14/765,621 US10430666B2 (en) | 2013-03-29 | 2014-01-31 | Target object identifying device, target object identifying method and target object identifying program |
SG11201508085QA SG11201508085QA (en) | 2013-03-29 | 2014-01-31 | Target object identifying device, target object identifying method and target object identifying program |
BR112015020989A BR112015020989A2 (pt) | 2013-03-29 | 2014-01-31 | dispositivo de identificação de objeto-alvo, método de identificação de objeto-alvo e programa de identificação de objeto-alvo |
AU2014240718A AU2014240718A1 (en) | 2013-03-29 | 2014-01-31 | Target object identifying device, target object identifying method and target object identifying program |
ZA2015/06437A ZA201506437B (en) | 2013-03-29 | 2015-09-02 | Target object identifying device, target object identifying method and target object identifying program |
US16/502,180 US11210527B2 (en) | 2013-03-29 | 2019-07-03 | Target object identifying device, target object identifying method and target object identifying program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-072178 | 2013-03-29 | ||
JP2013072178 | 2013-03-29 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/765,621 A-371-Of-International US10430666B2 (en) | 2013-03-29 | 2014-01-31 | Target object identifying device, target object identifying method and target object identifying program |
US16/502,180 Continuation US11210527B2 (en) | 2013-03-29 | 2019-07-03 | Target object identifying device, target object identifying method and target object identifying program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014155922A1 true WO2014155922A1 (ja) | 2014-10-02 |
Family
ID=51622949
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/000523 WO2014155922A1 (ja) | 2013-03-29 | 2014-01-31 | 対象物特定装置、対象物特定方法および対象物特定プログラム |
Country Status (9)
Country | Link |
---|---|
US (2) | US10430666B2 (ja) |
EP (1) | EP2981075A1 (ja) |
JP (1) | JP6406246B2 (ja) |
CN (1) | CN105075248A (ja) |
AU (1) | AU2014240718A1 (ja) |
BR (1) | BR112015020989A2 (ja) |
SG (1) | SG11201508085QA (ja) |
WO (1) | WO2014155922A1 (ja) |
ZA (1) | ZA201506437B (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016166950A1 (en) * | 2015-04-14 | 2016-10-20 | Sony Corporation | Image processing apparatus, image processing method, and image processing system |
WO2017163955A1 (ja) | 2016-03-23 | 2017-09-28 | 日本電気株式会社 | 監視システム、画像処理装置、画像処理方法およびプログラム記録媒体 |
JP2017191621A (ja) * | 2016-03-30 | 2017-10-19 | 日本電気株式会社 | 解析装置、解析方法及びプログラム |
WO2018096787A1 (ja) * | 2016-11-22 | 2018-05-31 | パナソニックIpマネジメント株式会社 | 人物行動監視装置および人物行動監視システム |
JP2019091395A (ja) * | 2017-11-15 | 2019-06-13 | キヤノン株式会社 | 情報処理装置、監視システム、方法及びプログラム |
JP2019114294A (ja) * | 2019-03-29 | 2019-07-11 | 株式会社東芝 | 改札監視システム |
JP2020074506A (ja) * | 2019-09-19 | 2020-05-14 | パナソニックIpマネジメント株式会社 | 人物行動監視装置および人物行動監視システム |
JP2021166399A (ja) * | 2020-01-15 | 2021-10-14 | 日本電気株式会社 | 画像処理システム、画像処理方法及びプログラム |
JP2022051683A (ja) * | 2020-09-22 | 2022-04-01 | グラスパー テクノロジーズ エーピーエス | 訓練データの生成と再識別に使用するための機械学習モデルの訓練とについての概念 |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10430666B2 (en) * | 2013-03-29 | 2019-10-01 | Nec Corporation | Target object identifying device, target object identifying method and target object identifying program |
WO2015166612A1 (ja) | 2014-04-28 | 2015-11-05 | 日本電気株式会社 | 映像解析システム、映像解析方法および映像解析プログラム |
US20180002109A1 (en) * | 2015-01-22 | 2018-01-04 | Nec Corporation | Shelf space allocation management device and shelf space allocation management method |
JP6597643B2 (ja) * | 2015-02-05 | 2019-10-30 | 株式会社リコー | 画像処理装置、画像処理システム、画像処理方法およびプログラム |
US10043089B2 (en) * | 2015-03-11 | 2018-08-07 | Bettina Jensen | Personal identification method and apparatus for biometrical identification |
WO2017037754A1 (en) * | 2015-08-28 | 2017-03-09 | Nec Corporation | Analysis apparatus, analysis method, and storage medium |
JP2017097510A (ja) * | 2015-11-20 | 2017-06-01 | ソニー株式会社 | 画像処理装置と画像処理方法およびプログラム |
DE102016106514A1 (de) * | 2016-04-08 | 2017-10-12 | Deutsche Post Ag | Kontrolle von Aufenthaltsdauer und Aufenthaltsbereich |
TWI657378B (zh) * | 2017-09-22 | 2019-04-21 | 財團法人資訊工業策進會 | 複數非線性扭曲鏡頭下之目標追蹤方法及系統 |
EP3487151A1 (en) * | 2017-11-15 | 2019-05-22 | Canon Kabushiki Kaisha | Information processing apparatus, monitoring system, method, and non-transitory computer-readable storage medium |
WO2020213638A1 (ja) * | 2019-04-18 | 2020-10-22 | 日本電気株式会社 | 人物特定装置、人物特定方法および記録媒体 |
CN111354146A (zh) * | 2020-01-06 | 2020-06-30 | 安然 | 一种家居防护系统 |
CN111611904B (zh) * | 2020-05-15 | 2023-12-01 | 新石器慧通(北京)科技有限公司 | 基于无人车行驶过程中的动态目标识别方法 |
CN111565300B (zh) * | 2020-05-22 | 2020-12-22 | 深圳市百川安防科技有限公司 | 基于对象的视频文件处理方法、设备及系统 |
US11589010B2 (en) | 2020-06-03 | 2023-02-21 | Apple Inc. | Camera and visitor user interfaces |
US11657614B2 (en) | 2020-06-03 | 2023-05-23 | Apple Inc. | Camera and visitor user interfaces |
US11657123B2 (en) * | 2020-10-08 | 2023-05-23 | Hitachi, Ltd. | Method and apparatus for people flow analysis using similar-image search |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008152736A (ja) * | 2006-12-20 | 2008-07-03 | Sony Corp | 監視システム、監視装置及び監視方法 |
JP2010039580A (ja) * | 2008-07-31 | 2010-02-18 | Secom Co Ltd | 移動物体追跡装置 |
JP2012039531A (ja) * | 2010-08-11 | 2012-02-23 | Oki Electric Ind Co Ltd | 映像監視システム |
JP2012078950A (ja) * | 2010-09-30 | 2012-04-19 | Sogo Keibi Hosho Co Ltd | 自律移動体を用いた監視システム、監視装置、自律移動体、監視方法、及び監視プログラム |
JP2012238111A (ja) | 2011-05-10 | 2012-12-06 | Nippon Hoso Kyokai <Nhk> | 顔画像認識装置及び顔画像認識プログラム |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7167576B2 (en) * | 2001-07-02 | 2007-01-23 | Point Grey Research | Method and apparatus for measuring dwell time of objects in an environment |
WO2003067360A2 (en) * | 2002-02-06 | 2003-08-14 | Nice Systems Ltd. | System and method for video content analysis-based detection, surveillance and alarm management |
US7397929B2 (en) * | 2002-09-05 | 2008-07-08 | Cognex Technology And Investment Corporation | Method and apparatus for monitoring a passageway using 3D images |
JP3861781B2 (ja) * | 2002-09-17 | 2006-12-20 | 日産自動車株式会社 | 前方車両追跡システムおよび前方車両追跡方法 |
US7263209B2 (en) * | 2003-06-13 | 2007-08-28 | Sarnoff Corporation | Vehicular vision system |
GB2410359A (en) * | 2004-01-23 | 2005-07-27 | Sony Uk Ltd | Display |
JP4915655B2 (ja) * | 2006-10-27 | 2012-04-11 | パナソニック株式会社 | 自動追尾装置 |
US7608825B2 (en) * | 2006-12-14 | 2009-10-27 | Sumitomo Electric Industries, Ltd. | Image pickup device, vision enhancement apparatus, night-vision apparatus, navigation support apparatus, and monitoring apparatus |
US20100141806A1 (en) * | 2007-03-15 | 2010-06-10 | Kansai University | Moving Object Noise Elimination Processing Device and Moving Object Noise Elimination Processing Program |
CN100531373C (zh) * | 2007-06-05 | 2009-08-19 | 西安理工大学 | 基于双摄像头联动结构的视频运动目标特写跟踪监视方法 |
JP2010081480A (ja) * | 2008-09-29 | 2010-04-08 | Fujifilm Corp | 携帯型不審者検出装置、不審者検出方法及びプログラム |
JP2010113692A (ja) * | 2008-11-10 | 2010-05-20 | Nec Corp | 顧客行動記録装置及び顧客行動記録方法並びにプログラム |
JP5432983B2 (ja) * | 2009-03-09 | 2014-03-05 | パナソニック株式会社 | 入退検出装置、監視装置及び入退検出方法 |
CN101715111B (zh) * | 2009-11-16 | 2011-12-14 | 南京邮电大学 | 视频监控中滞留物主自动搜寻方法 |
EP2717571B1 (en) * | 2011-05-24 | 2018-09-05 | Nissan Motor Co., Ltd | Vehicle monitoring device and method of monitoring vehicle |
WO2013081985A1 (en) * | 2011-11-28 | 2013-06-06 | Magna Electronics, Inc. | Vision system for vehicle |
US9070019B2 (en) * | 2012-01-17 | 2015-06-30 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US10325489B2 (en) * | 2012-06-29 | 2019-06-18 | Here Global B.V. | Dynamic natural guidance |
US10430666B2 (en) * | 2013-03-29 | 2019-10-01 | Nec Corporation | Target object identifying device, target object identifying method and target object identifying program |
JP6941805B2 (ja) * | 2018-02-22 | 2021-09-29 | パナソニックIpマネジメント株式会社 | 滞在状況表示システムおよび滞在状況表示方法 |
-
2014
- 2014-01-31 US US14/765,621 patent/US10430666B2/en active Active
- 2014-01-31 BR BR112015020989A patent/BR112015020989A2/pt not_active IP Right Cessation
- 2014-01-31 JP JP2015507981A patent/JP6406246B2/ja active Active
- 2014-01-31 AU AU2014240718A patent/AU2014240718A1/en not_active Abandoned
- 2014-01-31 CN CN201480016621.7A patent/CN105075248A/zh active Pending
- 2014-01-31 WO PCT/JP2014/000523 patent/WO2014155922A1/ja active Application Filing
- 2014-01-31 SG SG11201508085QA patent/SG11201508085QA/en unknown
- 2014-01-31 EP EP14774192.0A patent/EP2981075A1/en not_active Withdrawn
-
2015
- 2015-09-02 ZA ZA2015/06437A patent/ZA201506437B/en unknown
-
2019
- 2019-07-03 US US16/502,180 patent/US11210527B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008152736A (ja) * | 2006-12-20 | 2008-07-03 | Sony Corp | 監視システム、監視装置及び監視方法 |
JP2010039580A (ja) * | 2008-07-31 | 2010-02-18 | Secom Co Ltd | 移動物体追跡装置 |
JP2012039531A (ja) * | 2010-08-11 | 2012-02-23 | Oki Electric Ind Co Ltd | 映像監視システム |
JP2012078950A (ja) * | 2010-09-30 | 2012-04-19 | Sogo Keibi Hosho Co Ltd | 自律移動体を用いた監視システム、監視装置、自律移動体、監視方法、及び監視プログラム |
JP2012238111A (ja) | 2011-05-10 | 2012-12-06 | Nippon Hoso Kyokai <Nhk> | 顔画像認識装置及び顔画像認識プログラム |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10319099B2 (en) | 2015-04-14 | 2019-06-11 | Sony Corporation | Image processing apparatus, image processing method, and image processing system |
WO2016166950A1 (en) * | 2015-04-14 | 2016-10-20 | Sony Corporation | Image processing apparatus, image processing method, and image processing system |
WO2017163955A1 (ja) | 2016-03-23 | 2017-09-28 | 日本電気株式会社 | 監視システム、画像処理装置、画像処理方法およびプログラム記録媒体 |
US11030464B2 (en) | 2016-03-23 | 2021-06-08 | Nec Corporation | Privacy processing based on person region depth |
JP2020107354A (ja) * | 2016-03-30 | 2020-07-09 | 日本電気株式会社 | 解析装置、解析方法及びプログラム |
JP2017191621A (ja) * | 2016-03-30 | 2017-10-19 | 日本電気株式会社 | 解析装置、解析方法及びプログラム |
US11176698B2 (en) | 2016-03-30 | 2021-11-16 | Nec Corporation | Analysis apparatus, analysis method, and storage medium |
US11094076B2 (en) | 2016-03-30 | 2021-08-17 | Nec Corporation | Analysis apparatus, analysis method, and storage medium |
US10949657B2 (en) | 2016-11-22 | 2021-03-16 | Panasonic Intellectual Property Management Co., Ltd. | Person's behavior monitoring device and person's behavior monitoring system |
JP2018085597A (ja) * | 2016-11-22 | 2018-05-31 | パナソニックIpマネジメント株式会社 | 人物行動監視装置および人物行動監視システム |
WO2018096787A1 (ja) * | 2016-11-22 | 2018-05-31 | パナソニックIpマネジメント株式会社 | 人物行動監視装置および人物行動監視システム |
JP2019091395A (ja) * | 2017-11-15 | 2019-06-13 | キヤノン株式会社 | 情報処理装置、監視システム、方法及びプログラム |
JP7097721B2 (ja) | 2017-11-15 | 2022-07-08 | キヤノン株式会社 | 情報処理装置、方法及びプログラム |
JP2019114294A (ja) * | 2019-03-29 | 2019-07-11 | 株式会社東芝 | 改札監視システム |
JP2020074506A (ja) * | 2019-09-19 | 2020-05-14 | パナソニックIpマネジメント株式会社 | 人物行動監視装置および人物行動監視システム |
JP2021166399A (ja) * | 2020-01-15 | 2021-10-14 | 日本電気株式会社 | 画像処理システム、画像処理方法及びプログラム |
JP2022003529A (ja) * | 2020-01-15 | 2022-01-11 | 日本電気株式会社 | 情報処理装置、情報処理方法、プログラム |
JP7487888B2 (ja) | 2020-01-15 | 2024-05-21 | 日本電気株式会社 | 画像処理システム、画像処理方法及びプログラム |
JP2022051683A (ja) * | 2020-09-22 | 2022-04-01 | グラスパー テクノロジーズ エーピーエス | 訓練データの生成と再識別に使用するための機械学習モデルの訓練とについての概念 |
JP7186269B2 (ja) | 2020-09-22 | 2022-12-08 | グラスパー テクノロジーズ エーピーエス | 訓練データの生成と同一物判定に使用するための機械学習モデルの訓練とについての概念 |
Also Published As
Publication number | Publication date |
---|---|
AU2014240718A1 (en) | 2015-08-27 |
US20150371403A1 (en) | 2015-12-24 |
BR112015020989A2 (pt) | 2017-07-18 |
CN105075248A (zh) | 2015-11-18 |
US20190325229A1 (en) | 2019-10-24 |
ZA201506437B (en) | 2019-07-31 |
US10430666B2 (en) | 2019-10-01 |
EP2981075A1 (en) | 2016-02-03 |
US11210527B2 (en) | 2021-12-28 |
SG11201508085QA (en) | 2015-11-27 |
JPWO2014155922A1 (ja) | 2017-02-16 |
JP6406246B2 (ja) | 2018-10-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6406246B2 (ja) | 対象物特定装置、対象物特定方法および対象物特定プログラム | |
WO2015166612A1 (ja) | 映像解析システム、映像解析方法および映像解析プログラム | |
CN107958258B (zh) | 用于追踪限定区域中的对象的方法和系统 | |
JP2022082561A (ja) | 解析サーバ、監視システム、監視方法及びプログラム | |
AU2015224526B2 (en) | An image management system | |
WO2014155958A1 (ja) | 対象物監視システム、対象物監視方法および監視対象抽出プログラム | |
JP2008108243A (ja) | 人物認識装置および人物認識方法 | |
CN110852148A (zh) | 一种基于目标追踪的访客目的地核验方法及系统 | |
WO2020115890A1 (ja) | 情報処理システム、情報処理装置、情報処理方法、およびプログラム | |
JP2017151832A (ja) | 待ち時間算出システム | |
CN110782228A (zh) | 一种工作时长获得方法、装置、电子设备及存储介质 | |
CN113487784A (zh) | 一种闸机通行系统及方法 | |
JP2017220151A (ja) | 情報処理装置、情報処理方法及びプログラム | |
KR20220147566A (ko) | 교통정보 제공 방법, 장치 및 그러한 방법을 실행하기 위하여 매체에 저장된 컴퓨터 프로그램 | |
JP2008158678A (ja) | 人物認証装置および人物認証方法および入退場管理システム | |
CN110674775A (zh) | 一种闸机的控制方法、装置、系统及存储介质 | |
JP2007257296A (ja) | 入退室管理システム及び入退室管理方法 | |
KR102002287B1 (ko) | 의심행동 관리가 가능한 출입관리 시스템 | |
CN113744443B (zh) | 一种闸机通道防欺骗控制方法、装置、设备及存储介质 | |
JP6558178B2 (ja) | 迷惑行為者推定システム、迷惑行為者推定システムの制御方法及び制御プログラム | |
JP2020064531A5 (ja) | ||
WO2017048148A1 (en) | Monitoring a flow of objects by a sim card detector | |
WO2022029860A1 (ja) | 移動体追跡システム、移動体追跡装置、プログラム及び移動体追跡方法 | |
JP6879336B2 (ja) | 迷惑行為者推定システム、迷惑行為者推定システムの制御方法及び制御プログラム | |
JP2015014858A (ja) | 情報処理システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480016621.7 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14774192 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14765621 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2014774192 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014774192 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2015507981 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2014240718 Country of ref document: AU Date of ref document: 20140131 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2015141286 Country of ref document: RU Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112015020989 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112015020989 Country of ref document: BR Kind code of ref document: A2 Effective date: 20150831 |