WO2012131816A1 - 人物追跡装置、人物追跡方法および人物追跡プログラムを格納した非一時的なコンピュータ可読媒体 - Google Patents
人物追跡装置、人物追跡方法および人物追跡プログラムを格納した非一時的なコンピュータ可読媒体 Download PDFInfo
- Publication number
- WO2012131816A1 WO2012131816A1 PCT/JP2011/005973 JP2011005973W WO2012131816A1 WO 2012131816 A1 WO2012131816 A1 WO 2012131816A1 JP 2011005973 W JP2011005973 W JP 2011005973W WO 2012131816 A1 WO2012131816 A1 WO 2012131816A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- person
- information
- feature
- tracking
- specificity
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30242—Counting objects in image
Definitions
- the present invention relates to a person tracking device, a person tracking method, and a non-transitory computer-readable medium storing a person tracking program, and more particularly to a person tracking device, a person tracking method, and a person that track a person using video captured by a surveillance camera.
- the present invention relates to a non-transitory computer readable medium storing a tracking program.
- Patent Document 1 discloses a method for tracking a person based on the color characteristics of the person as an example of a person tracking method.
- FIG. 9 shows an embodiment of the person tracking system disclosed in Patent Document 1.
- the person tracking system includes a person area extracting unit 1, a voxel generating unit 2, a person color feature extracting unit 3, and a person tracking unit 4.
- the person area extraction unit 1 extracts a person area from the monitoring video and outputs the person area extraction result to the voxel generation unit 2.
- the voxel generation means 2 generates voxel information from the person area extraction result output from the person area extraction means 1, and outputs the generated voxel information to the person color feature extraction means 3.
- the person color feature extraction unit 3 extracts a person color feature from the voxel information output from the voxel generation unit 2 and the monitoring video, and outputs the extracted person color feature to the person tracking unit 4.
- the person tracking unit 4 tracks a person using the person color feature output from the person color feature extracting unit 3 and outputs a person tracking result.
- the person area extracting means 1 extracts a person area from the monitoring video input from the camera by the background subtraction method. Then, the person area extraction unit 1 outputs the extracted person area extraction result to the voxel generation unit 2.
- the voxel generation means 2 generates a voxel based on the input person area extraction result.
- the input person region extraction result is obtained by a plurality of cameras.
- the voxel generating means 2 generates a voxel representing the position of the person in the space by projecting the input person region extraction result onto the three-dimensional space by the view volume intersection method.
- the voxel generation unit 2 outputs the generated voxel to the person color feature extraction unit 3.
- the person color feature extraction means 3 obtains the vertical distribution of the color from the person's feet to the head as the person color feature based on the generated voxel and the surveillance camera video. At this time, the person color feature extraction unit 3 calculates the average of the colors for each height of the voxel, and normalizes the height to calculate the person color feature. This color feature is basically determined by the color of the clothes being worn, but a value obtained by calculating the average of the colors in all directions at the same height is used. Thereby, the person color feature extracting means 3 realizes robust color feature extraction against changes in the appearance of clothes depending on the direction.
- the person tracking means 4 compares the obtained person color feature with the person color feature obtained in the past, and determines similarity.
- the person tracking unit 4 calculates the correspondence between the voxel calculated in the past and the most recently calculated voxel according to the determination result. As a result, the person tracking unit 4 calculates a person tracking result in which the past person extraction result and the current extraction result are associated with each other.
- the present invention has been made in view of such problems, and a person tracking device, a person tracking method, and a person capable of tracking a tracking target with high accuracy even when the tracking target has few features It is a primary object to provide a non-transitory computer readable medium storing a tracking program.
- One aspect of the person tracking device is: A person area information extracting means for detecting a person area that is an area to which a person included in the video belongs, and generating person area information describing the information of the person area; Based on the person area information and information specifying the tracking person, at least one accompanying person accompanying the tracking person is identified from the persons included in the person area information, and the accompanying person is described
- Companion determination means for generating companion information as information, A characteristic person, which is information describing the characteristic person, is selected from the accompanying persons specified by the companion determination information by using the person area information to select a characteristic person having a specific characteristic amount.
- a characteristic person selection means for generating information;
- a person tracking means for calculating a characteristic person tracking result which is a tracking result of the characteristic person based on the person area information and the characteristic person information; It is what has.
- One aspect of the person tracking method is as follows. Detecting a person area that is an area to which a person included in the video belongs, and generating information person area information describing information of the person area; Based on the person area information and information specifying the tracking person, at least one accompanying person accompanying the tracking person is identified from the persons included in the person area information, and the accompanying person is described Generate companion information, which is information, A characteristic person, which is information describing the characteristic person, is selected from the accompanying persons specified by the companion determination information by using the person area information to select a characteristic person having a specific feature amount. Generate information, A feature person tracking result that is a result of tracking the feature person is calculated based on the person region information and the feature person information.
- a non-transitory computer-readable medium storing a person tracking program is as follows.
- a non-transitory computer-readable medium storing a program for causing a computer to execute processing for tracking a person included in an image, The processing detects a person area that is an area to which a person included in the video belongs, generates information person area information describing information on the person area, Based on the person area information and information specifying the tracking person, at least one accompanying person accompanying the tracking person is identified from the persons included in the person area information, and the accompanying person is described
- Generate companion information which is information
- a characteristic person which is information describing the characteristic person, is selected from the accompanying persons specified by the companion determination information by using the person area information to select a characteristic person having a specific feature amount.
- Generate information A feature person tracking result that is a result of tracking the feature person is calculated based on the person region information and the feature person information.
- a person tracking device In the present invention, a person tracking device, a person tracking method, and a non-transitory computer-readable medium storing a person tracking program capable of tracking a tracking target with high accuracy even when the tracking target has few features Can be provided.
- FIG. 1 is a block diagram showing a configuration of a person tracking apparatus according to a first embodiment.
- 3 is a flowchart showing a flow of processing of a companion determination unit 102 according to the first embodiment.
- 3 is a flowchart showing a flow of processing of a companion determination unit 102 according to the first embodiment.
- 3 is a flowchart showing a flow of processing of the person tracking apparatus 100 according to the first embodiment.
- 3 is a block diagram illustrating a configuration of a feature person selection unit 103 according to the first embodiment.
- FIG. 3 is a flowchart showing a flow of processing of a characteristic person determination unit 201 according to the first embodiment.
- FIG. 6 is a block diagram illustrating a configuration of a feature person selection unit 103 according to the second embodiment.
- FIG. 10 is a block diagram illustrating a configuration of a feature person selection unit 103 according to the third embodiment. It is a block diagram showing the structure of the person tracking system currently disclosed by patent document 1.
- FIG. 1 is a block diagram showing the configuration of the person tracking apparatus according to this embodiment.
- the person tracking device 100 includes a person area information extraction unit 101, a companion determination unit 102, a feature person selection unit 103, a person tracking unit 104, and a tracking result calculation unit 105.
- the person area information extraction unit 101 receives the monitoring video and outputs the extracted person area information to the accompanying person determination unit 102, the feature person selection unit 103, and the person tracking unit 104.
- the companion determination unit 102 receives the person area information output from the person area information extraction unit 101 and the tracking target person information, and outputs the calculated companion information to the feature person selection unit 103.
- the feature person selection unit 103 receives the person region information output from the person region information extraction unit 101 and the accompanying person information output from the accompanying person determination unit 102 and inputs the calculated characteristic person information to the person tracking unit 104. And the calculated tracking target person relative position information is output to the tracking result calculation unit 105.
- the person tracking unit 104 receives the person area information output from the person area information extraction unit 101 and the feature person information output from the feature person selection unit 103, and uses the calculated feature person tracking information as a tracking result calculation unit.
- the tracking result calculation unit 105 receives the characteristic person tracking information output from the person tracking unit 104 and the tracking target person relative position information output from the characteristic person selection unit 103, and calculates a tracking target person tracking result. Output to any processing unit.
- a monitoring video is input to the person area information extraction unit 101.
- the person area information extraction unit 101 generates a frame image from the input monitoring video.
- the person area information extraction unit 101 performs a process of extracting a person area from the frame image, and further performs a process of extracting person area information describing the person area.
- the input monitoring video is an analog video
- the person area information extraction unit 101 captures the monitoring video and generates a frame image.
- the monitoring video is a digital video encoded by H.264, Motion JPEG, MPEG-2, or the like
- the person area information extraction unit 101 generates a frame image by decoding using a corresponding decoding method.
- the person area extraction processing by the person area information extraction unit 101 can use various existing methods. For example, in the extraction of a person area based on a background difference, the person area information extraction unit 101 constructs a model representing background information from a frame image input along a time series, and extracts a moving object using the model. The person region is extracted from the extracted information. Most simply, the person area information extraction unit 101 defines a background image generated by averaging information of a still area of an image between a plurality of frames as a background model, and calculates a difference between the frame image and the background image. Then, a region having a large difference is extracted as a moving object.
- the person area information extraction unit 101 may directly use the moving object extraction result as the person area extraction result.
- the person area information extraction unit 101 determines whether the extracted moving object area corresponds to a person, and only an area that is highly likely to be a person. May be extracted as a person region.
- the person area information extraction unit 101 may extract a person area directly using a person model without using a background model.
- the person model used here may be a model representing the whole person or a model representing a part of the person.
- the person area information extraction unit 101 detects a face or head using a face detector or head detector that models and extracts the face or head as a part of the person, and the detection result
- the person area may be determined from Alternatively, the person area information extraction unit 101 may extract a person area using a detector that detects a part of the person area such as the upper body and the lower body.
- the person area information extraction unit 101 extracts person area information from the person area extracted by the above-described method.
- the person area information is information representing the characteristics of the extracted person area.
- the person area information includes information indicating the position and shape of the person area on the image and information describing the characteristics of the person included in the area specified by the information.
- the former (information representing the position and shape of the person area on the image) includes silhouette information representing the shape of the person (information obtained by labeling pixels corresponding to the person area), rectangular information representing the circumscribed rectangle of the person area, Alternatively, any information may be used as long as the information represents the shape and position of the person area. For example, it is also possible to represent area information using a descriptor that describes an area defined by MPEG-7.
- the latter (information describing the characteristics of the person included in the designated area) may be various, including the image characteristics included in the area and the higher-level characteristics of the person itself.
- the information includes a feature amount representing a human face feature, a hair color and a hairstyle, a feature amount representing a hair feature, a visual feature amount representing a clothing color or pattern, a shape, information representing a clothing type, Includes personal items (hats, glasses, masks, handbags, ties, mufflers, etc. worn by the person), information representing specific marks and logos on clothes, information representing skin color, etc. It is.
- the face feature can be calculated using a conventionally used face detector and face feature extraction.
- the feature of the clothes is calculated by designating the clothes area from the person area and extracting information describing the area.
- Various conventional methods for example, a method for describing a color, pattern, or shape defined in MPEG-7) can be used as a feature extraction method for colors, patterns, and shapes.
- Information describing a human accessory is calculated by detecting the accessory using a detector that detects a corresponding object from a specific part of the head or body and extracting information describing the region. .
- Specific marks and logos written on clothes can be detected using a discriminator that has learned the patterns.
- the specific mark or logo is also calculated by extracting information describing the feature or the identified result from the detected area.
- the skin color can also be extracted by estimating the skin area from the person area and obtaining the color of that part.
- the latter information can also include higher-order characteristics.
- personal height information may be a feature.
- the height information of the person can be calculated from the three-dimensional position of the person in the real world by using the camera calibration data from the two-dimensional position of the image acquired by the camera.
- information related to a person's physique can be extracted and used as a feature.
- information that describes the state of the person, such as riding in a wheelchair, holding a child, walking with a cane, by using a discriminator that determines a specific state such as the person is in a wheelchair can be extracted and used as a feature.
- a gait feature that is a feature of walking can be calculated and used as a feature.
- a classifier that identifies a specific state or classifies a gait feature can be constructed by learning the classifier using a learning image.
- the person area information extraction unit 101 outputs the extracted person area information to the accompanying person determination unit 102, the feature person selection unit 103, and the person tracking unit 104.
- the accompanying person determination unit 102 determines the accompanying person of the tracking target person from the input tracking target person information and the person area information output from the person area information extraction unit 101, and uses the determination result as the accompanying person information. Output.
- There are two main methods for determining a companion a method of specifying a companion person after specifying the target person to be tracked and a method of specifying a target person to be tracked after determining a group including the target person to be tracked.
- the companion determination unit 102 identifies the tracking target person by some method, and then determines a person existing in the vicinity as a companion. This process will be described with reference to the flowchart of FIG.
- the accompanying person determination unit 102 specifies a tracking target person from the tracking target person information and the person area information (S501).
- the companion determination unit 102 determines the face feature amount of the tracking target person and the face feature amount of the person area information. To identify the person to be tracked.
- the tracking target person information includes position information obtained from other sensor information such as RFID
- the companion determination unit 102 compares the person position information included in the person area information and tracks a person whose position is substantially the same. Identify as subject.
- the tracking target person identification process is not always executable in all frames, and is therefore executed in executable frames.
- the accompanying person determination unit 102 determines the accompanying person of the identified tracking target person (S502).
- the companion determination unit 102 determines the companion determination unit 102, for example, if the distance in the image between the identified tracking target person and each person included in the person area information is within a certain threshold within a certain time.
- Judge as a companion That is, the companion determination unit 102 tracks the movement of each person from the input person area information for several frames from the frame in which the tracking target person is identified, and in each frame, the tracking target person and other persons are tracked. Calculate the distance.
- the companion determination unit 102 determines that the person is a companion when the distance is within a predetermined threshold.
- the companion determination unit 102 does not have to determine only the person who is always within the predetermined threshold as the companion, and accompanies the person who is within the threshold for a certain ratio or more. You may make it judge with a person.
- the companion determination unit 102 uses the calibration information of the camera used for associating the two-dimensional coordinates in the image with the three-dimensional coordinates in the real world and the position of the person in the image in the real world. The position information of each person at is calculated. Then, the companion determination unit 102 may determine the companion of the tracking target person using the position information of each person.
- the companion determination unit 102 sets a person whose distance is within a certain threshold as a companion candidate in the frame in which the tracking target person is identified.
- the accompanying person determination part 102 may determine whether it is a companion by calculating
- the companion determination unit 102 generates companion information as a result of the companion determination (S502).
- the companion information is information that specifies which of the information of each person included in the person area information corresponds to the person who is the companion of the person to be tracked. For example, the companion information is given a flag indicating whether or not each person included in the person area information is a companion, and if this value is 1, it indicates that the companion is not companion. Alternatively, the companion information may be expressed in three values so as to include a state in which it is unknown whether the companion is companion.
- the companion information is information including information for specifying the tracking target person. This companion information calculation is performed only when the tracking target person can be identified.
- the accompanying person determination unit 102 calculates a group of persons estimated to include the tracking target person, and determines an accompanying person from the group. This process will be described with reference to the flowchart of FIG.
- the companion determination unit 102 groups persons having close positions from the position information of each person included in the person area information (S511). At this time, the companion determination unit 102 may use the position on the image, or calculates the position of the person in the real world using the camera calibration information as described above. It may be used.
- the accompanying person determination unit 102 determines the accompanying person (S512).
- the tracking target person information includes the position information of the tracking target person obtained by other information such as sensor information
- the companion determination unit 102 selects the group most likely to include the tracking target person.
- the companion determination unit 102 generates companion information from the selected group.
- the companion determination unit 102 determines the person to be tracked (S513).
- the tracking target person information includes information that can specify the tracking target person (such as a facial feature value or a visual feature value of clothes)
- the companion determination unit 102 may determine whether the target person is a tracking target person from the accompanying persons. Narrow down high people.
- the accompanying person determination part 102 also includes the information which specifies a tracking object person in accompanying person information. This determination does not need to be performed every frame, and may be performed only when a group that is highly likely to include the tracking target person can be identified.
- the companion information (which may include information on the person to be tracked) obtained mainly by one of the two methods described above is output to the feature person selection unit 103.
- the feature person selection unit 103 obtains the feature person information and the tracking target person relative position information based on the person region information output from the person region information extraction unit 101 and the accompanying person information output from the accompanying person determination unit 102. calculate.
- Characteristic person information is information indicating which person is characteristic and easy to track. For example, if a group of people in white wears one person wearing red clothes, the person wearing red clothes is completely different from the others. Therefore, when tracking using clothes color, it is considered unlikely that people wearing red clothes will be confused with other people. On the other hand, when chasing people wearing other white clothes, there is a high possibility of mistracking because there are many other people in white clothes. In this manner, the feature person selection unit 103 determines the ease of tracking of each person included in the person area information, and selects a person with high ease of tracking as a feature person. Details of the configuration and operation of the feature person selection unit 103 will be described later.
- the tracking target person relative position information is information representing a relative position between the tracking target person and the characteristic person selected from the accompanying persons.
- the tracking target person relative position information is vector information obtained by subtracting the position coordinates of the characteristic person from the position coordinates of the tracking target person.
- the tracking target person relative position information may be information that roughly represents the relative positional relationship such as “the tracking target person is behind the characteristic person”.
- the relative position information is a representative value of coordinates obtained from the plurality of person information (average, one point, etc.) It may be. Details of the relative position information will also be described later.
- the feature person selection unit 103 outputs the calculated feature person information to the person tracking unit 104, and outputs the calculated tracking target person relative position information to the tracking result calculation unit 105.
- the number of characteristic persons is not necessarily limited to one, and a plurality of characteristic persons may exist.
- the person tracking unit 104 calculates characteristic person tracking information obtained by tracking a characteristic person from the person area information output from the person area information extraction unit 101 and the characteristic person information output from the characteristic person selection unit 103.
- a tracking method any conventionally used tracking method may be used.
- the person tracking unit 104 may perform tracking using, for example, a particle filter using a feature amount of clothes.
- the person tracking unit 104 may perform tracking using a Kalman filter.
- the person tracking unit 104 selects the next neighboring camera area when a person being tracked appears in an area outside the angle of view of the camera currently being tracked. Predict how far you will go or how long you will be entering the angle of view of the camera. Then, the person tracking unit 104 notifies the camera to be tracked next (a control unit that controls the camera) of information about the characteristics of the person and the estimated time of arrival at the angle of view of the camera. Next, when receiving the information, the control unit of the camera that performs tracking starts searching for the characteristic person slightly before the estimated arrival time.
- control unit of the camera to be tracked next compares the characteristics of the person newly entering the angle of view with the characteristics of the characteristic person being tracked, and there is a person whose characteristics match. It is determined whether or not to do.
- the person tracking unit 104 switches to the process of tracking the person in the camera and tracks the person. The method for tracking a person in the same camera is as described above.
- the person tracking unit 104 outputs the tracking information for the calculated characteristic person to the tracking result calculation unit 105 as characteristic person tracking information.
- the tracking result calculation unit 105 calculates the tracking target person tracking result from the characteristic person tracking information output from the person tracking unit 104 and the tracking target person relative position information output from the characteristic person selection unit 103.
- the tracking result calculation unit 105 calculates the tracking result person tracking result by adding the tracking target person relative position information to the characteristic person tracking information.
- the tracking target person relative position information cannot always be calculated. Therefore, at the time when the tracking target person relative position information cannot be calculated, the tracking result calculation unit 105 calculates the person tracking result using the previous relative position information as it is, or from the previous relative position information.
- the person tracking result may be calculated by prediction.
- the tracking result calculation unit 105 temporarily stores the characteristic person tracking information in the buffer until the tracking target person relative position information is calculated next time. At the time when the next tracking target person relative position information is calculated, the tracking result calculation unit 105 uses the relative position information and the previous relative position information to interpolate the relative position information at each time by interpolation. calculate.
- the tracking result calculation unit 105 may calculate the person tracking result of the tracking target person using the characteristic person tracking information and the relative position information calculated by the interior interpolation.
- FIG. 4 is a flowchart showing the operation of the person tracking apparatus 100 according to the present embodiment.
- the person area information extraction unit 101 calculates person area information from the monitoring video (S101). The details of the calculation process of the person area information are as described in the description of the person area information extraction unit 101.
- the companion determination unit 102 calculates companion information based on the person area information and the tracking target person information (S102). The details of the companion information calculation process are as described in the explanation of the companion determination unit 102.
- the characteristic person selection unit 103 calculates characteristic person information and tracking target person relative position information based on the person area information and the accompanying person information (S103). The calculation of the information is as described in the description of the feature person selection unit 103.
- the person tracking unit 104 calculates characteristic person tracking information from the person area information and the characteristic person information (S104).
- the characteristic person tracking information calculation process is as described in the description of the person tracking unit 104.
- the tracking result calculation unit 105 calculates a tracking target person tracking result from the characteristic person tracking information and the tracking target person relative position information (S105). Details of the tracking target person tracking result calculation process are as described in the description of the tracking result calculation unit 105.
- FIG. 5 is a block diagram showing a configuration of the feature person selection unit 103 according to the present embodiment.
- the feature person selection unit 103 includes a feature person determination unit 201 and a feature specificity information storage unit 202.
- the feature specificity information storage unit 202 stores feature specificity information and outputs it to the feature person determination unit 201.
- the feature person determination unit 201 receives the person region information, the accompanying person information, and the feature specificity information output from the feature specificity information storage unit 202, and calculates the feature person information and the tracking target person relative position information. To do.
- the feature specificity information storage unit 202 stores feature specificity information.
- the feature specificity information is information indicating how unique (characteristic) the value obtained as each feature amount representing the feature of a person is. For example, in the case of a clothing color feature, the specificity of a commonly seen clothing color (eg white) is low. On the other hand, if it is a color of clothes that is not often seen (for example, bright red), the specificity of the color becomes high.
- the specific specificity value is calculated by calculating the appearance frequency of each feature amount value (the value of each color in the case of clothes) using the learning data, and calculating a monotonous non-increasing function for the frequency.
- the value of the self-information amount (-log 2 p where frequency is p) is calculated from the frequency, and the calculated value can be used as the specificity information.
- a value (for example, 1 / p) corresponding to the inverse document frequency used in the document search may be obtained and this value may be used as the specificity information.
- ⁇ Specificity information may be switched for each season or time. That is, the feature specificity information storage unit 202 may change the feature specificity information to be stored for each season or time. For example, black clothes increase in winter, but white clothes increase in summer. Or, since the suit jacket is worn in the morning and evening, the frequency of the jacket color is high, but in the daytime only the shirt is often used, so the frequency of the white color is high. Thus, when the frequency changes depending on the season and time, the specificity information may be changed according to the season and time. Moreover, when the tendency of the color of clothes changes with places (for example, Okinawa and Hokkaido), you may change specificity information according to a place.
- places for example, Okinawa and Hokkaido
- the specificity information may be changed.
- the feature is information such as a person's age, gender, and height, and there are many children in the daytime but many adults in the daytime, the daytime is a small age value or a peculiar to the height. However, at night, it is more specific for small age values and heights.
- the specificity information may be changed according to the change in the attribute of the person existing in the area observed by the monitoring camera.
- the feature specificity information is input to the feature person determination unit 201. Based on the feature specificity information, the feature person determination unit 201 calculates the specificity of the feature amount data for the person specified by the companion information among the person regions included in the person region information. The characteristic person determination unit 201 determines a person with high specificity as a characteristic person, and outputs information specifying the person to the person tracking unit 104 as characteristic person information.
- the feature person determination unit 201 calculates the feature specificity of each person from the feature amount of each person included in the person area information (S601).
- the feature person determination unit 201 directly uses the value as the specificity of each person. To do.
- the feature person determination unit 201 selects the person region from among the feature value values from which the value of specificity can be acquired. A value similar to the feature value included in the information is calculated. Then, the characteristic person determination unit 201 estimates the specificity value from the similar specificity values. For example, the feature person determination unit 201 may use the specificity value as it is, or may calculate a plurality of similar feature amounts and average the specificity values for these feature amounts. Good.
- the feature person determination unit 201 can easily and stably select a person with high specificity in the process described later. Furthermore, the characteristic person determination unit 201 can select a characteristic person suitable for the situation by appropriately changing the peculiarity information according to the situation such as time, season, and place.
- the characteristic person determination unit 201 selects a highly specific person (S602).
- the feature person determination unit 201 may select only one person having the highest specificity, or may select all persons having specificity greater than a certain threshold.
- the characteristic person determination unit 201 calculates the difference between the position of the tracking target person and the position of the person selected as the characteristic person, and tracks this difference. It outputs to tracking result calculation part 105 as subject person relative position information (S603).
- subject person relative position information S603
- the tracking target person relative position information is obtained by obtaining a difference from each of these persons.
- the person tracking device 100 identifies a companion of the person to be tracked, selects a characteristic person that is a characteristic person from the accompanying persons, and tracks the characteristic person. Thereby, the person tracking device 100 can perform highly accurate tracking even when tracking a tracking target person with few features.
- the person tracking device 100 can calculate the detailed position information of the tracking target person. it can.
- the person tracking apparatus according to the second embodiment is different from the person tracking apparatus described in the first embodiment in that the configuration of the feature person selection unit 103 is different.
- the human tracking device according to the present embodiment will be described below with respect to differences from the first embodiment.
- FIG. 7 is a block diagram showing a configuration of the feature person selection unit 103 according to the present embodiment.
- the feature person selection unit 103 includes a feature specificity determination unit 250 and a feature person determination unit 201.
- the feature specificity determination unit 250 calculates the feature specificity information using the person region information as an input, and outputs the calculated feature specificity information to the feature person determination unit 201.
- the feature person determination unit 201 receives the person region information, the accompanying person information, and the feature specificity information output from the feature specificity determination unit 250, and outputs the feature person information and the tracking target person relative position information. .
- the person area information is input to the feature specificity determination unit 250.
- the feature specificity determination unit 250 acquires the feature amount of each person area from the person area information, and calculates the specificity of the feature amount value. For example, when the feature amount is a clothing color, the feature specificity determination unit 250 calculates the appearance frequency of each color by summing up the clothing colors included in each person area information, and calculates the specificity according to the appearance frequency. .
- the appearance frequency may be calculated using only the person information of the current frame, or using all the characteristics of the person who has appeared up to now.
- the feature specificity determination unit 250 may calculate the above-described appearance frequency using only information on a person who has appeared within a certain time from the present time.
- the feature specificity determination unit 250 may add the appearance frequency to the past data with a weight that decreases as the time is further away from the present.
- the feature specificity determination unit 250 may calculate the appearance frequency using past data with different days but close times, or may calculate the appearance frequency using data only in the same season.
- the feature specificity determination unit 250 may add the appearance frequency with a weight having a small value as the season or time moves away from the time of the current frame. Furthermore, the feature specificity determination unit 250 may calculate the appearance frequency by aggregating information on the person area detected between a plurality of cameras. In this case, the feature specificity determination unit 250 may calculate the appearance frequency with a weight that increases as the physical arrangement between the cameras is closer.
- the feature specificity determination unit 250 calculates the specificity information of each person from the calculated appearance frequency.
- the method for calculating the specificity information is the same as the method for calculating based on the learning data described in the description of the feature specificity information storage unit 202 in FIG.
- the characteristic specificity information obtained in this way is input to the characteristic person determination unit 201.
- the operation of the feature person determination unit 201 is the same as that of the feature person determination unit 201 shown in FIG.
- the feature person selection unit 103 calculates the appearance frequency of each feature amount using the actually input monitoring video, and calculates the specificity of each person. Thereby, the feature person selection unit 103 can calculate the specificity most suitable for the place and time where the camera is installed, and can improve the validity of the feature person selection. By improving the validity of the feature person selection, it is possible to improve the tracking accuracy of the tracking target person. Furthermore, even if the specificity information changes with time, the person tracking device according to the present embodiment can appropriately follow the tracking target person.
- the person tracking apparatus is characterized in that the feature person selection unit 103 includes the feature specificity information storage unit 202 and the feature specificity determination unit 250 described above.
- the human tracking device will be described below with respect to differences from the first and second embodiments.
- FIG. 8 is a block diagram showing a configuration of the feature person selection unit 103 according to the present embodiment.
- the feature person selection unit 103 includes a feature specificity determination unit 250, a feature specificity information storage unit 202, a feature specificity information integration unit 253, and a feature person determination unit 201.
- the feature specificity determination unit 250 receives the person region information as input, and outputs the first feature specificity information to the feature specificity information integration unit 253.
- the feature specificity information storage unit 202 outputs the stored feature specificity information to the feature specificity information integration unit 253 as second feature specificity information.
- the feature specificity information integration unit 253 inputs the first feature specificity information output from the feature specificity determination unit 250 and the second feature specificity information output from the feature specificity information storage unit 202. And the calculated characteristic specificity information is output to the characteristic person determination unit 201.
- the feature person determination unit 201 receives the person region information, the accompanying person information, and the feature specificity information output from the feature specificity information integration unit 253, and outputs the feature person information and the tracking target person relative position information. To do.
- the operation of the feature specificity information storage unit 202 is the same as that of the feature specificity information storage unit 202 shown in FIG.
- the operation of the feature specificity determination unit 250 is the same as that of the feature specificity determination unit 250 shown in FIG.
- the feature specificity information output from the feature specificity determination unit 250 is input to the feature specificity information integration unit 253 as first feature specificity information.
- the feature specificity information output from the feature specificity information storage unit 202 is input to the feature specificity information integration unit 253 as second feature specificity information.
- the feature specificity information integration unit 253 calculates feature specificity information to be supplied to the feature person determination unit 201 using the first feature specificity information and the second feature specificity information.
- Various calculation methods are conceivable.
- the feature specificity information integration unit 253 sets the average of both as the feature specificity information supplied to the feature person determination unit 201.
- the feature specificity information integration unit 253 may calculate an average after weighting one of them. For example, when the feature specificity information integration unit 253 calculates the feature specificity information by increasing the weight of the first feature specificity information, the feature specificity information integration unit 253 emphasizes the monitoring video and the feature specificity. Information can be calculated.
- the feature specificity information integration unit 253 supplies either one of the first feature specificity information and the second feature specificity information to the feature person determination unit 201 according to time and day of the week. It may be information.
- the feature specificity information integration unit 253 outputs the calculated feature specificity information to the feature person determination unit 201.
- the operation of the feature specificity determination unit 250 is the same as that of the feature person determination unit 250 shown in FIG.
- the feature person selection unit 103 selects feature persons that take advantage of both. Is possible.
- Each process in the person tracking apparatus according to the first to third embodiments described above may be realized as a program that operates in an arbitrary computer.
- the program can be stored and provided to a computer using various types of non-transitory computer readable media.
- Non-transitory computer readable media include various types of tangible storage media (tangible storage medium).
- non-transitory computer-readable media examples include magnetic recording media (eg flexible disks, magnetic tapes, hard disk drives), magneto-optical recording media (eg magneto-optical discs), CD-ROMs (Read Only Memory), CD-Rs, CD-R / W, semiconductor memory (for example, mask ROM, PROM (Programmable ROM), EPROM (Erasable ROM), flash ROM, RAM (random access memory)) are included.
- the program may also be supplied to the computer by various types of temporary computer-readable media. Examples of transitory computer readable media include electrical signals, optical signals, and electromagnetic waves.
- the temporary computer-readable medium can supply the program to the computer via a wired communication path such as an electric wire and an optical fiber, or a wireless communication path.
- a person area information extracting means for detecting a person area that is an area to which a person included in the video belongs, and generating person area information describing the information of the person area; Based on the person area information and information specifying the tracking person, at least one accompanying person accompanying the tracking person is identified from the persons included in the person area information, and the accompanying person is described
- Companion determination means for generating companion information as information, A characteristic person, which is information describing the characteristic person, is selected from the accompanying persons specified by the companion determination information by using the person area information to select a characteristic person having a specific feature amount.
- a characteristic person selection means for generating information;
- a person tracking means for calculating a characteristic person tracking result which is a tracking result of the characteristic person based on the person area information and the characteristic person information;
- a person tracking device for detecting a person area that is an area to which a person included in the video belongs, and generating person area information describing the information of the person area; Based on the person area information and information specifying the tracking person
- the companion information includes information specifying the tracking target person,
- the characteristic person selection means calculates tracking target person relative position information representing a relative position between the tracking target person and the characteristic person,
- the person tracking device further includes tracking result calculation means for calculating a tracking result of the tracking target person from the characteristic person tracking result and the tracking target person relative position information.
- the person tracking device according to attachment 1.
- the feature person selection means includes feature specificity information storage means for storing feature specificity information describing information on the specificity of the feature value, and Calculate the feature amount of each of the accompanying persons specified by the accompanying person information based on the person area information, calculate the specificity of the feature amount of each of the accompanying persons based on the feature specificity information, Characteristic person determination means for selecting the characteristic persons in an order of relatively high specificity,
- the person tracking device according to appendix 1 or 2, characterized in that:
- the feature person selection means calculates feature specificity information that is information on the specificity of the value of the feature quantity based on the feature quantity of the person described in the person area information; and Calculate the feature amount of each of the accompanying persons specified by the accompanying person information based on the person area information, calculate the specificity of the feature amount of each of the accompanying persons based on the feature specificity information, Characteristic person determination means for selecting the characteristic persons in an order of relatively high specificity, The person tracking device according to appendix 1 or 2, characterized in that:
- the characteristic person selection means calculates characteristic first characteristic information, which is information relating to the characteristic value of the characteristic quantity, based on the characteristic quantity of each person described in the person area information.
- Feature-specificity information storage means for storing second feature-specificity information that describes information about the specificity of the feature value
- Feature specificity information integration means for calculating integrated feature specificity information obtained by integrating the first feature specificity information and the second feature specificity information
- the feature amount of each of the accompanying persons specified by the accompanying person information is calculated based on the person area information, and the specificity of the feature amount of each of the accompanying persons is calculated based on the integrated feature specificity information.
- a characteristic person determination means for selecting the characteristic person in order of relatively high specificity.
- Appendix 7 The person tracking apparatus according to appendix 3, wherein the feature specificity information accumulating unit changes the feature specificity information to be accumulated according to at least one of a current position, season, and time.
- the feature specificity information integration means calculates an average value from the first feature specificity information and the second feature specificity information, and generates the integrated feature specificity information from the average value.
- the feature specificity information integration means weights at least one of the first feature specificity information and the second feature specificity information, calculates an average value from both, and calculates the average value 6.
- the companion determination unit specifies information on the tracking target person included in the person area information based on the information specifying the tracking target person, and specifies the companion based on the specified information.
- the person tracking device according to any one of appendices 1 to 9, characterized in that:
- the companion determination unit performs grouping by grouping persons having close positions from the position information of each person included in the person area information, and based on the information specifying the tracking target person, the tracking target 10.
- the person tracking device according to any one of appendices 1 to 9, wherein a group to which the person belongs is specified, and the accompanying person information is calculated based on the specified group.
- (Appendix 12) Detecting a person area that is an area to which a person included in the video belongs, and generating information person area information describing information of the person area; Based on the person area information and information specifying the tracking person, at least one accompanying person accompanying the tracking person is identified from the persons included in the person area information, and the accompanying person is described
- Generate companion information which is information
- a characteristic person which is information describing the characteristic person, is selected from the accompanying persons specified by the companion determination information by using the person area information to select a characteristic person having a specific feature amount.
- Generate information A person tracking method for calculating a feature person tracking result, which is a result of tracking the feature person, based on the person area information and the feature person information.
- a non-transitory computer-readable medium storing a program for causing a computer to execute processing for tracking a person included in an image, The processing detects a person area that is an area to which a person included in the video belongs, generates information person area information describing information on the person area, Based on the person area information and information specifying the tracking person, at least one accompanying person accompanying the tracking person is identified from the persons included in the person area information, and the accompanying person is described Generate companion information, which is information, A characteristic person, which is information describing the characteristic person, is selected from the accompanying persons specified by the companion determination information by using the person area information to select a characteristic person having a specific feature amount. Generate information, A non-transitory computer-readable medium storing a person tracking program that calculates a characteristic person tracking result that is a result of tracking the characteristic person based on the person area information and the characteristic person information.
- the present invention it is possible to track a person with a monitoring camera and calculate the position of the person at a specific time. Therefore, an arbitrary system can provide information corresponding to the position to the tracking target person.
- the present invention can be applied to a child watching service for transmitting a child tracking result to a guardian.
- the present invention can also be applied to a purpose of tracking a specific person in a general security system.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
映像に含まれる人物の属する領域である人物領域を検出し、前記人物領域の情報を記述した人物領域情報を生成する人物領域情報抽出手段と、
前記人物領域情報と追跡対象者を指定する情報とに基づいて、前記人物領域情報に含まれる人物中から前記追跡対象者に同伴する少なくとも1人の同伴者を特定し、前記同伴者を記述した情報である同伴者情報を生成する同伴者判定手段と、
前記同伴者判定情報によって指定される前記同伴者の中から、前記人物領域情報を用いて特異的な特徴量を有する人物である特徴人物を選出し、前記特徴人物を記述した情報である特徴人物情報を生成する特徴人物選出手段と、
前記人物領域情報と、前記特徴人物情報と、に基づいて前記特徴人物の追跡結果である特徴人物追跡結果を算出する人物追跡手段と、
を有する、ものである。
映像に含まれる人物の属する領域である人物領域を検出し、前記人物領域の情報を記述した情報人物領域情報を生成し、
前記人物領域情報と追跡対象者を指定する情報とに基づいて、前記人物領域情報に含まれる人物中から前記追跡対象者に同伴する少なくとも1人の同伴者を特定し、前記同伴者を記述した情報である同伴者情報を生成し、
前記同伴者判定情報によって指定される前記同伴者の中から、前記人物領域情報を用いて特異的な特徴量を有する人物である特徴人物を選出し、前記特徴人物を記述した情報である特徴人物情報を生成し、
前記人物領域情報と、前記特徴人物情報と、に基づいて前記特徴人物の追跡結果である特徴人物追跡結果を算出する、ものである。
映像に含まれる人物を追跡する処理をコンピュータに実行させるためのプログラムを格納した非一時的なコンピュータ可読媒体であって、
前記処理は、前記映像に含まれる人物の属する領域である人物領域を検出し、前記人物領域の情報を記述した情報人物領域情報を生成し、
前記人物領域情報と追跡対象者を指定する情報とに基づいて、前記人物領域情報に含まれる人物中から前記追跡対象者に同伴する少なくとも1人の同伴者を特定し、前記同伴者を記述した情報である同伴者情報を生成し、
前記同伴者判定情報によって指定される前記同伴者の中から、前記人物領域情報を用いて特異的な特徴量を有する人物である特徴人物を選出し、前記特徴人物を記述した情報である特徴人物情報を生成し、
前記人物領域情報と、前記特徴人物情報と、に基づいて前記特徴人物の追跡結果である特徴人物追跡結果を算出する、ものである。
以下、図面を参照して本発明の実施の形態について説明する。図1は、本実施の形態にかかる人物追跡装置の構成を示すブロック図である。人物追跡装置100は、人物領域情報抽出部101と、同伴者判定部102と、特徴人物選出部103と、人物追跡部104と、追跡結果算出部105と、を有する。
本実施の形態2にかかる人物追跡装置は、実施の形態1に記載の人物追跡装置と比べ、特徴人物選出部103の構成が異なることを特徴とする。本実施の形態にかかる人物追跡装置について、実施の形態1と異なる点を以下に説明する。
本実施の形態3にかかる人物追跡装置は、特徴人物選出部103内に上述の特徴特異性情報蓄積部202及び特徴特異性判定部250を備えることを特徴とする。本実施の形態にかかる人物追跡装置について、実施の形態1及び2と異なる点を以下に説明する。
映像に含まれる人物の属する領域である人物領域を検出し、前記人物領域の情報を記述した人物領域情報を生成する人物領域情報抽出手段と、
前記人物領域情報と追跡対象者を指定する情報とに基づいて、前記人物領域情報に含まれる人物中から前記追跡対象者に同伴する少なくとも1人の同伴者を特定し、前記同伴者を記述した情報である同伴者情報を生成する同伴者判定手段と、
前記同伴者判定情報によって指定される前記同伴者の中から、前記人物領域情報を用いて特異的な特徴量を有する人物である特徴人物を選出し、前記特徴人物を記述した情報である特徴人物情報を生成する特徴人物選出手段と、
前記人物領域情報と、前記特徴人物情報と、に基づいて前記特徴人物の追跡結果である特徴人物追跡結果を算出する人物追跡手段と、
を有する人物追跡装置。
前記同伴者情報は、前記追跡対象者を指定する情報を含み、
前記特徴人物選出手段は、前記追跡対象者と前記特徴人物との相対位置を表す追跡対象者相対位置情報を算出し、
前記人物追跡装置は、前記特徴人物追跡結果と前記追跡対象者相対位置情報とから前記追跡対象者の追跡結果を算出する追跡結果算出手段をさらに有する、
付記1に記載の人物追跡装置。
前記特徴人物選出手段は、特徴量の値の特異性に関する情報を記述した特徴特異性情報を蓄積する特徴特異性情報蓄積手段と、
前記同伴者情報により指定される前記同伴者の各々の特徴量を前記人物領域情報に基づいて算出し、前記特徴特異性情報を基に前記同伴者の各々の特徴量の特異性を算出し、前記特異性が相対的に高い順序で前記特徴人物を選出する特徴人物判定手段と、を有する、
ことを特徴とする付記1または2に記載の人物追跡装置。
前記特徴人物選出手段は、前記人物領域情報に記述された人物の特徴量に基づいて、特徴量の値の特異性に関する情報である特徴特異性情報を算出する特徴特異性判定手段と、
前記同伴者情報により指定される前記同伴者の各々の特徴量を前記人物領域情報に基づいて算出し、前記特徴特異性情報を基に前記同伴者の各々の特徴量の特異性を算出し、前記特異性が相対的に高い順序で前記特徴人物を選出する特徴人物判定手段と、を有する、
ことを特徴とする付記1または2に記載の人物追跡装置。
前記特徴人物選出手段は、前記人物領域情報に記述された各人物の特徴量に基づいて、特徴量の値の特異性に関する情報である第1の特徴特異性情報を算出する特徴特異性判定手段と、
特徴量の値の特異性に関する情報を記載した第2の特徴特異性情報を蓄積する特徴特異性情報蓄積手段と、
前記第1の特徴特異性情報と前記第2の特徴特異性情報とを統合した統合特徴特異性情報を算出する特徴特異性情報統合手段と、
前記同伴者情報により指定される前記同伴者の各々の特徴量を前記人物領域情報に基づいて算出し、前記統合特徴特異性情報を基に前記同伴者の各々の特徴量の特異性を算出し、前記特異性が相対的に高い順序で前記特徴人物を選出する特徴人物判定手段と、を有する、
ことを特徴とする付記1または2に記載の人物追跡装置。
前記特徴特異性判定手段は、前記特徴量の値の出現頻度が小さくなるにつれて前記特異性を高く設定することを特徴とする付記4または付記5に記載の人物追跡装置。
前記特徴特異性情報蓄積手段は、現在位置、季節、時間の少なくとも1つに応じて蓄積する前記特徴特異性情報を変更することを特徴とする付記3に記載の人物追跡装置。
前記特徴特異性情報統合手段は、前記第1の特徴特異性情報と前記第2の特徴特異性情報と、から平均値を算出し、当該平均値から前記統合特徴特異性情報を生成することを特徴とする付記5に記載の人物追跡装置。
前記特徴特異性情報統合手段は、前記第1の特徴特異性情報と前記第2の特徴特異性情報との少なくとも一方に重みづけを行った上で、両者から平均値を算出し、当該平均値を基に前記統合特徴特異性情報を生成することを特徴とする付記5に記載の人物追跡装置。
前記同伴者判定手段は、前記追跡対象者を指定する情報に基づいて、前記人物領域情報に含まれる前記追跡対象者の情報を特定し、当該特定した情報に基づいて、前記同伴者を特定することを特徴とする付記1~9のいずれか1項に記載の人物追跡装置。
前記同伴者判定手段は、前記人物領域情報に含まれる各人物の位置情報から位置が近い人物同士を同一グループとするグループ化を行い、前記追跡対象者を指定する情報に基づいて、前記追跡対象者が属するグループを特定し、特定したグループに基づいて、前記同伴者情報を算出することを特徴とする付記1~9のいずれか1項に記載の人物追跡装置。
映像に含まれる人物の属する領域である人物領域を検出し、前記人物領域の情報を記述した情報人物領域情報を生成し、
前記人物領域情報と追跡対象者を指定する情報とに基づいて、前記人物領域情報に含まれる人物中から前記追跡対象者に同伴する少なくとも1人の同伴者を特定し、前記同伴者を記述した情報である同伴者情報を生成し、
前記同伴者判定情報によって指定される前記同伴者の中から、前記人物領域情報を用いて特異的な特徴量を有する人物である特徴人物を選出し、前記特徴人物を記述した情報である特徴人物情報を生成し、
前記人物領域情報と、前記特徴人物情報と、に基づいて前記特徴人物の追跡結果である特徴人物追跡結果を算出する、人物追跡方法。
映像に含まれる人物を追跡する処理をコンピュータに実行させるためのプログラムを格納した非一時的なコンピュータ可読媒体であって、
前記処理は、前記映像に含まれる人物の属する領域である人物領域を検出し、前記人物領域の情報を記述した情報人物領域情報を生成し、
前記人物領域情報と追跡対象者を指定する情報とに基づいて、前記人物領域情報に含まれる人物中から前記追跡対象者に同伴する少なくとも1人の同伴者を特定し、前記同伴者を記述した情報である同伴者情報を生成し、
前記同伴者判定情報によって指定される前記同伴者の中から、前記人物領域情報を用いて特異的な特徴量を有する人物である特徴人物を選出し、前記特徴人物を記述した情報である特徴人物情報を生成し、
前記人物領域情報と、前記特徴人物情報と、に基づいて前記特徴人物の追跡結果である特徴人物追跡結果を算出する、人物追跡プログラムを格納した非一時的なコンピュータ可読媒体。
2 ボクセル生成手段
3 人物色抽出手段
4 人物追跡手段
100 人物追跡装置
101 人物領域情報抽出部
102 同伴者判定部
103 特徴人物選出部
104 人物追跡部
105 追跡結果算出部
201 特徴人物判定部
202 特徴特異性情報蓄積部
250 特徴特異性判定部
253 特徴特異性情報統合部
Claims (10)
- 映像に含まれる人物の属する領域である人物領域を検出し、前記人物領域の情報を記述した人物領域情報を生成する人物領域情報抽出手段と、
前記人物領域情報と追跡対象者を指定する情報とに基づいて、前記人物領域情報に含まれる人物中から前記追跡対象者に同伴する少なくとも1人の同伴者を特定し、前記同伴者を記述した情報である同伴者情報を生成する同伴者判定手段と、
前記同伴者判定情報によって指定される前記同伴者の中から、前記人物領域情報を用いて特異的な特徴量を有する人物である特徴人物を選出し、前記特徴人物を記述した情報である特徴人物情報を生成する特徴人物選出手段と、
前記人物領域情報と、前記特徴人物情報と、に基づいて前記特徴人物の追跡結果である特徴人物追跡結果を算出する人物追跡手段と、
を有する人物追跡装置。 - 前記同伴者情報は、前記追跡対象者を指定する情報を含み、
前記特徴人物選出手段は、前記追跡対象者と前記特徴人物との相対位置を表す追跡対象者相対位置情報を算出し、
前記人物追跡装置は、前記特徴人物追跡結果と前記追跡対象者相対位置情報とから前記追跡対象者の追跡結果を算出する追跡結果算出手段をさらに有する、
請求項1に記載の人物追跡装置。 - 前記特徴人物選出手段は、特徴量の値の特異性に関する情報を記述した特徴特異性情報を蓄積する特徴特異性情報蓄積手段と、
前記同伴者情報により指定される前記同伴者の各々の特徴量を前記人物領域情報に基づいて算出し、前記特徴特異性情報を基に前記同伴者の各々の特徴量の特異性を算出し、前記特異性が相対的に高い順序で前記特徴人物を選出する特徴人物判定手段と、を有する、
ことを特徴とする請求項1または2に記載の人物追跡装置。 - 前記特徴人物選出手段は、前記人物領域情報に記述された人物の特徴量に基づいて、特徴量の値の特異性に関する情報である特徴特異性情報を算出する特徴特異性判定手段と、
前記同伴者情報により指定される前記同伴者の各々の特徴量を前記人物領域情報に基づいて算出し、前記特徴特異性情報を基に前記同伴者の各々の特徴量の特異性を算出し、前記特異性が相対的に高い順序で前記特徴人物を選出する特徴人物判定手段と、を有する、
ことを特徴とする請求項1または2に記載の人物追跡装置。 - 前記特徴人物選出手段は、前記人物領域情報に記述された各人物の特徴量に基づいて、特徴量の値の特異性に関する情報である第1の特徴特異性情報を算出する特徴特異性判定手段と、
特徴量の値の特異性に関する情報を記載した第2の特徴特異性情報を蓄積する特徴特異性情報蓄積手段と、
前記第1の特徴特異性情報と前記第2の特徴特異性情報とを統合した統合特徴特異性情報を算出する特徴特異性情報統合手段と、
前記同伴者情報により指定される前記同伴者の各々の特徴量を前記人物領域情報に基づいて算出し、前記統合特徴特異性情報を基に前記同伴者の各々の特徴量の特異性を算出し、前記特異性が相対的に高い順序で前記特徴人物を選出する特徴人物判定手段と、を有する、
ことを特徴とする請求項1または2に記載の人物追跡装置。 - 前記特徴特異性判定手段は、前記特徴量の値の出現頻度が小さくなるにつれて前記特異性を高く設定することを特徴とする請求項4または請求項5に記載の人物追跡装置。
- 前記特徴特異性情報蓄積手段は、現在位置、季節、時間の少なくとも1つに応じて蓄積する前記特徴特異性情報を変更することを特徴とする請求項3に記載の人物追跡装置。
- 前記特徴特異性情報統合手段は、前記第1の特徴特異性情報と前記第2の特徴特異性情報と、から平均値を算出し、当該平均値から前記統合特徴特異性情報を生成することを特徴とする請求項5に記載の人物追跡装置。
- 映像に含まれる人物の属する領域である人物領域を検出し、前記人物領域の情報を記述した情報人物領域情報を生成し、
前記人物領域情報と追跡対象者を指定する情報とに基づいて、前記人物領域情報に含まれる人物中から前記追跡対象者に同伴する少なくとも1人の同伴者を特定し、前記同伴者を記述した情報である同伴者情報を生成し、
前記同伴者判定情報によって指定される前記同伴者の中から、前記人物領域情報を用いて特異的な特徴量を有する人物である特徴人物を選出し、前記特徴人物を記述した情報である特徴人物情報を生成し、
前記人物領域情報と、前記特徴人物情報と、に基づいて前記特徴人物の追跡結果である特徴人物追跡結果を算出する、人物追跡方法。 - 映像に含まれる人物を追跡する処理をコンピュータに実行させるためのプログラムを格納した非一時的なコンピュータ可読媒体であって、
前記処理は、前記映像に含まれる人物の属する領域である人物領域を検出し、前記人物領域の情報を記述した情報人物領域情報を生成し、
前記人物領域情報と追跡対象者を指定する情報とに基づいて、前記人物領域情報に含まれる人物中から前記追跡対象者に同伴する少なくとも1人の同伴者を特定し、前記同伴者を記述した情報である同伴者情報を生成し、
前記同伴者判定情報によって指定される前記同伴者の中から、前記人物領域情報を用いて特異的な特徴量を有する人物である特徴人物を選出し、前記特徴人物を記述した情報である特徴人物情報を生成し、
前記人物領域情報と、前記特徴人物情報と、に基づいて前記特徴人物の追跡結果である特徴人物追跡結果を算出する、人物追跡プログラムを格納した非一時的なコンピュータ可読媒体。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/001,251 US9235754B2 (en) | 2011-03-28 | 2011-10-26 | Person tracking device, person tracking method, and non-transitory computer readable medium storing person tracking program |
RU2013147808/08A RU2546327C1 (ru) | 2011-03-28 | 2011-10-26 | Устройство для отслеживания человека, способ отслеживания человека и невременный машиночитаемый носитель, хранящий программу для отслеживания человека |
JP2013506855A JP5870996B2 (ja) | 2011-03-28 | 2011-10-26 | 人物追跡装置、人物追跡方法および人物追跡プログラム |
EP11862298.4A EP2693404B1 (en) | 2011-03-28 | 2011-10-26 | Person tracking device, person tracking method, and non-temporary computer-readable medium storing person tracking program |
BR112013025032A BR112013025032A2 (pt) | 2011-03-28 | 2011-10-26 | dispositivo de rastreamento de pessoa, método de rastreamento de pessoa, e meio legível por computador não transitório que armazena programa de rastreamento de pessoa |
CN201180069391.7A CN103430214B (zh) | 2011-03-28 | 2011-10-26 | 人员跟踪设备和人员跟踪方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011070114 | 2011-03-28 | ||
JP2011-070114 | 2011-03-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012131816A1 true WO2012131816A1 (ja) | 2012-10-04 |
Family
ID=46929659
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/005973 WO2012131816A1 (ja) | 2011-03-28 | 2011-10-26 | 人物追跡装置、人物追跡方法および人物追跡プログラムを格納した非一時的なコンピュータ可読媒体 |
Country Status (8)
Country | Link |
---|---|
US (1) | US9235754B2 (ja) |
EP (1) | EP2693404B1 (ja) |
JP (1) | JP5870996B2 (ja) |
CN (1) | CN103430214B (ja) |
BR (1) | BR112013025032A2 (ja) |
MY (1) | MY167470A (ja) |
RU (1) | RU2546327C1 (ja) |
WO (1) | WO2012131816A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014155159A (ja) * | 2013-02-13 | 2014-08-25 | Nec Corp | 情報処理システム、情報処理方法及びプログラム |
WO2016132769A1 (ja) * | 2015-02-19 | 2016-08-25 | シャープ株式会社 | 撮影装置、撮影装置の制御方法、および制御プログラム |
JP2016201758A (ja) * | 2015-04-14 | 2016-12-01 | パナソニックIpマネジメント株式会社 | 施設内人物捜索支援装置、施設内人物捜索支援システムおよび施設内人物捜索支援方法 |
JPWO2015064292A1 (ja) * | 2013-10-30 | 2017-03-09 | 日本電気株式会社 | 画像の特徴量に関する処理システム、処理方法及びプログラム |
JP2017157127A (ja) * | 2016-03-04 | 2017-09-07 | Necソリューションイノベータ株式会社 | 捜索支援装置、捜索支援方法、及びプログラム |
JP2018084924A (ja) * | 2016-11-22 | 2018-05-31 | サン電子株式会社 | 管理装置及び管理システム |
WO2019131062A1 (ja) * | 2017-12-27 | 2019-07-04 | パイオニア株式会社 | 判定装置及び情報記録装置、判定方法並びに判定用プログラム |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10248868B2 (en) * | 2012-09-28 | 2019-04-02 | Nec Corporation | Information processing apparatus, information processing method, and information processing program |
JP6148505B2 (ja) * | 2013-03-21 | 2017-06-14 | 株式会社東芝 | 在室確率推定装置およびその方法、ならびにプログラム |
CN103984955B (zh) * | 2014-04-23 | 2017-02-22 | 浙江工商大学 | 基于显著性特征和迁移增量学习的多摄像机目标识别方法 |
JP6428144B2 (ja) * | 2014-10-17 | 2018-11-28 | オムロン株式会社 | エリア情報推定装置、エリア情報推定方法、および空気調和装置 |
US10687022B2 (en) | 2014-12-05 | 2020-06-16 | Avigilon Fortress Corporation | Systems and methods for automated visual surveillance |
US20160165191A1 (en) * | 2014-12-05 | 2016-06-09 | Avigilon Fortress Corporation | Time-of-approach rule |
US20160182814A1 (en) * | 2014-12-19 | 2016-06-23 | Microsoft Technology Licensing, Llc | Automatic camera adjustment to follow a target |
CN105718905A (zh) * | 2016-01-25 | 2016-06-29 | 大连楼兰科技股份有限公司 | 基于行人特征与车载摄像头的盲人检测与识别方法与系统 |
CN105718904A (zh) * | 2016-01-25 | 2016-06-29 | 大连楼兰科技股份有限公司 | 基于组合特征与车载摄像头的盲人检测与识别方法与系统 |
CN105718907A (zh) * | 2016-01-25 | 2016-06-29 | 大连楼兰科技股份有限公司 | 基于导盲犬特征与车载摄像头的盲人检测识别方法与系统 |
JP6776719B2 (ja) * | 2016-08-17 | 2020-10-28 | 富士通株式会社 | 移動体群検出プログラム、移動体群検出装置、及び移動体群検出方法 |
US11049260B2 (en) * | 2016-10-19 | 2021-06-29 | Nec Corporation | Image processing device, stationary object tracking system, image processing method, and recording medium |
US20180232647A1 (en) * | 2017-02-10 | 2018-08-16 | International Business Machines Corporation | Detecting convergence of entities for event prediction |
CN107316463A (zh) * | 2017-07-07 | 2017-11-03 | 深圳市诺龙技术股份有限公司 | 一种车辆监控的方法和装置 |
CN107370989A (zh) * | 2017-07-31 | 2017-11-21 | 上海与德科技有限公司 | 目标寻找方法及服务器 |
CN107862240B (zh) * | 2017-09-19 | 2021-10-08 | 中科(深圳)科技服务有限公司 | 一种多摄像头协同的人脸追踪方法 |
CN108897777B (zh) * | 2018-06-01 | 2022-06-17 | 深圳市商汤科技有限公司 | 目标对象追踪方法及装置、电子设备和存储介质 |
DE102018214635A1 (de) | 2018-08-29 | 2020-03-05 | Robert Bosch Gmbh | Verfahren zur Vorhersage zumindest eines zukünftigen Geschwindigkeitsvektors und/oder einer zukünftigen Pose eines Fußgängers |
CN110781733B (zh) * | 2019-09-17 | 2022-12-06 | 浙江大华技术股份有限公司 | 图像去重方法、存储介质、网络设备和智能监控系统 |
US20220343673A1 (en) * | 2019-09-27 | 2022-10-27 | Nec Corporation | Information processing apparatus, information processing method and storage medium |
EP3907650A1 (en) * | 2020-05-07 | 2021-11-10 | IDEMIA Identity & Security Germany AG | Method to identify affiliates in video data |
CN111739065A (zh) * | 2020-06-29 | 2020-10-02 | 上海出版印刷高等专科学校 | 基于数码印花的目标识别方法、系统、电子设备和介质 |
JP2022177392A (ja) * | 2021-05-18 | 2022-12-01 | 富士通株式会社 | 制御プログラム、制御方法、および情報処理装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005250692A (ja) | 2004-03-02 | 2005-09-15 | Softopia Japan Foundation | 物体の同定方法、移動体同定方法、物体同定プログラム、移動体同定プログラム、物体同定プログラム記録媒体、移動体同定プログラム記録媒体 |
JP2006092396A (ja) * | 2004-09-27 | 2006-04-06 | Oki Electric Ind Co Ltd | 単独行動者及びグループ行動者検知装置 |
WO2006080367A1 (ja) * | 2005-01-28 | 2006-08-03 | Olympus Corporation | 粒子群運動解析システム、粒子群運動解析方法及びプログラム |
JP2008117264A (ja) * | 2006-11-07 | 2008-05-22 | Chuo Electronics Co Ltd | 不正通過者検出装置及びこれを利用した不正通過者録画システム |
JP2009075802A (ja) * | 2007-09-20 | 2009-04-09 | Giken Torasutemu Kk | 人物行動検索装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4240957B2 (ja) * | 2002-08-30 | 2009-03-18 | 日本電気株式会社 | 物体追跡装置、物体追跡方法および物体追跡プログラム |
RU2370817C2 (ru) | 2004-07-29 | 2009-10-20 | Самсунг Электроникс Ко., Лтд. | Система и способ отслеживания объекта |
EP1805684A4 (en) * | 2004-10-12 | 2008-10-22 | Samsung Electronics Co Ltd | METHOD, MEDIUM AND DEVICE FOR PERSON-BASED PHOTOCLUSTERING IN A DIGITAL PHOTO ALBUM AND METHOD, MEDIUM AND DEVICE FOR CREATING A PERSON-BASED DIGITAL PHOTOALBUM |
WO2006048809A1 (en) * | 2004-11-04 | 2006-05-11 | Koninklijke Philips Electronics N.V. | Face recognition |
US7479299B2 (en) | 2005-01-26 | 2009-01-20 | Honeywell International Inc. | Methods of forming high strength coatings |
US20080166020A1 (en) * | 2005-01-28 | 2008-07-10 | Akio Kosaka | Particle-Group Movement Analysis System, Particle-Group Movement Analysis Method and Program |
EP1915874A2 (de) * | 2005-08-17 | 2008-04-30 | SeeReal Technologies GmbH | Verfahren und schaltungsanordnung zum erkennen und verfolgen von augen mehrerer betrachter in echtzeit |
CN101582166A (zh) * | 2008-05-12 | 2009-11-18 | 皇家飞利浦电子股份有限公司 | 目标的跟踪系统和方法 |
US8284990B2 (en) * | 2008-05-21 | 2012-10-09 | Honeywell International Inc. | Social network construction based on data association |
JP5144487B2 (ja) * | 2008-12-15 | 2013-02-13 | キヤノン株式会社 | 主顔選択装置、その制御方法、撮像装置及びプログラム |
US8320617B2 (en) * | 2009-03-27 | 2012-11-27 | Utc Fire & Security Americas Corporation, Inc. | System, method and program product for camera-based discovery of social networks |
WO2010143290A1 (ja) * | 2009-06-11 | 2010-12-16 | 富士通株式会社 | 不審者検出装置、不審者検出方法、および、不審者検出プログラム |
-
2011
- 2011-10-26 BR BR112013025032A patent/BR112013025032A2/pt active Search and Examination
- 2011-10-26 EP EP11862298.4A patent/EP2693404B1/en active Active
- 2011-10-26 RU RU2013147808/08A patent/RU2546327C1/ru not_active IP Right Cessation
- 2011-10-26 US US14/001,251 patent/US9235754B2/en active Active
- 2011-10-26 WO PCT/JP2011/005973 patent/WO2012131816A1/ja active Application Filing
- 2011-10-26 MY MYPI2013701464A patent/MY167470A/en unknown
- 2011-10-26 CN CN201180069391.7A patent/CN103430214B/zh active Active
- 2011-10-26 JP JP2013506855A patent/JP5870996B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005250692A (ja) | 2004-03-02 | 2005-09-15 | Softopia Japan Foundation | 物体の同定方法、移動体同定方法、物体同定プログラム、移動体同定プログラム、物体同定プログラム記録媒体、移動体同定プログラム記録媒体 |
JP2006092396A (ja) * | 2004-09-27 | 2006-04-06 | Oki Electric Ind Co Ltd | 単独行動者及びグループ行動者検知装置 |
WO2006080367A1 (ja) * | 2005-01-28 | 2006-08-03 | Olympus Corporation | 粒子群運動解析システム、粒子群運動解析方法及びプログラム |
JP2008117264A (ja) * | 2006-11-07 | 2008-05-22 | Chuo Electronics Co Ltd | 不正通過者検出装置及びこれを利用した不正通過者録画システム |
JP2009075802A (ja) * | 2007-09-20 | 2009-04-09 | Giken Torasutemu Kk | 人物行動検索装置 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014155159A (ja) * | 2013-02-13 | 2014-08-25 | Nec Corp | 情報処理システム、情報処理方法及びプログラム |
JPWO2015064292A1 (ja) * | 2013-10-30 | 2017-03-09 | 日本電気株式会社 | 画像の特徴量に関する処理システム、処理方法及びプログラム |
US10140555B2 (en) | 2013-10-30 | 2018-11-27 | Nec Corporation | Processing system, processing method, and recording medium |
WO2016132769A1 (ja) * | 2015-02-19 | 2016-08-25 | シャープ株式会社 | 撮影装置、撮影装置の制御方法、および制御プログラム |
JP2016201758A (ja) * | 2015-04-14 | 2016-12-01 | パナソニックIpマネジメント株式会社 | 施設内人物捜索支援装置、施設内人物捜索支援システムおよび施設内人物捜索支援方法 |
JP2017157127A (ja) * | 2016-03-04 | 2017-09-07 | Necソリューションイノベータ株式会社 | 捜索支援装置、捜索支援方法、及びプログラム |
JP2018084924A (ja) * | 2016-11-22 | 2018-05-31 | サン電子株式会社 | 管理装置及び管理システム |
JP7101331B2 (ja) | 2016-11-22 | 2022-07-15 | サン電子株式会社 | 管理装置及び管理システム |
WO2019131062A1 (ja) * | 2017-12-27 | 2019-07-04 | パイオニア株式会社 | 判定装置及び情報記録装置、判定方法並びに判定用プログラム |
JPWO2019131062A1 (ja) * | 2017-12-27 | 2021-01-07 | パイオニア株式会社 | 判定装置及び情報記録装置、判定方法並びに判定用プログラム |
Also Published As
Publication number | Publication date |
---|---|
CN103430214B (zh) | 2016-10-26 |
US20130329958A1 (en) | 2013-12-12 |
CN103430214A (zh) | 2013-12-04 |
RU2546327C1 (ru) | 2015-04-10 |
MY167470A (en) | 2018-08-29 |
JPWO2012131816A1 (ja) | 2014-07-24 |
BR112013025032A2 (pt) | 2017-01-10 |
EP2693404A4 (en) | 2016-02-17 |
EP2693404B1 (en) | 2019-04-24 |
RU2013147808A (ru) | 2015-05-10 |
JP5870996B2 (ja) | 2016-03-01 |
EP2693404A1 (en) | 2014-02-05 |
US9235754B2 (en) | 2016-01-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5870996B2 (ja) | 人物追跡装置、人物追跡方法および人物追跡プログラム | |
JP6968645B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
Gowsikhaa et al. | Automated human behavior analysis from surveillance videos: a survey | |
US20130136304A1 (en) | Apparatus and method for controlling presentation of information toward human object | |
US8432445B2 (en) | Air conditioning control based on a human body activity amount | |
Jalal et al. | Depth map-based human activity tracking and recognition using body joints features and self-organized map | |
WO2015131734A1 (zh) | 一种前视监视场景下的行人计数方法、装置和存储介质 | |
US9305217B2 (en) | Object tracking system using robot and object tracking method using a robot | |
JP5271227B2 (ja) | 群衆監視装置および方法ならびにプログラム | |
WO2018084191A1 (ja) | 混雑状況分析システム | |
JP2016057998A (ja) | 物体識別方法 | |
CN109583373A (zh) | 一种行人重识别实现方法 | |
Ramos et al. | Fast-forward video based on semantic extraction | |
Afsar et al. | Automatic human trajectory destination prediction from video | |
JP2018081654A (ja) | 検索装置、表示装置および検索方法 | |
JP2021149687A (ja) | 物体認識装置、物体認識方法及び物体認識プログラム | |
CN107665495B (zh) | 对象跟踪方法及对象跟踪装置 | |
Willems et al. | A video-based algorithm for elderly fall detection | |
Sharma et al. | NAVI: Navigation aid for the visually impaired | |
Verma et al. | Prediction of satellite images using fuzzy rule based Gaussian regression | |
KR101564760B1 (ko) | 범죄 사건의 예측을 위한 영상 처리 장치 및 방법 | |
Reljin et al. | Small moving targets detection using outlier detection algorithms | |
Kuplyakov et al. | Further improvement on an MCMC-based video tracking algorithm | |
US20230206641A1 (en) | Storage medium, information processing method, and information processing apparatus | |
Rezaei et al. | Distibuted human tracking in smart camera networks by adaptive particle filtering and data fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11862298 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013506855 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14001251 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1301005429 Country of ref document: TH |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2013147808 Country of ref document: RU Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112013025032 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112013025032 Country of ref document: BR Kind code of ref document: A2 Effective date: 20130927 |