WO2011102072A1 - 対象物追跡装置、対象物追跡方法、および対象物追跡プログラム - Google Patents
対象物追跡装置、対象物追跡方法、および対象物追跡プログラム Download PDFInfo
- Publication number
- WO2011102072A1 WO2011102072A1 PCT/JP2011/000135 JP2011000135W WO2011102072A1 WO 2011102072 A1 WO2011102072 A1 WO 2011102072A1 JP 2011000135 W JP2011000135 W JP 2011000135W WO 2011102072 A1 WO2011102072 A1 WO 2011102072A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- likelihood
- image
- particle
- tracking
- particles
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
Definitions
- the present invention relates to an object tracking device, an object tracking method, and an object tracking program for tracking the position of an object projected on an image using a particle filter.
- Patent Document 1 the position of an object such as a human being displayed in an image is tracked using a particle filter (see, for example, Patent Document 1 and Non-Patent Document 1).
- Patent Document 1 and the technique described in Non-Patent Document 1 (hereinafter collectively referred to as “prior art”) first determine the feature amount of an image of an object in a video.
- the prior art generates a plurality of particles indicating candidates for the position of the object at the next time t from the position of the object at the time t-1, and each particle at the time t and the position of the object at the time t-1 The feature amount is matched between the two.
- the prior art calculates the likelihood that the particle is the position of the object at time t for each particle from the similarity.
- the prior art estimates the position of the particle having the highest likelihood for each object as the position of the object at time t. Thereby, the prior art can keep track of the position of the same object.
- the technique described in Patent Document 1 corrects the above-described likelihood using a color feature amount when the feature amount is a shape feature amount indicating a contour curve of an object. Specifically, the technique described in Patent Document 1 first calculates, for each particle, a color feature amount that is a color histogram of a region in a contour curve, for example. The technique described in Patent Document 1 calculates the similarity between the color histogram of the color feature of the target object and the color histogram of each particle using a histogram intersection. Thereafter, the technique described in Patent Literature 1 corrects the above-described likelihood based on the calculation result. As a result, the technique described in Patent Document 1 reduces the possibility that another object with a similar contour curve is tracked by mistake even when the object is located near the target object. can do.
- the feature amount of the object is similar between the objects. Difficult to distinguish. Therefore, in the prior art, the possibility that another target is erroneously tracked (hereinafter referred to as “mistracking”) cannot be sufficiently reduced.
- An object of the present invention is to provide an object tracking device, an object tracking method, and an object tracking program that can further reduce the possibility of erroneous tracking.
- the object tracking device of the present invention is an object tracking device that tracks the position of an object projected on an image using a particle filter, and generates a plurality of particles indicating candidate positions of the object, A feature amount calculation unit that calculates a feature amount of an image of the object and a feature amount of each image of the particle; and for each particle, a feature amount of the image of the particle and a feature amount of the image of the object A likelihood calculating unit that calculates the likelihood that the particle is the position of the object from the similarity, and a position estimating unit that estimates the position of the object based on the calculated likelihood of the particle And a likelihood correction unit that corrects the likelihood when there are a plurality of the objects and a plurality of positions estimated corresponding to these objects overlap.
- the object tracking method of the present invention is an object tracking method for tracking the position of an object projected on an image using a particle filter, and generates a plurality of particles indicating candidate positions of the object, Calculating the feature amount of the image of the object and the feature amount of each image of the particle, and for each particle, from the similarity between the feature amount of the image of the particle and the feature amount of the image of the object Calculating the likelihood that the particle is the position of the object; estimating the position of the object based on the calculated likelihood of the particle; and a plurality of the objects. And a step of correcting the likelihood when a plurality of positions estimated corresponding to these overlap.
- An object tracking program is an object tracking program for tracking the position of an object projected on an image using a particle filter, the particle indicating a candidate for the position of the object to a computer
- a plurality of image data a process for calculating a feature amount of the object image and a feature amount of each image of the particle, and for each particle, a feature amount of the particle image and a feature of the object image
- a process for calculating the likelihood that the particle is the position of the object from the similarity to the quantity a process for estimating the position of the object based on the calculated likelihood of the particle, When there are a plurality of the objects and a plurality of positions estimated corresponding to these objects overlap, a process of correcting the likelihood is executed.
- the likelihood of particles can be corrected when the estimated positions overlap between objects, and the possibility of erroneous tracking can be further reduced.
- FIG. 1 is a system configuration diagram showing a configuration of an object tracking system including an object tracking device according to an embodiment of the present invention.
- the figure which shows the definition of the position of the target object in this Embodiment The figure for demonstrating the tracking using the particle filter in this Embodiment
- the block diagram which shows the structure of the target tracking apparatus which concerns on this Embodiment.
- the flowchart which shows the whole operation
- FIG. 1 is a system configuration diagram showing a configuration of an object tracking system including an object tracking device according to an embodiment of the present invention.
- the present invention will be described using an example in which the present invention is applied to a system that tracks the movement of a plurality of workers wearing the same work clothes in a factory in a captured image.
- the object tracking system 100 includes an imaging device 200, an object tracking device 300, and a display device 400.
- the imaging device 200 and the display device 400 are connected to the object tracking device 300 in a communicable manner.
- the imaging device 200 is a device having an image acquisition function, for example, a digital video camera.
- the imaging apparatus 200 captures a state in a factory, and outputs time-series data (captured video) of the captured image to the object tracking apparatus 300.
- the object tracking device 300 is a device having an object tracking function, for example, a personal computer.
- the object tracking device 300 tracks the position of the object on the image (hereinafter simply referred to as “position”) from the captured video input from the imaging device 200 using a particle filter (for example, Patent Document 1 and non-patent documents). Patent Document 1).
- position the position of the object on the image
- Patent Document 1 a particle filter
- Patent Document 1 Non-patent documents
- Patent Document 1 for example, Patent Document 1 and non-patent documents.
- the object tracking device 300 reduces the likelihood of particles that overlap with the positions of other objects among the particles used to estimate the position of the object.
- the object tracking device 300 generates an image (hereinafter referred to as “result display image”) in which the tracking result is visually superimposed on the captured video, and outputs the generated image to the display device 400.
- the display device 400 is a device having a function of displaying an image, for example, a liquid crystal display.
- the display device 400 displays the image (result display image) input from the object tracking device 300 on the screen.
- the object tracking device 300 configured in this way can reduce mistracking even when there are objects with similar image characteristics.
- Particle filter is an approximate calculation method for Bayes filters.
- the probability distribution of the position of the object at time t can be obtained by applying position prediction, likelihood observation, and resampling to the position detected at time t-1.
- the position prediction is a position prediction at time t based on the state transition model.
- Likelihood observation is to obtain the likelihood of each position based on the similarity of the feature quantity of the image of the object to the feature quantity of the reference image.
- Resampling is to pick up a value obtained by discretizing the probability density distribution at each position.
- the reference image is an image registered as a tracking target, for example, an image of an object being tracked acquired in the past.
- the object tracking device 300 periodically detects the object from the captured video by image processing.
- FIG. 2 is a diagram showing the definition of the position of the object in the present embodiment.
- the object tracking device 300 acquires information defining the position of the detected object 510 using the XY axes set on the image plane.
- Information defining the position of the object 510 is, for example, a parameter set including the upper left coordinates (x, y), the width w, and the height h of the rectangular frame 520 that circumscribes the object 510.
- the object 510 may be the whole body of the worker or may be another part such as the upper body part of the worker.
- FIG. 3 is a diagram for explaining an outline of tracking using a particle filter.
- the horizontal axis conceptually shows each position.
- the object tracking device 300 is configured to detect the probability density of each position at the position of the object as a detection result at time t-1 of the immediately preceding period in each period in which the position of the object is detected. Particles that materialize the distribution are generated (S2). The average position of the probability density distribution approximated by the generated particles is a position that is highly likely to be the actual position of the object at time t.
- the object tracking device 300 moves each particle using the state transition model (S3).
- the density of the moved particles discretely represents the true probability density distribution (represented by line 510) at time t in the next cycle. That is, the average position of the density of the particles after movement is a position that is highly likely to be the actual position of the object at time t.
- the object tracking device 300 calculates, for each particle after movement, a similarity to an image registered as a tracking target as a likelihood (S4). Then, the object tracking device 300 estimates the weighted average position of the probability distribution calculated based on the likelihood for each particle as the position of the object at time t.
- the object tracking device 300 corrects the likelihood of particles when there are a plurality of objects and a plurality of positions estimated corresponding to these objects overlap. Specifically, the object tracking device 300, when the size of the overlapping area of the particles of the second object with respect to the position of the first object is greater than or equal to the first predetermined value, the likelihood of the corresponding particle. Decrease the degree. As a result, the object tracking device 300 can reduce the possibility of mistracking due to being dragged by an image of another object.
- the object tracking device 300 considers a case where the positions of the objects are actually displayed in an overlapping manner, and in such a case, using the fact that the likelihood is not so high, the likelihood is the second.
- the likelihood correction is limited only when the value is equal to or greater than a predetermined value.
- the object tracking device 300 can prevent the likelihood from being lowered and the position from being disabled even though the positions of the objects are actually displayed in an overlapping manner.
- FIG. 4 is a block diagram showing the configuration of the object tracking device 300. As shown in FIG. 4
- the object tracking device 300 includes an image acquisition unit 310, an image storage unit 320, a tracking instruction unit 330, a feature amount calculation unit 340, a likelihood calculation unit 350, a position estimation unit 360, a position storage unit 370, an overlap ratio.
- a calculation unit 380 and a likelihood correction unit 390 are included.
- the image acquisition unit 310 acquires an image from the imaging device 200 and outputs it to the image storage unit 320.
- the image storage unit 320 stores the image input from the image acquisition unit 310.
- the tracking instruction unit 330 acquires an image from the image storage unit 320, and detects a moving object from the image, for example, by applying a background subtraction method.
- a background subtraction method it is assumed that the moving object displayed on the image is only the worker. Accordingly, the tracking instruction unit 330 detects an object from the image.
- the tracking instruction unit 330 determines, for each target object, whether the target object is a newly detected target object or a target object being tracked.
- the newly detected object refers to an object that has started moving or an object that has entered the screen.
- the object being tracked refers to the object whose position at the immediately preceding time is detected. Details of this determination will be described later.
- the newly detected object is referred to as “new object”, and the object being tracked is referred to as “tracking object”.
- the tracking instruction unit 330 outputs the image and the position information of each detected object to the feature amount calculation unit 340.
- the position information includes the position of the target object and a status flag indicating whether the target object is a new target object or a tracking target object (hereinafter referred to as “tracking state”).
- the position of the object includes the upper left coordinates (x, y), the width w, and the height h of the rectangular frame 520 circumscribing the object.
- the tracking instruction unit 330 outputs a reference histogram registered by a feature amount calculation unit 340 described later for the tracking target object to the feature amount calculation unit 340 in association with the tracking target object. The reference histogram will be described later.
- the feature amount calculation unit 340 acquires the position and tracking state of each target object from the input position information.
- the feature value calculation unit 340 registers the image of the new object as a tracking target. Specifically, the feature amount calculation unit 340 calculates a color histogram (hereinafter referred to as “reference histogram”) of the image area at the position of the new target object, and changes the state flag to the tracking target object. Then, the feature amount calculation unit 340 outputs the image and the position information and the reference histogram of the object newly registered as the tracking target to the likelihood calculation unit 350.
- the position information and reference histogram of the object newly registered as the tracking target are the initial registration information of the tracking target object.
- the feature amount calculation unit 340 performs resampling of the particles, prediction of the position, and likelihood observation around the position of the tracking target object based on the particle filter method. Specifically, first, the feature amount calculation unit 340 resamples a predetermined number of particles (for example, 200) by prioritizing particles with high likelihood around the position of the tracking target object (see FIG. 3 S1, S2).
- the feature amount calculation unit 340 causes the resampled particles to transition based on the state transition model (S3 in FIG. 3).
- the feature amount calculation unit 340 employs a state transition model that moves the object to a position that takes into account the amount of movement per unit time and Gaussian noise under the assumption that the object moves at a constant linear velocity.
- the feature amount calculation unit 340 outputs the image, the position information of each tracking target object, the reference histogram, and the particle information to the likelihood calculation unit 350.
- the particle information is information that defines each generated particle, and is a parameter set of the position of each particle, that is, the upper left coordinates, width, and height of a rectangular frame that defines the particle (see FIG. 2).
- the likelihood calculation unit 350 outputs the input image and the reference histogram of each object to the position estimation unit 360.
- the likelihood calculating unit 350 calculates the likelihood in the particle filter for each particle from the particle information. Specifically, the likelihood calculating unit 350 calculates a color histogram for each particle. Then, the likelihood calculating unit 350 calculates the similarity between the calculation result and the reference histogram by using the histogram intersection, and sets the calculation result as the likelihood that the particle is the position of the tracking target object. Then, the likelihood calculating unit 350 outputs the position information of each object, the particle information and the likelihood of each particle to the overlap ratio calculating unit 380.
- the overlap rate calculation unit 380 calculates the overlap rate with other objects for each particle of the tracked object.
- the overlapping ratio is, for example, the ratio of the area of the overlapping region to the particle area. Then, the overlap ratio calculation unit 380 outputs the position information of each object, the particle information of each particle, the likelihood, and the overlap ratio to the likelihood correction unit 390.
- the likelihood correcting unit 390 corrects the likelihood of the particles when there are a plurality of objects to be tracked and a plurality of positions estimated by the position estimating unit 360 described later overlap correspondingly. Specifically, the likelihood correction unit 390 reduces the likelihood of particles having an overlap ratio that is equal to or higher than a first predetermined value and whose likelihood is equal to or higher than a second predetermined value. Then, likelihood correcting section 390 outputs position information of each target object, particle information and likelihood of each particle to position estimating section 360.
- the position estimation unit 360 calculates the average position of the probability distribution weighted by the likelihood of each particle after movement for each tracking target. Next, the position estimation unit 360 calculates the total likelihood value (hereinafter referred to as “likelihood total value”) of each particle. Then, the position estimation unit 360 estimates the calculated position as the position of the tracking target object at the time t, and the position storage unit 370 stores the estimated position, the total likelihood value thereof, and N particle information described later. Output to.
- the position storage unit 370 stores the input position of each tracking target object at time t, its total likelihood value, and N pieces of particle information.
- the position of each tracking target object stored in the position storage unit 370 is referred to by the tracking instruction unit 330 and the overlap ratio calculation unit 380 described above.
- the position storage unit 370 stores time-series data of a series of stored positions and captured images in association with time, and displays the result of visually superimposing the position of the tracking target object on the captured image as a tracking result. An image is generated and output to the display device 400.
- the object tracking device 300 having such a configuration can perform position estimation by reducing the likelihood of particles that greatly overlap with the positions of other objects among the particles used for estimating the position of the object. .
- FIG. 5 is a flowchart showing the overall operation of the object tracking device 300.
- the image storage unit 320 of the object tracking device 300 stores a captured video sent from the imaging device 200.
- step S1000 the tracking instruction unit 330 determines whether or not the end of the tracking process has been instructed by a user operation (such as a user pressing a program end button). When instructed to end the tracking process (S1000: YES), the tracking instruction unit 330 ends the process as it is. If the tracking instruction unit 330 is not instructed to end the tracking process (S1000: NO), the tracking instruction unit 330 proceeds to step S2000.
- a user operation such as a user pressing a program end button
- the tracking instruction unit 330 detects the target object from the image at time t-1 stored in the image storage unit 320, and generates position information for each target object. Specifically, the tracking instruction unit 330 generates a difference image by the background difference method using, for example, an image acquired when the object tracking system 100 is activated as a background image. Then, the tracking instruction unit 330 detects an area having an image characteristic such as a size or a shape estimated to be an object in the difference image as an image area of the object, and defines the position thereof. Then, the tracking instruction unit 330 determines a tracking state for each detected object and associates a state flag with it.
- the tracking instruction unit 330 determines a tracking state for each detected object and associates a state flag with it.
- step S3000 the tracking instruction unit 330 selects one of the objects detected from the image at time t-1.
- step S4000 the tracking instruction unit 330 determines whether the selected object is an object for which tracking has started (hereinafter referred to as “tracking start object”) or an object being tracked.
- the tracking instruction unit 330 may calculate the moving direction of the tracking target object from a plurality of pieces of past position information. In this case, the tracking instruction unit 330 sets the overlapping ratio with respect to the position where the position at the time t ⁇ 1 is moved in the moving amount and the moving direction when it is assumed that the tracking target object performs a constant-velocity linear motion. May be calculated.
- the tracking instruction unit 330 proceeds to step S5000 when the selected object is not a tracking start object or a tracking object, that is, a new object (S4000: NO). If the selected target is a tracking start target or a tracking target (S4000: YES), the tracking instruction unit 330 proceeds to step S8000.
- step S5000 the feature amount calculation unit 340 registers the selected object as a tracking target. That is, the feature amount calculation unit 340 generates a reference histogram of the new object and includes it in the position information, and corrects the corresponding state flag to indicate that the object is a tracking start object.
- step S7000 the feature amount calculation unit 340 determines whether or not the processing in step S4000 has been performed on all the objects detected from the image at time t-1.
- the feature amount calculation unit 340 returns to step S1000 when all the objects have been processed (S7000: YES).
- S7000: NO the feature amount calculation unit 340 returns to step S3000 and selects an unprocessed object.
- step S6000 the tracking instruction unit 330 determines whether the likelihood total value of the position information of the selected object is high.
- the tracking instruction unit 330 is sufficient for the total likelihood value to continue tracking when the selected target is a tracking target and the above-described total likelihood value is equal to or greater than a third predetermined value. It is determined to be high.
- the tracking instruction unit 330 determines that the likelihood total value is low to continue tracking when the object is being tracked and a likelihood average value described later is less than a third predetermined value. If the likelihood total value is high (S6000: YES), tracking instruction section 330 proceeds to step S8000. In addition, when the likelihood total value is low (S6000: NO), the tracking instruction unit 330 proceeds to step S9000.
- step S8000 the object tracking device 300 performs position estimation processing and proceeds to step S7000.
- the position estimation process will be described later.
- step S9000 the tracking instruction unit 330 discards the registration of the selected object as the tracking target because it is difficult to continue tracking, and proceeds to step S7000. That is, the tracking instruction unit 330 discards the position information of the selected object.
- the feature amount calculation unit 340 generates and places particles that are candidates for the actual position of the tracking target object at time t around the position of the tracking target object.
- the feature amount calculation unit 340 randomly arranges N particles when the selected target is a tracking start target.
- the feature amount calculation unit 340 selects (resamples) N particles from the particles at time t ⁇ 1 while allowing the overlapping of the particles. Then, the feature amount calculation unit 340 discards particles that are not selected.
- step S8020 the feature amount calculation unit 340 moves each of the N particles based on the state transition model, and generates a particle at the next time t.
- the state transition model for example, under the assumption of constant-velocity linear motion, the movement amount d and Gaussian noise n for a certain time are added, and the i-th particle located at the coordinates (x 1 , y 1 ) 1 + d x + n i, x , y 1 + d y + n i, y ).
- likelihood calculation section 350 calculates a histogram intersection for the corresponding reference histogram as likelihood L of each particle after movement.
- the histogram intersection is a value when the smaller value of the frequency a c of the color value c in the reference histogram and the frequency b c of the color value c in the particle histogram is summed for all color values. .
- step S8040 the overlap ratio calculation unit 380 selects one of the selected particles of the tracking target object.
- step S8050 the overlap ratio calculation unit 380 calculates the overlap ratio Cr with other objects of the selected particle using, for example, the following equation (1) for each other object.
- Pa is the area of the selected particle
- B is the area of the overlapping portion between the selected particle and the area of the other object (a rectangular frame circumscribing the object, see FIG. 2).
- . Cr B / Pa (1)
- the overlap ratio calculation unit 380 adopts the largest overlap ratio as the final calculation result when there are a plurality of other objects. Note that the overlap ratio calculation unit 380 may acquire the position of another object from the position storage unit 370 or from position information input from the likelihood calculation unit 350.
- the likelihood correction unit 390 determines whether the selected particle is a likelihood correction target. Likelihood correction unit 390 determines that the overlapping ratio is equal to or higher than the first predetermined value and the likelihood is equal to or higher than the second predetermined value as the likelihood correction target, and determines other particles as the target. It is determined that it is not a likelihood correction target.
- the second predetermined value is, for example, the mode value of the likelihood histogram or the average value of the likelihoods of the N particles of the selected tracking target object.
- the second predetermined value is the mode value, it is easy to reduce only the likelihood that has increased due to the influence of other objects, and therefore, it is difficult to be influenced by the outlier. Therefore, for example, there is an advantage that it is effective when the particles are not evenly arranged around the object being tracked. Further, when the second predetermined value is an average value, there is an advantage that it is not necessary to prepare the second predetermined value in advance, and calculation is easy.
- the likelihood correction unit 390 proceeds to step S8070 when the selected particle is a likelihood correction target (S8060: YES). If the selected particle is not a likelihood correction target (S8060: NO), the likelihood correcting unit 390 proceeds to the next step S8080.
- the likelihood correction unit 390 decreases the likelihood of the selected particle. Specifically, the likelihood correction unit 390 calculates a corrected likelihood value by a predetermined method. Various methods can be employed for correcting the likelihood of particles.
- the likelihood correction unit 390 obtains a difference in likelihood with respect to the second predetermined value described above, and employs a correction value obtained in advance in association with the combination of the overlap ratio and the difference.
- amendment part 390 calculates
- amendment part 390 calculates
- amendment part 390 calculates likelihood La 'after correction
- La ′ La ⁇ (1 ⁇ Cr) (2)
- step S8080 the likelihood correction unit 390 determines whether or not the processing in step S8060 has been performed for all the particles of the selected tracking start target object or tracking target object.
- the process proceeds to step S8090. If the unprocessed particles remain (S8080: NO), the likelihood correcting unit 390 returns to step S8040 and selects unprocessed particles. By repeating the processes in steps S8040 to S8080, the likelihood correcting unit 390 can reduce the likelihood of particles that are dragged by other objects and have a high likelihood.
- step S8090 the position estimation unit 360 obtains a weighted average for the positions of the particles using the likelihood for each particle as a weight for the N particles, and uses the obtained value as the position at the next time t. Estimated. That is, the position estimation unit 360 updates the position information at the time t ⁇ 1 stored in the position storage unit 370 with the obtained value. As a result, the position storage unit 370 uses the display device 400 to display a result display image in which the tracking result of the tracking target object is superimposed on the captured image.
- the position estimation unit 360 corrects the state flag to the content representing the tracking target object. Then, the position estimation unit 360 stores the reference histogram of the selected tracking target object in the position storage unit 370. In addition, the position estimation unit 360 stores the likelihood total value of the N particles of the selected tracking target object in the position storage unit 370. If this total likelihood value is low, the tracking result is likely to be incorrect and should not be presented. Therefore, as described above, when the likelihood total value is less than the third predetermined value, the object tracking device 300 ends the tracking for the corresponding tracking target object (see steps S6000 and S9000 in FIG. 5). ).
- the object tracking device 300 tracks the position of the object using the particle filter, and can correct the likelihood of the particles when the tracking positions of a plurality of objects overlap.
- the processing that flows from step S1000 to step S5000 can be said to be an operation at the time of tracking initialization for a certain object.
- the process that flows from step S1000 to step S8000 can be said to be an operation at the start of the tracking process for a certain object and when the tracking process is continued.
- the process flowing from step S1000 to step S9000 can be said to be an operation at the end of the tracking process for a certain object.
- the target object tracking apparatus 300 may perform the above-mentioned each process about a some target object and a some particle all at once. good.
- the object tracking device 300 tracks the position of the object using the particle filter, and corrects the likelihood of the particles when the tracking positions of a plurality of objects overlap. .
- the object tracking apparatus 300 may be dragged by an image of another object to cause mistracking. Such a possibility can be reduced.
- the object tracking device 300 is provided with a proximity state determination unit that determines a proximity state for each particle based on the overlapping ratio before the likelihood correction unit 390. Also good.
- the proximity state determination unit determines that the particle is in the proximity state when the overlap ratio is equal to or greater than the fourth predetermined value, and the particle when the overlap ratio is less than the fourth predetermined value. Is not in the proximity state.
- the fourth predetermined value is, for example, an average value of the overlapping ratios of all particles of the target object with other target objects.
- the fourth predetermined value is, for example, the mode value of the histogram of the overlapping ratio of all particles of the target object with other target objects. Then, the object tracking device 300 determines whether or not only the particles determined to be in the proximity state are likelihood correction targets.
- the proximity state determination unit can determine that particles that are overlapped are not subject to likelihood correction. Therefore, the object tracking device 300 can reduce mistracking caused by excessively reducing the likelihood.
- the object tracking device 300 may scale the width and height of the particles according to the movement of the object in consideration of the change in size on the image due to the movement of the object.
- the feature amount calculation unit 340 stores in advance a table in which the amount and direction of movement of the target object in the Y-axis direction are associated with the size change ratio of the target object. Then, the feature amount calculation unit 340 uses this table so that the size increases when the object moves forward, and the size decreases when the object moves backward. Correct the width and height of the particles.
- the object tracking device 300 does not rely on the likelihood of particles based on whether or not the positions of a plurality of objects estimated from particles overlap, not based on the overlapping ratio of individual particles with other objects.
- the degree may be corrected.
- the object tracking device 300 estimates the position of the object without correcting the likelihood, and the similarity is high even though the estimated position overlaps the position of another object. Sometimes the probability density near that position is lowered.
- the object tracking device 300 does not decrease the likelihood of particles whose overlap area size is equal to or greater than the first predetermined value, but instead reduces the likelihood of particles whose overlap area size is less than the first predetermined value. May be raised.
- the configuration in which the imaging device 200 and the object tracking device 300 are separated has been described.
- the imaging device 200 and the object tracking device 300 may be integrated.
- the present invention is applied to a system that tracks the movements of a plurality of workers wearing the same work clothes in a factory in a captured image. It is not limited to. The present invention can be applied to other various apparatuses and systems that track an object from an image.
- the object tracking device, the object tracking method, and the object tracking program according to the present invention are useful as an object tracking device, an object tracking method, and an object tracking program that can further reduce the possibility of erroneous tracking. It is.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Cr = B / Pa ・・・・・・(1)
La' = La × (1 - Cr) ・・・・・・(2)
200 撮像装置
300 対象物追跡装置
310 画像取得部
320 画像記憶部
330 追跡指示部
340 特徴量算出部
350 尤度算出部
360 位置推定部
370 位置記憶部
380 重なり割合算出部
390 尤度補正部
400 表示装置
Claims (10)
- パーティクルフィルタを用いて映像に映し出された対象物の位置を追跡する対象物追跡装置であって、
前記対象物の位置の候補を示すパーティクルを複数生成し、前記対象物の画像の特徴量と前記パーティクルのそれぞれの画像の特徴量とを算出する特徴量算出部と、
前記パーティクル毎に、そのパーティクルの画像の特徴量と前記対象物の画像の特徴量との類似度から、そのパーティクルが前記対象物の位置であることの尤度を算出する尤度算出部と、
算出された前記パーティクルの尤度に基づいて、前記対象物の位置を推定する位置推定部と、
前記対象物が複数存在し、これらに対応して推定される複数の位置が重なるとき、前記尤度を補正する尤度補正部と、
を有する対象物追跡装置。 - 前記対象物の位置は、前記映像における前記対象物の画像領域に対応する領域によって定義され、
第1の対象物の位置に対する、第2の対象物のパーティクルの重なり領域の大きさを算出する重なり割合算出部、を更に有し、
前記尤度補正部は、
前記重なり領域の大きさが第1の所定値以上であるとき、前記第2の対象物の該当するパーティクルの尤度を下げる、
請求項1記載の対象物追跡装置。 - 前記第1の対象物の位置は、前記映像における前記第1の対象物の画像領域に外接する矩形領域によって定義され、前記重なり領域の大きさは、前記第2の対象物のパーティクルの面積に対する重なり領域の面積の割合である、
請求項2記載の対象物追跡装置。 - 前記類似度は、前記対象物の画像の色ヒストグラムと、前記パーティクルの画像の色ヒストグラムとの類似度である、
請求項1記載の対象物追跡装置。 - 前記尤度補正部は、
前記重なり領域の大きさが前記第1の所定値以上であり、かつ、前記尤度が第2の所定値以上であるパーティクルが存在するとき、該当するパーティクルの尤度を補正する、
請求項2記載の対象物追跡装置。 - 前記第2の所定値は、前記第2の対象物のパーティクルの尤度の平均値である、
請求項5記載の対象物追跡装置。 - 前記第2の所定値は、前記第2の対象物のパーティクルの尤度のヒストグラムの最頻値である、
請求項5記載の対象物追跡装置。 - 前記尤度補正部は、
前記重なり領域の大きさと前記尤度の所定値との差分とに応じた大きさで、前記尤度を補正する、
請求項5記載の対象物追跡装置。 - パーティクルフィルタを用いて映像に映し出された対象物の位置を追跡する対象物追跡方法であって、
前記対象物の位置の候補を示すパーティクルを複数生成し、前記対象物の画像の特徴量と前記パーティクルのそれぞれの画像の特徴量とを算出するステップと、
前記パーティクル毎に、そのパーティクルの画像の特徴量と前記対象物の画像の特徴量との類似度から、そのパーティクルが前記対象物の位置であることの尤度を算出するステップと、
算出された前記パーティクルの尤度に基づいて、前記対象物の位置を推定するステップと、
前記対象物が複数存在し、これらに対応して推定される複数の位置が重なるとき、前記尤度を補正するステップと、
を有する対象物追跡方法。 - パーティクルフィルタを用いて映像に映し出された対象物の位置を追跡するための対象物追跡プログラムであって、
コンピュータに対し、
前記対象物の位置の候補を示すパーティクルを複数生成し、前記対象物の画像の特徴量と前記パーティクルのそれぞれの画像の特徴量とを算出する処理と、
前記パーティクル毎に、そのパーティクルの画像の特徴量と前記対象物の画像の特徴量との類似度から、そのパーティクルが前記対象物の位置であることの尤度を算出する処理と、
算出された前記パーティクルの尤度に基づいて、前記対象物の位置を推定する処理と、
前記対象物が複数存在し、これらに対応して推定される複数の位置が重なるとき、前記尤度を補正する処理と、
を実行させる対象物追跡プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/263,169 US8891821B2 (en) | 2010-02-19 | 2011-01-13 | Object tracking device, object tracking method, and object tracking program |
CN201180001777.4A CN102405483B (zh) | 2010-02-19 | 2011-01-13 | 对象物追踪装置以及对象物追踪方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-034849 | 2010-02-19 | ||
JP2010034849A JP5528151B2 (ja) | 2010-02-19 | 2010-02-19 | 対象物追跡装置、対象物追跡方法、および対象物追跡プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011102072A1 true WO2011102072A1 (ja) | 2011-08-25 |
Family
ID=44482681
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/000135 WO2011102072A1 (ja) | 2010-02-19 | 2011-01-13 | 対象物追跡装置、対象物追跡方法、および対象物追跡プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US8891821B2 (ja) |
JP (1) | JP5528151B2 (ja) |
CN (1) | CN102405483B (ja) |
WO (1) | WO2011102072A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102521612A (zh) * | 2011-12-16 | 2012-06-27 | 东华大学 | 一种基于协同关联粒子滤波的多视频目标主动跟踪方法 |
US20220051044A1 (en) * | 2020-08-14 | 2022-02-17 | Fujitsu Limited | Image processing apparatus and computer-readable storage medium for storing screen processing program |
Families Citing this family (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5841390B2 (ja) * | 2011-09-30 | 2016-01-13 | セコム株式会社 | 移動物体追跡装置 |
TWI479431B (zh) * | 2012-04-03 | 2015-04-01 | Univ Chung Hua | 物件追蹤方法 |
US9256781B2 (en) * | 2012-05-10 | 2016-02-09 | Pointguard Ltd. | System and method for computer vision based tracking of an object |
CN103679743B (zh) * | 2012-09-06 | 2016-09-14 | 索尼公司 | 目标跟踪装置和方法,以及照相机 |
US9152019B2 (en) | 2012-11-05 | 2015-10-06 | 360 Heros, Inc. | 360 degree camera mount and related photographic and video system |
KR101447671B1 (ko) * | 2012-11-19 | 2014-10-07 | 홍익대학교 산학협력단 | 대상의 위치를 확률적으로 예측하는 방법 |
CN103162629B (zh) * | 2013-01-31 | 2015-04-15 | 浙江大学 | 一种一维光阱微粒位移检测方法 |
JP5786879B2 (ja) * | 2013-02-21 | 2015-09-30 | カシオ計算機株式会社 | 被写体追跡装置、被写体追跡方法及びプログラム |
US20150018666A1 (en) * | 2013-07-12 | 2015-01-15 | Anant Madabhushi | Method and Apparatus for Registering Image Data Between Different Types of Image Data to Guide a Medical Procedure |
CN104424648B (zh) * | 2013-08-20 | 2018-07-24 | 株式会社理光 | 对象跟踪方法和设备 |
JP6110256B2 (ja) * | 2013-08-21 | 2017-04-05 | 株式会社日本自動車部品総合研究所 | 対象物推定装置および対象物推定方法 |
CN103489001B (zh) * | 2013-09-25 | 2017-01-11 | 杭州智诺科技股份有限公司 | 图像目标追踪方法和装置 |
JP6206804B2 (ja) | 2013-09-27 | 2017-10-04 | パナソニックIpマネジメント株式会社 | 移動体追跡装置、移動体追跡システムおよび移動体追跡方法 |
US11615460B1 (en) | 2013-11-26 | 2023-03-28 | Amazon Technologies, Inc. | User path development |
JP6277736B2 (ja) * | 2014-01-23 | 2018-02-14 | 富士通株式会社 | 状態認識方法及び状態認識装置 |
JP6295122B2 (ja) * | 2014-03-27 | 2018-03-14 | 株式会社メガチップス | 状態推定装置、プログラムおよび集積回路 |
US10169661B2 (en) * | 2014-03-28 | 2019-01-01 | International Business Machines Corporation | Filtering methods for visual object detection |
JP6415196B2 (ja) * | 2014-09-08 | 2018-10-31 | キヤノン株式会社 | 撮像装置および撮像装置の制御方法 |
JP6403509B2 (ja) * | 2014-09-08 | 2018-10-10 | キヤノン株式会社 | 画像処理装置および画像処理装置の制御方法 |
JP6399869B2 (ja) * | 2014-09-09 | 2018-10-03 | キヤノン株式会社 | 被写体追尾装置、撮像装置、被写体追尾方法及びプログラム |
JP5999394B2 (ja) | 2015-02-20 | 2016-09-28 | パナソニックIpマネジメント株式会社 | 追跡支援装置、追跡支援システムおよび追跡支援方法 |
US11205270B1 (en) | 2015-03-25 | 2021-12-21 | Amazon Technologies, Inc. | Collecting user pattern descriptors for use in tracking a movement of a user within a materials handling facility |
US10586203B1 (en) | 2015-03-25 | 2020-03-10 | Amazon Technologies, Inc. | Segmenting a user pattern into descriptor regions for tracking and re-establishing tracking of a user within a materials handling facility |
US10810539B1 (en) | 2015-03-25 | 2020-10-20 | Amazon Technologies, Inc. | Re-establishing tracking of a user within a materials handling facility |
US10679177B1 (en) | 2015-03-25 | 2020-06-09 | Amazon Technologies, Inc. | Using depth sensing cameras positioned overhead to detect and track a movement of a user within a materials handling facility |
KR101635973B1 (ko) * | 2015-04-23 | 2016-07-04 | 국방과학연구소 | Ir 영상 추적에서 파티클 필터를 이용한 기억 추적 성능 향상 방법 및 장치 |
JP6284086B2 (ja) | 2016-02-05 | 2018-02-28 | パナソニックIpマネジメント株式会社 | 追跡支援装置、追跡支援システムおよび追跡支援方法 |
JP6656987B2 (ja) * | 2016-03-30 | 2020-03-04 | 株式会社エクォス・リサーチ | 画像認識装置、移動体装置、及び画像認識プログラム |
JP6744123B2 (ja) * | 2016-04-26 | 2020-08-19 | 株式会社日立製作所 | 動体追跡装置および放射線照射システム |
JP6760767B2 (ja) * | 2016-05-31 | 2020-09-23 | 東芝テック株式会社 | 販売データ処理装置およびプログラム |
JP6715120B2 (ja) * | 2016-07-25 | 2020-07-01 | 株式会社Screenホールディングス | 基材処理装置および蛇行予測方法 |
JP2018197945A (ja) * | 2017-05-23 | 2018-12-13 | 株式会社デンソーテン | 障害物検出装置および障害物検出方法 |
WO2019088223A1 (ja) * | 2017-11-02 | 2019-05-09 | 株式会社Nttドコモ | 検出装置及び検出プログラム |
US11328513B1 (en) | 2017-11-07 | 2022-05-10 | Amazon Technologies, Inc. | Agent re-verification and resolution using imaging |
KR101982942B1 (ko) * | 2017-12-21 | 2019-05-27 | 건국대학교 산학협력단 | 객체 추적 방법 및 이를 수행하는 장치들 |
CN108460787B (zh) * | 2018-03-06 | 2020-11-27 | 北京市商汤科技开发有限公司 | 目标跟踪方法和装置、电子设备、程序、存储介质 |
JP7163372B2 (ja) | 2018-03-06 | 2022-10-31 | 北京市商▲湯▼科技▲開▼▲発▼有限公司 | 目標トラッキング方法及び装置、電子機器並びに記憶媒体 |
US11386306B1 (en) | 2018-12-13 | 2022-07-12 | Amazon Technologies, Inc. | Re-identification of agents using image analysis and machine learning |
JP7238962B2 (ja) * | 2019-03-13 | 2023-03-14 | 日本電気株式会社 | 物体追跡装置、物体追跡方法、及び、プログラム |
JP7188557B2 (ja) * | 2019-03-14 | 2022-12-13 | 日本電気株式会社 | 物体追跡システム、追跡パラメータ設定方法および設定プログラム |
JP7197000B2 (ja) | 2019-04-25 | 2022-12-27 | 日本電気株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
CN113473378A (zh) * | 2020-03-31 | 2021-10-01 | 宇龙计算机通信科技(深圳)有限公司 | 一种移动轨迹上报方法、装置、存储介质及电子设备 |
TWI786409B (zh) * | 2020-06-01 | 2022-12-11 | 聚晶半導體股份有限公司 | 影像偵測裝置以及影像偵測方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008007471A1 (fr) * | 2006-07-10 | 2008-01-17 | Kyoto University | Procédé de suivi d'un marcheur et dispositif de suivi d'un marcheur |
JP2009087090A (ja) * | 2007-09-28 | 2009-04-23 | Sony Computer Entertainment Inc | 対象物追跡装置および対象物追跡方法 |
JP2010219934A (ja) * | 2009-03-17 | 2010-09-30 | Victor Co Of Japan Ltd | 目標追尾装置 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7219032B2 (en) * | 2002-04-20 | 2007-05-15 | John Louis Spiesberger | Estimation algorithms and location techniques |
US7352359B2 (en) * | 2002-07-27 | 2008-04-01 | Sony Computer Entertainment America Inc. | Method and system for applying gearing effects to inertial tracking |
JP2005165688A (ja) * | 2003-12-02 | 2005-06-23 | Fuji Xerox Co Ltd | 複数対象物追跡方法及びシステム |
US8311276B2 (en) * | 2008-01-07 | 2012-11-13 | JVC Kenwood Corporation | Object tracking apparatus calculating tendency of color change in image data regions |
JP4991595B2 (ja) * | 2008-02-21 | 2012-08-01 | 株式会社東芝 | パーティクルフィルタを使用する追跡システム |
JP5213486B2 (ja) * | 2008-03-14 | 2013-06-19 | 株式会社ソニー・コンピュータエンタテインメント | 対象物追跡装置および対象物追跡方法 |
JP4730431B2 (ja) * | 2008-12-16 | 2011-07-20 | 日本ビクター株式会社 | 目標追尾装置 |
JP2010165052A (ja) * | 2009-01-13 | 2010-07-29 | Canon Inc | 画像処理装置及び画像処理方法 |
US20120020518A1 (en) * | 2009-02-24 | 2012-01-26 | Shinya Taguchi | Person tracking device and person tracking program |
JP5488076B2 (ja) * | 2010-03-15 | 2014-05-14 | オムロン株式会社 | 対象物追跡装置、対象物追跡方法、および制御プログラム |
-
2010
- 2010-02-19 JP JP2010034849A patent/JP5528151B2/ja not_active Expired - Fee Related
-
2011
- 2011-01-13 CN CN201180001777.4A patent/CN102405483B/zh not_active Expired - Fee Related
- 2011-01-13 WO PCT/JP2011/000135 patent/WO2011102072A1/ja active Application Filing
- 2011-01-13 US US13/263,169 patent/US8891821B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008007471A1 (fr) * | 2006-07-10 | 2008-01-17 | Kyoto University | Procédé de suivi d'un marcheur et dispositif de suivi d'un marcheur |
JP2009087090A (ja) * | 2007-09-28 | 2009-04-23 | Sony Computer Entertainment Inc | 対象物追跡装置および対象物追跡方法 |
JP2010219934A (ja) * | 2009-03-17 | 2010-09-30 | Victor Co Of Japan Ltd | 目標追尾装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102521612A (zh) * | 2011-12-16 | 2012-06-27 | 东华大学 | 一种基于协同关联粒子滤波的多视频目标主动跟踪方法 |
CN102521612B (zh) * | 2011-12-16 | 2013-03-27 | 东华大学 | 一种基于协同关联粒子滤波的多视频目标主动跟踪方法 |
US20220051044A1 (en) * | 2020-08-14 | 2022-02-17 | Fujitsu Limited | Image processing apparatus and computer-readable storage medium for storing screen processing program |
US11682188B2 (en) * | 2020-08-14 | 2023-06-20 | Fujitsu Limited | Image processing apparatus and computer-readable storage medium for storing screen processing program |
Also Published As
Publication number | Publication date |
---|---|
JP2011170684A (ja) | 2011-09-01 |
CN102405483B (zh) | 2014-11-12 |
CN102405483A (zh) | 2012-04-04 |
US8891821B2 (en) | 2014-11-18 |
JP5528151B2 (ja) | 2014-06-25 |
US20120093364A1 (en) | 2012-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5528151B2 (ja) | 対象物追跡装置、対象物追跡方法、および対象物追跡プログラム | |
US9542745B2 (en) | Apparatus and method for estimating orientation of camera | |
US10957068B2 (en) | Information processing apparatus and method of controlling the same | |
KR101064573B1 (ko) | 입자 여과를 이용하여 이동 물체를 추적하는 시스템 | |
US10853950B2 (en) | Moving object detection apparatus, moving object detection method and program | |
KR101071352B1 (ko) | 좌표맵을 이용한 팬틸트줌 카메라 기반의 객체 추적 장치 및 방법 | |
US9514541B2 (en) | Image processing apparatus and image processing method | |
US20150178900A1 (en) | Depth image processing apparatus and method based on camera pose conversion | |
US20130121592A1 (en) | Position and orientation measurement apparatus,position and orientation measurement method, and storage medium | |
US20120243733A1 (en) | Moving object detecting device, moving object detecting method, moving object detection program, moving object tracking device, moving object tracking method, and moving object tracking program | |
KR20180112090A (ko) | 카메라의 포즈를 판단하기 위한 장치 및 방법 | |
CN108140291A (zh) | 烟雾检测装置、方法以及图像处理设备 | |
JPWO2009091029A1 (ja) | 顔姿勢推定装置、顔姿勢推定方法、及び、顔姿勢推定プログラム | |
JP2013105285A5 (ja) | ||
JP2009510541A (ja) | オブジェクト追跡方法及びオブジェクト追跡装置 | |
JP6499047B2 (ja) | 計測装置、方法及びプログラム | |
JP2016085487A (ja) | 情報処理装置、情報処理方法及びコンピュータプログラム | |
US20150169947A1 (en) | Posture estimation device, posture estimation method, and posture estimation program | |
JP6850751B2 (ja) | 物体追跡装置、物体追跡方法、及びコンピュータプログラム | |
JP2008009849A (ja) | 人物追跡装置 | |
JP6947066B2 (ja) | 姿勢推定装置 | |
CN110378183B (zh) | 图像解析装置、图像解析方法及记录介质 | |
JP2011097217A (ja) | 動き補正装置およびその方法 | |
CA2543978A1 (en) | Object tracking within video images | |
CN111260709B (zh) | 一种面向动态环境的地面辅助的视觉里程计方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180001777.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11744367 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13263169 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11744367 Country of ref document: EP Kind code of ref document: A1 |