CN107657628A - A kind of real-time color method for tracking target - Google Patents

A kind of real-time color method for tracking target Download PDF

Info

Publication number
CN107657628A
CN107657628A CN201710854844.6A CN201710854844A CN107657628A CN 107657628 A CN107657628 A CN 107657628A CN 201710854844 A CN201710854844 A CN 201710854844A CN 107657628 A CN107657628 A CN 107657628A
Authority
CN
China
Prior art keywords
target
color
particle
real
motion information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710854844.6A
Other languages
Chinese (zh)
Inventor
任航
宋玉龙
郭巳秋
刘博超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Original Assignee
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Institute of Optics Fine Mechanics and Physics of CAS filed Critical Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority to CN201710854844.6A priority Critical patent/CN107657628A/en
Publication of CN107657628A publication Critical patent/CN107657628A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Abstract

The invention discloses a kind of real-time color method for tracking target.The real-time color method for tracking target comprises the following steps:Under particle filter framework, selected target template, while initialize particle collection;Present frame is obtained, by state transition model, obtains new particle collection;Target color information and movable information are extracted, each particle of new particle collection is drifted about using mean shift algorithm is improved, the particle collection after being drifted about;Fusion based on target color information and movable information, a new observation model is established, and calculate the particle weights of the particle collection after drift;The particle weights are normalized, and the state of present frame is obtained by the weighting of particle.Real-time color method for tracking target provided by the invention can overcome the disadvantages that the shortcomings that particle filter is computationally intensive, can suppress ambient interferences, realize, in real time tracking stable to target.

Description

Real-time color target tracking method
Technical Field
The invention relates to the field of image processing, in particular to a real-time color target tracking method.
Background
Particle filtering is rapidly developed in the field of target tracking, and a good effect is achieved, but in order to ensure the tracking stability, the number of particles must be enough, and the calculation amount is increased. The mean shift (mean shift) is one of typical representatives of the matching class tracking method, and has been widely regarded with the advantage of small calculation amount, but the mean shift cannot guarantee global optimization in the tracking process, and is easy to fall into local optimization.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and adopts the following technical scheme:
the invention provides a real-time color target tracking method. The real-time color target tracking method comprises the following steps: under a particle filter framework, selecting a target template, and initializing a particle set; acquiring a current frame, and acquiring a new particle set through a state transition model; extracting target color information and motion information, and drifting each particle of the new particle set by adopting an improved mean shift algorithm to obtain a drifted particle set; establishing a new observation model based on the fusion of the target color information and the motion information, and calculating the particle weight of the particle set after the drift; and normalizing the weight of the particles, and obtaining the state of the current frame through the weighting of the particles.
In some embodiments, under the particle filtering framework, selecting the target template, and initializing the particle set specifically includes: firstly, a target template is manually extracted from an initial frame to obtain initial state parameters of a target, and the number of particles is randomly distributed near the initial state of the target.
In some embodiments, obtaining the current frame and obtaining the particle set through the state transition model specifically includes: obtaining a set of particles at time k-1Obtaining a particle set through a state transition model
In some embodiments, the state transition model is: s' k -s k-1 =s k-1 -s k-2 +ru k-1 Wherein, in the step (A),represents the random propagation radius of the particle; u. of t-1 Is [ -1,1 [ ]]The random number of (2).
In some embodiments, the improved mean shift algorithm is: suppose that the target to be tracked is initially located at C 0 (x 0 ,y 0 ) Initializing iteration times itn =0, and firstly calculating a zeroth-order matrix and a first-order moment of a rectangular area where a target to be tracked is located; calculating the centroid of the rectangular area according to the zero order moment and the first order moment; if | | C 1 -C 0 ||&If lt, epsilon or itn is more than itn0, stopping iteration and updating the target position to be C 1 (ii) a Otherwise, order C 0 =C 1 And returning to the step (1).
In some embodiments, extracting the target color information specifically includes: firstly, respectively constructing an H component histogram and an S component histogram of a target template by using an H component and an S component in an HSV color space; and then obtaining the H component probability distribution map and the S component probability distribution map of the current frame through respective color histogram back projection, and finally obtaining the total color probability distribution map of the current frame.
In some embodiments, extracting the target motion information specifically includes:
for each pixel in the image, calculating the difference between the absolute values of all pixels in the local range centered on the pixel between the k-th frame and the k-1 th frame to obtain a difference image of the two frames of imagesComputing a difference image of the (k-1) th and (k-2) th framesTo pair Averaging to obtainTo D k Performing adaptive threshold segmentation, wherein the threshold is set as follows: th = m + ksd, where m and sd are respectively image D k K is a constant.
In some embodiments, based on the fusion of the target color information and the motion information, establishing a new observation model specifically includes: assuming that the candidate target is a rectangular area, calculating color information and motion information of the candidate target; and fusing the color information and the motion information of the candidate target to establish a new observation model.
In some embodiments, the color and motion information of the candidate object may be calculated by:
where (x, y) represents the coordinates of the pixels in the region in the current frame, M c And M m Representing the zeroth order moment, I, of the rectangular region in the color probability distribution map and the motion probability distribution map, respectively c (x, y) and I m (x, y) represents the pixel value of the (x, y) coordinate in the color probability distribution map and the motion probability distribution map.
In some embodiments, the fusion manner of fusing the color information and the motion information of the candidate object is as follows:
where M represents the joint zero matrix and β ∈ [0,1] indicates the contribution of motion information to tracking.
The invention has the technical effects that: according to the real-time color target tracking method, under the particle filter framework, a traditional target model is updated at first, and a new target model integrating color information and motion information is provided. The method not only can make up the defect of large calculation amount of particle filtering, but also can effectively overcome the defect that the mean shift is easy to fall into the local maximum, and simultaneously effectively inhibits background interference due to the fusion of the color and the motion information of the target, and finally realizes accurate and real-time tracking of the target.
Drawings
FIG. 1 is a flow chart of a real-time color target tracking method according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and do not limit the invention.
Referring to fig. 1, a real-time color target tracking method according to the present invention is shown. The real-time color target tracking method comprises the following steps:
s1, selecting a target template under a particle filter framework, and initializing a particle set at the same time;
s2, acquiring a current frame, and obtaining a new particle set through a state transition model;
s3, extracting target color information and motion information, and drifting each particle of the new particle set by adopting an improved mean shift algorithm to obtain a particle set after drifting;
s4, establishing a new observation model based on the fusion of the target color information and the motion information, and calculating the particle weight of the particle set after the drift;
and S5, normalizing the weight of the particles, and obtaining the state of the current frame through the weighting of the particles.
In some embodiments, the selecting a target template and initializing a particle set under the particle filter framework specifically include: firstly, a target template is manually extracted from an initial frame to obtain initial state parameters of a target, and the number of particles is randomly distributed near the initial state of the target.
In some embodiments, the obtaining the current frame and the particle set through the state transition model specifically includes: obtaining a set of particles at time k-1Obtaining a particle set through a state transition modelWhereinN is the number of particles.
In some embodiments, the state transition model is:
s′ k -s k-1 =s k-1 -s k-2 +ru k-1
wherein the content of the first and second substances,represents the random propagation radius of the particle; u. u t-1 Is [ -1,1]The random number of (2).
In the traditional mean shift, a color histogram is used as a target representation model, the center of mass of a weighted graph is solved iteratively to track a target, a good tracking effect can be obtained under the condition of simple background, but the target is easy to fall into a local maximum value when encountering noise interference or possibly shielding and the like, and meanwhile, the target is often failed to track the fast moving target and cannot be recovered from the failure, so that the traditional mean shift algorithm has limitation. In view of the shortcomings of the conventional mean shift algorithm, in some embodiments, the improved mean shift algorithm is: assuming a target to be trackedInitially at C 0 (x 0 ,y 0 ) Initializing iteration times itn =0, and firstly calculating a zeroth-order matrix and a first-order moment of a rectangular area where a target to be tracked is located; calculating the centroid of the rectangular area according to the zero-order moment and the first-order moment; if | | | C 1 -C 0 ||&If lt, epsilon or itn is more than itn0, stopping iteration and updating the target position to be C 1 (ii) a Otherwise, order C 0 =C 1 And returning to the step (1). Specifically, assume that the target to be tracked is initially located at C 0 (x 0 ,y 0 ) If the number of initialization iterations itn =0, the mean shift algorithm based on color and motion information has the following steps: firstly, calculating a zero-order matrix and a first-order matrix of a rectangular area where a target to be tracked is located
Wherein M is 00 Is a zero matrix, M 10 Is a matrix of x, M 01 Is a matrix of y;
calculating the centroid of the rectangular region according to the zero order moment and the first order moment:
the rectangular area is then positioned at C (x) 1 ,y 1 ) And altering the iteration number itn = itn +1 if C 1 -C 0 ||&If lt, epsilon or itn is more than itn0, stopping iteration and updating the target position to be C 1 (ii) a Otherwise, order C 0 =C 1 And returning to the first step.
Wherein the first iteration termination condition is that the average moving position is less than a preset threshold value
ε (typically 2); another condition is that the number of iterations is greater than a predetermined threshold
itn0, typically taken as [6, 15], is taken as 10 in the present invention.
In some embodiments, said step S4, is based on a goalFusing color information and motion information, establishing a new observation model, and calculating the particle weight of the particle set after the drift; it is also necessary to measure the similarity between each particle and the target template by defining the following functionWhereinThereby obtaining the observation function of the particle filterCalculating weights of particles according to observation function
The traditional particle filtering adopts color information to establish a target model, the information has the advantages of rotation invariance, scale invariance and the like, the traditional color probability distribution map is obtained by counting hue H components in HSV space, and the saturation S value is introduced to count so as to effectively distinguish the target and the background. In some embodiments, the extracting the target color information specifically includes: firstly, respectively constructing an H component histogram and an S component histogram of a target template by using an H component and an S component in an HSV color space; and then obtaining the H component probability distribution map and the S component probability distribution map of the current frame through respective color histogram back projection, and finally obtaining the total color probability distribution map of the current frame. Specifically, firstly, an H component histogram and an S component histogram of a target template are respectively constructed by an H component and an S component in an HSV color space, and then an H component probability distribution map I of a current frame is obtained by back projection of the respective color histograms h And S component probability distribution map I g Finally, obtaining the total color probability distribution diagram I of the current frame c :I c =αI h +(1-α)I s Wherein α ∈ [0.5,1 ]]Representing the contribution of the H component to the color information, through multiple experimentsIt was verified that α =0.7 is appropriate.
The traditional particle filter only adopts color information to construct a target model, the target model is constructed by using the color information and combining motion information, and an improved frame difference method is provided to detect the motion information of a current frame (set as a Kth frame). In some embodiments, the extracting the target motion information specifically includes: for each pixel in the image, calculating the difference between the absolute values of all pixels in the local range centered on the pixel between the k-th frame and the k-1 th frame to obtain a difference image of the two frames of imagesComputing a difference image of the (k-1) th and (k-2) th framesTo pair Averaging to obtain2; to D k Performing adaptive threshold segmentation, wherein the threshold is set as follows: th = m + ksd, where m and sd are respectively image D k K is a constant. In particular, k is typically taken to be [2,6 ]]In a specific embodiment of the present invention, k is 2, and pixels greater than the threshold are considered as possible motion pixels and are assigned with 1, otherwise, are assigned with 0, so as to obtain a motion difference image of the current frame, in the motion difference image, not all pixels with a value of 1 are useful, and only points similar to the target color are possible to make the target point, so that it is further said that the target difference image is multiplied by the pixels corresponding to the color probability distribution map to obtain the motion probability distribution map I m
In some embodiments, the establishing a new observation model based on the fusion of the target color information and the motion information specifically includes: assuming that the candidate target is a rectangular area, (x, y) represents the coordinates of the pixels in the area in the current frame, and calculating the color information and the motion information of the candidate target; and fusing the color information and the motion information of the candidate target to establish a new observation model.
In some embodiments, the color and motion information of the candidate object may be calculated by:
where (x, y) represents the coordinates of the pixels in the region in the current frame, M c And M m Representing the zeroth order moment, I, of the rectangular region in the color probability distribution map and the motion probability distribution map, respectively c (x, y) and I m (x, y) represents the pixel value of the (x, y) coordinate in the color probability distribution map and the motion probability distribution map. In some embodiments, the fusion manner for fusing the color information and the motion information of the candidate object is as follows:
wherein M represents a joint zero matrix, beta belongs to [0,1] to show the contribution of motion information to tracking, usually 0.8 is taken, the weighted sum of the color and the motion information corresponds to two characteristics of color and motion of a target, and the target model constructed on the basis is more stable than the traditional target model established by only utilizing a single target characteristic of color.
The invention has the technical effects that: according to the real-time color target tracking method, under the particle filter framework, a traditional target model is updated at first, and a new target model integrating color information and motion information is provided. The method can make up the defect of large calculation amount of particle filtering, can effectively overcome the defect that the mean shift is easy to fall into the local maximum value, effectively inhibits background interference due to the fusion of the color and the motion information of the target, and finally realizes accurate and real-time tracking of the target.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. .
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are exemplary and not to be construed as limiting the present invention, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention. Any other corresponding changes and modifications made according to the technical idea of the present invention should be included in the protection scope of the claims of the present invention.

Claims (10)

1. A real-time color target tracking method, comprising:
under the particle filter framework, selecting a target template and initializing a particle set at the same time;
acquiring a current frame, and acquiring a new particle set through a state transition model;
extracting target color information and motion information, and drifting each particle of the new particle set by adopting an improved mean shift algorithm to obtain a drifted particle set;
establishing a new observation model based on the fusion of the target color information and the motion information, and calculating the particle weight of the particle set after the drift;
and normalizing the weight of the particles, and obtaining the state of the current frame through the weighting of the particles.
2. The method of claim 1, wherein selecting a target template and initializing a set of particles in a particle filter framework comprises: firstly, a target template is manually extracted from an initial frame to obtain initial state parameters of a target, and the number of particles is randomly distributed near the initial state of the target.
3. The real-time color target tracking method of claim 1, wherein obtaining the current frame and the set of particles through the state transition model specifically comprises:
obtaining a set of particles at time k-1Obtaining a particle set through a state transition model
4. The real-time color target tracking method of claim 1, wherein the state transition model is:
s' k -s k-1 =s k-1 -s k-2 +ru k-1 wherein, in the step (A),represents the random propagation radius of the particle; u. u t-1 Is [ -1,1 [ ]]The random number of (2).
5. The real-time color target tracking method according to claim 1, wherein the improved mean shift algorithm is: suppose that the target to be tracked is initially located at C 0 (x 0 ,y 0 ) Number of initialization iterations itn =0
(1) Firstly, calculating a zero-order matrix and a first-order moment of a rectangular area where a target to be tracked is located;
(2) Calculating the centroid of the rectangular area according to the zero-order moment and the first-order moment;
(3) If | | C 1 -C 0 ||&If lt, epsilon or itn is more than itn0, stopping iteration and updating the target position to be C 1 (ii) a Otherwise, order C 0 =C 1 And returning to the step (1).
6. The real-time color target tracking method according to claim 1, wherein extracting target color information specifically comprises:
firstly, respectively constructing an H component histogram and an S component histogram of a target template by using an H component and an S component in an HSV color space;
and then obtaining the H component probability distribution map and the S component probability distribution map of the current frame through respective color histogram back projection, and finally obtaining the total color probability distribution map of the current frame.
7. The real-time color target tracking method according to claim 1, wherein extracting target motion information specifically comprises:
(1) For each pixel in the image, calculating the difference between the absolute values of all pixels in the local range centered on the pixel between the k-th frame and the k-1 th frame to obtain a difference image of the two frames of images
(2) Computing a difference image of the (k-1) th and (k-2) th frames
(3) Then, for Averaging to obtain
(4) Finally, for D k Performing adaptive threshold segmentation, wherein the threshold is set as follows: th = m + ksd, where m and sd are respectively image D k K is a constant.
8. The real-time color target tracking method according to claim 1, wherein based on the fusion of the target color information and the motion information, establishing a new observation model specifically comprises:
assuming that the candidate target is a rectangular area, calculating color information and motion information of the candidate target;
and fusing the color information and the motion information of the candidate target to establish a new observation model.
9. The real-time color target tracking method of claim 8, wherein the color and motion information of the candidate target is calculated by:
where (x, y) represents the coordinates of the pixels in the region in the current frame, M c And M m Representing the zeroth order moment, I, of the rectangular region in the color probability distribution map and the motion probability distribution map, respectively c (x, y) and I m (x, y) denotes color probability distribution map and fortunePixel values of (x, y) coordinates in the dynamic probability distribution map.
10. The real-time color target tracking method according to claim 8, wherein the fusion mode for fusing the color information and the motion information of the candidate target is as follows:
where M represents the joint zero matrix and β ∈ [0,1] indicates the contribution of motion information to tracking.
CN201710854844.6A 2017-09-20 2017-09-20 A kind of real-time color method for tracking target Pending CN107657628A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710854844.6A CN107657628A (en) 2017-09-20 2017-09-20 A kind of real-time color method for tracking target

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710854844.6A CN107657628A (en) 2017-09-20 2017-09-20 A kind of real-time color method for tracking target

Publications (1)

Publication Number Publication Date
CN107657628A true CN107657628A (en) 2018-02-02

Family

ID=61130095

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710854844.6A Pending CN107657628A (en) 2017-09-20 2017-09-20 A kind of real-time color method for tracking target

Country Status (1)

Country Link
CN (1) CN107657628A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110298847A (en) * 2019-06-27 2019-10-01 浙江工业大学 A kind of background modeling method of long-time background collection
CN112348853A (en) * 2020-11-04 2021-02-09 哈尔滨工业大学(威海) Particle filter tracking method based on infrared saliency feature fusion
CN112561945A (en) * 2020-12-03 2021-03-26 南京理工大学 Dynamic background target tracking method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142085A (en) * 2011-05-11 2011-08-03 武汉大学 Robust tracking method for moving flame target in forest region monitoring video
CN103021186A (en) * 2012-12-28 2013-04-03 中国科学技术大学 Vehicle monitoring method and vehicle monitoring system
WO2017047688A1 (en) * 2015-09-17 2017-03-23 株式会社日立国際電気 Falling object detecting-and-tracking system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142085A (en) * 2011-05-11 2011-08-03 武汉大学 Robust tracking method for moving flame target in forest region monitoring video
CN103021186A (en) * 2012-12-28 2013-04-03 中国科学技术大学 Vehicle monitoring method and vehicle monitoring system
WO2017047688A1 (en) * 2015-09-17 2017-03-23 株式会社日立国際電気 Falling object detecting-and-tracking system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴迪: "基于粒子滤波的目标跟踪算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
张虎: "MeanShift粒子滤波算法在视频目标跟踪中的应用研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110298847A (en) * 2019-06-27 2019-10-01 浙江工业大学 A kind of background modeling method of long-time background collection
CN110298847B (en) * 2019-06-27 2021-06-04 浙江工业大学 Background modeling method for long-time background collection
CN112348853A (en) * 2020-11-04 2021-02-09 哈尔滨工业大学(威海) Particle filter tracking method based on infrared saliency feature fusion
CN112348853B (en) * 2020-11-04 2022-09-23 哈尔滨工业大学(威海) Particle filter tracking method based on infrared saliency feature fusion
CN112561945A (en) * 2020-12-03 2021-03-26 南京理工大学 Dynamic background target tracking method
CN112561945B (en) * 2020-12-03 2022-09-13 南京理工大学 Dynamic background target tracking method

Similar Documents

Publication Publication Date Title
Sun et al. Motion removal for reliable RGB-D SLAM in dynamic environments
Hu et al. A novel object tracking algorithm by fusing color and depth information based on single valued neutrosophic cross-entropy
Li et al. Fast guided global interpolation for depth and motion
Mohamed et al. Illumination-robust optical flow using a local directional pattern
CN109685045B (en) Moving target video tracking method and system
Huhle et al. Robust non-local denoising of colored depth data
CN111079556A (en) Multi-temporal unmanned aerial vehicle video image change area detection and classification method
Lee et al. Robust stereo matching using adaptive random walk with restart algorithm
CN109961506A (en) A kind of fusion improves the local scene three-dimensional reconstruction method of Census figure
CN106991686B (en) A kind of level set contour tracing method based on super-pixel optical flow field
Lo et al. Joint trilateral filtering for depth map super-resolution
CN110619647B (en) Method for positioning fuzzy region of image based on combination of edge point frequency domain and spatial domain characteristics
CN107657628A (en) A kind of real-time color method for tracking target
Chen et al. A color-guided, region-adaptive and depth-selective unified framework for Kinect depth recovery
Chen et al. Kinect depth recovery using a color-guided, region-adaptive, and depth-selective framework
St-Charles et al. Online multimodal video registration based on shape matching
Tian et al. A novel edge-weight based fuzzy clustering method for change detection in SAR images
CN103337082B (en) Methods of video segmentation based on Statistical Shape priori
Siddiqui et al. Clustering techniques for image segmentation
Liu et al. Automatic objects segmentation with RGB-D cameras
WO2014172875A1 (en) Moving object detection
Xu et al. Features based spatial and temporal blotch detection for archive video restoration
Kim et al. Multi-view object extraction with fractional boundaries
CN107392936B (en) Target tracking method based on meanshift
CN107067411B (en) Mean-shift tracking method combined with dense features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180202