CN109064491A - A kind of nuclear phase pass filter tracking method of adaptive piecemeal - Google Patents

A kind of nuclear phase pass filter tracking method of adaptive piecemeal Download PDF

Info

Publication number
CN109064491A
CN109064491A CN201810808535.XA CN201810808535A CN109064491A CN 109064491 A CN109064491 A CN 109064491A CN 201810808535 A CN201810808535 A CN 201810808535A CN 109064491 A CN109064491 A CN 109064491A
Authority
CN
China
Prior art keywords
piecemeal
target
pixel
value
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810808535.XA
Other languages
Chinese (zh)
Inventor
陈超
王玮
潘九宝
刘善磊
石善球
王圣尧
张大骞
杨锦
范雪婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PROVINCIAL GEOMATICS CENTRE OF JIANGSU
Original Assignee
PROVINCIAL GEOMATICS CENTRE OF JIANGSU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PROVINCIAL GEOMATICS CENTRE OF JIANGSU filed Critical PROVINCIAL GEOMATICS CENTRE OF JIANGSU
Publication of CN109064491A publication Critical patent/CN109064491A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • G06V10/507Summing image-intensity values; Histogram projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a kind of nuclear phases of adaptive piecemeal to close filter tracking method, HOG feature, gray feature, textural characteristics training classifier are extracted as unit of super-pixel block, calculate tracking response figure, and figure calculates average peak correlation energy according to response, when any one block energy value is less than threshold value, it was demonstrated that tracking is lost, and reselects method of partition based on the aspect ratio of previous frame target area, otherwise according to the mutual alignment relation between each piecemeal, goal end position and size are obtained.When target aspect is than changing obvious, accurate tracking result just can not be obtained.Method of the invention can avoid tracking result mistake by adaptive piecemeal in the case where target aspect ratio changes, and HOG feature, gray feature and the textural characteristics based on super-pixel carry out classifier training, and tracking is carried out according to average peak correlation energy and loses judgement, it can be realized quick, accurate and effective target following.

Description

A kind of nuclear phase pass filter tracking method of adaptive piecemeal
Technical field
The present invention relates to the nuclear phases of digital image processing techniques field more particularly to a kind of adaptive piecemeal to close filter tracking Method.
Background technique
Target following is the research hotspot of computer vision field, in precise guidance, intelligent transportation, video monitoring, man-machine The multiple fields extensive applications such as interaction.Although having obtained huge progress between in the past twenty years, but still face background Change, block, the interference of illumination variation, the factors such as target deformation.
Existing plurality of target track algorithm shows excellent performance.In recent years, correlation filter be introduced into target with Track, and achieve good tracking result.Bolme etc. proposes minimum output square error and (MOSSE), for the first time filters correlation The thought of wave is introduced into target following.The it is proposeds such as Henriques are based on core loop structure detecting and tracking (CSK), using intensively adopting Sample and kernel function, which simplify, calculates step, and this method is high-efficient, and average tracking speed is per second up to hundreds of frames;It is again public in 2015 afterwards It has opened and the single channel in CSK method is extended to multichannel, using HOG feature replacement original gradation feature, proposed nuclear phase pass Filter (KCF) tracking.Danelljan etc. has used color characteristic on the basis of CSK, proposes CN tracking.But It is the above method is all fixed-size tracking.In order to solve the problems, such as dimension self-adaption, Danelljan and Yang are in KCF On the basis of introduce dimension self-adaption mechanism, propose SAMF method and DSST method respectively, both methods is to solve scale The representative algorithm of variation issue.DSST is similar to the thought that SAMF solves target scale variation: to present frame target according to default Size amplification and diminution, find Optimum Matching value as new target scale from different scale.Duan Weiwei and Xu Yulong etc. exists Method on the basis of core correlation filtering using piecemeal realizes multiscale tracing, but both methods is to image fixed block, When target aspect is than changing obvious, accurate tracking result just can not be obtained.
Summary of the invention
The purpose of the present invention is to provide a kind of nuclear phase of adaptive piecemeal close filter tracking method, with obtain accurately with Track result.
To realize the above-mentioned technical purpose, the present invention adopts the following technical scheme:
A kind of nuclear phase pass filter tracking method of adaptive piecemeal, selects target area in initial frame center, calculates aspect ratio, Several piecemeals are divided an image into according to aspect ratio and calculate the super-pixel block of each piecemeal;It is extracted as unit of super-pixel block HOG feature, gray feature, textural characteristics training classifier, calculate tracking response figure, and figure calculates average peak phase according to response Energy is closed, when any one block energy value is less than threshold value, it was demonstrated that tracking is lost, based on the aspect ratio of previous frame target area Method of partition is reselected, otherwise according to the mutual alignment relation between each piecemeal, obtains goal end position and size.
Further, the method specifically comprises the following steps:
S100 reads video sequence, obtains initial frame, the selected target region in initial frame;
S200 divides an image into several piecemeals according to the aspect ratio of target area, and determines every piece of region of search;
S300 calculates separately the super-pixel piecemeal of each piecemeal;
S400 extracts gray feature, HOG feature and the textural characteristics of each super-pixel piecemeal respectively, design factor matrix, Training obtains following Nonlinear Classifier;
S500 obtains next frame, and the partitioned searching region of present frame is obtained according to the piecemeal target position of previous frame, calculates Correlation filtering response diagram;
S600 is based on correlation filtering response diagram and calculates average peak correlation energy, is shown below:
F in formulamax、Fmin、Fw,hRespectively indicate response highest, minimum and w, the response on the position h;
When the average peak correlation energy value of each piecemeal is both greater than or is equal to history mean value, S700 is continued to execute;When When wherein any one piece of value is less than history mean value, present frame uses the target position of previous frame, goes to S200;
Further, in the S200, target area aspect ratio is calculated, aspect ratio parameter is set as T1=1.4, T2= 0.6, when aspect ratio is greater than or equal to T1, target level is halved, aspect ratio is less than or equal to T2, target vertical is halved, Aspect ratio is greater than T2Less than T1When, by the target quartering.
Further, in the S300, the calculation method of super-pixel piecemeal is as follows:
The selected number initialization cluster centre of S310, delimit the initial area of generic;
S320 calculates all pixels point gradient value, seed point is moved on to the smallest position of gradient in seed point 3*3 neighborhood It sets;
S330 takes all pixels in the certain contiguous range of each cluster centre, calculates each pixel to cluster centre Pixel is classified as the cluster centre of minimum range by distance;
S340 repeats S330 until error convergence;
S350 carries out connectivity processing to image.
Super-pixel is converted by image to calculate, and can greatly promote calculating speed.
Further, in the S400, textural characteristics, one-dimensional reality Gabor filter are obtained using one-dimensional Gabor filter It indicates are as follows:
In formula, σ is the standard deviation of Gaussian function, x0For the center point coordinate of function, u0For the centre frequency of cosine wave;It will The centre frequency of Gabor filtering is determined as 3/64,3/32,3/16, obtains three filter templates, sets 1 for template width, Angle step is set as 15 °, that is, is provided with 24 directions;
Textural characteristics are obtained using above three filter, steps are as follows:
Gray value of image is mapped in [0,1] section by S410;
The datum mark of above three filter template is respectively aligned to pixel s by S420, along direction j, j ∈ 1,2, 3 ... 24 } filter response value is obtained, and normalize respectively;
S430 calculates the weighted average of the spectral value after equidirectional three normalization filter values and normalization as picture The texture value in 24 directions of vegetarian refreshments;
24 texture values of S440 compared pixels point s, using minimum value as the final angle textural characteristics of pixel s;
S450 successively selects each pixel, repeats S410~S440, obtains complete angular texture signature value.
Every bit in tracking video has a textural characteristics along either direction, but along the line of a certain specific direction Reason feature can be best described by object.Therefore, the present invention sets a kind of extracting rule, and the texture for choosing a certain specific direction is special Sign, the textural characteristics so obtained are angular texture signature.
Method of the invention can avoid tracking result wrong by adaptive piecemeal in the case where target aspect ratio changes Accidentally, and the HOG feature based on super-pixel, gray feature and textural characteristics carry out classifier training, in target and ambient background face When color is consistent or height is similar, the microscopic feature and gross feature of image can be preferably taken into account, tracking accuracy is improved;And according to Average peak correlation energy carries out tracking and loses judgement, realizes quick, accurate and effective target following.
Detailed description of the invention
Fig. 1 is the flow chart of the method for the present invention;
Fig. 2 is filter template setting figure of the present invention.
Specific embodiment
The technical scheme of the present invention will be further described with specific embodiment for explanation with reference to the accompanying drawing.
As shown in Figure 1, steps are as follows for tracking of the invention:
S100 reads video sequence, obtains initial frame, the selected target region in initial frame;
S200 divides an image into several piecemeals according to the aspect ratio of target area, and determines every piece of region of search;
Method of partition in the present embodiment is to calculate target area aspect ratio, and aspect ratio parameter is set as T1=1.4, T2= 0.6, when aspect ratio is greater than or equal to T1, target level is halved, aspect ratio is less than or equal to T2, target vertical is halved, Aspect ratio is greater than T2Less than T1When, by the target quartering.
S300 calculates separately the super-pixel piecemeal of each piecemeal;
The calculation method of super-pixel piecemeal is as follows:
The selected number initialization cluster centre of S310, delimit the initial area of generic;
S320 calculates all pixels point gradient value, seed point is moved on to the smallest position of gradient in seed point 3*3 neighborhood It sets;
S330 takes all pixels in the certain contiguous range of each cluster centre, calculates each pixel to cluster centre Pixel is classified as the cluster centre of minimum range by distance;
S340 repeats S330 until error convergence;
S350 carries out connectivity processing to image.
S400 extracts gray feature, HOG feature and the textural characteristics of each super-pixel piecemeal respectively, design factor matrix, Training obtains following Nonlinear Classifier;
Angular texture signature is obtained using one-dimensional Gabor filter, one-dimensional reality Gabor filter indicates are as follows:
In formula, σ is the standard deviation of Gaussian function, x0For the center point coordinate of function, u0For the centre frequency of cosine wave;It will The centre frequency of Gabor filtering is determined as 3/64,3/32,3/16, obtains three filter templates, sets 1 for template width, Angle step is set as 15 °, that is, is provided with 24 directions;It is as shown in Figure 1 filter template width, the schematic diagram that direction is arranged.
Textural characteristics are obtained using above three filter, steps are as follows:
Gray value of image is mapped in [0,1] section by S410;
The datum mark of three filter templates shown in FIG. 1 will be respectively aligned to pixel s by S420, along direction j, j ∈ 1,2,3 ... and 24 } filter response value is obtained, and normalize respectively;
S430 calculates the weighted average of the spectral value after equidirectional three normalization filter values and normalization as picture The texture value in 24 directions of vegetarian refreshments;
24 texture values of S440 compared pixels point s, using minimum value as the final angle textural characteristics of pixel s;
S450 successively selects each pixel, repeats S410~S440, obtains complete angular texture signature value.
S500 obtains next frame, and the partitioned searching region of present frame is obtained according to the piecemeal target position of previous frame, calculates Correlation filtering response diagram;
S600 is based on correlation filtering response diagram and calculates average peak correlation energy, is shown below:
Fmax、Fmin、Fw,hRespectively indicate response highest, the response on minimum and position (w, h);
When the average peak correlation energy value of each piecemeal is both greater than or is equal to history mean value, S700 is continued to execute;When When wherein any one piece of value is less than history mean value, present frame uses the target position of previous frame, goes to S200;
Concussion situation of the above formula to reflect response diagram illustrates that target occlusion occurs, loses when APCE reduces suddenly, Next frame will not use the tracking result of this frame, and select piecemeal side based on the aspect ratio of previous frame target area again Method carries out piecemeal tracking.
When target aspect ratio changes greatly, and has been unsatisfactory for the piecemeal situation of initial frame, if continuing to use initial frame Partitioned mode, it will increase target's center position and size error;Or when piecemeal target is blocked or has lost, but not Judged, all deviation accumulation is made to have arrived next frame, influences final tracking result.The response diagram of KCF is when tracking accurate One close to ideal two-dimentional response diagram, and encounter block, target is lost, it is serious mismatch when response diagram shake Acutely.Therefore whether the present invention judge target by calculating average peak correlation energy APCE and be blocked and need again New piecemeal.
S700 is using the maximum position of response as the new position of piecemeal target;
S800 determines target size and position according to the relative positional relationship between sub-block;
By taking two piecemeal of target level as an example, into next frame, distance between the center of two piecemeals is calculated, when the distance Greater than between the center of initial two piecemeals apart from when, it is believed that target amplification, at this time calculate calculate two piecemeals centre bit The midpoint for setting line, obtains the center of target, and target size is determined by piecemeal maximum magnitude obtained.Similarly, when working as Between the center of two piecemeals of previous frame distance be less than initial two piecemeals center between apart from when, it is believed that shrinking of object, After obtaining target's center position, target size is determined by the minimum zone of piecemeal.
S900 updates the display model of object module according to the following formulaAnd coefficient matrix
In formulaWithIt is illustrated respectively in the display model and coefficient matrix that present frame obtains;η is learning rate;
S1000 repeats S300~S900, and up to video, processing terminate.

Claims (5)

1. a kind of nuclear phase of adaptive piecemeal closes filter tracking method, which is characterized in that select target area in initial frame center, count Aspect ratio is calculated, several piecemeals are divided an image into according to aspect ratio and calculate the super-pixel block of each piecemeal;With super-pixel block HOG feature, gray feature, textural characteristics training classifier are extracted for unit, calculates tracking response figure, and figure calculates according to response Average peak correlation energy, when any one block energy value is less than threshold value, it was demonstrated that tracking is lost, with the vertical of previous frame target area Method of partition is reselected based on horizontal ratio, otherwise according to the mutual alignment relation between each piecemeal, obtains goal end position With size.
2. the method according to claim 1, wherein including the following steps:
S100 reads video sequence, obtains initial frame, the selected target region in initial frame;
S200 divides an image into several piecemeals according to the aspect ratio of target area, and determines every piece of region of search;
S300 calculates separately the super-pixel piecemeal of each piecemeal;
S400 extracts gray feature, HOG feature and the textural characteristics of each super-pixel piecemeal, design factor matrix, training respectively Obtain following Nonlinear Classifier;
S500 obtains next frame, and the partitioned searching region of present frame is obtained according to the piecemeal target position of previous frame, calculates related Filter response figure;
S600 is based on correlation filtering response diagram and calculates average peak correlation energy, is shown below:
F in formulamax、Fmin、Fw,hRespectively indicate response highest, minimum and w, the response on the position h;
When the average peak correlation energy value of each piecemeal is both greater than or is equal to history mean value, S700 is continued to execute;When wherein When any one piece of value is less than history mean value, present frame uses the target position of previous frame, goes to S200;
S700 is using the maximum position of response as the new position of piecemeal target;
S800 determines target size and position according to the relative positional relationship between sub-block;
S900 updates the display model of object module according to the following formulaAnd coefficient matrix
In formulaWithIt is illustrated respectively in the display model and coefficient matrix that present frame obtains;η is learning rate;
S1000 repeats S300~S900, and up to video, processing terminate.
3. method according to claim 1 or 2, which is characterized in that in the S200, calculate target area aspect ratio, indulge It is horizontal to be set as T than parameter1=1.4, T2=0.6, when aspect ratio is greater than or equal to T1, target level is halved, aspect ratio is less than Or it is equal to T2, target vertical is halved, aspect ratio is greater than T2Less than T1When, by the target quartering.
4. method according to claim 1 to 3, which is characterized in that in the S300, the calculation method of super-pixel piecemeal is such as Under:
The selected number initialization cluster centre of S310, delimit the initial area of generic;
S320 calculates all pixels point gradient value, seed point is moved on to the smallest position of gradient in seed point 3*3 neighborhood;
S330 takes all pixels in the certain contiguous range of each cluster centre, calculate each pixel to cluster centre away from From pixel to be classified as to the cluster centre of minimum range;
S340 repeats S330 until error convergence;
S350 carries out connectivity processing to image.
5. method described in -4 according to claim 1, which is characterized in that in the S400, obtained using one-dimensional Gabor filter Angular texture signature, one-dimensional reality Gabor filter indicate are as follows:
In formula, σ is the standard deviation of Gaussian function, x0For the center point coordinate of function, u0For the centre frequency of cosine wave;By Gabor The centre frequency of filtering is determined as 3/64,3/32,3/16, obtains three filter templates, sets 1 for template width, angle Step-length is set as 15 °, that is, is provided with 24 directions;
Textural characteristics are obtained using above three filter, steps are as follows:
Gray value of image is mapped in [0,1] section by S410;
The datum mark of above three filter template is respectively aligned to pixel s by S420, along direction j, j ∈ 1,2,3, ... 24 } filter response value is obtained, and normalize respectively;
S430 calculates the weighted average of the spectral value after equidirectional three normalization filter values and normalization as pixel 24 directions texture value;
24 texture values of S440 compared pixels point s, using minimum value as the final angle textural characteristics of pixel s;
S450 successively selects each pixel, repeats S410~S440, obtains complete angular texture signature value.
CN201810808535.XA 2018-04-12 2018-07-20 A kind of nuclear phase pass filter tracking method of adaptive piecemeal Pending CN109064491A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2018103296509 2018-04-12
CN201810329650 2018-04-12

Publications (1)

Publication Number Publication Date
CN109064491A true CN109064491A (en) 2018-12-21

Family

ID=64835318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810808535.XA Pending CN109064491A (en) 2018-04-12 2018-07-20 A kind of nuclear phase pass filter tracking method of adaptive piecemeal

Country Status (1)

Country Link
CN (1) CN109064491A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110349183A (en) * 2019-05-30 2019-10-18 西安电子科技大学 A kind of tracking based on KCF, device, electronic equipment and storage medium
CN110414439A (en) * 2019-07-30 2019-11-05 武汉理工大学 Anti- based on multi-peak detection blocks pedestrian tracting method
CN110706252A (en) * 2019-09-09 2020-01-17 西安理工大学 Robot nuclear correlation filtering tracking algorithm under guidance of motion model
CN110942472A (en) * 2019-11-28 2020-03-31 广西师范大学 Nuclear correlation filtering tracking method based on feature fusion and self-adaptive blocking
CN112348847A (en) * 2020-10-26 2021-02-09 南京邮电大学 Target scale self-adaptive tracking method
CN116823737A (en) * 2023-06-05 2023-09-29 中铁九局集团电务工程有限公司 Tunnel wall abnormity detection method and system in low-texture environment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6012458A (en) * 1998-03-20 2000-01-11 Mo; Larry Y. L. Method and apparatus for tracking scan plane motion in free-hand three-dimensional ultrasound scanning using adaptive speckle correlation
CN107123130A (en) * 2017-03-06 2017-09-01 华南理工大学 Kernel correlation filtering target tracking method based on superpixel and hybrid hash
CN107154024A (en) * 2017-05-19 2017-09-12 南京理工大学 Dimension self-adaption method for tracking target based on depth characteristic core correlation filter
CN108090919A (en) * 2018-01-02 2018-05-29 华南理工大学 Improved kernel correlation filtering tracking method based on super-pixel optical flow and adaptive learning factor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6012458A (en) * 1998-03-20 2000-01-11 Mo; Larry Y. L. Method and apparatus for tracking scan plane motion in free-hand three-dimensional ultrasound scanning using adaptive speckle correlation
CN107123130A (en) * 2017-03-06 2017-09-01 华南理工大学 Kernel correlation filtering target tracking method based on superpixel and hybrid hash
CN107154024A (en) * 2017-05-19 2017-09-12 南京理工大学 Dimension self-adaption method for tracking target based on depth characteristic core correlation filter
CN108090919A (en) * 2018-01-02 2018-05-29 华南理工大学 Improved kernel correlation filtering tracking method based on super-pixel optical flow and adaptive learning factor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
孙智华: "基于自适应分块和在线判别式分类器的单目标跟踪技术研究", 《中国优秀硕士学位论文全文数据库信息科技编辑》 *
栾悉道: "《多媒体情报处理技术》", 30 May 2016 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110349183A (en) * 2019-05-30 2019-10-18 西安电子科技大学 A kind of tracking based on KCF, device, electronic equipment and storage medium
CN110349183B (en) * 2019-05-30 2022-12-09 西安电子科技大学 Tracking method and device based on KCF, electronic equipment and storage medium
CN110414439A (en) * 2019-07-30 2019-11-05 武汉理工大学 Anti- based on multi-peak detection blocks pedestrian tracting method
CN110414439B (en) * 2019-07-30 2022-03-15 武汉理工大学 Anti-blocking pedestrian tracking method based on multi-peak detection
CN110706252A (en) * 2019-09-09 2020-01-17 西安理工大学 Robot nuclear correlation filtering tracking algorithm under guidance of motion model
CN110706252B (en) * 2019-09-09 2020-10-23 西安理工大学 Robot nuclear correlation filtering tracking algorithm under guidance of motion model
CN110942472A (en) * 2019-11-28 2020-03-31 广西师范大学 Nuclear correlation filtering tracking method based on feature fusion and self-adaptive blocking
CN110942472B (en) * 2019-11-28 2023-10-13 江苏砺行能源科技有限公司 Nuclear correlation filtering tracking method based on feature fusion and self-adaptive blocking
CN112348847A (en) * 2020-10-26 2021-02-09 南京邮电大学 Target scale self-adaptive tracking method
CN112348847B (en) * 2020-10-26 2023-08-15 南京邮电大学 Target scale self-adaptive tracking method
CN116823737A (en) * 2023-06-05 2023-09-29 中铁九局集团电务工程有限公司 Tunnel wall abnormity detection method and system in low-texture environment
CN116823737B (en) * 2023-06-05 2024-05-07 中铁九局集团电务工程有限公司 Tunnel wall abnormity detection method and system in low-texture environment

Similar Documents

Publication Publication Date Title
CN109064491A (en) A kind of nuclear phase pass filter tracking method of adaptive piecemeal
CN102324030B (en) Target tracking method and system based on image block characteristics
CN110210360B (en) Rope skipping counting method based on video image target recognition
CN106997597A (en) It is a kind of based on have supervision conspicuousness detection method for tracking target
CN108682017A (en) Super-pixel method for detecting image edge based on Node2Vec algorithms
CN108596951A (en) A kind of method for tracking target of fusion feature
CN107886507B (en) A kind of salient region detecting method based on image background and spatial position
CN103632137B (en) A kind of human eye iris segmentation method
CN107392968A (en) The image significance detection method of Fusion of Color comparison diagram and Color-spatial distribution figure
CN112837344A (en) Target tracking method for generating twin network based on conditional confrontation
CN106384363B (en) A kind of quick self-adapted weight solid matching method
CN107944437B (en) A kind of Face detection method based on neural network and integral image
CN108615229B (en) Collision detection optimization method based on curvature point clustering and decision tree
CN101872112B (en) Three-dimensional camera shooting automatic collecting system
CN105957107A (en) Pedestrian detecting and tracking method and device
CN109886267A (en) A kind of soft image conspicuousness detection method based on optimal feature selection
CN104376334A (en) Pedestrian comparison method based on multi-scale feature fusion
CN104599288A (en) Skin color template based feature tracking method and device
Zhang et al. A prior-based graph for salient object detection
CN106447662A (en) Combined distance based FCM image segmentation algorithm
CN110334581A (en) A kind of multi-source Remote Sensing Images change detecting method
CN106874843A (en) A kind of method for tracking target and equipment
CN114511803B (en) Target shielding detection method for visual tracking task
CN110910417A (en) Weak and small moving target detection method based on super-pixel adjacent frame feature comparison
CN107506400B (en) A kind of image search method based on cognitive characteristics and manifold ranking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20181221

RJ01 Rejection of invention patent application after publication