CN117392179B - Target tracking method based on correlation filter and edge frame - Google Patents

Target tracking method based on correlation filter and edge frame Download PDF

Info

Publication number
CN117392179B
CN117392179B CN202311686913.9A CN202311686913A CN117392179B CN 117392179 B CN117392179 B CN 117392179B CN 202311686913 A CN202311686913 A CN 202311686913A CN 117392179 B CN117392179 B CN 117392179B
Authority
CN
China
Prior art keywords
target
frame
edge
tracking
search window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311686913.9A
Other languages
Chinese (zh)
Other versions
CN117392179A (en
Inventor
李东晨
陈春
高升久
李毅捷
李非桃
冉欢欢
李和伦
陈益
褚俊波
王丹
董平凯
陈未东
杨伟
夏添
罗瀚森
肖枭
何建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Desheng Xinda Brain Intelligence Technology Co ltd
Original Assignee
Sichuan Desheng Xinda Brain Intelligence Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Desheng Xinda Brain Intelligence Technology Co ltd filed Critical Sichuan Desheng Xinda Brain Intelligence Technology Co ltd
Priority to CN202311686913.9A priority Critical patent/CN117392179B/en
Publication of CN117392179A publication Critical patent/CN117392179A/en
Application granted granted Critical
Publication of CN117392179B publication Critical patent/CN117392179B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/77Determining position or orientation of objects or cameras using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Probability & Statistics with Applications (AREA)
  • Computing Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a target tracking method based on a correlation filter and an edge frame, which relates to the technical field of target tracking of computer vision and comprises the following steps: locking a target in an initial frame, acquiring image information of the target, and generating a target template; calculating HOG features and color histogram features of the targets in the initial frame, and respectively initializing a HOG feature model and an initialized color feature model; acquiring image information in a current frame; obtaining a plurality of search boxes by using an edge frame-based method so as to confirm the positions of target search windows; judging the size of the target according to the image information of the target in the initial frame, and selecting a search window suitable for the size of the target; and calculating the position and the size of the target in the current frame according to the response value of the features in the search window, and completing target tracking. The invention can accurately track the target in the current video in real time under the condition of the change of the moving target and the complex background.

Description

Target tracking method based on correlation filter and edge frame
Technical Field
The invention relates to the technical field of target tracking of computer vision, in particular to a target tracking method based on a correlation filter and an edge frame.
Background
As a core technology of systems such as military guidance, safety monitoring and the like, the problem of moving target tracking is a hot spot problem of computer vision field research. In recent years, many researches on video target tracking theory and application still stay on specific method designs under certain constraints, and great difficulties are often encountered in tracking complex backgrounds or changing targets, even tracking failures caused by target loss, and the method is generally only suitable for certain situations where the background is less complex, the target transformation is small or other specific working scenes. In the existing typical tracking algorithm, a search window is established through the position coordinates of the target of the previous frame, a template and the search window are used for carrying out relevant filtering, the position with the largest response value is taken as the target coordinates, and then the template is refreshed to adapt to possible changes of appearance and background. The method has good tracking effect when the target moves slowly and the lens does not move, and the target background is simple, but the target is easy to lose when the target speed is too high and the lens suddenly shakes. In real life, especially in the military field, there are situations of too fast target movement speed, complex background, lens shake and the like, so that tracking is difficult to carry out. Therefore, how to efficiently and accurately characterize the target and how to accurately track the moving target in real time when the change and the background of the moving target are complex is a great difficulty in the current video target tracking research.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, and provides a target tracking method based on a correlation filter and an edge frame, which realizes target tracking in a complex environment.
The aim of the invention is realized by the following technical scheme:
a target tracking method based on a correlation filter and an edge frame is characterized in that: the method comprises the following steps:
s1: locking a target in an initial frame, acquiring image information of the target, and generating a target template;
s2: calculating HOG features and color histogram features of the targets in the initial frame, and respectively initializing a HOG feature model and an initialized color feature model;
s3: acquiring image information in a current frame;
s4: obtaining a plurality of search boxes by using an edge frame-based method so as to confirm the positions of target search windows;
s5: judging the size of the target according to the image information of the target in the initial frame, and selecting a search window suitable for the size of the target;
s6: and calculating the position and the size of the target in the current frame according to the response value of the features in the search window, and completing target tracking.
Further, the image information of the object includes pixel information of the object, center point coordinates of a rectangular frame of the object, height, and width.
Further, the HOG feature model and the color feature model have two sizes, specifically 32×32 and 64×64.
Further, the step S4 specifically includes:
s41: detecting the edge of the image by using a structured edge detection algorithm, and screening the edge by using non-maximum suppression;
s42: continuously searching eight connected edge points until the direction angle difference value between the initial edge point and the end edge point is larger than 90 degrees, and obtaining N edge point set edge groups almost on the same straight line;
s43: calculating the similarity between all two edge groups;
s44: calculating the weight of each edge group;
s45: deleting the frame which does not meet the requirements according to the width and the height of the search window;
s46: calculating the score of each frame after screening, and taking the top k frames with the highest score as candidate frames;
s47: cutting the candidate frames into the same size as the target template, calculating the correlation between the k candidate frames and the gray level co-occurrence matrix of the target template, and taking the center point of the most relevant candidate frame as the value center of the search window;
s48: if the obtained correlations are smaller than a threshold value, the target is considered to be blocked, the target is reported to be blocked, the next frame tracking operation is not performed until the target appears, and the re-tracking is started.
Further, the two edge groupss i Ands j the similarity between the two is calculated by the following formula:
a(s i ,s j ) = |cos(θ i ij )cos(θ j ij )| γ
wherein,θ i for edge groupss i Average position of all points in (a)(x i , y i )At least one of the first and second end portions,θ j for edge groupss j Average position of all points in (a)(x j , y j )At least one of the first and second end portions,θ ij for two edge groupss i Ands j average position of all points in (a)(x ij , y ij )Gamma is a hyper-parameter.
Further, the edge groups i The weight of (2) is calculated by the following formula:
wherein t= {t 1 ,t 2 ,...,t j+1 And represents the set of edge groups in the contour.
Further, the specific process of deleting the frame which does not meet the requirement according to the width and the height of the search window is to judge whether the frame width of the search window is greater than or equal to 0.8 times of the width of the target rectangular frame and less than or equal to 1.2 times of the width of the target rectangular frame, and whether the frame height of the search window is greater than or equal to 0.8 times of the height of the target rectangular frame and less than or equal to 1.2 times of the height of the target rectangular frame, if yes, reserving, otherwise deleting;
the score of each frame after calculation and screening is obtained by the following formula:
wherein the method comprises the steps ofb w Is a frame width of the frame, and the frame width is a frame width,b h is in the shape of a frame high and is provided with a plurality of grooves,m i for the amplitude of each point of the edge,κis a super parameter.
Further, the specific process of step S5 is to determine the size of the target in the initial frame, if the height and width of the target are within 21, take a search window of 32×32, and if the height and width are between 21 and 42, take a search window of 64×64.
Further, the step S6 specifically includes:
s61: calculating HOG features, color histogram features and scale features in the search window;
s62: respectively obtaining characteristic response points according to the characteristics of the target;
s63: according tof(x) =γ tmpl * f tmpl (x) +γ hist * f hist (x)Finding a point with the largest response value as a coordinate point of the target, whereinf tmpl Is the score of the HOG feature model,f hist is a color featureThe score of the model is calculated,γ tmpl is the scoring weight of the HOG feature model,γ hist is the scoring weight of the color feature model;
s64: and taking the response point with the largest scale characteristic as the scale size of the target.
Further, the method further comprises the following steps:
s7: and after the target position and the size in the current frame are obtained, updating all parameters of the model and the target template of the edge frame.
The beneficial effects of the invention are as follows:
1) By using the method based on the edge frame to confirm the position of the search window, the situation that the target flies out of the search window and tracking fails due to the fact that the movement speed of the target is too high or the lens shakes is avoided.
2) The scale model is added on the basis of the stage filter, so that the problem of incapacity caused by the change of the target scale is avoided.
Drawings
FIG. 1 is a flow chart of the steps for implementing the present invention.
Detailed Description
The technical solutions of the present invention will be clearly and completely described below with reference to the embodiments, and it is apparent that the described embodiments are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by a person skilled in the art without any inventive effort, are intended to be within the scope of the present invention, based on the embodiments of the present invention.
Referring to fig. 1, the present invention provides a technical solution:
a target tracking method based on a correlation filter and an edge frame is characterized in that: the method comprises the following steps:
s1: locking a target in an initial frame, acquiring image information of the target, and generating a target template; the image information of the object in the present embodiment includes pixel information of the object, center point coordinates (x, y) of a rectangular frame of the object, a height h, and a width w.
S2: calculating HOG features and color histogram features of the targets in the initial frame, and respectively initializing a HOG feature model and an initialized color feature model; the HOG feature model and the color feature model have two sizes, specifically 32×32 and 64×64.
S3: image information in a current frame is acquired.
S4: obtaining a plurality of search boxes by using an edge frame-based method so as to confirm the positions of target search windows;
in this embodiment, the specific steps of obtaining a plurality of search boxes by using the edge-box-based method include:
s41: detecting the edge of the image by using a structured edge detection algorithm, and screening the edge by using non-maximum suppression;
s42: continuously searching eight connected edge points until the direction angle difference value between the initial edge point and the end edge point is larger than 90 degrees, and obtaining N edge point set edge groups almost on the same straight line;
s43: calculating the similarity between all the two edge groups, wherein if the two edge groups are on the same straight line, the similarity is higher;
s44: calculating the weight of each edge group;
s45: deleting the frame which does not meet the requirements according to the width and the height of the search window;
s46: calculating the score of each frame after screening, and taking the top k frames with the highest score as candidate frames;
s47: cutting the candidate frames into the same size as the target template, calculating the correlation between the k candidate frames and the gray level co-occurrence matrix of the target template, and taking the center point of the most relevant candidate frame as the value center of the search window; the first 10 scoring candidates are selected as candidates in this embodiment.
S48: if the obtained correlations are smaller than a threshold value, the target is considered to be blocked, the target is reported to be blocked, the next frame tracking operation is not performed until the target appears, and the re-tracking is started.
The two edge groups described in this embodiments i Ands j the similarity between the two is calculated by the following formula:
a(s i ,s j ) = |cos(θ i ij )cos(θ j ij )| γ
wherein,θ i for edge groupss i Average position of all points in (a)(x i , y i )At least one of the first and second end portions,θ j for edge groupss j Average position of all points in (a)(x j , y j )At least one of the first and second end portions,θ ij for two edge groupss i Ands j average position of all points in (a)(x ij , y ij )Gamma is a hyper-parameter.
Edge groups as described in this embodiments i The weight of (2) is calculated by the following formula:
wherein t= {t 1 ,t 2 ,...,t j+1 And represents the set of edge groups in the contour.
The specific process of deleting the frame which does not meet the requirements according to the width and the height of the search window in the embodiment is to judge whether the frame width of the search window is greater than or equal to 0.8 times of the width of the target rectangular frame and less than or equal to 1.2 times of the width of the target rectangular frame, and whether the frame height of the search window is greater than or equal to 0.8 times of the height of the target rectangular frame and less than or equal to 1.2 times of the height of the target rectangular frame, if yes, reserving, otherwise deleting;
the score of each frame after calculation and screening is obtained by the following formula:
wherein the method comprises the steps ofb w Is a frame width of the frame, and the frame width is a frame width,b h is in the shape of a frame high and is provided with a plurality of grooves,m i for the amplitude of each point of the edge,κis a super parameter.
The score of each box after screening described in this embodiment is calculated by the following formula:
wherein the method comprises the steps ofb w Is a frame width of the frame, and the frame width is a frame width,b h is in the shape of a frame high and is provided with a plurality of grooves,m i for the amplitude of each point of the edge,κis a super parameter.
The method based on the edge frame confirms the position of the search window, and avoids the situation that the target flies out of the search window and tracking fails because the movement speed of the target is too high or the lens shakes.
S5: judging the size of the target according to the image information of the target in the initial frame, and selecting a search window suitable for the size of the target; the specific process of determining the size of the target in the initial frame in step S5 is to take a search window of 32×32 if the height and width of the target are within 21, and take a search window of 64×64 if the height and width are between 21 and 42.
S6: and calculating the position and the size of the target in the current frame according to the response value of the features in the search window, and completing target tracking.
In this embodiment, the specific steps of calculating the target position and size in the current frame according to the response value of the feature in the search window are as follows:
s61: calculating HOG features, color histogram features and scale features in the search window;
s62: respectively obtaining characteristic response points according to the characteristics of the target;
s63: according tof(x) =γ tmpl * f tmpl (x) +γ hist * f hist (x)Finding a point with the largest response value as a coordinate point of the target, whereinf tmpl Is the score of the HOG feature model,f hist is the score of the color feature model,γ tmpl is the scoring weight of the HOG feature model,γ hist is the scoring weight of the color feature model;
s64: and taking the response point with the largest scale characteristic as the scale size of the target.
Furthermore, the method further comprises:
s7: after the target position and the target size in the current frame are obtained, updating the HOG feature model, all parameters of the color feature model and the target template of the edge frame. The parameters of the HOG feature model and the color feature model with the specifications of 64 x 64 and 32 x 32 are updated mainly by weighted summation of the parameters calculated by the current frame and the parameters of the feature model. If the selected model is 32 x 32, the matrix of 32 x 32 needs to be enlarged to 64 x 64 by interpolation and then weighted summation is performed when updating 64 x 64, and if the selected model is 64 x 64, the matrix of 64 x 64 needs to be reduced to 32 x 32 when updating 32 x 32.
The foregoing is merely a preferred embodiment of the invention, and it is to be understood that the invention is not limited to the form disclosed herein but is not to be construed as excluding other embodiments, but is capable of numerous other combinations, modifications and environments and is capable of modifications within the scope of the inventive concept, either as taught or as a matter of routine skill or knowledge in the relevant art. And that modifications and variations which do not depart from the spirit and scope of the invention are intended to be within the scope of the appended claims.

Claims (8)

1. A correlation filter and edge box based target tracking method, comprising:
s1: locking a target in an initial frame, acquiring image information of the target, and generating a target template;
s2: calculating HOG features and color histogram features of the targets in the initial frame, and respectively initializing a HOG feature model and an initialized color feature model;
s3: acquiring image information in a current frame;
s4: obtaining a plurality of search boxes by using an edge frame-based method so as to confirm the positions of target search windows;
s5: judging the size of the target according to the image information of the target in the initial frame, and selecting a search window suitable for the size of the target;
s6: calculating the position and the size of a target in the current frame according to the response value of the features in the search window, and completing target tracking;
the step S4 specifically includes:
s41: detecting the edge of the image by using a structured edge detection algorithm, and screening the edge by using non-maximum suppression;
s42: continuously searching eight connected edge points until the direction angle difference value between the initial edge point and the end edge point is larger than 90 degrees, and obtaining N edge point set edge groups almost on the same straight line;
s43: calculating the similarity between all two edge groups;
s44: calculating the weight of each edge group;
s45: deleting the frame which does not meet the requirements according to the width and the height of the search window;
s46: calculating the score of each frame after screening, and taking the top k frames with the highest score as candidate frames;
s47: cutting the candidate frames into the same size as the target template, calculating the correlation between the k candidate frames and the gray level co-occurrence matrix of the target template, and taking the center point of the most relevant candidate frame as the value center of the search window;
s48: if the obtained correlation is smaller than a threshold value, the target is considered to be blocked, the target is reported to be blocked, the next frame tracking operation is not performed until the position of the target appears, and the re-tracking is started;
the step S6 specifically includes:
s61: calculating HOG features, color histogram features and scale features in the search window;
s62: respectively obtaining characteristic response points according to the characteristics of the target;
s63: according tof(x) = γ tmpl * f tmpl (x) + γ hist * f hist (x)Finding a point with the largest response value as a coordinate point of the target, whereinf tmpl Is the score of the HOG feature model,f hist is the score of the color feature model,γ tmpl is the scoring weight of the HOG feature model,γ hist is the scoring weight of the color feature model;
s64: and taking the response point with the largest scale characteristic as the scale size of the target.
2. The method for tracking an object based on a correlation filter and an edge frame according to claim 1, wherein: the image information of the target comprises pixel information of the target, coordinates of a central point of a rectangular frame of the target, and height and width.
3. The method for tracking an object based on a correlation filter and an edge frame according to claim 1, wherein: the HOG feature model and the color feature model have two sizes, specifically 32×32 and 64×64.
4. The method for tracking an object based on a correlation filter and an edge frame according to claim 1, wherein: the two edge groupss i Ands j the similarity between the two is calculated by the following formula:
a(s i ,s j ) = |cos(θ i - θ ij )cos(θ j - θ ij )| γ
wherein,θ i for edge groupss i Average position of all points in (a)(x i , y i )At least one of the first and second end portions,θ j for edge groupss j Average position of all points in (a)(x j , y j )At least one of the first and second end portions,θ ij for two edge groupss i Ands j average position of all points in (a)(x ij , y ij ) Gamma is a hyper-parameter.
5. The method for tracking an object based on a correlation filter and an edge frame according to claim 4, wherein: the edge groups i The weight of (2) is calculated by the following formula:
wherein t= {t 1 ,t 2 ,...,t j+1 And represents the set of edge groups in the contour.
6. The method for tracking an object based on a correlation filter and an edge frame according to claim 5, wherein:
the specific process of deleting the frame which does not meet the requirements according to the width and the height of the search window is to judge whether the frame width of the search window is greater than or equal to 0.8 times of the width of the target rectangular frame and less than or equal to 1.2 times of the width of the target rectangular frame, and whether the frame height of the search window is greater than or equal to 0.8 times of the height of the target rectangular frame and less than or equal to 1.2 times of the height of the target rectangular frame, if yes, the frame is reserved, and if not, the frame is deleted;
the score of each frame after calculation and screening is obtained by the following formula:
wherein the method comprises the steps ofb w Is a frame width of the frame, and the frame width is a frame width,b h is in the shape of a frame high and is provided with a plurality of grooves,m i for the amplitude of each point of the edge,κis a super parameter.
7. A correlation filter and edge box based object tracking method according to claim 3, characterized in that: the specific process of step S5 is to determine the size of the target in the initial frame, if the target height and width are within 21, take a search window of 32×32, and if the height and width are between 21 and 42, take a search window of 64×64.
8. The method of object tracking based on correlation filters and edge frames of claim 1, further comprising:
s7: after the target position and the target size in the current frame are obtained, updating all parameters of the HOG feature model and the color feature model and the target template of the edge frame.
CN202311686913.9A 2023-12-11 2023-12-11 Target tracking method based on correlation filter and edge frame Active CN117392179B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311686913.9A CN117392179B (en) 2023-12-11 2023-12-11 Target tracking method based on correlation filter and edge frame

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311686913.9A CN117392179B (en) 2023-12-11 2023-12-11 Target tracking method based on correlation filter and edge frame

Publications (2)

Publication Number Publication Date
CN117392179A CN117392179A (en) 2024-01-12
CN117392179B true CN117392179B (en) 2024-02-27

Family

ID=89465140

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311686913.9A Active CN117392179B (en) 2023-12-11 2023-12-11 Target tracking method based on correlation filter and edge frame

Country Status (1)

Country Link
CN (1) CN117392179B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015171084A1 (en) * 2014-05-08 2015-11-12 Aselsan Elektronik Sanayi Ve Ticaret Anonim Şirketi A real- time, semi-automatic method for target tracking window initialization in thermal imagery
CN105868776A (en) * 2016-03-25 2016-08-17 中国科学院自动化研究所 Transformer equipment recognition method and device based on image processing technology
CN105930803A (en) * 2016-04-22 2016-09-07 北京智芯原动科技有限公司 Preceding vehicle detection method based on Edge Boxes and preceding vehicle detection device thereof
CN106250812A (en) * 2016-07-15 2016-12-21 汤平 A kind of model recognizing method based on quick R CNN deep neural network
CN106651913A (en) * 2016-11-29 2017-05-10 开易(北京)科技有限公司 Target tracking method based on correlation filtering and color histogram statistics and ADAS (Advanced Driving Assistance System)
CN106971176A (en) * 2017-05-10 2017-07-21 河海大学 Tracking infrared human body target method based on rarefaction representation
CN108803655A (en) * 2018-06-08 2018-11-13 哈尔滨工程大学 A kind of UAV Flight Control platform and method for tracking target
CN108876818A (en) * 2018-06-05 2018-11-23 国网辽宁省电力有限公司信息通信分公司 A kind of method for tracking target based on like physical property and correlation filtering
CN109087322A (en) * 2018-07-18 2018-12-25 华中科技大学 A kind of Moving small targets detection method of Aerial Images
CN110472577A (en) * 2019-08-15 2019-11-19 江南大学 Video tracing method when a kind of long based on adaptive correlation filtering
CN111915649A (en) * 2020-07-27 2020-11-10 北京科技大学 Strip steel moving target tracking method under shielding condition
CN113888586A (en) * 2021-09-01 2022-01-04 河北汉光重工有限责任公司 Target tracking method and device based on correlation filtering
CN116228817A (en) * 2023-03-10 2023-06-06 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering
CN116665097A (en) * 2023-05-25 2023-08-29 南京理工大学 Self-adaptive target tracking method combining context awareness

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015171084A1 (en) * 2014-05-08 2015-11-12 Aselsan Elektronik Sanayi Ve Ticaret Anonim Şirketi A real- time, semi-automatic method for target tracking window initialization in thermal imagery
CN105868776A (en) * 2016-03-25 2016-08-17 中国科学院自动化研究所 Transformer equipment recognition method and device based on image processing technology
CN105930803A (en) * 2016-04-22 2016-09-07 北京智芯原动科技有限公司 Preceding vehicle detection method based on Edge Boxes and preceding vehicle detection device thereof
CN106250812A (en) * 2016-07-15 2016-12-21 汤平 A kind of model recognizing method based on quick R CNN deep neural network
CN106651913A (en) * 2016-11-29 2017-05-10 开易(北京)科技有限公司 Target tracking method based on correlation filtering and color histogram statistics and ADAS (Advanced Driving Assistance System)
CN106971176A (en) * 2017-05-10 2017-07-21 河海大学 Tracking infrared human body target method based on rarefaction representation
CN108876818A (en) * 2018-06-05 2018-11-23 国网辽宁省电力有限公司信息通信分公司 A kind of method for tracking target based on like physical property and correlation filtering
CN108803655A (en) * 2018-06-08 2018-11-13 哈尔滨工程大学 A kind of UAV Flight Control platform and method for tracking target
CN109087322A (en) * 2018-07-18 2018-12-25 华中科技大学 A kind of Moving small targets detection method of Aerial Images
CN110472577A (en) * 2019-08-15 2019-11-19 江南大学 Video tracing method when a kind of long based on adaptive correlation filtering
CN111915649A (en) * 2020-07-27 2020-11-10 北京科技大学 Strip steel moving target tracking method under shielding condition
CN113888586A (en) * 2021-09-01 2022-01-04 河北汉光重工有限责任公司 Target tracking method and device based on correlation filtering
CN116228817A (en) * 2023-03-10 2023-06-06 东南大学 Real-time anti-occlusion anti-jitter single target tracking method based on correlation filtering
CN116665097A (en) * 2023-05-25 2023-08-29 南京理工大学 Self-adaptive target tracking method combining context awareness

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Applying Detection Proposals to Visual Tracking for Scale and Aspect Ratio Adaptability;Dafei Huang 等;《Springer》;20161226;1-18 *
Correlation Filter Based Moving Object Tracking With Scale Adaptation and Online Re-Detection;MD MOJAHIDUL ISLAM 等;《IEEE Access》;20181127;1-15 *
基于深度学习的无人机航拍目标检测与跟踪方法综述;欧阳权 等;《电光与控制》;20231116;1-10 *
基于相关滤波的目标跟踪算法研究;闫培亮;《中国优秀硕士学位论文全文数据库信息科技辑》;20200215(第02期);I135-554 *
基于相关滤波的目标重检测跟踪;姜珊 等;《红外与激光工程》;20210228;第50卷(第2期);1-12 *
适合长时跟踪的自适应相关滤波算法;肖逸清 等;《计算机辅助设计与图形学学报》;20200131;第32卷(第1期);121-129 *

Also Published As

Publication number Publication date
CN117392179A (en) 2024-01-12

Similar Documents

Publication Publication Date Title
US20220366576A1 (en) Method for target tracking, electronic device, and storage medium
CN111508002B (en) Small-sized low-flying target visual detection tracking system and method thereof
KR100647322B1 (en) Apparatus and method of generating shape model of object and apparatus and method of automatically searching feature points of object employing the same
CN111598916A (en) Preparation method of indoor occupancy grid map based on RGB-D information
CN112836640B (en) Single-camera multi-target pedestrian tracking method
CN103093198B (en) A kind of crowd density monitoring method and device
CN110647836B (en) Robust single-target tracking method based on deep learning
CN111445497B (en) Target tracking and following method based on scale context regression
US10482584B1 (en) Learning method and learning device for removing jittering on video acquired through shaking camera by using a plurality of neural networks for fault tolerance and fluctuation robustness in extreme situations, and testing method and testing device using the same
CN112446882A (en) Robust visual SLAM method based on deep learning in dynamic scene
CN110111370B (en) Visual object tracking method based on TLD and depth multi-scale space-time features
CN111192294A (en) Target tracking method and system based on target detection
CN108710879B (en) Pedestrian candidate region generation method based on grid clustering algorithm
CN114708300A (en) Anti-blocking self-adaptive target tracking method and system
CN112308879A (en) Image processing apparatus, method of tracking target object, and storage medium
CN117392179B (en) Target tracking method based on correlation filter and edge frame
CN113781523A (en) Football detection tracking method and device, electronic equipment and storage medium
CN111738085B (en) System construction method and device for realizing automatic driving simultaneous positioning and mapping
Islam et al. ARD-SLAM: Accurate and robust dynamic SLAM using dynamic object identification and improved multi-view geometrical approaches
Stumper et al. Offline object extraction from dynamic occupancy grid map sequences
CN115511920A (en) Detection tracking method and system based on deep sort and deep EMD
CN110930519B (en) Semantic ORB-SLAM sensing method and device based on environment understanding
Maier et al. Surprise-driven acquisition of visual object representations for cognitive mobile robots
CN113284228B (en) Indoor scene room layout dividing method based on point cloud
CN107563284B (en) Pedestrian tracking method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant