CN112200827B - Far and near scene-based infrared image tracking algorithm evaluation method and platform - Google Patents

Far and near scene-based infrared image tracking algorithm evaluation method and platform Download PDF

Info

Publication number
CN112200827B
CN112200827B CN202010937960.6A CN202010937960A CN112200827B CN 112200827 B CN112200827 B CN 112200827B CN 202010937960 A CN202010937960 A CN 202010937960A CN 112200827 B CN112200827 B CN 112200827B
Authority
CN
China
Prior art keywords
tracking
module
image
algorithm
evaluation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010937960.6A
Other languages
Chinese (zh)
Other versions
CN112200827A (en
Inventor
李婷
徐传刚
姚克明
刘国文
王悦行
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Jinhang Institute of Technical Physics
Original Assignee
Tianjin Jinhang Institute of Technical Physics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Jinhang Institute of Technical Physics filed Critical Tianjin Jinhang Institute of Technical Physics
Priority to CN202010937960.6A priority Critical patent/CN112200827B/en
Publication of CN112200827A publication Critical patent/CN112200827A/en
Application granted granted Critical
Publication of CN112200827B publication Critical patent/CN112200827B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an infrared image tracking algorithm evaluation method and an evaluation platform based on far and near scenes, which are used for solving the problem that the qualitative evaluation of the existing tracking algorithm cannot meet the actual application. The evaluation method is characterized in that a test image library is established based on infrared images under far and near field scenes, the real target center point position and the real target area range of each frame of images in an image sequence are marked to form a configuration file and loaded, the image sequence is tracked according to an algorithm to be evaluated, and then tracking accuracy, tracking success rate and stability evaluation are carried out according to comparison of tracking results and the real target center point position and the real target area range. According to the invention, the reading-in and batch processing of the result data and the marking data are realized through the configuration file, the evaluation process does not need the participation of people, the evaluation efficiency and the evaluation precision are improved, the automatic evaluation of the tracking algorithm in the far-near application scene is realized, and the selection of the tracking algorithm and the rapid iterative optimization in the algorithm research and development process are facilitated.

Description

Far and near scene-based infrared image tracking algorithm evaluation method and platform
Technical Field
The invention belongs to the field of computer vision and image processing, and particularly relates to an infrared image tracking algorithm evaluation method and an evaluation platform based on far and near scenes.
Background
With the development of computer vision, image tracking algorithms are more and more, and internationally disclosed vision target tracking contests (VOT, benchmark and the like) are provided with universal test data sets and universal test platforms, so that a unified evaluation platform is provided for a plurality of algorithms.
In the prior art, the evaluation method of the tracking algorithm mainly has accuracy and success rate, wherein the accuracy is the position deviation error of the center point, and the success rate is the overlapping ratio of the tracking area. However, in the far-to-near infrared application scene, especially when the far-to-near infrared aviation or military field is involved, the two indexes cannot well reflect the performance characteristics of the tracking algorithm in the practical application process, and the disclosed evaluation platform is not applicable any more.
Disclosure of Invention
In view of the above-mentioned defects or shortcomings in the prior art, the invention aims to provide an infrared image tracking algorithm evaluation method and an evaluation platform based on far and near scenes, which provide a unified performance evaluation platform for the infrared image tracking algorithm, improve the evaluation work efficiency and realize the automatic evaluation of the performance of the tracking algorithm in the fields of far and near infrared aviation or military and the like.
In order to achieve the above purpose, the embodiment of the present invention adopts the following technical scheme:
in a first aspect, an embodiment of the present invention provides an evaluation method of an infrared image tracking algorithm based on a far and near scene, where the evaluation method of the tracking algorithm includes:
step S1, a test image library is established based on infrared image data under far and near field scenes, the position of a real target center point and the range of a real target area of each frame of image in an image sequence are marked, and a configuration file is formed, wherein the configuration file comprises an algorithm to be evaluated;
s2, loading a configuration file, tracking the image sequence according to an algorithm to be evaluated, and outputting a tracking result;
s3, carrying out data analysis according to the tracking result, the position of the center point of the real target and the range mark of the region of the real target;
in the above scheme, the step S3 of data analysis compares the tracking result with the real marking result on the basis of the marked infrared image big data test image library, and normalizes the comparison result according to the distance from far to near.
In the above scheme, the data analysis includes: evaluating tracking precision, evaluating tracking success rate and evaluating stability;
the true target center point is (x target ,y target ) Real target areaDomain range S target The tracking center point coordinates are (x tracker ,y tracker );
The tracking precision Pre is as follows:
Figure BDA0002672587520000021
/>
in the formula (1), S target,x Representing the length of the target region in the x-direction, S target,y Representing the length of the target area in the y-direction;
the tracking success rate Suc is defined as:
Suc=(x tracker ,y tracker )∈S target (2)
the formula (2) shows that the tracking point shows successful tracking in the target area, and the tracking point shows failure tracking outside the target area;
stability of the tracking algorithm is defined as:
Figure BDA0002672587520000022
in the formula (3), L tracker The tracking algorithm is represented to continuously and successfully track the maximum image frame number in the test sequence, and N is represented to the total image frame number of the current image sequence.
In the above scheme, the configuration file records the storage path of the image sequence in the test image library, the name of the image sequence and the algorithm to be evaluated.
In the above scheme, the algorithm to be evaluated is recorded in the configuration file in the form of an algorithm abbreviation or an algorithm code.
In the above solution, the step S2 includes the following steps:
step S201, loading a configuration file;
step S202, judging whether the image sequence needs to be specified; if the image sequence is required to be specified, evaluating the algorithm through the specific image sequence, and then entering step S203; otherwise, go to step S204;
step S203, a specific image sequence is designated;
step S204, storing and acquiring a preservation path of a test image library;
step S205, sequentially analyzing an ith image sequence in a storage path; i starts from 1;
step S206, judging whether the current frame is the first frame of the current image sequence; when the frame is the first frame, the process proceeds to step S207; otherwise, directly enter step S208;
step S207, acquiring an initial tracking position and initializing tracking;
step S208, judging whether the frame is the last frame of the current image sequence; when it is the last frame, the process proceeds to step S209; otherwise, i=i+1, and the process proceeds to step S206;
step S209, judging whether all tracking image sequences have been traversed; when traversed, go to step S3; otherwise, the process advances to step S205.
In a second aspect, the embodiment of the invention also provides an infrared image tracking algorithm evaluation platform based on a far-near scene, wherein the tracking algorithm evaluation platform comprises a test image library creation module, a tracking algorithm implementation module, a tracking result analysis module and an evaluation result output module which are sequentially connected; wherein:
the test image library creation module is used for creating a test image library based on infrared image big data, marking the position of a real target center point and the range of a real target area of each frame of image in the image sequence, and forming a configuration file;
the tracking algorithm implementation module is used for loading a configuration file, tracking the image sequence according to an algorithm to be evaluated and outputting a tracking result;
and the tracking result analysis module is used for carrying out data analysis according to the tracking result and the real mark. Finishing the evaluation of the algorithm;
and the evaluation result output module is used for completing and outputting an algorithm evaluation result according to the data analysis.
In the above scheme, the tracking algorithm implementation module includes: the system comprises an initialization loading module, an image sequence designating module, a path saving module, an image sequence analyzing module, an image frame judging module, an initial frame tracking module, an image tracking module and an image sequence judging module; wherein, the liquid crystal display device comprises a liquid crystal display device,
the initialization loading module is connected with the image sequence designating module and the path saving module at the same time, the image sequence designating module is connected with the path saving module, the path saving module is downwards connected with the image sequence analyzing module, the image sequence analyzing module is connected with the image frame judging module, the image frame judging module is connected with the initial frame tracking module and the image tracking module at the same time, the initial frame tracking module is connected with the image tracking module, and the image tracking module is connected with the image sequence judging module; the image sequence judging module is connected with the image sequence analyzing module and the tracking result analyzing module at the same time.
In the above scheme, the tracking result analysis module comprises a tracking precision evaluation module, a tracking success rate evaluation module and a stability evaluation module;
the tracking accuracy evaluation module, the tracking success rate evaluation module and the stability evaluation module are all connected with the image sequence judgment module, respectively obtain tracking results of the image tracking module and are simultaneously connected with the evaluation result output module.
The invention has the following beneficial effects:
according to the algorithm tracking evaluation method and the evaluation platform based on the infrared image big data, provided by the embodiment of the invention, on the basis of the marked infrared image big data test image library, the tracking result is compared with the real marking result, the comparison result is normalized according to the distance from far to near, and the performance of the tracking algorithm is analyzed through three indexes of tracking precision, tracking success rate and stability; the tracking precision is normalized to the deviation precision of a target tracking point relative to a target area, so that evaluation deviation caused by different pixel numbers of targets in far and near field scene application is compensated; the tracking success rate is that the tracking points represent successful tracking in the target area, otherwise, the tracking failure is represented, and the success rate evaluation method can guide the actual engineering application and is applicable to the tracking algorithm which can not output the target area; the stability of the tracking algorithm is the maximum number of images that the algorithm successfully tracks continuously in the test sequence. The platform automatically identifies the algorithm through interaction of the configuration file with the test image library and the tracking algorithm, and invokes the corresponding algorithm code to realize the tracking process of the algorithm; the method has the advantages that the result data and the marking data are read in and processed in batches through the configuration file, the deviation curve and the statistic data are directly output, the evaluation process does not need human participation, the efficiency of the evaluation process is improved, the automatic evaluation of the performance of the tracking algorithm in the far-and-near infrared aviation or military fields is realized, a unified performance evaluation platform is provided for the tracking algorithm, and the selection of the tracking algorithm and the rapid iterative optimization in the algorithm research and development process are facilitated.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the accompanying drawings in which:
FIG. 1 is a flowchart of an evaluation method based on an infrared image tracking algorithm of far and near scenes according to an embodiment of the present invention;
FIG. 2 is a flowchart of a tracking algorithm implementation in an embodiment of the present invention;
FIG. 3 is an exemplary graph of a precision deviation curve in accordance with an embodiment of the present invention;
FIG. 4 is a diagram showing a partial example of a statistics table of tracking results of a tracking algorithm according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of an evaluation platform structure based on an infrared image tracking algorithm of far and near scenes according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a structure of a tracking algorithm implementation module in a tracking algorithm evaluation platform according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a tracking result analysis module in a tracking algorithm evaluation platform according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be noted that, for convenience of description, only the portions related to the invention are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other. The invention will be described in detail below with reference to the drawings in connection with embodiments.
The embodiment of the invention provides an infrared image tracking algorithm evaluation method and an evaluation platform based on far and near scenes. According to the invention, a reference value is provided for quantitative evaluation of algorithm performance through a test image library based on infrared image data under far and near field scenes; the performance evaluation of tracking algorithms in far and near infrared aviation or military fields has more practical engineering value by using a normalized performance evaluation method; by means of an automatic software evaluation platform, quantitative evaluation of performance indexes of a tracking algorithm is realized, and a platform is provided for automatic and efficient algorithm evaluation; the evaluation platform enables the evaluation result of the algorithm performance to be more visual through the output form of the precision curve and the statistical data table, and when the test image library has a large data scale, the statistical result can more visually reflect the characteristics of the algorithm, the precision curve can enable an algorithm developer to rapidly locate the problem position of the algorithm, and an effective and convenient tool is provided for rapid iterative optimization in the algorithm research and development process and comparison selection of the tracking algorithm performance.
Fig. 1 shows a flowchart of an evaluation method based on an infrared image tracking algorithm of far and near scenes according to an embodiment of the present invention. As shown in fig. 1, the evaluation method of the tracking algorithm includes:
step S1, based on the infrared image data under far and near field scenes, a test image library is established, and the real target center point position (x target ,y target ) And a true target area range S target Marking is carried out, and a configuration file is formed.
In this step, the configuration file records a saving path of the image sequence in the test image library, an image sequence name and an algorithm to be evaluated. Preferably, the algorithm is recorded in a short form of algorithm. The algorithm abbreviation is a fixed abbreviation agreed in advance and corresponds to the algorithm code.
Preferably, the infrared image data is derived from a normalized image data format, an infrared image sequence is stored as an image data file, and the common attribute parameters of the image sequence and the associated attribute parameters of each image are recorded, and the image data in each image data file is cleaned data containing valid targets. Wherein the common attribute parameter and the associated attribute parameter are stored as file header information.
And S2, loading a configuration file, tracking the image sequence according to an algorithm to be evaluated, and outputting a tracking result.
In the step, firstly, reading and analyzing the configuration file formed in the step S1, and finding each image sequence in the test image library according to the path and the picture column name in the configuration file, and the algorithm waiting for evaluation is abbreviated; or a certain image sequence can be designated for testing; then selecting a certain appointed result storage path, and if not, default saving to a folder where the image sequence is located; then tracking an algorithm to be evaluated in each test image sequence, wherein the initial tracking position is used as a reference for marking the marking position of the first image of the image sequence; and finally, comparing the tracking result with the marking result, and carrying out data processing, analysis and statistics according to the three evaluation indexes. Figure 2 shows in particular the implementation of tracking.
As shown in fig. 2, in tracking, the method specifically comprises the following steps of:
step S201, loading a configuration file;
step S202, it is determined whether or not the image sequence needs to be specified. If the image sequence is required to be specified, evaluating the algorithm through the specific image sequence, and then entering step S203; otherwise, go to step S204;
step S203, a specific image sequence is designated;
step S204, storing and acquiring a preservation path of a test image library;
in step S205, the i-th image sequence is sequentially analyzed in the save path. i starts from 1.
Step S206, judging whether the current frame is the first frame of the current image sequence; when the frame is the first frame, the process proceeds to step S207; otherwise, directly enter step S208;
step S207, acquiring an initial tracking position and initializing tracking;
step S208, judging whether the frame is the last frame of the current image sequence; when it is the last frame, the process proceeds to step S209; otherwise, i=i+1, and the process proceeds to step S206;
step S209, judging whether all tracking image sequences have been traversed; when traversed, go to step S3; otherwise, the process advances to step S205.
The above is the detailed tracking process in step S2. And the tracking results of all frames form the tracking result of the current image sequence to be tested.
And S3, carrying out data analysis according to the tracking result and the real mark. And (5) finishing the evaluation of the algorithm.
In this step, the data analysis is a normalized performance evaluation process, including: tracking accuracy evaluation, tracking success rate evaluation and stability evaluation.
The tracking precision evaluation is based on the marked infrared image big data test image library, the tracking result is compared with the real marking result, the comparison result is normalized according to the distance from far to near, and the tracking center point coordinate is set as (x tracker ,y tracker ) If the tracking algorithm outputs the tracking area, the tracking area is converted to the position of the tracking center point, and the tracking algorithm selects the target point for tracking in the actual use process, so that the tracking accuracy of the target point can be judged to guide the actual use. The tracking accuracy Pre is defined as:
Figure BDA0002672587520000071
in the formula (1), S target,x Representing the length of the target area in the x-direction,S target,y the length of the target area in the y direction is represented, the deviation of the transverse tracking point and the longitudinal tracking point is divided by the length of the respective target area, the deviation accuracy of the target tracking point relative to the target area is normalized, and as shown in fig. 3, the evaluation deviation caused by the difference of the number of pixels occupied by the target in the far-field and near-field scene application is compensated by the deviation accuracy.
The tracking success rate Suc is defined as:
Suc=(x tracker ,y tracker )∈S target (2)
the expression (2) indicates that the tracking point indicates successful tracking in the target area and that the tracking point indicates failure tracking in the target area. Compared with the cross-union ratio of the tracking areas, the success rate evaluation method in the embodiment can guide the practical engineering application, and is suitable for the tracking algorithm which can not output the target areas.
Stability of the tracking algorithm is defined as:
Figure BDA0002672587520000081
in the formula (3), L tracker The tracking algorithm is represented to continuously and successfully track the maximum image frame number in the test sequence, and N is represented to the total image frame number of the current test sequence.
And S4, completing and outputting an algorithm evaluation result according to data analysis. As shown in fig. 4, quantitative tracking performance indexes of the tracking algorithm on the test image library are obtained through tracking precision evaluation, tracking success rate evaluation and stability evaluation.
From the above, it can be seen that, according to the embodiment of the invention, based on the far and near scene infrared image tracking algorithm evaluation method, the tracking result is compared with the real marking result on the basis of the marked infrared image big data test image library through normalized performance evaluation, and the comparison result is normalized according to the far and near distance, and the tracking algorithm performance is analyzed through three indexes of tracking precision, tracking success rate and stability. The tracking precision is normalized to the deviation precision of a target tracking point relative to a target area, so that evaluation deviation caused by different pixel numbers of targets in far and near field scene application is compensated; the tracking success rate is that the tracking points represent successful tracking in the target area, otherwise, the tracking failure is represented, and the success rate evaluation method can guide the actual engineering application and is applicable to the tracking algorithm which can not output the target area; the stability of the tracking algorithm is the maximum number of images that the algorithm successfully tracks continuously in the test sequence.
Fig. 5 shows a schematic diagram of an evaluation platform structure based on an infrared image tracking algorithm of far and near scenes according to an embodiment of the present invention. The tracking algorithm evaluation platform according to the embodiment enables the tracking algorithm evaluation method in the embodiment to be realized. As shown in fig. 5, the tracking algorithm evaluation platform includes a test image library creation module 10, a tracking algorithm implementation module 20, a tracking result analysis module 30, and an evaluation result output module 40, which are sequentially connected; wherein:
the test image library creating module 10 is configured to create a test image library based on infrared image data under far-field and near-field scenes, mark a real target center point position and a real target area range of each frame of image in an image sequence, and form a configuration file;
the tracking algorithm implementation module 20 is configured to load a configuration file, track the image sequence according to an algorithm to be evaluated, and output a tracking result.
The tracking result analysis module 30 is configured to perform data analysis according to the tracking result and the real mark. Finishing the evaluation of the algorithm;
the evaluation result output module 40 is configured to complete and output an algorithm evaluation result according to the data analysis.
The tracking algorithm implementation module 20 specifically includes: an initialization loading module 21, an image sequence specifying module 22, a path saving module 23, an image sequence analyzing module 24, an image frame judging module 25, an initial frame tracking module 26, an image tracking module 27 and an image sequence judging module 28.
As shown in fig. 6, the initialization loading module 21 is connected to the image sequence designating module 22 and the path preserving module 23 at the same time, the image sequence designating module 22 is connected to the path preserving module 23, the path preserving module 23 is connected to the image sequence analyzing module 24 downwards, the image sequence analyzing module 24 is connected to the image frame judging module 25, the image frame judging module 25 is connected to the initial frame tracking module 26 and the image tracking module 27 at the same time, the initial frame tracking module 26 is connected to the image tracking module 27, and the image tracking module 27 is connected to the image sequence judging module 28; the image sequence determination module 28 is coupled to both the image sequence analysis module 24 and the tracking result analysis module 30.
The tracking result analysis module 30 includes a tracking accuracy evaluation module 31, a tracking success rate evaluation module 32, and a stability evaluation module 33. As shown in fig. 7, the tracking accuracy evaluation module 31, the tracking success rate evaluation module 32, and the stability evaluation module 33 are all connected to the image sequence determination module 28, and can respectively obtain the tracking results of the image tracking module 27 and are also connected to the evaluation result output module 40.
The modules in this embodiment are implemented by computer hardware such as a CPU and a PLC, and the modules with a storage function have storage elements such as a RAM or a hard disk, for example, the test image library creation module 10 itself has a function of storing a test image library, and the storage of the image library is implemented by the hard disk.
The tracking algorithm evaluation platform in this embodiment corresponds to the tracking algorithm evaluation method in the foregoing embodiment, and description and limitation of the tracking method are also applicable to the tracking algorithm evaluation platform in this embodiment, and are not described herein again.
From the above, it can be seen that, in the embodiment of the invention, based on the far and near scene infrared image tracking algorithm evaluation platform, the tracking algorithm is interacted with the test image library and the tracking algorithm through the configuration file, the tracking algorithm is embedded into the evaluation platform for short by the algorithm, the platform automatically recognizes the algorithm for short, and invokes the corresponding algorithm code, so as to realize the tracking process of the algorithm; the method has the advantages that the result data and the marking data are read in and processed in batches through the configuration file, the deviation curve and the statistic data are directly output, the parameters of people are not needed in the evaluation process, the efficiency of the evaluation process is improved, the automatic evaluation of the performance of the tracking algorithm in the far-near infrared aviation or military fields is realized, a unified performance evaluation platform is provided for the tracking algorithm, and the selection of the tracking algorithm and the rapid iterative optimization in the algorithm research and development process are facilitated.
The above description is only illustrative of the preferred embodiments of the present invention and of the principles of the technology employed. It will be appreciated by persons skilled in the art that the scope of the invention referred to in the present invention is not limited to the specific combinations of the technical features described above, but also covers other technical features formed by any combination of the technical features described above or their equivalents without departing from the inventive concept. Such as the above-mentioned features and the technical features disclosed in the present invention (but not limited to) having similar functions are replaced with each other.

Claims (6)

1. An infrared image tracking algorithm evaluation method based on far and near scenes is characterized by comprising the following steps of:
step S1, a test image library is established based on infrared image data under far and near field scenes, the position of a real target center point and the range of a real target area of each frame of image in an image sequence are marked, and a configuration file is formed;
s2, loading a configuration file, tracking the image sequence according to an algorithm to be evaluated, and outputting a tracking result;
s3, carrying out data analysis according to the tracking result, the position of the center point of the real target and the range mark of the region of the real target; comparing the tracking result with the real marking result on the basis of the marked infrared image big data test image library, and normalizing the comparison result according to the distance from far to near;
s4, completing and outputting an algorithm evaluation result according to data analysis;
the data analysis includes: evaluating tracking precision, evaluating tracking success rate and evaluating stability;
true target center pointIs (x) target ,y target ) The range of the real target area is S target The tracking center point coordinates are (x tracker ,y tracker );
The tracking precision Pre is as follows:
Figure FDA0004003540830000011
in the formula (1), S target,x Representing the length of the target region in the x-direction, S target,y Representing the length of the target area in the y-direction; the tracking precision Pre is normalized to the deviation precision of a target tracking point relative to a target area, and the evaluation deviation caused by different pixel numbers of the target in the far-field and near-field scene application is compensated through the deviation precision;
the tracking success rate Suc is defined as:
Suc=(x tracker ,y tracker )∈S target (2)
the formula (2) shows that the tracking point shows successful tracking in the target area, and the tracking point shows failure tracking outside the target area;
stability of the tracking algorithm is defined as:
Figure FDA0004003540830000012
in the formula (3), L tracker The tracking algorithm is represented to continuously and successfully track the maximum image frame number in the test sequence, and N is represented to the total image frame number of the current image sequence.
2. The method for evaluating a tracking algorithm according to claim 1, wherein the configuration file records a preservation path of an image sequence in a test image library, an image sequence name, and an algorithm to be evaluated.
3. The tracking algorithm evaluation method according to claim 2, wherein the algorithm to be evaluated is recorded in the configuration file in the form of an algorithm abbreviation or an algorithm code.
4. A tracking algorithm evaluation method according to any one of claims 1 to 3, wherein said step S2 comprises the steps of:
step S201, loading a configuration file;
step S202, judging whether the image sequence needs to be specified; if the image sequence is required to be specified, evaluating the algorithm through the specific image sequence, and then entering step S203; otherwise, go to step S204;
step S203, a specific image sequence is designated;
step S204, storing and acquiring a preservation path of a test image library;
step S205, sequentially analyzing an ith image sequence in a storage path; i starts from 1;
step S206, judging whether the current frame is the first frame of the current image sequence; when the frame is the first frame, the process proceeds to step S207; otherwise, directly enter step S208;
step S207, acquiring an initial tracking position and initializing tracking;
step S208, judging whether the frame is the last frame of the current image sequence; when it is the last frame, the process proceeds to step S209; otherwise, i=i+1, and the process proceeds to step S206;
step S209, judging whether all tracking image sequences have been traversed; when traversed, go to step S3; otherwise, the process advances to step S205.
5. The infrared image tracking algorithm evaluation platform based on the far and near scenes is characterized by sequentially comprising a test image library creation module, a tracking algorithm implementation module, a tracking result analysis module and an evaluation result output module; wherein:
the test image library creation module is used for creating a test image library based on infrared image data under far and near field scenes, marking the position of a real target center point and the range of a real target area of each frame of image in an image sequence, and forming a configuration file;
the tracking algorithm implementation module is used for loading a configuration file, tracking the image sequence according to an algorithm to be evaluated and outputting a tracking result;
the tracking result analysis module is used for carrying out data analysis according to the tracking result and the real marking position and an evaluation method, comparing the tracking result with the real marking result on the basis of the marked infrared image big data test image library, and normalizing the comparison result according to the distance from far to near; the tracking result analysis module comprises a tracking precision evaluation module, a tracking success rate evaluation module and a stability evaluation module; the tracking accuracy evaluation module, the tracking success rate evaluation module and the stability evaluation module are connected with the image sequence judgment module, respectively obtain tracking results of the image tracking module and are simultaneously connected with the evaluation result output module;
the evaluation result output module is used for completing and outputting an algorithm evaluation result according to data analysis;
the tracking precision evaluation module is used for executing the following steps:
the true target center point is (x target ,y target ) The range of the real target area is S target The tracking center point coordinates are (x tracker ,y tracker );
The tracking precision Pre is as follows:
Figure FDA0004003540830000031
in the formula (1), S target,x Representing the length of the target region in the x-direction, S target,y Representing the length of the target area in the y-direction; the tracking precision Pre is normalized to the deviation precision of a target tracking point relative to a target area, and the evaluation deviation caused by different pixel numbers of the target in the far-field and near-field scene application is compensated through the deviation precision;
the tracking success rate evaluation module is used for executing the following steps:
the tracking success rate Suc is defined as:
Suc=(x tracker ,y tracker )∈S target (2)
the formula (2) shows that the tracking point shows successful tracking in the target area, and the tracking point shows failure tracking outside the target area;
the stability evaluation module is used for executing the following steps:
stability of the tracking algorithm is defined as:
Figure FDA0004003540830000032
in the formula (3), L tracker The tracking algorithm is represented to continuously and successfully track the maximum image frame number in the test sequence, and N is represented to the total image frame number of the current image sequence.
6. The far and near scene based infrared image tracking algorithm evaluation platform of claim 5, wherein the tracking algorithm implementation module comprises: the system comprises an initialization loading module, an image sequence designating module, a path saving module, an image sequence analyzing module, an image frame judging module, an initial frame tracking module, an image tracking module and an image sequence judging module; wherein, the liquid crystal display device comprises a liquid crystal display device,
the initialization loading module is connected with the image sequence designating module and the path saving module at the same time, the image sequence designating module is connected with the path saving module, the path saving module is downwards connected with the image sequence analyzing module, the image sequence analyzing module is connected with the image frame judging module, the image frame judging module is connected with the initial frame tracking module and the image tracking module at the same time, the initial frame tracking module is connected with the image tracking module, and the image tracking module is connected with the image sequence judging module; the image sequence judging module is connected with the image sequence analyzing module and the tracking result analyzing module at the same time.
CN202010937960.6A 2020-09-09 2020-09-09 Far and near scene-based infrared image tracking algorithm evaluation method and platform Active CN112200827B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010937960.6A CN112200827B (en) 2020-09-09 2020-09-09 Far and near scene-based infrared image tracking algorithm evaluation method and platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010937960.6A CN112200827B (en) 2020-09-09 2020-09-09 Far and near scene-based infrared image tracking algorithm evaluation method and platform

Publications (2)

Publication Number Publication Date
CN112200827A CN112200827A (en) 2021-01-08
CN112200827B true CN112200827B (en) 2023-06-09

Family

ID=74005463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010937960.6A Active CN112200827B (en) 2020-09-09 2020-09-09 Far and near scene-based infrared image tracking algorithm evaluation method and platform

Country Status (1)

Country Link
CN (1) CN112200827B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105787950A (en) * 2016-03-24 2016-07-20 电子科技大学 Infrared image sea-sky-line detection algorithm based on line gradient accumulation
CN109345560A (en) * 2018-09-20 2019-02-15 网易(杭州)网络有限公司 The motion tracking method for testing precision and device of augmented reality equipment
CN110443827A (en) * 2019-07-22 2019-11-12 浙江大学 A kind of UAV Video single goal long-term follow method based on the twin network of improvement
CN110796093A (en) * 2019-10-30 2020-02-14 上海眼控科技股份有限公司 Target tracking method and device, computer equipment and storage medium
CN111239766A (en) * 2019-12-27 2020-06-05 北京航天控制仪器研究所 Water surface multi-target rapid identification and tracking method based on laser radar

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105787950A (en) * 2016-03-24 2016-07-20 电子科技大学 Infrared image sea-sky-line detection algorithm based on line gradient accumulation
CN109345560A (en) * 2018-09-20 2019-02-15 网易(杭州)网络有限公司 The motion tracking method for testing precision and device of augmented reality equipment
CN110443827A (en) * 2019-07-22 2019-11-12 浙江大学 A kind of UAV Video single goal long-term follow method based on the twin network of improvement
CN110796093A (en) * 2019-10-30 2020-02-14 上海眼控科技股份有限公司 Target tracking method and device, computer equipment and storage medium
CN111239766A (en) * 2019-12-27 2020-06-05 北京航天控制仪器研究所 Water surface multi-target rapid identification and tracking method based on laser radar

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"基于区域边缘统计的图像特征描述新方法";余旺盛 等;《计算机学报》;20140630;第1398-1410页 *
余旺盛 等."基于区域边缘统计的图像特征描述新方法".《计算机学报》.2014, *

Also Published As

Publication number Publication date
CN112200827A (en) 2021-01-08

Similar Documents

Publication Publication Date Title
EP2479726B9 (en) Image comparison system and image comparison method
CN109344727B (en) Identity card text information detection method and device, readable storage medium and terminal
US20200402242A1 (en) Image analysis method and apparatus, and electronic device and readable storage medium
US9158963B2 (en) Fitting contours to features
CN109325961B (en) Unmanned aerial vehicle video multi-target tracking method and device
CN108846404B (en) Image significance detection method and device based on related constraint graph sorting
CN111598049B (en) Cheating identification method and device, electronic equipment and medium
CN111444964B (en) Multi-target rapid image matching method based on adaptive ROI (region of interest) division
CN114781514A (en) Floater target detection method and system integrating attention mechanism
CN115115825B (en) Method, device, computer equipment and storage medium for detecting object in image
CN112559341A (en) Picture testing method, device, equipment and storage medium
CN111445496B (en) Underwater image recognition tracking system and method
CN112200827B (en) Far and near scene-based infrared image tracking algorithm evaluation method and platform
CN113989604A (en) Tire DOT information identification method based on end-to-end deep learning
Cai et al. Single shot multibox detector for honeybee detection
CN111126286A (en) Vehicle dynamic detection method and device, computer equipment and storage medium
CN112200217B (en) Identification algorithm evaluation method and system based on infrared image big data
CN113784026B (en) Method, apparatus, device and storage medium for calculating position information based on image
CN110345919A (en) Space junk detection method based on three-dimensional space vector and two-dimensional plane coordinate
Embarak et al. Intelligent image detection system based on internet of things and cloud computing
CN111124862B (en) Intelligent device performance testing method and device and intelligent device
CN112861652B (en) Video target tracking and segmentation method and system based on convolutional neural network
CN114155471A (en) Design drawing and object verification method, device, computer equipment and system
CN111881746B (en) Face feature point positioning method and system based on information fusion
CN114241495B (en) Data enhancement method for off-line handwritten text recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant