CN112631333A - Target tracking method and device of unmanned aerial vehicle and image processing chip - Google Patents

Target tracking method and device of unmanned aerial vehicle and image processing chip Download PDF

Info

Publication number
CN112631333A
CN112631333A CN202011565949.8A CN202011565949A CN112631333A CN 112631333 A CN112631333 A CN 112631333A CN 202011565949 A CN202011565949 A CN 202011565949A CN 112631333 A CN112631333 A CN 112631333A
Authority
CN
China
Prior art keywords
target
identification
score
list
recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011565949.8A
Other languages
Chinese (zh)
Other versions
CN112631333B (en
Inventor
李彬
丁国斌
蔡思航
巨擘
费媛媛
雷锦成
蔡宏伟
文岐月
巫伟林
李星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southern Power Grid Digital Grid Research Institute Co Ltd
Original Assignee
Southern Power Grid Digital Grid Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southern Power Grid Digital Grid Research Institute Co Ltd filed Critical Southern Power Grid Digital Grid Research Institute Co Ltd
Priority to CN202011565949.8A priority Critical patent/CN112631333B/en
Publication of CN112631333A publication Critical patent/CN112631333A/en
Application granted granted Critical
Publication of CN112631333B publication Critical patent/CN112631333B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of unmanned aerial vehicle tracking, in particular to a target tracking method and device of an unmanned aerial vehicle and an image processing chip. The method comprises the following steps: identifying a target image transmitted back by the unmanned aerial vehicle; calculating a first recognition result from the target image; respectively carrying out score comparison on the identification frames in the second identification result and all the identification frames in the first identification result, and constructing and screening a target list with the highest conformity; calculating the relative offset direction of each identification box in the target list; determining the target equipment between the target image and the previous frame of target image as the optimal matching relation; a plurality of target devices in a target image are tracked. By applying the target tracking method provided by the embodiment of the invention, the real-time tracking of the specific target equipment can be kept in the process of turning the pan-tilt camera of the unmanned aerial vehicle to the specific target equipment in the inspection process, and the specific target equipment is photographed when the specific target equipment appears in the central position of the picture, so that the imaging quality of the inspection photo is improved, and the imaging quality of the inspection photo is improved.

Description

Target tracking method and device of unmanned aerial vehicle and image processing chip
Technical Field
The invention relates to the technical field of unmanned aerial vehicle tracking, in particular to a target tracking method and device of an unmanned aerial vehicle and an image processing chip.
Background
Along with the continuous development of unmanned aerial vehicle technique, unmanned aerial vehicle range of application is also wider and wider, and unmanned aerial vehicle has expanded comprehensively in electric power system transmission line patrols and examines, is patrolling and examining to the full-automatic unmanned change from manual operation unmanned aerial vehicle and is patrolling and examining the transition.
The inventor discovers that in the process of implementing the invention: unmanned aerial vehicle is automatic to be patrolled and examined and when promoting and patrolling and examining efficiency, because unmanned aerial vehicle position control precision, GPS positioning accuracy scheduling problem lead to in the picture of patrolling and examining that unmanned aerial vehicle shot, have the target device who is shot not at picture central authorities, and lie in the edge of picture to influence follow-up analysis to patrolling and examining the picture.
Disclosure of Invention
In view of the above technical problems, embodiments of the present invention provide a target tracking method and apparatus for an unmanned aerial vehicle, and an image processing chip, so as to solve one or more problems that a target device photographed by a conventional unmanned aerial vehicle in an inspection process is not in the center of a picture, which affects subsequent picture analysis.
A first aspect of an embodiment of the present invention provides a target tracking method for an unmanned aerial vehicle, including: identifying a target image transmitted back by the unmanned aerial vehicle; calculating first recognition results of all target devices from the target image, wherein the first recognition results comprise the target device recognition frames; recording the recognition results of all target devices in a frame of target image as a second recognition result, and respectively performing score comparison on any recognition frame in the second recognition result and all recognition frames in the first recognition result to construct a matching list meeting conditions; screening a target list with the highest conformity from the matching list according to a preset screening condition; calculating a relative offset direction of each identified box in the target list; if the relative offset directions of the identification frames are consistent, determining that the target equipment between the target image and the target image of the previous frame is in the optimal matching relationship; tracking a plurality of target devices in the target image.
Optionally, the score comparison of any recognition frame in the second recognition result and all recognition frames in the first recognition result is performed respectively, so as to construct a matching list meeting the conditions, including: taking out one identification frame from the second identification result as Bi, sequentially comparing the Bi with the identification frames Aj in the first identification result, and calculating the score after comparison as Sij according to a preset scoring method; wherein i belongs to [1, m ], j belongs to [1, n ]; m is the number of the identification frames in the first identification result; n is the number of the identification frames in the second identification result; screening out an Aj set matched with Bi according to the score Sij, and constructing a matching list meeting the conditions; and repeating the steps, and continuing to construct a new list until the new list cannot be constructed.
Optionally, the screening out the Aj set matched with Bi according to the score Sij includes: recording Aj corresponding to Sij >0 and corresponding score; and taking the Aj corresponding to Sij >0 as an Aj set matched with Bi.
Optionally, the score Sij is a sum of the distance score, the ratio score, and the degree of match score.
Optionally, the distance score is calculated as follows: calculating the identification distance between the center point of the identification frame Bi and the center point of the identification frame Aj, and if the identification distance is greater than a distance threshold value, the distance score is 0; if the identification distance is smaller than the distance threshold, the smaller the identification distance is, the higher the distance score is; the ratio score is calculated as follows: calculating the ratio of the intersection region and the union region of the identification frame Bi and the identification frame Aj, wherein if the ratio is 0, the ratio score is 0, and if the ratio is greater than 0, the ratio is larger, and the ratio score is higher; the matching degree score is calculated as follows: calculating image characteristic values of areas where the identification frame Bi and the identification frame Aj are located, and calculating a characteristic matching result, wherein if the matching degree is smaller than a matching degree threshold value, the matching degree score is 0; otherwise, the higher the degree of match score.
Optionally, the screening, according to a preset screening condition, a target list with the highest conformity from the matching list includes: sorting the matching list from high to low according to Sij scores; and taking the list with the highest score as a target list with the highest conformity.
Optionally, the calculating the relative offset direction of each recognition box in the target list specifically includes: and calculating the offset direction of Bi relative to Aj in all [ Bi, Aj ] combinations in the target list.
Optionally, the first recognition result and the second recognition result specifically include an upper left corner coordinate and a lower right corner coordinate of the target device recognition box.
A second aspect of an embodiment of the present invention provides a target tracking apparatus for an unmanned aerial vehicle, including: the identification module is used for identifying a target image transmitted back by the unmanned aerial vehicle; the first calculation module is used for obtaining a first recognition result of the calculation target equipment from the target image, the first recognition result comprises a target equipment recognition frame, and the recognition results of all the target equipment in one frame of target image are recorded as second recognition results; the comparison module is used for respectively performing score comparison on any identification frame in the second identification result and all identification frames in the first identification result to construct a matching list meeting conditions; the screening module is used for screening a target list with the highest conformity from the matching list according to preset screening conditions; the second calculation module is used for calculating the relative offset direction of each identification box in the target list; the determining module is used for determining that the target equipment between the target image and the target image of the previous frame is in the best matching relationship if the relative offset directions of the identification frames are consistent; and the tracking module is used for tracking a plurality of target devices in the target image.
A third aspect of an embodiment of the present invention provides an image processing chip, including: a processor and a memory communicatively coupled to the processor; the memory has stored therein computer program instructions which, when invoked by the processor, cause the processor to perform a target tracking method as described above.
The target tracking method of the unmanned aerial vehicle provided by the embodiment of the invention has the following effects:
(1) according to the embodiment of the invention, after the target equipment in two continuous frames of target images is identified by adopting the deep neural network model, the association condition of the target equipment in the two frames of images is obtained through the identification result of the two frames of images and the similarity of the identification frame area, so that the tracking of a plurality of target equipment is realized.
(2) The embodiment of the invention comprehensively considers the displacement of the identification frame, the coincidence degree of the identification frame, the moving direction of the identification frame, the similarity of the local image of the area of the identification frame and the like in the identification results of the two frames of target images, so that the target tracking is more accurate.
(3) According to the embodiment of the invention, the total matching condition of all the tracked targets in the target image is used as the assessment standard of the matching algorithm, and the obtained matching and tracking effects are more in line with the actual condition.
By applying the target tracking method provided by the embodiment of the invention, the real-time tracking of the specific target equipment can be kept in the process of routing inspection and the process of turning the pan-tilt camera of the unmanned aerial vehicle to the specific target equipment, and when the specific target equipment appears at the central position of the picture, the picture is taken, so that the imaging quality of the routing inspection picture is improved, and the imaging quality of the routing inspection picture is improved; the target tracking method provided by the embodiment of the invention can also realize the tracking of the unmanned aerial vehicle cloud platform on the plurality of target devices in the inspection process, and the invention can support the tracking and shooting of the plurality of target devices at a single shooting point, thereby improving the inspection efficiency.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
Fig. 1 is an application scene diagram of the unmanned aerial vehicle inspection provided by the embodiment of the invention;
FIG. 2 is a block diagram of an image processing chip according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart diagram of a target tracking method provided by an embodiment of the present invention;
fig. 4 is a block diagram of a target tracking apparatus according to an embodiment of the present invention.
Detailed Description
In order to facilitate an understanding of the invention, the invention is described in more detail below with reference to the accompanying drawings and specific examples. It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may be present. As used in this specification, the terms "upper," "lower," "inner," "outer," "bottom," and the like are used in the orientation or positional relationship indicated in the drawings for convenience in describing the invention and simplicity in description, and do not indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and are not to be considered limiting of the invention. Furthermore, the terms "first," "second," "third," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
In unmanned aerial vehicle inspection, automatic unmanned inspection needs to track target equipment from a shot picture, so that the position of the target equipment in the picture is an important index. By applying the target tracking method provided by the embodiment of the invention, shooting can be carried out when the specific target equipment appears at the center of the picture, so that the shot target equipment is prevented from being positioned at the edge of the picture instead of the center of the picture, and the follow-up analysis of the inspection picture is influenced.
Fig. 1 is an application scenario of the unmanned aerial vehicle inspection provided by the embodiment of the invention. As shown in fig. 1, in this application scenario, a drone 10, a smart terminal 20, and a wireless network 30 are included.
The drone 10 may be any type of powered unmanned aerial vehicle including, but not limited to, a quad drone, a fixed wing aircraft, a helicopter model, and the like. The device can be provided with corresponding volume or power according to the requirements of actual conditions, so that the loading capacity, the flying speed, the flying endurance mileage and the like which can meet the use requirements are provided.
The intelligent terminal 20 may be any type of intelligent device for establishing a communication connection with the drone, such as a mobile phone, a tablet computer, a PC terminal, or an intelligent remote controller. The intelligent terminal 20 may be equipped with one or more different user interaction devices for collecting user instructions or presenting and feeding back information to the user.
The wireless network 30 may be a wireless communication network for establishing a data transmission channel between two nodes based on any type of data transmission principle, such as a bluetooth network, a WiFi network, a wireless cellular network or a combination thereof located in a specific signal band.
Fig. 2 is a block diagram of an image processing chip according to an embodiment of the present invention. The image processing chip may be used to implement all or part of the functions in the target tracking method in the embodiments described below.
As shown in fig. 2, the image processing chip 100 may include: a processor 110 and a memory 120.
The processor 110 and the memory 120 are communicatively connected to each other by way of a bus 130.
The processor 110 may be of any type, having one or more processing cores. The system can execute single-thread or multi-thread operation and is used for analyzing instructions to execute operations of acquiring data, executing logic operation functions, issuing operation processing results and the like.
The memory 120 serves as a non-volatile computer-readable storage medium, such as at least one magnetic disk storage device, flash memory device, distributed storage device remotely located from the processor 110, or other non-volatile solid-state storage device. The memory 120 may have a program storage area for storing non-volatile software programs, non-volatile computer-executable programs, and modules for calling by the processor 110 to cause the processor 110 to perform one or more method steps. The memory 120 may further have a data storage area for storing the operation processing result issued and output by the processor 110.
With reference to fig. 1, in an actual use process, the unmanned aerial vehicle 10 may shoot images at different times for the target device in the inspection process, such that in some image picture shooting, the target device is located in the center of the picture, and in some image picture shooting, the target device is located at the edge of the picture. Therefore, the invention provides a target tracking method to avoid the problem that the target device is not in the center of the picture when shooting.
Fig. 3 is a schematic flow chart of a target tracking method according to an embodiment of the present invention, and as shown in fig. 3, the method includes the following steps:
and step 310, identifying the target image returned by the unmanned aerial vehicle.
Unmanned aerial vehicle itself has the function module that can gather the image, and it can gather image data at specific time at the in-process of patrolling and examining to reach back to the image processing chip as the target image with the image of this gathering, so that the image processing chip is patrolled and examined the environment and is analyzed according to this target image.
Step 320, calculating first recognition results of all target devices from the target image, wherein the first recognition results comprise the target device recognition frames.
The recognition algorithm for recognizing the target device from the target image in the invention can be an image recognition algorithm based on a deep neural network model, and the algorithm is universal in the field and is not described herein again.
The target images returned at different times are different in shot pictures and the positions of the target devices in the pictures are also different. Here, the calculation of the target image is directed to calculating the position of the target device recognition frame in the target image in the shooting screen, and specifically, the first recognition result and a second recognition result described below are coordinates of the target device recognition frame in the shooting screen, for example, coordinates of an upper left corner and a lower right corner. It should be noted that, in one frame of target image, there may be a plurality of target devices. The first recognition result here includes the target device recognition box positions of all the target devices.
Step 330, recording the recognition results of all the target devices in the target image of one frame as a second recognition result,
and respectively carrying out score comparison on any recognition frame in the second recognition result and all recognition frames in the first recognition result, and constructing a matching list meeting the conditions.
In order to know whether the target device in the target image is located in the middle of the target image frame, the target image needs to be compared with the previous frame of target image (it should be noted that, in the embodiment of the present invention, one frame of image which is returned from the drone and needs to be processed currently is collectively referred to as the target image, and one frame of image which is returned in the previous frame before the target image and needs to be processed is collectively referred to as the previous frame of target image), that is, the first recognition result is compared with the second recognition result in the previous frame of target image, so as to calculate the correlation between the target image and the previous frame of target image. Specifically, score matching may be performed on any recognition frame in the second recognition result and all recognition frames in the first recognition result, and a matching list meeting the condition is constructed to represent the coincidence degree of the recognition frames in the two frames of target images.
Further, score comparison is performed on any recognition frame in the second recognition result and all recognition frames in the first recognition result, and a matching list meeting conditions is constructed, including:
1. taking one identification frame out of the second identification result as Bi, sequentially comparing the Bi with the identification frames Aj in the first identification result, and calculating the score after comparison as Sij according to a preset scoring method, wherein i belongs to [1, m ], j belongs to [1, n ], and m is the number of the identification frames in the first identification result; n is the number of the identification frames in the second identification result.
2. And screening out an Aj set matched with Bi according to the score Sij, and constructing a matching list meeting the conditions.
Specifically, a corresponding satisfying condition may be set to screen the Aj set matching Bi, and in a specific embodiment, Aj corresponding to Sij >0 and a corresponding score may be recorded, and Aj corresponding to Sij >0 may be taken as the Aj set matching Bi.
Specifically, in the embodiment of the present invention, the score Sij is a sum of a distance score, a ratio score and a matching score, where the scores are used to characterize the displacement of the recognition frame, the coincidence degree of the recognition frame, the moving direction of the recognition frame, the similarity of local images in the region of the recognition frame, and the like in two frames of target images, so that the target tracking is more accurate.
The distance score is calculated as follows: calculating an identification distance between the center point of the identification frame Bi and the center point of the identification frame Aj, and if the identification distance is greater than a distance threshold, the distance score is 0, and the distance threshold is related to the size of the identification frame, for example, setting a threshold LMi ═ RBi + RAi)/2, where RBi is the distance from the center point of the identification frame Bi to the vertex, and RAi is the distance from the center point of the identification frame Ai to the vertex; if the recognition distance is less than the distance threshold, the smaller the recognition distance, the higher the distance score.
The ratio score is calculated as follows: and calculating the ratio of the intersection region and the union region of the identification frame Bi and the identification frame Aj, wherein if the ratio is 0, the ratio score is 0, and if the ratio is greater than 0, the ratio is larger, and the ratio score is higher.
The matching score is calculated as follows: calculating image characteristic values of the areas where the identification frame Bi and the identification frame Aj are located, and calculating a characteristic matching result, wherein the matching degree is smaller than a matching degree threshold (the matching degree threshold is related to a specific characteristic value matching algorithm, and a person in the art can determine the matching degree threshold according to the selected characteristic value matching algorithm), and then the matching degree score is 0; otherwise, the higher the degree of match score.
The distance threshold and the matching degree threshold are selected by a person skilled in the art according to a specific application scenario, and are not described herein again. When the feature matching degree is calculated, the image feature value of the region where the identification frame is located can be calculated by utilizing a feature extraction and matching algorithm such as SIFT (Scale invariant feature transform). And finally, adding the distance score, the ratio score and the matching degree score to obtain a final score Sij.
3. And repeating the steps, and continuing to construct a new list until the new list cannot be constructed.
This step is to construct a list of all possible [ Bi, Aj ] matching combinations, the combination list characterizing one of the two recognition results of B and a), each list in the constructed list is composed of a plurality of [ Bi, Aj ], and the following 3 conditions are satisfied:
(1) i belongs to [1, m ], j belongs to [1, n ], and Sij > 0;
(2) for each [ Bi, Aj ] combination in the list, i and j cannot occur repeatedly;
(3) the [ Bi, Aj ] combinations in each list should be as many as possible if the first two conditions are met.
Specifically, the list of [ Bi, Aj ] matching combinations can be constructed in the following way:
(1) b1 is taken out, then a recognition box is taken out from the Aj set matched with B1 and is marked as Ax1, and the matching combination (B1, Ax1) is put into a list;
(2) b2 is taken out, then a recognition frame is taken out from the Aj set matched with B2 and is marked as Ax2, wherein Ax2 cannot be repeated with other recognition frames in the recognition result A which is taken out before, and a matching combination (B2, Ax2) is put into a list;
(3) repeat step 2 until Bm is taken out and the matching combination (Bm, Axm) is put into the list, and this list construction is complete. Wherein, if the set of Aj matched with Bi is an empty set after Bi is taken out in the list construction process, or all the identification frames in the set are repeated with other identification frames in the identification result A which is taken out before, the matching combination [ Bi, Aj ] cannot be put into the constructed list;
(4) and after the list is built, repeating the steps, and continuously building a new list to ensure that the new list cannot be repeated with the built list until the new list cannot be built, namely obtaining the list of the Aj set matched with Bi.
And 340, screening a target list with the highest conformity from the matching list according to preset screening conditions.
In this step, the list with the highest Sij score can be screened out from the matching list as the target list with the highest conformity. That is, the matching list is sorted by Sij score from high to low; and taking the list with the highest score as a target list with the highest conformity.
Step 350, calculating the relative offset direction of each recognition box in the target list.
Here, the offset direction of Bi relative to Aj in all [ Bi, Aj ] combinations in the target list of step 304 is calculated as follows:
(1) calculating the center points (BiX, BiY) of Bi and Aj
(2) If | BiX-AjX | < XM, and | BiY-AjY | < YM, it is defined that the offset is small
(3) Otherwise, if BiX > AjX, and BiY > AjY, defined as offset direction 1
(4) Otherwise, if BiX > AjX and BiY < AjY, the offset direction is defined as 2
(5) Otherwise, if BiX < AjX, and BiY > AjY, defined as offset direction 3
(6) Otherwise, if BiX < AjX, and BiY < AjY, defined as offset direction 4
XM and YM are central point thresholds, and the value can be selected according to the characteristics of the size of the target image, the number of target devices in the target image and the like.
And step 360, if the relative offset directions of the identification frames are consistent, determining that the target equipment between the target image and the target image of the previous frame is in the best matching relationship.
If the shifting directions of Bi relative to Aj are consistent (in some cases, for example, the shifting amount is small and can be ignored), in all [ Bi, Aj ] combinations in the extracted list, all [ Bi, Aj ] combinations in the list are considered to be the best matching relation between the recognition results B of all target devices in the frame image and the recognition results A of all target devices in the previous frame image; if this condition is not satisfied, the best matching relationship is not considered to be present, and the list is removed and the shift direction of Bi with respect to Aj is repeatedly calculated.
Step 370, tracking a plurality of target devices in the target image.
In step 360, if the optimal matching relationship between the recognition results B of all the target devices in the frame of image and the recognition results a of all the target devices in the previous frame of image is obtained, the target devices in the target image can be tracked simultaneously according to the optimal matching relationship between the target devices in the two frames of target images. The plurality of target devices tracked from the picture can meet the condition that a specific target device appears in the central position of the picture so as to ensure the imaging quality of the inspection photo.
To sum up, the target tracking method of the unmanned aerial vehicle provided by the embodiment of the invention has the following effects:
(1) according to the embodiment of the invention, after the target equipment in two continuous frames of target images is identified by adopting the deep neural network model, the association condition of the target equipment in the two frames of images is obtained through the identification result of the two frames of images and the similarity of the identification frame area, so that the tracking of a plurality of target equipment is realized.
(2) The embodiment of the invention comprehensively considers the displacement of the identification frame, the coincidence degree of the identification frame, the moving direction of the identification frame, the similarity of the local image of the area of the identification frame and the like in the identification results of the two frames of target images, so that the target tracking is more accurate.
(3) According to the embodiment of the invention, the total matching condition of all the tracked targets in the target image is used as the assessment standard of the matching algorithm, and the obtained matching and tracking effects are more in line with the actual condition.
By applying the target tracking method provided by the embodiment of the invention, the target equipment can be tracked by the unmanned aerial vehicle holder in the inspection process, so that the imaging quality of the inspection photo is improved; the target tracking method provided by the embodiment of the invention can also realize the tracking of the unmanned aerial vehicle cloud platform on the plurality of target devices in the inspection process, and the invention can support the tracking and shooting of the plurality of target devices at a single shooting point, thereby improving the inspection efficiency.
The present invention further provides a target tracking apparatus 400 of an unmanned aerial vehicle, please refer to fig. 4, where the target tracking apparatus 400 includes: an identification module 41, a first calculation module 42, a comparison module 43, a screening module 44, a second calculation module 45, a determination module 46 and a tracking module 47.
The identification module 41 is used for identifying a target image returned by the unmanned aerial vehicle;
the first calculating module 42 is configured to obtain a first recognition result of a calculation target device from the target image, where the first recognition result includes the target device recognition frame, and note that the recognition results of all target devices in a frame of target image are second recognition results;
the comparison module 43 is configured to perform score comparison on any recognition frame in the second recognition result and all recognition frames in the first recognition result, respectively, so as to construct a matching list meeting the conditions;
the screening module 44 is configured to screen a target list with the highest conformity from the matching list according to a preset screening condition;
the second calculating module 45 is configured to calculate a relative offset direction of each identified box in the target list;
the determining module 46 is configured to determine that the target device between the target image and the target image of the previous frame is the best matching relationship if the relative offset directions of the recognition frames are consistent;
the tracking module 47 is configured to track a plurality of target devices in the target image.
The target tracking method in the above embodiment is also applicable to the target tracking apparatus of the present invention, and will not be described herein again. The target tracking device of the embodiment of the invention can realize the tracking of the unmanned aerial vehicle holder on the target equipment in the inspection process, thereby improving the imaging quality of the inspection photo; the target tracking method provided by the embodiment of the invention can also realize the tracking of the unmanned aerial vehicle cloud platform on the plurality of target devices in the inspection process, and the invention can support the tracking and shooting of the plurality of target devices at a single shooting point, thereby improving the inspection efficiency.
It will be further appreciated by those of skill in the art that the various steps of the exemplary object tracking method described in connection with the embodiments disclosed herein can be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the various examples have been described above generally in terms of their functionality for the purpose of clearly illustrating the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation.
Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention. The computer software may be stored in a computer readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a read-only memory or a random access memory.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; within the idea of the invention, also technical features in the above embodiments or in different embodiments may be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the invention as described above, which are not provided in detail for the sake of brevity; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A target tracking method of an unmanned aerial vehicle is characterized by comprising the following steps:
identifying a target image transmitted back by the unmanned aerial vehicle;
calculating first recognition results of all target devices from the target image, wherein the first recognition results comprise the target device recognition frames;
recording the recognition results of all target devices in a frame of target image as a second recognition result, and respectively performing score comparison on any recognition frame in the second recognition result and all recognition frames in the first recognition result to construct a matching list meeting conditions;
screening a target list with the highest conformity from the matching list according to a preset screening condition;
calculating a relative offset direction of each identified box in the target list;
if the relative offset directions of the identification frames are consistent, determining that the target equipment between the target image and the target image of the previous frame is in the optimal matching relationship;
tracking a plurality of target devices in the target image.
2. The target tracking method according to claim 1, wherein the score comparison of any recognition frame in the second recognition result with all recognition frames in the first recognition result is performed to construct a matching list meeting the condition, and the method comprises:
taking out one identification frame from the second identification result as Bi, sequentially comparing the Bi with the identification frames Aj in the first identification result, and calculating the score after comparison as Sij according to a preset scoring method;
wherein i belongs to [1, m ], j belongs to [1, n ]; m is the number of the identification frames in the first identification result; n is the number of the identification frames in the second identification result;
screening out an Aj set matched with Bi according to the score Sij, and constructing a matching list meeting the conditions;
and repeating the steps, and continuing to construct a new list until the new list cannot be constructed.
3. The method for tracking the target according to claim 2, wherein the screening out the Aj set matching Bi according to the score Sij comprises:
recording Aj corresponding to Sij >0 and corresponding score;
and taking the Aj corresponding to Sij >0 as an Aj set matched with Bi.
4. The target tracking method of claim 3, wherein the score Sij is a sum of a distance score, a ratio score, and a degree of match score.
5. The target tracking method of claim 4,
the distance score is calculated as follows:
calculating the identification distance between the center point of the identification frame Bi and the center point of the identification frame Aj, and if the identification distance is greater than a distance threshold value, the distance score is 0; if the identification distance is smaller than the distance threshold, the smaller the identification distance is, the higher the distance score is;
the ratio score is calculated as follows:
calculating the ratio of the intersection region and the union region of the identification frame Bi and the identification frame Aj, wherein if the ratio is 0, the ratio score is 0, and if the ratio is greater than 0, the ratio is larger, and the ratio score is higher;
the matching degree score is calculated as follows:
calculating image characteristic values of areas where the identification frame Bi and the identification frame Aj are located, and calculating a characteristic matching result, wherein if the matching degree is smaller than a matching degree threshold value, the matching degree score is 0; otherwise, the higher the degree of match score.
6. The target tracking method according to claim 5, wherein the step of screening the matching list for the target with the highest conformity according to a preset screening condition comprises:
sorting the matching list from high to low according to Sij scores;
and taking the list with the highest score as a target list with the highest conformity.
7. The method for tracking the target according to claim 6, wherein calculating the relative offset direction of each recognition box in the target list comprises: and calculating the offset direction of Bi relative to Aj in all [ Bi, Aj ] combinations in the target list.
8. The target tracking method of claim 1, wherein: the first recognition result and the second recognition result specifically include an upper left corner coordinate and a lower right corner coordinate of the target device recognition box.
9. An unmanned aerial vehicle's target tracking device, its characterized in that includes:
the identification module is used for identifying a target image transmitted back by the unmanned aerial vehicle;
the first calculation module is used for obtaining a first recognition result of the calculation target equipment from the target image, the first recognition result comprises a target equipment recognition frame, and the recognition results of all the target equipment in one frame of target image are recorded as second recognition results;
the comparison module is used for respectively performing score comparison on any identification frame in the second identification result and all identification frames in the first identification result to construct a matching list meeting conditions;
the screening module is used for screening a target list with the highest conformity from the matching list according to preset screening conditions;
the second calculation module is used for calculating the relative offset direction of each identification box in the target list;
the determining module is used for determining that the target equipment between the target image and the target image of the previous frame is in the best matching relationship if the relative offset directions of the identification frames are consistent;
and the tracking module is used for tracking a plurality of target devices in the target image.
10. An image processing chip, comprising: a processor and a memory communicatively coupled to the processor;
the memory has stored therein computer program instructions which, when invoked by the processor, cause the processor to perform the object tracking method of any one of claims 1-8.
CN202011565949.8A 2020-12-25 2020-12-25 Target tracking method and device of unmanned aerial vehicle and image processing chip Active CN112631333B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011565949.8A CN112631333B (en) 2020-12-25 2020-12-25 Target tracking method and device of unmanned aerial vehicle and image processing chip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011565949.8A CN112631333B (en) 2020-12-25 2020-12-25 Target tracking method and device of unmanned aerial vehicle and image processing chip

Publications (2)

Publication Number Publication Date
CN112631333A true CN112631333A (en) 2021-04-09
CN112631333B CN112631333B (en) 2024-04-12

Family

ID=75325362

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011565949.8A Active CN112631333B (en) 2020-12-25 2020-12-25 Target tracking method and device of unmanned aerial vehicle and image processing chip

Country Status (1)

Country Link
CN (1) CN112631333B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139986A (en) * 2021-04-30 2021-07-20 东风越野车有限公司 Integrated environment perception and multi-target tracking system
CN115063452A (en) * 2022-06-13 2022-09-16 中国船舶重工集团公司第七0七研究所九江分部 Cloud deck camera tracking method for offshore target

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105814888A (en) * 2013-11-08 2016-07-27 弗劳恩霍夫应用研究促进协会 Multi-aperture device and method for detecting object region
US9737757B1 (en) * 2016-11-16 2017-08-22 Wawgd, Inc Golf ball launch monitor target alignment method and system
CN108476288A (en) * 2017-05-24 2018-08-31 深圳市大疆创新科技有限公司 Filming control method and device
CA3006155A1 (en) * 2017-09-25 2019-03-25 The Boeing Company Positioning system for aerial non-destructive inspection
CN109671103A (en) * 2018-12-12 2019-04-23 易视腾科技股份有限公司 Method for tracking target and device
CN110472496A (en) * 2019-07-08 2019-11-19 长安大学 A kind of traffic video intelligent analysis method based on object detecting and tracking
CN111462180A (en) * 2020-03-30 2020-07-28 西安电子科技大学 Object tracking method based on AND-OR graph AOG
CN111539986A (en) * 2020-03-25 2020-08-14 西安天和防务技术股份有限公司 Target tracking method and device, computer equipment and storage medium
US20200265608A1 (en) * 2019-02-15 2020-08-20 Keyence Corporation Image Processing Apparatus
CN111739056A (en) * 2020-06-23 2020-10-02 杭州海康威视数字技术股份有限公司 Trajectory tracking system
CN111882579A (en) * 2020-07-03 2020-11-03 湖南爱米家智能科技有限公司 Large infusion foreign matter detection method, system, medium and equipment based on deep learning and target tracking
CN111899285A (en) * 2020-07-08 2020-11-06 浙江大华技术股份有限公司 Method and device for determining tracking track of target object and storage medium
CN111986227A (en) * 2020-08-26 2020-11-24 杭州海康威视数字技术股份有限公司 Trajectory generation method and apparatus, computer device and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105814888A (en) * 2013-11-08 2016-07-27 弗劳恩霍夫应用研究促进协会 Multi-aperture device and method for detecting object region
US9737757B1 (en) * 2016-11-16 2017-08-22 Wawgd, Inc Golf ball launch monitor target alignment method and system
CN108476288A (en) * 2017-05-24 2018-08-31 深圳市大疆创新科技有限公司 Filming control method and device
CA3006155A1 (en) * 2017-09-25 2019-03-25 The Boeing Company Positioning system for aerial non-destructive inspection
CN109556577A (en) * 2017-09-25 2019-04-02 波音公司 Positioning system for aerial nondestructive inspection
CN109671103A (en) * 2018-12-12 2019-04-23 易视腾科技股份有限公司 Method for tracking target and device
US20200265608A1 (en) * 2019-02-15 2020-08-20 Keyence Corporation Image Processing Apparatus
CN110472496A (en) * 2019-07-08 2019-11-19 长安大学 A kind of traffic video intelligent analysis method based on object detecting and tracking
CN111539986A (en) * 2020-03-25 2020-08-14 西安天和防务技术股份有限公司 Target tracking method and device, computer equipment and storage medium
CN111462180A (en) * 2020-03-30 2020-07-28 西安电子科技大学 Object tracking method based on AND-OR graph AOG
CN111739056A (en) * 2020-06-23 2020-10-02 杭州海康威视数字技术股份有限公司 Trajectory tracking system
CN111882579A (en) * 2020-07-03 2020-11-03 湖南爱米家智能科技有限公司 Large infusion foreign matter detection method, system, medium and equipment based on deep learning and target tracking
CN111899285A (en) * 2020-07-08 2020-11-06 浙江大华技术股份有限公司 Method and device for determining tracking track of target object and storage medium
CN111986227A (en) * 2020-08-26 2020-11-24 杭州海康威视数字技术股份有限公司 Trajectory generation method and apparatus, computer device and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YUANPENG LIU,等: "Regression-Based Three-Dimensional Pose Estimation for Texture-Less Objects", 《IEEE TRANSACTIONS ON MULTIMEDIA 》, vol. 21, no. 11, XP011752249, DOI: 10.1109/TMM.2019.2913321 *
赵赫东,等: "结合定向扰动和HOG特征的卷积神经网络目标跟踪", 《计算机辅助设计与图形学学报》, vol. 31, no. 10 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139986A (en) * 2021-04-30 2021-07-20 东风越野车有限公司 Integrated environment perception and multi-target tracking system
CN115063452A (en) * 2022-06-13 2022-09-16 中国船舶重工集团公司第七0七研究所九江分部 Cloud deck camera tracking method for offshore target
CN115063452B (en) * 2022-06-13 2024-03-26 中国船舶重工集团公司第七0七研究所九江分部 Cloud deck camera tracking method for offshore targets

Also Published As

Publication number Publication date
CN112631333B (en) 2024-04-12

Similar Documents

Publication Publication Date Title
CN108037770B (en) Unmanned aerial vehicle power transmission line inspection system and method based on artificial intelligence
CN106529538A (en) Method and device for positioning aircraft
CN105955308A (en) Aircraft control method and device
CN111765974B (en) Wild animal observation system and method based on miniature refrigeration thermal infrared imager
CN109357679B (en) Indoor positioning method based on significance characteristic recognition
CN111382808A (en) Vehicle detection processing method and device
CN112631333A (en) Target tracking method and device of unmanned aerial vehicle and image processing chip
CN109635797A (en) Coil of strip sequence precise positioning method based on multichip carrier identification technology
CN114035606A (en) Pole tower inspection system, pole tower inspection method, control device and storage medium
CN112802027A (en) Target object analysis method, storage medium and electronic device
CN110782484A (en) Unmanned aerial vehicle video personnel identification and tracking method
CN114020039A (en) Automatic focusing system and method for unmanned aerial vehicle inspection tower
CN114037895A (en) Unmanned aerial vehicle pole tower inspection image identification method
CN112037255A (en) Target tracking method and device
CN112633114A (en) Unmanned aerial vehicle inspection intelligent early warning method and device for building change event
CN112329616A (en) Target detection method, device, equipment and storage medium
CN112585945A (en) Focusing method, device and equipment
CN116503807A (en) Equipment inspection method, device, electronic equipment and computer program product
CN108416880B (en) Video-based identification method
CN115830342A (en) Method and device for determining detection frame, storage medium and electronic device
CN113021355B (en) Agricultural robot operation method for predicting sheltered crop picking point
CN111328099B (en) Mobile network signal testing method, device, storage medium and signal testing system
CN113469135A (en) Method and device for determining object identity information, storage medium and electronic device
CN113469130A (en) Shielded target detection method and device, storage medium and electronic device
CN111079617A (en) Poultry identification method and device, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant