CN112631333B - Target tracking method and device of unmanned aerial vehicle and image processing chip - Google Patents

Target tracking method and device of unmanned aerial vehicle and image processing chip Download PDF

Info

Publication number
CN112631333B
CN112631333B CN202011565949.8A CN202011565949A CN112631333B CN 112631333 B CN112631333 B CN 112631333B CN 202011565949 A CN202011565949 A CN 202011565949A CN 112631333 B CN112631333 B CN 112631333B
Authority
CN
China
Prior art keywords
list
target
identification
matching
score
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011565949.8A
Other languages
Chinese (zh)
Other versions
CN112631333A (en
Inventor
李彬
丁国斌
蔡思航
巨擘
费媛媛
雷锦成
蔡宏伟
文岐月
巫伟林
李星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southern Power Grid Digital Grid Research Institute Co Ltd
Original Assignee
Southern Power Grid Digital Grid Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southern Power Grid Digital Grid Research Institute Co Ltd filed Critical Southern Power Grid Digital Grid Research Institute Co Ltd
Priority to CN202011565949.8A priority Critical patent/CN112631333B/en
Publication of CN112631333A publication Critical patent/CN112631333A/en
Application granted granted Critical
Publication of CN112631333B publication Critical patent/CN112631333B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of unmanned aerial vehicle tracking, in particular to a target tracking method and device of an unmanned aerial vehicle and an image processing chip. The method comprises the following steps: identifying a target image returned by the unmanned aerial vehicle; calculating a first recognition result from the target image; respectively comparing the scores of the identification frames in the second identification result with all the identification frames in the first identification result, and constructing and screening a target list with highest coincidence degree; calculating the relative offset direction of each identification frame in the target list; determining target equipment between a target image and a target image of a previous frame as an optimal matching relation; a plurality of target devices in a target image are tracked. By applying the target tracking method provided by the embodiment of the invention, in the process of inspection, the real-time tracking of the specific target equipment is kept in the process of turning the unmanned aerial vehicle holder camera to the specific target equipment, and when the specific target equipment appears at the central position of a picture, the shooting is performed again, so that the imaging quality of inspection photos is improved, and the imaging quality of the inspection photos is improved.

Description

Target tracking method and device of unmanned aerial vehicle and image processing chip
Technical Field
The invention relates to the technical field of unmanned aerial vehicle tracking, in particular to a target tracking method and device of an unmanned aerial vehicle and an image processing chip.
Background
Along with the continuous development of unmanned aerial vehicle technology, unmanned aerial vehicle range of application also gets wider and wider, and unmanned aerial vehicle's application in electric power system transmission line inspection has been expanding comprehensively, is being from manual operation unmanned aerial vehicle inspection to full-automatic unmanned inspection conversion.
The inventors found in the course of implementing the present invention that: unmanned aerial vehicle automatic inspection is when promoting inspection efficiency, because unmanned aerial vehicle position control precision, GPS positioning accuracy scheduling problem, leads to in the inspection photo that unmanned aerial vehicle took, exists the target equipment of being shot not in the picture center, and is located the edge of picture to influence follow-up analysis to inspection photo.
Disclosure of Invention
Aiming at the technical problems, the embodiment of the invention provides a target tracking method and device of an unmanned aerial vehicle and an image processing chip, so as to solve one or more problems that target equipment shot by the traditional unmanned aerial vehicle in the inspection process is not in the center of a picture and affects subsequent picture analysis.
A first aspect of an embodiment of the present invention provides a target tracking method for an unmanned aerial vehicle, including: identifying a target image returned by the unmanned aerial vehicle; calculating first recognition results of all target devices from the target image, wherein the first recognition results comprise a target device recognition frame; recording the identification results of all target devices in a frame of target image as second identification results, respectively comparing scores of any identification frame in the second identification results with all identification frames in the first identification results, and constructing a matching list meeting the conditions; screening a target list with highest coincidence degree from the matching list according to preset screening conditions; calculating the relative offset direction of each identification frame in the target list; if the relative offset directions of the identification frames are consistent, determining that the target equipment between the target image and the target image of the previous frame is the best matching relation; a plurality of target devices in the target image are tracked.
Optionally, the step of comparing the score of any one of the second recognition results with the score of all the recognition frames in the first recognition result, to construct a matching list meeting the condition includes: taking out one identification frame from the second identification result, marking the identification frame as Bi, sequentially comparing the Bi with the identification frame Aj in the first identification result, and calculating the compared score to be Sij according to a preset scoring method; wherein i is [1, m ] and j is [1, n ]; m is the number of the identification frames in the first identification result; n is the number of the identification frames in the second identification result; screening an Aj set matched with Bi according to the score Sij, and constructing a matching list meeting the condition; repeating the steps, and continuing to construct a new list until the new list cannot be constructed.
Optionally, the screening the Aj set matched with Bi according to the score Sij includes: recording the corresponding Aj of Sij >0 and the corresponding score; the Aj corresponding to Sij >0 is taken as the Aj set matched with Bi.
Optionally, the score Sij is a sum of a distance score, a ratio score, and a matching score.
Optionally, the distance score is calculated as follows: calculating an identification distance between the center point of the identification frame Bi and the center point of the identification frame Aj, and if the identification distance is greater than a distance threshold value, obtaining a distance score of 0; if the identification distance is smaller than the distance threshold, the smaller the identification distance is, the higher the distance score is; the ratio score is calculated as follows: calculating and calculating the ratio of the intersection area and the union area of the identification frame Bi and the identification frame Aj, wherein if the ratio is 0, the ratio score is 0, and if the ratio is greater than 0, the ratio score is higher the greater the ratio is; the matching degree score is calculated as follows: calculating image characteristic values of the areas where the identification frames Bi and the identification frames Aj are located, calculating characteristic matching results, and if the matching degree is smaller than a matching degree threshold value, the matching degree score is 0; otherwise, the higher the matching degree score.
Optionally, the screening, according to a preset screening condition, the target list with the highest coincidence degree from the matching list includes: sorting the matching list according to Sij scores from high to low; and taking out the list with the highest score as a target list with the highest coincidence degree.
Optionally, calculating the relative offset direction of each identification frame in the target list is specifically: and calculating the offset direction of Bi relative to Aj in all [ Bi, aj ] combinations in the target list.
Optionally, the first identification result and the second identification result specifically include an upper left corner coordinate and a lower right corner coordinate of the target device identification frame.
A second aspect of an embodiment of the present invention provides an object tracking device of an unmanned aerial vehicle, including: the recognition module is used for recognizing the target image returned by the unmanned aerial vehicle; the first computing module is used for obtaining a first recognition result of computing target equipment from the target image, wherein the first recognition result comprises a target equipment recognition frame, and the recognition results of all target equipment in a frame of target image are recorded as second recognition results; the comparison module is used for respectively comparing scores of any identification frame in the second identification result with all identification frames in the first identification result to construct a matching list meeting the conditions; the screening module is used for screening a target list with highest coincidence degree from the matching list according to preset screening conditions; the second calculation module is used for calculating the relative offset direction of each identification frame in the target list; the determining module is used for determining that the target equipment between the target image and the target image of the previous frame is the best matching relation if the relative offset directions of the identification frames are consistent; and the tracking module is used for tracking a plurality of target devices in the target image.
A third aspect of an embodiment of the present invention provides an image processing chip, including: a processor and a memory communicatively coupled to the processor; the memory has stored therein computer program instructions that, when invoked by the processor, cause the processor to perform the target tracking method as described above.
The target tracking method of the unmanned aerial vehicle provided by the embodiment of the invention has the following effects:
(1) According to the embodiment of the invention, after the target equipment in two continuous frames of target images is identified by adopting the deep neural network model, the association condition of the target equipment in the two frames of images is obtained through the identification result of the two frames of images and the similarity of the identification frame areas, so that tracking of a plurality of target equipment is realized.
(2) According to the embodiment of the invention, the displacement of the identification frame, the coincidence degree of the identification frame, the moving direction of the identification frame, the similarity of the local images of the identification frame area and the like in the identification results in the two frames of target images are comprehensively considered, so that the target tracking is more accurate.
(3) According to the embodiment of the invention, the total matching condition of all tracking targets in the target image is used as an assessment standard of a matching algorithm, and the obtained matching and tracking effects are more in line with the actual condition.
By applying the target tracking method provided by the embodiment of the invention, in the process of inspection, the real-time tracking of the specific target equipment is kept in the process of turning the unmanned aerial vehicle holder camera to the specific target equipment, and when the specific target equipment appears at the central position of a picture, the shooting is performed again, so that the imaging quality of inspection photos is improved, and the imaging quality of the inspection photos is improved; the target tracking method can also realize the tracking of the unmanned aerial vehicle holder on a plurality of target devices in the inspection process, and the invention can support the tracking and the shooting of the plurality of target devices at a single shooting point, thereby improving the inspection efficiency.
Drawings
One or more embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements, and in which the figures of the drawings are not to be taken in a limiting sense, unless otherwise indicated.
Fig. 1 is an application scenario diagram of unmanned aerial vehicle inspection provided by an embodiment of the present invention;
fig. 2 is a block diagram of an image processing chip according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of a target tracking method according to an embodiment of the present invention;
fig. 4 is a block diagram of a target tracking apparatus according to an embodiment of the present invention.
Detailed Description
In order that the invention may be readily understood, a more particular description thereof will be rendered by reference to specific embodiments that are illustrated in the appended drawings. It will be understood that when an element is referred to as being "fixed" to another element, it can be directly on the other element or one or more intervening elements may be present therebetween. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or one or more intervening elements may be present therebetween. The terms "upper," "lower," "inner," "outer," "bottom," and the like as used in this specification are used in an orientation or positional relationship based on that shown in the drawings, merely to facilitate the description of the invention and to simplify the description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus should not be construed as limiting the invention. Furthermore, the terms "first," "second," "third," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The term "and/or" as used in this specification includes any and all combinations of one or more of the associated listed items.
In unmanned aerial vehicle inspection, automatic unmanned inspection is required to track target equipment from a shot picture, so that the position of the target equipment in the picture is a very important index. By applying the target tracking method provided by the embodiment of the invention, shooting can be performed again when the specific target equipment appears at the central position of the picture, so that the shot target equipment is prevented from being positioned at the edge of the picture instead of being positioned at the center of the picture, and the subsequent analysis of the patrol photo is influenced.
Fig. 1 is an application scenario of unmanned aerial vehicle inspection provided by the embodiment of the invention. As shown in fig. 1, in this application scenario, the unmanned aerial vehicle 10, the intelligent terminal 20, and the wireless network 30 are included.
The unmanned aerial vehicle 10 may be any type of powered unmanned aerial vehicle including, but not limited to, a four-axis unmanned aerial vehicle, a fixed wing aircraft, a helicopter model, and the like. The device can have corresponding volume or power according to the actual situation, thereby providing the load carrying capacity, the flying speed, the flying endurance mileage and the like which can meet the use requirement.
The intelligent terminal 20 may be any type of intelligent device for establishing a communication connection with a drone, such as a mobile phone, tablet computer, PC-side or intelligent remote control, etc. The intelligent terminal 20 may be equipped with one or more different user interaction means for gathering user instructions or presenting and feeding back information to the user.
The wireless network 30 may be a wireless communication network based on any type of data transmission principle for establishing a data transmission channel between two nodes, such as a bluetooth network, a WiFi network, a wireless cellular network or a combination thereof, which is located in a specific signal band.
Fig. 2 is a block diagram of an image processing chip according to an embodiment of the present invention. The image processing chip may be used to realize all or part of the functions of the object tracking method in the embodiments described below.
As shown in fig. 2, the image processing chip 100 may include: a processor 110 and a memory 120.
The communication connection between the processor 110 and the memory 120 is established by way of a bus 130.
Processor 110 may be any type of processor having one or more processing cores. It may perform single-threaded or multi-threaded operations for parsing instructions to perform operations such as fetching data, performing logical operation functions, and delivering operational processing results.
The memory 120 acts as a non-volatile computer readable storage medium such as at least one magnetic disk storage device, a flash memory device, a distributed storage device remotely located relative to the processor 110, or other non-volatile solid state storage device. The memory 120 may have program storage areas for storing non-volatile software programs, non-volatile computer-executable programs, and modules for use by the processor 110 to cause the processor 110 to perform one or more method steps. The memory 120 may also have a data storage area for storing the result of the operation processing issued by the processor 110.
With continued reference to fig. 1, in the actual use process, the unmanned aerial vehicle 10 may capture images at different times for the target device during the inspection process, so that the target device is located in the center of the image during some image frames, and the target device is located at the edge of the image during some image frames. Based on the above, the present invention provides a target tracking method to avoid the problem that the target device is not in the center of the screen during shooting.
Fig. 3 is a flow chart of a target tracking method according to an embodiment of the present invention, as shown in fig. 3, where the method includes the following steps:
step 310, identifying a target image returned by the unmanned aerial vehicle.
The unmanned aerial vehicle itself has the functional module that can gather the image, and it can gather image data in specific time in the inspection process to regard as the target image and return to image processing chip with this image that gathers, so that image processing chip is according to this target image to the environment of inspecting analysis.
Step 320, calculating a first recognition result of all target devices from the target image, wherein the first recognition result comprises the target device recognition frame.
The recognition algorithm for recognizing the target device from the target image in the present invention may be an image recognition algorithm based on a deep neural network model, which is common in the art and will not be described herein.
The target images transmitted back at different times have different photographed pictures, and the positions of the target devices in the pictures are different. The first recognition result and the second recognition result described below are coordinates of the target device recognition frame in the photographed screen, for example, upper left corner and lower right corner coordinates. It should be noted that, in one frame of the target image, there may be a plurality of target devices. The first recognition result here includes target device recognition frame positions of all target devices.
Step 330, recording the identification results of all target devices in the target image of one frame as the second identification result,
and respectively comparing scores of any one of the second recognition results with all the recognition frames in the first recognition results to construct a matching list meeting the conditions.
In order to know whether the target device in the target image is at the middle position of the target image frame, the target image needs to be compared with the previous frame of target image (it should be noted that, in the embodiment of the invention, the current frame of image to be processed transmitted back from the unmanned aerial vehicle is collectively referred to as the target image, and the frame of image to be processed transmitted back from the previous frame of image before the target image is collectively referred to as the previous frame of target image), this step is to compare the first recognition result with the second recognition result in the previous frame of target image, so as to calculate the correlation between the target image and the previous frame of target image. Specifically, any one of the second recognition results and all the recognition frames in the first recognition results can be respectively subjected to score matching, and a matching list meeting the conditions is constructed so as to represent the coincidence degree of the recognition frames in the two-frame target image.
Further, score comparison is respectively carried out on any one of the second recognition results and all the recognition frames in the first recognition results, and a matching list meeting the conditions is constructed, which comprises the following steps:
1. taking out one identification frame from the second identification result, marking the identification frame as Bi, sequentially comparing Bi with the identification frames Aj in the first identification result, and calculating a score of Sij after comparison according to a preset scoring method, wherein i is [1, m ], j is [1, n ], and m is the number of the identification frames in the first identification result; n is the number of the identification frames in the second identification result.
2. And screening an Aj set matched with Bi according to the score Sij, and constructing a matching list meeting the condition.
Specifically, the corresponding meeting condition may be set to filter the Aj set matched with Bi, and in a specific embodiment, the Aj corresponding to Sij >0 and the score corresponding to Sij >0 may be recorded, and the Aj corresponding to Sij >0 may be used as the Aj set matched with Bi.
Specifically, in the embodiment of the invention, the score Sij is the sum of a distance score, a ratio score and a matching degree score, and the score is used for representing the displacement of the identification frame, the coincidence degree of the identification frame, the moving direction of the identification frame, the similarity of local images of the identification frame area and the like in the two frames of target images, so that the target tracking is more accurate.
The distance score is calculated as follows: calculating an identification distance between the center point of the identification frame Bi and the center point of the identification frame Aj, wherein if the identification distance is greater than a distance threshold value, the distance score is 0, the distance threshold value is related to the size of the identification frame, for example, a threshold value LMi = (rbi+rai)/2 is set, wherein RBi is the distance from the center point to the vertex of the identification frame Bi, and RAi is the distance from the center point to the vertex of the identification frame Ai; if the identification distance is less than the distance threshold, the smaller the identification distance, the higher the distance score.
The ratio score is calculated as follows: calculating the ratio of the intersection area and the union area of the identification frame Bi and the identification frame Aj, if the ratio is 0, the ratio score is 0, and if the ratio is larger than 0, the larger the ratio is, the higher the ratio score is.
The matching score is calculated as follows: calculating the image characteristic values of the areas where the identification frames Bi and the identification frames Aj are located, calculating a characteristic matching result, wherein the matching degree is smaller than a matching degree threshold (the matching degree threshold is related to a specific characteristic value matching algorithm, and a person skilled in the art can determine the matching degree threshold according to the selected characteristic value matching algorithm), and the matching degree score is 0; otherwise, the higher the matching degree score.
The distance threshold and the matching degree threshold are selected by a person skilled in the art according to a specific application scenario, and are not described herein. When the feature matching degree calculation is performed, feature extraction and matching algorithms such as SIFT can be utilized to calculate the image feature value of the area where the identification frame is located. And finally, adding the distance score, the ratio score and the matching degree score to obtain a final score Sij.
3. Repeating the steps, and continuing to construct a new list until the new list cannot be constructed.
This step constructs a list of all possible [ Bi, aj ] matching combinations, which characterize one of the two recognition results B and a matching relationship), each of the constructed lists being composed of a plurality of [ Bi, aj ] and satisfying the following 3 conditions:
(1) i belongs to [1, m ], j belongs to [1, n ], and Sij >0;
(2) For each [ Bi, aj ] combination in the list, neither i nor j can be repeated;
(3) The [ Bi, aj ] combinations in each list should be as many as possible under the first two conditions are met.
Specifically, a list of [ Bi, aj ] matching combinations can be constructed by:
(1) Taking out B1, then taking an identification frame from the Aj set matched with B1, marking as Ax1, and putting the matched combination (B1, ax 1) into a list;
(2) Taking out B2, then taking one identification frame from the Aj set matched with B2 and marking as Ax2, wherein Ax2 cannot be repeated with other identification frames in the identification result A which is already taken out before, and putting the matched combination (B2, ax 2) into a list;
(3) Step 2 is repeated until Bm is removed and the matching combination (Bm, axm) is placed in the list, and the list construction is complete. If Bi is taken out in the list construction process, and then the Aj set matched with Bi is an empty set, or all recognition frames in the set are repeated with other recognition frames in the recognition result a which has been taken out before, the matched combination [ Bi, aj ] cannot be put into the constructed list;
(4) After the list is constructed, repeating the steps, continuing to construct a new list, and ensuring that the new list cannot be repeated with the constructed list until the new list cannot be constructed, thus obtaining the list of the Aj set matched with Bi.
And 340, screening a target list with highest coincidence degree from the matching list according to a preset screening condition.
In the step, a list with the highest Sij score can be selected from the matching list to be used as a target list with the highest coincidence degree. That is, the matching list is ordered from high to low by the Sij score; and taking out the list with the highest score as a target list with the highest coincidence degree.
Step 350, calculating a relative offset direction of each identification frame in the target list.
Here, the offset direction of Bi relative to Aj in all [ Bi, aj ] combinations in the target list of step 304 is calculated as follows:
(1) Calculating the center points (BiX, biY) and Aj (ajX, ajY) of Bi
(2) If BiX-AjX < XM and BiY-AjY < YM, it is defined that the offset is small
(3) Otherwise, if BiX > AjX and BiY > AjY, defined as offset direction 1
(4) Otherwise, if BiX > AjX and BiY < AjY, defined as offset direction 2
(5) Otherwise, if BiX < AjX, and BiY > AjY, defined as offset direction 3
(6) Otherwise, if BiX < AjX and BiY < AjY, defined as offset direction 4
The XM and YM are central point thresholds, and the value can be selected according to the size of the target image, the number of target devices in the target image and other characteristics.
And step 360, if the relative offset directions of the identification frames are consistent, determining that the target equipment between the target image and the target image of the previous frame is the best matching relation.
If the offset direction of Bi relative to Aj is consistent (in some cases, for example, the offset is smaller and can be ignored and is not considered), all the [ Bi, aj ] combinations in the list are considered to be the best matching relationship between the identification result B of all the target devices in the image of the frame and the identification result a of all the target devices in the image of the previous frame; if the condition is not satisfied, the best matching relation is not considered, the list is removed, and the offset direction of Bi relative to Aj is repeatedly calculated.
Step 370, tracking a plurality of target devices in the target image.
In step 360, if the best matching relationship between the identification result B of all the target devices in the frame image and the identification result a of all the target devices in the previous frame image is obtained, the plurality of target devices in the target image can be tracked simultaneously according to the best matching relationship between the target devices in the two frame target images. The plurality of target devices tracked from the picture can meet the requirement that a specific target device appears in the central position of the picture so as to ensure the imaging quality of the inspection photo.
In summary, the target tracking method of the unmanned aerial vehicle provided by the embodiment of the invention has the following effects:
(1) According to the embodiment of the invention, after the target equipment in two continuous frames of target images is identified by adopting the deep neural network model, the association condition of the target equipment in the two frames of images is obtained through the identification result of the two frames of images and the similarity of the identification frame areas, so that tracking of a plurality of target equipment is realized.
(2) According to the embodiment of the invention, the displacement of the identification frame, the coincidence degree of the identification frame, the moving direction of the identification frame, the similarity of the local images of the identification frame area and the like in the identification results in the two frames of target images are comprehensively considered, so that the target tracking is more accurate.
(3) According to the embodiment of the invention, the total matching condition of all tracking targets in the target image is used as an assessment standard of a matching algorithm, and the obtained matching and tracking effects are more in line with the actual condition.
By applying the target tracking method provided by the embodiment of the invention, the unmanned aerial vehicle holder can track the target equipment in the inspection process, so that the imaging quality of inspection photos is improved; the target tracking method can also realize the tracking of the unmanned aerial vehicle holder on a plurality of target devices in the inspection process, and the invention can support the tracking and the shooting of the plurality of target devices at a single shooting point, thereby improving the inspection efficiency.
The present invention also provides an unmanned aerial vehicle target tracking device 400, referring to fig. 4, the target tracking device 400 includes: the identification module 41, the first calculation module 42, the comparison module 43, the screening module 44, the second calculation module 45, the determination module 46 and the tracking module 47.
The identification module 41 is used for identifying a target image returned by the unmanned aerial vehicle;
the first calculating module 42 is configured to obtain a first recognition result of calculating the target device from the target image, where the first recognition result includes the target device recognition frame, and record recognition results of all target devices in a target image of a frame as second recognition results;
the comparison module 43 is configured to compare scores of any one of the second recognition results with all recognition frames in the first recognition result, so as to construct a matching list that meets the condition;
the screening module 44 is configured to screen a target list with highest coincidence degree from the matching list according to a preset screening condition;
the second calculating module 45 is configured to calculate a relative offset direction of each identification frame in the target list;
the determining module 46 is configured to determine that the target device between the target image and the target image of the previous frame is the best matching relationship if the relative offset directions of the identification frames are consistent;
the tracking module 47 is used for tracking a plurality of target devices in the target image.
The target tracking method in the above embodiment is also applicable to the target tracking apparatus of the present invention, and will not be described herein. The target tracking device can track target equipment by the unmanned aerial vehicle holder in the inspection process, and improves the imaging quality of inspection photos; the target tracking method can also realize the tracking of the unmanned aerial vehicle holder on a plurality of target devices in the inspection process, and the invention can support the tracking and the shooting of the plurality of target devices at a single shooting point, thereby improving the inspection efficiency.
Those skilled in the art should further appreciate that the individual steps of the exemplary object tracking methods described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of both, and that the individual exemplary components and steps have been described generally in terms of functionality in the foregoing description to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution.
Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not intended to be limiting. The computer software may be stored in a computer readable storage medium, and the program, when executed, may include the flow of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a read-only memory, a random access memory, or the like.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; the technical features of the above embodiments or in the different embodiments may also be combined within the idea of the invention, the steps may be implemented in any order, and there are many other variations of the different aspects of the invention as described above, which are not provided in detail for the sake of brevity; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (6)

1. The target tracking method of the unmanned aerial vehicle is characterized by comprising the following steps of:
identifying a target image returned by the unmanned aerial vehicle;
calculating first recognition results of all target devices from the target image, wherein the first recognition results comprise a target device recognition frame;
recording the identification results of all target devices in a frame of target image as second identification results, respectively comparing scores of any identification frame in the second identification results with all identification frames in the first identification results, and constructing a matching list meeting the conditions;
screening a target list with highest coincidence degree from the matching list according to preset screening conditions;
calculating the relative offset direction of each identification frame in the target list;
if the relative offset directions of the identification frames are consistent, determining that the target equipment between the target image and the target image of the previous frame is the best matching relation;
tracking a plurality of target devices in the target image;
the step of comparing the score of any one of the second recognition results with the score of all the recognition frames in the first recognition results, and constructing a matching list meeting the conditions comprises the following steps:
taking out one identification frame from the second identification result, marking the identification frame as Bi, sequentially comparing the Bi with the identification frame Aj in the first identification result, and calculating the compared score to be Sij according to a preset scoring method;
wherein i is [1, m ] and j is [1, n ]; m is the number of the identification frames in the first identification result; n is the number of the identification frames in the second identification result;
screening an Aj set matched with Bi according to the score Sij, and constructing a matching list meeting the condition:
constructing a list of all possible [ Bi, aj ] matching combinations, wherein the combination list characterizes one matching relation of two recognition results of B and A, each constructed list consists of a plurality of [ Bi, aj ] and satisfies the following 3 conditions:
(1) i belongs to [1, m ], j belongs to [1, n ], and Sij >0;
(2) For each [ Bi, aj ] combination in the list, neither i nor j can be repeated;
(3) The [ Bi, aj ] combinations in each list should be as many as possible under the first two conditions are met;
the list of [ Bi, aj ] matching combinations is constructed by:
1) Taking out B1, then taking an identification frame from the Aj set matched with B1, marking as Ax1, and putting the matched combination (B1, ax 1) into a list;
2) Taking out B2, then taking one identification frame from the Aj set matched with B2 and marking as Ax2, wherein Ax2 cannot be repeated with other identification frames in the identification result A which is already taken out before, and putting the matched combination (B2, ax 2) into a list;
3) Repeating step 2) until Bm is taken out and a matching combination (Bm, axm) is put into the list, and completing the construction of the list, wherein if the Aj set matched with Bi is an empty set after Bi is taken out in the construction process of the list, or all recognition frames in the set are repeated with other recognition frames in the recognition result A which has been taken out before, the matching combination [ Bi, aj ] cannot be put into the constructed list;
4) Repeating the steps after the list construction is completed once, continuing to construct a new list, and ensuring that the new list cannot be repeated with the constructed list until the new list cannot be constructed, thus obtaining a list of the Aj set matched with Bi;
and screening the Aj set matched with Bi according to the score Sij, wherein the method comprises the following steps:
recording the corresponding Aj of Sij >0 and the corresponding score;
the Aj corresponding to Sij >0 is used as an Aj set matched with Bi;
the screening the target list with highest coincidence degree from the matching list according to the preset screening condition comprises the following steps:
sorting the matching list according to Sij scores from high to low;
taking out the list with the highest score as a target list with the highest coincidence degree;
the calculating of the relative offset direction of each identification frame in the target list is specifically as follows: calculating the offset direction of Bi relative to Aj in all [ Bi, aj ] combinations in the target list;
the tracking the plurality of target devices in the target image includes: and simultaneously tracking a plurality of target devices in the target image according to the optimal matching relation of the target device between the target image and the previous frame image.
2. The target tracking method according to claim 1, wherein the score Sij is a sum of a distance score, a ratio score, and a matching degree score.
3. The method of object tracking as defined in claim 2, wherein,
the distance score is calculated as follows:
calculating an identification distance between the center point of the identification frame Bi and the center point of the identification frame Aj, and if the identification distance is greater than a distance threshold value, obtaining a distance score of 0; if the identification distance is smaller than the distance threshold, the smaller the identification distance is, the higher the distance score is;
the ratio score is calculated as follows:
calculating and calculating the ratio of the intersection area and the union area of the identification frame Bi and the identification frame Aj, wherein if the ratio is 0, the ratio score is 0, and if the ratio is greater than 0, the ratio score is higher the greater the ratio is;
the matching degree score is calculated as follows:
calculating image characteristic values of the areas where the identification frames Bi and the identification frames Aj are located, calculating characteristic matching results, and if the matching degree is smaller than a matching degree threshold value, the matching degree score is 0; otherwise, the higher the matching degree score.
4. The target tracking method according to claim 1, characterized in that: the first recognition result and the second recognition result specifically comprise an upper left corner coordinate and a lower right corner coordinate of the target equipment recognition frame.
5. An unmanned aerial vehicle's target tracking device, characterized in that includes:
the recognition module is used for recognizing the target image returned by the unmanned aerial vehicle;
the first computing module is used for obtaining a first recognition result of computing target equipment from the target image, wherein the first recognition result comprises a target equipment recognition frame, and the recognition results of all target equipment in a frame of target image are recorded as second recognition results;
the comparison module is used for respectively comparing scores of any identification frame in the second identification result with all identification frames in the first identification result to construct a matching list meeting the conditions;
the screening module is used for screening a target list with highest coincidence degree from the matching list according to preset screening conditions;
the second calculation module is used for calculating the relative offset direction of each identification frame in the target list;
the determining module is used for determining that the target equipment between the target image and the target image of the previous frame is the best matching relation if the relative offset directions of the identification frames are consistent;
the tracking module is used for tracking a plurality of target devices in the target image;
the step of comparing the score of any one of the second recognition results with the score of all the recognition frames in the first recognition results, and constructing a matching list meeting the conditions comprises the following steps:
taking out one identification frame from the second identification result, marking the identification frame as Bi, sequentially comparing the Bi with the identification frame Aj in the first identification result, and calculating the compared score to be Sij according to a preset scoring method;
wherein i is [1, m ] and j is [1, n ]; m is the number of the identification frames in the first identification result; n is the number of the identification frames in the second identification result;
screening an Aj set matched with Bi according to the score Sij, and constructing a matching list meeting the condition:
constructing a list of all possible [ Bi, aj ] matching combinations, wherein the combination list characterizes one matching relation of two recognition results of B and A, each constructed list consists of a plurality of [ Bi, aj ] and satisfies the following 3 conditions:
(1) i belongs to [1, m ], j belongs to [1, n ], and Sij >0;
(2) For each [ Bi, aj ] combination in the list, neither i nor j can be repeated;
(3) The [ Bi, aj ] combinations in each list should be as many as possible under the first two conditions are met;
the list of [ Bi, aj ] matching combinations is constructed by:
1) Taking out B1, then taking an identification frame from the Aj set matched with B1, marking as Ax1, and putting the matched combination (B1, ax 1) into a list;
2) Taking out B2, then taking one identification frame from the Aj set matched with B2 and marking as Ax2, wherein Ax2 cannot be repeated with other identification frames in the identification result A which is already taken out before, and putting the matched combination (B2, ax 2) into a list;
3) Repeating step 2) until Bm is taken out and a matching combination (Bm, axm) is put into the list, and completing the construction of the list, wherein if the Aj set matched with Bi is an empty set after Bi is taken out in the construction process of the list, or all recognition frames in the set are repeated with other recognition frames in the recognition result A which has been taken out before, the matching combination [ Bi, aj ] cannot be put into the constructed list;
4) Repeating the steps after the list construction is completed once, continuing to construct a new list, and ensuring that the new list cannot be repeated with the constructed list until the new list cannot be constructed, thus obtaining a list of the Aj set matched with Bi;
and screening the Aj set matched with Bi according to the score Sij, wherein the method comprises the following steps:
recording the corresponding Aj of Sij >0 and the corresponding score;
the Aj corresponding to Sij >0 is used as an Aj set matched with Bi;
the screening the target list with highest coincidence degree from the matching list according to the preset screening condition comprises the following steps:
sorting the matching list according to Sij scores from high to low;
taking out the list with the highest score as a target list with the highest coincidence degree;
the calculating of the relative offset direction of each identification frame in the target list is specifically as follows: calculating the offset direction of Bi relative to Aj in all [ Bi, aj ] combinations in the target list;
the tracking the plurality of target devices in the target image includes: and simultaneously tracking a plurality of target devices in the target image according to the optimal matching relation of the target device between the target image and the previous frame image.
6. An image processing chip, comprising: a processor and a memory communicatively coupled to the processor;
the memory has stored therein computer program instructions which, when invoked by the processor, cause the processor to perform the object tracking method of any of claims 1-4.
CN202011565949.8A 2020-12-25 2020-12-25 Target tracking method and device of unmanned aerial vehicle and image processing chip Active CN112631333B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011565949.8A CN112631333B (en) 2020-12-25 2020-12-25 Target tracking method and device of unmanned aerial vehicle and image processing chip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011565949.8A CN112631333B (en) 2020-12-25 2020-12-25 Target tracking method and device of unmanned aerial vehicle and image processing chip

Publications (2)

Publication Number Publication Date
CN112631333A CN112631333A (en) 2021-04-09
CN112631333B true CN112631333B (en) 2024-04-12

Family

ID=75325362

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011565949.8A Active CN112631333B (en) 2020-12-25 2020-12-25 Target tracking method and device of unmanned aerial vehicle and image processing chip

Country Status (1)

Country Link
CN (1) CN112631333B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139986A (en) * 2021-04-30 2021-07-20 东风越野车有限公司 Integrated environment perception and multi-target tracking system
CN115063452B (en) * 2022-06-13 2024-03-26 中国船舶重工集团公司第七0七研究所九江分部 Cloud deck camera tracking method for offshore targets

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105814888A (en) * 2013-11-08 2016-07-27 弗劳恩霍夫应用研究促进协会 Multi-aperture device and method for detecting object region
US9737757B1 (en) * 2016-11-16 2017-08-22 Wawgd, Inc Golf ball launch monitor target alignment method and system
CN108476288A (en) * 2017-05-24 2018-08-31 深圳市大疆创新科技有限公司 Filming control method and device
CA3006155A1 (en) * 2017-09-25 2019-03-25 The Boeing Company Positioning system for aerial non-destructive inspection
CN109671103A (en) * 2018-12-12 2019-04-23 易视腾科技股份有限公司 Method for tracking target and device
CN110472496A (en) * 2019-07-08 2019-11-19 长安大学 A kind of traffic video intelligent analysis method based on object detecting and tracking
CN111462180A (en) * 2020-03-30 2020-07-28 西安电子科技大学 Object tracking method based on AND-OR graph AOG
CN111539986A (en) * 2020-03-25 2020-08-14 西安天和防务技术股份有限公司 Target tracking method and device, computer equipment and storage medium
CN111739056A (en) * 2020-06-23 2020-10-02 杭州海康威视数字技术股份有限公司 Trajectory tracking system
CN111882579A (en) * 2020-07-03 2020-11-03 湖南爱米家智能科技有限公司 Large infusion foreign matter detection method, system, medium and equipment based on deep learning and target tracking
CN111899285A (en) * 2020-07-08 2020-11-06 浙江大华技术股份有限公司 Method and device for determining tracking track of target object and storage medium
CN111986227A (en) * 2020-08-26 2020-11-24 杭州海康威视数字技术股份有限公司 Trajectory generation method and apparatus, computer device and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7164461B2 (en) * 2019-02-15 2022-11-01 株式会社キーエンス Image processing device

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105814888A (en) * 2013-11-08 2016-07-27 弗劳恩霍夫应用研究促进协会 Multi-aperture device and method for detecting object region
US9737757B1 (en) * 2016-11-16 2017-08-22 Wawgd, Inc Golf ball launch monitor target alignment method and system
CN108476288A (en) * 2017-05-24 2018-08-31 深圳市大疆创新科技有限公司 Filming control method and device
CA3006155A1 (en) * 2017-09-25 2019-03-25 The Boeing Company Positioning system for aerial non-destructive inspection
CN109556577A (en) * 2017-09-25 2019-04-02 波音公司 Positioning system for aerial nondestructive inspection
CN109671103A (en) * 2018-12-12 2019-04-23 易视腾科技股份有限公司 Method for tracking target and device
CN110472496A (en) * 2019-07-08 2019-11-19 长安大学 A kind of traffic video intelligent analysis method based on object detecting and tracking
CN111539986A (en) * 2020-03-25 2020-08-14 西安天和防务技术股份有限公司 Target tracking method and device, computer equipment and storage medium
CN111462180A (en) * 2020-03-30 2020-07-28 西安电子科技大学 Object tracking method based on AND-OR graph AOG
CN111739056A (en) * 2020-06-23 2020-10-02 杭州海康威视数字技术股份有限公司 Trajectory tracking system
CN111882579A (en) * 2020-07-03 2020-11-03 湖南爱米家智能科技有限公司 Large infusion foreign matter detection method, system, medium and equipment based on deep learning and target tracking
CN111899285A (en) * 2020-07-08 2020-11-06 浙江大华技术股份有限公司 Method and device for determining tracking track of target object and storage medium
CN111986227A (en) * 2020-08-26 2020-11-24 杭州海康威视数字技术股份有限公司 Trajectory generation method and apparatus, computer device and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Regression-Based Three-Dimensional Pose Estimation for Texture-Less Objects;Yuanpeng Liu,等;《IEEE Transactions on Multimedia 》;第21卷(第11期);全文 *
结合定向扰动和HOG特征的卷积神经网络目标跟踪;赵赫东,等;《计算机辅助设计与图形学学报》;第31卷(第10期);全文 *

Also Published As

Publication number Publication date
CN112631333A (en) 2021-04-09

Similar Documents

Publication Publication Date Title
CN109087510B (en) Traffic monitoring method and device
US11023717B2 (en) Method, apparatus, device and system for processing commodity identification and storage medium
CN112631333B (en) Target tracking method and device of unmanned aerial vehicle and image processing chip
CN111382808A (en) Vehicle detection processing method and device
CN109357679B (en) Indoor positioning method based on significance characteristic recognition
CN109325429A (en) A kind of method, apparatus, storage medium and the terminal of linked character data
CN112651398B (en) Snapshot control method and device for vehicle and computer readable storage medium
CN109635797A (en) Coil of strip sequence precise positioning method based on multichip carrier identification technology
CN113112526A (en) Target tracking method, device, equipment and medium
CN109167998A (en) Detect method and device, the electronic equipment, storage medium of camera status
CN112329616A (en) Target detection method, device, equipment and storage medium
CN115690545B (en) Method and device for training target tracking model and target tracking
CN113780145A (en) Sperm morphology detection method, sperm morphology detection device, computer equipment and storage medium
CN113160272A (en) Target tracking method and device, electronic equipment and storage medium
CN112037255B (en) Target tracking method and device
CN113077018A (en) Target object identification method and device, storage medium and electronic device
CN111079617A (en) Poultry identification method and device, readable storage medium and electronic equipment
CN115830342A (en) Method and device for determining detection frame, storage medium and electronic device
CN113706578B (en) Method and device for determining accompanying relation of moving object based on track
CN114782496A (en) Object tracking method and device, storage medium and electronic device
CN114527779A (en) Control method and system of cargo distribution unmanned aerial vehicle and storage medium
CN111328099B (en) Mobile network signal testing method, device, storage medium and signal testing system
CN114118271A (en) Image determination method, image determination device, storage medium and electronic device
CN113469130A (en) Shielded target detection method and device, storage medium and electronic device
CN114323013A (en) Method for determining position information of a device in a scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant