CN112233141A - Moving target tracking method and system based on unmanned aerial vehicle vision in electric power scene - Google Patents

Moving target tracking method and system based on unmanned aerial vehicle vision in electric power scene Download PDF

Info

Publication number
CN112233141A
CN112233141A CN202011038186.1A CN202011038186A CN112233141A CN 112233141 A CN112233141 A CN 112233141A CN 202011038186 A CN202011038186 A CN 202011038186A CN 112233141 A CN112233141 A CN 112233141A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
target
tracking
moving target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011038186.1A
Other languages
Chinese (zh)
Other versions
CN112233141B (en
Inventor
冯雪
徐晓华
杜猛俊
钱锦
孙剑
徐汉麟
徐李冰
樊笑利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Power Supply Co of State Grid Zhejiang Electric Power Co Ltd
Original Assignee
Hangzhou Power Supply Co of State Grid Zhejiang Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Power Supply Co of State Grid Zhejiang Electric Power Co Ltd filed Critical Hangzhou Power Supply Co of State Grid Zhejiang Electric Power Co Ltd
Priority to CN202011038186.1A priority Critical patent/CN112233141B/en
Publication of CN112233141A publication Critical patent/CN112233141A/en
Application granted granted Critical
Publication of CN112233141B publication Critical patent/CN112233141B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The utility model provides a moving target tracking method and system based on unmanned aerial vehicle vision in an electric power scene, which comprises the following steps: carrying out real-time positioning and tracking on the moving target to obtain the position of the moving target of the current frame; based on the obtained current frame moving target position, the unmanned aerial vehicle is controlled to fly, so that the flying speed and direction of the unmanned aerial vehicle are adaptively adjusted according to the target moving condition, and the moving target is ensured to be always kept at the center of the visual field of the unmanned aerial vehicle. The accuracy and the real-time performance of tracking the moving target in the power system are improved.

Description

Moving target tracking method and system based on unmanned aerial vehicle vision in electric power scene
Technical Field
The disclosure belongs to the technical field of moving target tracking, and particularly relates to a moving target tracking method and system based on unmanned aerial vehicle vision in an electric power scene.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
With the rapid development of the economy of China, the safety problem of the power system as the life line of the economy of China is increasingly highlighted. Because the surrounding of the outdoor power transmission line is threatened by moving objects such as birds, balloons and plastic floaters, and the maintenance, climbing and other operations of workers exist, the timely tracking of moving targets around the power transmission line becomes an important link for guaranteeing the safe operation of a power system.
In the electric power scene, the accuracy and the real-time of tracking need to be satisfied, traditional moving object tracking control mainly relies on the manual work to patrol and examine, and this mode work load is big, the cycle length, inefficiency, can't satisfy the development demand of trade. In recent years, with the rapid development of artificial intelligence technology, moving object tracking technology for power systems is getting more and more attention of researchers. Although prior methods have met with some success, they are mostly based on fixed-camera surveillance video recording. The mode can not track the moving target flexibly, and when the moving target leaves the visual field of the camera, the problem of loss of the moving target exists.
Disclosure of Invention
For overcoming above-mentioned prior art's not enough, this disclosure provides the moving target tracking method based on unmanned aerial vehicle vision under the electric power scene, and unmanned aerial vehicle's flying speed and direction can be adjusted according to target movement condition self-adaptation to improve the performance based on unmanned aerial vehicle vision moving target tracking under the electric power scene.
In order to achieve the above object, one or more embodiments of the present disclosure provide the following technical solutions:
in a first aspect, a method for tracking a moving target based on unmanned aerial vehicle vision in an electric power scene is disclosed, which comprises:
carrying out real-time positioning and tracking on the moving target to obtain the position of the moving target of the current frame;
based on the obtained current frame moving target position, the unmanned aerial vehicle is controlled to fly, so that the flying speed and direction of the unmanned aerial vehicle are adaptively adjusted according to the target moving condition, and the moving target is ensured to be always kept at the center of the visual field of the unmanned aerial vehicle.
In the further technical scheme, the position of the moving target of the current frame is obtained based on a mode of combining target tracking and a target detection algorithm.
According to the further technical scheme, during target tracking, real-time video streams shot by the unmanned aerial vehicle are used for modeling, and the target tracking is carried out by adopting the kernel correlation filtering based on the Gaussian kernel function, so that the coordinate position of the moving target to be tracked in each frame is obtained.
In a further technical scheme, the target detection algorithm for recapturing the moving target which may be lost specifically includes:
and if the similarity of the color histograms of the target areas of the two adjacent frames predicted by target tracking is smaller than a certain threshold value or no moving target is detected in the current frame, reselecting the target area to perform subsequent target tracking.
According to the further technical scheme, when the unmanned aerial vehicle is controlled to fly, a heuristic flying strategy is adopted: and controlling the running speed of the unmanned aerial vehicle based on the offset distance of the position of the moving target in the current frame relative to the previous frame, so that the speed of the unmanned aerial vehicle is consistent with the speed of the moving object.
According to the further technical scheme, the difference between the coordinates of the x axis of the front frame image and the coordinates of the x axis of the rear frame image are larger than zero, so that the unmanned aerial vehicle moves at a certain speed along the direction of the positive half axis of the x axis, and otherwise, the unmanned aerial vehicle moves at a certain speed along the direction of the negative half axis of the x axis.
Further technical scheme, when controlling unmanned aerial vehicle flight, adopt data drive's flight strategy: and predicting the displacement of each frame of unmanned aerial vehicle based on a gated cyclic neural network according to a pre-extracted unmanned aerial vehicle tracking target displacement sequence.
The method specifically comprises the following steps: tracking each video, in the tracking process, randomly moving a simulation window for simulating an unmanned aerial vehicle on each frame of the video until a moving target cannot be tracked, and finally selecting displacement sequences corresponding to a plurality of sequences with the longest length as real labels through multiple random movements;
aiming at the ith sequence, obtaining the position information of a moving target in the jth frame;
and predicting the flight displacement of the current frame of the unmanned aerial vehicle by using the gating cyclic neural network, corresponding to the position information of the simulation window of the unmanned aerial vehicle and the displacement of the target central points of the front frame and the rear frame based on the acquired position information of the moving target in the jth frame.
In a second aspect, a moving target tracking system based on unmanned aerial vehicle vision in an electric power scene is disclosed, comprising:
a location acquisition module configured to: carrying out real-time positioning and tracking on the moving target to obtain the position of the moving target of the current frame;
an adaptively adjusting module configured to: based on the obtained current frame moving target position, the unmanned aerial vehicle is controlled to fly, so that the flying speed and direction of the unmanned aerial vehicle are adaptively adjusted according to the target moving condition, and the moving target is ensured to be always kept at the center of the visual field of the unmanned aerial vehicle.
According to a further technical scheme, the self-adaptive adjusting module comprises a heuristic unmanned aerial vehicle flight control module and a data-driven unmanned aerial vehicle flight control module, and the heuristic unmanned aerial vehicle flight control module is configured to:
controlling the running speed of the unmanned aerial vehicle based on the offset distance of the position of the moving target in the current frame relative to the previous frame, so that the speed of the unmanned aerial vehicle is consistent with the speed of the moving object;
if the difference between the coordinates of the x-axis of the two frames of images is larger than zero, the unmanned aerial vehicle moves at a certain speed along the positive half-axis direction of the x-axis, otherwise, the unmanned aerial vehicle moves at a certain speed along the negative half-axis direction of the x-axis;
a data-driven drone flight control module configured to: tracking each video, in the tracking process, randomly moving a simulation window for simulating an unmanned aerial vehicle on each frame of the video until a moving target cannot be tracked, and finally selecting displacement sequences corresponding to a plurality of sequences with the longest length as real labels through multiple random movements;
aiming at the ith sequence, obtaining the position information of a moving target in the jth frame;
and predicting the flight displacement of the current frame of the unmanned aerial vehicle by using the gating cyclic neural network, corresponding to the position information of the simulation window of the unmanned aerial vehicle and the displacement of the target central points of the front frame and the rear frame based on the acquired position information of the moving target in the jth frame.
The above one or more technical solutions have the following beneficial effects:
the technical scheme of the method includes the steps that firstly, the hidden danger target is positioned and tracked in a real-time low-power-consumption mode on the basis of a target detection algorithm YOLOv3 and a target tracking algorithm KCF. Secondly, the invention provides two flight control methods: heuristic flight control strategies and data-driven flight control strategies. The heuristic flight strategy aims at adaptively adjusting the flight speed and direction of the unmanned aerial vehicle aiming at moving objects with different speeds so as to enable a hidden danger target to be always positioned in a picture center; the data-driven flight strategy mainly predicts the displacement of each frame of unmanned aerial vehicle based on a gated cyclic neural network according to a pre-extracted unmanned aerial vehicle tracking target displacement sequence, and improves the accuracy and the real-time performance of tracking a moving target in a power system.
Advantages of additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure and are not to limit the disclosure.
Fig. 1 is a block diagram of the overall system flow of an embodiment of the present disclosure.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present disclosure. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
The existing method based on the combination of target detection and target tracking algorithm usually assumes that the moving speed of a moving target is slow, i.e. the front and rear frame displacement of the target is small. Therefore, how to effectively track objects with various moving speeds, the problem of reducing tracking loss is the primary challenge of the work, and the real-time performance of tracking is ensured.
The moving target is guaranteed to be always kept at the center of the visual field of the unmanned aerial vehicle, and how to effectively control the flight speed and direction of the unmanned aerial vehicle is an important problem to be solved by the implementation example of the disclosure.
Example one
The embodiment discloses a moving target tracking method based on unmanned aerial vehicle vision in an electric power scene, which comprises the following steps:
s1: and (3) positioning and tracking the hidden danger target in a real-time low-power-consumption manner based on a mode of combining target tracking and a target detection algorithm to obtain the position of the moving target of the current frame.
S2: and introducing a heuristic unmanned aerial vehicle flight control module based on the current frame moving target position acquired by S1.
S3: and introducing a data-driven unmanned aerial vehicle flight control module based on the current frame moving target position acquired in the S1.
The step S1 tracking method construction process further includes:
s11: the invention uses real-time video stream shot by unmanned aerial vehicle
Figure BDA0002705789900000051
And modeling. Wherein, IjRepresenting the j-th frame image, t, in a video stream0=(x0,y0,w0,h0) Representing the initial position of the moving object to be tracked. N refers to real-time video streams
Figure BDA0002705789900000052
The number of frames in (2).
In order to ensure the tracking effect, the invention selects a Kernel Correlation Filter (KCF) based on a gaussian Kernel function to track the target, and the expression is as follows:
Figure BDA0002705789900000053
where x and x' denote two arbitrary samples, respectively, and σ is the standard deviation,
Figure BDA0002705789900000054
refers to an inverse Fourier transform, and an inner product. According to the KCF algorithm, the invention can obtain the coordinate position t of the moving target to be tracked in each framej=(xj,yj,wj,hj),
Figure BDA0002705789900000061
Is the value after x' fourier transform;
Figure BDA0002705789900000062
is the conjugate transpose of x; (x)j,yj) Coordinates, w, in the upper left corner of the target box in frame jjAnd hjThe width and height of the target box are indicated, respectively.
S12: in order to reduce the possibility of loss of the tracking target, the present invention proposes to introduce a target detection algorithm (i.e. YOLOv3) to recapture the moving target which may be lost, so as to improve the model effect. Specifically, if the target area t of two adjacent frames is predicted based on the KCF algorithmjAnd t(j-1)The similarity of the color histograms is less than a certain threshold value gamma1Or no moving object is detected in the current frame, the object detection algorithm YOLOv3 is triggered to reselect the object region to perform subsequent object tracking. Let H(j-1)And HjRespectively representing the target area t of the j-1 th and j-th frames(j-1)And tjThe color histogram vector of (1). Current frame IjAnd the previous frame I(j-1)The histogram similarity S of the target region of (1) is calculated as follows:
Figure BDA0002705789900000063
wherein the content of the first and second substances,
Figure BDA0002705789900000064
and
Figure BDA0002705789900000065
respectively, represent histogram means. And alpha is the number of intervals of the histogram.
S13: through the detection and judgment of S12, if the target in the tracking process is lost, the invention adopts a YOLOv3 network structure
Figure BDA0002705789900000066
The target detection is carried out on the moving target so as to make up for the problem of target loss in the tracking process. The algorithm of YOLOv3 can be described as:
Figure BDA0002705789900000067
wherein the content of the first and second substances,
Figure BDA0002705789900000068
is a YOLOv3 network
Figure BDA0002705789900000069
Is determined by the parameters of (a) and (b),
Figure BDA00027057899000000610
indicating the coordinates of the upper left corner of the target frame in the j-th frame based on the moving target detected by YOLOv3,
Figure BDA00027057899000000611
and
Figure BDA00027057899000000612
the width and height of the target box are indicated, respectively.
Figure BDA00027057899000000613
The coordinate position of the moving object in the j frame image detected based on YOLOv3 is represented as an algorithm output. The present invention predicts the target area of the current frame according to Yolov3
Figure BDA00027057899000000614
Target area t of previous frame predicted by KCF tracking algorithm(j-1)Of (2) similarity threshold gamma2Determining the target area detected by the current Yolov3
Figure BDA00027057899000000615
Whether it should be selected as a new target for subsequent KCF tracking. Here, the present invention considers not only the similarity based on the histogram, but also the spatial similarity of the current frame and the previous frame target frame, i.e. the intersection ratio, and sets the threshold value to δ, and the calculation formula is as follows:
Figure BDA0002705789900000071
the process of constructing the heuristic unmanned aerial vehicle flight control module in the step S2 further includes:
s21: obtained based on S1The position of the moving target of the current frame and the speed of the unmanned aerial vehicle are kept consistent with the speed of the moving object, so that the problems of swinging caused by too high speed of the unmanned aerial vehicle, target loss caused by too low speed and the like are prevented. For this purpose, the invention calculates the unmanned aerial vehicle speed v according to the following formulajAs follows:
vj=ψ(d(c(j-1),cj))
ψ(d)=wd+b
Figure BDA0002705789900000072
wherein w and b are weights and bias values in the linear function,
Figure BDA0002705789900000073
and
Figure BDA0002705789900000074
respectively representing two frames of images I(j-1)And IjAnd coordinates of the center point of the middle target frame. d (c)(j-1),cj) Refers to the distance between two center points, i.e. the distance that the position of the moving object in the current frame is shifted from the previous frame. The greater this distance, the faster the object moves, and the faster the speed configuration of the drone should be. In order to ensure that the unmanned aerial vehicle and the moving target are kept in the same visual field and facilitate simulation experiments, the invention adopts a linear function psi (d) to scale the distance between the central points of the target frames of the front frame and the rear frame.
S22: considering that most unmanned aerial vehicles cannot move in multiple directions simultaneously in a real scene, the invention designs the following heuristic flight strategy: if it is not
Figure BDA0002705789900000075
Make unmanned aerial vehicle along positive semi-axis direction of its x axle with vjThe speed of (2) is moved; on the contrary, v is the negative semiaxis direction of the x-axisjIs moved at the speed of (1). If one-time unidirectional movement is adopted, the moving target cannot be kept on the x axis in the visual field of the unmanned aerial vehicleIn (3), continue to calculate vj+1And the next unidirectional movement is performed. The strategy is also applicable in the y-axis direction in the field of view of the unmanned aerial vehicle.
The process of constructing the unmanned aerial vehicle flight control module driven by the data in the step S3 further includes:
s31: the heuristic flight control module is greatly influenced by human experience due to S2, and needs multiple unmanned aerial vehicle movements. Thus, the present invention further improves the addition of data-driven flight control. Specifically, the present invention first tracks on each video. In the tracking process, the simulation window for simulating the unmanned aerial vehicle is randomly moved on each frame of the video until the moving target cannot be tracked. Through multiple random movements, finally selecting displacement sequences corresponding to a plurality of sequences with the longest length
Figure BDA0002705789900000081
As a real label to train flight control. M refers to the length of the shift sequence, wherein,
Figure BDA0002705789900000082
indicating the displacement of the drone from frame j to frame j +1 of the sequence. Qi represents the length of the ith sequence. For the ith sequence, the invention can obtain the position information t of the moving target in the jth frame by the tracking methodij
S32: position information t of the moving object in the jth frame acquired based on S31ijThe invention uses the gate control cyclic neural network to correspond to the position information p of the simulation window of the unmanned aerial vehicleijAnd displacement of target center points of front and rear adjacent frames
Figure BDA0002705789900000083
Wherein
Figure BDA0002705789900000084
To predict the current frame flight displacement of the unmanned aerial vehicle
Figure BDA0002705789900000085
The details are as follows:
zij=tij||pij||dij
zij′=W1zij+b1
Figure BDA0002705789900000086
Figure BDA0002705789900000087
where | represents the concatenation of vectors,
Figure BDA0002705789900000088
representing gated recurrent neural networks, W1,W2And b1,b2Respectively, the weight and bias parameters of the fully connected network. With mean square error loss, the cost function can be expressed as:
Figure BDA0002705789900000089
the embodiment of the disclosure discloses a moving target tracking method based on unmanned aerial vehicle vision in an electric power scene, aiming at improving the accuracy and the real-time performance of moving target tracking in an electric power system. The method is used for positioning and tracking the hidden danger target in a real-time low-power-consumption manner on the basis of a target detection algorithm YOLOv3 and a target tracking algorithm KCF. Secondly, the invention provides two flight control methods: a heuristic flight control method and a data-driven flight control method. The heuristic flight strategy aims at adaptively adjusting the flight speed and direction of the unmanned aerial vehicle aiming at moving objects with different speeds so as to enable a hidden danger target to be always positioned in a picture center; the data-driven flight strategy is mainly characterized in that the displacement of each frame of unmanned aerial vehicle is predicted based on a gated cyclic neural network according to a pre-extracted unmanned aerial vehicle tracking target displacement sequence.
Example II
This embodiment discloses a moving target tracking system based on unmanned aerial vehicle vision under electric power scene, includes:
a tracking module configured to: carrying out real-time positioning and tracking on the moving target to obtain the position of the moving target of the current frame;
an adaptively adjusting module configured to: based on the obtained current frame moving target position, the unmanned aerial vehicle is controlled to fly, so that the flying speed and direction of the unmanned aerial vehicle are adaptively adjusted according to the target moving condition, and the moving target is ensured to be always kept at the center of the visual field of the unmanned aerial vehicle.
Specifically, the adaptive adjustment module includes a heuristic unmanned aerial vehicle flight control module and a data-driven unmanned aerial vehicle flight control module, the heuristic unmanned aerial vehicle flight control module is configured to:
controlling the running speed of the unmanned aerial vehicle based on the offset distance of the position of the moving target in the current frame relative to the previous frame, so that the speed of the unmanned aerial vehicle is consistent with the speed of the moving object;
if the difference between the coordinates of the x-axis of the two frames of images is larger than zero, the unmanned aerial vehicle moves at a certain speed along the positive half-axis direction of the x-axis, otherwise, the unmanned aerial vehicle moves at a certain speed along the negative half-axis direction of the x-axis;
a data-driven drone flight control module configured to: tracking each video, in the tracking process, randomly moving a simulation window for simulating an unmanned aerial vehicle on each frame of the video until a moving target cannot be tracked, and finally selecting displacement sequences corresponding to a plurality of sequences with the longest length as real labels through multiple random movements;
aiming at the ith sequence, obtaining the position information of a moving target in the jth frame;
and predicting the flight displacement of the current frame of the unmanned aerial vehicle by using the gating cyclic neural network, corresponding to the position information of the simulation window of the unmanned aerial vehicle and the displacement of the target central points of the front frame and the rear frame based on the acquired position information of the moving target in the jth frame.
Referring again to fig. 1, the tracking module calculates the direction and the movement displacement between the current frame target and the previous frame target, and the first method is to actually control the flight speed and direction of the unmanned aerial vehicle based on the target displacement and direction.
The first full connection in the method 2 is to map three different parameters to a space, so that the three different parameters can be conveniently sent to a GRU network for coding; the second full connection in method 2 is to map the information generated by the GRU into flight control instructions applicable to the drone.
EXAMPLE III
The present embodiment is directed to a computing device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor executes the computer program to implement the specific steps of the method.
Example four
An object of the present embodiment is to provide a computer-readable storage medium.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the specific steps of the above-mentioned method.
The steps involved in the apparatuses of the above second, third and fourth embodiments correspond to the first embodiment of the method, and the detailed description thereof can be found in the relevant description of the first embodiment. The term "computer-readable storage medium" should be taken to include a single medium or multiple media containing one or more sets of instructions; it should also be understood to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by a processor and that cause the processor to perform any of the methods of the present disclosure.
Those skilled in the art will appreciate that the modules or steps of the present disclosure described above can be implemented using general purpose computer means, or alternatively, they can be implemented using program code executable by computing means, whereby the modules or steps may be stored in memory means for execution by the computing means, or separately fabricated into individual integrated circuit modules, or multiple modules or steps thereof may be fabricated into a single integrated circuit module. The present disclosure is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present disclosure and is not intended to limit the present disclosure, and various modifications and changes may be made to the present disclosure by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present disclosure should be included in the protection scope of the present disclosure.
Although the present disclosure has been described with reference to specific embodiments, it should be understood that the scope of the present disclosure is not limited thereto, and those skilled in the art will appreciate that various modifications and changes can be made without departing from the spirit and scope of the present disclosure.

Claims (10)

1. A moving target tracking method based on unmanned aerial vehicle vision in an electric power scene is characterized by comprising the following steps:
carrying out real-time positioning and tracking on the moving target to obtain the position of the moving target of the current frame;
based on the obtained current frame moving target position, the unmanned aerial vehicle is controlled to fly, so that the flying speed and direction of the unmanned aerial vehicle are adaptively adjusted according to the target moving condition, and the moving target is ensured to be always kept at the center of the visual field of the unmanned aerial vehicle.
2. The method for tracking the moving target based on the vision of the unmanned aerial vehicle in the electric power scene as claimed in claim 1, wherein the position of the moving target of the current frame is obtained based on a combination of a target tracking algorithm and a target detection algorithm.
3. The method for tracking the moving target based on the vision of the unmanned aerial vehicle in the electric power scene as claimed in claim 2, wherein during the target tracking, a real-time video stream shot by the unmanned aerial vehicle is used for modeling, and the target tracking is performed by adopting a kernel correlation filtering based on a Gaussian kernel function, so as to obtain the coordinate position of the moving target to be tracked in each frame.
4. The method for tracking the moving object based on the vision of the unmanned aerial vehicle in the power scene as claimed in claim 2, wherein the target detection algorithm is used to recapture the moving object which may be lost, and specifically comprises:
and if the similarity of the color histograms of the target areas of the two adjacent frames predicted by target tracking is smaller than a certain threshold value or no moving target is detected in the current frame, reselecting the target area to perform subsequent target tracking.
5. The method for tracking the moving target based on the vision of the unmanned aerial vehicle in the power scene as claimed in claim 1, wherein when the flight of the unmanned aerial vehicle is controlled, a heuristic flight strategy is adopted: and controlling the running speed of the unmanned aerial vehicle based on the offset distance of the position of the moving target in the current frame relative to the previous frame, so that the speed of the unmanned aerial vehicle is consistent with the speed of the moving object.
6. The method for tracking the moving object based on the vision of the unmanned aerial vehicle in the electric power scene as claimed in claim 5, wherein the difference between the coordinates of the x-axis of the two frames of images is greater than zero, so that the unmanned aerial vehicle moves at a certain speed along the positive half-axis direction of the x-axis, and vice versa.
7. The method for tracking the moving target based on the vision of the unmanned aerial vehicle in the electric power scene as claimed in claim 1, wherein when the flight of the unmanned aerial vehicle is controlled, a data-driven flight strategy is adopted: predicting the displacement of each frame of unmanned aerial vehicle based on a gated cyclic neural network according to a pre-extracted unmanned aerial vehicle tracking target displacement sequence;
the method specifically comprises the following steps: tracking each video, in the tracking process, randomly moving a simulation window for simulating an unmanned aerial vehicle on each frame of the video until a moving target cannot be tracked, and finally selecting displacement sequences corresponding to a plurality of sequences with the longest length as real labels through multiple random movements;
aiming at the ith sequence, obtaining the position information of the moving target in the jth frame by the method;
and predicting the flight displacement of the current frame of the unmanned aerial vehicle by using the gating cyclic neural network, corresponding to the position information of the simulation window of the unmanned aerial vehicle and the displacement of the target central points of the front frame and the rear frame based on the acquired position information of the moving target in the jth frame.
8. Moving target tracking system based on unmanned aerial vehicle vision under electric power scene, characterized by includes:
a tracking module configured to: carrying out real-time positioning and tracking on the moving target to obtain the position of the moving target of the current frame;
an adaptively adjusting module configured to: based on the obtained current frame moving target position, the unmanned aerial vehicle is controlled to fly, so that the flying speed and direction of the unmanned aerial vehicle are adaptively adjusted according to the target moving condition, and the moving target is ensured to be always kept at the center of the visual field of the unmanned aerial vehicle.
9. A computing device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the method of any of claims 1-7 when executing the program.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method steps of any one of the claims 1-7.
CN202011038186.1A 2020-09-28 2020-09-28 Moving target tracking method and system based on unmanned aerial vehicle vision in electric power scene Active CN112233141B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011038186.1A CN112233141B (en) 2020-09-28 2020-09-28 Moving target tracking method and system based on unmanned aerial vehicle vision in electric power scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011038186.1A CN112233141B (en) 2020-09-28 2020-09-28 Moving target tracking method and system based on unmanned aerial vehicle vision in electric power scene

Publications (2)

Publication Number Publication Date
CN112233141A true CN112233141A (en) 2021-01-15
CN112233141B CN112233141B (en) 2022-10-14

Family

ID=74120203

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011038186.1A Active CN112233141B (en) 2020-09-28 2020-09-28 Moving target tracking method and system based on unmanned aerial vehicle vision in electric power scene

Country Status (1)

Country Link
CN (1) CN112233141B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113191294A (en) * 2021-05-11 2021-07-30 山东浪潮科学研究院有限公司 Machine vision-based large-sized floating object collision dam prevention and detection method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102096927A (en) * 2011-01-26 2011-06-15 北京林业大学 Target tracking method of independent forestry robot
CN103149939A (en) * 2013-02-26 2013-06-12 北京航空航天大学 Dynamic target tracking and positioning method of unmanned plane based on vision
CN107230219A (en) * 2017-05-04 2017-10-03 复旦大学 A kind of target person in monocular robot is found and follower method
WO2017185503A1 (en) * 2016-04-29 2017-11-02 高鹏 Target tracking method and apparatus
CN107909600A (en) * 2017-11-04 2018-04-13 南京奇蛙智能科技有限公司 The unmanned plane real time kinematics target classification and detection method of a kind of view-based access control model
CN108399642A (en) * 2018-01-26 2018-08-14 上海深视信息科技有限公司 A kind of the general target follower method and system of fusion rotor wing unmanned aerial vehicle IMU data
US20180231985A1 (en) * 2016-12-22 2018-08-16 TCL Research America Inc. System and method for vision-based flight self-stabilization by deep gated recurrent q-networks
CN110147122A (en) * 2019-06-14 2019-08-20 深圳市道通智能航空技术有限公司 A kind of method for tracing, device and the unmanned plane of mobile target
CN110222581A (en) * 2019-05-13 2019-09-10 电子科技大学 A kind of quadrotor drone visual target tracking method based on binocular camera
CN111680713A (en) * 2020-04-26 2020-09-18 中国科学院上海微系统与信息技术研究所 Unmanned aerial vehicle ground target tracking and approaching method based on visual detection

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102096927A (en) * 2011-01-26 2011-06-15 北京林业大学 Target tracking method of independent forestry robot
CN103149939A (en) * 2013-02-26 2013-06-12 北京航空航天大学 Dynamic target tracking and positioning method of unmanned plane based on vision
WO2017185503A1 (en) * 2016-04-29 2017-11-02 高鹏 Target tracking method and apparatus
US20180231985A1 (en) * 2016-12-22 2018-08-16 TCL Research America Inc. System and method for vision-based flight self-stabilization by deep gated recurrent q-networks
CN107230219A (en) * 2017-05-04 2017-10-03 复旦大学 A kind of target person in monocular robot is found and follower method
CN107909600A (en) * 2017-11-04 2018-04-13 南京奇蛙智能科技有限公司 The unmanned plane real time kinematics target classification and detection method of a kind of view-based access control model
CN108399642A (en) * 2018-01-26 2018-08-14 上海深视信息科技有限公司 A kind of the general target follower method and system of fusion rotor wing unmanned aerial vehicle IMU data
CN110222581A (en) * 2019-05-13 2019-09-10 电子科技大学 A kind of quadrotor drone visual target tracking method based on binocular camera
CN110147122A (en) * 2019-06-14 2019-08-20 深圳市道通智能航空技术有限公司 A kind of method for tracing, device and the unmanned plane of mobile target
CN111680713A (en) * 2020-04-26 2020-09-18 中国科学院上海微系统与信息技术研究所 Unmanned aerial vehicle ground target tracking and approaching method based on visual detection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
AHMED DIRIR 等: "Object Tracking Framework for Unmanned Aerial Vehicles", 《2019 IEEE GLOBAL CONFERENCE ON INTERNET OF THINGS (GCIOT)》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113191294A (en) * 2021-05-11 2021-07-30 山东浪潮科学研究院有限公司 Machine vision-based large-sized floating object collision dam prevention and detection method
CN113191294B (en) * 2021-05-11 2022-06-17 山东浪潮科学研究院有限公司 Machine vision-based large-sized floating object collision dam prevention and detection method

Also Published As

Publication number Publication date
CN112233141B (en) 2022-10-14

Similar Documents

Publication Publication Date Title
CN109800689B (en) Target tracking method based on space-time feature fusion learning
Cui et al. Remote sensing object tracking with deep reinforcement learning under occlusion
CN110009665A (en) A kind of target detection tracking method blocked under environment
CN106022239A (en) Multi-target tracking method based on recurrent neural network
CN111176309B (en) Multi-unmanned aerial vehicle self-group mutual inductance understanding method based on spherical imaging
CN105913455A (en) Local image enhancement-based object tracking method
Hamdi et al. SADA: semantic adversarial diagnostic attacks for autonomous applications
CN112818783B (en) Improved confrontation sample generation method based on traffic sign target detector
CN102663775A (en) Target tracking method oriented to video with low frame rate
CN109658442A (en) Multi-object tracking method, device, equipment and computer readable storage medium
CN110070565A (en) A kind of ship trajectory predictions method based on image superposition
CN111680713A (en) Unmanned aerial vehicle ground target tracking and approaching method based on visual detection
Zhang et al. Transmission line abnormal target detection based on machine learning yolo v3
CN109508686A (en) A kind of Human bodys' response method based on the study of stratification proper subspace
CN112233141B (en) Moving target tracking method and system based on unmanned aerial vehicle vision in electric power scene
Liu et al. Data augmentation technology driven by image style transfer in self-driving car based on end-to-end learning
CN111833378A (en) Multi-unmanned aerial vehicle single-target tracking method and device based on proxy sharing network
Gao et al. Enhance sample efficiency and robustness of end-to-end urban autonomous driving via semantic masked world model
Jin et al. Graph neural network based relation learning for abnormal perception information detection in self-driving scenarios
Li A hierarchical autonomous driving framework combining reinforcement learning and imitation learning
Lu et al. Hybrid deep learning based moving object detection via motion prediction
CN110377033B (en) RGBD information-based small football robot identification and tracking grabbing method
Wang et al. Vehicle key information detection algorithm based on improved SSD
Wang et al. Research on improved pedestrian detection algorithm based on convolutional neural network
CN111862158B (en) Staged target tracking method, device, terminal and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant