CN112507885A - Method for identifying intrusion of inspection unmanned aerial vehicle - Google Patents

Method for identifying intrusion of inspection unmanned aerial vehicle Download PDF

Info

Publication number
CN112507885A
CN112507885A CN202011455467.7A CN202011455467A CN112507885A CN 112507885 A CN112507885 A CN 112507885A CN 202011455467 A CN202011455467 A CN 202011455467A CN 112507885 A CN112507885 A CN 112507885A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
camera
state
intruding object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011455467.7A
Other languages
Chinese (zh)
Other versions
CN112507885B (en
Inventor
付理祥
高洁
张祥罗
夏阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Power Supply Branch State Grid Jiangxi Province Electric Power Co ltd
State Grid Corp of China SGCC
Original Assignee
Nanchang Power Supply Branch State Grid Jiangxi Province Electric Power Co ltd
State Grid Corp of China SGCC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Power Supply Branch State Grid Jiangxi Province Electric Power Co ltd, State Grid Corp of China SGCC filed Critical Nanchang Power Supply Branch State Grid Jiangxi Province Electric Power Co ltd
Priority to CN202011455467.7A priority Critical patent/CN112507885B/en
Publication of CN112507885A publication Critical patent/CN112507885A/en
Application granted granted Critical
Publication of CN112507885B publication Critical patent/CN112507885B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Astronomy & Astrophysics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for identifying external aggression by an inspection unmanned aerial vehicle, which comprises the following steps: s1, coordinate calculation: according to the identification of the unmanned aerial vehicle camera on the intruding object, establishing the coordinates of the intruding object in a camera coordinate system; s2, state prediction: establishing a state observer to predict the motion state of the invader; s3, risk identification: and calculating whether the intruding object has the risk of hitting the unmanned aerial vehicle according to the state prediction condition of the intruding object. The unmanned aerial vehicle line patrol system can assist the line patrol unmanned aerial vehicle in identifying whether artificial intrusion factors such as stones or slingshots threaten the unmanned aerial vehicle, and if the artificial intrusion factors such as stones or slingshots threaten the unmanned aerial vehicle, the unmanned aerial vehicle can avoid danger in advance.

Description

Method for identifying intrusion of inspection unmanned aerial vehicle
Technical Field
The invention belongs to the technical field of power transmission and distribution, and particularly relates to a method for identifying external intrusion of a line patrol unmanned aerial vehicle.
Background
Unmanned aerial vehicle patrols the line and has alleviateed basic unit team's working strength by a wide margin, has improved and has patrolled the line quality, but in the particular rural area especially, even child is out of curiosity or prank, can try to hit with articles such as stone or catapult and hit unmanned aerial vehicle and cause it to fall, because unmanned aerial vehicle value is higher, causes economic loss often also can be great.
The existing unmanned aerial vehicle mainly provides an active obstacle avoidance method, which can better complete active obstacle avoidance work, and the passive defense method is rarely applied.
Disclosure of Invention
In order to avoid the threat of external man-made intrusion factors, the invention provides a method for identifying external intrusion by a line patrol unmanned aerial vehicle, which comprises three steps of computing the coordinates of an intrusion object, predicting the state of the intrusion object by adopting a state observer on the basis of the coordinate computing, calculating the risk of hitting the unmanned aerial vehicle, and identifying whether man-made intrusion factors such as stones or slingshots threaten the unmanned aerial vehicle.
In order to achieve the purpose, the invention adopts the following technical scheme:
a method for identifying external aggression by a line patrol unmanned aerial vehicle comprises the following steps:
s1, coordinate calculation: according to the identification of the unmanned aerial vehicle camera on the intruding object, establishing the coordinates of the intruding object in a camera coordinate system;
s2, state prediction: establishing a state observer to predict the motion state of the invader;
s3, risk identification: and calculating whether the intruding object has the risk of hitting the unmanned aerial vehicle according to the state prediction condition of the intruding object.
In step S1, the position of the intruding object is set as point P, and the unmanned aerial vehicle camera coordinate system overlaps with the world coordinate system, so that the spatial coordinate of point P is (X, Y, Z), and Z is the vertical distance from point P to the optical center of the camera; setting the intersection point of the P point and the image surface as a point P, the pixel coordinate as (x, y), the Z as the depth, and the f as the focal length of the camera;
the projection relationship is as follows:
Figure BDA0002828602790000011
Figure BDA0002828602790000012
the above formula assumes that the origin is at the center of the image, and is offset from the pixel coordinate system of the image, and the corresponding pixel coordinate of the optical center on the image is (c)x,cy) Then, the projection relationship is corrected as:
Figure BDA0002828602790000021
Figure BDA0002828602790000022
focal length f, cx,cyBelonging to camera internal parameters, wherein x and y are coordinates of the invader on the image; the coordinates (X, Y, Z) of the infestation in three-dimensional space are:
Figure BDA0002828602790000023
the coordinates can be more conveniently used after being converted, a world coordinate system is converted into a camera coordinate system, R is a rotation matrix with three rows and three columns, and T is a displacement vector with three rows and one column;
sx=K[RX+T] (6)
where sx represents the camera coordinate system and K represents a constant parameter.
S2, the specific process of state prediction is as follows:
decomposing the movement of the intruding object into x, y and z directions, wherein the movement in the x, y and z directions is independent, and the x is the left-right direction, the y is the up-down direction and the z is the front-back direction according to the visual angle of the camera;
the x direction:
Figure BDA0002828602790000024
Figure BDA0002828602790000025
wherein: v. ofxThe speed in the left-right direction is the speed,
Figure BDA0002828602790000026
acceleration in the left and right directions;
the y direction:
Figure BDA0002828602790000027
Figure BDA0002828602790000028
wherein: v. ofyThe speed of the moving object is the speed of the moving object in the up-and-down direction,
Figure BDA0002828602790000029
acceleration in the up-down direction, and g is gravity acceleration;
the z direction:
Figure BDA0002828602790000031
Figure BDA0002828602790000032
wherein: v. ofzThe speed in the front-back direction is,
Figure BDA0002828602790000033
acceleration in the front-rear direction;
setting:
Figure BDA0002828602790000034
then:
Figure BDA0002828602790000035
the transformation into a matrix is then:
Figure BDA0002828602790000036
Figure BDA0002828602790000037
(15) (16) can be abbreviated as:
Figure BDA0002828602790000038
y=Cx (18)
(17) (18) the state space model of the invader is obtained, the state at a certain moment is known, the state at the next moment can be predicted, and state feedback is added;
Figure BDA0002828602790000041
in (19), L is the observer gain;
Figure BDA0002828602790000042
in the formula, A, B, C is a matrix parameter, and u is a control amount.
The unmanned aerial vehicle line patrol system has the beneficial effects that the line patrol unmanned aerial vehicle can be assisted to identify whether artificial intrusion factors such as stones or slingshots threaten the unmanned aerial vehicle, and if so, the unmanned aerial vehicle can avoid risks in advance.
Drawings
FIG. 1 is a flow chart of the present invention.
Fig. 2 is a schematic diagram of the projection of three-dimensional coordinates onto a two-dimensional plane in the present invention.
Fig. 3 is a schematic diagram of coordinate transformation in the present invention.
Fig. 4 is a schematic diagram of a state observer according to the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the following embodiments, and it should be understood that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
A method for identifying intrusion of line patrol unmanned aerial vehicle is completed through coordinate calculation, state prediction and risk identification, and referring to FIG. 1, the method comprises the following steps:
s1, coordinate calculation: and establishing coordinates of the intruding object in a camera coordinate system according to the identification of the unmanned aerial vehicle camera on the intruding object.
S2, state prediction: and establishing a state observer to predict the motion state of the invader.
S3, risk identification: and calculating whether the intruding object has the risk of hitting the unmanned aerial vehicle according to the state prediction condition of the intruding object.
S1, the specific process of coordinate calculation is as follows:
referring to fig. 2, the position of the intruding object is set as a point P, and the coordinate system of the unmanned aerial vehicle camera overlaps with the world coordinate system, so that the coordinates of the point P in the space are (X, Y, Z), and Z is the vertical distance from the point P to the optical center of the camera. Let the intersection point of P point and image plane be point P, pixel coordinate be (x, y), Z be depth, and f be the focal length of the camera.
The projection relationship is as follows:
Figure BDA0002828602790000051
Figure BDA0002828602790000052
the above formula assumes that the origin is at the center of the image, and is offset from the pixel coordinate system of the image, and the corresponding pixel coordinate of the optical center on the image is (c)x,cy) Then, the projection relationship is corrected as:
Figure BDA0002828602790000053
Figure BDA0002828602790000059
focal length f, cx,cyBelonging to camera intrinsic parameters, where x, y are coordinates of the intruding object on the image.
The coordinates (X, Y, Z) of the infestation in three-dimensional space are:
Figure BDA0002828602790000054
as shown in fig. 3, the coordinates need to be converted for more convenient use, and the world coordinate system is converted into a camera coordinate system, where R is a rotation matrix with three rows and three columns, and T is a displacement vector with three rows and one column.
sx=K[RX+T] (6)
Where sx represents the camera coordinate system and K represents a constant parameter.
S2, the specific process of state prediction is as follows:
the motion of the intruding object is decomposed into x, y and z directions, the motion in the x, y and z directions is independent, and the x is the left-right direction, the y is the up-down direction and the z is the front-back direction according to the view angle of the camera.
The x direction:
Figure BDA0002828602790000055
Figure BDA0002828602790000056
wherein: v. ofxThe speed in the left-right direction is the speed,
Figure BDA0002828602790000057
is the left and right direction acceleration.
The y direction:
Figure BDA0002828602790000058
Figure BDA0002828602790000061
wherein: v. ofyThe speed of the moving object is the speed of the moving object in the up-and-down direction,
Figure BDA0002828602790000062
the acceleration in the up-down direction is shown, and g is the acceleration of gravity.
The z direction:
Figure BDA0002828602790000063
Figure BDA0002828602790000064
wherein: v. ofzThe speed in the front-back direction is,
Figure BDA0002828602790000065
is the front-rear direction acceleration.
Setting:
Figure BDA0002828602790000066
then:
Figure BDA0002828602790000067
the transformation into a matrix is then:
Figure BDA0002828602790000068
Figure BDA0002828602790000069
(15) (16) can be abbreviated as:
Figure BDA0002828602790000071
y=Cx (18)
(18) (18) the state space model of the invader, knowing the state at a certain moment, can predict the state at the next moment, and add state feedback, as shown in fig. 4.
Figure BDA0002828602790000072
In (19), L is the observer gain.
Figure BDA0002828602790000073
Where A, B, C is a matrix parameter and u is a control amount.
If the observer error can converge to a lower level within 100ms, the characteristic root of A-LC can be set to [ -10, -9, -10, -9, -10, -9], at which point the calculation by MATLAB yields:
L=[19 90 0 0 0 0;0 0 19 90 0 0;0 0 0 0 19 90] (21)
s3, risk identification:
and (3) solving the intersection point of the velocity vector and the plane where the camera is located by using the space geometric relation, simultaneously budgeting the intersection time, calculating the falling distance caused by neutral influence, and predicting the intersection position with the plane of the camera.
An example of programming with MATLAB is as follows:
function[result]=get_meetpoint(planevec,planepoint,linevec,linepo int)
vp1=planevec(1);
vp2=planevec(2);
vp3=planevec(3);
n1=planepoint(1);
n2=planepoint(2);
n3=planepoint(3);
v1=linevec(1);
v2=linevec(2);
v3=linevec(3);
m1=linepoint(1);
m2=linepoint(2);
m3=linepoint(3);
vpt=v1*vp1+v2*vp2+v3*vp3;
if(vpt==0)
result==[];
else
t=((n1-m1)*vp1+(n2-m2)*vp2+(n3-m3)*vp3)/vpt;
result=[m1+v1*t,m2+v2*t,m3+v3*t,t];
end
end
the first three terms of the function output are three-dimensional coordinates of the junction, and the fourth term is flight time.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.

Claims (4)

1. A method for identifying external intrusion of a line patrol unmanned aerial vehicle is characterized by comprising the following steps:
s1, coordinate calculation: according to the identification of the unmanned aerial vehicle camera on the intruding object, establishing the coordinates of the intruding object in a camera coordinate system;
s2, state prediction: establishing a state observer to predict the motion state of the invader;
s3, risk identification: and calculating whether the intruding object has the risk of hitting the unmanned aerial vehicle according to the state prediction condition of the intruding object.
2. The method for identifying the external aggression of the line patrol unmanned aerial vehicle according to claim 1, wherein the method comprises the following steps:
in step S1, the position of the intruding object is set as point P, and the unmanned aerial vehicle camera coordinate system overlaps with the world coordinate system, so that the spatial coordinate of point P is (X, Y, Z), and Z is the vertical distance from point P to the optical center of the camera; setting the intersection point of the P point and the image surface as a point P, the pixel coordinate as (x, y), the Z as the depth, and the f as the focal length of the camera;
the projection relationship is as follows:
Figure FDA0002828602780000011
Figure FDA0002828602780000012
the above formula assumes that the origin is at the center of the image, and is offset from the pixel coordinate system of the image, and the corresponding pixel coordinate of the optical center on the image is (c)x,cy) Then, the projection relationship is corrected as:
Figure FDA0002828602780000013
Figure FDA0002828602780000014
focal length f, cx,cyBelong toCamera intrinsic parameters, where x, y are coordinates of the intruding object on the image;
the coordinates (X, Y, Z) of the infestation in three-dimensional space are:
Figure FDA0002828602780000015
the coordinates can be more conveniently used after being converted, a world coordinate system is converted into a camera coordinate system, R is a rotation matrix with three rows and three columns, and T is a displacement vector with three rows and one column;
sx=K[RX+T] (6)
where sx represents the camera coordinate system and K represents a constant parameter.
3. The method for identifying the external aggression of the line patrol unmanned aerial vehicle according to claim 1, wherein in the step S2, the specific process of state prediction is as follows:
decomposing the movement of the intruding object into x, y and z directions, wherein the movement in the x, y and z directions is independent, and the x is the left-right direction, the y is the up-down direction and the z is the front-back direction according to the visual angle of the camera;
the x direction:
Figure FDA0002828602780000021
Figure FDA0002828602780000022
wherein: v. ofxThe speed in the left-right direction is the speed,
Figure FDA0002828602780000023
acceleration in the left and right directions;
the y direction:
Figure FDA0002828602780000024
Figure FDA0002828602780000025
wherein: v. ofyThe speed of the moving object is the speed of the moving object in the up-and-down direction,
Figure FDA0002828602780000026
acceleration in the up-down direction, and g is gravity acceleration;
the z direction:
Figure FDA0002828602780000027
Figure FDA0002828602780000028
wherein: v. ofzThe speed in the front-back direction is,
Figure FDA0002828602780000029
acceleration in the front-rear direction;
setting:
Figure FDA00028286027800000210
then:
Figure FDA0002828602780000031
the transformation into a matrix is then:
Figure FDA0002828602780000032
Figure FDA0002828602780000033
(15) (16) can be abbreviated as:
Figure FDA0002828602780000034
y=Cx (18)
(17) (18) the state space model of the invader is obtained, the state at a certain moment is known, the state at the next moment can be predicted, and state feedback is added;
Figure FDA0002828602780000035
in (19), L is the observer gain;
Figure FDA0002828602780000036
in the formula, A, B, C is a matrix parameter, and u is a control amount.
4. The method for identifying the external aggression of the line patrol unmanned aerial vehicle according to claim 1, wherein a space geometric relationship is used for obtaining an intersection point of a velocity vector and a plane where a camera is located, meanwhile, the intersection time is estimated, the falling distance caused by neutral influence is calculated, and the intersection position with the plane of the camera is predicted.
CN202011455467.7A 2020-12-10 2020-12-10 Method for identifying intrusion by line inspection unmanned aerial vehicle Active CN112507885B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011455467.7A CN112507885B (en) 2020-12-10 2020-12-10 Method for identifying intrusion by line inspection unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011455467.7A CN112507885B (en) 2020-12-10 2020-12-10 Method for identifying intrusion by line inspection unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN112507885A true CN112507885A (en) 2021-03-16
CN112507885B CN112507885B (en) 2023-07-21

Family

ID=74973442

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011455467.7A Active CN112507885B (en) 2020-12-10 2020-12-10 Method for identifying intrusion by line inspection unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN112507885B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090080701A1 (en) * 2007-09-20 2009-03-26 Mirko Meuter Method for object tracking
CN107314771A (en) * 2017-07-04 2017-11-03 合肥工业大学 Unmanned plane positioning and attitude angle measuring method based on coded target
CN108829130A (en) * 2018-06-11 2018-11-16 重庆大学 A kind of unmanned plane patrol flight control system and method
CN109540126A (en) * 2018-12-03 2019-03-29 哈尔滨工业大学 A kind of inertia visual combination air navigation aid based on optical flow method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090080701A1 (en) * 2007-09-20 2009-03-26 Mirko Meuter Method for object tracking
CN107314771A (en) * 2017-07-04 2017-11-03 合肥工业大学 Unmanned plane positioning and attitude angle measuring method based on coded target
CN108829130A (en) * 2018-06-11 2018-11-16 重庆大学 A kind of unmanned plane patrol flight control system and method
CN109540126A (en) * 2018-12-03 2019-03-29 哈尔滨工业大学 A kind of inertia visual combination air navigation aid based on optical flow method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
毛厚晨;宋敏;高文明;甘旭升;: "基于态势预测的无人机防相撞控制方法", 火力与指挥控制, no. 11 *

Also Published As

Publication number Publication date
CN112507885B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
CN111735445B (en) Monocular vision and IMU (inertial measurement Unit) integrated coal mine tunnel inspection robot system and navigation method
CN107315410B (en) Automatic obstacle removing method for robot
CN109872372A (en) A kind of small-sized quadruped robot overall Vision localization method and system
CN109669474B (en) Priori knowledge-based multi-rotor unmanned aerial vehicle self-adaptive hovering position optimization algorithm
CN110433467A (en) Picking up table tennis ball robot operation method and equipment based on binocular vision and ant group algorithm
Ma et al. Crlf: Automatic calibration and refinement based on line feature for lidar and camera in road scenes
CN103558868B (en) The nozzle spray angle opertaing device of concrete sprayer, method and engineering machinery
CN116645649A (en) Vehicle pose and size estimation method, device and storage medium
CN112509054A (en) Dynamic calibration method for external parameters of camera
Leung et al. Hybrid terrain traversability analysis in off-road environments
CN112507885A (en) Method for identifying intrusion of inspection unmanned aerial vehicle
Barasuol et al. Reactive trotting with foot placement corrections through visual pattern classification
Xin et al. Visual navigation for mobile robot with Kinect camera in dynamic environment
Zhao et al. Environmental perception and sensor data fusion for unmanned ground vehicle
CN108733076B (en) Method and device for grabbing target object by unmanned aerial vehicle and electronic equipment
CN106933233A (en) A kind of unmanned plane obstacle avoidance system and method based on interval flow field
CN112947570B (en) Unmanned aerial vehicle obstacle avoidance method and device and storage medium
CN114661051A (en) Front obstacle avoidance system based on RGB-D
CN114973037A (en) Unmanned aerial vehicle intelligent detection and synchronous positioning multi-target method
CN112686963B (en) Target positioning method of aerial work robot for shielding
Ren et al. Teleoperation of unmanned ground vehicles based on 3D trajectory prediction
Yang et al. A new algorithm for obstacle segmentation in dynamic environments using a RGB-D sensor
Aggarwal et al. Vision based collision avoidance by plotting a virtual obstacle on depth map
Jiang et al. Adaptive image-based visual servoing of nonholonomic mobile robot with on-board camera
Wang et al. Visual servoing control of video tracking system for tracking a flying target

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant