CN112507885B - Method for identifying intrusion by line inspection unmanned aerial vehicle - Google Patents
Method for identifying intrusion by line inspection unmanned aerial vehicle Download PDFInfo
- Publication number
- CN112507885B CN112507885B CN202011455467.7A CN202011455467A CN112507885B CN 112507885 B CN112507885 B CN 112507885B CN 202011455467 A CN202011455467 A CN 202011455467A CN 112507885 B CN112507885 B CN 112507885B
- Authority
- CN
- China
- Prior art keywords
- aerial vehicle
- unmanned aerial
- camera
- state
- invasion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Astronomy & Astrophysics (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method for identifying external invasion by a line inspection unmanned aerial vehicle, which comprises the following steps: s1, calculating coordinates: establishing coordinates of the invasion object in a camera coordinate system according to the identification of the invasion object by the unmanned aerial vehicle camera; s2, predicting the state: a state observer is established to predict the motion state of the invasion; s3, risk identification: and calculating whether the invading object is hit on the unmanned plane or not according to the condition of predicting the state of the invading object. The unmanned aerial vehicle inspection system can assist the line inspection unmanned aerial vehicle in identifying whether artificial invasion factors such as stones or slings threaten the unmanned aerial vehicle, and if the unmanned aerial vehicle threatens, the unmanned aerial vehicle can avoid danger in advance.
Description
Technical Field
The invention belongs to the technical field of power transmission and distribution, and particularly relates to a method for identifying external invasion by a line inspection unmanned aerial vehicle.
Background
The unmanned aerial vehicle patrols the line and has alleviateed basic unit's working strength by a wide margin, has improved the line quality of patrolling, but in individual district especially rural, occasionally child is curiosity or miscreant, try to hit unmanned aerial vehicle with articles such as stone or slingshot and cause it to fall, because unmanned aerial vehicle value is higher, causes economic loss often also can be great.
The existing unmanned aerial vehicle mainly provides an active obstacle avoidance method, the active obstacle avoidance work can be well completed, and the passive defense method has little application.
Disclosure of Invention
In order to avoid the threat of external artificial invasion factors, the invention provides a method for identifying external invasion by a line inspection unmanned aerial vehicle, which comprises three steps of coordinate calculation of an invasion object, state prediction and risk identification, wherein a state observer is adopted to predict the motion state of the invasion object based on the coordinate calculation, and the risk of hitting the unmanned aerial vehicle is calculated, so that whether the artificial invasion factors such as stones or slingshot threaten the unmanned aerial vehicle.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
a method for identifying external invasion by a line inspection unmanned aerial vehicle comprises the following steps:
s1, calculating coordinates: establishing coordinates of the invasion object in a camera coordinate system according to the identification of the invasion object by the unmanned aerial vehicle camera;
s2, predicting the state: a state observer is established to predict the motion state of the invasion;
s3, risk identification: and calculating whether the invading object is hit on the unmanned plane or not according to the condition of predicting the state of the invading object.
In step S1, the position of the invasive object is set as a point P, and the unmanned aerial vehicle camera coordinate system overlaps with the world coordinate system, so that the coordinates of the point P in space are (X, Y, Z), and Z is the vertical distance from the point P to the camera optical center; let P point and intersection point of image plane as point P, pixel coordinate as (x, y), Z as depth, f as focal length of camera;
the projection relation is as follows:
the above formula assumes that the origin is at the center of the image, offset from the pixel coordinate system of the image, and sets the pixel coordinate corresponding to the optical center on the image as (c x ,c y ) The projection relationship is corrected as:
focal length f, c x ,c y Belongs to the internal parameters of the camera, wherein x and y are coordinates of an invading object on an image at the moment; coordinates (X, Y, Z) of the offender in three-dimensional space are:
the coordinate can be more conveniently used after being converted, the world coordinate system is converted into a camera coordinate system, R is a three-row three-column rotation matrix, and T is a three-row one-column displacement vector;
sx=K[RX+T] (6)
where sx represents the camera coordinate system and K represents a constant parameter.
S2, the specific process of state prediction is as follows:
decomposing the movement of an invasive object into x, y and z directions, wherein the movement in the x, y and z directions is independent, and the x is the left and right direction, the y is the up and down direction and the z is the front and back direction according to the view angle of a camera;
x direction:
wherein: v x The speed in the left-right direction is the speed in the left-right direction,acceleration in the left-right direction;
y direction:
wherein: v y For the speed in the up-down direction,the acceleration in the up-down direction is g, and the gravity acceleration is g;
z direction:
wherein: v z The speed in the front-rear direction is the speed in the front-rear direction,acceleration in the front-rear direction;
setting:
then:
the conversion into a matrix is as follows:
(15) (16) can be abbreviated as:
y=Cx (18)
(17) (18) a state space model of the invasion, wherein the state at a certain moment is known, the state at the next moment can be predicted, and state feedback is added;
in (19), L is the observer gain;
in the formula, A, B, C is a matrix parameter, and u is a control amount.
The unmanned aerial vehicle inspection system has the beneficial effects that the unmanned aerial vehicle inspection system can assist the line inspection unmanned aerial vehicle in identifying whether artificial invasion factors such as stones or slings threaten the unmanned aerial vehicle, and if the unmanned aerial vehicle threatens, the unmanned aerial vehicle can avoid danger in advance.
Drawings
Fig. 1 is a flow chart of the present invention.
Fig. 2 is a schematic view of three-dimensional coordinate projection onto a two-dimensional plane in the present invention.
Fig. 3 is a schematic diagram of coordinate transformation in the present invention.
Fig. 4 is a schematic diagram of a state observer in the present invention.
Detailed Description
The technical solutions of the present invention will be clearly and completely described in connection with the embodiments, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments.
The method for identifying the intrusion of the line inspection unmanned aerial vehicle is completed through coordinate calculation, state prediction and risk identification, and referring to fig. 1, the following steps are performed:
s1, calculating coordinates: and establishing coordinates of the invasion object in a camera coordinate system according to the identification of the unmanned aerial vehicle camera on the invasion object.
S2, predicting the state: and establishing a state observer to predict the motion state of the invading object.
S3, risk identification: and calculating whether the invading object is hit on the unmanned plane or not according to the condition of predicting the state of the invading object.
S1, calculating coordinates, wherein the specific process is as follows:
referring to fig. 2, the position of the invading object is set as a point P, and the unmanned aerial vehicle camera coordinate system overlaps with the world coordinate system, and then the coordinates of the point P in space are (X, Y, Z), and Z is the vertical distance from the point P to the camera optical center. Let the intersection of the P point and the image plane be point P, the pixel coordinates be (x, y), Z be depth, and f be the focal length of the camera.
The projection relation is as follows:
the above formula assumes that the origin is at the center of the image, offset from the pixel coordinate system of the image, and sets the pixel coordinate corresponding to the optical center on the image as (c x ,c y ) The projection relationship is corrected as:
focal length f, c x ,c y Belongs to the internal parameters of the camera, at the moment, x and y are the invasion object in the figureCoordinates on the image.
Coordinates (X, Y, Z) of the offender in three-dimensional space are:
as shown in fig. 3, the coordinates need to be converted to be more convenient to use, the world coordinate system is converted into the camera coordinate system, R is a rotation matrix of three rows and three columns, and T is a displacement vector of three rows and one column.
sx=K[RX+T] (6)
Where sx represents the camera coordinate system and K represents a constant parameter.
S2, the specific process of state prediction is as follows:
the motion of the invasive object is decomposed into x, y and z directions, the motion in the x, y and z directions is independent, x is the left and right direction, y is the up and down direction, and z is the front and back direction according to the view angle of the camera.
x direction:
wherein: v x The speed in the left-right direction is the speed in the left-right direction,is acceleration in the left-right direction.
y direction:
wherein: v y For the speed in the up-down direction,the vertical acceleration is g, and the gravitational acceleration is g.
z direction:
wherein: v z The speed in the front-rear direction is the speed in the front-rear direction,is acceleration in the front-rear direction.
Setting:
then:
the conversion into a matrix is as follows:
(15) (16) can be abbreviated as:
y=Cx (18)
(18) (18) is a state space model of the invasiveness object, knowing the state at a certain moment, the state at the next moment can be predicted, and state feedback is added, as shown in fig. 4.
In (19), L is the observer gain.
Where A, B, C is a matrix parameter and u is a control amount.
If the observer error can be converged to a lower level within 100ms, the characteristic root of A-LC can be set to [ -10, -9, -10, -9, -10, -9], at which time the calculation by MATLAB yields:
L=[19 90 0 0 0 0;0 0 19 90 0 0;0 0 0 0 19 90] (21)
step S3, risk identification:
and obtaining the intersection point of the speed vector and the plane of the camera by using a space geometrical relationship, simultaneously budgeting the intersection time, calculating the falling distance caused by the neutral influence, and predicting the intersection position of the speed vector and the plane of the camera.
An example of programming with MATLAB is as follows:
function[result]=get_meetpoint(planevec,planepoint,linevec,linepo int)
vp1=planevec(1);
vp2=planevec(2);
vp3=planevec(3);
n1=planepoint(1);
n2=planepoint(2);
n3=planepoint(3);
v1=linevec(1);
v2=linevec(2);
v3=linevec(3);
m1=linepoint(1);
m2=linepoint(2);
m3=linepoint(3);
vpt=v1*vp1+v2*vp2+v3*vp3;
if(vpt==0)
result==[];
else
t=((n1-m1)*vp1+(n2-m2)*vp2+(n3-m3)*vp3)/vpt;
result=[m1+v1*t,m2+v2*t,m3+v3*t,t];
end
end
the first three terms of function output are three-dimensional coordinates of the intersection point, and the fourth term is flight time.
The foregoing is only a preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art, who is within the scope of the present invention, should make equivalent substitutions or modifications according to the technical scheme of the present invention and the inventive concept thereof, and should be covered by the scope of the present invention.
Claims (2)
1. A method for identifying external invasion by a line inspection unmanned aerial vehicle is characterized by comprising the following steps:
s1, calculating coordinates: establishing coordinates of the invasion object in a camera coordinate system according to the identification of the invasion object by the unmanned aerial vehicle camera; the position of the invasion object is set as a point P, and the unmanned aerial vehicle camera coordinate system is overlapped with the world coordinate system, so that the coordinates of the point P in space are (X, Y, Z), and Z is the vertical distance from the point P to the camera optical center; let P point and intersection point of image plane as point P, pixel coordinate as (x, y), Z as depth, f as focal length of camera;
the projection relation is as follows:
;
;
the above formula assumes that the origin is at the center of the image, offset from the pixel coordinate system of the image, and sets the pixel coordinate corresponding to the optical center on the image as (c x ,c y ) The projection relationship is corrected as:
;
;
focal length f, c x ,c y Belongs to the internal parameters of the camera, wherein x and y are coordinates of an invading object on an image at the moment;
coordinates (X, Y, Z) of the offender in three-dimensional space are:
;
the coordinate can be more conveniently used after being converted, the world coordinate system is converted into a camera coordinate system, R is a three-row three-column rotation matrix, and T is a three-row one-column displacement vector;
;
wherein sx represents a camera coordinate system, and K represents a constant parameter;
s2, predicting the state: a state observer is established to predict the motion state of the invasion;
decomposing the movement of an invasive object into x, y and z directions, wherein the movement in the x, y and z directions is independent, and the x is the left and right direction, the y is the up and down direction and the z is the front and back direction according to the view angle of a camera;
x direction:
;
;
wherein:for the speed of left and right direction +.>Acceleration in the left-right direction;
y direction:
;
;
wherein:for the up-down direction speed, +.>The acceleration in the up-down direction is g, and the gravity acceleration is g;
z direction:
;
;
wherein:for the speed in the front-back direction +.>Acceleration in the front-rear direction;
setting:
;
then:
;
the conversion into a matrix is as follows:
;
;
the formulas (15) and (16) are abbreviated as:
;
;
equations (17) and (18) are state space models of the infestances, the state at a certain moment is known, the state at the next moment is predicted, and state feedback is added;
;
in formula (19), L is the observer gain;
;
wherein A, B, C is a matrix parameter, u is a control amount;
s3, risk identification: and calculating whether the invading object is hit on the unmanned plane or not according to the condition of predicting the state of the invading object.
2. The method for identifying external disturbance by the line inspection unmanned aerial vehicle according to claim 1, wherein the intersection point of the speed vector and the plane of the camera is obtained by using a space geometrical relationship, meanwhile, the intersection time is budgeted, the falling distance caused by the neutral influence is calculated, and the intersection position of the speed vector and the plane of the camera is predicted.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011455467.7A CN112507885B (en) | 2020-12-10 | 2020-12-10 | Method for identifying intrusion by line inspection unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011455467.7A CN112507885B (en) | 2020-12-10 | 2020-12-10 | Method for identifying intrusion by line inspection unmanned aerial vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112507885A CN112507885A (en) | 2021-03-16 |
CN112507885B true CN112507885B (en) | 2023-07-21 |
Family
ID=74973442
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011455467.7A Active CN112507885B (en) | 2020-12-10 | 2020-12-10 | Method for identifying intrusion by line inspection unmanned aerial vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112507885B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107314771A (en) * | 2017-07-04 | 2017-11-03 | 合肥工业大学 | Unmanned plane positioning and attitude angle measuring method based on coded target |
CN108829130A (en) * | 2018-06-11 | 2018-11-16 | 重庆大学 | A kind of unmanned plane patrol flight control system and method |
CN109540126A (en) * | 2018-12-03 | 2019-03-29 | 哈尔滨工业大学 | A kind of inertia visual combination air navigation aid based on optical flow method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ATE523864T1 (en) * | 2007-09-20 | 2011-09-15 | Delphi Tech Inc | OBJECT TRACKING METHOD |
-
2020
- 2020-12-10 CN CN202011455467.7A patent/CN112507885B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107314771A (en) * | 2017-07-04 | 2017-11-03 | 合肥工业大学 | Unmanned plane positioning and attitude angle measuring method based on coded target |
CN108829130A (en) * | 2018-06-11 | 2018-11-16 | 重庆大学 | A kind of unmanned plane patrol flight control system and method |
CN109540126A (en) * | 2018-12-03 | 2019-03-29 | 哈尔滨工业大学 | A kind of inertia visual combination air navigation aid based on optical flow method |
Non-Patent Citations (1)
Title |
---|
基于态势预测的无人机防相撞控制方法;毛厚晨;宋敏;高文明;甘旭升;;火力与指挥控制(第11期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN112507885A (en) | 2021-03-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113485453B (en) | Method and device for generating inspection flight path of marine unmanned aerial vehicle and unmanned aerial vehicle | |
CN106125730B (en) | A kind of robot navigation's map constructing method based on mouse cerebral hippocampal spatial cell | |
CN109872372A (en) | A kind of small-sized quadruped robot overall Vision localization method and system | |
CN108453738A (en) | A kind of quadrotor based on Opencv image procossings independently captures the control method of operation in the air | |
CN108885791A (en) | ground detection method, related device and computer readable storage medium | |
CN106500669A (en) | A kind of Aerial Images antidote based on four rotor IMU parameters | |
US20220306157A1 (en) | Vehicle-mounted camera gimbal servo system and control method | |
CN109848996B (en) | Large-scale three-dimensional environment map creation method based on graph optimization theory | |
CN109946564B (en) | Distribution network overhead line inspection data acquisition method and inspection system | |
CN112097769A (en) | Homing pigeon brain-hippocampus-imitated unmanned aerial vehicle simultaneous positioning and mapping navigation system and method | |
CN103743394A (en) | Light-stream-based obstacle avoiding method of mobile robot | |
CN113593035A (en) | Motion control decision generation method and device, electronic equipment and storage medium | |
CN109673529A (en) | Police dog gesture recognition data vest and gesture recognition method based on multisensor | |
CN112507885B (en) | Method for identifying intrusion by line inspection unmanned aerial vehicle | |
CN114527294A (en) | Target speed measuring method based on single camera | |
CN112509054A (en) | Dynamic calibration method for external parameters of camera | |
CN117274566B (en) | Real-time weeding method based on deep learning and inter-plant weed distribution conditions | |
CN113961013A (en) | Unmanned aerial vehicle path planning method based on RGB-D SLAM | |
CN102175227B (en) | Quick positioning method for probe car in satellite image | |
CN117058209A (en) | Method for calculating depth information of visual image of aerocar based on three-dimensional map | |
CN116171962B (en) | Efficient targeted spray regulation and control method and system for plant protection unmanned aerial vehicle | |
Xin et al. | Visual navigation for mobile robot with Kinect camera in dynamic environment | |
CN112947570B (en) | Unmanned aerial vehicle obstacle avoidance method and device and storage medium | |
CN106840137A (en) | A kind of four-point development machine is automatically positioned orientation method | |
CN113705115B (en) | Ground unmanned vehicle chassis motion and target striking cooperative control method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |