CN112507885A - Method for identifying intrusion of inspection unmanned aerial vehicle - Google Patents
Method for identifying intrusion of inspection unmanned aerial vehicle Download PDFInfo
- Publication number
- CN112507885A CN112507885A CN202011455467.7A CN202011455467A CN112507885A CN 112507885 A CN112507885 A CN 112507885A CN 202011455467 A CN202011455467 A CN 202011455467A CN 112507885 A CN112507885 A CN 112507885A
- Authority
- CN
- China
- Prior art keywords
- aerial vehicle
- unmanned aerial
- camera
- state
- intruding object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Astronomy & Astrophysics (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method for identifying external aggression by an inspection unmanned aerial vehicle, which comprises the following steps: s1, coordinate calculation: according to the identification of the unmanned aerial vehicle camera on the intruding object, establishing the coordinates of the intruding object in a camera coordinate system; s2, state prediction: establishing a state observer to predict the motion state of the invader; s3, risk identification: and calculating whether the intruding object has the risk of hitting the unmanned aerial vehicle according to the state prediction condition of the intruding object. The unmanned aerial vehicle line patrol system can assist the line patrol unmanned aerial vehicle in identifying whether artificial intrusion factors such as stones or slingshots threaten the unmanned aerial vehicle, and if the artificial intrusion factors such as stones or slingshots threaten the unmanned aerial vehicle, the unmanned aerial vehicle can avoid danger in advance.
Description
Technical Field
The invention belongs to the technical field of power transmission and distribution, and particularly relates to a method for identifying external intrusion of a line patrol unmanned aerial vehicle.
Background
Unmanned aerial vehicle patrols the line and has alleviateed basic unit team's working strength by a wide margin, has improved and has patrolled the line quality, but in the particular rural area especially, even child is out of curiosity or prank, can try to hit with articles such as stone or catapult and hit unmanned aerial vehicle and cause it to fall, because unmanned aerial vehicle value is higher, causes economic loss often also can be great.
The existing unmanned aerial vehicle mainly provides an active obstacle avoidance method, which can better complete active obstacle avoidance work, and the passive defense method is rarely applied.
Disclosure of Invention
In order to avoid the threat of external man-made intrusion factors, the invention provides a method for identifying external intrusion by a line patrol unmanned aerial vehicle, which comprises three steps of computing the coordinates of an intrusion object, predicting the state of the intrusion object by adopting a state observer on the basis of the coordinate computing, calculating the risk of hitting the unmanned aerial vehicle, and identifying whether man-made intrusion factors such as stones or slingshots threaten the unmanned aerial vehicle.
In order to achieve the purpose, the invention adopts the following technical scheme:
a method for identifying external aggression by a line patrol unmanned aerial vehicle comprises the following steps:
s1, coordinate calculation: according to the identification of the unmanned aerial vehicle camera on the intruding object, establishing the coordinates of the intruding object in a camera coordinate system;
s2, state prediction: establishing a state observer to predict the motion state of the invader;
s3, risk identification: and calculating whether the intruding object has the risk of hitting the unmanned aerial vehicle according to the state prediction condition of the intruding object.
In step S1, the position of the intruding object is set as point P, and the unmanned aerial vehicle camera coordinate system overlaps with the world coordinate system, so that the spatial coordinate of point P is (X, Y, Z), and Z is the vertical distance from point P to the optical center of the camera; setting the intersection point of the P point and the image surface as a point P, the pixel coordinate as (x, y), the Z as the depth, and the f as the focal length of the camera;
the projection relationship is as follows:
the above formula assumes that the origin is at the center of the image, and is offset from the pixel coordinate system of the image, and the corresponding pixel coordinate of the optical center on the image is (c)x,cy) Then, the projection relationship is corrected as:
focal length f, cx,cyBelonging to camera internal parameters, wherein x and y are coordinates of the invader on the image; the coordinates (X, Y, Z) of the infestation in three-dimensional space are:
the coordinates can be more conveniently used after being converted, a world coordinate system is converted into a camera coordinate system, R is a rotation matrix with three rows and three columns, and T is a displacement vector with three rows and one column;
sx=K[RX+T] (6)
where sx represents the camera coordinate system and K represents a constant parameter.
S2, the specific process of state prediction is as follows:
decomposing the movement of the intruding object into x, y and z directions, wherein the movement in the x, y and z directions is independent, and the x is the left-right direction, the y is the up-down direction and the z is the front-back direction according to the visual angle of the camera;
the x direction:
wherein: v. ofxThe speed in the left-right direction is the speed,acceleration in the left and right directions;
the y direction:
wherein: v. ofyThe speed of the moving object is the speed of the moving object in the up-and-down direction,acceleration in the up-down direction, and g is gravity acceleration;
the z direction:
setting:
then:
the transformation into a matrix is then:
(15) (16) can be abbreviated as:
y=Cx (18)
(17) (18) the state space model of the invader is obtained, the state at a certain moment is known, the state at the next moment can be predicted, and state feedback is added;
in (19), L is the observer gain;
in the formula, A, B, C is a matrix parameter, and u is a control amount.
The unmanned aerial vehicle line patrol system has the beneficial effects that the line patrol unmanned aerial vehicle can be assisted to identify whether artificial intrusion factors such as stones or slingshots threaten the unmanned aerial vehicle, and if so, the unmanned aerial vehicle can avoid risks in advance.
Drawings
FIG. 1 is a flow chart of the present invention.
Fig. 2 is a schematic diagram of the projection of three-dimensional coordinates onto a two-dimensional plane in the present invention.
Fig. 3 is a schematic diagram of coordinate transformation in the present invention.
Fig. 4 is a schematic diagram of a state observer according to the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the following embodiments, and it should be understood that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
A method for identifying intrusion of line patrol unmanned aerial vehicle is completed through coordinate calculation, state prediction and risk identification, and referring to FIG. 1, the method comprises the following steps:
s1, coordinate calculation: and establishing coordinates of the intruding object in a camera coordinate system according to the identification of the unmanned aerial vehicle camera on the intruding object.
S2, state prediction: and establishing a state observer to predict the motion state of the invader.
S3, risk identification: and calculating whether the intruding object has the risk of hitting the unmanned aerial vehicle according to the state prediction condition of the intruding object.
S1, the specific process of coordinate calculation is as follows:
referring to fig. 2, the position of the intruding object is set as a point P, and the coordinate system of the unmanned aerial vehicle camera overlaps with the world coordinate system, so that the coordinates of the point P in the space are (X, Y, Z), and Z is the vertical distance from the point P to the optical center of the camera. Let the intersection point of P point and image plane be point P, pixel coordinate be (x, y), Z be depth, and f be the focal length of the camera.
The projection relationship is as follows:
the above formula assumes that the origin is at the center of the image, and is offset from the pixel coordinate system of the image, and the corresponding pixel coordinate of the optical center on the image is (c)x,cy) Then, the projection relationship is corrected as:
focal length f, cx,cyBelonging to camera intrinsic parameters, where x, y are coordinates of the intruding object on the image.
The coordinates (X, Y, Z) of the infestation in three-dimensional space are:
as shown in fig. 3, the coordinates need to be converted for more convenient use, and the world coordinate system is converted into a camera coordinate system, where R is a rotation matrix with three rows and three columns, and T is a displacement vector with three rows and one column.
sx=K[RX+T] (6)
Where sx represents the camera coordinate system and K represents a constant parameter.
S2, the specific process of state prediction is as follows:
the motion of the intruding object is decomposed into x, y and z directions, the motion in the x, y and z directions is independent, and the x is the left-right direction, the y is the up-down direction and the z is the front-back direction according to the view angle of the camera.
The x direction:
wherein: v. ofxThe speed in the left-right direction is the speed,is the left and right direction acceleration.
The y direction:
wherein: v. ofyThe speed of the moving object is the speed of the moving object in the up-and-down direction,the acceleration in the up-down direction is shown, and g is the acceleration of gravity.
The z direction:
Setting:
then:
the transformation into a matrix is then:
(15) (16) can be abbreviated as:
y=Cx (18)
(18) (18) the state space model of the invader, knowing the state at a certain moment, can predict the state at the next moment, and add state feedback, as shown in fig. 4.
In (19), L is the observer gain.
Where A, B, C is a matrix parameter and u is a control amount.
If the observer error can converge to a lower level within 100ms, the characteristic root of A-LC can be set to [ -10, -9, -10, -9, -10, -9], at which point the calculation by MATLAB yields:
L=[19 90 0 0 0 0;0 0 19 90 0 0;0 0 0 0 19 90] (21)
s3, risk identification:
and (3) solving the intersection point of the velocity vector and the plane where the camera is located by using the space geometric relation, simultaneously budgeting the intersection time, calculating the falling distance caused by neutral influence, and predicting the intersection position with the plane of the camera.
An example of programming with MATLAB is as follows:
function[result]=get_meetpoint(planevec,planepoint,linevec,linepo int)
vp1=planevec(1);
vp2=planevec(2);
vp3=planevec(3);
n1=planepoint(1);
n2=planepoint(2);
n3=planepoint(3);
v1=linevec(1);
v2=linevec(2);
v3=linevec(3);
m1=linepoint(1);
m2=linepoint(2);
m3=linepoint(3);
vpt=v1*vp1+v2*vp2+v3*vp3;
if(vpt==0)
result==[];
else
t=((n1-m1)*vp1+(n2-m2)*vp2+(n3-m3)*vp3)/vpt;
result=[m1+v1*t,m2+v2*t,m3+v3*t,t];
end
end
the first three terms of the function output are three-dimensional coordinates of the junction, and the fourth term is flight time.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.
Claims (4)
1. A method for identifying external intrusion of a line patrol unmanned aerial vehicle is characterized by comprising the following steps:
s1, coordinate calculation: according to the identification of the unmanned aerial vehicle camera on the intruding object, establishing the coordinates of the intruding object in a camera coordinate system;
s2, state prediction: establishing a state observer to predict the motion state of the invader;
s3, risk identification: and calculating whether the intruding object has the risk of hitting the unmanned aerial vehicle according to the state prediction condition of the intruding object.
2. The method for identifying the external aggression of the line patrol unmanned aerial vehicle according to claim 1, wherein the method comprises the following steps:
in step S1, the position of the intruding object is set as point P, and the unmanned aerial vehicle camera coordinate system overlaps with the world coordinate system, so that the spatial coordinate of point P is (X, Y, Z), and Z is the vertical distance from point P to the optical center of the camera; setting the intersection point of the P point and the image surface as a point P, the pixel coordinate as (x, y), the Z as the depth, and the f as the focal length of the camera;
the projection relationship is as follows:
the above formula assumes that the origin is at the center of the image, and is offset from the pixel coordinate system of the image, and the corresponding pixel coordinate of the optical center on the image is (c)x,cy) Then, the projection relationship is corrected as:
focal length f, cx,cyBelong toCamera intrinsic parameters, where x, y are coordinates of the intruding object on the image;
the coordinates (X, Y, Z) of the infestation in three-dimensional space are:
the coordinates can be more conveniently used after being converted, a world coordinate system is converted into a camera coordinate system, R is a rotation matrix with three rows and three columns, and T is a displacement vector with three rows and one column;
sx=K[RX+T] (6)
where sx represents the camera coordinate system and K represents a constant parameter.
3. The method for identifying the external aggression of the line patrol unmanned aerial vehicle according to claim 1, wherein in the step S2, the specific process of state prediction is as follows:
decomposing the movement of the intruding object into x, y and z directions, wherein the movement in the x, y and z directions is independent, and the x is the left-right direction, the y is the up-down direction and the z is the front-back direction according to the visual angle of the camera;
the x direction:
wherein: v. ofxThe speed in the left-right direction is the speed,acceleration in the left and right directions;
the y direction:
wherein: v. ofyThe speed of the moving object is the speed of the moving object in the up-and-down direction,acceleration in the up-down direction, and g is gravity acceleration;
the z direction:
setting:
then:
the transformation into a matrix is then:
(15) (16) can be abbreviated as:
y=Cx (18)
(17) (18) the state space model of the invader is obtained, the state at a certain moment is known, the state at the next moment can be predicted, and state feedback is added;
in (19), L is the observer gain;
in the formula, A, B, C is a matrix parameter, and u is a control amount.
4. The method for identifying the external aggression of the line patrol unmanned aerial vehicle according to claim 1, wherein a space geometric relationship is used for obtaining an intersection point of a velocity vector and a plane where a camera is located, meanwhile, the intersection time is estimated, the falling distance caused by neutral influence is calculated, and the intersection position with the plane of the camera is predicted.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011455467.7A CN112507885B (en) | 2020-12-10 | 2020-12-10 | Method for identifying intrusion by line inspection unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011455467.7A CN112507885B (en) | 2020-12-10 | 2020-12-10 | Method for identifying intrusion by line inspection unmanned aerial vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112507885A true CN112507885A (en) | 2021-03-16 |
CN112507885B CN112507885B (en) | 2023-07-21 |
Family
ID=74973442
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011455467.7A Active CN112507885B (en) | 2020-12-10 | 2020-12-10 | Method for identifying intrusion by line inspection unmanned aerial vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112507885B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090080701A1 (en) * | 2007-09-20 | 2009-03-26 | Mirko Meuter | Method for object tracking |
CN107314771A (en) * | 2017-07-04 | 2017-11-03 | 合肥工业大学 | Unmanned plane positioning and attitude angle measuring method based on coded target |
CN108829130A (en) * | 2018-06-11 | 2018-11-16 | 重庆大学 | A kind of unmanned plane patrol flight control system and method |
CN109540126A (en) * | 2018-12-03 | 2019-03-29 | 哈尔滨工业大学 | A kind of inertia visual combination air navigation aid based on optical flow method |
-
2020
- 2020-12-10 CN CN202011455467.7A patent/CN112507885B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090080701A1 (en) * | 2007-09-20 | 2009-03-26 | Mirko Meuter | Method for object tracking |
CN107314771A (en) * | 2017-07-04 | 2017-11-03 | 合肥工业大学 | Unmanned plane positioning and attitude angle measuring method based on coded target |
CN108829130A (en) * | 2018-06-11 | 2018-11-16 | 重庆大学 | A kind of unmanned plane patrol flight control system and method |
CN109540126A (en) * | 2018-12-03 | 2019-03-29 | 哈尔滨工业大学 | A kind of inertia visual combination air navigation aid based on optical flow method |
Non-Patent Citations (1)
Title |
---|
毛厚晨;宋敏;高文明;甘旭升;: "基于态势预测的无人机防相撞控制方法", 火力与指挥控制, no. 11 * |
Also Published As
Publication number | Publication date |
---|---|
CN112507885B (en) | 2023-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108885791B (en) | Ground detection method, related device and computer readable storage medium | |
CN109471432B (en) | Shortest obstacle avoidance path planning method for autonomous navigation agricultural vehicle | |
CN107315410B (en) | Automatic obstacle removing method for robot | |
CN109872372A (en) | A kind of small-sized quadruped robot overall Vision localization method and system | |
CN111735445B (en) | Monocular vision and IMU (inertial measurement Unit) integrated coal mine tunnel inspection robot system and navigation method | |
AU2015234395A1 (en) | Real-time range map generation | |
CN109669474B (en) | Priori knowledge-based multi-rotor unmanned aerial vehicle self-adaptive hovering position optimization algorithm | |
CN114378827B (en) | Dynamic target tracking and grabbing method based on overall control of mobile mechanical arm | |
CN106225774A (en) | A kind of unmanned agriculture tractor road measurement apparatus based on computer vision and method | |
CN110433467A (en) | Picking up table tennis ball robot operation method and equipment based on binocular vision and ant group algorithm | |
CN109508673A (en) | It is a kind of based on the traffic scene obstacle detection of rodlike pixel and recognition methods | |
CN112509054A (en) | Dynamic calibration method for external parameters of camera | |
Barasuol et al. | Reactive trotting with foot placement corrections through visual pattern classification | |
CN112507885A (en) | Method for identifying intrusion of inspection unmanned aerial vehicle | |
Xin et al. | Visual navigation for mobile robot with Kinect camera in dynamic environment | |
CN113961013A (en) | Unmanned aerial vehicle path planning method based on RGB-D SLAM | |
Mishra et al. | A review on vision based control of autonomous vehicles using artificial intelligence techniques | |
CN116171962B (en) | Efficient targeted spray regulation and control method and system for plant protection unmanned aerial vehicle | |
CN106933233A (en) | A kind of unmanned plane obstacle avoidance system and method based on interval flow field | |
CN114326766B (en) | Cooperative autonomous tracking and landing method for vehicle and machine | |
CN112947570B (en) | Unmanned aerial vehicle obstacle avoidance method and device and storage medium | |
CN114973037A (en) | Unmanned aerial vehicle intelligent detection and synchronous positioning multi-target method | |
CN112686963B (en) | Target positioning method of aerial work robot for shielding | |
Hanabusa et al. | 3D map generation for decommissioning work | |
Aggarwal et al. | Vision based collision avoidance by plotting a virtual obstacle on depth map |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |