CN111741263A - Multi-view situation perception navigation method for substation inspection unmanned aerial vehicle - Google Patents

Multi-view situation perception navigation method for substation inspection unmanned aerial vehicle Download PDF

Info

Publication number
CN111741263A
CN111741263A CN202010561811.4A CN202010561811A CN111741263A CN 111741263 A CN111741263 A CN 111741263A CN 202010561811 A CN202010561811 A CN 202010561811A CN 111741263 A CN111741263 A CN 111741263A
Authority
CN
China
Prior art keywords
transformer substation
aerial vehicle
unmanned aerial
substation
inspection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010561811.4A
Other languages
Chinese (zh)
Other versions
CN111741263B (en
Inventor
张永挺
谢幸生
王干军
吴啟民
冯灿成
江玉欢
李福鹏
丁宗宝
张勇志
林永昌
朱翚
邱桂洪
梁简
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Power Grid Co Ltd
Zhongshan Power Supply Bureau of Guangdong Power Grid Co Ltd
Original Assignee
Guangdong Power Grid Co Ltd
Zhongshan Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Power Grid Co Ltd, Zhongshan Power Supply Bureau of Guangdong Power Grid Co Ltd filed Critical Guangdong Power Grid Co Ltd
Priority to CN202010561811.4A priority Critical patent/CN111741263B/en
Publication of CN111741263A publication Critical patent/CN111741263A/en
Application granted granted Critical
Publication of CN111741263B publication Critical patent/CN111741263B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C1/00Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
    • G07C1/20Checking timed patrols, e.g. of watchman

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a multi-view situation perception navigation method for a transformer substation inspection unmanned aerial vehicle, which comprises the following steps: collecting a video stream shot by a monitoring camera and an inspection unmanned aerial vehicle in a transformer substation; decoding the video stream to obtain multi-source synchronous multi-frame pictures in different areas in the transformer substation, and storing the multi-source synchronous multi-frame pictures in a specified storage path; establishing a transformer substation space coordinate system; reading the synchronous multi-frame pictures, and performing object identification analysis on the synchronous multi-frame pictures by adopting an improved SSD algorithm to obtain a classification result of the power equipment in the transformer substation; the method comprises the steps that a multisource SLAM algorithm is adopted, situation perception is conducted on the real-time environment of the transformer substation, and the running state of the inspection unmanned aerial vehicle in the transformer substation is obtained; and inputting the classification result of the power equipment in the transformer substation and the running state of the inspection unmanned aerial vehicle in the transformer substation into a background of the dispatching system, matching the classification result with the characteristic data of the actual environment of the transformer substation in the background of the dispatching system, and finishing multi-view situation perception navigation of the inspection unmanned aerial vehicle when the classification result is consistent with the characteristic data of the actual environment of the transformer substation in the background of the dispatching system.

Description

Multi-view situation perception navigation method for substation inspection unmanned aerial vehicle
Technical Field
The invention relates to the technical field of unmanned aerial vehicle navigation, in particular to a multi-view situation perception navigation method for a transformer substation inspection unmanned aerial vehicle.
Background
At present, unmanned aerial vehicle patrols and examines and obtains preliminary application in electric power system distribution network patrols and examines, and the application of the intelligent means of patrolling and examining has greatly promoted the innovation of electric power system tradition operation and maintenance mode, patrols and examines through using unmanned aerial vehicle in the transformer substation, can further improve unmanned, the degree of automation of transformer substation.
Because there are a large amount of power equipment in the transformer substation, like power transformer, voltage current transformer, power equipment such as high-voltage circuit breaker, and transformer substation extension transformation engineering, lead to the interior wiring condition of transformer substation complicated, equipment interval is intensive, there is the risk of unmanned aerial vehicle and the interior live equipment collision of transformer substation, and the actual environment of transformer substation compares in transmission line, the object of patrolling and examining of different times probably is different completely, the unmanned aerial vehicle transformer substation patrols and examines the target that faces also more complicated various, this navigation positioning mode that leads to current unmanned aerial vehicle to fly according to fixed airline is not suitable for, and there is the problem that the rate of accuracy of unmanned aerial vehicle navigation positioning is low. In addition, the transformer substation inspection technology also has the problems that the utilization rate of the transformer substation monitoring video is not high, the environment situation perception technology of the unmanned aerial vehicle transformer substation is not mature enough and the like. Due to the fact that the transformer substation monitoring video data volume is large, effective partition and classification management are lacked, manual checking is mainly relied on, the problem that checking efficiency is low in the existing transformer substation video management is solved, and therefore the utilization rate of the transformer substation monitoring video is reduced.
Disclosure of Invention
The invention provides a multi-view situation perception navigation method for a transformer substation inspection unmanned aerial vehicle, aiming at overcoming the defects of low accuracy of navigation positioning of the transformer substation inspection unmanned aerial vehicle and low efficiency of monitoring video management and inspection in the prior art.
In order to solve the technical problems, the technical scheme of the invention is as follows:
a multi-view situation perception navigation method for a transformer substation inspection unmanned aerial vehicle comprises the following steps:
s1: collecting a video stream shot by a monitoring camera and an inspection unmanned aerial vehicle in a transformer substation;
s2: decoding the video stream to obtain multi-source synchronous multi-frame pictures in different areas in the transformer substation, and storing the multi-source synchronous multi-frame pictures in a specified storage path;
s3: establishing a transformer substation space coordinate system;
s4: reading synchronous multi-frame pictures, and performing object identification analysis on the synchronous multi-frame pictures by adopting an improved SSD (Single Shot Multi Box Detector) algorithm to identify and obtain a classification result of power equipment in the transformer substation;
s5: adopting a multi-source SLAM (Simultaneous Localization and Mapping, instant positioning and map construction) algorithm and carrying out situation perception on the real-time environment of the transformer substation to obtain the running state of the inspection unmanned aerial vehicle in the transformer substation;
s6: inputting the recognized classification result of the power equipment in the transformer substation and the running state of the inspection unmanned aerial vehicle in the transformer substation into a dispatching system background, matching the classification result with the actual environment characteristic data of the transformer substation in the dispatching system background, and finishing multi-view situation perception navigation of the inspection unmanned aerial vehicle if the input data is consistent with the actual environment characteristic data of the transformer substation in a matching manner; and if the input data is not matched with the actual environment characteristic data of the transformer substation, skipping to execute the step S5.
In the technical scheme, intelligent analysis is carried out on the video stream shot by the monitoring camera and the inspection unmanned aerial vehicle in the transformer substation by adopting a multisource SLAM algorithm and a modified SSD algorithm, the problems of multi-purpose identification, map reconstruction and navigation prediction of the unmanned aerial vehicle in the transformer substation are solved, the situation perception of the inspection unmanned aerial vehicle under the environment of the transformer substation is realized, and the navigation automation level and the intelligent level of the unmanned aerial vehicle in the transformer substation inspection process are improved.
Preferably, the specific steps of the step S1 are as follows:
s1.1: acquiring a video stream shot by a monitoring camera in the transformer substation according to a polling date, dividing a transformer substation area into a plurality of areas from a certain azimuth angle according to the position distribution of objects shot by the monitoring camera in a clockwise or anticlockwise approximately equal rectangular manner, numbering the cameras in each area in sequence, and establishing corresponding documents to store monitoring videos of different areas and different monitoring cameras;
s1.2: and acquiring the video stream shot by the inspection unmanned aerial vehicle, classifying the video stream shot by the inspection unmanned aerial vehicle according to the plurality of areas divided in the step S1.1, and storing the video stream in a corresponding document.
Preferably, the specific steps of the step S2 are as follows:
s2.1: acquiring a video stream which is classified and managed, and reading the transformer substation inspection video information frame by frame for the video stream;
s2.2: and writing the read substation inspection video information into the pictures frame by frame according to the time sequence of the video stream to obtain multi-source synchronous multi-frame pictures of different regions of the substation, and storing the multi-source synchronous multi-frame pictures in a specified storage path.
Preferably, in the step S3, the specific step of establishing the transformer substation space coordinate system includes: any position in a transformer substation area is selected as a coordinate origin O, a certain azimuth angle direction is taken as the positive direction of an x axis, a y axis and a z axis are determined according to a right-hand rule, and a transformer substation space coordinate system is established.
Preferably, in the step S4, the specific steps are as follows:
s4.1: reading a multi-view inspection decoding picture of the unmanned aerial vehicle corresponding to the video stream shot by the inspection unmanned aerial vehicle in the synchronous multi-frame pictures;
s4.2: generating a target window of the power equipment in the transformer substation in the unmanned aerial vehicle multi-view inspection decoding picture;
s4.3: preliminarily extracting features in the unmanned aerial vehicle multi-view inspection decoded picture by adopting a convolutional network;
s4.4: carrying out multi-layer feature extraction on the features extracted in the step S4.3 by adopting an inverted pyramid type convolution network to obtain a feature map;
s4.5: calculating a category score and a loss function of the target window in each feature map;
s4.6: generating a primary identification result of the power equipment in the transformer substation by adopting a non-maximum value inhibition step;
s4.7: abstracting a clustering model from the primary identification result of the power equipment in the transformer substation, setting standard parameters of different transformer substation equipment as reference points, and calculating a local density function gamma of the transformer substation equipmentiFunction of distancei(ii) a Wherein the local density function gamma of the substation equipmentiFunction of distanceiThe following formula is satisfied:
γi=∑λ(dij-dc)
Figure BDA0002546506330000031
wherein d isijDenotes polyDistance between data point i and data point j in class model, dcRepresents a truncation distance; lambda is a coefficient, and when the abscissa of the data point i is a negative number, the value of lambda is 1, and when the abscissa of the data point i is a positive number, the value of lambda is 0;
s4.8: as a function of local densityiFunction of distanceiRespectively serving as horizontal and vertical coordinates, determining a clustering center of the power equipment in the transformer substation in the clustering model, and identifying the category of the power equipment in the transformer substation;
s4.9: performing class assignment on the remaining data points according to a density descending order, and determining a classification result of the power equipment in the transformer substation;
s4.10: comparing the classification result of the power equipment in the transformer substation with the classification result of the actual power transformation equipment, and if the classification result of the power equipment in the transformer substation is consistent with the classification result of the actual power transformation equipment, outputting the classification result of the power equipment in the transformer substation; and if the two are not consistent, skipping to execute the step S4.2.
Preferably, the step of S4 further includes the steps of: establishing a safe distance cube taking any mass center of power equipment in the transformer substation as a center, wherein the safe distance cube is used for modeling and calculating a three-dimensional space environment of the transformer substation; wherein the coordinates (Xn, Yn, Zn) of the 8 vertices of the safety distance cube satisfy the following formula:
(Xn,Yn,Zn)=(Xoi±Ri,Yoi±Ri,Zoi±Ri)
Figure BDA0002546506330000041
wherein n denotes the serial number of 8 vertices of the safety distance cube, i.e., n is 1,2, 3. (Xoi, Yoi, Zoi) are centroid coordinates of the electrical device; i represents the type number of the power equipment in the transformer substation; riIs the safe radius of the center of mass, RdiFor the safety distance, L, of the electrical equipment in the substationiIs the average apparent size of the electrical equipment in the substation.
Preferably, in the step S5, the specific steps are as follows:
s5.1: reading a monitoring camera decoding picture corresponding to a video stream shot by a monitoring camera in a synchronous multi-frame picture;
s5.2: determining observation cameras of the inspection unmanned aerial vehicle at different flight positions, and calibrating internal and external parameters of the observation cameras;
s5.3: establishing a transformer substation inspection multi-view camera coordinate model;
s5.4: correcting the transformer substation inspection multi-view camera coordinate model by using the interference matrix E, and determining the real-time position of the inspection unmanned aerial vehicle;
s5.5: and introducing the step S4 to identify the obtained classification result of the power equipment in the transformer substation and the real-time position of the inspection unmanned aerial vehicle, and determining the motion state of the inspection unmanned aerial vehicle.
Preferably, in the step S5.3, the specific steps of establishing the substation inspection multi-view camera coordinate model are as follows: defining the pixel coordinate of the left camera as (u)l,vl) The projection matrix is defined as Ml(ii) a The pixel coordinate of the right camera is defined as (u)r,vr) Projection matrix is MrThen, the following model is established for the relationship between the coordinates (X, Y, Z) of the observed point in the substation and the pixel coordinates of the observation camera:
Figure BDA0002546506330000042
Figure BDA0002546506330000043
wherein Z islAnd ZrThe depths of observed points in the transformer substation in the optical axis direction of the left camera and the right camera are respectively represented, and the formula can be obtained by combining the following steps:
Figure BDA0002546506330000051
note the book
Figure BDA0002546506330000052
The above formula can be simplified to C · P ═ D;
according to the formula, the coordinate P of the observed point in the transformer substation can be obtained through solving, the coordinate P of the observed point is the theoretical real-time position of the inspection unmanned aerial vehicle in the transformer substation, and the following formula is satisfied:
P=(CTC)-1CTD。
preferably, in step S5.4, the interference matrix E is a space vector of 3 × 1 dimension; the interference matrix E is used for correcting the substation inspection multi-view camera coordinate model, and then the calculation formula for determining the real-time position of the inspection unmanned aerial vehicle is determined as follows:
P=(CTC)-1CTD+E。
preferably, in the step S5.5, the specific steps include: the transformer substation patrol inspection decoding pictures are stored and arranged according to a frame sequence, wherein the time intervals among the transformer substation patrol inspection decoding pictures of each frame are the same, the linear velocity and the angular velocity of the patrol inspection unmanned aerial vehicle in the transformer substation are obtained according to the pose change of the patrol inspection unmanned aerial vehicle among the transformer substation patrol inspection decoding pictures of different frames relative to the same physical space reference point, and the calculation formula is as follows:
vn,k=Gnvk+vv,k
Figure BDA0002546506330000053
wherein v isn,kFor inspecting the linear velocity observation vector, omega, of the unmanned aerial vehiclen,kObserving vectors for the angular velocity of the inspection unmanned aerial vehicle; gnA discrete cosine transformation matrix, H, representing the transformation from the camera coordinate system to the substation coordinate systemnRepresenting a rotation rate transformation matrix from a camera coordinate system to a transformer substation coordinate system; the delta T is the time difference of shooting of the two selected frames of pictures, namely the time interval of the change of the motion state of the unmanned aerial vehicle during inspection; v. ofv,kAnd vω,kIs gaussian white noise.
Compared with the prior art, the technical scheme of the invention has the beneficial effects that: the collected video stream is decoded and stored in the designated storage path, so that the utilization rate of the monitoring video can be improved, and the review and manual inspection are convenient to call; the method adopts a multisource SLAM algorithm and an improved SSD algorithm, realizes the steps of object identification, map reconstruction, navigation prediction and the like, realizes the real-time situation perception and the map reconstruction of the inspection unmanned aerial vehicle in the transformer substation, and can effectively solve the problem of low navigation positioning accuracy of the unmanned aerial vehicle in the transformer substation.
Drawings
Fig. 1 is a flow chart of the multi-view situation awareness navigation method of the substation inspection unmanned aerial vehicle.
Fig. 2 is a flowchart of object recognition analysis performed on a synchronous multi-frame picture according to the present invention.
Fig. 3 is a flowchart of situational awareness for a real-time environment of a substation according to the present invention.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the patent;
for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product;
it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The technical solution of the present invention is further described below with reference to the accompanying drawings and examples.
The embodiment provides a multi-view situation awareness navigation method for a transformer substation inspection unmanned aerial vehicle, which is a flowchart of the multi-view situation awareness navigation method for the transformer substation inspection unmanned aerial vehicle shown in fig. 1.
In the multi-view situation awareness navigation method for the substation inspection unmanned aerial vehicle, the method comprises the following steps:
step 1: collecting a video stream shot by a monitoring camera and an inspection unmanned aerial vehicle in a transformer substation; the method comprises the following specific steps:
step 1.1: acquiring a video stream shot by a monitoring camera in the transformer substation according to a polling date, dividing a transformer substation area into a plurality of areas from a certain azimuth angle according to the position distribution of objects shot by the monitoring camera in a clockwise or anticlockwise approximately equal rectangular manner, numbering the cameras in each area in sequence, and establishing corresponding documents to store monitoring videos of different areas and different monitoring cameras;
step 1.2: and acquiring the video stream shot by the inspection unmanned aerial vehicle, classifying the video stream shot by the inspection unmanned aerial vehicle according to the plurality of areas divided in the step 1.1, and storing the video stream in corresponding documents.
In this embodiment, video streams shot by monitoring cameras in a transformer substation can be acquired from a master control center, a higher-level dispatching department and the like of the transformer substation, a war area of the transformer substation is divided into A, B, C, D, E, F regions from the northwest corner of the transformer substation according to the position distribution of shooting objects of the monitoring cameras in the transformer substation and a clockwise approximately-equal rectangle, the cameras in each region are numbered sequentially according to numbers 1,2,3 and the like, corresponding documents are established, and monitoring videos of different regions and different monitoring cameras are stored in corresponding documents, so that unified management of the monitoring videos is realized, and a foundation is provided for building a situation awareness model of an unmanned aerial vehicle.
Step 2: decoding the video stream to obtain multi-source synchronous multi-frame pictures in different areas in the transformer substation, and storing the multi-source synchronous multi-frame pictures in a specified storage path; the method comprises the following specific steps:
step 2.1: acquiring a video stream which is classified and managed, and reading the transformer substation inspection video information frame by frame for the video stream;
step 2.2: and writing the read substation inspection video information into the pictures frame by frame according to the time sequence of the video stream to obtain multi-source synchronous multi-frame pictures of different regions of the substation, and storing the multi-source synchronous multi-frame pictures in a specified storage path.
In a specific implementation process, MATLAB can be adopted to decode a video stream, for example, the video stream is decoded by using major functions of video reader, hasFrame, readFrame and the like of the MATLAB, so that multi-source synchronous multi-frame pictures of different areas of a transformer substation can be obtained, wherein the video reader function is used for obtaining the video stream from a storage path and creating a structural body for storing the video, the transformer substation inspection video information of each frame is recorded and comprises an inspection video file name, an inspection video file path, an inspection video frame height, an inspection video frame amount, an inspection video time length, an inspection video frame speed, an inspection video type, an inspection video total frame number and the like, then the readFrame function is adopted to read the information of each frame of the inspection video pictures, the hasFrame function is used for frame discrimination, the read inspection video information is written into the pictures frame by frame according to the time sequence and is stored into a designated storage path, so that multi-source synchronous multi-frame pictures of different areas of the transformer substation are obtained, specifically, the method comprises the steps of dividing the decoded pictures into substation monitoring camera decoded pictures and unmanned aerial vehicle multi-view inspection decoded pictures according to the source of the video stream.
And step 3: and establishing a transformer substation space coordinate system.
In this step, the specific steps of establishing the transformer substation space coordinate system include: any position in a transformer substation area is selected as a coordinate origin O, a certain azimuth angle direction is taken as the positive direction of an x axis, a y axis and a z axis are determined according to a right-hand rule, and a transformer substation space coordinate system is established.
In the embodiment, a substation control room is selected as a coordinate origin O in a substation area, the south-facing direction of the central axis is taken as the positive direction of the x axis, the y axis and the z axis are determined according to the right-hand rule, and a substation space coordinate system is established.
And 4, step 4: reading the synchronous multi-frame pictures, carrying out object identification analysis on the synchronous multi-frame pictures by adopting an improved SSD algorithm, and identifying to obtain a classification result of the power equipment in the transformer substation; as shown in fig. 2, it is a flowchart of performing object recognition analysis on a synchronous multi-frame picture in this embodiment, and the specific steps are as follows:
step 4.1: reading a multi-view inspection decoding picture of the unmanned aerial vehicle corresponding to the video stream shot by the inspection unmanned aerial vehicle in the synchronous multi-frame pictures;
step 4.2: generating a target window of the power equipment in the transformer substation in the unmanned aerial vehicle multi-view inspection decoding picture;
step 4.3: preliminarily extracting features in the unmanned aerial vehicle multi-view inspection decoded picture by adopting a convolutional network;
step 4.4: carrying out multi-layer feature extraction on the features extracted in the step 4.3 by adopting an inverted pyramid type convolution network to obtain a feature map;
step 4.5: calculating a category score and a loss function of the target window in each feature map;
step 4.6: generating a primary identification result of the power equipment in the transformer substation by adopting a non-maximum value inhibition step;
step 4.7: abstracting a clustering model from the primary identification result of the power equipment in the transformer substation, setting standard parameters of different transformer substation equipment as reference points, and calculating a local density function gamma of the transformer substation equipmentiFunction of distancei(ii) a Wherein the local density function gamma of the substation equipmentiFunction of distanceiThe following formula is satisfied:
γi=∑λ(dij-dc)
Figure BDA0002546506330000081
wherein d isijRepresents the distance between data point i and data point j in the clustering model, dcRepresents a truncation distance; lambda is a coefficient, and when the abscissa of the data point i is a negative number, the value of lambda is 1, and when the abscissa of the data point i is a positive number, the value of lambda is 0;
step 4.8: as a function of local densityiFunction of distanceiRespectively serving as horizontal and vertical coordinates, determining a clustering center of the power equipment in the transformer substation in the clustering model, and identifying the category of the power equipment in the transformer substation;
step 4.9: performing class assignment on the remaining data points according to a density descending order, and determining a classification result of the power equipment in the transformer substation;
step 4.10: comparing the classification result of the power equipment in the transformer substation with the classification result of the actual power transformation equipment, and if the classification result of the power equipment in the transformer substation is consistent with the classification result of the actual power transformation equipment, outputting the classification result of the power equipment in the transformer substation; and if the two are not consistent, skipping to execute the step 4.2.
In this embodiment, the step 4.2 to the step 4.6 are performed by using the existing SSD algorithm, analyzing and calculating the unmanned aerial vehicle multi-view inspection decoded picture by using VGG-16-Atrous as a basic network, and adding a detection mode based on a feature pyramid (PyramidalFeature Hierarchy) to obtain a primary identification result of the power equipment in the substation. Considering that the initial parameters of the existing SSD algorithm are mainly set manually according to experience, and the result is generated through non-maximum suppression, the value of the initial parameters has certain influence on the extraction of the features and the final recognition result. The improvement of the SSD algorithm in this embodiment is to combine a clustering model based on a density peak, and add a step of feedback discrimination after generating a result of preliminary identification of power equipment in the substation by using a non-maximum suppression step in step 4.6, that is, step 4.7 to step 4.10.
Wherein the truncation distance d in step 4.7cThe value is 5%; local density function gammaiIndicating that the distance between the data point of the object of interest and the reference data point is less than the truncation distance dcQuantity of, distance functioniRepresenting the minimum distance of all points having a local density greater than the reference point, the distance function being such that if there is an extremum in the local densityiI.e. represents the maximum distance of all other data points from the reference point.
Step 4 in this embodiment further includes the following steps: establishing a safe distance cube taking any mass center of power equipment in the transformer substation as a center, wherein the safe distance cube is used for modeling and calculating a three-dimensional space environment of the transformer substation; wherein the coordinates (Xn, Yn, Zn) of the 8 vertices of the safety distance cube satisfy the following formula:
(Xn,Yn,Zn)=(Xoi±Ri,Yoi±Ri,Zoi±Ri)
Figure BDA0002546506330000091
wherein n denotes the serial number of 8 vertices of the safety distance cube, i.e., n is 1,2, 3. (Xoi, Yoi, Zoi) are centroid coordinates of the electrical device; i represents the type number of the power equipment in the transformer substation; riIs the safe radius of the center of mass, RdiFor the safety distance, L, of the electrical equipment in the substationiIs the average apparent size of the electrical equipment in the substation. According to the formula, the vertex coordinates of a safe distance cube can be calculated, the safe distance cube area is a forbidden area which needs to be avoided by the transformer substation to patrol the flight route of the unmanned aerial vehicle, so that the actual operable range of the patrol unmanned aerial vehicle in the transformer substation can be determined, and the actual operable range of the transformer substation can be realizedFurther perception of the environment.
And 5: the method comprises the steps that a multisource SLAM algorithm is adopted, situation perception is conducted on the real-time environment of the transformer substation, and the running state of the inspection unmanned aerial vehicle in the transformer substation is obtained; as shown in fig. 3, a flowchart of situational awareness of a real-time environment of a substation according to this embodiment is provided, which specifically includes the following steps:
step 5.1: reading a monitoring camera decoding picture corresponding to a video stream shot by a monitoring camera in a synchronous multi-frame picture;
step 5.2: determining observation cameras of the inspection unmanned aerial vehicle at different flight positions, and calibrating internal and external parameters of the observation cameras;
step 5.3: establishing a transformer substation inspection multi-view camera coordinate model;
step 5.4: correcting the transformer substation inspection multi-view camera coordinate model by using the interference matrix E, and determining the real-time position of the inspection unmanned aerial vehicle;
step 5.5: and (4) identifying the obtained classification result of the power equipment in the transformer substation and the real-time position of the inspection unmanned aerial vehicle, and determining the motion state of the inspection unmanned aerial vehicle.
In step 5.3, the specific steps of establishing the camera coordinate model for substation inspection are as follows: defining the pixel coordinate of the left camera as (u)l,vl) The projection matrix is defined as Ml(ii) a The pixel coordinate of the right camera is defined as (u)r,vr) Projection matrix is MrThen, the following model is established for the relationship between the coordinates (X, Y, Z) of the observed point in the substation and the pixel coordinates of the observation camera:
Figure BDA0002546506330000101
Figure BDA0002546506330000102
wherein Z islAnd ZrThe depths of observed points in the transformer substation in the optical axis direction of the left camera and the right camera are respectively represented, and the formula can be obtained by combining the following steps:
Figure BDA0002546506330000103
note the book
Figure BDA0002546506330000104
The above formula can be simplified to C · P ═ D;
according to the formula, the coordinate P of the observed point in the transformer substation can be obtained through solving, the coordinate P of the observed point is the theoretical real-time position of the inspection unmanned aerial vehicle in the transformer substation, and the following formula is satisfied:
P=(CTC)-1CTD。
and 5.4, considering that the inspection unmanned aerial vehicle is possibly interfered by electromagnetic signals sent by power equipment in the transformer substation, adding an interference matrix E to correct a theoretical real-time position calculation formula of the inspection unmanned aerial vehicle in the transformer substation. The interference matrix E is a 3 x 1 dimensional space vector and is calculated based on historical data of actual positions and theoretical calculation deviations of the substation inspection unmanned aerial vehicle; the interference matrix E is used for correcting the substation inspection multi-view camera coordinate model, and then the calculation formula for determining the real-time position of the inspection unmanned aerial vehicle is determined as follows:
P=(CTC)-1CTD+E。
in step 5.5, the reconstruction of the transformer substation map can be realized and the motion state of the unmanned aerial vehicle in the transformer substation is estimated mainly according to a time domain feature point tracking method and in combination with the classification result of the power equipment in the transformer substation and the range defined by the safety distance cube. Wherein, the specific steps of step 5.5 include: the transformer substation patrol inspection decoding pictures are stored and arranged according to a frame sequence, wherein the time intervals among the transformer substation patrol inspection decoding pictures of each frame are the same, the linear velocity and the angular velocity of the patrol inspection unmanned aerial vehicle in the transformer substation are obtained according to the pose change of the patrol inspection unmanned aerial vehicle among the transformer substation patrol inspection decoding pictures of different frames relative to the same physical space reference point, and the calculation formula is as follows:
vn,k=Gnvk+vv,k
Figure BDA0002546506330000111
wherein v isn,kFor inspecting the linear velocity observation vector, omega, of the unmanned aerial vehiclen,kObserving vectors for the angular velocity of the inspection unmanned aerial vehicle; gnA discrete cosine transformation matrix, H, representing the transformation from the camera coordinate system to the substation coordinate systemnRepresenting a transformation matrix of rotation rates from the camera coordinate system to the substation coordinate system, and matrix GnAnd HnThe internal parameters of (2) are obtained in a multi-view camera calibration step, namely step 5.2; the delta T is the time difference of shooting of the two selected frames of pictures, namely the time interval of the change of the motion state of the unmanned aerial vehicle during inspection; v. ofv,kAnd vω,kIs gaussian white noise.
The camera of patrolling and examining unmanned aerial vehicle in this embodiment is many meshes camera, mainly adopts two mesh three-dimensional imaging principle, and on step 1 carried out the basis of numbering to the surveillance camera head in each transformer substation, according to patrolling and examining unmanned aerial vehicle's position of motion, select mated camera nearby to patrol and examine unmanned aerial vehicle position and carry out two mesh synchronous analysis, mark the inside and outside parameter of camera simultaneously. Considering that the space scale of the transformer substation is far larger than the size of the inspection unmanned aerial vehicle, 5.3, abstracting the inspection unmanned aerial vehicle into particles for motion analysis, converting the situation perception problem of the inspection unmanned aerial vehicle into the positioning and control problem of a space observation point, establishing a relation model of coordinates (X, Y, Z) of the observed point and pixel coordinates of an observation camera in the transformer substation, introducing the classification result of the power equipment in the transformer substation obtained by identification and the real-time position of the inspection unmanned aerial vehicle, and determining the motion state of the inspection unmanned aerial vehicle.
Step 6: inputting the recognized classification result of the power equipment in the transformer substation and the running state of the inspection unmanned aerial vehicle in the transformer substation into a dispatching system background in the transformer substation, matching the classification result with the actual environment characteristic data of the transformer substation in the dispatching system background, and finishing multi-view situation perception navigation of the inspection unmanned aerial vehicle if the input data is consistent with the actual environment characteristic data of the transformer substation in a matching manner; and if the input data are not matched with the actual environment characteristic data of the transformer substation, skipping to execute the step 5.
In the multi-view situation awareness navigation method for the substation inspection unmanned aerial vehicle, collected video streams are classified and sorted according to recording time and target positions, so that the utilization rate of monitoring videos can be improved, and the substation inspection unmanned aerial vehicle is convenient to call for review and manual inspection; the method adopts a multisource step LAM algorithm and an improved step D algorithm to realize the steps of object identification, map reconstruction, navigation prediction and the like, realizes the real-time situation perception and map reconstruction of the inspection unmanned aerial vehicle in the transformer substation, and solves the problem of low navigation positioning accuracy of the unmanned aerial vehicle in the transformer substation.
The same or similar reference numerals correspond to the same or similar parts;
the terms describing positional relationships in the drawings are for illustrative purposes only and are not to be construed as limiting the patent;
it should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (10)

1. A multi-view situation perception navigation method for a transformer substation inspection unmanned aerial vehicle is characterized by comprising the following steps:
s1: collecting a video stream shot by a monitoring camera and an inspection unmanned aerial vehicle in a transformer substation;
s2: decoding the video stream to obtain multi-source synchronous multi-frame pictures in different areas in the transformer substation, and storing the multi-source synchronous multi-frame pictures in a specified storage path;
s3: establishing a transformer substation space coordinate system;
s4: reading the synchronous multi-frame picture, carrying out object identification analysis on the synchronous multi-frame picture by adopting an improved SSD algorithm, and identifying to obtain a classification result of the power equipment in the transformer substation;
s5: the method comprises the steps that a multisource SLAM algorithm is adopted, situation perception is conducted on the real-time environment of the transformer substation, and the running state of the inspection unmanned aerial vehicle in the transformer substation is obtained;
s6: inputting the classification result of the power equipment in the transformer substation obtained by identification and the running state of the inspection unmanned aerial vehicle in the transformer substation into a dispatching system background, matching the classification result with the actual environment characteristic data of the transformer substation in the dispatching system background, and finishing multi-view situation perception navigation of the inspection unmanned aerial vehicle if the input data is consistent with the actual environment characteristic data of the transformer substation in a matching way; and if the input data is not matched with the actual environment characteristic data of the transformer substation, skipping to execute the step S5.
2. The multi-purpose situation awareness navigation method for the substation inspection unmanned aerial vehicle according to claim 1, characterized in that: the specific steps of the step S1 are as follows:
s1.1: acquiring a video stream shot by a monitoring camera in the transformer substation according to a polling date, dividing a transformer substation area into a plurality of areas from a certain azimuth angle according to the position distribution of objects shot by the monitoring camera in a clockwise or anticlockwise approximately equal rectangular manner, numbering the cameras in each area in sequence, and establishing corresponding documents to store monitoring videos of different areas and different monitoring cameras;
s1.2: and acquiring a video stream shot by the inspection unmanned aerial vehicle, classifying the video stream shot by the inspection unmanned aerial vehicle according to the plurality of areas divided in the step S1.1, and storing the video stream in the corresponding document.
3. The multi-purpose situation awareness navigation method for the substation inspection unmanned aerial vehicle according to claim 1, characterized in that: the specific steps of the step S2 are as follows:
s2.1: acquiring a video stream which is subjected to classification management, and reading transformer substation inspection video information frame by frame for the video stream;
s2.2: and writing the read substation inspection video information into the pictures frame by frame according to the time sequence of the video stream to obtain multi-source synchronous multi-frame pictures of different regions of the substation, and storing the multi-source synchronous multi-frame pictures in a specified storage path.
4. The multi-purpose situation awareness navigation method for the substation inspection unmanned aerial vehicle according to claim 1, characterized in that: in the step S3, the specific step of establishing the transformer substation spatial coordinate system includes: any position in a transformer substation area is selected as a coordinate origin O, a certain azimuth angle direction is taken as the positive direction of an x axis, a y axis and a z axis are determined according to a right-hand rule, and a transformer substation space coordinate system is established.
5. The multi-purpose situation awareness navigation method for the substation inspection unmanned aerial vehicle according to claim 1, characterized in that: in the step S4, the specific steps are as follows:
s4.1: reading a multi-view inspection decoding picture of the unmanned aerial vehicle corresponding to the video stream shot by the inspection unmanned aerial vehicle in the synchronous multi-frame picture;
s4.2: generating a target window of power equipment in the transformer substation in the unmanned aerial vehicle multi-view inspection decoding picture;
s4.3: preliminarily extracting features in the unmanned aerial vehicle multi-view inspection decoded picture by adopting a convolutional network;
s4.4: carrying out multi-layer feature extraction on the features extracted in the step S4.3 by adopting an inverted pyramid type convolution network to obtain a feature map;
s4.5: calculating a class score and a loss function of a target window in each feature map;
s4.6: generating a primary identification result of the power equipment in the transformer substation by adopting a non-maximum value inhibition step;
s4.7: abstracting a clustering model from the primary identification result of the power equipment in the transformer substation, setting standard parameters of different transformer substation equipment as reference points, and calculating a local density function gamma of the transformer substation equipmentiFunction of distancei(ii) a Wherein the local density function γ of the substation equipmentiFunction of distanceiThe following formula is satisfied:
γi=∑λ(dij-dc)
Figure FDA0002546506320000021
wherein d isijRepresents the distance between data point i and data point j in the clustering model, dcRepresents a truncation distance; lambda is a coefficient, and when the abscissa of the data point i is a negative number, the value of lambda is 1, and when the abscissa of the data point i is a positive number, the value of lambda is 0;
s4.8: with said local density function gammaiFunction of distanceiRespectively serving as horizontal and vertical coordinates, determining a clustering center of the power equipment in the substation in the clustering model, and identifying the category of the power equipment in the substation;
s4.9: performing class assignment on the remaining data points according to a density descending order, and determining a classification result of the power equipment in the transformer substation;
s4.10: comparing the classification result of the power equipment in the transformer substation with the classification result of the actual power transformation equipment, and if the classification result of the power equipment in the transformer substation is consistent with the classification result of the actual power transformation equipment, outputting the classification result of the power equipment in the transformer substation; and if the two are not consistent, skipping to execute the step S4.2.
6. The multi-view situation awareness navigation method for the substation inspection unmanned aerial vehicle according to claim 5, characterized in that: in the step S4, the method further includes the steps of: establishing a safe distance cube taking any power equipment mass center in the transformer substation as a center, wherein the safe distance cube is used for modeling and calculating a three-dimensional space environment of the transformer substation; wherein the coordinates (Xn, Yn, Zn) of the 8 vertices of the safety distance cube satisfy the following formula:
(Xn,Yn,Zn)=(Xoi±Ri,Yoi±Ri,Zoi±Ri)
Figure FDA0002546506320000031
wherein n denotes the serial number of 8 vertices of the safety distance cube, i.e., n is 1,2, 3. (Xoi, Yoi, Zo)i) Is the centroid coordinate of the power equipment; i represents the type number of the power equipment in the transformer substation; riIs the safe radius of the center of mass, RdiFor the safety distance, L, of the electrical equipment in the substationiIs the average apparent size of the electrical equipment in the substation.
7. The substation inspection unmanned aerial vehicle multi-view situation awareness navigation method according to claim 1, wherein the method comprises the following steps: in the step S5, the specific steps are as follows:
s5.1: reading a monitoring camera decoding picture corresponding to a video stream shot by a monitoring camera in the synchronous multi-frame picture;
s5.2: determining observation cameras of the inspection unmanned aerial vehicle at different flight positions, and calibrating internal and external parameters of the observation cameras;
s5.3: establishing a transformer substation inspection multi-view camera coordinate model;
s5.4: correcting the substation inspection multi-view camera coordinate model by using the interference matrix E, and determining the real-time position of the inspection unmanned aerial vehicle;
s5.5: and (5) introducing the classification result of the power equipment in the transformer substation obtained by the identification in the step S4 and the real-time position of the inspection unmanned aerial vehicle, and determining the motion state of the inspection unmanned aerial vehicle.
8. The multi-purpose situation awareness navigation method for the substation inspection unmanned aerial vehicle according to claim 7, wherein the method comprises the following steps: in the step S5.3, the specific steps of establishing the camera coordinate model for the substation inspection are as follows: defining the pixel coordinate of the left camera as (u)l,vl) The projection matrix is defined as Ml(ii) a The pixel coordinate of the right camera is defined as (u)r,vr) Projection matrix is MrThen, the following model is established for the relation between the coordinates (X, Y, Z) of the observed point in the substation and the pixel coordinates of the observation camera:
Figure FDA0002546506320000041
Figure FDA0002546506320000042
wherein Z islAnd ZrThe depths of observed points in the transformer substation in the optical axis direction of the left camera and the right camera are respectively represented, and the formula can be obtained by combining the following steps:
Figure FDA0002546506320000043
note the book
Figure FDA0002546506320000044
The above formula can be simplified to C · P ═ D;
according to the formula, the coordinate P of the observed point in the transformer substation can be obtained through solving, the coordinate P of the observed point is the theoretical real-time position of the inspection unmanned aerial vehicle in the transformer substation, and the following formula is satisfied:
P=(CTC)-1CTD。
9. the multi-purpose situation awareness navigation method for the substation inspection unmanned aerial vehicle according to claim 8, wherein the method comprises the following steps: in the step S5.4, the interference matrix E is a 3 × 1 dimensional space vector; the interference matrix E is used for correcting the substation inspection multi-view camera coordinate model, and then the calculation formula for determining the real-time position of the inspection unmanned aerial vehicle is determined as follows:
P=(CTC)-1CTD+E。
10. the multi-purpose situation awareness navigation method for the substation inspection unmanned aerial vehicle according to claim 7, wherein the method comprises the following steps: in the step S5.5, the specific steps include: storing and arranging the substation patrol inspection decoded pictures according to a frame sequence, wherein the time intervals among the substation patrol inspection decoded pictures of each frame are the same, and solving the linear velocity and the angular velocity of the patrol inspection unmanned aerial vehicle in the substation according to the pose change of the patrol inspection unmanned aerial vehicle among the substation patrol inspection decoded pictures of different frames relative to the same physical space reference point, wherein the calculation formula is as follows:
vn,k=Gnvk+vv,k
Figure FDA0002546506320000051
wherein v isn,kFor inspecting the linear velocity observation vector, omega, of the unmanned aerial vehiclen,kObserving vectors for the angular velocity of the inspection unmanned aerial vehicle; gnA discrete cosine transformation matrix, H, representing the transformation from the camera coordinate system to the substation coordinate systemnRepresenting a rotation rate transformation matrix from a camera coordinate system to a transformer substation coordinate system; the delta T is the time difference of shooting of the two selected frames of pictures, namely the time interval of the change of the motion state of the unmanned aerial vehicle during inspection; v. ofv,kAnd vω,kIs gaussian white noise.
CN202010561811.4A 2020-06-18 2020-06-18 Multi-view situation perception navigation method for substation inspection unmanned aerial vehicle Active CN111741263B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010561811.4A CN111741263B (en) 2020-06-18 2020-06-18 Multi-view situation perception navigation method for substation inspection unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010561811.4A CN111741263B (en) 2020-06-18 2020-06-18 Multi-view situation perception navigation method for substation inspection unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN111741263A true CN111741263A (en) 2020-10-02
CN111741263B CN111741263B (en) 2021-08-31

Family

ID=72649820

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010561811.4A Active CN111741263B (en) 2020-06-18 2020-06-18 Multi-view situation perception navigation method for substation inspection unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN111741263B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112729312A (en) * 2020-12-25 2021-04-30 云南电网有限责任公司昆明供电局 Unmanned aerial vehicle inspection method for high-voltage chamber of transformer substation
CN113506345A (en) * 2021-07-28 2021-10-15 上海纽盾科技股份有限公司 Unmanned aerial vehicle security situation sensing method and system for equipment inspection
CN113592350A (en) * 2021-08-12 2021-11-02 浙江创意声光电科技有限公司 Situation awareness system and method
CN114202819A (en) * 2021-11-29 2022-03-18 山东恒创智控科技有限公司 Robot-based substation inspection method and system and computer
CN115909183A (en) * 2022-09-16 2023-04-04 北京燃气平谷有限公司 Monitoring system and monitoring method for external environment of gas delivery
CN116233219A (en) * 2022-11-04 2023-06-06 国电湖北电力有限公司鄂坪水电厂 Inspection method and device based on personnel positioning algorithm
CN117906615A (en) * 2024-03-15 2024-04-19 苏州艾吉威机器人有限公司 Fusion positioning method and system of intelligent carrying equipment based on environment identification code
CN117906615B (en) * 2024-03-15 2024-06-04 苏州艾吉威机器人有限公司 Fusion positioning method and system of intelligent carrying equipment based on environment identification code

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105159297A (en) * 2015-09-11 2015-12-16 南方电网科学研究院有限责任公司 Power transmission line unmanned plane inspection obstacle avoidance system and method
KR20170087340A (en) * 2016-01-20 2017-07-28 한국전력공사 System and method for connection power line using dron
CN108549098A (en) * 2018-04-24 2018-09-18 湘潭大学 A kind of patrol unmanned machine localization method of indoor substation
CN109623839A (en) * 2018-12-24 2019-04-16 西南交通大学 Power distribution station indoor equipment air-ground coordination inspection device and its method for inspecting
US20190155314A1 (en) * 2017-07-06 2019-05-23 Top Flight Technologies, Inc. Navigation system for a drone
US20190159444A1 (en) * 2017-11-30 2019-05-30 Florida Power & Light Company Unmanned aerial vehicle system for deterring avian species from sensitive areas
CN110737212A (en) * 2018-07-18 2020-01-31 华为技术有限公司 Unmanned aerial vehicle control system and method
US20200041560A1 (en) * 2018-08-01 2020-02-06 Florida Power & Light Company Remote autonomous inspection of utility system components utilizing drones and rovers
CN210090988U (en) * 2019-04-11 2020-02-18 株洲时代电子技术有限公司 Unmanned aerial vehicle system of patrolling and examining
CN110989686A (en) * 2019-12-31 2020-04-10 深圳市贝贝特科技实业有限公司 Unmanned aerial vehicle and transformer substation actuating mechanism interaction method and system
CN110989685A (en) * 2019-12-31 2020-04-10 深圳市贝贝特科技实业有限公司 Unmanned aerial vehicle cruise detection system and cruise detection method thereof
KR20200052670A (en) * 2018-11-07 2020-05-15 한국전력공사 Drone control method and drone control server using the same
CN111272172A (en) * 2020-02-12 2020-06-12 深圳壹账通智能科技有限公司 Unmanned aerial vehicle indoor navigation method, device, equipment and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105159297A (en) * 2015-09-11 2015-12-16 南方电网科学研究院有限责任公司 Power transmission line unmanned plane inspection obstacle avoidance system and method
KR20170087340A (en) * 2016-01-20 2017-07-28 한국전력공사 System and method for connection power line using dron
US20190155314A1 (en) * 2017-07-06 2019-05-23 Top Flight Technologies, Inc. Navigation system for a drone
US20190159444A1 (en) * 2017-11-30 2019-05-30 Florida Power & Light Company Unmanned aerial vehicle system for deterring avian species from sensitive areas
CN108549098A (en) * 2018-04-24 2018-09-18 湘潭大学 A kind of patrol unmanned machine localization method of indoor substation
CN110737212A (en) * 2018-07-18 2020-01-31 华为技术有限公司 Unmanned aerial vehicle control system and method
US20200041560A1 (en) * 2018-08-01 2020-02-06 Florida Power & Light Company Remote autonomous inspection of utility system components utilizing drones and rovers
KR20200052670A (en) * 2018-11-07 2020-05-15 한국전력공사 Drone control method and drone control server using the same
CN109623839A (en) * 2018-12-24 2019-04-16 西南交通大学 Power distribution station indoor equipment air-ground coordination inspection device and its method for inspecting
CN210090988U (en) * 2019-04-11 2020-02-18 株洲时代电子技术有限公司 Unmanned aerial vehicle system of patrolling and examining
CN110989686A (en) * 2019-12-31 2020-04-10 深圳市贝贝特科技实业有限公司 Unmanned aerial vehicle and transformer substation actuating mechanism interaction method and system
CN110989685A (en) * 2019-12-31 2020-04-10 深圳市贝贝特科技实业有限公司 Unmanned aerial vehicle cruise detection system and cruise detection method thereof
CN111272172A (en) * 2020-02-12 2020-06-12 深圳壹账通智能科技有限公司 Unmanned aerial vehicle indoor navigation method, device, equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
J. CHEN, P. KIM, Y.K. CHO, J. UEDA: "Object-sensitive potential fields for mobile robot navigation and mapping in indoor environments", 《2018 15TH INTERNATIONAL》 *
李栋: "基于无人机视觉的输电线路安全距离巡检系统研究", 《中国优秀硕士学位论文全文数据库》 *
薛志成,巫伟男,崔志文,张欣,张裕汉: "基于无人机辅助的变电站巡检机器人定位导航方法", 《2019中国自动化大会(CAC2019)论文集》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112729312A (en) * 2020-12-25 2021-04-30 云南电网有限责任公司昆明供电局 Unmanned aerial vehicle inspection method for high-voltage chamber of transformer substation
CN113506345A (en) * 2021-07-28 2021-10-15 上海纽盾科技股份有限公司 Unmanned aerial vehicle security situation sensing method and system for equipment inspection
CN113592350A (en) * 2021-08-12 2021-11-02 浙江创意声光电科技有限公司 Situation awareness system and method
CN113592350B (en) * 2021-08-12 2024-03-29 浙江创意声光电科技有限公司 Situation awareness system and method
CN114202819A (en) * 2021-11-29 2022-03-18 山东恒创智控科技有限公司 Robot-based substation inspection method and system and computer
CN114202819B (en) * 2021-11-29 2024-01-23 山东恒创智控科技有限公司 Substation inspection method, system and computer based on robot
CN115909183A (en) * 2022-09-16 2023-04-04 北京燃气平谷有限公司 Monitoring system and monitoring method for external environment of gas delivery
CN115909183B (en) * 2022-09-16 2023-08-29 北京燃气平谷有限公司 Monitoring system and monitoring method for external environment of fuel gas delivery
CN116233219A (en) * 2022-11-04 2023-06-06 国电湖北电力有限公司鄂坪水电厂 Inspection method and device based on personnel positioning algorithm
CN116233219B (en) * 2022-11-04 2024-04-30 国电湖北电力有限公司鄂坪水电厂 Inspection method and device based on personnel positioning algorithm
CN117906615A (en) * 2024-03-15 2024-04-19 苏州艾吉威机器人有限公司 Fusion positioning method and system of intelligent carrying equipment based on environment identification code
CN117906615B (en) * 2024-03-15 2024-06-04 苏州艾吉威机器人有限公司 Fusion positioning method and system of intelligent carrying equipment based on environment identification code

Also Published As

Publication number Publication date
CN111741263B (en) 2021-08-31

Similar Documents

Publication Publication Date Title
CN111741263B (en) Multi-view situation perception navigation method for substation inspection unmanned aerial vehicle
CN110415342B (en) Three-dimensional point cloud reconstruction device and method based on multi-fusion sensor
Wang et al. Tiny object detection in aerial images
CN114004941B (en) Indoor scene three-dimensional reconstruction system and method based on nerve radiation field
Xu et al. Power line-guided automatic electric transmission line inspection system
CN106780560B (en) Bionic robot fish visual tracking method based on feature fusion particle filtering
CN111275015A (en) Unmanned aerial vehicle-based power line inspection electric tower detection and identification method and system
CN113205116A (en) Automatic extraction and flight path planning method for unmanned aerial vehicle inspection shooting target point of power transmission line
CN113298035A (en) Unmanned aerial vehicle electric power tower detection and autonomous cruise method based on image recognition
CN109063549A (en) High-resolution based on deep neural network is taken photo by plane video moving object detection method
CN114666564A (en) Method for synthesizing virtual viewpoint image based on implicit neural scene representation
CN109816780A (en) A kind of the transmission line of electricity three-dimensional point cloud generation method and device of binocular sequential images
CN116719339A (en) Unmanned aerial vehicle-based power line inspection control method and system
CN115451964A (en) Ship scene simultaneous mapping and positioning method based on multi-mode mixed features
CN115116137A (en) Pedestrian detection method based on lightweight YOLO v5 network model and space-time memory mechanism
CN114723944A (en) Image analysis method, storage medium, and electronic device
CN114419444A (en) Lightweight high-resolution bird group identification method based on deep learning network
CN112816939A (en) Substation unmanned aerial vehicle positioning method based on Internet of things
CN113867410B (en) Unmanned aerial vehicle aerial photographing data acquisition mode identification method and system
Lim et al. MSDPN: Monocular depth prediction with partial laser observation using multi-stage neural networks
CN110826432B (en) Power transmission line identification method based on aviation picture
CN116758409A (en) Remote sensing image target detection method based on single anchor frame sampling
CN113869122A (en) Distribution network engineering reinforced control method
CN113628251A (en) Smart hotel terminal monitoring method
CN112818837A (en) Aerial photography vehicle weight recognition method based on attitude correction and difficult sample perception

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant