CN113989682A - Navigation mark inspection system and inspection method based on unmanned aerial vehicle remote sensing - Google Patents

Navigation mark inspection system and inspection method based on unmanned aerial vehicle remote sensing Download PDF

Info

Publication number
CN113989682A
CN113989682A CN202111632391.5A CN202111632391A CN113989682A CN 113989682 A CN113989682 A CN 113989682A CN 202111632391 A CN202111632391 A CN 202111632391A CN 113989682 A CN113989682 A CN 113989682A
Authority
CN
China
Prior art keywords
inspection
image
buoy
reason
navigation mark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111632391.5A
Other languages
Chinese (zh)
Other versions
CN113989682B (en
Inventor
季克淮
霍虎伟
毛建峰
李铁
李金鹏
李栋
何彩柳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Navigation Guarantee Center Of North China Sea (ngcn) Mot
Tianjin Tianyuanhai Technology Development Co ltd
Original Assignee
Navigation Guarantee Center Of North China Sea (ngcn) Mot
Tianjin Tianyuanhai Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Navigation Guarantee Center Of North China Sea (ngcn) Mot, Tianjin Tianyuanhai Technology Development Co ltd filed Critical Navigation Guarantee Center Of North China Sea (ngcn) Mot
Priority to CN202111632391.5A priority Critical patent/CN113989682B/en
Publication of CN113989682A publication Critical patent/CN113989682A/en
Application granted granted Critical
Publication of CN113989682B publication Critical patent/CN113989682B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention relates to the field of navigation mark inspection, in particular to a navigation mark inspection system and an inspection method based on unmanned aerial vehicle remote sensing, which acquire inspection characteristics capable of being analyzed by inspection images through acquiring the inspection images of the navigation mark, judge the abnormal conditions of the navigation mark according to the inspection characteristics, carry out secondary shooting for the types of the abnormal conditions when the abnormal conditions occur, acquire inspection reason images through the inspection reason images, acquire the inspection reason characteristics corresponding to the types of the abnormal conditions through the inspection reason images, so that the acquired information can be more comprehensive, convenient and quick, simultaneously, the application can also screen the shot inspection images, avoid generating a large amount of unclear and repeated non-objection image information, can increase the working efficiency of navigation mark maintenance, enhance the emergency response capability of the navigation mark, and comprehensively improve the maintenance and the service quality of the navigation mark, and 4, the patrol cost is reduced.

Description

Navigation mark inspection system and inspection method based on unmanned aerial vehicle remote sensing
Technical Field
The application relates to the field of navigation mark inspection, in particular to a navigation mark inspection system and an inspection method based on unmanned aerial vehicle remote sensing.
Background
In recent years, on the background of the continuous development of shipping economy in China, the number of ports and the number of ships are continuously increased, and marine sudden accidents and traffic accidents also occur at times, so that the demand on the navigation mark is gradually increased, the workload of a navigation mark department is increased day by day, and the responsibility for ensuring safety is more important. Therefore, the maintenance and management of the navigation mark become key points, and in order to ensure the safety of the navigation path, the navigation mark needs to be inspected at regular time so as to grasp and process abnormal conditions such as damage, displacement and the like of the navigation mark in time. At present, technical means such as field inspection or ship inspection, near-shore monitoring and the like are mainly utilized for the navigation mark guarantee work, the advantages of a field inspection mode are obvious, but the defects are also obvious, such as the influence and the restriction of meteorological conditions, the visual navigation mark needs to be simulated by a mark-climbing and light-covering lamp, the consumed time is long, operators are easy to fatigue, and the danger coefficient is increased; the modes of ship inspection and the like have the problems of low response speed, high cost, large limitation by meteorological conditions, limited operation capability and range, occasional false alarm and missing report, incapability of displaying main body information such as the appearance of a navigation mark body and the like. The development of the remote sensing technology provides a remote sensing inspection mode for the navigation mark inspection, the remote sensing inspection mode makes up for many defects of field inspection, and the maintenance level of the navigation mark is obviously enhanced along with the continuous development of the remote sensing measurement and control technology. On this basis, the design idea that the unmanned aerial vehicle remote sensing is applied to the navigation mark field is provided, the navigation mark condition is shown through the mode of video images, on this basis, the unmanned aerial vehicle remote sensing mode is further provided, the unmanned remote sensing patrol can provide powerful management basis for navigation mark departments, the current condition of the navigation mark is mastered in real time, and targeted management and maintenance are carried out on the navigation mark according to actual conditions, so that the goal of improving the maintenance efficiency of the navigation mark is achieved. An Unmanned Aerial Vehicle (UAV) is an unmanned plane, which is an unmanned plane operated by a radio remote control device or a self-contained program control device, and generally includes an unmanned helicopter or a fixed-wing drone. Be applied to the navigation guarantee field with unmanned aerial vehicle, can give full play to its advantage such as with low costs, transportation convenience, easy operation, the quick high flexibility of reaction and can independently fly, compensate the not enough of present technical means, provide fine technical support for the navigation guarantee, improve navigation guarantee service level comprehensively.
In recent years, as ships develop towards large-scale and high-speed directions, higher requirements are provided for the advancement, accuracy and timeliness of fault recovery of a navigation mark, the mode of polling and flying by using a single unmanned aerial vehicle is not only time-consuming, but also faults easily occur in the polling process, the service life of the unmanned aerial vehicle is also influenced, and particularly, in areas with complex navigation channels or large navigation distance, if a single unmanned aerial vehicle is used for cruising, a plurality of performance efficiency problems can occur, therefore, under the condition, a plurality of unmanned aerial vehicles are often used for carrying out regional polling, how to effectively distribute the polling efficiency, how to save the electric power of the unmanned aerial vehicle, how to improve the service life of the unmanned aerial vehicle and the like needs are continuously provided by users, and meanwhile, routes for polling a plurality of unmanned aerial vehicles are planned and the like, The navigation mark system has the advantages that the abnormality condition of the navigation mark is judged, the reason of the abnormality is obtained according to the abnormality condition, when the obtained information is insufficient, problems such as proper adjustment of a route and the like also occur simultaneously, even on the navigation mark at some special positions, a device for obtaining the information of the navigation mark and the conditions of a water area around the navigation mark is further arranged to be used as an information station of the navigation mark, a large data system of the navigation mark can be further established by obtaining the information, and the safe operation of the large data of the navigation mark is also required to be ensured in the inspection process. Therefore, how to increase the working efficiency of the navigation mark maintenance, enhance the emergency response capability of the navigation mark, comprehensively improve the navigation mark maintenance and service quality, and reduce the inspection cost is a problem to be solved urgently in the current navigation mark management.
Disclosure of Invention
In order to solve the problems, the application provides a beacon inspection system based on unmanned aerial vehicle remote sensing, which comprises a control system, an unmanned aerial vehicle and a beacon;
the control system comprises an inspection route distribution module, an inspection image acquisition module, an image characteristic module, an image screening module, an abnormality judgment module, a troubleshooting reason module and a troubleshooting reason image acquisition module;
the control system also comprises a historical inspection information base and a historical maintenance information base, wherein the historical inspection information base comprises inspection characteristics of the navigation mark in historical inspection and the cruise weight of the navigation mark; the historical maintenance information base comprises troubleshooting reason categories corresponding to abnormal categories when the buoy is abnormal, and the troubleshooting reason categories can be judged through troubleshooting reason characteristics;
the inspection route distribution module is used for distributing the unmanned aerial vehicle for performing inspection image acquisition on the navigation mark; the inspection image acquisition module is used for acquiring an inspection image of the navigation mark; the image characteristic module is used for obtaining the inspection characteristics in the inspection image; the image screening module is used for taking the inspection images with the same category of inspection features as a feature image group and screening the inspection images in the feature image group; the abnormality judging module is used for obtaining the abnormality type of the navigation mark through the inspection features in the inspection image; the troubleshooting reason module is used for acquiring troubleshooting reason types corresponding to the abnormality types; the troubleshooting reason image acquisition module is used for acquiring a troubleshooting reason image of the navigation mark and acquiring troubleshooting reason characteristics belonging to the troubleshooting reason category through the troubleshooting reason image;
the inspection host comprises acquisition equipment, and the acquisition equipment is used for acquiring inspection images and troubleshooting reason images of the buoy.
Wherein, collection equipment includes photoelectric gondola and laser rangefinder.
The navigation mark comprises a navigation mark main body, wherein the navigation mark main body comprises at least one of a lighthouse, a buoy, a stand column and a lightboat.
Wherein, it specifically includes to patrol and examine information classification: brightness, float color, float code, flicker frequency, body shape, and spatial location.
The application also provides a method for polling the navigation mark polling system based on the unmanned aerial vehicle remote sensing, which comprises the following steps:
s10, according to the control system, the unmanned plane P is sent to0Distributed routing inspection routes, unmanned aerial vehicle P0To buoy R0Collecting a patrol image;
s20, buoy R collected by unmanned aerial vehicle0Can obtain a buoy R0Is arranged to be classified into M types and a buoy R0Inspection characteristic class R0L=[R0L1, R0L2,…, R0LM];
S30; setting up unmanned aerial vehicle P0Acquisition buoy R0The inspection image of (1), wherein the inspection characteristic R of the ith type is included0LiThe inspection image is used as a buoy R0The ith type of feature image group is set to comprise alpha N (alpha is less than or equal to 1) routing inspection images;
s40, screening the inspection images of the ith characteristic image group, and removing the unqualified images out of the ith characteristic image group;
s50, obtaining the buoy R according to the inspection image in the ith type characteristic image group0The i-th inspection characteristic of the buoy R is judged0If the type i polling characteristic is abnormal, if yes, the step is switched to step S51; if not, go to step S52;
s51, when the buoy R is used0When the ith type inspection characteristic is abnormal, the ith type inspection characteristic is taken as abnormalAdding the normal category into a disorder category list;
s52, when the buoy R is used0When the i-th inspection characteristic is normal, the buoy R is continuously checked0Judging a jth type patrol characteristic (j is not equal to i);
s60, obtaining a buoy R0When RS = null, the abnormality category list RS of (1) ends for the float R0The inspection is carried out;
when the abnormality category list RS ≠ null, the float R is set0Disorder class list RS = [ S ]1,S2, ,…, SL]Wherein, the ith abnormal condition category in the abnormal condition category list is set as SiObtaining the abnormality class SiCorresponding reason checking characteristics in a historical maintenance database;
s70, establishing a reason investigation task by the control system, acquiring an reason investigation image of the buoy R0 by using the unmanned aerial vehicle P0, and obtaining the reason investigation characteristics through the reason investigation image;
setting the troubleshooting reason characteristics corresponding to the abnormality categories Si to comprise K categories, wherein the troubleshooting reason characteristic categories SIT = [ SIT1, SIT2, … and SITK =]Wherein, when there is a type of investigation cause characteristics SITjIf the reason image cannot be acquired, the process proceeds to step S71;
when abnormal type SiWhen the investigation cause features of all the categories are acquired in the investigation cause image, the process proceeds to step S72;
s71, adjusting the routing inspection route of the unmanned aerial vehicle, and using the unmanned aerial vehicle Pi(Pi≠P0) To buoy R0Disorder class S ofiSIT (a)jAcquiring a troubleshooting reason image according to the category troubleshooting reason characteristics;
and S72, judging the generation reason of the malfunction type according to the obtained reason checking characteristics.
In step S40, the method further includes:
step S401, carrying out graying processing on alpha N images in the ith type image group to obtain a grayscale image;
step S402, calculating the average gray value EH and the smoothness EP of each pixel point in the gray image;
and S403, screening the inspection images in the ith image group according to the average gray value EH and the smoothness EP of the pixel point E and the positions of the buoys on the inspection images, and moving the unqualified images out of the ith image group.
In step S401, during the gray scale calculation, R, G, B is selected as 0.3, 059, 0.11, and the gray scale value E (x, y) =0.3R (x, y) +059G (x, y) +0.11B (x, y) of the pixel E is expressed by a formula; wherein, R is the red value of the pixel E (x, y), G is the green value of the pixel E (x, y), and B is the blue value of the pixel E (x, y).
In step S402, a pixel point E (x, y) is set in the x-th row and the y-th column, and F adjacent pixel points are selected as all pixel points in the F-row and the F-column around the pixel E, specifically, the pixel in the F-row and the F-column around the pixel E is set as an adjacent pixel point of the pixel E;
the number of adjacent pixels of the pixel E
Figure DEST_PATH_IMAGE001
Wherein f is more than or equal to 1 and is an integer;
obtain the average gray value of the pixel E
Figure DEST_PATH_IMAGE002
Meanwhile, the average variance EX and the excessive chromatic aberration EG of the pixel E can also be obtained, and the formula is:
Figure DEST_PATH_IMAGE003
Figure DEST_PATH_IMAGE004
the smoothness EP = EX + EG of the available pixel E.
The gray level image C corresponding to the inspection image in the ith type image group comprises an area A and an area B, wherein A ⊆ B ⊆ C; setting the distance between the area A and the wide side of the gray-scale image C as a1 and the distance between the area A and the long side of the gray-scale image C as a 2; the distance between the region B and the wide side of the gray image C is B1, and the distance between the region B and the long side of the gray image C is B2;
wherein b1 is less than a1, b2 is less than a 2; setting a smoothness threshold WR0
Provided with a buoy R0The smoothness of the region in the gray image is R0EP is the sum of all pixel point smoothness EP in the area;
when buoy R in gray level image0When the position of (2) is within the area A, the buoy R0The distance H1 > a1 between the left and right ends of the float R and the wide side of the gray image, and the float R0The distance H2 between the upper and lower ends of the gray scale image and the long side of the gray scale image is more than a2, and the process proceeds to step S411;
when buoy R in gray level image0When the position of (2) is within the area B and outside the area A, the process proceeds to step S412,
when buoy R in gray level image0When the position of the gray image is outside the area B, the gray image moves out of the ith type image group;
step S411, taking f =1 to obtain buoy R in gray level image0Smoothness R of the region0EP; when R is0EP<WR0Then, the gray image is shifted out of the ith type image group;
step S412, when the buoy R is used0The distance b1 between the left end and the right end of the gray scale image and the wide side of the gray scale image is more than or equal to H1 and more than or equal to a1,
Figure DEST_PATH_IMAGE005
(ii) a When the distance a2 between the left end and the right end of the buoy and the long side of the gray scale image is more than or equal to H2 and less than or equal to b2 in the image,
Figure DEST_PATH_IMAGE006
(ii) a Wherein f is rounded up, wherein R θ0The routing inspection weight corresponding to the navigation mark R; when R is0EP<WR0And when the gray image is shifted out of the ith type image group.
When F =1, the number of pixels F =8 adjacent to the pixel E, where E (x, y) is set to represent the gray value of the pixel point at the x-th row and the y-th column of the current image; e (x, y + 1) represents the gray value of the pixel point of the x row and y +1 column of the current image; e (x, y-1) represents the gray value of the y-1 th column pixel point of the x-th row of the current image; e (x-1, y) represents the gray value of the pixel point of the y column of the x-1 row of the current image; e (x-1, y + 1) represents the gray value of the pixel points in the x-1 row and y +1 column of the current image; e (x-1, y-1) represents the gray value of the y-1 column pixel point of the x-1 row of the current image; e (x +1, y + 1) represents the gray value of the pixel point of the (x + 1) th row and the (y + 1) th column of the current image; e (x +1, y) represents the gray value of the pixel point of the x +1 row and the y column of the current image; e (x +1, y + 1) represents the gray value of the pixel point of the (x + 1) th row and the (y + 1) th column of the current image; the average gray value of the pixel E can be obtained
Figure DEST_PATH_IMAGE007
EH denotes an average gradation value of the pixel E.
The beneficial effect that this application realized is as follows:
this application is through the collection to the image of patrolling and examining of fairway buoy, obtain by patrolling and examining the characteristic of patrolling and examining that the image can the analysis obtain, judge the malfunction situation of fairway buoy according to patrolling and examining the characteristic, when producing the malfunction situation, give the classification of malfunction situation, carry out the secondary by unmanned aerial vehicle and shoot, acquire the investigation reason image, acquire the corresponding investigation reason characteristic with the classification of malfunction situation through the investigation reason image, the information of gathering like this can be more comprehensive, also convenient and fast, this application can also screen the image of patrolling and examining of shooting simultaneously, avoid producing a large amount of unclear, repeated no objection image information, this application can increase the work efficiency that the fairway buoy maintained through above-mentioned method, reinforcing fairway buoy emergency response ability, improve fairway buoy maintenance and service quality comprehensively, reduce the cost of patrolling and examining.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1 is a flow chart of the navigation mark inspection method based on unmanned aerial vehicle remote sensing.
Fig. 2 is a representation of a pixel in a gray scale image of a tour inspection image acquired in the present application and its surrounding neighboring pixels.
Fig. 3 is a distribution representation of an effective area and an ineffective area in a gray scale image of an inspection image acquired in the present application.
Detailed Description
The technical solutions in the embodiments of the present application are clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As shown in fig. 1-3, the application provides a beacon inspection system based on unmanned aerial vehicle remote sensing, which comprises a beacon to be inspected, a ground control system and an inspection host, wherein the ground control system is used for distributing an inspection route of the inspection host, judging whether the beacon is abnormal or not through beacon image information shot by the inspection host, setting the beacon as an abnormal beacon when the beacon is judged to be abnormal, and controlling the inspection host to collect an image of the abnormal cause of the abnormal beacon; when the routing inspection host of the routing inspection route to which the abnormal navigation mark belongs can not acquire the abnormal information of the abnormal navigation mark, adjusting the routing inspection route, and acquiring the abnormal information of the abnormal navigation mark by using other routing inspection hosts;
the navigation mark inspection system further comprises a navigation mark historical information base, and the navigation mark historical information base stores basic navigation mark information and historical inspection information of navigation marks obtained in historical inspection. The basic navigation mark information comprises the coordinates, the serial number, the navigation mark type, the initial navigation mark characteristic, the use start time and the like of the navigation mark, and the historical patrol inspection information of the navigation mark comprises the historical hidden danger degree phi and the position importance degree epsilon of the navigation mark.
The navigation mark comprises a navigation mark main body and a navigation mark information station, wherein the navigation mark main body comprises a lighthouse, a buoy, a stand column, a lightboat and other devices which are fixed on a navigation route or fixed in water and used for prompting directions for a navigation ship; the navigation mark information station comprises a monitoring device arranged on a navigation mark main body and is used for collecting hydrological meteorological information, flow velocity and direction information, ship flow information and the like, and the information can be used for constructing big data information in the aspects of meteorology, water flow and ship traffic.
The patrol inspection host machine is an unmanned aircraft which is operated by a radio remote control device or a self-contained program control device, generally comprises an unmanned helicopter or a fixed-wing unmanned aerial vehicle, and comprises a collecting component for collecting navigation mark information and abnormal information, wherein the collecting component comprises and is not limited to image collecting equipment such as a panoramic camera, a photoelectric pod, a laser range finder, an infrared measuring instrument, a surveying instrument and the like. The inspection host can acquire the information of the navigation mark by using a photoelectric pod or other shooting measuring equipment along a preset inspection route in an inspection task planned and designed by the ground control system, and transmits the acquired information to the ground control system in real time, so that the system makes a judgment.
In the specific inspection process, the ground control system is used for inspecting n navigation marks R = [ R ] to be inspected1,R2,R3,…,Rn]The basic navigation mark information can obtain the patrol weight R theta = [ R theta ] corresponding to the navigation mark R in the navigation mark historical information base1,Rθ2,Rθ3,…,Rθn](ii) a Wherein, the ith navigation mark RiPatrol weight R thetaiiφi,φiAs navigation mark RiDegree of historical risk of epsiloniAs navigation mark RiThe location importance of (a); while creating a removal list YR that includes a first removal list YR1 and a second removal list YR 2.
The ground control system establishes an inspection task A, wherein the inspection task A comprises m inspection routes AH = [ AH ]1,AH2,AH3,…,AHm]Each routing inspection route is correspondingly subjected to flight inspection by one routing inspection host P, namely, the m routing inspection routes AH are correspondingly subjected to flight inspection by m routing inspection hosts P = [ P ]1,P2,P3,…,Pm]Performing flight inspection; wherein is provided withI platform inspection host PiAt inspection route AHiNavigation mark IR = [ IR ] requiring inspection1,IR2,IR3,…,IRk](ii) a Patrol inspection weight IR theta = [ IR theta ] corresponding to navigation mark IR1,IRθ2,IRθ3,…,IRθk](ii) a Setting inspection route AHiGo up j navigation mark IR that needs to patrol and examinejHas a patrol weight of IR thetaj(ii) a Patrol route AHiTotal weight of thetai=
Figure DEST_PATH_IMAGE008
(ii) a Wherein, the weight threshold value of the routing inspection route is set to be theta0Then patrol route AHiNeed to satisfy
Figure DEST_PATH_IMAGE009
(ii) a At the same time, route AH of patroliNumber of navigation marks to be inspected
Figure DEST_PATH_IMAGE010
(ii) a Wherein e is a natural constant, θ0<2e。
Specifically, in one embodiment, the number of the navigation marks to be inspected is 60, 4 inspection hosts are used for inspection, the ground control system sets the total weight threshold of the inspection route to be 5 according to requirements, and the inspection route AH isiNumber threshold of navigation mark required to be patrolled
Figure DEST_PATH_IMAGE011
=16;
Can obtain and patrol task A: comprises 5 routing inspection routes AH1、AH2、AH3、AH4、AH5Corresponding to 5 polling host machines P1、P2、P3、P4、P5Carrying out flight inspection setting;
set up 3 rd platform and patrol inspection host P3At inspection route AH3The number of the navigation marks needing to be inspected is 15, and the navigation marks are respectively IR1、IR2、IR3、IR4、IR5、IR6、IR7、IR8、IR9、IR10、IR11、IR12、IR13、IR14、IR15The corresponding inspection weights are 0.2, 0.5, 0.1, 0.5, 0.3, 0.2, 0.5, 0.4, 0.3, 0.2, 0.5, 0.1, 0.2 and 0.6 respectively, and the inspection route AH3Total weight of thetai=4.8<5;
According to the requirement of the polling task, the polling host PiAt inspection route AHiUpper pair navigation mark IR = [ IR =1,IR2,IR3,…,IRk]The inspection is carried out to obtain inspection information such as inspection images of the navigation mark IR, and when the control system inspects the inspection route AH according to the inspectioniGo up j navigation mark IR that needs to patrol and examinejPatrol information judgment IRjWhen the navigation system is abnormal, the ground control system controls the inspection host PiFor abnormal navigation mark IRjCollecting the aberration information;
the ground control system can obtain the routing inspection characteristics such as painting, structure and mark position of the navigation mark through the acquired routing inspection information image, can establish a neural network model through acquiring the routing inspection characteristics in the navigation mark image and through navigation mark characteristic data in a historical navigation mark information base, takes the acquired routing inspection characteristics as input, and judges whether the navigation mark corresponding to the routing inspection image is abnormal according to output, and the specific method comprises the following steps: setting and inputting D routing inspection characteristics x = [ x ]1; x2 ; …; xD]Corresponding to weight w = [ w = [ w ]1; w2;…; w D]Setting a bias b epsilon R; then we can get the weighted sum z of the input features, the specific formula is:
Figure DEST_PATH_IMAGE012
using the ReLU function as the activation function, then
Figure DEST_PATH_IMAGE013
In a multi-layer feedforward neural network, let
Figure DEST_PATH_IMAGE014
Then feed-forward neural networkBy continuous iteration, the propagation formula layer by layer is:
Figure DEST_PATH_IMAGE015
the composite function is:
Figure DEST_PATH_IMAGE016
wherein
Figure DEST_PATH_IMAGE017
And
Figure DEST_PATH_IMAGE018
representing the connection weights and offsets for all layers in the network,
Figure DEST_PATH_IMAGE019
is the number of layers of the neural network,
Figure DEST_PATH_IMAGE020
is as follows
Figure 647320DEST_PATH_IMAGE019
The number of layer neurons;
Figure DEST_PATH_IMAGE021
is as follows
Figure DEST_PATH_IMAGE022
Layer to layer
Figure 519461DEST_PATH_IMAGE019
A weight matrix of the layer;
Figure DEST_PATH_IMAGE023
is as follows
Figure 919350DEST_PATH_IMAGE022
Layer to layer
Figure 408100DEST_PATH_IMAGE019
Biasing of the layers;
Figure DEST_PATH_IMAGE024
is as follows
Figure 676270DEST_PATH_IMAGE019
Output of layer neurons;
using a cross-entropy loss function, for sample (x, y) the loss function is:
Figure DEST_PATH_IMAGE025
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE026
representing by a one-hot vector corresponding to y;
given a training set of
Figure DEST_PATH_IMAGE027
Each sample is sampled
Figure DEST_PATH_IMAGE028
Input to the pre-neural network to obtain the network output of
Figure DEST_PATH_IMAGE029
The risk function on the data set is:
Figure DEST_PATH_IMAGE030
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE031
is a regularization term; λ is a long parameter, and W is closer to 0 as λ is larger;
in each iteration of the gradient descent method, a learning rate α is set to obtain an update mode of the parameters W and b:
Figure DEST_PATH_IMAGE032
calculating the gradient of the l-th layer weight and bias, δ(l)Error term for layer i:
Figure DEST_PATH_IMAGE033
Figure DEST_PATH_IMAGE034
obtaining an iterative formula:
Figure DEST_PATH_IMAGE035
through the neural network model, the patrol characteristic in the patrol information image obtained by the patrol host is input into the network neural model to obtain output, when the output is 0, the corresponding navigation mark is represented to be abnormal, and when the output is 1, the corresponding navigation mark is represented to be normal.
When the navigation mark is abnormal, the ground control system sends an instruction to the inspection host, the inspection host acquires the abnormal reason of the abnormal navigation mark, specifically, the appearance details of the navigation mark can be shot, the water area condition can be shot, the geological structure can be shot, the humidity and the temperature can be acquired, for example, when the navigation mark is displaced, the land or the ship where the navigation mark is located can be shot, and whether the soil slope is loose or the pontoon has a fault can be judged for the control system.
According to the patrol inspection host P by the ground control systemiThe obtained image of the cause of the malfunction is used to determine the host PiWhen the collection of the abnormal reasons is completed, the host PiAt inspection route AHiContinuously inspecting;
when the control system is according to the inspection host PiThe obtained image of the cause of the malfunction is used to determine the host PiWhen acquisition of the cause of the malfunction is not completed, e.g. due to tide, the patrol inspection master PiWhen shooting is carried out, the navigation mark under the water surface can not be shot, and at the moment, the navigation mark IR is shotjSlave patrol route AHiRemoving and updating the patrol inspection host PiObtaining a new routing inspection route, and repeating the steps to obtain the removalNavigation mark list YR = [ YR ]1,YR2,YR3,…,YRc]C is less than k; setting a distance threshold D0(ii) a Arrange and patrol and examine host computer APiHas a flying speed viInspection host APjHas a flying speed vjThen there is a distance threshold D0=
Figure DEST_PATH_IMAGE036
(ii) a Wherein D ismaxThe inspection distance of the single machine during inspection of all n pilots is used.
When the z-th abnormal navigation mark YR in the navigation mark list YR is removedzTo patrol route AHjDistance D ofz≤D0When, YR is not presentzJ-th routing inspection route AH added with AHj(j ≠ i), the patrol route AH is updatedjObtain a new routing inspection route AHj'; corresponding inspection host PjCarrying out routing inspection according to the updated routing inspection route;
when there is not an inspection route AHjSo that YRzTo patrol route AHjDistance D ofz≤D0And when the navigation mark which is not patrolled in the navigation mark R and the navigation mark in the removed navigation mark list YR are used as the navigation marks to be patrolled, the patrolling task is reestablished, and the steps are repeated according to the method for establishing the patrolling task at first.
For example, in an implementation, patrol host P3At inspection route AH3Go up and patrol and examine fairway buoy IR7Then, the ground control system obtains the navigation mark IR7Patrol information, judge navigation mark IR7If the navigation mark is abnormal, the ground control system commands the inspection host P3To navigation mark IR7The shooting of the abnormality information is carried out, and the specific shooting content is, for example, the shooting of a navigation mark carrier, such as the ground, or the shooting of the details of the navigation mark body.
When the obtained malfunction information enables the ground control system to judge IR7When the abnormality is caused, the inspection host P3Continuously inspecting the next navigation mark;
when the obtained malfunction information enables the ground control system to judge IR7When the disease is caused, the disease will beNavigation mark IR7Patrol route AH3While removing IR7Adding the first removal list;
patrol and examine host computer P3And patrol inspection host P4The flying speed is the same, the total inspection distance is 55km when a single machine is used for inspecting all 60 to-be-inspected sails, and the current time has 5 cruise host machines to execute 5 routing inspection tasks
Then get the inspection host P4Location routing inspection route and IR7The distance threshold is 2.2.2 km;
when IR is measured7And another patrol route AH4Within 2.2.km, the navigation mark IR7Transferring the first removal list into the routing inspection route AH4In (1), updating AH simultaneously4The routing inspection route; when patrolling route AH4Patrol and examine host computer P4Patrol and examine fairway buoy IR7Due to the navigation mark IR7Transferred from the first removal list, illustrating the navigation mark IR7The navigation mark information is acquired, so that the patrol main unit P4Without reacquiring the navigation mark IR7Navigation mark information of, directly to, navigation mark IR7Shooting for acquiring the abnormality information; when P is present4Obtained navigation mark IR7Enables the ground control system to determine IR7When the abnormality is caused, the inspection host P4Continuously inspecting the next navigation mark; when P is present4Obtained navigation mark IR7The malfunction information makes the ground control system still unable to judge IR7When the reason is abnormal, the navigation mark is IR7Slave inspection host P4Patrol route AH4The first removal list is converted back, and the inspection host P is updated simultaneously4The routing inspection route;
setting of an execution-error beacon IR7The cruise host of the abnormal information acquisition task is an abnormal navigation mark IR7The filtering host set without executing the abnormal navigation mark IR7The cruise host of the abnormal information acquisition task is the abnormal navigation mark IR7The non-filtering host; the malformed fairway signs in the first removal list can only be switched into the cruise route in which the non-filtering host computer is located. In addition, when it is reacted with IR7Patrol of unfiltered host within 2.2km distanceWhen there is more than one inspection route, the inspection route with the shortest preferred distance is used for identifying the navigation mark IR7Adding;
when navigation mark IR7When the distance between the navigation mark and the cruising route of any non-filtering host computer is more than 2.2km, the navigation mark IR7And (4) switching to a second removal list from the first removal list, taking the second removal list as a navigation mark to be patrolled after the patrolling tasks of all the cruise hosts are finished, and distributing the cruise hosts by the ground control system again according to information such as cruise weight and the like, and returning to the initial operation step of the embodiment.
In other embodiments of the present invention, a drone P0 is provided to collect inspection information of the buoy R0 to be inspected, and the types of inspection information of the buoy R0 that the drone needs to acquire may specifically include: brightness, float color, float code, flicker frequency, body shape, spatial position, etc.;
in some embodiments, the drone P0 captures images of the buoy R0 by capturing images or videos using a panoramic camera, a photoelectric pod, or the like, and may also capture information such as the location of the buoy R0 using a laser range finder, an infrared measuring instrument, or a surveying instrument.
The ground control system can obtain the polling characteristics of each polling information category of the buoy R0 according to the images acquired by the unmanned aerial vehicle and transmitted in real time, can judge whether the polling information category of the buoy R0 is an abnormal category according to the polling characteristics corresponding to the polling information categories, and judges that the buoy R0 is an abnormal buoy when any type of polling information of the buoy R0 is an abnormal category; when all the inspection information types of the buoy R0 are normal, the buoy R0 is judged to be a normal buoy;
specifically, the unmanned aerial vehicle P0 is set to take N images of the float R0, and to determine whether the float R0 is a malfunctioning float, M patrol information types are determined for the image of the float R0, and the patrol information type R0L = [ R0L ] of the float R0 is set1, R0L2,…, R0LM]Wherein, the inspection type R0L of the i-th type is arranged in the N images of the float R0iAs an i-class diagram of the buoy R0The image group sets alpha N (alpha is less than or equal to 1) in the i-type image group;
and (3) screening the images of the i-type image group:
(1) graying alpha N images in the i-type image group to obtain a grayscale image, wherein in the specific implementation process, a cvtColor function can be used for graying, in some embodiments, the component proportion of R, G, B is selected to be 0.3, 059 and 0.11 when grayscale calculation is performed, and the grayscale value E (x, y) =0.3R (x, y) +059G (x, y) +0.11B (x, y) of a pixel point E is expressed by a formula; wherein, R is the red value of the pixel E (x, y), G is the green value of the pixel E (x, y), and B is the blue value of the pixel E (x, y).
In other embodiments, an integer approach may also be used: e (x, y) = (R (x, y) × 30+ G (x, y) × 59+ B (x, y) × 11)/100.
(2) For each gray image, traversing each pixel point except for the edge point of the image, and calculating the gray average value of each pixel point and the adjacent F pixel points around the pixel point as the average gray value EH of the pixel point, wherein as shown in fig. 2, the pixel point E (x, y) is set in the x-th row and the y-th column, the F adjacent pixel points are selected as all the pixel points of the F-row and the F-column around the pixel E, specifically, the pixel of the F-row and the F-column around the pixel E is set as the adjacent pixel point of the pixel E, and then the number of the adjacent pixel points of the pixel E is calculated
Figure DEST_PATH_IMAGE037
Wherein f is not less than 1 and f is an integer.
The average gray value of the pixel E can be obtained
Figure 15634DEST_PATH_IMAGE002
For example, when F =1, the number of pixels F =8 adjacent to the pixel E, where E (x, y) is set to represent the gray scale value of the pixel point at the x row and y column of the current image; e (x, y + 1) represents the gray value of the pixel point of the x row and y +1 column of the current image; e (x, y-1) represents the gray value of the y-1 th column pixel point of the x-th row of the current image; e (x-1, y) represents the gray value of the pixel point of the y column of the x-1 row of the current image; e (x-1, y + 1) represents the gray value of the pixel points in the x-1 row and y +1 column of the current image; e (x-1, y-1) represents the gray value of the y-1 column pixel point of the x-1 row of the current image; e (x +1, y + 1) represents the gray value of the pixel point of the (x + 1) th row and the (y + 1) th column of the current image; e (x +1, y) represents the gray value of the pixel point of the x +1 row and the y column of the current image; e (x +1, y + 1) represents the gray value of the pixel point of the (x + 1) th row and the (y + 1) th column of the current image;
the average gray value of the pixel E can be obtained
Figure 117583DEST_PATH_IMAGE004
Meanwhile, the average variance EX and smoothness PE of the pixel E can also be obtained, and the formula is:
Figure DEST_PATH_IMAGE038
Figure DEST_PATH_IMAGE039
(3) as shown in fig. 3, the position of the float on the shot image is determined, when the gray scale image corresponding to the shot patrol image comprises an effective area a and an effective area B, the area a is inside the area B, the distance between the area a and the wide side of the image is a1, the distance between the area a and the long side of the image is a2, the distance between the area B and the wide side of the image is B1, and the distance between the area B and the long side of the image is B2, when the position of the float is in the area a, namely the distance between the left end and the right end of the float in the image and the wide side of the image is H1 > a1, and the distance between the left end and the right end of the float in the image and the field side of the image is H2 > a 2; f = 1;
when the position of the buoy on the shot image is in the area B and outside the area A, namely the distance B1 between the left end and the right end of the buoy and the wide edge of the image in the image is more than or equal to H1 and less than or equal to a1,
Figure 624787DEST_PATH_IMAGE005
(ii) a When the distance a2 & ltH 2 & ltb 2 between the left and right ends of the float and the field edge of the image,
Figure 572015DEST_PATH_IMAGE006
(ii) a Wherein, R θ0And the routing inspection weight corresponding to the navigation mark R.
When the position of the float on the captured image is in the region C outside the region B, i.e., H1 < B1 or H2 < B2, the image is shifted out of the group of i-class images.
In this embodiment, because buoy D is the buoy light, therefore its patrol and examine information classification that needs to patrol and examine generally includes: the lamp brightness, the buoy color, the buoy code, the body shape and the space position, and in addition, the flicker frequency and the like can also be included; taking the spatial position of the buoy as an example, among N images shot, alpha N images (alpha is less than or equal to 1) are set as the images of the position feature group, wherein the images can completely acquire the spatial position of the buoy
When the ith inspection characteristic of the buoy D is judged to be abnormal, adding the ith inspection category as an abnormal category into the abnormal category list, and continuing to judge the next inspection category; when the i-th inspection type of the buoy D is judged to be normal, continuing to judge the next type; the steps are circulated until all M categories are judged;
therefore, an abnormal category list DS of the buoy D can be obtained, when DS = null, it is indicated that the buoy D is not abnormal, the unmanned aerial vehicle finishes the inspection of the buoy D, and the inspection task of the next buoy is continued; when the inspection types are included in the abnormal type list DS, a reason inspection task is established according to the reason inspection characteristics corresponding to the abnormal types in the maintenance database, and the unmanned aerial vehicle is used for carrying out inspection reason image acquisition on the buoy D again.
Set DS = [ S ]1,S2, ,…, SR]Setting the i-th disorder category as SiObtaining the abnormality class SiCorresponding investigation class SIT = [ DST ]1,DST2, ,…, DSTK]When there is a jth abnormal class STjIf the unmanned aerial vehicle does not acquire the inspection image of the buoy D shot by the unmanned aerial vehicle, readjusting the inspection task of the unmanned aerial vehicle, and readjusting the abnormal category STjPerforming investigation;
when abnormal type SiAfter all the investigation categories SIT obtain the investigation characteristics, the abnormality category S is subjected to investigation according to the investigation characteristicsiThe cause of occurrence of (2) is judged.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A beacon patrol system based on unmanned aerial vehicle remote sensing comprises a control system, an unmanned aerial vehicle and a beacon;
the control system comprises an inspection route distribution module, an inspection image acquisition module, an image characteristic module, an image screening module, an abnormality judgment module, a troubleshooting reason module and a troubleshooting reason image acquisition module;
the control system also comprises a historical inspection information base and a historical maintenance information base, wherein the historical inspection information base comprises inspection characteristics of the navigation mark in historical inspection and the cruise weight of the navigation mark; the historical maintenance information base comprises troubleshooting reason categories corresponding to abnormal categories when the buoy is abnormal, and the troubleshooting reason categories can be judged through troubleshooting reason characteristics;
the inspection route distribution module is used for distributing the unmanned aerial vehicle for performing inspection image acquisition on the navigation mark; the inspection image acquisition module is used for acquiring an inspection image of the navigation mark; the image characteristic module is used for obtaining the inspection characteristics in the inspection image; the image screening module is used for taking the inspection images with the same category of inspection features as a feature image group and screening the inspection images in the feature image group; the abnormality judging module is used for obtaining the abnormality type of the navigation mark through the inspection features in the inspection image; the troubleshooting reason module is used for acquiring troubleshooting reason types corresponding to the abnormality types; the troubleshooting reason image acquisition module is used for acquiring a troubleshooting reason image of the navigation mark and acquiring troubleshooting reason characteristics belonging to the troubleshooting reason category through the troubleshooting reason image;
unmanned aerial vehicle includes collection equipment, collection equipment is used for gathering patrol and examine the image and the investigation reason image of buoy.
2. The unmanned aerial vehicle remote sensing-based beacon inspection system according to claim 1, wherein the collection devices include a photoelectric pod and a laser rangefinder.
3. The drone remote sensing-based beacon inspection system according to claim 1, wherein the beacon includes a beacon body including at least one of a lighthouse, a buoy, a post, and a lightboat.
4. The unmanned aerial vehicle remote sensing-based beacon inspection system according to claim 3, wherein the inspection information categories specifically include: brightness, float color, float code, flicker frequency, body shape, and spatial location.
5. An inspection method using the unmanned aerial vehicle remote sensing-based beacon inspection system according to any one of claims 1 to 4, comprising the following steps:
s10, according to the control system, the unmanned plane P is sent to0Distributed routing inspection routes, unmanned aerial vehicle P0To buoy R0Collecting a patrol image;
s20, buoy R collected by unmanned aerial vehicle0Can obtain a buoy R0Is arranged to be classified into M types and a buoy R0Inspection characteristic class R0L=[R0L1, R0L2,…, R0LM];
S30; setting up unmanned aerial vehicle P0Acquisition buoy R0The inspection image of (1), wherein the inspection characteristic R of the ith type is included0LiThe inspection image is used as a buoy R0The ith type of feature image group is set to comprise alpha N (alpha is less than or equal to 1) routing inspection images;
s40, screening the inspection images of the ith characteristic image group, and removing the unqualified images out of the ith characteristic image group;
s50, obtaining the buoy R according to the inspection image in the ith type characteristic image group0The i-th inspection characteristic of the buoy R is judged0If the type i polling characteristic is abnormal, if yes, the step is switched to step S51; if not, go to step S52;
s51, when the buoy R is used0When the ith inspection characteristic is abnormal, adding the ith inspection characteristic as an abnormal category into an abnormal category list;
s52, when the buoy R is used0When the i-th inspection characteristic is normal, the buoy R is continuously checked0Judging a jth type patrol characteristic (j is not equal to i);
s60, obtaining a buoy R0When RS = null, the abnormality category list RS of (1) ends for the float R0The inspection is carried out;
when the abnormality category list RS ≠ null, the float R is set0Disorder class list RS = [ S ]1,S2, ,…, SL]Wherein, the ith abnormal condition category in the abnormal condition category list is set as SiObtaining the abnormality class SiCorresponding reason checking characteristics in a historical maintenance database;
s70, establishing a reason investigation task by the control system, acquiring an reason investigation image of the buoy R0 by using the unmanned aerial vehicle P0, and obtaining the reason investigation characteristics through the reason investigation image;
setting the troubleshooting reason characteristics corresponding to the abnormality categories Si to comprise K categories, wherein the troubleshooting reason characteristic categories SIT = [ SIT1, SIT2, … and SITK =]Wherein, when there is a type of investigation cause characteristics SITjIf the reason image cannot be acquired, the process proceeds to step S71;
when abnormal type SiOfWhen the classified troubleshooting-reason features are all acquired in the troubleshooting-reason image, the procedure goes to step S72;
s71, adjusting the routing inspection route of the unmanned aerial vehicle, and using the unmanned aerial vehicle Pi(Pi≠P0) To buoy R0Disorder class S ofiSIT (a)jAcquiring a troubleshooting reason image according to the category troubleshooting reason characteristics;
and S72, judging the generation reason of the malfunction type according to the obtained reason checking characteristics.
6. The inspection method according to claim 5, wherein in step S40, the method further comprises:
step S401, carrying out graying processing on alpha N images in the ith type image group to obtain a grayscale image;
step S402, calculating the average gray value EH and the smoothness EP of each pixel point in the gray image;
and S403, screening the inspection images in the ith image group according to the average gray value EH and the smoothness EP of the pixel point E and the positions of the buoys on the inspection images, and moving the unqualified images out of the ith image group.
7. The inspection method according to claim 6, wherein in step S401, when performing gray scale calculation, R, G, B is selected as the component ratio of 0.3, 059, 0.11, and the gray scale value E (x, y) =0.3R (x, y) +059G (x, y) +0.11B (x, y) of the pixel E is formulated; wherein, R is the red value of the pixel E (x, y), G is the green value of the pixel E (x, y), and B is the blue value of the pixel E (x, y).
8. The inspection method according to claim 6, wherein in step S402, pixel E (x, y) is set in the x-th row and y-th column, and F adjacent pixels are selected as all pixels in the F-row and F-column around the pixel E, specifically, the pixels in the F-row and F-column around the pixel E are set as the adjacent pixels of the pixel E;
the number of adjacent pixels of the pixel E
Figure 562130DEST_PATH_IMAGE001
Wherein f is more than or equal to 1 and is an integer;
obtain the average gray value of the pixel E
Figure 213691DEST_PATH_IMAGE002
Meanwhile, the average variance EX and the excessive chromatic aberration EG of the pixel E can also be obtained, and the formula is:
Figure 790166DEST_PATH_IMAGE003
Figure 919796DEST_PATH_IMAGE004
the smoothness EP = EX + EG of the available pixel E.
9. The inspection method according to claim 8, wherein gray level images C corresponding to the inspection images in the ith type of image group comprise an area A and an area B, wherein A ⊆ B ⊆ C; setting the distance between the area A and the wide side of the gray-scale image C as a1 and the distance between the area A and the long side of the gray-scale image C as a 2; the distance between the region B and the wide side of the gray image C is B1, and the distance between the region B and the long side of the gray image C is B2;
wherein b1 is less than a1, b2 is less than a 2; setting a smoothness threshold WR0
Provided with a buoy R0The smoothness of the region in the gray image is R0EP is the sum of all pixel point smoothness EP in the area;
when buoy R in gray level image0When the position of (2) is within the area A, the buoy R0The distance H1 > a1 between the left and right ends of the float R and the wide side of the gray image, and the float R0The distance H2 between the upper and lower ends of the gray scale image and the long side of the gray scale image is more than a2, and the process proceeds to step S411;
when buoy R in gray level image0When the position of (2) is within the area B and outside the area A, the process proceeds to step S412,
when buoy R in gray level image0When the position of the gray image is outside the area B, the gray image moves out of the ith type image group;
step S411, taking f =1 to obtain buoy R in gray level image0Smoothness R of the region0EP; when R is0EP<WR0Then, the gray image is shifted out of the ith type image group;
step S412, when the buoy R is used0The distance b1 between the left end and the right end of the gray scale image and the wide side of the gray scale image is more than or equal to H1 and more than or equal to a1,
Figure 406272DEST_PATH_IMAGE005
(ii) a When the distance a2 between the left end and the right end of the buoy and the long side of the gray scale image is more than or equal to H2 and less than or equal to b2 in the image,
Figure 228735DEST_PATH_IMAGE006
(ii) a Wherein f is rounded up, wherein R θ0The routing inspection weight corresponding to the navigation mark R; when R is0EP<WR0And when the gray image is shifted out of the ith type image group.
10. The inspection method according to claim 8, wherein when F =1, the number of pixels adjacent to the pixel E is F =8, wherein E (x, y) is set to represent the gray value of the pixel point at the x row and the y column of the current image; e (x, y + 1) represents the gray value of the pixel point of the x row and y +1 column of the current image; e (x, y-1) represents the gray value of the y-1 th column pixel point of the x-th row of the current image; e (x-1, y) represents the gray value of the pixel point of the y column of the x-1 row of the current image; e (x-1, y + 1) represents the gray value of the pixel points in the x-1 row and y +1 column of the current image; e (x-1, y-1) represents the gray value of the y-1 column pixel point of the x-1 row of the current image; e (x +1, y + 1) represents the gray value of the pixel point of the (x + 1) th row and the (y + 1) th column of the current image; e (x +1, y) represents the gray value of the pixel point of the x +1 row and the y column of the current image; e (x +1, y + 1) represents the gray value of the pixel point of the (x + 1) th row and the (y + 1) th column of the current image; the average gray value of the pixel E can be obtained
Figure 230189DEST_PATH_IMAGE007
EH denotes an average gradation value of the pixel E.
CN202111632391.5A 2021-12-29 2021-12-29 Navigation mark inspection system and inspection method based on unmanned aerial vehicle remote sensing Active CN113989682B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111632391.5A CN113989682B (en) 2021-12-29 2021-12-29 Navigation mark inspection system and inspection method based on unmanned aerial vehicle remote sensing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111632391.5A CN113989682B (en) 2021-12-29 2021-12-29 Navigation mark inspection system and inspection method based on unmanned aerial vehicle remote sensing

Publications (2)

Publication Number Publication Date
CN113989682A true CN113989682A (en) 2022-01-28
CN113989682B CN113989682B (en) 2022-05-17

Family

ID=79734833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111632391.5A Active CN113989682B (en) 2021-12-29 2021-12-29 Navigation mark inspection system and inspection method based on unmanned aerial vehicle remote sensing

Country Status (1)

Country Link
CN (1) CN113989682B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107977018A (en) * 2017-12-12 2018-05-01 吉林大学 Crop straw burning monitoring method based on binocular stereo vision and unmanned plane
CN110570537A (en) * 2019-08-27 2019-12-13 厦门蓝海天信息技术有限公司 Navigation mark monitoring method based on video identification and shipborne navigation mark intelligent inspection equipment
CN210835732U (en) * 2019-10-29 2020-06-23 福建师范大学 Beacon inspection device based on unmanned aerial vehicle
CN111985435A (en) * 2020-08-29 2020-11-24 浙江工业大学 Unmanned aerial vehicle water area monitoring and cruising method based on machine vision
CN113204245A (en) * 2021-05-19 2021-08-03 广州海事科技有限公司 Navigation mark inspection method, system, equipment and storage medium based on unmanned aerial vehicle
CN113313703A (en) * 2021-06-17 2021-08-27 上海红檀智能科技有限公司 Unmanned aerial vehicle power transmission line inspection method based on deep learning image recognition
CN113705433A (en) * 2021-09-24 2021-11-26 池州学院 Power line detection method based on visible light aerial image
CN113744427A (en) * 2021-11-05 2021-12-03 天津天元海科技开发有限公司 Navigation mark inspection system and inspection method based on unmanned aerial vehicle remote sensing
CN113741527A (en) * 2021-09-13 2021-12-03 德仕能源科技集团股份有限公司 Oil well inspection method, equipment and medium based on multiple unmanned aerial vehicles

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107977018A (en) * 2017-12-12 2018-05-01 吉林大学 Crop straw burning monitoring method based on binocular stereo vision and unmanned plane
CN110570537A (en) * 2019-08-27 2019-12-13 厦门蓝海天信息技术有限公司 Navigation mark monitoring method based on video identification and shipborne navigation mark intelligent inspection equipment
CN210835732U (en) * 2019-10-29 2020-06-23 福建师范大学 Beacon inspection device based on unmanned aerial vehicle
CN111985435A (en) * 2020-08-29 2020-11-24 浙江工业大学 Unmanned aerial vehicle water area monitoring and cruising method based on machine vision
CN113204245A (en) * 2021-05-19 2021-08-03 广州海事科技有限公司 Navigation mark inspection method, system, equipment and storage medium based on unmanned aerial vehicle
CN113313703A (en) * 2021-06-17 2021-08-27 上海红檀智能科技有限公司 Unmanned aerial vehicle power transmission line inspection method based on deep learning image recognition
CN113741527A (en) * 2021-09-13 2021-12-03 德仕能源科技集团股份有限公司 Oil well inspection method, equipment and medium based on multiple unmanned aerial vehicles
CN113705433A (en) * 2021-09-24 2021-11-26 池州学院 Power line detection method based on visible light aerial image
CN113744427A (en) * 2021-11-05 2021-12-03 天津天元海科技开发有限公司 Navigation mark inspection system and inspection method based on unmanned aerial vehicle remote sensing

Also Published As

Publication number Publication date
CN113989682B (en) 2022-05-17

Similar Documents

Publication Publication Date Title
CN107943078A (en) More rotor dual systems unmanned plane inspection fault diagnosis systems and method
CN107992067A (en) Unmanned plane inspection fault diagnosis system based on integrated gondola and AI technologies
CN107729808A (en) A kind of image intelligent acquisition system and method for power transmission line unmanned machine inspection
CN110673628B (en) Inspection method for oil-gas pipeline of composite wing unmanned aerial vehicle
CN108109437A (en) It is a kind of that generation method is extracted from main shipping track based on the unmanned plane of map feature
CN203219298U (en) Unmanned helicopter system special for inspecting electric grid in mountain area
CN115861855B (en) Operation and maintenance monitoring method and system for photovoltaic power station
CN103078673A (en) Special unmanned helicopter system suitable for routing inspection on power grid in mountain area
CN106654987A (en) Power line multi-robot collaborative inspection method
CN113744427B (en) Navigation mark inspection system and inspection method based on unmanned aerial vehicle remote sensing
CN112327906A (en) Intelligent automatic inspection system based on unmanned aerial vehicle
CN110866483B (en) Dynamic and static combined visual detection and positioning method for airport runway foreign matter
CN108229587A (en) A kind of autonomous scan method of transmission tower based on aircraft floating state
CN114265418A (en) Unmanned aerial vehicle inspection and defect positioning system and method for photovoltaic power station
CN112364725B (en) Cotton pest three-dimensional monitoring method and system based on small unmanned aerial vehicle group
CN110189411A (en) Emergency management and rescue Search Area method for visualizing after a kind of accident of aircraft
CN111027422A (en) Emergency unmanned aerial vehicle inspection method and system applied to power transmission line corridor
CN112859905A (en) Unmanned aerial vehicle inspection route generation method and device for overhead power line and unmanned aerial vehicle
CN108061572A (en) A kind of ocean nuclear power platform comprehensive situation display & control system and method
CN112734970A (en) Automatic inspection system and method for wind power plant unmanned aerial vehicle based on LoRaWAN positioning technology
CN114240868A (en) Unmanned aerial vehicle-based inspection analysis system and method
CN113589837A (en) Electric power real-time inspection method based on edge cloud
CN116310891A (en) Cloud-edge cooperative transmission line defect intelligent detection system and method
CN106320173B (en) Vehicle-mounted unmanned aerial vehicle bridge routine safety detection system and detection method
CN113989682B (en) Navigation mark inspection system and inspection method based on unmanned aerial vehicle remote sensing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant