CN115345830A - Ceiling detection method and device based on unmanned aerial vehicle routing inspection, unmanned aerial vehicle and medium - Google Patents

Ceiling detection method and device based on unmanned aerial vehicle routing inspection, unmanned aerial vehicle and medium Download PDF

Info

Publication number
CN115345830A
CN115345830A CN202210830549.8A CN202210830549A CN115345830A CN 115345830 A CN115345830 A CN 115345830A CN 202210830549 A CN202210830549 A CN 202210830549A CN 115345830 A CN115345830 A CN 115345830A
Authority
CN
China
Prior art keywords
node
routing inspection
inspection
path
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210830549.8A
Other languages
Chinese (zh)
Inventor
应自炉
谭梓峻
翟懿奎
王文琪
廖锦锐
江子义
周建宏
李文霸
梁长钊
李青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuyi University
Original Assignee
Wuyi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuyi University filed Critical Wuyi University
Priority to CN202210830549.8A priority Critical patent/CN115345830A/en
Publication of CN115345830A publication Critical patent/CN115345830A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/12Computing arrangements based on biological models using genetic models
    • G06N3/126Evolutionary algorithms, e.g. genetic algorithms or genetic programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Biomedical Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Physiology (AREA)
  • Genetics & Genomics (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a ceiling detection method and device based on unmanned aerial vehicle routing inspection, an unmanned aerial vehicle and a medium, wherein the method comprises the following steps: determining a routing inspection area, determining an initial routing inspection path according to a plurality of routing inspection nodes of the routing inspection area, and determining a target path with optimal energy consumption through a genetic algorithm; performing routing inspection at least twice according to the target path, and shooting a node image at each routing inspection node in each routing inspection process; inputting at least two node images of each inspection node into the GAN, and performing semantic segmentation and change detection through the GAN to obtain a change segmentation graph; and determining the ceiling detection result of the routing inspection node according to the change segmentation graph. According to the technical scheme of the embodiment, the path planning can be carried out through the genetic algorithm, the routing inspection path with the optimal energy consumption is obtained, the change detection is carried out through the node images and the GAN which are subjected to routing inspection and shot twice, the ceiling detection is carried out through the change segmentation graph, the automation of the ceiling detection is realized, and the efficiency of the ceiling detection is effectively improved.

Description

Ceiling detection method and device based on unmanned aerial vehicle routing inspection, unmanned aerial vehicle and medium
Technical Field
The invention belongs to the technical field of unmanned aerial vehicles, and particularly relates to a ceiling detection method and device based on unmanned aerial vehicle routing inspection, an unmanned aerial vehicle and a medium.
Background
The ceiling is common roof building, and the construction method of ceiling is not standard, destroys the floor structure very easily, perhaps causes other potential safety hazards, based on this, needs to investigate the ceiling, in time discovers the ceiling that goes wrong. The traditional manual checking mode consumes a large amount of manpower, and is difficult to realize the rapid checking of all buildings. Along with the development of unmanned aerial vehicle and image recognition technology, the flight ability that utilizes unmanned aerial vehicle detects the ceiling from aerial and has effectively improved work efficiency, but conventional unmanned aerial vehicle needs artifical remote control to shoot, returns to navigate back and carries out identification process to the image, and degree of automation is not high, and efficiency is lower.
Disclosure of Invention
The following is a summary of the subject matter described in detail herein. This summary is not intended to limit the scope of the claims.
The embodiment of the invention provides a ceiling detection method and device based on unmanned aerial vehicle inspection, an unmanned aerial vehicle and a medium, which can realize automatic ceiling inspection by the unmanned aerial vehicle, complete ceiling detection in the inspection process and improve the ceiling detection efficiency.
In a first aspect, an embodiment of the present invention provides a ceiling detection method based on unmanned aerial vehicle routing inspection, which is applied to an unmanned aerial vehicle, and includes:
determining a routing inspection area, wherein the routing inspection area comprises a plurality of routing inspection nodes;
determining an initial routing inspection path according to the plurality of routing inspection nodes, wherein each routing inspection node is in the initial routing inspection path at one time;
taking the initial routing inspection path as an initialization population of a genetic algorithm, and determining a target path with optimal energy consumption through the genetic algorithm;
performing routing inspection at least twice according to the target path, and shooting a node image at each routing inspection node in each routing inspection process;
inputting at least two node images of each inspection node into a pre-trained generation countermeasure Network (GAN), and performing semantic segmentation and change detection on the at least two node images through the GAN to obtain a change segmentation graph;
and determining a ceiling detection result of the routing inspection node according to the variation segmentation graph.
In some embodiments, the determining, by a genetic algorithm, an energy-consumption-optimal target path includes:
determining initial path energy consumption according to the initial routing inspection path;
acquiring a preset fitness function, and selecting a plurality of nodes to be adjusted from the routing inspection nodes according to the fitness function;
sequentially crossing and varying the nodes to be adjusted to obtain a middle routing inspection path;
and iterating according to the middle routing inspection path, and obtaining the target path after finishing preset iteration times.
In some embodiments, the determining of the initial path energy consumption according to the initial routing inspection path is obtained by the following formula:
Figure BDA0003748130330000021
wherein E is the initial path energy consumption, N is the number of patrol nodes, W is unmanned aerial vehicle's load weight, d j The distance of the j-th route in the initial patrol route, wherein j belongs to {1,2,. Cndot., N +1}, t j The flight time required for each trip between two of the routing nodes,
Figure BDA0003748130330000022
v is the flight speed of the drone.
In some embodiments, the node image of each inspection node at least includes a first node image and a second node image, the first node image and the second node image have different shooting times, and the shooting angles of the first node image and the second node image are the same.
In some embodiments, the GAN includes a semantic segmentation network and a discrimination network, and performing semantic segmentation and change detection on at least two node images through the GAN to obtain a change segmentation map includes:
inputting the first node image and the second node image into the semantic segmentation network for feature extraction to obtain a first semantic feature map and a second semantic feature map;
and inputting the first semantic feature map and the second semantic feature map into the discrimination network for change detection, and determining a confidence map output by the discrimination network as the change segmentation map.
In some embodiments, the semantic segmentation network and the discriminant network are trained by a minimization of loss function.
In some embodiments, determining the ceiling detection result of the routing inspection node according to the variation segmentation graph comprises:
when the variation segmentation graph represents that the first node image and the second node image are different, determining that the ceiling detection result is that the detection is not passed;
alternatively, the first and second electrodes may be,
and when the variation segmentation graph represents that the first node image is the same as the second node image, determining that the ceiling detection result is that the detection is passed.
In a second aspect, an embodiment of the present invention provides a ceiling detection device based on unmanned aerial vehicle routing inspection, including:
the system comprises a routing inspection area determining unit, a routing inspection area determining unit and a routing inspection area determining unit, wherein the routing inspection area determining unit is used for determining a routing inspection area which comprises a plurality of routing inspection nodes;
the inspection unit is used for determining an initial inspection path according to the inspection nodes, and the path of the initial inspection path is one-time for each inspection node;
the path planning unit is used for determining a target path with optimal energy consumption through a genetic algorithm by taking the initial routing inspection path as an initialization population of the genetic algorithm;
the image acquisition unit is used for carrying out at least two times of routing inspection according to the target path and shooting a node image at each routing inspection node in each routing inspection process;
the detection unit is used for inputting at least two node images of each inspection node into a pre-trained generation countermeasure network GAN, and performing semantic segmentation and change detection on the at least two node images through the GAN to obtain a change segmentation graph;
and the result judging unit is used for determining the ceiling detection result of the routing inspection node according to the change segmentation graph.
In a third aspect, an embodiment of the present invention provides an unmanned aerial vehicle, including: the system comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the computer program to realize the unmanned aerial vehicle inspection tour-based ceiling detection method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, which stores a computer program, where the computer program is configured to execute the method for detecting a ceiling based on unmanned aerial vehicle inspection according to the first aspect.
The embodiment of the invention comprises the following steps: determining a routing inspection area, wherein the routing inspection area comprises a plurality of routing inspection nodes; determining an initial routing inspection path according to the routing inspection nodes, wherein the initial routing inspection path is one-time for each routing inspection node; taking the initial routing inspection path as an initialization population of a genetic algorithm, and determining a target path with optimal energy consumption through the genetic algorithm; performing routing inspection at least twice according to the target path, and shooting a node image at each routing inspection node in each routing inspection process; inputting at least two node images of each inspection node into a pre-trained GAN, and performing semantic segmentation and change detection on the at least two node images through the GAN to obtain a change segmentation graph; and determining the ceiling detection result of the inspection node according to the variation segmentation graph. According to the technical scheme of the embodiment, path planning can be carried out through a genetic algorithm, the routing inspection path with optimal energy consumption is obtained, change detection is carried out through node images and GAN which are subjected to routing inspection twice, ceiling detection is carried out through a change segmentation graph, automation of ceiling detection is achieved, and ceiling detection efficiency is effectively improved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the example serve to explain the principles of the invention and not to limit the invention.
Fig. 1 is a flowchart of a ceiling detection method based on unmanned aerial vehicle inspection according to an embodiment of the present invention;
FIG. 2 is a flow chart of path planning provided by another embodiment of the present invention;
FIG. 3 is a flow chart of change discrimination provided by another embodiment of the present invention;
FIG. 4 is a schematic diagram of a semantic segmentation network provided by another embodiment of the present invention;
FIG. 5 is a schematic diagram of a discrimination network provided by another embodiment of the present invention;
fig. 6 is a flowchart for determining a ceiling detection result according to another embodiment of the present invention;
fig. 7 is a structural diagram of a ceiling detection device based on unmanned aerial vehicle routing inspection according to another embodiment of the invention;
fig. 8 is an apparatus diagram of a drone provided by another embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It is noted that while functional block divisions are provided in device diagrams and logical sequences are shown in flowcharts, in some cases, steps shown or described may be performed in sequences other than block divisions within devices or flowcharts. The terms "first," "second," and the like in the description, in the claims, or in the drawings, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
The invention provides a ceiling detection method and device based on unmanned aerial vehicle inspection, an unmanned aerial vehicle and a medium, wherein the method comprises the following steps: determining a routing inspection area, wherein the routing inspection area comprises a plurality of routing inspection nodes; determining an initial routing inspection path according to the plurality of routing inspection nodes, wherein each routing inspection node is in the initial routing inspection path at one time; taking the initial routing inspection path as an initialization population of a genetic algorithm, and determining a target path with optimal energy consumption through the genetic algorithm; performing routing inspection at least twice according to the target path, and shooting a node image at each routing inspection node in each routing inspection process; inputting at least two node images of each inspection node into a pre-trained GAN, and performing semantic segmentation and change detection on the at least two node images through the GAN to obtain a change segmentation graph; and determining a ceiling detection result of the routing inspection node according to the variation segmentation graph. According to the technical scheme of the embodiment, path planning can be carried out through a genetic algorithm, the routing inspection path with optimal energy consumption is obtained, change detection is carried out through node images and GAN which are subjected to routing inspection twice, ceiling detection is carried out through a change segmentation graph, automation of ceiling detection is achieved, and ceiling detection efficiency is effectively improved.
As shown in fig. 1, fig. 1 is a flowchart of a method for detecting a ceiling based on unmanned aerial vehicle inspection according to an embodiment of the present invention, and the method for detecting a ceiling based on unmanned aerial vehicle inspection is applied to an unmanned aerial vehicle, and includes, but is not limited to, the following steps:
step S110, determining a routing inspection area, wherein the routing inspection area comprises a plurality of routing inspection nodes;
step S120, determining an initial routing inspection path according to the plurality of routing inspection nodes, wherein each routing inspection node is arranged in the initial routing inspection path at one time;
step S130, taking the initial routing inspection path as an initialization population of a genetic algorithm, and determining a target path with optimal energy consumption through the genetic algorithm;
step S140, performing at least two times of routing inspection according to the target path, and shooting a node image at each routing inspection node in each routing inspection process;
s150, inputting at least two node images of each inspection node into a pre-trained GAN, and performing semantic segmentation and change detection on the at least two node images through the GAN to obtain a change segmentation graph;
and step S160, determining the ceiling detection result of the routing inspection node according to the change segmentation graph.
It should be noted that the unmanned aerial vehicle of this embodiment can adopt unmanned aerial vehicle of any type, can ensure that duration is enough to accomplish patrol and examine can to in order to obtain the node image, can carry on remote sensing equipment in unmanned aerial vehicle, this embodiment does not improve specific hardware structure, can realize the function can. The region of patrolling and examining can be set for according to actual demand, including a starting point and a plurality of node of patrolling and examining in the region of patrolling and examining can. In the routing inspection process, the unmanned aerial vehicle starts from the starting point, approaches all nodes in the area, finally returns to the starting point, and acquires images by using remote sensing equipment when passing through each node.
It should be noted that GAN is one of supervised learning technologies, and this embodiment combines a change detection technology with GAN, so as to implement semi-supervised learning, and can complete identification through less labeled data and more unlabeled data, thereby effectively reducing the number of node images requiring manual labeling, facilitating implementation of automatic identification, and improving ceiling detection efficiency.
It should be noted that, in the genetic algorithm, each body of the population is regarded as a chromosome participating in inheritance, which represents a potential feasible solution, in this embodiment, the patrol route can be used as a chromosome, and after the patrol area is determined, c is used i To represent the ith node, i e {1, 2.., N }, then each chromosome has N genetic positions, i.e., each optional patrol path needs to include all patrol nodes at the same time, and the coding sequence in the chromosome represents the path trajectory of the drone. It should be noted that, when planning a path according to a genetic algorithm, it is necessary to ensure that each chromosome (patrol path) does not include repeated individuals, and if there are repeated individuals in the chromosomes obtained by the genetic algorithm, the repeated individuals can be discarded, thereby avoiding the repetition of patrol.
It should be noted that after the target path is obtained, the unmanned aerial vehicle can be controlled to perform multiple patrols and examines according to the target path, the patrolling and examining is performed once each time, the multiple images at different times are compared to obtain a variation segmentation diagram, when the ceiling obtained according to the variation segmentation diagram is changed, risks may exist, and therefore the ceiling detection result is determined.
In addition, in an embodiment, referring to fig. 2, step S130 of the embodiment shown in fig. 1 further includes, but is not limited to, the following steps:
step S210, determining initial path energy consumption according to the initial routing inspection path;
step S220, a preset fitness function is obtained, and a plurality of nodes to be adjusted are selected from the inspection nodes according to the fitness function;
step S230, sequentially crossing and varying a plurality of nodes to be adjusted to obtain a middle routing inspection path;
and S240, iterating according to the middle routing inspection path, and obtaining a target path after the preset iteration times are finished.
The initial path energy consumption is obtained by the following formula:
Figure BDA0003748130330000061
wherein E is the initial path energy consumption, N is the quantity of patrolling and examining the node, W is unmanned aerial vehicle's load weight, d j Is the distance of the j-th route in the initial patrol route, j belongs to {1, 2., N +1}, t j The flight time required for each trip between two routing nodes,
Figure BDA0003748130330000062
v is the flight speed of the drone.
It should be noted that, after the routing inspection area is determined, the population consisting of N nodes may be initialized and encoded according to the encoding rule, where the number of individuals of the initialized population is N c Starting from the starting point node, the unmanned aerial vehicle firstly flies to the first node to obtain a first distance d 1 In the whole process, the distance can be divided into N +1 sections, and the distance of the j-th section is recorded as d j J ∈ {1, 2., N +1}, the time required for each leg is then
Figure BDA0003748130330000071
The energy consumed by the unmanned aerial vehicle is related to flight time and bearing weight, the whole process is divided into multiple sections of routes, and the obtained initial route energy consumption
Figure BDA0003748130330000072
On the basis, in order to obtain the optimal energy consumption through a genetic algorithmThe target path of (3) may determine an optimization objective function according to the initial path energy consumption, for example, the optimization objective function obtained according to the expression of the initial path energy consumption is:
Figure BDA0003748130330000073
it should be noted that the fitness function is a deterministic index describing the survival chances of individuals in the population, and each individual in the genetic algorithm has a fitness value, so that the optimization objective function can be selected as the fitness function, and the energy consumption of the unmanned aerial vehicle is reduced to the maximum extent through path planning. In the genetic algorithm, the strategy of individual selection can be adjusted according to actual requirements, the probability that excellent individuals are selected to enter the next generation heredity is higher, and the probability that the ith chromosome is selected is determined by using the reciprocal of the fitness value of the individuals as follows:
Figure BDA0003748130330000074
wherein: f. of i Is the fitness value of the ith chromosome. By the mode, the nodes to be adjusted can be randomly selected from the routing inspection nodes, routing inspection paths with lower energy consumption are obtained by crossing and varying the nodes to be adjusted, iteration is performed for multiple times until the number of iterations is completed or the energy consumption of the iterated paths meets a threshold value, so that target paths with optimal energy consumption are obtained, and the routing inspection of the unmanned aerial vehicle can be completed through less energy consumption.
In addition, the crossover of this embodiment can adopt single-point crossover, and a section of genes of two individuals are exchanged and recombined to generate a new individual. Because unmanned aerial vehicle can not repeatedly pass through a node of patrolling and examining, when having the same code in the individuality after crossing, then presume this time and cross unsuccessfully.
Because of the setting of the coding constraint, under the control of the mutation probability, two variation points are obtained for the individual, and then a part of the chromosome between the two variation points is inverted to obtain a new individual. In order to avoid the population from falling into local convergence, the global search capability of the algorithm is improved, and the variation probability can be adjusted according to actual requirements, which is not limited in this embodiment.
In addition, in an embodiment, the node image of each routing inspection node at least comprises a first node image and a second node image, the shooting time of the first node image is different from that of the second node image, and the shooting angles of the first node image and the second node image are the same.
It should be noted that the first node image and the second node image have the same shooting position and angle and different shooting time, so that whether the ceiling is changed or not is determined by the change of the image at the same position at different times, and ceiling detection is realized.
It is worth noting that the node images can be shot through remote sensing equipment carried by an unmanned aerial vehicle, in a processing mode of low-airspace remote sensing images, the change detection requirements can obtain higher accuracy, and the remote sensing images cannot be affected by different positions and postures, distances, quality and the like of sensors and weather. Therefore, before change detection is carried out according to the first node image and the second node image, image preprocessing such as corresponding accurate geometric correction, atmospheric correction, mutual registration and the like is carried out on the remote sensing images in different periods, which is an essential step, and the advantage of the method is that the accuracy of the change detection can be improved. The purpose of image registration is to reduce errors caused by different shooting angles and the like, avoid that different pictures cannot be sequentially corresponding to each other at the same spatial position, and when feature points extracted later are compared one by one, the feature points of one picture and the feature points of other positions of the other picture are analyzed, so that a change detection result has a high false detection rate.
In addition, in an embodiment, the GAN includes a semantic segmentation network and a discriminant network, and referring to fig. 3, step S150 of the embodiment shown in fig. 1 further includes, but is not limited to, the following steps:
step S310, inputting the first node image and the second node image into a semantic segmentation network for feature extraction to obtain a first semantic feature map and a second semantic feature map;
and step S320, inputting the first semantic feature map and the second semantic feature map into a discrimination network for change detection, and determining a confidence map output by the discrimination network as a change segmentation map.
It should be noted that, the structure of the semantic segmentation network in this embodiment may refer to the deplaybv 3 structure shown in fig. 4, and the images in this embodiment are the first node image and the second node image, and may also include a labeled image and an unlabeled image at the same time. A segmentation network inputs a group of first node images and second node images with dimension H multiplied by W multiplied by 3, and outputs a variation region segmentation probability graph with dimension H multiplied by W multiplied by C, wherein C represents the number of categories needing to be segmented. The whole network consists of an encoder and a decoder, the input image firstly passes through a feature extraction network to extract features, the extracted features are respectively input into two paths, wherein one path is a new feature graph entering 1 multiplied by 1 convolution; and the other path is to perform depth feature extraction on 5 parts of the ASSP layer, and the features after depth extraction are spliced and then enter a 1 x 1 convolutional layer to obtain a feature map through four times of upsampling. And splicing the characteristic graphs obtained by the two paths, then entering a 1 x 1 convolutional layer, and performing up-sampling for 4 times to obtain a final segmentation graph. The ASPP layer is divided into 5 parts, and the 5 parts are 1 × 1 convolutional layer, 3 × 3 convolutional layer with a void ratio of 6, 3 × 3 convolutional layer with a void ratio of 12, 3 × 3 convolutional layer with a void ratio of 18, and global average pooling layer, respectively. The output of the four convolution layers is subjected to batch normalization processing, and is connected with a 1 multiplied by 1 convolution layer after being fully connected with the average pooling layer, and then is subjected to batch normalization processing.
It should be noted that, as for the structure of the discrimination network of this embodiment, as shown in fig. 5, the discrimination network inputs a variation region segmentation probability map from the segmentation network or a one-hot encoded real tag of a labeled variation region, and then generates a confidence map. In the training process by using the labeled data, the loss function of the segmentation network optimization comprises two items: cross entropy L of segmentation result and real label ce And a discrimination result L for discriminating the network adv (ii) a For unmarked data, a high-confidence area corresponding to the discrimination network is used as a pseudo label, and the cross entropy L of the segmentation result and the pseudo label is correspondingly used semi Training is performed as a loss function. The whole network can be simultaneously subjected to marked images and unmarked imagesLike supervision, the semantic segmentation network and the discrimination network are simultaneously subjected to antagonistic training.
It should be noted that the discrimination network D of this embodiment is a standard binary classification network, and is composed of 4 × 4 convolutional layers, channels of each convolutional layer are 64, 128, 256, and 512, respectively, a Leak-Relu activation function with a slope of-0.2 is attached to the back of each convolutional layer, a global tie pooling layer and a full connection layer are attached to the back of the last convolutional layer, and specific network layer parameters may be adjusted according to actual requirements.
In addition, in one embodiment, the semantic segmentation network and the discriminant network are trained by a minimization loss function.
It is to be noted that, in order to better illustrate the GAN of the present embodiment, the following describes an exemplary training process of GAN, in this example, the first node image is denoted as T1, the second node image is denoted as T2, the dimension is H × W × 3, which respectively corresponds to the three color channels of the image, i.e., height, width and RGB, S (·) represents a segmentation network, S (T1, T2) represents a variation region classification probability map with the dimension of H × W × C, where C represents the number of classes. D (-) represents a discrimination network, which inputs a result Yn obtained by one-hot encoding the true tag or a division result of the division network S (T1, T2), and outputs a confidence map of H × W × 1:
(1) Discriminant network training
The discriminant network is trained using only the labeled data by minimizing a loss function L D Training a discrimination network, wherein the expression is as follows: l is a radical of an alcohol D =-∑ h,w (1-y n )log log(1-D(S(T1,T2)) (h,w) )+y n log(D(Y n ) (h,w) ) (ii) a If the sample is from a segmented network, y n =0, if the sample is from a genuine label, y n =1. In addition, D (S (T1, T2)) (h,w) D (Y) is a value representing the position (h, w) of the confidence map obtained after T1 and T2 pass through the segmentation network and the discriminant network n ) (h,w) And (5) indicating the value of the confidence graph at the point (h, w) after the real label corresponding to the true change area is subjected to one-hot coding. The one-hot coding scheme specifically comprises the following steps: if pixel X (h, w) n belongs toIn class c, then Y (h, w, c) n takes the value 1; if pixel X (h, w) n does not belong to class c, then Y (h, w, c) n takes the value 0.
(2) Semantic segmentation network
By minimizing the loss function L seg The network is generated by training, and the expression is as follows: l is a radical of an alcohol seg =L ceadv L advsemi L semi (ii) a Wherein L is ce 、L adv And λ semi Respectively representing a classification cross entropy loss function, a antagonism loss function and a semi-supervised loss function. In the above formula, λ adv And λ semi Are two weights for minimizing the proposed loss function. For the labeled data. The input image is Xn, given that the one-hot coding corresponding to the real label of the image is Yn, and the prediction result of the segmentation network is S (T1, T2), the corresponding cross entropy loss is obtained by the following formula:
Figure BDA0003748130330000101
wherein the loss L is resisted adv Comprises the following steps: l is adv =-∑ h,w log(D(S(T1,T2)) (h,w) ) (ii) a For unlabeled data. Firstly, the marked data is used for training, so that the purpose is to enable the segmentation network to have preliminary segmentation capability, the purpose is to enable the discrimination network to have basic discrimination capability, and the segmentation result generated by the segmentation network can be recognized to a certain extent to generate a confidence map. Inputting a classification probability graph obtained by segmenting the data which are not marked through a segmentation network into a discrimination network, and generating a pseudo label for self-training through binarization of the output confidence graph. Due to the existence of the false tags, whether marked data or unmarked data, the corresponding segmentation loss and the corresponding countermeasure loss can be calculated. When the segmentation network parameters are updated, the fixed discrimination network parameters are unchanged, the segmentation network parameters are updated through semi-supervised training for unlabelled data, and the network parameters are updated through a full-supervised mode for labeled data. The resulting semi-supervised loss is defined as:
Figure BDA0003748130330000102
wherein, T semi Is a threshold value for binarizing the confidence map by 0 and 1. Let c * =argmax c S(T1,T2) (h,w,c) . When c = c |,
Figure BDA0003748130330000103
otherwise
Figure BDA0003748130330000104
I (-) is an indication function, is used for screening high-confidence pixel classification and constructing a pseudo label for semi-supervised training.
In addition, in an embodiment, referring to fig. 6, step S160 of the embodiment shown in fig. 1 further includes, but is not limited to, the following steps:
step S610, when the change segmentation chart indicates that the first node image and the second node image are different, determining that the ceiling detection result is that the detection does not pass;
alternatively, the first and second electrodes may be,
and S620, when the change segmentation chart indicates that the first node image is the same as the second node image, determining that the ceiling detection result is that the detection is passed.
It should be noted that, the ceiling belongs to one type of building, and is therefore usually static and of a constant style, based on which, when the node images acquired by the changing partition chart feature twice continuously are different, the ceiling is changed and needs to be checked in time, otherwise, the ceiling is not changed, and the ceiling checking efficiency is effectively improved.
In addition, referring to fig. 7, an embodiment of the present invention provides a ceiling detection apparatus based on unmanned aerial vehicle inspection, where the ceiling detection apparatus 700 includes
A routing inspection area determining unit 710 for determining a routing inspection area, which includes a plurality of routing inspection nodes;
the inspection unit 720 is used for determining an initial inspection path according to the plurality of inspection nodes, wherein each inspection node is arranged in the initial inspection path at one time;
the path planning unit 730 is used for determining a target path with optimal energy consumption through a genetic algorithm by taking the initial routing inspection path as an initialization population of the genetic algorithm;
the image acquisition unit 740 is configured to perform at least two rounds of routing inspection according to the target path, and capture a node image at each routing inspection node in each routing inspection process;
the detection unit 750 is used for inputting at least two node images of each inspection node into a pre-trained generation countermeasure network GAN, and performing semantic segmentation and change detection on the at least two node images through the GAN to obtain a change segmentation graph;
and a result judging unit 760 for determining the ceiling detection result of the routing inspection node according to the change division graph.
Additionally, referring to fig. 8, an embodiment of the present invention also provides a drone, the drone 800 including: memory 810, processor 820, and a computer program stored on memory 810 and executable on processor 820.
The processor 820 and memory 810 may be connected by a bus or other means.
Non-transitory software programs and instructions required to implement the drone inspection-based ceiling detection method of the above embodiment are stored in the memory 810, and when executed by the processor 820, perform the drone inspection-based ceiling detection method of the above embodiment, for example, performing the above-described method steps S110 to S160 in fig. 1, method steps S210 to S240 in fig. 2, method steps S310 to S320 in fig. 3, and method steps S610 to S620 in fig. 6.
The above-described embodiments of the apparatus are merely illustrative, wherein the units illustrated as separate components may or may not be physically separate, i.e. may be located in one place, or may also be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
Furthermore, an embodiment of the present invention also provides a computer-readable storage medium, where the computer program is stored, and the computer program is executed by a processor or a controller, for example, by a processor in the above-mentioned unmanned aerial vehicle embodiment, so that the processor may execute the method for ceiling detection based on unmanned aerial vehicle inspection in the above-mentioned embodiment, for example, execute the above-mentioned method steps S110 to S160 in fig. 1, method steps S210 to S240 in fig. 2, method steps S310 to S320 in fig. 3, and method steps S610 to S620 in fig. 6. It will be understood by those of ordinary skill in the art that all or some of the steps, means, and/or steps of the methods disclosed above may be implemented as software, firmware, hardware, or any suitable combination thereof. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer-readable storage media, which may include computer storage media (or non-transitory storage media) and communication storage media (or transitory storage media). The term computer storage media includes volatile and nonvolatile, removable and non-removable storage media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other storage medium which can be used to store the desired information and which can be accessed by a computer. In addition, communication storage media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery storage media as is well known to those of ordinary skill in the art.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.
While the preferred embodiments of the present invention have been described, the present invention is not limited to the above embodiments, and those skilled in the art can make various equivalent modifications or substitutions without departing from the spirit of the present invention, and such equivalent modifications or substitutions are to be included within the scope of the present invention defined by the appended claims.

Claims (10)

1. The utility model provides a ceiling detection method based on unmanned aerial vehicle patrols and examines, its characterized in that includes:
determining a routing inspection area, wherein the routing inspection area comprises a plurality of routing inspection nodes;
determining an initial routing inspection path according to the routing inspection nodes, wherein the initial routing inspection path is one-time for each routing inspection node;
taking the initial routing inspection path as an initialization population of a genetic algorithm, and determining a target path with optimal energy consumption through the genetic algorithm;
performing routing inspection at least twice according to the target path, and shooting a node image at each routing inspection node in each routing inspection process;
inputting at least two node images of each inspection node into a pre-trained generation countermeasure network GAN, and performing semantic segmentation and change detection on the at least two node images through the GAN to obtain a change segmentation graph;
and determining a ceiling detection result of the routing inspection node according to the variation segmentation graph.
2. The unmanned aerial vehicle inspection tour based ceiling detection method according to claim 1, wherein the determining an optimal target path for energy consumption through a genetic algorithm comprises:
determining initial path energy consumption according to the initial routing inspection path;
acquiring a preset fitness function, and selecting a plurality of nodes to be adjusted from the routing inspection nodes according to the fitness function;
sequentially crossing and varying the nodes to be adjusted to obtain a middle routing inspection path;
and iterating according to the middle routing inspection path, and obtaining the target path after the preset iteration times are finished.
3. The ceiling detection method based on unmanned aerial vehicle inspection according to claim 2, wherein the initial path energy consumption is determined according to the initial inspection path and is obtained through the following formula:
Figure FDA0003748130320000011
wherein E is the initial path energy consumption, N is the number of patrol nodes, W is the load weight of the unmanned aerial vehicle, d j The distance of the jth route in the initial routing inspection path is j ∈ {1,2,. And N +1}, t j The flight time required for each trip between two of the routing nodes,
Figure FDA0003748130320000012
v is the flight speed of the drone.
4. The ceiling detection method based on unmanned aerial vehicle inspection according to claim 1, wherein the node images of each inspection node at least include a first node image and a second node image, the first node image and the second node image are shot at different times, and the shooting angles of the first node image and the second node image are the same.
5. The ceiling detection method based on unmanned aerial vehicle inspection according to claim 4, wherein the GAN includes a semantic segmentation network and a discrimination network, and the obtaining of the change segmentation map by performing semantic segmentation and change detection on at least two node images through the GAN includes:
inputting the first node image and the second node image into the semantic segmentation network for feature extraction to obtain a first semantic feature map and a second semantic feature map;
and inputting the first semantic feature map and the second semantic feature map into the discrimination network for change detection, and determining a confidence map output by the discrimination network as the change segmentation map.
6. The ceiling detection method based on unmanned aerial vehicle inspection tour according to claim 5, wherein: the semantic segmentation network and the discrimination network are obtained through minimum loss function training.
7. The ceiling detection method based on unmanned aerial vehicle inspection according to claim 4, wherein determining the ceiling detection result of the inspection node according to the variation segmentation graph comprises the following steps:
when the variation segmentation graph represents that the first node image and the second node image are different, determining that the ceiling detection result is that the detection is not passed;
alternatively, the first and second electrodes may be,
and when the variation segmentation graph represents that the first node image is the same as the second node image, determining that the ceiling detection result is that the ceiling detection result passes the detection.
8. The utility model provides a ceiling detection device based on unmanned aerial vehicle patrols and examines, its characterized in that includes:
the system comprises a polling area determining unit, a polling node determining unit and a polling node determining unit, wherein the polling area determining unit is used for determining a polling area which comprises a plurality of polling nodes;
the inspection unit is used for determining an initial inspection path according to the inspection nodes, and the initial inspection path is one-time for each inspection node;
the path planning unit is used for determining a target path with optimal energy consumption through a genetic algorithm by taking the initial routing inspection path as an initialization population of the genetic algorithm;
the image acquisition unit is used for carrying out at least two times of routing inspection according to the target path and shooting a node image at each routing inspection node in each routing inspection process;
the detection unit is used for inputting at least two node images of each inspection node into a pre-trained generation countermeasure network GAN, and performing semantic segmentation and change detection on the at least two node images through the GAN to obtain a change segmentation graph;
and the result judging unit is used for determining the ceiling detection result of the routing inspection node according to the change segmentation graph.
9. An unmanned aerial vehicle, comprising: memory, processor and computer program stored on the memory and executable on the processor, characterized in that the processor when executing the computer program implements the unmanned aerial vehicle inspection tour based ceiling detection method according to any of claims 1 to 7.
10. A computer-readable storage medium storing a computer program, wherein the computer program is configured to execute the unmanned aerial vehicle inspection-based ceiling detection method according to any one of claims 1 to 7.
CN202210830549.8A 2022-07-15 2022-07-15 Ceiling detection method and device based on unmanned aerial vehicle routing inspection, unmanned aerial vehicle and medium Pending CN115345830A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210830549.8A CN115345830A (en) 2022-07-15 2022-07-15 Ceiling detection method and device based on unmanned aerial vehicle routing inspection, unmanned aerial vehicle and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210830549.8A CN115345830A (en) 2022-07-15 2022-07-15 Ceiling detection method and device based on unmanned aerial vehicle routing inspection, unmanned aerial vehicle and medium

Publications (1)

Publication Number Publication Date
CN115345830A true CN115345830A (en) 2022-11-15

Family

ID=83947740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210830549.8A Pending CN115345830A (en) 2022-07-15 2022-07-15 Ceiling detection method and device based on unmanned aerial vehicle routing inspection, unmanned aerial vehicle and medium

Country Status (1)

Country Link
CN (1) CN115345830A (en)

Similar Documents

Publication Publication Date Title
CN110956651B (en) Terrain semantic perception method based on fusion of vision and vibrotactile sense
Liu et al. 3DCNN-DQN-RNN: A deep reinforcement learning framework for semantic parsing of large-scale 3D point clouds
Costea et al. Creating roadmaps in aerial images with generative adversarial networks and smoothing-based optimization
CN109636049B (en) Congestion index prediction method combining road network topological structure and semantic association
KR102328734B1 (en) Method for automatically evaluating labeling reliability of training images for use in deep learning network to analyze images, and reliability-evaluating device using the same
CN110633632A (en) Weak supervision combined target detection and semantic segmentation method based on loop guidance
CN111382686B (en) Lane line detection method based on semi-supervised generation confrontation network
CN108230291B (en) Object recognition system training method, object recognition method, device and electronic equipment
CN110889318A (en) Lane detection method and apparatus using CNN
CN110879961A (en) Lane detection method and apparatus using lane model
CN110059646A (en) The method and Target Searching Method of training action plan model
CN113781519A (en) Target tracking method and target tracking device
CN114708518A (en) Bolt defect detection method based on semi-supervised learning and priori knowledge embedding strategy
Yang et al. Toward country scale building detection with convolutional neural network using aerial images
CN113111814A (en) Regularization constraint-based semi-supervised pedestrian re-identification method and device
CN113129336A (en) End-to-end multi-vehicle tracking method, system and computer readable medium
CN116630801A (en) Remote sensing image weak supervision target detection method based on pseudo-instance soft label
CN111291785A (en) Target detection method, device, equipment and storage medium
CN115345830A (en) Ceiling detection method and device based on unmanned aerial vehicle routing inspection, unmanned aerial vehicle and medium
CN116311357A (en) Double-sided identification method for unbalanced bovine body data based on MBN-transducer model
CN115546668A (en) Marine organism detection method and device and unmanned aerial vehicle
CN114581819A (en) Video behavior identification method and system
CN116343132B (en) Complex scene power equipment defect identification method and device and computer equipment
Ogawa et al. Identifying Parking Lot Occupancy with YOLOv5
CN113705648B (en) Data processing method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination