CN118115902A - Unmanned aerial vehicle state detection image recognition method, device, equipment and medium - Google Patents

Unmanned aerial vehicle state detection image recognition method, device, equipment and medium Download PDF

Info

Publication number
CN118115902A
CN118115902A CN202410378740.2A CN202410378740A CN118115902A CN 118115902 A CN118115902 A CN 118115902A CN 202410378740 A CN202410378740 A CN 202410378740A CN 118115902 A CN118115902 A CN 118115902A
Authority
CN
China
Prior art keywords
image
aerial vehicle
unmanned aerial
mask
recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410378740.2A
Other languages
Chinese (zh)
Inventor
张佳辉
吕陆
彭玉宾
赵铁钢
章超
张子扬
许桐浩
谢毅铭
齐帅
黄嘉鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Corp of China SGCC
State Grid Beijing Electric Power Co Ltd
Original Assignee
State Grid Corp of China SGCC
State Grid Beijing Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Corp of China SGCC, State Grid Beijing Electric Power Co Ltd filed Critical State Grid Corp of China SGCC
Priority to CN202410378740.2A priority Critical patent/CN118115902A/en
Publication of CN118115902A publication Critical patent/CN118115902A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)

Abstract

The invention belongs to the field of image detection, and particularly discloses an unmanned aerial vehicle state detection image identification method, device, equipment and medium; according to the method, the unmanned aerial vehicle acquires the image and transmits the image to the cloud; the cloud receives the image and preprocesses the image; performing feature recognition and processing on the preprocessed image to obtain an image recognition result; by deeply analyzing the characteristics of the circuit discharge picture, the automatic identification of the circuit discharge image is simply and skillfully realized by utilizing an image processing technology; compared with a deep learning algorithm, the method can realize the target function without a complex deep learning model, and can save a great deal of manpower and material resource cost; the recognition accuracy is high, the accuracy can reach more than 98%, and the recognition method is close to a deep learning recognition method.

Description

Unmanned aerial vehicle state detection image recognition method, device, equipment and medium
Technical Field
The invention belongs to the field of image detection, and particularly relates to an unmanned aerial vehicle state detection image recognition method, device, equipment and medium.
Background
Along with the gradual popularization of unmanned aerial vehicle technology in the power industry, the power inspection profession is also developing towards the direction of intellectualization, high efficiency and convenience at a high speed. The unmanned aerial vehicle inspection technology has the following advantages: firstly, the real-time performance is strong, and unmanned aerial vehicle can carry high definition digtal camera and sensor, detects defect and the unusual of circuit in real time to real-time transmission data, be convenient for in time discover and solve the problem. Secondly, the security is high, and unmanned aerial vehicle patrols the line and can go on in high altitude or complex environment, has avoided the artifical security risk that patrol and examine probably meetting. Thirdly, with low costs, unmanned aerial vehicle patrols the cost of line mainly including equipment purchase and maintenance, unmanned aerial vehicle operating personnel wages etc. compare traditional manual work and patrol and examine, the cost is lower. Fourthly, not limited by the topography, unmanned aerial vehicle can easily fly over complicated topography and dangerous area, patrol and examine, and artifical patrol and examine and to have very big potential safety hazard in these areas. And fifthly, the unmanned aerial vehicle can be recycled after the task is completed, and the utilization rate of the equipment is improved. Sixth, can realize intelligent management, unmanned aerial vehicle patrols the line and can combine with intelligent management platform, realizes functions such as data automated processing, analysis and early warning, has improved management efficiency and accuracy.
At present, when the unmanned aerial vehicle autonomously patrols the line state and detects image recognition, the unmanned aerial vehicle carries on high definition digtal camera and sensor, and flies along the power line through remote control or autonomous flight mode. According to the flight setting, the shooting information is uploaded to a ground station or a data center in real time for analysis and processing (or temporarily stores an in-machine memory, and is copied after the flight is finished). And analyzing the data transmitted back by the unmanned aerial vehicle by the ground station or a data center worker, judging whether the line has faults or potential safety hazards, and making corresponding maintenance and overhaul plans.
At present, unmanned plane state detection image defect identification mainly relies on manual work or a deep learning method to establish a model for identification. The manual identification method has the problems of unstable accuracy, high labor cost and the like. The model algorithm is efficient, but needs to build a database, and can be realized by means of hardware such as a GPU (top-level GPU has limited export to China abroad), so that a large amount of manpower and material resources are also needed.
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle state detection image recognition method, device, equipment and medium, which are used for solving the problems of unstable accuracy and high labor cost of the existing unmanned aerial vehicle state detection image recognition method.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
The invention provides an unmanned aerial vehicle state detection image recognition method, which comprises the following steps:
The unmanned aerial vehicle acquires an image and transmits the image to the cloud;
The cloud receives the image and preprocesses the image;
And carrying out feature recognition and processing on the preprocessed image to obtain an image recognition result.
Further, the image transmission from the unmanned aerial vehicle to the cloud is specifically:
Setting a walking route of the unmanned aerial vehicle, carrying a charged state detection device on the unmanned aerial vehicle to carry out inspection according to the preset walking route, and shooting when the unmanned aerial vehicle reaches a specified position; after shooting is completed, the unmanned aerial vehicle transmits the image to the cloud end in real time through a wireless signal, or the shot image is stored in a built-in storage card of the unmanned aerial vehicle, and after the unmanned aerial vehicle completes a flight task, the image is copied to the cloud end for subsequent processing.
Further, after the cloud receives the image shot by the unmanned aerial vehicle, denoising is performed through a Gaussian filtering method, and mask processing is performed on the image after denoising, so that a preprocessed image is obtained.
Further, the mask processing specifically includes:
Converting the image from RGB format to HSV format, and processing the image by using masks with different colors;
in HSV format, when masking with blue, the blue mask parameters are: the value range of H is [100, 124], the value range of S is [43, 255], and the value range of V is [43, 255];
When the masking process is performed using yellow, the yellow masking parameters are: the value range of H is [26, 77], the value range of S is [43, 255], and the value range of V is [43, 255];
when masking is performed by adopting other colors, setting is performed according to the HSV value range.
Further, the performing feature recognition and processing on the preprocessed image to obtain an image recognition result specifically includes:
when the blue mask and the yellow mask are adopted to perform feature recognition and processing on the preprocessed image:
carrying out feature recognition on the image by adopting a Hough ring transformation method;
and carrying out feature processing on the image subjected to feature recognition to obtain an image recognition result.
Further, the method for performing feature recognition on the image by adopting the hough ring transformation specifically comprises the following steps:
when the blue mask and the yellow mask are adopted to perform feature recognition and processing on the preprocessed image:
detecting the preprocessed image by adopting a Hough ring transformation method to obtain the center coordinates and the radius information of the characteristic circle;
Setting Hough circular ring transformation parameters:
dp=1、minDist=1000、param1=10、param2=10、minRadius=1、maxRadius=100;
Wherein dp is the accumulator resolution; minDist is the minimum distance between circle centers; param1 is the high threshold of the Canny edge detector; the param2 is the number of votes which must be received at the center position; minRadius is the minimum value of the radius of the detection circle; maxRadius is the maximum value of the radius of the detection circle;
image feature value under blue mask, rb= { (xb i,ybi,rbi) |i=1, 2, … n }, data amount is 3n;
Image feature value under yellow mask, ry= { (xy j,yyj,ryj) |j=1, 2, … m, data amount is 3m.
Further, the performing feature processing on the image after feature recognition to obtain an image recognition result specifically includes:
judging the relative positions of the feature circles under the blue mask and the yellow mask;
When the yellow ring is included in the blue ring, the image has a discharge phenomenon; otherwise, the image does not contain discharge information;
the yellow circles are all contained in the blue circles, and the following data basis is adopted:
dij<rbi-ryj
Wherein d ij is the circular distance between the ith circle under the blue mask and the jth circle of the yellow mask; rb i is the radius of the ith circle under the blue mask; ry j is the radius of the jth circle of the yellow mask;
and when the judging condition is met, recognizing that the discharge phenomenon exists in the image, and obtaining an image recognition result.
In a second aspect of the present invention, there is provided an unmanned aerial vehicle state detection image recognition apparatus, including:
The image acquisition module is used for acquiring images through the unmanned aerial vehicle and transmitting the images to the cloud;
the preprocessing module is used for receiving the image through the cloud and preprocessing the image;
And the identification module is used for carrying out feature identification and processing on the preprocessed image to obtain an image identification result.
In a third aspect of the present invention, there is provided an electronic device comprising a processor and a memory, the processor being configured to execute a computer program stored in the memory to implement a method for identifying a status detection image of a drone as described in any one of the preceding claims.
In a fourth aspect of the present invention, there is provided a computer readable storage medium storing at least one instruction which when executed by a processor implements a method of unmanned aerial vehicle state detection image recognition as described in any one of the above.
The beneficial effects of the invention are as follows:
1. According to the method, the unmanned aerial vehicle acquires the image and transmits the image to the cloud; the cloud receives the image and preprocesses the image; performing feature recognition and processing on the preprocessed image to obtain an image recognition result; by deeply analyzing the characteristics of the circuit discharge picture, the automatic identification of the circuit discharge image is simply and skillfully realized by utilizing an image processing technology; compared with a deep learning algorithm, the method can realize the target function without a complex deep learning model, and can save a great deal of manpower and material resource cost; the recognition accuracy is high, the accuracy can reach more than 98%, and the recognition method is close to a deep learning recognition method.
2. The invention has lower hardware requirement, does not need to run on a high-performance processor, can meet the functional requirement by a common processing unit, and can save a large amount of hardware cost; starting from the characteristics of the picture, the target function can be realized without establishing a picture library; the portability is strong, and the realization process is relatively simple, so the method has fewer programs, can be widely applied to various hardware devices, and can be directly carried on an unmanned aerial vehicle for realizing real-time identification.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application. In the drawings:
Fig. 1 is a schematic flow chart of a method for recognizing a state detection image of an unmanned aerial vehicle;
fig. 2 is a state detection picture taken after the unmanned aerial vehicle is equipped with the state detection device;
FIG. 3 is a photograph of an image after filtering;
FIG. 4 is a blue mask processed image;
FIG. 5 is a graph showing the first blue mask feature recognition result without tuning optimization;
FIG. 6 is a graph showing the result of a second blue mask feature identification without tuning optimization;
FIG. 7 is a graph of blue mask feature recognition results after tuning optimization;
fig. 8 is a block diagram of a configuration of an unmanned aerial vehicle state detection image recognition device;
Fig. 9 is a block diagram of an electronic device.
Detailed Description
The application will be described in detail below with reference to the drawings in connection with embodiments. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
The following detailed description is exemplary and is intended to provide further details of the application. Unless defined otherwise, all technical terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments in accordance with the application.
Example 1
As shown in fig. 1, an unmanned aerial vehicle state detection image recognition method includes:
s1: the unmanned aerial vehicle acquires an image and transmits the image to the cloud;
Setting a walking route of the unmanned aerial vehicle, carrying a charged state detection device on the unmanned aerial vehicle to carry out inspection according to the preset walking route, and shooting when the unmanned aerial vehicle reaches a specified position; after shooting is completed, the unmanned aerial vehicle can transmit the image to the cloud end in real time through a wireless signal, or the shot image is stored in a built-in storage card of the unmanned aerial vehicle, and after the unmanned aerial vehicle completes a flight task, the image is copied to the cloud end for subsequent processing;
In the shooting process, an unmanned opportunity automatically focuses according to shooting conditions, so that the shot pictures are clear and undistorted, as shown in fig. 2;
s2: the cloud receives the image and preprocesses the image;
specifically, after the cloud receives an image shot by the unmanned aerial vehicle, denoising is performed through a Gaussian filtering method, and then masking is performed on the image to obtain a preprocessed image;
s21: the image shot by the unmanned aerial vehicle usually contains some random noise, and for this purpose, gaussian filtering is used to denoise the image, so as to obtain a denoised image, as shown in fig. 3;
S22: performing mask processing (image thresholding) on the denoised image to obtain a preprocessed image;
in order to identify a cloud picture in an image shot by the unmanned aerial vehicle (if a line has discharge, the cloud picture appears in the image, and a square frame surrounding area in fig. 3), processing the image by adopting a mask method, wherein the image after mask processing is shown in fig. 4; usually, the cloud image contains more than 2 colors, and the invention adopts two masks of blue and yellow;
After obtaining the preprocessed image, it can be found that the image with discharge phenomenon contains one or more circles; the subsequent research and judgment method is mainly based on the characteristic circle; in fact, in order to obtain the discharge intensity of the discharge point, the charged detection apparatus usually focuses, and the cloud image shows a circular distribution.
As can be seen from the cloud chart analysis of fig. 2, the blue circle is at the outermost layer, and the yellow circle is inside the blue circle, so that the discharge information in the image is to be judged by utilizing the characteristic;
As can be seen from fig. 3, if there is a discharge in the line, the image taken by the drone will contain a cloud, which will typically contain at least two colors, blue and yellow. The color types are related to the discharge intensity, and as the discharge intensity increases, the color types contained in the cloud image also gradually increase, such as green and red, as shown in fig. 3. For mask processing, we need to convert the image from RGB format to HSV format, and further use a mask of different color to process the image. The parameters of the blue mask can be obtained by looking up a table, and under the HSV format are as follows: the value range of H is [100, 124], the value range of S is [43, 255], and the value range of V is [43, 255]; the yellow mask parameters were: the values of H are 26, 77, S are 43, 255, V are 43, 255, if other color masks are needed, the color masks can be set according to HSV value ranges, and the image processed by the blue mask is shown in figure 4.
S3: performing feature recognition and processing on the preprocessed image to obtain an image recognition result;
In order to obtain the characteristic information (namely, characteristic circle information) of the blue mask and the yellow mask, an image is identified by adopting a Hough circular ring transformation method, and the information of the characteristic circle (namely, circle center coordinates and radius) can be obtained by adopting the method. And judging the relative positions of the characteristic circles under the blue mask and the yellow mask, if the yellow circular ring is contained in the blue circular ring, considering that the image has a discharge phenomenon, otherwise, the image does not contain discharge information.
In particular, the method comprises the steps of,
S31: performing feature recognition on the preprocessed image to obtain all features of the blue and yellow masks;
As shown in fig. 4, there is a ring shape (in a blue frame) in the image after the blue mask processing, and this feature can be used as an important basis for identifying whether the image contains discharge information (in fact, in the live detection, in order to determine the discharge intensity, focusing is required on the image, and in this case, the cloud image becomes a circle). Therefore, in order to extract the feature ring in fig. 3, the present invention adopts the hough circular ring transformation method for detection;
By the method, the information of all circles in the image after the blue mask processing, namely the center coordinates and the radius of each circle, is called the image characteristic value under the blue mask, rb= { (xb i,ybi,rbi) |i=1, 2, … n }, and the data size is 3n;
Similarly, the image characteristic value under the yellow mask can be obtained by using the yellow mask, and Ry= { (xy j,yyj,ryj) |j=1, 2, … m } and the data size is 3m;
in fact, the image is detected by adopting the Hough circular ring transformation method, if the parameters are not properly selected, a large amount of noise circular information is detected or cannot be detected, and fig. 5 and 6 are identification detection screenshots which are not optimized. In order to reduce the calculated amount and improve the detection accuracy, the parameters of the Hough circular ring transformation are optimized, wherein one reasonable value is as follows:
dp=1、minDist=1000、param1=10、param2=10、minRadius=1、maxRadius=100;
Wherein dp is the accumulator resolution; minDist is the minimum distance between circle centers; param1 is the high threshold of the Canny edge detector; the param2 is the number of votes which must be received at the center position; minRadius is the minimum value of the radius of the detection circle; maxRadius is the maximum value of the radius of the detection circle;
Fig. 7 is a result of detecting an image using the above parameters. It can be seen that the target circles we want have been detected and the eigenvalues under their blue mask are obtained: (358.5.270.5.45.2); the feature value under the yellow mask can be obtained by adopting the same parameters, (358.5 269.5.35.5);
Obtaining all features of the blue and yellow masks;
S32: performing feature processing on the image subjected to feature recognition to obtain an image recognition result;
After obtaining all the characteristics of the blue and yellow masks, analyzing if one yellow ring is contained in one blue ring, and if a discharge cloud image exists in the image, namely, a discharge phenomenon exists in the detected line section;
The specific judging process is as follows:
The yellow circles are all contained in the blue circles, and the following data basis is adopted:
dij<rbi-ryj
Wherein d ij is the circular distance between the ith circle under the blue mask and the jth circle of the yellow mask; rb i is the radius of the ith circle under the blue mask; ry j is the radius of the jth circle of the yellow mask;
In fact, when we finish tuning the hough circle detection parameters, the circles that can be detected by the two masks are very limited, so the number of times of judging the above formula is small, which also explains the necessity of tuning the hough circle detection parameters;
When the numerical values in the above formula are met, we consider that there is a cloud image in the image, that is, there is a discharge phenomenon in the image, and the cloud image is calculated according to the blue mask and yellow mask data obtained by the calculation:
and when the judging condition is met, recognizing that the discharge phenomenon exists in the image, and obtaining an image recognition result.
Example 2
As shown in fig. 8, based on the same inventive concept as the above embodiment, the present invention also provides an unmanned aerial vehicle state detection image recognition apparatus, including:
The image acquisition module is used for acquiring images through the unmanned aerial vehicle and transmitting the images to the cloud;
the preprocessing module is used for receiving the image through the cloud and preprocessing the image;
And the identification module is used for carrying out feature identification and processing on the preprocessed image to obtain an image identification result.
Example 3
As shown in fig. 9, the present invention further provides an electronic device 100 for implementing an unmanned aerial vehicle state detection image recognition method;
The electronic device 100 comprises a memory 101, at least one processor 102, a computer program 103 stored in the memory 101 and executable on the at least one processor 102, and at least one communication bus 104.
The memory 101 may be used to store a computer program 103, and the processor 102 implements a unmanned aerial vehicle state detection image recognition method step of embodiment 1 by running or executing the computer program stored in the memory 101 and invoking data stored in the memory 101.
The memory 101 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data) created according to the use of the electronic device 100, and the like. In addition, memory 101 may include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart memory card (SMART MEDIA CARD, SMC), secure Digital (SD) card, flash memory card (FLASH CARD), at least one disk storage device, flash memory device, or other non-volatile solid-state storage device.
The at least one Processor 102 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor 102 may be a microprocessor or the processor 102 may be any conventional processor or the like, the processor 102 being a control center of the electronic device 100, the various interfaces and lines being utilized to connect various portions of the overall electronic device 100.
The memory 101 in the electronic device 100 stores a plurality of instructions to implement a method for identifying a status detection image of a drone, the processor 102 may execute the plurality of instructions to implement:
The unmanned aerial vehicle acquires an image and transmits the image to the cloud;
The cloud receives the image and preprocesses the image;
performing feature recognition and processing on the preprocessed image to obtain an image recognition result;
Example 4
The modules/units integrated with the electronic device 100 may be stored in a computer readable storage medium if implemented in the form of software functional units and sold or used as a stand alone product. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of each method embodiment described above may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, and a Read-Only Memory (ROM).
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In the description of the present specification, the descriptions of the terms "one embodiment," "example," "specific example," and the like, mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Finally, it should be noted that: the above embodiments are only for illustrating the technical aspects of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the above embodiments, it should be understood by those of ordinary skill in the art that: modifications and equivalents may be made to the specific embodiments of the invention without departing from the spirit and scope of the invention, which is intended to be covered by the claims.

Claims (10)

1. The unmanned aerial vehicle state detection image recognition method is characterized by comprising the following steps of:
The unmanned aerial vehicle acquires an image and transmits the image to the cloud;
The cloud receives the image and preprocesses the image;
And carrying out feature recognition and processing on the preprocessed image to obtain an image recognition result.
2. The method for identifying the unmanned aerial vehicle state detection image according to claim 1, wherein the unmanned aerial vehicle acquired image is transmitted to the cloud specifically comprises:
Setting a walking route of the unmanned aerial vehicle, carrying a charged state detection device on the unmanned aerial vehicle to carry out inspection according to the preset walking route, and shooting when the unmanned aerial vehicle reaches a specified position; after shooting is completed, the unmanned aerial vehicle transmits the image to the cloud end in real time through a wireless signal, or the shot image is stored in a built-in storage card of the unmanned aerial vehicle, and after the unmanned aerial vehicle completes a flight task, the image is copied to the cloud end for subsequent processing.
3. The unmanned aerial vehicle state detection image recognition method according to claim 1, wherein the cloud receives the image shot by the unmanned aerial vehicle, performs denoising processing through a Gaussian filter method, and performs mask processing on the image after denoising processing to obtain a preprocessed image.
4. A method for identifying a status detection image of an unmanned aerial vehicle according to claim 3, wherein the masking process specifically comprises:
Converting the image from RGB format to HSV format, and processing the image by using masks with different colors;
in HSV format, when masking with blue, the blue mask parameters are: the value range of H is [100, 124], the value range of S is [43, 255], and the value range of V is [43, 255];
When the masking process is performed using yellow, the yellow masking parameters are: the value range of H is [26, 77], the value range of S is [43, 255], and the value range of V is [43, 255];
when masking is performed by adopting other colors, setting is performed according to the HSV value range.
5. The method for recognizing the unmanned aerial vehicle state detection image according to claim 1, wherein the performing feature recognition and processing on the preprocessed image to obtain the image recognition result specifically comprises:
when the blue mask and the yellow mask are adopted to perform feature recognition and processing on the preprocessed image:
carrying out feature recognition on the image by adopting a Hough ring transformation method;
and carrying out feature processing on the image subjected to feature recognition to obtain an image recognition result.
6. The unmanned aerial vehicle state detection image recognition method according to claim 5, wherein the method for performing feature recognition on the image by using hough ring transformation specifically comprises:
when the blue mask and the yellow mask are adopted to perform feature recognition and processing on the preprocessed image:
detecting the preprocessed image by adopting a Hough ring transformation method to obtain the center coordinates and the radius information of the characteristic circle;
Setting Hough circular ring transformation parameters:
dp=1、minDist=1000、param1=10、param2=10、minRadius=1、maxRadius=100;
Wherein dp is the accumulator resolution; minDist is the minimum distance between circle centers; param1 is the high threshold of the Canny edge detector; the param2 is the number of votes which must be received at the center position; minRadius is the minimum value of the radius of the detection circle; maxRadius is the maximum value of the radius of the detection circle;
image feature value under blue mask, rb= { (xb i,ybi,rbi) |i=1, 2, … n }, data amount is 3n;
Image feature value under yellow mask, ry= { (xy j,yyj,ryj) |j=1, 2, … m, data amount is 3m.
7. The method for recognizing the unmanned aerial vehicle state detection image according to claim 6, wherein the feature processing is performed on the image after feature recognition, and the obtaining of the image recognition result specifically comprises:
judging the relative positions of the feature circles under the blue mask and the yellow mask;
When the yellow ring is included in the blue ring, the image has a discharge phenomenon; otherwise, the image does not contain discharge information;
the yellow circles are all contained in the blue circles, and the following data basis is adopted:
dij<rbi-ryj
Wherein d ij is the circular distance between the ith circle under the blue mask and the jth circle of the yellow mask; rb i is the radius of the ith circle under the blue mask; ry j is the radius of the jth circle of the yellow mask;
and when the judging condition is met, recognizing that the discharge phenomenon exists in the image, and obtaining an image recognition result.
8. An unmanned aerial vehicle state detection image recognition device, characterized by comprising:
The image acquisition module is used for acquiring images through the unmanned aerial vehicle and transmitting the images to the cloud;
the preprocessing module is used for receiving the image through the cloud and preprocessing the image;
And the identification module is used for carrying out feature identification and processing on the preprocessed image to obtain an image identification result.
9. An electronic device comprising a processor and a memory, the processor being configured to execute a computer program stored in the memory to implement a method of unmanned aerial vehicle state detection image recognition as claimed in any one of claims 1 to 7.
10. A computer readable storage medium storing at least one instruction that when executed by a processor implements a method of unmanned aerial vehicle condition detection image recognition according to any of claims 1 to 7.
CN202410378740.2A 2024-03-29 2024-03-29 Unmanned aerial vehicle state detection image recognition method, device, equipment and medium Pending CN118115902A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410378740.2A CN118115902A (en) 2024-03-29 2024-03-29 Unmanned aerial vehicle state detection image recognition method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410378740.2A CN118115902A (en) 2024-03-29 2024-03-29 Unmanned aerial vehicle state detection image recognition method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN118115902A true CN118115902A (en) 2024-05-31

Family

ID=91208604

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410378740.2A Pending CN118115902A (en) 2024-03-29 2024-03-29 Unmanned aerial vehicle state detection image recognition method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN118115902A (en)

Similar Documents

Publication Publication Date Title
TWI709109B (en) A non-transitory computer-readable medium and system for detecting objects from aerial imagery and a method for detecting objects from aerial imagery
KR101854554B1 (en) Method, device and storage medium for calculating building height
JP6710426B2 (en) Obstacle detection method and device
US10824885B2 (en) Method and apparatus for detecting braking behavior of front vehicle of autonomous vehicle
CN106851229B (en) Security and protection intelligent decision method and system based on image recognition
CN108961276B (en) Distribution line inspection data automatic acquisition method and system based on visual servo
CN105572541A (en) High-voltage line patrol fault detection method and system based on visual attention mechanism
CN102298698A (en) Remote sensing image airplane detection method based on fusion of angle points and edge information
CN104601956A (en) Power transmission line online monitoring system and method based on fixed-wing unmanned aerial vehicle
CN114445440A (en) Obstacle identification method applied to self-walking equipment and self-walking equipment
CN110046584B (en) Road crack detection device and detection method based on unmanned aerial vehicle inspection
CN109297978B (en) Binocular imaging-based power line unmanned aerial vehicle inspection and defect intelligent diagnosis system
CN111340833B (en) Power transmission line extraction method for least square interference-free random Hough transformation
CN111192326A (en) Method and system for visually identifying direct-current charging socket of electric automobile
CN106251337A (en) A kind of drogue space-location method and system
CN112578405A (en) Method and system for removing ground based on laser radar point cloud data
CN110782484A (en) Unmanned aerial vehicle video personnel identification and tracking method
CN110084587B (en) Automatic dinner plate settlement method based on edge context
CN118115902A (en) Unmanned aerial vehicle state detection image recognition method, device, equipment and medium
CN117197700A (en) Intelligent unmanned inspection contact net defect identification system
CN108335308A (en) A kind of orange automatic testing method, system and intelligent robot retail terminal
CN104899854A (en) Detection method and detection device of grain piling height line
CN111985497B (en) Crane operation identification method and system under overhead transmission line
CN112016418B (en) Secant recognition method and device, electronic equipment and storage medium
CN204884183U (en) Parking area bluetooth WIFI read head and cell -phone APP assist license plate recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination