CN114937027A - Fan blade defect detection method and device, electronic equipment and storage medium - Google Patents

Fan blade defect detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114937027A
CN114937027A CN202210700976.4A CN202210700976A CN114937027A CN 114937027 A CN114937027 A CN 114937027A CN 202210700976 A CN202210700976 A CN 202210700976A CN 114937027 A CN114937027 A CN 114937027A
Authority
CN
China
Prior art keywords
image
fan blade
structural similarity
mean
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210700976.4A
Other languages
Chinese (zh)
Other versions
CN114937027B (en
Inventor
刘海莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Innovation Wisdom Shanghai Technology Co ltd
Original Assignee
Innovation Wisdom Shanghai Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innovation Wisdom Shanghai Technology Co ltd filed Critical Innovation Wisdom Shanghai Technology Co ltd
Priority to CN202210700976.4A priority Critical patent/CN114937027B/en
Publication of CN114937027A publication Critical patent/CN114937027A/en
Application granted granted Critical
Publication of CN114937027B publication Critical patent/CN114937027B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E10/00Energy generation through renewable energy sources
    • Y02E10/70Wind energy
    • Y02E10/72Wind turbines with rotation axis in wind direction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Quality & Reliability (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application belongs to the technical field of image processing, and discloses a method, a device, electronic equipment and a storage medium for detecting the defects of a fan blade, wherein the method comprises the steps of segmenting a local image of the fan blade from an image to be detected; inputting a local image of the fan blade into an image restoration model to obtain a fan blade restoration image, wherein the image restoration model is used for filling a missing part of the fan blade; determining the structural similarity between the local image of the fan blade and the restored image of the fan blade; and determining the detection result of the fan blade defects according to the structural similarity. Therefore, the complex steps of detection are simplified, the consumed labor cost and time cost are reduced, and the detection efficiency and accuracy are improved.

Description

Fan blade defect detection method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of image processing, in particular to a method and a device for detecting defects of fan blades, electronic equipment and a storage medium.
Background
The traditional wind power blade fault inspection mode is mainly a visual inspection mode, mainly comprises high power telescope inspection, high-altitude circumambulated descent visual inspection and blade maintenance platform inspection, has the characteristics of long detection time, high cost, complex operation steps, low efficiency and the like, and is not suitable for daily inspection.
Therefore, a method for detecting defects of a fan blade is needed to reduce the complexity and cost of detecting the fan blade and improve the detection efficiency.
Disclosure of Invention
The embodiment of the application aims to provide a method and a device for detecting the defects of the fan blade, electronic equipment and a storage medium, which are used for reducing the complexity and cost of fan blade detection and improving the detection efficiency when the fan blade is detected.
In one aspect, a method for detecting a defect of a fan blade is provided, which includes:
segmenting a local image of the fan blade from an image to be detected;
inputting a local image of the fan blade into an image restoration model to obtain a fan blade restoration image, wherein the image restoration model is used for filling a missing part of the fan blade;
determining the structural similarity between the local image of the fan blade and the restored image of the fan blade;
and determining the detection result of the fan blade defects according to the structural similarity.
In the implementation process, the fan blade restoration image is obtained through the image restoration model, the detection result is obtained through the structural similarity between the images, the complex steps of detection are simplified, the consumed labor cost and time cost are reduced, and the detection efficiency and accuracy are improved.
In one embodiment, segmenting a local image of a fan blade from an image to be detected includes:
generating a foreground depth map of the image to be detected based on the distance between each pixel in the image to be detected and the target position;
adjusting the distance lower than the distance threshold value to a set distance from the foreground depth map to obtain a background depth map;
generating a background depth map of the image to be detected based on the distance between each pixel in the background of the image to be detected and the target position;
obtaining a subtraction image based on a difference between the background depth map and the foreground depth map;
and extracting a local image of the fan blade based on the subtraction image.
In the implementation process, the local image of the fan blade is extracted through the depth map, and the detected data processing amount is reduced.
In one embodiment, extracting a local image of the fan blade based on the subtraction image includes:
carrying out binarization on the subtraction image to obtain a binarized image;
and segmenting a local image of the fan blade from the binary image.
In the implementation process, the edge extraction effect is optimized through binarization.
In one embodiment, segmenting a local image of the fan blade from the binarized image includes:
performing particle filtering on the binary image by adopting particle filtering to obtain a filtered image;
and segmenting a local image of the fan blade from the filtering image.
In the above implementation process, through particle filtering, part of particles can be filled and interference can be removed.
In one embodiment, determining structural similarity between a local image of a fan blade and a restored image of the fan blade includes:
determining a first mean value and a first variance of a first pixel value of each pixel in a local image of the fan blade;
determining a second mean value and a second variance of a second pixel value of each pixel in the fan blade restoration image;
determining a covariance between each first pixel value and each second pixel value;
determining a product between the first mean value and the second mean value to obtain a mean value product;
and obtaining structural similarity based on the mean product, the covariance, the first variance, the second variance, the first mean and the second mean, wherein the structural similarity is positively correlated with the mean product and the covariance and negatively correlated with the first variance, the second variance, the square of the first mean and the square of the second mean.
In the implementation process, the similarity between the images can be determined through the structure of the fan blade.
In one embodiment, determining a fan blade defect detection result according to structural similarity includes:
if the structural similarity is higher than the structural similarity threshold value, obtaining a fan blade defect detection result representing that the fan blade is normal;
and if the structural similarity is not higher than the structural similarity threshold value, obtaining a fan blade defect detection result representing that the fan blade has defects.
In the implementation process, whether defects exist is judged through structural similarity between the images, and the detection accuracy is improved.
In one embodiment, the method further comprises:
and if the fan blade is determined to have defects based on the fan blade defect detection result, comparing the local image of the fan blade with the fan blade restoration image to obtain a defect area of the fan blade.
In the above implementation, an accurate defect region can be obtained.
In one aspect, a device for detecting defects of a fan blade is provided, which includes:
the segmentation unit is used for segmenting a local image of the fan blade from the image to be detected;
the fan blade restoration method comprises a restoration unit, a calculation unit and a calculation unit, wherein the restoration unit is used for inputting a local image of a fan blade into an image restoration model to obtain a fan blade restoration image, and the image restoration model is used for filling a missing part of the fan blade;
the matching unit is used for determining the structural similarity between the local image of the fan blade and the restored image of the fan blade;
and the detection unit is used for determining the detection result of the fan blade defects according to the structural similarity.
In one embodiment, the segmentation unit is configured to:
generating a foreground depth map of the image to be detected based on the distance between each pixel in the image to be detected and the target position;
adjusting the distance lower than the distance threshold value to a set distance from the foreground depth map to obtain a background depth map;
generating a background depth map of the image to be detected based on the distance between each pixel in the background of the image to be detected and the target position;
obtaining a subtraction image based on a difference between the background depth map and the foreground depth map;
and extracting a local image of the fan blade based on the subtraction image.
In one embodiment, the segmentation unit is configured to:
binarizing the subtraction image to obtain a binarized image;
and (5) segmenting a local image of the fan blade from the binary image.
In one embodiment, the segmentation unit is configured to:
performing particle filtering on the binary image by adopting particle filtering to obtain a filtered image;
and segmenting a local image of the fan blade from the filtering image.
In one embodiment, the matching unit is configured to:
determining a first mean value and a first variance of a first pixel value of each pixel in a local image of the fan blade;
determining a second mean value and a second variance of a second pixel value of each pixel in the fan blade restoration image;
determining a covariance between each first pixel value and each second pixel value;
determining a product between the first mean value and the second mean value to obtain a mean value product;
and obtaining structural similarity based on the mean product, the covariance, the first variance, the second variance, the first mean and the second mean, wherein the structural similarity is positively correlated with the mean product and the covariance, and is negatively correlated with the first variance, the second variance, the square of the first mean and the square of the second mean.
In one embodiment, the detection unit is configured to:
if the structural similarity is higher than the structural similarity threshold value, obtaining a fan blade defect detection result representing that the fan blade is normal;
and if the structural similarity is not higher than the structural similarity threshold value, obtaining a fan blade defect detection result representing that the fan blade has defects.
In one embodiment, the detection unit is further configured to:
and if the fan blade is determined to have defects based on the fan blade defect detection result, comparing the local image of the fan blade with the fan blade restoration image to obtain a defect area of the fan blade.
In one aspect, an electronic device is provided, comprising a processor and a memory, the memory storing computer readable instructions which, when executed by the processor, perform the steps of the method provided in any of the various alternative implementations of fan blade defect detection described above.
In one aspect, a computer-readable storage medium is provided, having stored thereon a computer program which, when executed by a processor, performs the steps of the method as provided in the various alternative implementations of fan blade defect detection as described in any of the above.
In one aspect, a computer program product is provided, which when run on a computer causes the computer to perform the steps of the method as provided in any of the various alternative implementations of fan blade defect detection described above.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
FIG. 1 is a flow chart of a method for detecting a defect in a fan blade according to an embodiment of the present disclosure;
fig. 2 is an exemplary diagram of a foreground depth map provided in an embodiment of the present application;
fig. 3 is an exemplary diagram of a background depth map provided in an embodiment of the present application;
FIG. 4 is an exemplary diagram of a defect image provided by an embodiment of the present application;
fig. 5 is an example of a restored image according to an embodiment of the present disclosure;
FIG. 6 is a diagram illustrating an exemplary defect in a fan blade according to an embodiment of the present disclosure;
fig. 7 is a flowchart of an implementation of an image segmentation method according to an embodiment of the present application;
FIG. 8 is a block diagram of a device for detecting defects of a fan blade according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined or explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Some terms referred to in the embodiments of the present application will be described first to facilitate understanding by those skilled in the art.
The terminal equipment: may be a mobile terminal, a fixed terminal, or a portable terminal such as a mobile handset, station, unit, device, multimedia computer, multimedia tablet, internet node, communicator, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, personal communication system device, personal navigation device, personal digital assistant, audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, gaming device, or any combination thereof, including the accessories and peripherals of these devices, or any combination thereof. It is also contemplated that the terminal device can support any type of interface to the user (e.g., wearable device), and the like.
A server: the cloud server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server for providing basic cloud computing services such as cloud service, a cloud database, cloud computing, cloud functions, cloud storage, network service, cloud communication, middleware service, domain name service, security service, big data and artificial intelligence platform and the like.
In order to reduce the complexity and cost of fan blade detection and improve the detection efficiency when detecting the fan blade, the embodiment of the application provides a method and a device for detecting the defect of the fan blade, an electronic device and a storage medium.
Referring to fig. 1, a flowchart of a method for detecting a fan blade defect according to an embodiment of the present disclosure is shown, and the method may be applied to an electronic device, where the electronic device may be a server or a terminal device. The specific implementation flow of the method is as follows:
step 100: segmenting a local image of the fan blade from the image to be detected; step 101: inputting a local image of the fan blade into an image restoration model to obtain a fan blade restoration image, wherein the image restoration model is used for filling a missing part of the fan blade; step 102: determining the structural similarity between the local image of the fan blade and the restored image of the fan blade; step 103: and determining the detection result of the fan blade defects according to the structural similarity.
Optionally, the image to be detected may be stored locally by the electronic device, may also be sent by other devices, and may also be a video frame selected from a video.
In one embodiment, a fan blade is monitored to obtain a monitoring video, and a video frame at the current moment is captured from the monitoring video and used as an image to be detected.
In order to reduce the subsequent data processing amount, the implementation process of segmenting the local image of the fan blade from the image to be detected in step 100 may include: s1001: generating a foreground depth map of the image to be detected based on the distance between each pixel in the image to be detected and the target position; s1002: from the foreground depth map, adjusting a distance lower than a distance threshold (optionally, the distance threshold may be set according to an actual application scene, and is not limited herein) to a set distance (optionally, the set distance may be set according to the actual application scene, and is not limited herein), and obtaining a background depth map; s1003: obtaining a subtraction image based on a difference between the background depth map and the foreground depth map; s1004: and extracting a local image of the fan blade based on the subtraction image.
In one embodiment, the implementation process of S1001 may include: and determining the distance between each pixel in the image to be detected and the camera (namely the target position), and generating a foreground depth map based on the distance of each pixel.
It should be noted that the depth map is used to represent each pixel value of the image to represent the distance of a certain point in the scene from the camera.
Fig. 2 shows an example of a foreground depth map. Fig. 3 is an exemplary diagram of a background depth map. Fig. 2 and 3 differ in the pixel values of the foreground portion (e.g., the fan blade). It should be noted that fig. 2 and fig. 3 are only used to illustrate the depths of different areas, and if there are lines in fig. 2 and fig. 3, the lines are not clear, which does not affect the clarity of the description.
Further, to optimize the edge extraction effect, the implementation process of extracting the local image of the fan blade based on the subtraction image in S1004 may include: binarizing the subtraction image to obtain a binarized image; and (5) segmenting a local image of the fan blade from the binary image.
Further, in order to fill partial particles and remove interference, the implementation process of segmenting the local image of the fan blade from the binarized image in S1004 may include: performing particle filtering on the binary image by adopting particle filtering to obtain a filtered image; and segmenting a local image of the fan blade from the filtering image.
As an example, the image restoration model in step 101 may be a Partial Convolutional image restoration (PConv) based method.
The image restoration method is to fill up the missing part in the target object in the image, so that the image is attractive visually and semantically. As an example, referring to fig. 4, an exemplary diagram of a defect image is shown. An example of restoring an image is shown in fig. 5. The image in fig. 4 has a missing portion, and the missing portion in fig. 4 is restored by an image restoration method to obtain a restored image after restoration shown in fig. 5. It should be noted that fig. 4 and 5 are only used to illustrate an image with a defect and an image after restoration, and if lines in fig. 4 and 5 are unclear, the clarity of the description is not affected.
In the embodiment of the application, the image restoration model is trained through a large number of normal fan blade samples, so that the image restoration model captures the characteristics of the normal fan blade samples, and the pixel point abnormality (blade carbonization) of the fan blade is captured through the image comparison mode, and the restoration of the fan blade image with defects is realized. When the fan blade is actually deployed, the image restoration model can be trained and obtained only by collecting a large number of easily-obtained normal state samples of the fan blade.
In order to improve the accuracy of determining the structural similarity, the implementation process of determining the structural similarity between the local image of the fan blade and the restored image of the fan blade in step 102 may include: s1021: determining a first mean value and a first variance of a first pixel value of each pixel in a local image of the fan blade; s1022: determining a second mean value and a second variance of a second pixel value of each pixel in the fan blade restoration image; s1023: determining a covariance between each first pixel value and each second pixel value; s1024: determining a product between the first mean value and the second mean value to obtain a mean value product; s1025: and obtaining structural similarity based on the mean product, the covariance, the first variance, the second variance, the first mean and the second mean, wherein the structural similarity is positively correlated with the mean product and the covariance and negatively correlated with the first variance, the second variance, the square of the first mean and the square of the second mean.
In one embodiment, when determining the structural similarity SSIM, the following formula may be used:
Figure BDA0003703930060000091
wherein x is a first pixel value of the local image of the fan blade, y is a second pixel value of the restored image of the fan blade, and u x Is a first average value, u y The second average value is a value of the second average value,
Figure BDA0003703930060000092
is a first one of the variances, and,
Figure BDA0003703930060000093
is the second variance, δ xy Is covariance, c 1 =(k1*L)^2,c 2 And (k2 × L) ^2 is a constant for maintaining stability. L is the dynamic range of the pixel value (e.g., 255), and k1 and k2 are both constants, e.g., k1 equals 0.01 and k2 equals 0.03. The structural similarity SSIM ranges from-1 to +1, i.e., SSIM ∈ (-1, 0)]. When the two images are identical, the value of SSIM is equal to 1.
It should be noted that natural images generally have extremely high structure, and there is strong correlation between pixels of images, especially in the case of spatial similarity. These correlations carry important information about the structure of objects in the visual scene. The human visual system mainly acquires structural information from within a visible region, and therefore, approximate information of image distortion can be perceived by detecting whether the structural information changes. SSIM is the comparison of the structure of a distorted image and a reference image to obtain structural similarity. The brightness (i.e. the average of the pixel values) and the contrast (i.e. the variance of the pixel values) related to the structure of the object are taken as definitions of the structural information in the image. Since the brightness and contrast in a scene are always changing, more accurate results can be obtained by processing the parts separately.
In order to obtain an accurate fan blade defect detection result, the determining a fan blade defect detection result implementation process according to the structural similarity in step 103 may include: s1031: if the structural similarity is higher than a structural similarity threshold value (for example, 0.5), obtaining a fan blade defect detection result representing that the fan blade is normal; s1032: and if the structural similarity is not higher than the structural similarity threshold value, obtaining a fan blade defect detection result representing that the fan blade has defects.
In practical application, the structural similarity threshold may be set according to a practical application scenario, which is not limited herein.
Further, when the fan blade is determined to have defects, the area where the fan blade has defects can be obtained.
In one embodiment, if it is determined that the fan blade has a defect based on the fan blade defect detection result, the local image of the fan blade is compared with the restored image of the fan blade to obtain a defect area of the fan blade.
Further, when the fan blade defect detection result representing that the fan blade has defects is determined to be obtained, abnormal warning information including the fan blade defect detection result and the defect area of the fan blade can be sent to the management device.
FIG. 6 is a diagram illustrating an exemplary fan blade defect. The black areas in fig. 6 are defective areas in the fan blade. Fig. 6 is only used to illustrate that there is a defect area in the image, and if there is a line in fig. 6, it does not affect the clarity of the description.
Fig. 7 is a flowchart illustrating an implementation of an image segmentation method. The step of segmenting the partial image of the fan blade in fig. 1 is illustrated with reference to fig. 7. The specific implementation flow of the method is as follows:
step 701: and generating a foreground depth map of the image to be detected based on the distance between each pixel in the image to be detected and the target position.
Step 702: and adjusting the distance lower than the distance threshold value to be a set distance from the foreground depth map to obtain a background depth map.
Step 703: based on the difference between the background depth map and the foreground depth map, a subtraction image is obtained.
Step 704: and carrying out binarization on the subtraction image to obtain a binarized image.
Step 705: and performing particle filtering on the binary image by adopting particle filtering to obtain a filtered image.
Step 706: and segmenting a local image of the fan blade from the filtered image.
Specifically, when step 701 to step 706 are executed, the specific steps refer to step 100 described above, and are not described herein again.
In the embodiment of the application, fan blade local image is separated from the image to be detected, the subsequent detection data processing amount is reduced, through binarization and particle filtering, the edge extraction effect is optimized, partial particles can be filled and interference can be removed, and the normal state sample of the fan blade which is easy to obtain only needs to be collected can be trained to obtain an image restoration model, so that image restoration is carried out through the image restoration model, the fussy steps of detection are simplified, and comparison is carried out through structural similarity, and the detection accuracy is improved.
Based on the same inventive concept, the embodiment of the application also provides a device for detecting the defects of the fan blade, and as the principle of solving the problems of the device and the equipment is similar to the method for detecting the defects of the fan blade, the implementation of the device can refer to the implementation of the method, and repeated parts are not repeated.
As shown in fig. 8, a schematic structural diagram of a device for detecting a defect of a fan blade according to an embodiment of the present application includes:
the segmentation unit 801 is used for segmenting a local image of the fan blade from the image to be detected;
a restoring unit 802, configured to input the local image of the fan blade into an image restoring model, to obtain a fan blade restoring image, where the image restoring model is used to fill a missing portion of the fan blade;
the matching unit 803 is configured to determine structural similarity between the local image of the fan blade and the restored image of the fan blade;
and the detection unit 804 is used for determining a fan blade defect detection result according to the structural similarity.
In one embodiment, the segmentation unit 801 is configured to:
generating a foreground depth map of the image to be detected based on the distance between each pixel in the image to be detected and the target position;
adjusting the distance lower than the distance threshold value to a set distance from the foreground depth map to obtain a background depth map;
generating a background depth map of the image to be detected based on the distance between each pixel in the background of the image to be detected and the target position;
obtaining a subtraction image based on a difference between the background depth map and the foreground depth map;
and extracting a local image of the fan blade based on the subtraction image.
In one embodiment, the segmentation unit 801 is configured to:
binarizing the subtraction image to obtain a binarized image;
and (5) segmenting a local image of the fan blade from the binary image.
In one embodiment, the segmentation unit 801 is configured to:
performing particle filtering on the binary image by adopting particle filtering to obtain a filtered image;
and segmenting a local image of the fan blade from the filtering image.
In one embodiment, the matching unit 803 is configured to:
determining a first mean value and a first variance of a first pixel value of each pixel in a local image of the fan blade;
determining a second mean value and a second variance of a second pixel value of each pixel in the fan blade restoration image;
determining a covariance between each first pixel value and each second pixel value;
determining a product between the first mean value and the second mean value to obtain a mean value product;
and obtaining structural similarity based on the mean product, the covariance, the first variance, the second variance, the first mean and the second mean, wherein the structural similarity is positively correlated with the mean product and the covariance and negatively correlated with the first variance, the second variance, the square of the first mean and the square of the second mean.
In one embodiment, the detecting unit 804 is configured to:
if the structural similarity is higher than the structural similarity threshold value, obtaining a fan blade defect detection result representing that the fan blade is normal;
and if the structural similarity is not higher than the structural similarity threshold value, obtaining a fan blade defect detection result representing that the fan blade has defects.
In one embodiment, the detecting unit 804 is further configured to:
and if the fan blade is determined to have defects based on the fan blade defect detection result, comparing the local image of the fan blade with the fan blade restoration image to obtain a defect area of the fan blade.
In the method, the device, the electronic equipment and the storage medium for detecting the fan blade defects, the local image of the fan blade is segmented from the image to be detected; inputting a local image of the fan blade into an image restoration model to obtain a fan blade restoration image, wherein the image restoration model is used for filling a missing part of the fan blade; determining the structural similarity between the local image of the fan blade and the restored image of the fan blade; and determining the detection result of the fan blade defects according to the structural similarity. Therefore, the fan blade restoration images are obtained through the image restoration model, the detection results are obtained through the structural similarity between the images, the complex steps of detection are simplified, the consumed labor cost and time cost are reduced, and the detection efficiency and accuracy are improved.
Fig. 9 shows a schematic structural diagram of an electronic device 9000. Referring to fig. 9, an electronic device 9000 comprises: a processor 9010 and a memory 9020, which may optionally further include a power supply 9030, a display unit 9040, and an input unit 9050.
The processor 9010 is a control center of the electronic device 9000, connects various components by various interfaces and lines, and executes software programs and/or data stored in the memory 9020 to perform various functions of the electronic device 9000, thereby monitoring the electronic device 9000 as a whole.
In this embodiment of the present application, when the processor 9010 calls the computer program stored in the memory 9020, the steps in the above embodiments are performed.
Optionally, processor 9010 may include one or more processing units; preferably, the processor 9010 may integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a modem processor, which mainly handles wireless communications. It is to be understood that the modem processor may not be integrated into the processor 9010. In some embodiments, the processor, memory, and/or memory may be implemented on a single chip, or in some embodiments, they may be implemented separately on separate chips.
The memory 9020 may mainly include a program storage area and a data storage area, where the program storage area may store an operating system, various applications, and the like; the storage data area may store data created from use of the electronic device 9000, and the like. Further, the memory 9020 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The electronic device 9000 further comprises a power supply 9030 (e.g., a battery) for supplying power to various components, which may be logically connected to the processor 9010 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system.
The display unit 9040 can be used to display information input by a user or information provided to the user, various menus of the electronic device 9000, and the like, and in the embodiment of the present invention, the display unit is mainly used to display a display interface of each application in the electronic device 9000 and objects such as texts and pictures displayed in the display interface. The display unit 9040 may include a display panel 9041. The Display panel 9041 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The input unit 9050 may be configured to receive information such as numbers or characters input by a user. The input unit 9050 may include a touch panel 9051 and other input devices 9052. Among other things, the touch panel 9051, also referred to as a touch screen, may collect touch operations by a user thereon or nearby (e.g., operations by a user on or near the touch panel 9051 using a finger, a touch pen, or any other suitable object or accessory).
Specifically, the touch panel 9051 may detect a touch operation of the user, detect signals generated by the touch operation, convert the signals into touch point coordinates, send the touch point coordinates to the processor 9010, receive a command sent from the processor 9010, and execute the command. In addition, the touch panel 9051 may be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. Other input devices 9052 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, power on/off keys, etc.), a trackball, a mouse, a joystick, and the like.
Of course, the touch panel 9051 may cover the display panel 9041, and when the touch panel 9051 detects a touch operation on or near the touch panel 9051, the touch panel is transmitted to the processor 9010 to determine the type of the touch event, and then the processor 9010 provides a corresponding visual output on the display panel 9041 according to the type of the touch event. Although in fig. 9 the touch panel 9051 and the display panel 9041 are two separate components to implement the input and output functions of the electronic device 9000, in some embodiments the touch panel 9051 and the display panel 9041 may be integrated to implement the input and output functions of the electronic device 9000.
The electronic device 9000 can also include one or more sensors, such as a pressure sensor, a gravitational acceleration sensor, a proximity light sensor, and the like. Of course, the electronic device 9000 may further comprise other components such as a camera, which are not shown in fig. 9 and will not be described in detail herein since these components are not the components used in this embodiment of the present application.
Those skilled in the art will appreciate that fig. 9 is merely an example of an electronic device and is not intended to limit the electronic device and may include more or fewer components than those shown, or some components may be combined, or different components.
In an embodiment of the present application, a computer-readable storage medium has a computer program stored thereon, and when the computer program is executed by a processor, the communication device may perform the steps in the above embodiments.
For convenience of description, the above parts are separately described as modules (or units) according to functional division. Of course, the functionality of the various modules (or units) may be implemented in the same one or more pieces of software or hardware when implementing the present application.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all changes and modifications that fall within the scope of the present application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (15)

1. A method for detecting defects of a fan blade is characterized by comprising the following steps:
segmenting a local image of the fan blade from the image to be detected;
inputting the local image of the fan blade into an image restoration model to obtain a fan blade restoration image, wherein the image restoration model is used for filling the missing part of the fan blade;
determining the structural similarity between the local image of the fan blade and the restored image of the fan blade;
and determining a fan blade defect detection result according to the structural similarity.
2. The method of claim 1, wherein the segmenting a local image of the fan blade from the image to be detected comprises:
generating a foreground depth map of the image to be detected based on the distance between each pixel in the image to be detected and the target position;
adjusting the distance lower than a distance threshold value to a set distance from the foreground depth map to obtain a background depth map;
generating a background depth map of the image to be detected based on the distance between each pixel in the background of the image to be detected and the target position;
obtaining a subtraction image based on a difference between the background depth map and the foreground depth map;
and extracting the local image of the fan blade based on the subtraction image.
3. The method of claim 2, wherein said extracting the fan blade local image based on the subtraction image comprises:
carrying out binarization on the subtraction image to obtain a binarization image;
and segmenting the local image of the fan blade from the binary image.
4. The method as claimed in claim 3, wherein the segmenting the fan blade local image from the binarized image comprises:
performing particle filtering on the binary image by adopting particle filtering to obtain a filtered image;
and segmenting the local image of the fan blade from the filtering image.
5. The method of any of claims 1-4, wherein the determining the structural similarity between the fan blade local image and the fan blade restoration image comprises:
determining a first mean value and a first variance of a first pixel value of each pixel in the local image of the fan blade;
determining a second mean value and a second variance of a second pixel value of each pixel in the fan blade restoration image;
determining a covariance between each first pixel value and each second pixel value;
determining the product between the first average value and the second average value to obtain an average value product;
obtaining the structural similarity based on the mean product, the covariance, the first variance, the second variance, the first mean and the second mean, the structural similarity being positively correlated with the mean product and the covariance and negatively correlated with the first variance, the second variance, the square of the first mean and the square of the second mean.
6. The method of any one of claims 1-4, wherein determining a fan blade defect detection result based on the structural similarity comprises:
if the structural similarity is higher than a structural similarity threshold value, obtaining a fan blade defect detection result representing that the fan blade is normal;
and if the structural similarity is not higher than the structural similarity threshold value, obtaining a fan blade defect detection result representing that the fan blade has defects.
7. The method of claim 6, wherein the method further comprises:
and if the fan blade is determined to have defects based on the fan blade defect detection result, comparing the local image of the fan blade with the restored image of the fan blade to obtain a defect area of the fan blade.
8. A fan blade defect detection device, comprising:
the segmentation unit is used for segmenting a local image of the fan blade from the image to be detected;
the restoration unit is used for inputting the local image of the fan blade into an image restoration model to obtain a fan blade restoration image, and the image restoration model is used for filling the missing part of the fan blade;
the matching unit is used for determining the structural similarity between the local image of the fan blade and the restored image of the fan blade;
and the detection unit is used for determining the detection result of the defects of the fan blades according to the structural similarity.
9. The apparatus of claim 8, wherein the segmentation unit is to:
generating a foreground depth map of the image to be detected based on the distance between each pixel in the image to be detected and the target position;
adjusting the distance lower than a distance threshold value to a set distance from the foreground depth map to obtain a background depth map;
generating a background depth map of the image to be detected based on the distance between each pixel in the background of the image to be detected and the target position;
obtaining a subtraction image based on a difference between the background depth map and the foreground depth map;
and extracting the local image of the fan blade based on the subtraction image.
10. The apparatus of claim 9, wherein the segmentation unit is to:
carrying out binarization on the subtraction image to obtain a binarization image;
and segmenting the local image of the fan blade from the binary image.
11. The apparatus of claim 10, wherein the segmentation unit is to:
performing particle filtering on the binary image by adopting particle filtering to obtain a filtered image;
and segmenting the local image of the fan blade from the filtering image.
12. The apparatus of any one of claims 8-11, wherein the matching unit is to:
determining a first mean value and a first variance of a first pixel value of each pixel in the local image of the fan blade;
determining a second mean value and a second variance of a second pixel value of each pixel in the fan blade restoration image;
determining a covariance between each first pixel value and each second pixel value;
determining a product between the first mean value and the second mean value to obtain a mean value product;
obtaining the structural similarity based on the mean product, the covariance, the first variance, the second variance, the first mean and the second mean, the structural similarity being positively correlated with the mean product and the covariance and negatively correlated with the first variance, the second variance, the square of the first mean and the square of the second mean.
13. The apparatus of any one of claims 8-11, wherein the detection unit is to:
if the structural similarity is determined to be higher than a structural similarity threshold value, obtaining a fan blade defect detection result representing that the fan blade is normal;
and if the structural similarity is not higher than the structural similarity threshold value, obtaining a fan blade defect detection result representing that the fan blade has defects.
14. An electronic device comprising a processor and a memory, the memory storing computer readable instructions that, when executed by the processor, perform the method of any of claims 1-7.
15. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202210700976.4A 2022-06-20 2022-06-20 Fan blade defect detection method and device, electronic equipment and storage medium Active CN114937027B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210700976.4A CN114937027B (en) 2022-06-20 2022-06-20 Fan blade defect detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210700976.4A CN114937027B (en) 2022-06-20 2022-06-20 Fan blade defect detection method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114937027A true CN114937027A (en) 2022-08-23
CN114937027B CN114937027B (en) 2024-03-15

Family

ID=82867678

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210700976.4A Active CN114937027B (en) 2022-06-20 2022-06-20 Fan blade defect detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114937027B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117315368A (en) * 2023-10-23 2023-12-29 龙坤(无锡)智慧科技有限公司 Intelligent operation and maintenance inspection method for large-scale data center

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200033822A (en) * 2018-03-15 2020-03-30 (주)니어스랩 Apparatus and Method for Detecting/Analyzing Defect of Windturbine Blade
CN111489348A (en) * 2020-04-16 2020-08-04 创新奇智(重庆)科技有限公司 Magnetic material product surface defect simulation method and device
CN112419318A (en) * 2020-12-17 2021-02-26 深圳市华汉伟业科技有限公司 Multi-path cascade feedback-based anomaly detection method and device and storage medium
CN113240673A (en) * 2021-07-09 2021-08-10 武汉Tcl集团工业研究院有限公司 Defect detection method, defect detection device, electronic equipment and storage medium
CN113393430A (en) * 2021-06-09 2021-09-14 东方电气集团科学技术研究院有限公司 Thermal imaging image enhancement training method and device for fan blade defect detection
CN114219762A (en) * 2021-11-16 2022-03-22 杭州三米明德科技有限公司 Defect detection method based on image restoration

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200033822A (en) * 2018-03-15 2020-03-30 (주)니어스랩 Apparatus and Method for Detecting/Analyzing Defect of Windturbine Blade
CN111489348A (en) * 2020-04-16 2020-08-04 创新奇智(重庆)科技有限公司 Magnetic material product surface defect simulation method and device
CN112419318A (en) * 2020-12-17 2021-02-26 深圳市华汉伟业科技有限公司 Multi-path cascade feedback-based anomaly detection method and device and storage medium
CN113393430A (en) * 2021-06-09 2021-09-14 东方电气集团科学技术研究院有限公司 Thermal imaging image enhancement training method and device for fan blade defect detection
CN113240673A (en) * 2021-07-09 2021-08-10 武汉Tcl集团工业研究院有限公司 Defect detection method, defect detection device, electronic equipment and storage medium
CN114219762A (en) * 2021-11-16 2022-03-22 杭州三米明德科技有限公司 Defect detection method based on image restoration

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117315368A (en) * 2023-10-23 2023-12-29 龙坤(无锡)智慧科技有限公司 Intelligent operation and maintenance inspection method for large-scale data center
CN117315368B (en) * 2023-10-23 2024-04-23 龙坤(无锡)智慧科技有限公司 Intelligent operation and maintenance inspection method for large-scale data center

Also Published As

Publication number Publication date
CN114937027B (en) 2024-03-15

Similar Documents

Publication Publication Date Title
CN110381369B (en) Method, device and equipment for determining recommended information implantation position and storage medium
CN111124888B (en) Method and device for generating recording script and electronic device
CN111753701B (en) Method, device, equipment and readable storage medium for detecting violation of application program
CN111325271B (en) Image classification method and device
CN110889379B (en) Expression package generation method and device and terminal equipment
WO2022089170A1 (en) Caption area identification method and apparatus, and device and storage medium
CN108986125B (en) Object edge extraction method and device and electronic equipment
CN111062854A (en) Method, device, terminal and storage medium for detecting watermark
CN110378276B (en) Vehicle state acquisition method, device, equipment and storage medium
CN116168038B (en) Image reproduction detection method and device, electronic equipment and storage medium
CN114937027B (en) Fan blade defect detection method and device, electronic equipment and storage medium
CN115631122A (en) Image optimization method and device for edge image algorithm
CN116168351A (en) Inspection method and device for power equipment
CN115580450A (en) Method and device for detecting flow, electronic equipment and computer readable storage medium
CN111444819A (en) Cutting frame determining method, network training method, device, equipment and storage medium
CN105678301A (en) Method, system and device for automatically identifying and segmenting text image
CN110895811A (en) Image tampering detection method and device
CN115170400A (en) Video repair method, related device, equipment and storage medium
CN108062405B (en) Picture classification method and device, storage medium and electronic equipment
CN111368128B (en) Target picture identification method, device and computer readable storage medium
CN112270238A (en) Video content identification method and related device
CN103093213A (en) Video file classification method and terminal
CN115497100A (en) Character recognition method and electronic equipment
CN115601684A (en) Emergency early warning method and device, electronic equipment and storage medium
CN112966272B (en) Internet of things Android malicious software detection method based on countermeasure network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant