CN115034973A - Part image enhancement method based on texture stability - Google Patents

Part image enhancement method based on texture stability Download PDF

Info

Publication number
CN115034973A
CN115034973A CN202210453887.4A CN202210453887A CN115034973A CN 115034973 A CN115034973 A CN 115034973A CN 202210453887 A CN202210453887 A CN 202210453887A CN 115034973 A CN115034973 A CN 115034973A
Authority
CN
China
Prior art keywords
pixel point
image
gray
target pixel
stability
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202210453887.4A
Other languages
Chinese (zh)
Inventor
张秀梅
王仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Baohong Metal Industry Co ltd
Original Assignee
Haimen Baohong Machinery Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haimen Baohong Machinery Co ltd filed Critical Haimen Baohong Machinery Co ltd
Priority to CN202210453887.4A priority Critical patent/CN115034973A/en
Publication of CN115034973A publication Critical patent/CN115034973A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the field of artificial intelligence, and provides a part image enhancement method based on texture stability, which comprises the following steps: s101, acquiring a part gray image; s102, processing the gray level image to obtain a multi-layer image set of each frequency level; s103, performing up-sampling on the image to obtain a multi-layer image set with the same scale as the gray level image; s201: extracting a target pixel point to obtain a contrast mean value; s202, obtaining the stability between two adjacent layers of the pixel point; obtaining a stability change rate average value; s203: obtaining an enhanced weight; calculating the average stability; s204, repeating 201 and 203, and calculating the pixel gray scale stretching coefficient; s205: obtaining the gray value of the pixel point after enhancement; s206: and repeating 201 and 205 to obtain the enhanced gray-scale image. The method provides different enhancement coefficients for the pixels with different stability levels, so that the coarse characteristic representation effect capable of reflecting the object types is stronger.

Description

Part image enhancement method based on texture stability
Technical Field
The invention relates to the field of artificial intelligence, in particular to a part image enhancement method based on texture stability.
Background
Due to reasons such as illumination, a shot part picture has an atomization effect, or the part itself has a shallow texture, which affects further extraction of part information in the picture, and the image needs to be enhanced, so that the texture information in the picture is highlighted.
Histogram equalization and a linear gray scale stretching algorithm are mostly adopted in the conventional image enhancement method, the purpose of enhancing an image is achieved by expanding a gray scale quantization interval, and the quantization interval is expanded by the probability density corresponding to each gray scale, so that some pseudo contours often appear in the processed effect, and some texture details are lost. The linear gray scale stretching does not consider the characteristics of the texture pixels and the common pixels, and the stretching processing is performed on the gray scale values of the pixels by adopting a consistent stretching coefficient, which may cause the loss of texture information.
Disclosure of Invention
In order to overcome the shortcomings of the prior art, the invention provides a part image enhancement method based on texture stability.
In order to achieve the above object, the present invention adopts the following technical solution, a part image enhancement method based on texture stability, comprising the following steps:
s101, acquiring a part gray image;
s102, processing the part gray level image by adopting Gaussian filters with different frequency levels and image down-sampling to obtain first-layer images with different frequency levels, performing Gaussian filter and image down-sampling on the first-layer images according to the frequency levels corresponding to the first-layer images to obtain second-layer images, and repeating the steps to obtain a multi-layer image set corresponding to each frequency level;
s103, performing up-sampling on each layer of image in each frequency level to obtain a multi-layer image set with the same scale as the gray level image of the part;
s201: extracting a pixel point as a target pixel point, selecting the pixel point corresponding to each layer of image in any frequency level, and obtaining a contrast mean value by utilizing the gray value of the pixel point and the gray value of the adjacent pixel point of the pixel point;
s202, extracting gray values of a target pixel point and an adjacent pixel point of the pixel point to obtain the stability between two adjacent layers of the pixel point; obtaining a stability change rate average value under the frequency by using the stability;
s203: obtaining the enhanced weight of the pixel point by using the contrast average value and the stability change rate average value corresponding to the target pixel point; calculating the average stability under the frequency by using the stability between two adjacent layers of the pixel points;
s204, repeating 201 and 203 to obtain the enhanced weight and the average stability of the target pixel point under all frequencies, and calculating the gray scale stretching coefficient of the pixel point according to the enhanced weight and the average stability of the pixel point under all frequencies;
s205: enhancing the gray value of the target pixel point by utilizing the gray stretching coefficient to obtain the enhanced gray value of the pixel point;
s206: and repeating 201 and 205, calculating the enhanced gray values of other pixel points, and obtaining the enhanced gray image through the enhanced gray values of all the pixel points.
Further, in the method for enhancing a part image based on texture stability, in S201, the method for obtaining a mean contrast value by using the gray value of the pixel point and the gray value of the pixel point adjacent to the pixel point is as follows:
extracting a gray value of a target pixel point;
calculating the mean value of the gray values in eight neighborhoods of the target pixel point;
the gray value of the target pixel point is subtracted from the mean value of the gray values in the eight neighborhoods of the target pixel point to obtain the contrast of the target pixel point;
and obtaining a contrast mean value by using the contrast of the target pixel points of each layer of image.
Further, in the method for enhancing an image of a part based on texture stability, the expression of the contrast mean value is as follows:
Figure BDA0003618036080000021
in the formula:
Figure BDA0003618036080000022
mean contrast value representing j-th order frequency target pixel, dbo ij Representing the contrast, k, of the target pixel point of the ith layer image with the jth level frequency j-1 σ represents the variance of the gaussian filter corresponding to the j-th order frequency, i represents the i-th layer image, and μ represents the total layer number of the image.
Further, in the method for enhancing a part image based on texture stability, the method for extracting the gray values of the target pixel point and the adjacent pixel point of the pixel point to obtain the stability between the two adjacent layers of the pixel point in S202 is as follows:
extracting the gray values of each layer of target pixel points and the adjacent pixel points of the pixel points to obtain the direction amplitude vector of each layer of target pixel points;
screening a direction angle corresponding to the maximum gray amplitude in each layer as a gradient direction of the target pixel point;
calculating the direction amplitude similarity and gradient direction deviation between two adjacent layers of the target pixel point;
and calculating the stability between the two adjacent layers of the target pixel point through the direction amplitude similarity and the gradient direction deviation between the two adjacent layers of the target pixel point.
Further, in the method for enhancing a part image based on texture stability, in S205, the gray value expression of the pixel point after enhancement is as follows:
Figure BDA0003618036080000031
in the formula: (x) o ,y o ) Coordinates, p (x), representing a target pixel point o ,y o ) Representing the gray value of a target pixel point in a gray image of a part, p1 (x) o ,y o ) Expressing the enhanced gray value p (x) of the target pixel point in the part gray image f ,y f ) Representing (x) in a grayscale image of a part f ,y f ) The gray value of the pixel point at the coordinate, M represents the number of the pixel points in the gray image of the part, f represents the f-th pixel point in the gray image of the part, and LSo represents the gray stretching coefficient of the target pixel point.
Further, in the method for enhancing an image of a part based on texture stability, the expression of the gray scale stretch coefficient of the pixel point is as follows:
Figure BDA0003618036080000032
in the formula: LSo represents the gray scale stretch coefficient of the target pixel point, j represents the j-th level frequency, L represents the number of frequency levels, Wdo j Representing the average stability of the j-th order frequency target pixel, Qzo j And expressing the enhanced weight of the j-th level frequency target pixel point.
The invention has the beneficial effects that: the stability characteristics of the texture are analyzed by combining a pyramid algorithm, different enhancement coefficients are given to pixels of different stability levels, and the coarse characteristic representation effect capable of reflecting the variety of objects is stronger; the invention also considers the possibility that the texture belongs to noise when enhancing the texture, and effectively restrains the texture which is possible to be noise.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a method for enhancing a part image based on texture stability according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an enhancement weight curve.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
The scenario addressed by the present embodiment is: a part picture shot under natural light is obtained, and a gray scale stretching coefficient is calculated by analyzing the stability of texture features in the picture, so that the picture is subjected to self-adaptive enhancement processing, and the enhanced picture texture information is richer.
The embodiment of the invention provides a part image enhancement method based on texture stability, as shown in fig. 1, comprising the following steps:
acquiring a part gray level image;
and shooting a part picture under natural light, and converting the collected part picture from an RGB space to a gray scale space to obtain a part gray scale image.
Processing the part gray level image by adopting Gaussian filters with different frequency levels and image downsampling to obtain first-layer images with different frequency levels, carrying out Gaussian filter and image downsampling on the first-layer images according to the frequency levels corresponding to the obtained first-layer images to obtain second-layer images, and repeating the steps to obtain a multi-layer image set corresponding to each frequency level;
processing the part gray level image by utilizing a pyramid: using variance as [ sigma, k 2 σ,…,k (L-1) σ]Gaussian filter pair image T with dimension length and width of 5 x 5 0 (part gray level image) is filtered, and after filtering is finished, each image is downsampled in a mode of deleting even rows and even columns to obtain an image sequence
Figure BDA0003618036080000041
Where σ is typically 1.6, k is typically 2 and L is typically 8. Reuse of variance as [ sigma, k [ ] 2 σ,…,k (L-1) σ]Gaussian filter pair image with dimension length and width of 5 x 5
Figure BDA0003618036080000042
Filtering and down-sampling to obtain image set
Figure BDA0003618036080000043
Repeating the filtering and down-sampling processes until L frequency level 5-layer image sets are obtained
Figure BDA0003618036080000044
Up-sampling each layer of image in each frequency level to obtain a multi-layer image set with the same scale as the gray level image of the part;
taking the j-th level frequency as an example, the j-th level frequency image is obtained, and the j-th level frequency image is obtained by using the variance k j-1 Filtering and downsampling each layer of image set obtained by a Gaussian filter of sigma, upsampling each layer of image to obtain an image with the same scale as the gray level image of the original part, and recording the image set as
Figure BDA0003618036080000045
Extracting a pixel point as a target pixel point, selecting the pixel point corresponding to each layer of image in any frequency level, and obtaining a contrast mean value by utilizing the gray value of the pixel point and the gray value of the adjacent pixel point of the pixel point;
taking the jth layer image as an example, the coordinates of the target pixel points are (x) o ,y o ) Calculating the difference between the gray value of the pixel at the position and the average gray value of the pixels in the eight neighborhoods of the position, wherein the difference is the contrast of the pixel at the position, and the expression of the contrast is as follows:
Figure BDA0003618036080000046
in the formula: (x) o ,y o ) The coordinates of the target pixel point are represented,
Figure BDA0003618036080000047
expressing the pixel gray mean value p in eight neighborhoods of a target pixel point ij (x o ,y o ) Representing the gray value of the target pixel point of the ith layer image with the j-th level frequency, dbo ij And the contrast of a target pixel point of the ith layer image with the j-th level frequency is represented.
Since the higher the level, the higher the requirement for texture contrast, it should have a higher weight value. When the contrast weight of each level is calculated, the number of layers, the size of a Gaussian filter kernel and the variance of a Gaussian filter need to be considered, the larger the texture filter kernel is, the larger the contrast interval is, and the larger the variance is, the larger the interlayer interval is. Therefore, the contrast mean value is obtained through the contrast of the target pixel point of each j-level picture, and the expression is as follows:
Figure BDA0003618036080000051
in the formula:
Figure BDA0003618036080000052
mean contrast value representing j-th order frequency target pixel point, dbo ij Representing the contrast, k, of the target pixel point of the ith layer image with the jth level frequency j-1 σ represents the variance of the gaussian filter corresponding to the j-th order frequency, i represents the i-th layer image, and μ represents the total number of layers of the image.
Extracting gray values of a target pixel point and an adjacent pixel point of the pixel point to obtain the stability between two adjacent layers of the pixel point; obtaining the stability change rate average value under the frequency by utilizing the stability;
taking the ith layer image as an example for explanation, extracting the gray amplitudes of target pixel points in the jth level frequency ith layer gray image in all directions: sliding window step length of 5 × 5 is used for sliding in the ith layer of image, the sliding window is used for sliding to obtain the gray scale values of the central pixel and pixels in all directions, wherein the sliding window central pixel is superposed with the target pixel point of the ith layer of image, and the gray scale value of the target pixel point in the horizontal direction is as follows:
Figure BDA0003618036080000053
in the formula:
Figure BDA0003618036080000054
representing the gray scale amplitude of the target pixel point of the ith layer image with the j-th level frequency in the horizontal direction, (x) o ,y o ) Representing the coordinates of the target pixel point, p ij (x o ,y o ) Expressing the gray value, p, of the target pixel point of the ith layer image with the jth level frequency ij (x o -1,y o ) Representing the ith layer image (x) of the j-th order frequency o -1,y o ) Gray value of a pixel point, p ij (x o -2,y o ) Representing the ith layer image (x) of the j-th order frequency o -2,y o ) Gray value of a pixel, p ij (x o +1,y o ) Representing the ith layer image (x) of the j-th order frequency o +1,y o ) Gray value of a pixel, p ij (x o +2,y o ) Representing the ith layer image (x) of the jth level frequency o +2,y o ) The gray value of the pixel point.
Analogy with this method can obtain the gray scale of the target pixel in the direction 22.5 degrees from the horizontal direction as:
Figure BDA0003618036080000055
in the formula:
Figure BDA0003618036080000056
representing the gray scale value p of the target pixel of the ith layer image at the j-th level frequency in the direction of 22.5 degrees from the horizontal direction ij (x o +2,y o +1 represents the ith layer image (x) of the j-th order frequency o +2,y o +1) the gray value of the pixel point, p ij (x o -2,y o -1) represents the jth level frequency ith layer image (x) o -2,y o -1) the grey value of the pixel point.
Gray scale amplitude of the target pixel in a direction 45 degrees from the horizontal direction:
Figure BDA0003618036080000061
in the formula:
Figure BDA0003618036080000062
representing the gray scale amplitude value p of the target pixel of the ith layer image at the j-th level frequency in a direction forming 45 degrees with the horizontal direction ij (x o +1,y o +1 represents the ith layer image (x) of the j-th order frequency o +1,y o +1) gray scale of pixel pointValue, p ij (x o +2,y o +2) denotes the ith layer image (x) of the jth level frequency o +2,y o +2) the gray value of the pixel point, p ij (x o -1,y o -1) represents the ith layer image (x) of the jth level frequency o -1,y o -1) the grey value of a pixel point, p ij (x o -2,y o -2) representing the ith layer image (x) at the jth level of frequency o -2,y o -2) the grey value of the pixel point.
The gray scale of the target pixel in the direction 67.5 degrees from the horizontal direction is:
Figure BDA0003618036080000063
in the formula:
Figure BDA0003618036080000064
representing the gray scale value p of the ith layer image target pixel at the j-level frequency in the direction of 67.5 degrees from the horizontal direction ij (x o +1,y o +2) denotes the ith layer image (x) of the jth level frequency o +1,y o +2) the gray value of the pixel point, p ij (x o -1,y o -2) representing the ith layer image (x) at the jth level of frequency o -1,y o -2) the grey value of the pixel point.
Gray scale amplitude of the target pixel in a direction 90 degrees from the horizontal direction:
Figure BDA0003618036080000065
in the formula:
Figure BDA0003618036080000066
representing the gray scale amplitude, p, of the target pixel of the ith layer image at the j-th level frequency in a direction 90 degrees to the horizontal direction ij (x o ,y o -1) represents the ith layer image (x) of the jth level frequency o ,y o -1) the grey value of a pixel point, p ij (x o ,y o -2) representing the ith layer image (x) at the jth level of frequency o ,y o -2) the grey value of a pixel point, p ij (x o ,y o +1 represents the ith layer image (x) of the j-th order frequency o ,y o +1) the gray value of the pixel point, p ij (x o ,y o +2) represents the ith layer image (x) of the jth level frequency o ,y o +2) the gray value of the pixel point.
Gray scale amplitude of the target pixel in a direction of 112.5 degrees from the horizontal direction:
Figure BDA0003618036080000067
in the formula:
Figure BDA0003618036080000068
representing the gray scale value p of the target pixel of the ith layer image at the j-th level frequency in a direction forming 112.5 degrees with the horizontal direction ij (x o -1,y o +2) denotes the ith layer image (x) of the jth level frequency o -1,y o +2) the gray value of the pixel point, p ij (x o +1,y o -2) representing the ith layer image (x) at the jth level of frequency o +1,y o -2) the grey value of the pixel point.
Gray scale amplitude of the target pixel in a direction 135 degrees from the horizontal direction:
Figure BDA0003618036080000071
in the formula:
Figure BDA0003618036080000072
representing the gray scale value p of the target pixel of the ith layer image at the j-th level frequency in the direction of 135 degrees from the horizontal direction ij (x o -1,y o +1 represents the ith layer image (x) of the j-th order frequency o -1,y o +1) the gray value of the pixel point, p ij (x o -2,y o +2) represents the ith layer image (x) of the jth level frequency o -2,y o +2) the gray value of the pixel point, p ij (x o +1,y o -1) represents the jth level frequency ith layer image (x) o +1,y o -1) the grey value of a pixel point, p ij (x o +2,y o -2) representing the ith layer image (x) at the jth level of frequency o +2,y o -2) the grey value of the pixel point.
Gray scale amplitude of the target pixel in a direction 157.5 degrees from the horizontal direction:
Figure BDA0003618036080000073
in the formula:
Figure BDA0003618036080000074
representing the gray scale value p of the target pixel of the ith layer image at the j-th level frequency in the direction of 157.5 degrees from the horizontal direction ij (x o -2,y o +1 represents the ith layer image (x) of the j-th order frequency o -2,y o +1) the gray value of the pixel point, p ij (x o +2,y o -1) represents the jth level frequency ith layer image (x) o +2,y o -1) the grey value of the pixel point.
And further obtaining a direction amplitude vector of the target pixel point of the ith layer image with the j-th level frequency as follows:
Figure BDA0003618036080000075
screening out the angle theta corresponding to the maximum gray scale amplitude ij And the direction is the gradient direction of the target pixel point of the ith layer of image. By the method, the direction amplitude vector and the gradient direction of the target pixel point of each layer of image of the j-th level frequency can be obtained.
Calculating the direction amplitude similarity of two adjacent target pixel points of the j-th-level frequency, wherein the expression is as follows:
Xso ij =<fz ij ,fz (i+1)j >
in the formula: fz ij The direction amplitude vector, fz, of the ith layer target pixel point of the jth level frequency (i+1)j Represents the directional amplitude value, Xso, of the target pixel point of the (i +1) th layer of the j-th level frequency ij Is shown asThe direction amplitude similarity of the target pixel points of the ith layer image and the (i +1) th layer image with j-level frequency, wherein the direction amplitude similarity is less than fz ij ,fz (i+1)j >The similarity of the two vectors is calculated, and the similarity is obtained by calculating the Euclidean distance between the two vectors.
Analogy to the formula, the direction amplitude similarity of the target pixel points of each two adjacent layers of images is calculated.
Calculating the gradient direction deviation of two adjacent layers of target pixel points, wherein the expression is as follows:
Figure BDA0003618036080000076
Figure BDA0003618036080000081
representing the gradient direction deviation theta of target pixel points of the ith layer image and the (i +1) th layer image of the jth level frequency i+1 The gradient direction theta of the target pixel point of the i +1 layer image is represented i And expressing the gradient direction of the target pixel point of the ith layer image.
And calculating the gradient direction deviation of target pixel points of every two adjacent layers of images by using the formula.
Calculating the stability according to the gradient direction deviation and the direction amplitude similarity, wherein the expression is as follows:
Figure BDA0003618036080000082
in the formula: wdo ij Representing the stability of target pixel points of ith layer image and (i +1) th layer image of jth level frequency, Xso ij Representing the direction amplitude similarity of target pixel points of the ith layer image and the (i +1) th layer image of the jth level frequency,
Figure BDA0003618036080000083
and expressing the gradient direction deviation of target pixel points of the ith layer image and the (i +1) th layer image of the jth level frequency.
By analogy with the formula, the corresponding stability of each two adjacent layers of target pixel points can be obtained.
Calculating the stability change rate mean value of the j-th-level frequency target pixel point, wherein the expression is as follows:
Figure BDA0003618036080000084
in the formula: wlo j Mean value of stability change rate representing j-th order frequency target pixel point, Wdo (i+1)j Representing the stability of the target pixel points of the i +1 th layer image and the i +2 th layer image in the j-th level frequency, Wdo ij And (3) representing the stability of target pixel points of the ith layer image and the (i +1) th layer image of the j-th level frequency, wherein i represents the ith layer image, and mu represents the total layer number of the images.
Obtaining the enhanced weight of the pixel point by using the contrast average value and the stability change rate average value corresponding to the target pixel point; calculating the average stability under the frequency by using the stability between two adjacent layers of the pixel points;
generally, the probability that a pixel point with stronger contrast and quicker stability change rate is noise is larger, so that the enhancement weight of the pixel point at the moment should be reduced, and the expression of the enhancement weight is as follows:
Figure BDA0003618036080000085
in the formula: qzo j And expressing the enhanced weight of the j-th level frequency target pixel point.
The formula can realize that the enhancement weight of the pixel point is reduced when the product of the stability change rate mean value and the contrast mean value of the pixel point is increased, and the enhancement weight of the pixel point is kept in a stable higher value when the product of the stability change rate mean value and the contrast mean value of the pixel point is smaller, as shown in figure 2.
Calculating the average stability of the j-th-level frequency target pixel point, wherein the expression is as follows:
Figure BDA0003618036080000086
in the formula: wdo j Representing the average stability of the j-th order frequency target pixel, Wdo ij And (3) representing the stability of target pixel points of the ith layer image and the (i +1) th layer image of the j-th level frequency, wherein i represents the ith layer image, and mu represents the total layer number of the images.
Obtaining the enhancement weight and the average stability of the target pixel point under all frequencies, and calculating the gray scale stretching coefficient of the pixel point through the enhancement weight and the average stability of the pixel point under all frequencies;
the enhancement weight and the average stability of the target pixel point of different frequency levels can be calculated through the process, the gray scale stretching coefficient of the target pixel point is calculated through the enhancement weight and the average stability of the target pixel point of different frequency levels, and the expression is as follows:
Figure BDA0003618036080000091
in the formula: LSo represents the gray scale stretch coefficient of the target pixel point, j represents the j-th level frequency, L represents the number of frequency levels, Wdo j Representing the average stability of the j-th order frequency target pixel, Qzo j And expressing the enhanced weight of the j-th level frequency target pixel point.
Enhancing the gray value of the target pixel point by utilizing the gray stretching coefficient to obtain the enhanced gray value of the pixel point;
the expression of the gray value after the pixel point enhancement is as follows:
Figure BDA0003618036080000092
in the formula: (x) o ,y o ) Coordinates, p (x), representing a target pixel point o ,y o ) Representing the gray value of a target pixel point in a gray image of a part, p1 (x) o ,y o ) Expressing the enhanced gray value p (x) of the target pixel point in the part gray image f ,y f ) Representing (x) in a grayscale image of a part f ,y f ) Gray value of pixel point at coordinate, M represents part grayThe number of pixel points in the intensity image, f represents the f-th pixel point in the part gray image, and LSo represents the gray stretching coefficient of the target pixel point.
And calculating the enhanced gray values of other pixel points, and obtaining the enhanced gray image through the enhanced gray values of all the pixel points.
By analogy with the method, the gray value of each pixel point in the part gray image after pixel enhancement can be obtained, and the enhanced gray image can be obtained through the gray values of all the pixel points after the pixel enhancement.
The stability characteristics of the texture are analyzed by combining a pyramid algorithm, different enhancement coefficients are given to pixels of different stability levels, and the coarse characteristic representation effect capable of reflecting the variety of objects is stronger; the invention also considers the possibility that the texture belongs to noise when enhancing the texture, and effectively restrains the texture which is possible to be noise.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (6)

1. A part image enhancement method based on texture stability is characterized by comprising the following steps:
s101, acquiring a part gray level image;
s102, processing the gray level image of the part by adopting Gaussian filters with different frequency levels and image down-sampling to obtain first-layer images with different frequency levels, carrying out Gaussian filter and image down-sampling on the first-layer images according to the frequency levels corresponding to the first-layer images to obtain second-layer images, and repeating the steps to obtain a multi-layer image set corresponding to each frequency level;
s103, performing up-sampling on each layer of image in each frequency level to obtain a multi-layer image set with the same scale as the gray level image of the part;
s201: extracting a pixel point as a target pixel point, selecting the pixel point corresponding to each layer of image in any frequency level, and obtaining a contrast mean value by utilizing the gray value of the pixel point and the gray value of the adjacent pixel point of the pixel point;
s202, extracting gray values of a target pixel point and an adjacent pixel point of the pixel point to obtain the stability between two adjacent layers of the pixel point; obtaining a stability change rate average value under the frequency by using the stability;
s203: obtaining the enhanced weight of the pixel point by using the contrast average value and the stability change rate average value corresponding to the target pixel point; calculating the average stability under the frequency by using the stability between two adjacent layers of the pixel points;
s204, repeating 201 and 203 to obtain the enhanced weight and the average stability of the target pixel point under all frequencies, and calculating the gray scale stretching coefficient of the pixel point according to the enhanced weight and the average stability of the pixel point under all frequencies;
s205: enhancing the gray value of the target pixel point by utilizing the gray stretching coefficient to obtain the enhanced gray value of the pixel point;
s206: and (5) repeating 201 and 205, calculating the enhanced gray values of other pixel points, and obtaining the enhanced gray image through the enhanced gray values of all the pixel points.
2. The method for enhancing the image of the part based on the texture stability as claimed in claim 1, wherein the method for obtaining the mean contrast value by using the gray value of the pixel point and the gray values of the pixel points adjacent to the pixel point in S201 is:
extracting the gray value of the target pixel point;
calculating the mean value of the gray values in eight neighborhoods of the target pixel point;
the gray value of the target pixel point is subtracted from the mean value of the gray values in the eight neighborhoods of the target pixel point to obtain the contrast of the target pixel point;
and obtaining a contrast mean value by using the contrast of the target pixel points of each layer of image.
3. The method for enhancing the part image based on the texture stability as claimed in claim 2, wherein the expression of the contrast mean is:
Figure FDA0003618036070000021
in the formula:
Figure FDA0003618036070000022
mean contrast value representing j-th order frequency target pixel point, dbo ij Representing the contrast, k, of the target pixel point of the ith layer image with the jth level frequency j-1 σ represents the variance of the gaussian filter corresponding to the j-th order frequency, i represents the i-th layer image, and μ represents the total number of layers of the image.
4. The method for enhancing the image of the part based on the texture stability as claimed in claim 1, wherein the method for extracting the gray values of the target pixel and the adjacent pixel of the pixel to obtain the stability between the two adjacent layers of the pixel in S202 is as follows:
extracting the gray values of each layer of target pixel points and the adjacent pixel points of the pixel points to obtain the direction amplitude vector of each layer of target pixel points;
screening a direction angle corresponding to the maximum gray amplitude in each layer as a gradient direction of the target pixel point;
calculating the direction amplitude similarity and gradient direction deviation between two adjacent layers of the target pixel point;
and calculating the stability between the two adjacent layers of the target pixel point through the direction amplitude similarity and the gradient direction deviation between the two adjacent layers of the target pixel point.
5. The method for enhancing the part image based on the texture stability as claimed in claim 1, wherein the gray value expression of the pixel points after enhancement in S205 is:
Figure FDA0003618036070000023
in the formula: (x) o ,y o ) Coordinates, p (x), representing a target pixel point o ,y o ) Representing the gray value of a target pixel point in a gray image of a part, p1 (x) o ,y o ) Expressing the enhanced gray value p (x) of the target pixel point in the part gray image f ,y f ) Representing (x) in a grayscale image of a part f ,y f ) The gray value of the pixel point at the coordinate position, M represents the number of the pixel points in the part gray image, f represents the f-th pixel point in the part gray image, and LSo represents the gray stretching coefficient of the target pixel point.
6. The method as claimed in claim 5, wherein the gray scale stretch coefficient of the pixel is expressed as:
Figure FDA0003618036070000031
in the formula: LSo represents the gray scale stretch coefficient of the target pixel point, j represents the j-th level frequency, L represents the number of frequency levels, Wdo j Representing the average stability of the j-th order frequency target pixel, Qzo j And representing the enhanced weight of the j-th frequency target pixel point.
CN202210453887.4A 2022-04-24 2022-04-24 Part image enhancement method based on texture stability Withdrawn CN115034973A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210453887.4A CN115034973A (en) 2022-04-24 2022-04-24 Part image enhancement method based on texture stability

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210453887.4A CN115034973A (en) 2022-04-24 2022-04-24 Part image enhancement method based on texture stability

Publications (1)

Publication Number Publication Date
CN115034973A true CN115034973A (en) 2022-09-09

Family

ID=83119053

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210453887.4A Withdrawn CN115034973A (en) 2022-04-24 2022-04-24 Part image enhancement method based on texture stability

Country Status (1)

Country Link
CN (1) CN115034973A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116168039A (en) * 2023-04-26 2023-05-26 济宁市新华电力特种材料有限公司 Environment-friendly energy-saving aluminum silicate plate quality detection method
CN116665137A (en) * 2023-08-01 2023-08-29 聊城市彩烁农业科技有限公司 Livestock breeding wastewater treatment method based on machine vision

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116168039A (en) * 2023-04-26 2023-05-26 济宁市新华电力特种材料有限公司 Environment-friendly energy-saving aluminum silicate plate quality detection method
CN116665137A (en) * 2023-08-01 2023-08-29 聊城市彩烁农业科技有限公司 Livestock breeding wastewater treatment method based on machine vision
CN116665137B (en) * 2023-08-01 2023-10-10 聊城市彩烁农业科技有限公司 Livestock breeding wastewater treatment method based on machine vision

Similar Documents

Publication Publication Date Title
CN109360156B (en) Single image rain removing method based on image block generation countermeasure network
Gurrola-Ramos et al. A residual dense u-net neural network for image denoising
CN111127336B (en) Image signal processing method based on self-adaptive selection module
CN115034973A (en) Part image enhancement method based on texture stability
CN111091503B (en) Image defocusing and blurring method based on deep learning
CN108510451B (en) Method for reconstructing license plate based on double-layer convolutional neural network
CN112614136B (en) Infrared small target real-time instance segmentation method and device
CN110930327B (en) Video denoising method based on cascade depth residual error network
Chen et al. Densely connected convolutional neural network for multi-purpose image forensics under anti-forensic attacks
CN117253154B (en) Container weak and small serial number target detection and identification method based on deep learning
CN113436112B (en) Image enhancement method, device and equipment
CN114723630A (en) Image deblurring method and system based on cavity double-residual multi-scale depth network
CN115471746A (en) Ship target identification detection method based on deep learning
CN115880495A (en) Ship image target detection method and system under complex environment
CN109003247B (en) Method for removing color image mixed noise
KR102466061B1 (en) Apparatus for denoising using hierarchical generative adversarial network and method thereof
CN113592740A (en) Image noise removing method in air tightness detection based on artificial intelligence
CN117315336A (en) Pollen particle identification method, device, electronic equipment and storage medium
Van Noord et al. Light-weight pixel context encoders for image inpainting
CN112541873B (en) Image processing method based on bilateral filter
US20220398696A1 (en) Image processing method and device, and computer-readable storage medium
CN115170812A (en) Image denoising model training and denoising method, device and storage medium thereof
Richmond et al. Image deblurring using multi-scale dilated convolutions in a LSTM-based neural network
CN115205148A (en) Image deblurring method based on double-path residual error network
CN114529518A (en) Image pyramid and NLM-based image enhancement method for cryoelectron microscope

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230725

Address after: 226399 South of Dongjiang Road and East of Fuhai Road, Tongzhou Bay River Sea Linkage Development Demonstration Zone, Nantong City, Jiangsu Province

Applicant after: Jiangsu Baohong Metal Industry Co.,Ltd.

Address before: Room 1, No. 999-1, Gangxi Avenue, Baochang Town, Haimen City, Nantong City, Jiangsu Province 226100

Applicant before: Haimen Baohong Machinery Co.,Ltd.

TA01 Transfer of patent application right
WW01 Invention patent application withdrawn after publication

Application publication date: 20220909

WW01 Invention patent application withdrawn after publication