CN114723632A - Part abnormal exposure image correction method and device based on texture information - Google Patents

Part abnormal exposure image correction method and device based on texture information Download PDF

Info

Publication number
CN114723632A
CN114723632A CN202210348628.5A CN202210348628A CN114723632A CN 114723632 A CN114723632 A CN 114723632A CN 202210348628 A CN202210348628 A CN 202210348628A CN 114723632 A CN114723632 A CN 114723632A
Authority
CN
China
Prior art keywords
image
texture
obtaining
exposure
integrity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210348628.5A
Other languages
Chinese (zh)
Inventor
黄桂华
高航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Jixing Fastener Technology Co ltd
Original Assignee
Nantong Jixing Fastener Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Jixing Fastener Technology Co ltd filed Critical Nantong Jixing Fastener Technology Co ltd
Priority to CN202210348628.5A priority Critical patent/CN114723632A/en
Publication of CN114723632A publication Critical patent/CN114723632A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a method and a device for correcting abnormal exposure images of parts based on texture information, relating to the field of computer vision; the overexposed area and the underexposed area in the abnormal exposure image can be corrected simultaneously. The method mainly comprises the following steps: acquiring a first image and an exposure value of the first image; obtaining texture integrity and a correction necessity index of the first image according to the edge feature of the first image and the frequency of gray values in a gray histogram of the first image, and correcting the first image when the correction necessity index is larger than a preset first threshold; obtaining a lower limit and an upper limit of an adjusting value according to the texture integrity of the first image, and obtaining an exposure value range; adjusting the exposure value of the first image to obtain a plurality of second images with different exposure values; and calculating the fusion weight of the second images, and fusing the plurality of second images according to the fusion weight to obtain the corrected first image. The specific application scenarios of the invention are as follows: and correcting the abnormally exposed image.

Description

Part abnormal exposure image correction method and device based on texture information
Technical Field
The application relates to the field of computer vision, in particular to a method and a device for correcting an abnormal exposure image of a part based on texture information.
Background
In the process of shooting a part, due to the influence of light intensity and light source angle, an abnormal exposure phenomenon exists in the obtained part image, the abnormal exposure comprises overexposure and underexposure, and the obtained part image cannot present complete texture details of the part due to the overexposure or the underexposure; meanwhile, the surface of the part is reflected, so that a plurality of light reflection areas exist in the shot part image, but a large number of dark areas exist in the shot part image while the light reflection areas are reduced by reducing the exposure value during shooting, so that the part image with abnormal exposure needs to be corrected, and the shot part image presents complete texture details of the part.
In order to solve the above problems in the prior art, the overall exposure of the image is usually changed to achieve a more appropriate exposure, and this way further increases the exposure of the overexposed portion in the image, or further decreases the exposure of the underexposed portion in the image, so that the overexposed portion and the underexposed portion in the image cannot be simultaneously corrected, and a better abnormal exposure image correction effect cannot be achieved.
Disclosure of Invention
In order to solve the technical problems, the invention provides a method and a device for correcting an abnormal exposure image of a part based on texture information.
In a first aspect, a method for correcting an abnormal exposure image of a part based on texture information is provided, including:
and acquiring a first image and an exposure value of the first image, wherein the first image refers to an abnormal exposure image to be corrected.
And obtaining the texture integrity of the first image according to the edge characteristics of the first image and the frequency of gray values in a gray histogram of the first image, obtaining a correction necessity index according to the texture integrity of the first image, and correcting the first image when the correction necessity index is greater than a preset first threshold value.
And obtaining a lower adjustment value limit and an upper adjustment value limit according to the texture integrity of the first image, and obtaining an exposure value range according to the lower adjustment value limit, the upper adjustment value limit and the exposure value of the first image.
And adjusting the exposure value of the first image to obtain a plurality of second images with different exposure values, wherein the exposure value of the second images is within the exposure value range.
And obtaining the texture integrity of each second image according to a method for obtaining the texture integrity of the first image, and obtaining the fusion weight of each second image according to the texture integrity of each second image.
And performing fusion superposition on the obtained fusion weights of all the second images to obtain the pixel value of the first image, and finishing the correction of the first image.
Further, in the method for correcting an abnormal exposure image of a part based on texture information, the texture integrity of the first image/the second image is obtained from the texture integrity of a bright area and the texture integrity of a dark area of the first image/the second image, and the specific method includes:
obtaining a gray scale image of the first image/the second image, and obtaining a bright area image and a dark area image according to the gray scale image, wherein the bright area image comprises a bright area in the gray scale image, and the dark area image comprises a dark area in the gray scale image.
And acquiring the texture integrity of the bright area according to the texture information of the bright area image.
And acquiring the texture integrity of the dark area according to the texture information of the dark area image.
And obtaining the texture integrity according to the texture integrity of the bright area and the texture integrity of the dark area.
Further, the method for correcting the abnormal exposure image of the part based on the texture information includes the following steps:
and acquiring the texture richness and the texture uniformity of the bright area image/the dark area image according to the texture features of the bright area image/the dark area image.
And multiplying the texture richness and the texture uniformity to obtain the texture integrity of the bright area image/the dark area image.
Further, the method for correcting the abnormal exposure image of the part based on the texture information, wherein the process of acquiring the richness of the texture, comprises the following steps:
and subtracting the maximum value of the gray values in all pixel points from the minimum value of the gray values in the bright area image/the dark area image to obtain the width of the gray values.
And obtaining the histogram uniformity degree according to the variance of the occurrence times of different gray values in the gray histogram of the bright area image/the dark area image.
And multiplying the width of the gray value by the uniformity degree of the histogram to obtain the texture richness.
Further, the method for correcting the abnormal exposure image of the part based on the texture information and the method for obtaining the uniformity degree of the texture comprise the following steps:
and performing edge detection on the bright area image/the dark area image to obtain edge characteristic points of the bright area image/the dark area image.
And taking the number of the eight neighborhood pixel points of the edge feature points in the bright area image/dark area image and the number of the edge feature points in the nine pixel points of the pixel points as the frequency of the edge feature points.
And obtaining the line distribution uniformity degree of the edge characteristic points according to the frequency of the edge characteristic points of the lines in the bright area image/dark area image and the line distance characteristics of the edge characteristic points of the lines.
And obtaining the column distribution uniformity degree of the edge characteristic points according to the frequency of the edge characteristic points of the columns in the bright area image/dark area image and the column distance characteristics of the edge characteristic points of the columns.
And multiplying the uniform degree of the line distribution of the edge feature points by the uniform degree of the column distribution to obtain the uniform degree of the texture.
Further, in the method for correcting the abnormal exposure image of the part based on the texture information, the line distance feature refers to a distance between an edge feature point and another adjacent edge feature point in the same line, or a distance between an edge feature point and a boundary of a bright area image/a dark area image.
The column distance feature refers to a distance between an edge feature point and another adjacent edge feature point in the same column, or a distance between an edge feature point and a boundary of a bright area image/a dark area image.
Further, the method for correcting the abnormal exposure image of the part based on the texture information, which includes the steps of performing fusion and superposition on the obtained fusion weights of all the second images to obtain the pixel value of the first image, and completing correction of the first image, includes:
the fusion weight comprises a dark region fusion weight and a bright region fusion weight.
And performing weighted fusion on the dark region in the gray level image of the second image and the region in the second image corresponding to the dark region by using the dark region fusion weight.
And performing weighted fusion on the bright areas in the gray-scale image of the second image and the areas in the second image corresponding to the bright areas by using the bright area fusion weight.
In a second aspect, the present invention provides an apparatus for correcting an abnormal exposure image of a part based on texture information, comprising:
the image and exposure value acquisition module is used for acquiring a first image and an exposure value of the first image, wherein the first image refers to an abnormal exposure image to be corrected.
The image and exposure value acquisition module is used for acquiring a first image and an exposure value of the first image, wherein the first image refers to an abnormal exposure image to be corrected;
the first calculation module is used for obtaining the texture integrity of the first image according to the edge characteristics of the first image and the frequency of gray values in a gray histogram of the first image, obtaining a correction necessity index according to the texture integrity of the first image, and correcting the first image when the correction necessity index is larger than a preset first threshold value;
the second calculation module is used for obtaining an adjustment value lower limit and an adjustment value upper limit according to the texture integrity of the first image and obtaining an exposure value range according to the adjustment value lower limit, the adjustment value upper limit and the exposure value of the first image;
the image generation module is used for adjusting the exposure value of the first image to obtain a plurality of second images with different exposure values, and the exposure value of each second image is within the exposure value range;
the third calculation module is used for obtaining the texture integrity of each second image according to the method for obtaining the texture integrity of the first image and obtaining the fusion weight of each second image according to the texture integrity of each second image;
and the image correction module is used for performing fusion and superposition on the obtained fusion weights of all the second images to obtain the pixel value of the first image and finishing the correction of the first image.
In view of the above technical problems, the present invention provides a method and an apparatus for correcting an abnormal exposure image of a part based on texture information, which mainly include:
acquiring a first image and an exposure value of the first image, wherein the first image refers to an abnormal exposure image to be corrected; obtaining texture integrity of the first image according to the edge characteristics of the first image and the frequency of gray values in a gray histogram of the first image, obtaining a correction necessity index according to the texture integrity of the first image, and correcting the first image when the correction necessity index is larger than a preset first threshold value; obtaining an adjustment value lower limit and an adjustment value upper limit according to the texture integrity of the first image, and obtaining an exposure value range according to the adjustment value lower limit, the adjustment value upper limit and the exposure value of the first image; adjusting the exposure value of the first image to obtain a plurality of second images with different exposure values, wherein the exposure value of the second images is within the exposure value range; obtaining the texture integrity of each second image according to the method for obtaining the texture integrity of the first image, and obtaining the fusion weight of each second image according to the texture integrity of each second image; and performing fusion superposition on the obtained fusion weights of all the second images to obtain the pixel value of the first image, and finishing the correction of the first image.
Compared with the prior art, the invention has the beneficial effects that: judging whether abnormal exposure exists or not by utilizing the texture information in the collected part image, and avoiding unnecessary processing on the image which does not need to be corrected; the texture information of the image is utilized to obtain an exposure value range, and the overexposure area and the underexposure area in the multiple images under different exposure values in the exposure value range are subjected to weighted fusion, so that the overexposure area and the underexposure area in the abnormal exposure image can be simultaneously corrected, and the image correction process is more targeted and accurate.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic flowchart of a method for correcting an abnormal exposure image of a part based on texture information according to embodiment 1 of the present invention.
Fig. 2 is a schematic flowchart of a device for correcting an abnormal exposure image of a part based on texture information according to embodiment 2 of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature; in the description of the present embodiment, "a plurality" means two or more unless otherwise specified.
Example 1
Embodiment 1 of the present invention provides a method for correcting an abnormal exposure image of a part based on texture information, as shown in fig. 1, including:
101. and acquiring a first image and an exposure value of the first image, wherein the first image refers to an abnormal exposure image to be corrected.
The method comprises the steps of acquiring an abnormal exposure image to be corrected, namely a first image, by utilizing an image acquisition device, wherein the first image is in an RGB format, RGB is a color standard, various colors are obtained through the change of three color channels of red (R), green (G) and blue (B) and the superposition of the three color channels, and RGB is the color representing the three channels of red, green and blue.
102. And obtaining the texture integrity of the first image according to the edge characteristics of the first image and the frequency of the gray values in the gray histogram of the first image, obtaining a correction necessity index according to the texture integrity of the first image, and correcting the first image when the correction necessity index is greater than a preset first threshold value.
Embodiment 102 specifically includes 1021, 1022, and 1023.
1021. A gray image of the first image is obtained, and a bright area image and a dark area image are obtained according to the gray image.
Specifically, in order to divide a bright area and a dark area in the grayscale picture of the first image, a division threshold range of the bright area and a division threshold range of the dark area are given in the present embodiment, where the grayscale division threshold range of the bright area is [204, 255] and the grayscale division threshold range of the dark area is [0, 51 ].
It should be noted that, the pixel points of the first image whose gray values are within the threshold range [0, 51] are retained, and the pixel points of the first image whose gray values are within the threshold range (51, 255) are set to 0, so as to obtain the dark area image.
Specifically, pixel points of which the gray values are within the threshold range [204, 255] in the gray image of the first image are reserved, and the pixel points of which the gray values are within the threshold range [0, 204) are set to be 0, so that a bright area image is obtained. The bright area image includes bright areas in the grayscale image of the first image, and the dark area image includes dark areas in the grayscale image of the first image.
Specifically, in this embodiment, the luminance information in the first image is obtained through the gray-scale value of the image, and the bright area in the gray-scale image of the first image corresponds to the overexposed area in the first image; the dark regions in the grayscale image of the first image correspond to the under-exposed regions in the first image.
1022. Obtaining the texture integrity of the bright area according to the texture information of the bright area image; and obtaining the texture integrity of the dark area according to the texture information of the dark area image.
It should be noted that, in the present embodiment, the method for obtaining the texture integrity of the dark area is the same as the method for obtaining the texture integrity of the bright area, but is different from the target image, the method for calculating the texture integrity specifically includes 10221, 10222, and 10223, the texture integrity can reflect the integrity of the texture in the image, and the texture integrity mainly includes the texture richness and the texture uniformity.
10221. And obtaining the texture richness of the bright area image/dark area image.
Specifically, in this embodiment, the bright area image/dark area image is a bright area image or a dark area image. Due to the influence of light, part of the texture in the bright area or the dark area is lost, which causes uneven distribution of the texture, and therefore, the degree of uniformity of the texture needs to be extracted.
Firstly, subtracting the maximum value of the gray value from the minimum value of the gray value in all pixel points in the bright area image/dark area image to obtain the width of the gray value.
It should be noted that, the method for calculating the texture richness of the bright area image and the dark area image is the same, and the method for obtaining the texture richness is specifically described below by taking the bright area image as an example.
Specifically, subtracting the maximum value of the gray value in the bright area image from the minimum value of the gray value to obtain the gray value width wl of the bright area image. And subtracting the maximum value of the gray value and the minimum value of the gray value in the dark area image to obtain the gray value width wa of the dark area image.
Secondly, obtaining the uniformity degree of the histogram according to the variance of the occurrence times of different gray values in the gray histogram of the bright area image/the dark area image, wherein the richness of the texture is mainly reflected on the histogram, and the richer the gray information contained in the histogram is, the richer the texture characteristics are. And multiplying the width of the gray value by the uniformity degree of the histogram to obtain the texture richness.
Specifically, a gray level histogram of the bright area image is obtained, and the frequency mean value of the histogram is calculated
Figure BDA0003578171970000061
Wherein p (h)j) The jth gray value h in the gray histogram of the bright area imagejAnd the corresponding frequency value wl is the gray value width of the bright area image. Calculating the variance of the frequency of the gray histogram of the bright area image
Figure BDA0003578171970000062
The gray level histogram of the bright area is thus uniform to the degree
Figure BDA0003578171970000063
Note that, the gray histogram uniformity degree jh2 of the dark area image can be obtained by the same acquisition method as the gray histogram uniformity degree of the bright area gray image.
Specifically, the texture integrity of the bright area image is ff1 ═ wl ═ jh1, where wl is the gray scale value width of the bright area image, and jh1 is the uniformity degree of the gray scale histogram of the bright area image; meanwhile, the texture richness ff2 of the dark area image is wa jh2, wherein wa is the gray scale value width of the dark area image, and jh2 is the gray scale histogram uniformity degree of the dark area image.
10222. The texture uniformity of the bright area image/dark area image is obtained.
Specifically, the uniformity of the texture is mainly represented by the distance information between the texture pixels and the texture frequency difference information. The distance information of the texture is reflected by the distance size and the distance deviation, and the texture distribution is more uneven when the distance of the texture pixel is larger. The larger the variance of the texel distance per row or column, the more non-uniform the texture distribution of that row or column. The greater the difference in density of the pixel textures per row or column, the more non-uniform the texture distribution of that row or column.
It should be noted that, the calculation method of the texture uniformity of the bright area image and the dark area image is the same, and the method for obtaining the texture uniformity is specifically described below by taking the bright area image as an example.
First, edge detection is performed on the bright area image to obtain edge feature points in the bright area image, and in this embodiment, the sober operator is used to process the edge feature points in the bright area image.
Then, taking the number of the edge feature points in the eight neighborhood pixel points of the edge feature points in the bright area image and the nine pixel points of the pixel points as the frequency of the edge feature points; obtaining the line distribution uniformity degree of the edge feature points according to the frequency of the edge feature points and the line distance features of the edge feature points of the lines; and obtaining the column distribution uniformity degree of the edge feature points according to the frequency of the edge feature points and the column distance features of the edge feature points of the columns.
Specifically, the pixel distance between every two adjacent edge feature points is obtained: suppose that in the k-th line of the bright-area image, two adjacent edge feature points exist at (k, i) and (k, j), and the distance between the two edge feature points is d1kj=j-i。
It should be noted that, when there is no other edge feature point between the edge feature point at (k, l) in the bright area image and the image boundary, the pixel distance between the edge features is: the closest distance of this point to the image boundary, at this time d1klMin (l, M-l), where M is the number of columns in the image.
Further obtaining a distance sequence between edge feature points in the kth line in the bright area image; by using the frequency of the edge feature points, a sequence formed by the frequency of the edge feature points in the k-th line in the bright area image can be obtained, and further, the distance average value of the line edge feature points in the bright area image and the distance deviation of the edge feature points can be calculated.
Specifically, the mean distance between the edge feature points of the k-th line in the bright area image is
Figure BDA0003578171970000071
The distance deviation between the edge feature points of the k-th line in the bright area image is
Figure BDA0003578171970000072
Wherein d1khThe distance value of the h-th edge feature point of the first line in the bright area image, here, the number of the distance values of the edge feature points in the distance sequence between the r edge feature points.
It should be noted that the frequency deviation value of the edge feature point of the k-th line in the bright area image is
Figure BDA0003578171970000073
Where M is the number of columns in the bright field image, f1kgIs the frequency value of the edge feature at the location of the kth row and the gth column of the bright area,
Figure BDA0003578171970000081
is the frequency average of the pixels of the k-th row.
Secondly, the degree of uniformity of the distribution of the edge feature points of the k-th line in the bright area image
Figure BDA0003578171970000082
Where σ d1kDistance deviation of characteristic points of k line edge in bright area image, σ f1kThe frequency deviation of the k line edge feature in the bright area image is shown.
Figure BDA0003578171970000083
The distance average value of the k-th row edge characteristic points in the bright area image is obtained.
Further, the average value of the distribution uniformity of the edge features of the line pixels in the bright area image is
Figure BDA0003578171970000084
Here Jy1qThe edge feature distribution uniformity degree of the q-th row of the bright area image is obtained, N is the number of rows of the frequency image, and the average value of the edge feature distribution uniformity degree of the column pixels in the bright area image can be further obtained
Figure BDA0003578171970000085
The edge distribution uniformity of the bright area image is
Figure BDA0003578171970000086
And finally multiplying the line distribution uniformity degree and the column distribution uniformity degree of the edge feature points to obtain the texture uniformity degree Jy2 of the bright area image.
Further, the texture uniformity degree Jy2 of the dark region can be obtained by the same obtaining method as the texture uniformity degree of the image of the bright region.
10223. And multiplying the texture richness and the texture uniformity to obtain the texture integrity of the bright area image/the dark area image.
The texture integrity of the bright area is Wz 1-ff 1-Jy 1, wherein ff1 is the texture richness of the bright area image, and Jy1 is the texture uniformity of the bright area image; the texture integrity of the dark region is Wz 2-ff 2-Jy 2, where ff2 is the texture richness of the dark region image and Jy2 is the texture uniformity of the dark region image.
1023. And obtaining the texture integrity of the first image according to the texture integrity of the bright area and the texture integrity of the dark area.
Specifically, the texture integrity of the bright area and the texture integrity of the dark area are added to obtain the texture integrity of the first image, and the texture integrity can reflect the texture information of the first image, namely the abnormal exposure image.
The inverse of the texture integrity of the first image is a correction necessity index, which can reflect the necessity of correcting the first image, and it is determined whether the correction necessity index is greater than a preset first threshold, if so, the first image needs to be corrected, and if not, the image does not have abnormal exposure, and thus, the subsequent correction is not needed.
103. And obtaining an adjustment value lower limit and an adjustment value upper limit according to the texture integrity of the first image, and obtaining an exposure value range according to the adjustment value lower limit, the adjustment value upper limit and the exposure value of the first image.
Specifically, α is a lower limit of an adjustment value, β is an upper limit of the adjustment value, α is related to texture information of an overexposed region in the first image, and β is related to texture information of an underexposed region in the first image; when the texture information in the bright area is seriously lost, the exposure value of the bright area is reduced; when the missing of texture information in the dark area is severe, the exposure value of the dark area should be increased. Lower limit of adjustment value in this embodiment
Figure BDA0003578171970000087
Upper limit of regulation value
Figure BDA0003578171970000088
Where Wz1 is the texture completeness of the light areas, Wz2 is the texture completeness of the dark areas,
Figure BDA0003578171970000091
the constant parameters can be selected according to the specific needs of the implementer.
Further, an exposure value range, [ Bg- α, Bg + β, can be obtained, where Bg is the exposure value of the abnormally exposed image, i.e., the first image.
104. And adjusting the exposure value of the first image to obtain a plurality of second images with different exposure values in the exposure value range.
Specifically, according to the exposure value range obtained in 103, a plurality of second pictures with different exposure values can be obtained on the basis of the first image, the obtained second images are different from the first image in the exposure value, and the exposure values of all the second images are within the exposure value range.
105. And obtaining the texture integrity of each second image according to the method for obtaining the texture integrity of the first image, and obtaining the fusion weight of each second image according to the texture integrity of each second image.
Further, by using the method for obtaining the texture integrity of the image in this embodiment 102, the texture integrity of the bright area image and the texture integrity of the dark area image corresponding to each second image can be obtained.
It should be noted that, a bright area in the grayscale image of the first image corresponds to an overexposed area in the first image, and a dark area in the grayscale image of the first image corresponds to an underexposed area in the first image, and then the fusion weight of the a-th second image to the overexposed area is:
Figure BDA0003578171970000092
fusion weight of the second image of the a-th image to the underexposed area
Figure BDA0003578171970000093
Where v represents the total number of second images.
106. And performing fusion superposition on the obtained fusion weights of all the second images to obtain the pixel value of the first image, and finishing the correction of the first image.
Specifically, according to the fusion weight of the second image, the overexposed region and the underexposed region in the first image are fused, the normal region in the first image is kept unchanged, and the obtained image is the corrected first image.
It should be noted that the pixel values in the overexposed region in the first image are corrected in such a way that
Figure BDA0003578171970000094
Wherein
Figure BDA0003578171970000095
Expressing pixel points of bright areas in the gray level image of the first image and pixel values of corresponding pixel points in the a-th second image; the pixel values in the underexposed region in the first image are corrected in such a way that
Figure BDA0003578171970000096
Wherein
Figure BDA0003578171970000097
The pixel values of the pixel points of the dark area in the gray scale image of the first image and the corresponding pixel points in the a-th second image are represented; xs1Is the corrected pixel value, Xs, of the overexposed region in the first image2The pixel values of the under-exposed area in the first image after being corrected are obtained.
Compared with the prior art, the invention has the beneficial effects that: judging whether abnormal exposure exists or not by utilizing the texture information in the collected part image, and avoiding unnecessary processing on the image which does not need to be corrected; the texture information of the image is utilized to obtain an exposure value range, and the overexposure area and the underexposure area in the multiple images under different exposure values in the exposure value range are subjected to weighted fusion, so that the overexposure area and the underexposure area in the abnormal exposure image can be simultaneously corrected, and the image correction process is more targeted and accurate.
Example 2
Embodiment 2 of the present invention provides a method and an apparatus for correcting an abnormal exposure image of a part based on texture information, as shown in fig. 2, including:
the image and exposure value obtaining module 21 is configured to obtain a first image and an exposure value of the first image, where the first image is an abnormal exposure image to be corrected.
The first calculating module 22 is configured to obtain a texture integrity of the first image according to the edge feature of the first image and the frequency of the gray value in the gray histogram of the first image, obtain a modification necessity index according to the texture integrity of the first image, and modify the first image when the modification necessity index is greater than a preset first threshold.
The second calculating module 23 is configured to obtain a lower adjustment value limit and an upper adjustment value limit according to the texture integrity of the first image, and obtain an exposure value range according to the lower adjustment value limit, the upper adjustment value limit, and the exposure value of the first image.
And the image generation module 24 is configured to adjust the exposure value of the first image to obtain a plurality of second images with different exposure values, where the exposure value of the second image is within the exposure value range.
And the third calculating module 25 is configured to obtain the texture integrity of each second image according to the method for obtaining the texture integrity of the first image, and obtain the fusion weight of each second image according to the texture integrity of each second image.
And the image correction module 26 is configured to perform fusion and superposition on the obtained fusion weights of all the second images to obtain a pixel value of the first image, so as to complete correction on the first image.
In summary, compared with the prior art, the invention has the beneficial effects that: judging whether abnormal exposure exists or not by utilizing the texture information in the collected part image, and avoiding unnecessary processing on the image which does not need to be corrected; the texture information of the image is utilized to obtain an exposure value range, and the overexposure area and the underexposure area in the multiple images under different exposure values in the exposure value range are subjected to weighted fusion, so that the overexposure area and the underexposure area in the abnormal exposure image can be simultaneously corrected, and the image correction process is more targeted and accurate.
The use of words such as "including," "comprising," "having," and the like in this disclosure is an open-ended term that means "including, but not limited to," and is used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the method and system of the present invention, various components or steps may be decomposed and/or re-combined. These decompositions and/or recombinations are to be considered equivalents of the present disclosure.
The above-mentioned embodiments are merely examples for clearly illustrating the present invention and do not limit the scope of the present invention. Other variations and modifications in the above description will occur to those skilled in the art and are not necessarily exhaustive of all embodiments. All designs identical or similar to the present invention are within the scope of the present invention.

Claims (8)

1. A part abnormal exposure image correction method based on texture information is characterized by comprising the following steps:
acquiring a first image and an exposure value of the first image, wherein the first image refers to an abnormal exposure image to be corrected;
obtaining texture integrity of the first image according to the edge feature of the first image and the frequency of gray values in a gray histogram of the first image, obtaining a correction necessity index according to the texture integrity of the first image, and correcting the first image when the correction necessity index is larger than a preset first threshold;
obtaining an adjustment value lower limit and an adjustment value upper limit according to the texture integrity of the first image, and obtaining an exposure value range according to the adjustment value lower limit, the adjustment value upper limit and the exposure value of the first image;
adjusting the exposure value of the first image to obtain a plurality of second images with different exposure values, wherein the exposure value of the second images is within the exposure value range;
obtaining the texture integrity of each second image according to a method for obtaining the texture integrity of a first image, and obtaining the fusion weight of each second image according to the texture integrity of each second image;
and performing fusion superposition on the obtained fusion weights of all the second images to obtain the pixel value of the first image, and finishing the correction of the first image.
2. The method for correcting the abnormal exposure image of the part based on the texture information as claimed in claim 1, wherein the texture integrity of the first/second image is obtained from the texture integrity of a bright area and the texture integrity of a dark area of the first/second image, and the method comprises the following steps:
obtaining a gray scale image of the first image/the second image, and obtaining a bright area image and a dark area image according to the gray scale image, wherein the bright area image comprises a bright area in the gray scale image, and the dark area image comprises a dark area in the gray scale image;
obtaining the texture integrity of the bright area according to the texture information of the bright area image;
acquiring the texture integrity of the dark area according to the texture information of the dark area image;
and obtaining the texture integrity according to the texture integrity of the bright area and the texture integrity of the dark area.
3. The method for correcting the abnormal exposure image of the part based on the texture information as claimed in claim 2, wherein the texture integrity acquiring process comprises:
acquiring texture richness and texture uniformity of the bright area image/dark area image according to texture features of the bright area image/dark area image;
and multiplying the texture richness and the texture uniformity to obtain the texture integrity of the bright area image/the dark area image.
4. The method for correcting the abnormal exposure image of the part based on the texture information as claimed in claim 3, wherein the texture richness acquiring process comprises:
subtracting the maximum value of the gray value from the minimum value of the gray value in all pixel points in the bright area image/dark area image to obtain the width of the gray value;
obtaining the histogram uniformity degree according to the variance of the occurrence times of different gray values in the gray histogram of the bright area image/the dark area image;
and multiplying the width of the gray value by the uniformity degree of the histogram to obtain the texture richness.
5. The method for correcting the abnormal exposure image of the part based on the texture information as claimed in claim 3, wherein the method for obtaining the uniformity degree of the texture comprises:
performing edge detection on the bright area image/the dark area image to obtain edge characteristic points of the bright area image/the dark area image;
taking the number of eight neighborhood pixel points of the edge feature points in the bright area image/dark area image and the number of the edge feature points in nine pixel points of the pixel points as the frequency of the edge feature points;
obtaining the line distribution uniformity degree of the edge characteristic points according to the frequency of the edge characteristic points of the lines in the bright area image/dark area image and the line distance characteristics of the edge characteristic points of the lines;
obtaining the column distribution uniformity degree of the edge characteristic points according to the frequency of the edge characteristic points of the columns in the bright area image/dark area image and the column distance characteristics of the edge characteristic points of the columns;
and multiplying the line distribution uniformity degree and the column distribution uniformity degree of the edge feature points to obtain the texture uniformity degree.
6. The method for correcting the abnormal exposure image of the part based on the texture information as claimed in claim 5, wherein the line distance feature is a distance from an edge feature point to another adjacent edge feature point in the same line, or a distance from an edge feature point to a boundary of a bright area image/a dark area image;
the column distance feature refers to a distance between an edge feature point and another adjacent edge feature point in the same column, or a distance between an edge feature point and a boundary of a bright area image/dark area image.
7. The method for correcting the abnormal exposure image of the part based on the texture information as claimed in claim 1, wherein the step of performing fusion superposition on the obtained fusion weights of all the second images to obtain the pixel value of the first image to complete the correction of the first image comprises the following steps:
the fusion weight comprises a dark region fusion weight and a bright region fusion weight;
carrying out weighted fusion on the dark area in the gray level image of the second image and the area in the second image corresponding to the dark area by using the dark area fusion weight;
and performing weighted fusion on the bright areas in the gray level image of the second image and the areas in the second image corresponding to the bright areas by using the bright area fusion weight.
8. An apparatus for correcting an abnormal exposure image of a part based on texture information, comprising:
the image and exposure value acquisition module is used for acquiring a first image and an exposure value of the first image, wherein the first image refers to an abnormal exposure image to be corrected;
the first calculation module is used for obtaining the texture integrity of the first image according to the edge characteristics of the first image and the frequency of gray values in a gray histogram of the first image, obtaining a correction necessity index according to the texture integrity of the first image, and correcting the first image when the correction necessity index is larger than a preset first threshold value;
the second calculation module is used for obtaining an adjustment value lower limit and an adjustment value upper limit according to the texture integrity of the first image and obtaining an exposure value range according to the adjustment value lower limit, the adjustment value upper limit and the exposure value of the first image;
the image generation module is used for adjusting the exposure value of the first image to obtain a plurality of second images with different exposure values, and the exposure value of each second image is within the exposure value range;
the third calculation module is used for obtaining the texture integrity of each second image according to the method for obtaining the texture integrity of the first image and obtaining the fusion weight of each second image according to the texture integrity of each second image;
and the image correction module is used for performing fusion and superposition on the obtained fusion weights of all the second images to obtain the pixel value of the first image and finishing the correction of the first image.
CN202210348628.5A 2022-04-01 2022-04-01 Part abnormal exposure image correction method and device based on texture information Pending CN114723632A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210348628.5A CN114723632A (en) 2022-04-01 2022-04-01 Part abnormal exposure image correction method and device based on texture information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210348628.5A CN114723632A (en) 2022-04-01 2022-04-01 Part abnormal exposure image correction method and device based on texture information

Publications (1)

Publication Number Publication Date
CN114723632A true CN114723632A (en) 2022-07-08

Family

ID=82241583

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210348628.5A Pending CN114723632A (en) 2022-04-01 2022-04-01 Part abnormal exposure image correction method and device based on texture information

Country Status (1)

Country Link
CN (1) CN114723632A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116188462A (en) * 2023-04-24 2023-05-30 深圳市翠绿贵金属材料科技有限公司 Noble metal quality detection method and system based on visual identification

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100157110A1 (en) * 2008-12-19 2010-06-24 Sanyo Electric Co., Ltd. Image Sensing Apparatus
JP2011199860A (en) * 2010-02-26 2011-10-06 Nikon Corp Imaging device and image generating program
CN104902168A (en) * 2015-05-08 2015-09-09 梅瑜杰 Image synthesis method, device and shooting equipment
CN110087003A (en) * 2019-04-30 2019-08-02 深圳市华星光电技术有限公司 More exposure image fusion methods

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100157110A1 (en) * 2008-12-19 2010-06-24 Sanyo Electric Co., Ltd. Image Sensing Apparatus
JP2011199860A (en) * 2010-02-26 2011-10-06 Nikon Corp Imaging device and image generating program
CN104902168A (en) * 2015-05-08 2015-09-09 梅瑜杰 Image synthesis method, device and shooting equipment
CN110087003A (en) * 2019-04-30 2019-08-02 深圳市华星光电技术有限公司 More exposure image fusion methods

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZHANG Q等: "Dual Illumination Estimation for Robust Exposure Correction", COMPUTER GRAPHICS FORUM, vol. 38, no. 7, 31 October 2019 (2019-10-31), pages 243 - 252, XP071489827, DOI: 10.1111/cgf.13833 *
王韬: "复杂光照下非对齐人脸特征融合的识别算法研究", 中国优秀硕士学位论文全文数据库信息科技辑, no. 02, 15 February 2022 (2022-02-15), pages 138 - 757 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116188462A (en) * 2023-04-24 2023-05-30 深圳市翠绿贵金属材料科技有限公司 Noble metal quality detection method and system based on visual identification
CN116188462B (en) * 2023-04-24 2023-08-11 深圳市翠绿贵金属材料科技有限公司 Noble metal quality detection method and system based on visual identification

Similar Documents

Publication Publication Date Title
CN110505459B (en) Image color correction method, device and storage medium suitable for endoscope
US7557812B2 (en) Multilevel texture processing method for mapping multiple images onto 3D models
US8035871B2 (en) Determining target luminance value of an image using predicted noise amount
EP2849431B1 (en) Method and apparatus for detecting backlight
CN112752023B (en) Image adjusting method and device, electronic equipment and storage medium
US8090214B2 (en) Method for automatic detection and correction of halo artifacts in images
CN108009997B (en) Method and device for adjusting image contrast
CN111340721B (en) Pixel correction method, device, equipment and readable storage medium
US7522314B2 (en) Image sharpening
CN113592739B (en) Lens shading correction method, device and storage medium
CN113596573A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN114723632A (en) Part abnormal exposure image correction method and device based on texture information
CN107659777B (en) Automatic exposure method and device
CN111127337A (en) Image local area highlight adjusting method, medium, equipment and device
CN115802169A (en) Automatic exposure method and terminal based on brightness histogram
CN114757853A (en) Flat field correction function acquisition method and system and flat field correction method and system
CN110086997B (en) Face image exposure brightness compensation method and device
CN110175967A (en) Image defogging processing method, system, computer equipment and storage medium
CN111640068A (en) Unsupervised automatic correction method for image exposure
WO2005025207A1 (en) Image processing utilizing locally adaptive color correction and cumulative histograms
KR100999811B1 (en) Image quality enhancement method using histogram equalization by parity probability segmentation
CN113808045B (en) Image brightness adjusting method and device
CN113850878B (en) Color histogram-based non-light source AWB (AWB) estimation method and system
EP1919189B1 (en) Method and apparatus for processing digital images
US20040109602A1 (en) Method and apparatus for inspecting a bump electrode

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination