CN117710489A - Image color edge quantization method, apparatus, storage medium, and program product - Google Patents

Image color edge quantization method, apparatus, storage medium, and program product Download PDF

Info

Publication number
CN117710489A
CN117710489A CN202310567260.6A CN202310567260A CN117710489A CN 117710489 A CN117710489 A CN 117710489A CN 202310567260 A CN202310567260 A CN 202310567260A CN 117710489 A CN117710489 A CN 117710489A
Authority
CN
China
Prior art keywords
color
image
target pixel
color difference
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310567260.6A
Other languages
Chinese (zh)
Inventor
周天一
李钱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310567260.6A priority Critical patent/CN117710489A/en
Publication of CN117710489A publication Critical patent/CN117710489A/en
Pending legal-status Critical Current

Links

Abstract

The application provides an image color edge quantization method, an image color edge quantization device, a storage medium and a program product, wherein an image to be quantized is acquired in the image color edge quantization method, and the image to be quantized comprises color edges. The target pixel point in the image to be quantized corresponds to a first color component, a second color component and a third color component, the first color difference is the color difference of the first color component of the target pixel point relative to the third color component of the target pixel point, and the second color difference is the color difference of the second color component of the target pixel point relative to the third color component of the target pixel point. According to the method, the width of the color edge and/or the color difference of the color edge are determined according to the first color difference and the second color difference of the target pixel point in the image to be quantized, and then the intensity degree of the color edge in the image to be quantized can be accurately quantized according to the width of the color edge and/or the color difference of the color edge, so that the problem that the intensity degree of the color edge in the image cannot be counted and quantized due to subjective judgment is solved, and the influence caused by the color edge is evaluated more objectively.

Description

Image color edge quantization method, apparatus, storage medium, and program product
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image color edge quantization method, an image color edge quantization device, a storage medium, and a program product.
Background
When the image acquisition equipment shoots, the image sensor generally acquires original image data, and the original image data is processed by an image signal processing algorithm and the like to obtain a common image.
However, when the image capturing apparatus captures an image, a color border may occur at a boundary between a highlight region and a low-light region of the image, for example, at a boundary of an object in the image, due to hardware reasons such as an image sensor or algorithm reasons such as an image signal processing algorithm, and the color border belongs to a false color phenomenon. Once the color edges appear in the image, the image quality is reduced, and the visual experience effect is seriously affected.
Therefore, the detection of the intensity of the color edges in the image has important significance to the research of the image acquisition equipment and the image signal processing algorithm. However, the related color edge detection method is often based on subjective judgment, and cannot count and quantify the intensity of the color edge in the image.
Disclosure of Invention
The application provides an image color edge quantization method, device, storage medium and program product, which determine the intensity degree of color edges in an image to be quantized by determining the width of the color edges and/or the color difference of the color edges in the image to be quantized so as to realize the quantization of the intensity degree of the color edges in the image to be quantized, solve the problem that subjective judgment leads to incapability of statistics and quantization of the intensity degree of the color edges in the image, and further objectively evaluate the influence brought by the color edges.
In order to achieve the above purpose, the present application adopts the following technical scheme:
in a first aspect, an image color edge quantization method is provided, first, an image to be quantized is obtained, where the image to be quantized includes color edges. And secondly, determining the width of the color edge and/or the color difference of the color edge according to the first color difference and the second color difference of the target pixel point in the image to be quantized, wherein the target pixel point corresponds to the first color component, the second color component and the third color component, the first color difference is the color difference of the first color component of the target pixel point relative to the third color component of the target pixel point, and the second color difference is the color difference of the second color component of the target pixel point relative to the third color component of the target pixel point. Finally, the intensity of the color edge in the image to be quantized is determined according to the width of the color edge and/or the color difference of the color edge.
Based on the technical scheme, an image to be quantized is obtained, wherein the image to be quantized comprises color edges. The target pixel point in the image to be quantized corresponds to three color components, namely a first color component, a second color component and a third color component, wherein the first color difference is the color difference of the first color component of the target pixel point relative to the third color component of the target pixel point, and the second color difference is the color difference of the second color component of the target pixel point relative to the third color component of the target pixel point. In the embodiment, the width of the color edge and/or the color difference of the color edge are determined according to the first color difference and the second color difference of the target pixel point in the image to be quantized; and quantifying the intensity of the color edge in the image to be quantified according to the width of the color edge and/or the color difference of the color edge, and solving the problem that subjective judgment cannot count and quantify the intensity of the color edge in the image, thereby more objectively evaluating the influence caused by the color edge.
In a possible implementation manner of the first aspect, the determining, according to a width of the color edge and a color difference of the color edge, a degree of intensity of the color edge in the image to be quantized includes: and carrying out weighted average according to the width of the color edge, the color difference of the color edge, a preset first weight value corresponding to the width of the color edge and a preset second weight value corresponding to the color difference of the color edge to obtain a target value, wherein the target value represents the intensity degree of the color edge in the image to be quantized.
Based on the above technical solution, the first weight value and the second weight value may be set by an evaluation person, for example: the first weight value may be set equal to the second weight value; if the evaluating personnel compares the width of the intentional color edge, the first weight value is set to be larger than the second weight value; if the evaluating personnel compares the color difference of the intentional color edge, the second weight value can be set to be larger than the first weight value. Different evaluating staff can set the first weight value and the second weight value according to different standards, so that the intensity degree of the color edge can be quantified by the different evaluating staff according to different standards.
In a possible implementation manner of the first aspect, the determining the width of the color edge according to the first color difference and the second color difference of the target pixel point in the image to be quantized includes: determining the number of target pixel points, of which the first color difference is larger than a first preset color difference threshold value and the second color difference is larger than a second preset color difference threshold value, according to the first color difference and the second color difference of the target pixel points in the image to be quantized; and determining the width of the color edge according to the number of target pixel points, wherein the first color difference is larger than a first preset color difference threshold value and the second color difference is larger than a second preset color difference threshold value. Based on the technical scheme, the width of the color edge can be accurately calculated.
In a possible implementation manner of the first aspect, the determining the color difference of the color edge according to the first color difference and the second color difference of the target pixel point in the image to be quantized includes: determining the maximum value of the first chromatic aberration and the maximum value of the second chromatic aberration of a plurality of target pixel points in the image to be quantized; and determining the color difference of the color edge according to the maximum value of the first color difference and the maximum value of the second color difference. Based on the technical scheme, the color difference of the color edge can be accurately calculated.
In a possible implementation manner of the first aspect, the determining the color difference of the color edge according to the maximum value of the first color difference and the maximum value of the second color difference includes: averaging the maximum value of the first color difference and the maximum value of the second color difference; and determining the chromatic aberration of the color edge according to the average value. Based on the technical scheme, the color difference of the color edge can be accurately calculated.
In a possible implementation manner of the first aspect, the image to be quantized includes N rows and M columns of target pixel points, where N and M are integers greater than or equal to 2; before the width of the color edge and/or the color difference of the color edge are determined according to the first color difference and the second color difference of the target pixel point in the image to be quantized, the method further comprises the following steps: respectively carrying out weighted average on the first color component, the second color component and the third color component of each row or each column of target pixel points in the image to be quantized and a preset weight coefficient to obtain a first color component after weighted average, a second color component after weighted average and a third color component after weighted average, wherein the first color component, the second color component and the third color component correspond to each row or each column of target pixel points; and determining the first color difference and the second color difference of each row or each column of target pixel points according to the first color component after weighted averaging, the second color component after weighted averaging and the third color component after weighted averaging, which correspond to each row or each column of target pixel points.
Based on the above technical solution, in this embodiment, weighted average is performed on three color components of each row or each column of target pixel points according to a preset weight coefficient, so as to eliminate random errors, such as noise images during measurement of electronic components.
In a possible implementation manner of the first aspect, the determining the first color difference and the second color difference of each row or each column of the target pixel point according to the weighted-averaged first color component, the weighted-averaged second color component, and the weighted-averaged third color component corresponding to each row or each column of the target pixel point includes: normalizing the difference value between the weighted and averaged first color component and the weighted and averaged third color component corresponding to each row or each column of target pixel points to obtain a first color difference corresponding to each row or each column of target pixel points; and carrying out normalization processing on the difference value between the second color component after weighted averaging and the third color component after weighted averaging corresponding to each row or each column of target pixel points to obtain a second color difference corresponding to each row or each column of target pixel points.
The image to be quantized in the application can be a RAW format image (file extension of the RAW format image is shown as:. RAW,. DNG,. Arw, etc.) obtained by direct shooting of an image sensor, or can be a JPG or JPEG format image (file extension of JPG or JPEG format image is shown as:. JPG,. JPEG, etc.) obtained by processing an image signal processing algorithm, wherein the bit number of data in the RAW format image is generally 10bit, 12bit or 14bit, etc., and the bit number of data in the JPG or JPEG format image is generally 8bit. In this embodiment, after the difference is normalized, the first color difference and the second color difference obtained by final calculation of the images in any format are both in the same value range, so that comparison between images in different formats is facilitated.
In a possible implementation manner of the first aspect, the color of the color edge is purple; the first color component is a red color component, the second color component is a blue color component, and the third color component is a green color component.
Based on the above technical solution, the color of the color border is generally purple, and the color border is purple because the deviation of the red component and the blue component is larger than that of the green component, when the color of the color border is purple, the green component is selected as the third color component, that is, the third color component is determined by combining the generation principle of the color border, and then the width and/or the color difference of the color border can be accurately represented according to the width and/or the color difference obtained by the first color difference and the second color difference.
In a possible implementation manner of the first aspect, the image to be quantized further includes a highlight region and a low-light region, and a difference between a brightness of a target pixel point in the highlight region and a brightness of a target pixel point in the low-light region is greater than a preset brightness difference.
Based on the above technical scheme, since the color edge is generally located at the boundary of the highlight region and the low-light region in the image, the image to be quantized further comprises the highlight region and the low-light region, so as to ensure that the width of the color edge in the image to be quantized is complete, thereby improving the accuracy of the calculation result of the width of the color edge.
In a possible implementation manner of the first aspect, the image to be quantized is a partial image in the target image.
Based on the technical scheme, the whole image can be used as the target image, and the partial area including the color edge in the target image is used as the image to be quantized, so that the processing speed of the color edge in the quantized image is improved.
In a second aspect, there is provided an image color edge quantization apparatus comprising: the acquisition module is used for acquiring an image to be quantized, wherein the image to be quantized comprises a color edge; the processing module is used for determining the width of the color edge and/or the color difference of the color edge according to the first color difference and the second color difference of a target pixel point in the image to be quantized, wherein the target pixel point corresponds to a first color component, a second color component and a third color component, the first color difference is the color difference of the first color component of the target pixel point relative to the third color component of the target pixel point, and the second color difference is the color difference of the second color component of the target pixel point relative to the third color component of the target pixel point; the processing module is further configured to determine the intensity of the color edge in the image to be quantized according to the width of the color edge and/or the color difference of the color edge.
Based on the technical scheme, the acquisition module is used for acquiring an image to be quantized, wherein the image to be quantized comprises color edges. The target pixel point in the image to be quantized corresponds to three color components, namely a first color component, a second color component and a third color component, wherein the first color difference is the color difference of the first color component of the target pixel point relative to the third color component of the target pixel point, and the second color difference is the color difference of the second color component of the target pixel point relative to the third color component of the target pixel point. The processing module in this embodiment is configured to determine a width of a color edge and/or a color difference of the color edge according to a first color difference and a second color difference of a target pixel point in an image to be quantized; the processing module is also used for quantifying the intensity degree of the color edge in the image to be quantified according to the width of the color edge and/or the color difference of the color edge, and solving the problem that subjective judgment leads to incapability of counting and quantifying the intensity degree of the color edge in the image, thereby more objectively evaluating the influence caused by the color edge.
In a possible implementation manner of the second aspect, the processing module is specifically configured to perform weighted average according to the width of the color edge, the color difference of the color edge, a preset first weight value corresponding to the width of the color edge, and a preset second weight value corresponding to the color difference of the color edge, so as to obtain a target value, where the target value represents the intensity of the color edge in the image to be quantized.
Based on the above technical solution, the first weight value and the second weight value may be set by an evaluation person, for example: the first weight value may be set equal to the second weight value; if the evaluating personnel compares the width of the intentional color edge, the first weight value is set to be larger than the second weight value; if the evaluating personnel compares the color difference of the intentional color edge, the second weight value can be set to be larger than the first weight value. Different evaluating staff set the first weight value and the second weight value according to different standards, so that the intensity degree of the color edge is quantified by the different evaluating staff according to different standards.
In a possible implementation manner of the second aspect, the processing module is specifically configured to determine, according to the first color difference and the second color difference of the target pixel point in the image to be quantized, a number of target pixel points where the first color difference is greater than a first preset color difference threshold and the second color difference is greater than a second preset color difference threshold; the method is particularly used for determining the width of the color edge according to the number of target pixel points, wherein the first color difference is larger than a first preset color difference threshold value, and the second color difference is larger than a second preset color difference threshold value. Based on the technical scheme, the width of the color edge can be accurately calculated.
In a possible implementation manner of the second aspect, the processing module is specifically configured to determine a maximum value of a first color difference and a maximum value of a second color difference of a plurality of target pixel points in the image to be quantized; and the method is specifically used for determining the color difference of the color edge according to the maximum value of the first color difference and the maximum value of the second color difference. Based on the technical scheme, the color difference of the color edge can be accurately calculated.
In a possible implementation manner of the second aspect, the processing module is specifically configured to average the maximum value of the first color difference and the maximum value of the second color difference; and is specifically configured to determine a color difference of the color edge according to the average value. Based on the technical scheme, the color difference of the color edge can be accurately calculated.
In a possible implementation manner of the second aspect, the image to be quantized includes N rows and M columns of target pixel points, where N and M are integers greater than or equal to 2; the processing module is further configured to, before determining the width of the color edge and/or the color difference of the color edge according to the first color difference and the second color difference of the target pixel point in the image to be quantized, respectively performing weighted average on the first color component, the second color component and the third color component of each row or each column of the target pixel point in the image to be quantized and a preset weight coefficient to obtain a weighted-average first color component, a weighted-average second color component and a weighted-average third color component of each row or each column of the target pixel point; and the method is also used for determining the first color difference and the second color difference of each row or each column of target pixel points according to the first color component after weighted averaging, the second color component after weighted averaging and the third color component after weighted averaging, which correspond to each row or each column of target pixel points.
Based on the above technical solution, in this embodiment, according to a preset weight coefficient, weighted average is performed on three color components of each row or each column of target pixel points, so as to eliminate random errors, such as noise images during measurement of electronic components, and the like.
In a possible implementation manner of the second aspect, the processing module is specifically configured to normalize a difference value between the weighted-averaged first color component and the weighted-averaged third color component corresponding to each row or each column of the target pixel point to obtain a first color difference corresponding to each row or each column of the target pixel point; and the method is specifically used for carrying out normalization processing on the difference value between the second color component after weighted averaging and the third color component after weighted averaging corresponding to each row or each column of target pixel points to obtain the second color difference corresponding to each row or each column of target pixel points.
The image to be quantized in the application can be a RAW format image (file extension of the RAW format image is shown as:. RAW,. DNG,. Arw, etc.) obtained by direct shooting of an image sensor, or can be a JPG or JPEG format image (file extension of JPG or JPEG format image is shown as:. JPG,. JPEG, etc.) obtained by processing an image signal processing algorithm, wherein the bit number of data in the RAW format image is generally 10bit, 12bit or 14bit, etc., and the bit number of data in the JPG or JPEG format image is generally 8bit. In this embodiment, after the difference is normalized, the first color difference and the second color difference obtained by final calculation of the images in any format are both in the same value range, so that comparison between images in different formats is facilitated.
In a possible implementation manner of the second aspect, the color of the color edge is purple; the first color component is a red color component, the second color component is a blue color component, and the third color component is a green color component.
Based on the above technical solution, the color of the color border is generally purple, and the color border is purple because the deviation of the red component and the blue component is larger than that of the green component, when the color of the color border is purple, the green component is selected as the third color component, that is, the third color component is determined by combining the generation principle of the color border, and then the width and/or the color difference of the color border can be accurately represented according to the width and/or the color difference obtained by the first color difference and the second color difference.
In a possible implementation manner of the second aspect, the image to be quantized further includes a highlight region and a low-light region, and a difference between brightness of the target pixel point in the highlight region and brightness of the target pixel point in the low-light region is greater than a preset brightness difference.
Based on the above technical scheme, since the color edge is generally located at the boundary of the highlight region and the low-light region in the image, the image to be quantized further comprises the highlight region and the low-light region, so as to ensure that the width of the color edge in the image to be quantized is complete, thereby improving the accuracy of the calculation result of the width of the color edge.
In a possible implementation manner of the second aspect, the image to be quantized is a partial image in the target image.
Based on the technical scheme, the whole image can be used as the target image, and the partial area including the color edge in the target image is used as the image to be quantized, so that the processing speed of the color edge in the quantized image is improved.
In a third aspect, there is provided an image color edge quantization apparatus comprising a memory and a processor, the memory for storing instructions that, when executed by the processor, cause the image color edge quantization apparatus to perform the image color edge quantization method of the first aspect or any one of the possible implementations of the first aspect.
In a fourth aspect, a computer readable storage medium is provided, the computer readable storage medium storing a computer program comprising program instructions which, when executed, implement the image color border quantization method of the first aspect or any one of the possible implementations of the first aspect.
In a fifth aspect, there is provided a computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the image color edge quantization method of the first aspect or any of the possible implementations of the first aspect.
Further combinations of the present application may be made to provide further implementations based on the implementations provided in the above aspects.
Drawings
FIG. 1 is an exemplary flow chart of a method for image color edge quantization of a first aspect provided by an embodiment of the present application;
FIG. 2 is a schematic diagram of an image color border provided in an embodiment of the present application;
FIG. 3 is a specific exemplary flow chart of step 130 of FIG. 1 provided in an embodiment of the present application;
FIG. 4 is a graph showing weighted averaged three color components in the same coordinate system provided by an embodiment of the present application;
FIG. 5 is a graph showing a first color difference and a second color difference in the same coordinate system provided by an embodiment of the present application;
fig. 6 is a schematic structural diagram of an image color edge quantization apparatus according to the second aspect provided in the embodiment of the present application;
fig. 7 is a schematic structural diagram of an image color edge quantization apparatus according to a third aspect provided in an embodiment of the present application.
Detailed Description
The technical solutions in the present application will be described below with reference to the accompanying drawings.
In the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
The principle of capturing an image is as follows, an image sensor is used to convert a captured light source signal into an original data image of a digital signal, for example, a Complementary Metal Oxide Semiconductor (CMOS) or Charge-coupled Device (CCD), and the original data image is a RAW format image (file extension of RAW format image is:. RAW,. DNG, etc.),
Arw, etc.); then, the RAW format image output by the front-end image sensor is subjected to signal processing through an image signal processing algorithm, so that the commonly used JPG or JPEG format images (file extensions of JPG or JPEG format images are shown as JPG, JPEG and the like) can be obtained.
The color edge is a false color phenomenon generated at the juncture of a highlight region and a low-light region of an image due to hardware reasons such as an image sensor or algorithm reasons such as an image signal processing algorithm when the image acquisition device shoots. Color fringing generally occurs at object boundaries, and the occurrence of color fringing causes degradation of image quality, severely affecting visual experience. However, the related color edge detection method is often based on subjective judgment, and cannot count and quantify the color edge of the image.
In view of this, an embodiment of the present application provides an image color edge quantization method, which obtains an image to be quantized, where the image to be quantized includes color edges. The target pixel point in the image to be quantized corresponds to three color components, namely a first color component, a second color component and a third color component, wherein the first color difference is the color difference of the first color component of the target pixel point relative to the third color component of the target pixel point, and the second color difference is the color difference of the second color component of the target pixel point relative to the third color component of the target pixel point. In the embodiment, the width of the color edge and/or the color difference of the color edge are determined according to the first color difference and the second color difference of the target pixel point in the image to be quantized; and quantifying the intensity of the color edge in the image to be quantified according to the width of the color edge and/or the color difference of the color edge, and solving the problem that subjective judgment cannot count and quantify the intensity of the color edge in the image, thereby more objectively evaluating the influence caused by the color edge.
An image color edge quantization method according to an embodiment of the present application will be described in detail with reference to the accompanying drawings, and fig. 1 is an exemplary flowchart of the image color edge quantization method.
Step 110: and acquiring an image to be quantized, wherein the image to be quantized comprises color edges.
The image to be quantized in the application can be an original data image, namely an RAW format image, or can be a commonly used JPG or JPEG format image obtained after being processed by an image signal processing algorithm.
The implementation manner of step 110 in the present application is not limited, and any manner in which an image to be quantized may be acquired may be used as the implementation manner of step 110. In one possible implementation manner, the image to be quantized can be obtained through a reading or receiving mode; in one possible implementation, the image to be quantized may be obtained by active shooting or other active acquisition.
In the image shown in fig. 2, the area indicated by the dotted line is the area where the color edge is located, and in fig. 2, the area where the color edge is located is in a square ring shape. It will be appreciated that fig. 2 is only an illustration of the region where the color border is located, and that in practice the shape of the color border will vary depending on the shape of the object boundary.
It may be possible to take the whole image shown in fig. 2 as the image to be quantized, or take the whole image shown in fig. 2 as the target image, and take the partial region including the color edge in the target image as the image to be quantized, for example, take the region of interest (Region of Interest, ROI) shown by the solid line box in fig. 2 as the image to be quantized, for example, take the ROI1 or ROI2 shown by the solid line box in fig. 2 as the image to be quantized, where w is the wide edge of the ROI region, and h is the long edge of the ROI region.
In one implementation, the image to be quantized further includes a highlight region and a low-light region, and a difference between brightness of the target pixel point in the highlight region and brightness of the target pixel point in the low-light region is greater than a preset brightness difference, where the preset brightness difference can be set by an evaluation personnel according to actual conditions.
As shown in fig. 2, the image to be quantized is shown as a solid line frame ROI1 or ROI2, it can be seen that the solid line frame includes not only the color edge but also a part of the highlight region adjacent to the color edge and a part of the highlight region adjacent to the color edge, so as to ensure that the width (i.e., d) of the color edge included in the image to be quantized is complete, if the solid line frame does not include the highlight region and the highlight region, the width of the color edge may be incomplete, which affects the accuracy of the calculation result of the width of the color edge, thereby affecting the accuracy of the quantization result of the intensity degree of the color edge.
It can be seen that fig. 2 is only for illustrating the image to be quantized, and is not limited to the content of the picture in the image to be quantized, in practical application, the image to be quantized may include sky, tree, etc. in the natural environment, and in addition, the image to be quantized may not include a highlight region and/or a low-highlight region.
In the present application, the ROI area may be determined in the target image as an image to be quantized in the following manner.
Mode one: after the target image is obtained, an evaluation personnel observes the region where the color edge is located in the target image through naked eyes, and frames the ROI region in the target image by using a preselected frame to serve as an image to be quantized. When the ROI is framed as an image to be quantized, the evaluating personnel can also include part of highlight areas and part of low-light areas positioned at two sides of the color edge, so that the width of the color edge included in the image to be quantized is ensured to be complete.
Mode two: the trained neural network is utilized to frame the ROI area in the target image as the image to be quantized.
Mode three: the region where the color edge is located in the image to be quantized can be pre-detected, and after the region where the color edge is located in the target image is determined through pre-detection, the region of the ROI containing the region where the color edge is located is framed in the target image as the image to be quantized. The brightness of the pixels at two sides of the region where the color edge is located can be obtained, the difference value of the brightness of the pixels at two sides is compared with the preset brightness difference, the pixels at two sides with the brightness difference larger than the preset brightness difference are determined to be the pixels in the high-brightness region or the low-brightness region, and the ROI region can be framed in the target image to be used as the image to be quantized according to the found pixels in the high-brightness region or the low-brightness region.
Step 120: and acquiring a first color component, a second color component and a third color component of a target pixel point in the image to be quantized.
In this embodiment, the ROI area in the target image is used as the image to be quantized, and the target pixel points are the pixel points in the ROI area and also the pixel points in the image to be quantized. Each single color channel of the target pixel point in the RGB color space in the image to be quantized corresponds to a pixel gray value. For example, the pixel gray value of a certain target pixel point is: r (red) 80, G (green) 90, B (blue) 130, the different pixel gray values are expressed as the brightness of the corresponding color. The three color components in the present application refer to pixel gray values corresponding to three color channels in an RGB color space of a target pixel point in an image to be quantized, that is, luminance values corresponding to the three color channels.
Step 130: and determining the first color difference and the second color difference of the target pixel point according to the first color component, the second color component and the third color component of the target pixel point.
The target pixel points within the image to be quantized may be a single row, a plurality of rows, a single column, or a plurality of rows and columns, as may be practical. The more the number of target pixel points in the image to be quantized, the less the calculation result is affected by random errors.
The processing manner of the target pixel points in a plurality of rows and columns is illustrated in the present application, fig. 3 is a further illustration of step 130 in fig. 1, and the following details of step 130 are described with reference to fig. 3:
step 1301: and respectively carrying out weighted average on the first color component, the second color component and the third color component of each row or each column of target pixel points in the image to be quantized and the preset weight coefficient to obtain a weighted-average first color component, a weighted-average second color component and a weighted-average third color component of each row or each column of target pixel points.
For example, taking ROI2 in fig. 2 as an image to be quantized as an example, when the first color component, the second color component, and the third color component of each row of target pixel points in the image to be quantized are respectively weighted-averaged with the preset weight coefficients, the weight coefficients may be set for the row of target pixel points according to the color depth of the color border, for example, where the color of the color border is darker, the weight coefficients of the target pixel points may be set to be greater, and where the color border is lighter, the weight coefficients of the target pixel points may be set to be smaller.
As can be seen from fig. 2, in the row (i.e., h direction) direction, the color of the color border is darker at the middle target pixel position, and the color of the color border is lighter at the target pixel positions on both sides, so that the weight coefficient that can be set for the middle target pixel is larger, and the weight coefficient that can be set for the target pixel on both sides is smaller. For example: if a certain row of the image to be quantized includes 10 target pixel points, the weight coefficient corresponding to each target pixel point from left to right may be: 0.0, 0.15, 0.2, 0.3, 0.05, 0.
For the target pixel points in the row (i.e. h direction), the weight coefficient corresponding to a certain target pixel point may be applicable to other target pixel points in the column where the target pixel point is located, in other words, a weight coefficient may be set for a column of target pixel points, and the manner of setting a weight coefficient for a column of target pixel points is simplified compared with setting a weight coefficient for each target pixel point.
For example, the image to be quantized includes 10 rows and 10 columns of target pixel points, and the weight coefficients corresponding to the first row of target pixel points from left to right are: 0.0, 0.15, 0.2, 0.3, 0.05, 0, in which case the weight coefficients of the first three columns of target pixel points may all be 0, the weight coefficients of the fourth column of target pixel points may all be 0.15, the weight coefficients of the fifth column of target pixel points may all be 0.2, and so on.
In this embodiment, by setting a weight coefficient for each row of target pixel points in the image to be quantized, and respectively performing weighted average on three color components of each row of target pixel points according to the weight coefficient, random errors, such as noise images during measurement of electronic components, can be eliminated.
For example, taking ROI1 in fig. 2 as an image to be quantized as an example, when the first color component, the second color component, and the third color component of each column of the target pixel point in the image to be quantized are weighted and averaged with the preset weight coefficients, the weight coefficients may be set for the column of the target pixel point according to the color depth of the color edge, for example, where the color of the color edge is darker, the weight coefficient of the target pixel point may be set to be larger, and where the color edge is lighter, the weight coefficient of the target pixel point may be set to be smaller.
As can be seen from fig. 2, in the column (i.e., h direction) direction, the color of the color border is darker at the middle target pixel position, and the color of the color border is lighter at the target pixel positions on both sides, so that the weight value that can be set for the middle target pixel is larger, and the weight value that can be set for the target pixel on both sides is smaller. For example: if a certain column of the image to be quantized includes 10 target pixel points, the weight coefficient corresponding to each target pixel point from bottom to top may be: 0.0, 0.15, 0.2, 0.3, 0.05, 0.
For the target pixel points in the column (i.e. h direction), the weight coefficient corresponding to a certain target pixel point may be applicable to other target pixel points in the row where the target pixel point is located, in other words, a weight coefficient may be set for a row of target pixel points, and the manner of setting a weight coefficient for a row of target pixel points is simplified compared with setting a weight coefficient for each target pixel point.
For example: the image to be quantized comprises 10 rows and 10 columns of target pixel points, and the weight coefficient corresponding to the first column of target pixel points from bottom to top is as follows: 0.0, 0.15, 0.2, 0.3, 0.05, 0, in which case the weight coefficients of the first three rows of target pixel points may all be 0, the weight coefficients of the fourth row of target pixel points may all be 0.15, the weight coefficients of the fifth row of target pixel points may all be 0.2, and so on.
In this embodiment, by setting a weight coefficient for each column of target pixel points in the image to be quantized, and respectively performing weighted average on three color components of each column of target pixel points according to the weight coefficient, random errors, such as noise images during measurement of electronic components, can be eliminated.
In one implementation, the weighting coefficients of the target pixel points corresponding to the three color components may be the same or different.
Taking ROI1 in fig. 2 as an example of an image to be quantized, assuming that the weight coefficients of the target pixel points corresponding to the three color components are the same, taking a certain column of the image to be quantized including 10 target pixel points as an example, the weight coefficients of the target pixel points corresponding to the three color components from bottom to top may be: 0.0, 0.15, 0.2, 0.3, 0.05, 0.
Assuming that the weight coefficients of the target pixel points corresponding to the three color components are different, taking an example that a certain column in the image to be quantized includes 10 target pixel points, the weight coefficient of each target pixel point from bottom to top corresponding to the first color component may be: 0.0, 0.05, 0.15, 0.2, 0.25, 0.3, 0.05, 0; the weight coefficient of each target pixel point from bottom to top corresponding to the second color component may be: 0.0, 0.1, 0.15, 0.2, 0.3, 0.05, 0; the weight coefficient of each target pixel point from bottom to top corresponding to the third color component may be: 0.0, 0.15, 0.2, 0.3, 0.05, 0.
Accordingly, the arrangement in the row direction may refer to the arrangement in the column direction, and will not be described herein.
Step 1302: and determining the first color difference and the second color difference of each row or each column of target pixel points according to the weighted average first color component, the weighted average second color component and the weighted average third color component of each row or each column of target pixel points.
Taking ROI1 in fig. 2 as an example of an image to be quantized, the following details:
in this embodiment, the first color difference is a color difference of the first color component after weighted averaging corresponding to each column of the target pixel points relative to the third color component after weighted averaging corresponding to each column of the target pixel points, and the second color difference is a color difference of the second color component after weighted averaging corresponding to each column of the target pixel points relative to the third color component after weighted averaging corresponding to each column of the target pixel points.
In one implementation, the first color difference and the second color difference may be values obtained after normalization processing.
For example: the difference between the first color component after weighted averaging corresponding to each row or each column of target pixel points and the third color component after weighted averaging corresponding to each row or each column of target pixel points can be normalized, and the obtained value is used as a first color difference; and normalizing the difference value between the second color component after weighted averaging and the third color component after weighted averaging corresponding to each row or each column of target pixel points, and taking the obtained value as a second color difference.
Alternatively, the first color component after weighted averaging, the second color component after weighted averaging, and the third color component after weighted averaging corresponding to each row or each column of the target pixel point may be normalized, and then the difference between the first color component after normalized processing and the third color component after normalized processing may be used as the first color difference, and the difference between the second color component after normalized processing and the third color component after normalized processing may be used as the second color difference.
In this embodiment, the first color difference and the second color difference may be values obtained after normalization processing, and the first color difference and the second color difference obtained by final calculation of images in any format are all within the same value range, so that comparison between images in different formats is facilitated. For example: the image to be quantized in the application can be a RAW format image, or can be a JPG or JPEG format image obtained after processing by an image signal processing algorithm, the bit number of data in the RAW format image is generally 10bit, 12bit or 14bit, and the bit number of data in the JPG or JPEG format image is generally 8bit, and after normalization processing, the first chromatic aberration and the second chromatic aberration obtained by calculating images with different data bit numbers can be limited in the same value range.
The color of the color edge is different in the application, and the generation principle is also different. The color of the color edge is also called purple edge, the color of the color edge is generally purple, and the color edge is formed by that the deviation between red component and blue component is larger than that between green component, and the color edge can be roughly divided into two types of blue-bias edge and red-bias edge, and the blue component is more than the green component in the blue-bias edge (the red component is not necessarily larger than the green component); there will be a relationship of blue component > green component and red component > green component in the reddish color edge. The color of the color edge may be cyan or yellow, and the cyan color of the color edge is formed by the fact that the blue component and the green component deviate more than the red component; the yellow color of the color border is due to the red component and the green component being more biased than the blue component.
The principle of the generation of the color edges is different due to the different colors of the color edges. In this embodiment, the third color component is determined by combining the generation principle of the color edge, and then the width and/or color difference obtained according to the first color difference and the second color difference can accurately represent the width and/or color difference of the color edge.
For example, if the color of the color edge in the image to be quantized is purple, purple bluish or purple reddish, a green component may be selected as a third color component, the first color difference of the target pixel point is the difference between the red component and the green component, and the second color difference is the difference between the blue component and the green component; or the first color difference is the difference between the blue component and the green component, and the second color difference is the difference between the red component and the green component.
For example, if the color of the color edge in the image to be quantized is cyan, selecting the red component as a third color component, wherein the first color difference of the target pixel point is the difference between the green component and the red component, and the second color difference is the difference between the blue component and the red component; or the first color difference is the difference between the blue component and the red component, and the second color difference is the difference between the green component and the red component.
For example, if the color of the color edge in the image to be quantized is yellow, selecting a blue component as a third color component, wherein the first color difference of the target pixel point is the difference between the red component and the blue component, and the second color difference is the difference between the green component and the blue component; or the first color difference is the difference between the green component and the blue component, and the second color difference is the difference between the red component and the blue component.
The following describes in detail the calculation process of the first color difference and the second color difference with the color of the color side being purple and the third color component being green.
Assuming that ROI1 in fig. 2 is taken as an image to be quantized, the image to be quantized includes N rows and M columns of target pixels, where N and M are integers greater than or equal to 2.
(1) The first color component, the second color component and the third color component of the target pixel point in the image to be quantized are acquired, and the data matrix of the red R color component in the target pixel point of N rows and M columns is obtained as follows:
The data matrix for the blue B color component is as follows:
/>
the data matrix for the green G color component is as follows:
wherein, (x) 1 ,y 1 ) Representing the coordinates of the first row and first column target pixel points, (x) N ,y M ) Represents the N < thCoordinates of the row mth column target pixel point.
(2) And respectively carrying out weighted average on three color components of each column of target pixel points in the image to be quantized. Taking weighted average of the red components of the first column of target pixel points as an example, assuming that the preset weight coefficients of each row of target pixel points are the same and the weight coefficients of each row of target pixel points corresponding to the three color components are also the same, the weighted average red components of the first column of target pixel points can be calculated by the following formula 1-1
Wherein mu 1 、μ 2 …μ N Weights of 1 to N rows of target pixel points respectively, and mu 12 +…+μ N =1。
Thus, the data matrix of the weighted average red component is as follows:
similarly, the data matrix of the weighted-averaged blue component is as follows:
the data matrix of the weighted averaged green component is as follows:
as shown in fig. 4, graphs of the first color component after weighted averaging, the second color component after weighted averaging, and the third color component after weighted averaging obtained in the actual test are shown in the same coordinate system, where the horizontal axis of the coordinate system is the position of the target pixel point column, and the vertical axis is the values of the three color components after weighted averaging. In fig. 4, curve 1 is a curve formed by a weighted average of red components, curve 2 is a curve formed by a weighted average of blue components, and curve 3 is a curve formed by a weighted average of green components.
It should be noted that, in fig. 4, the three color components after weighted average are limited to the interval of 0 to 100 for illustration only, and in practical application, the three color components after weighted average may be directly displayed in a graph, which is not limited in this application.
(3) And carrying out normalization processing on the difference value between the weighted average first color component and the weighted average third color component of each column of target pixel points to obtain a first color difference corresponding to each column of target pixel points. And carrying out normalization processing on the difference value between the second color component after weighted averaging and the third color component after weighted averaging of each column of target pixel points to obtain a second color difference corresponding to each column of target pixel points.
Taking the normalization processing of the difference between the weighted-average first color component and the weighted-average third color component of the first column of target pixel points as an example, the first color difference of the first column of target pixel points is calculated by the following formula 1-2
/>
Thus, the data matrix L of the first color difference corresponding to each column of the target pixel points R The following is shown:
similarly, the second color corresponding to each column of the target pixel pointsDifferential data matrix L B The following is shown:
as shown in fig. 5, a graph showing the first color difference and the second color difference in the same coordinate system in the actual test is shown, wherein the horizontal axis of the coordinate system is the position of the pixel column, and the vertical axis is the normalized value of the difference between the other two color components and the green component. In fig. 5, curve 4 is a curve formed by a first color difference corresponding to each column of target pixel points, and curve 5 is a curve formed by a second color difference corresponding to each column of target pixel points.
It should be noted that, in fig. 5, the values of the first color difference and the second color difference are limited to be within the interval of 0 to 10, which is only convenient for illustration, and in practical application, the first color difference and the second color difference may also be directly displayed in the graph, which is not limited in this application.
Step 140: and determining the width of the color edge and/or the color difference of the color edge according to the first color difference and the second color difference of the target pixel point in the image to be quantized, wherein the target pixel point corresponds to the first color component, the second color component and the third color component, the first color difference is the color difference of the first color component of the target pixel point relative to the third color component of the target pixel point, and the second color difference is the color difference of the second color component of the target pixel point relative to the third color component of the target pixel point.
Illustratively, step 140 may be implemented in several ways:
according to the first mode, the width of the color edge is determined according to the first color difference and the second color difference of the target pixel point in the image to be quantized.
Specifically, according to a first color difference and a second color difference of target pixel points in an image to be quantized, determining the number of target pixel points, wherein the first color difference is larger than a first preset color difference threshold value, the second color difference is larger than a second preset color difference threshold value, and according to the number of target pixel points, the first color difference is larger than the first preset color difference threshold value, the second color difference is larger than the second preset color difference threshold value, determining the width of a color edge.
Continuing with the description of the ROI1 of fig. 2 as the image to be quantized, assume that the first preset color difference threshold is α 1 The number of target pixel points with the first color difference larger than the first preset color difference threshold value can be determined firstly, namely the width of the red color difference area and the width W of the red color difference area are determined R Can be calculated by the following equations 1-3:
wherein t is from the data matrix L of the first color difference corresponding to each column of target pixel points R Taking a value of the middle value; the meaning of the signal () function is: the calculation result in the brackets is positive, and the result of signal () is 1; the result of calculation in brackets is negative, and the result of signal () is 0.
Similarly, assume that the second predetermined color difference threshold is α 2 The number of target pixel points with the second color difference larger than the second preset color difference threshold value can be determined first, namely, the width of the blue color difference area and the width W of the blue color difference area are determined B Can be calculated by the following equations 1-4:
wherein t is from the data matrix L of the second color difference corresponding to each column of target pixel points B Taking a value of the middle value; the meaning of the signal () function is: the calculation result in the brackets is positive, and the result of signal () is 1; the result of calculation in brackets is negative, and the result of signal () is 0.
It should be noted that, the values of the first preset color difference threshold and the second preset color difference threshold may be the same or different, and are set by the evaluating personnel. When an evaluating person sets, according to an experimental result, a minimum first color difference threshold value which can find a colored edge subjectively by the person is selected as the first preset color difference threshold value, and a minimum second color difference threshold value is selected as the second preset color difference threshold value. For example: the first preset color difference threshold value or the second preset color difference threshold value can be 3% and 5% equivalent.
The overlapping area of the red color difference area and the blue color difference area is the area where the color edge is located, and the width of the color edge is the width of the overlapping area of the red color difference area and the blue color difference area. As shown in FIG. 5, W B The indicated dotted line area is the blue color difference area obtained in the actual test, W R Indicated is the red color difference area obtained in the actual test, the width of the overlapping area is the width of the color edge, and as can be seen from FIG. 5, the width W of the color edge purple Width W of the red color difference region R The same applies.
And secondly, determining the color difference of the color edge according to the first color difference and the second color difference of the target pixel point in the image to be quantized.
Specifically, determining the maximum value of the first chromatic aberration and the maximum value of the second chromatic aberration of a plurality of target pixel points in an image to be quantized; and determining the color difference of the color edge according to the maximum value of the first color difference and the maximum value of the second color difference.
Determining a maximum value of a first color difference of a plurality of target pixel points in an image to be quantized, namely a data matrix L of the first color difference R Find the maximum value omega of the first color difference R The method comprises the steps of carrying out a first treatment on the surface of the And from the data matrix L of the second color difference B Finding the maximum value omega of the second color difference B The color difference of the color edge can be determined according to the maximum value of the first color difference and the maximum value of the second color difference.
It may be realized that determining the color difference of the color edge according to the maximum value of the first color difference and the maximum value of the second color difference includes: and (3) averaging the maximum value of the first color difference and the maximum value of the second color difference, and determining the color difference of the color edge according to the average value.
Assume that the color difference of the color edge is omega purple The color difference omega of the color edge can be calculated by the following formulas 1-5 purple
ω purple =(ω RB )/2 (1-5)
It may be realized that determining the color difference of the color edge according to the maximum value of the first color difference and the maximum value of the second color difference includes: and carrying out weighted average on the maximum value of the first color difference, the maximum value of the second color difference and preset weight coefficients corresponding to the first color difference and the second color difference respectively to obtain the color difference of the color edge. The weight coefficient corresponding to the first color difference is different from the weight coefficient corresponding to the second color difference, and the size of the weight coefficient can be set by an evaluation personnel, for example: the color deviation of the color edge can be determined according to the color deviation of the color edge, and if the color of the color edge deviates to red, the weight coefficient of the first color difference is set to be larger than the weight coefficient of the second color difference; if the color of the color edge deviates to blue, the weight coefficient of the first color difference is set smaller than that of the second color difference.
And thirdly, determining the color difference and the width of the color edge according to the first color difference and the second color difference of the target pixel point in the image to be quantized.
The third mode is a combination of the first mode and the second mode, the width of the color edge can be determined by using the implementation mode in the first mode, and the color difference of the color edge is determined by using the implementation mode in the second mode, which is not described in detail in this embodiment.
Step 150: and determining the intensity degree of the color edge in the image to be quantized according to the width of the color edge and/or the color difference of the color edge.
This step corresponds to the previous step 140, and there are three implementations:
the first mode corresponds to the first mode in step 140, and the intensity of the color edge in the image to be quantized is determined according to the width of the color edge.
In this scheme, the intensity of the color edge in the image is quantified directly according to the calculated width of the color edge, and the larger the width of the color edge is, the stronger the intensity of the color edge is.
As can be realized, a corresponding threshold value can be set to measure the intensity of the color edge, for example: the intensity degree of the color edge is divided into three types of high, medium and low, the width threshold value interval of the corresponding color edge is respectively set for each type, and when the intensity degree of the color edge is in the corresponding width threshold value interval, the intensity degree of the color edge can be determined to be high, medium or low.
The method can be realized, the maximum score of the strongest color edge is 100, the maximum color edge width corresponding to the maximum score of 100 is set, the intensity degree of the color edge is linearly scored according to the actually calculated width of the color edge and the maximum color edge width corresponding to the maximum score of 100, the intensity degree of the color edge is reflected through the high and low score, and the influence caused by the color edge is conveniently and objectively evaluated according to the result.
The above two implementations are merely illustrative, and are not limiting to the present application, and any implementation of determining the intensity of the color edge in the image to be quantized according to the width of the color edge is within the scope of protection of the present application.
In a possible implementation manner, the image to be quantized is a part of the image in the target image, at this time, multiple images to be quantized of the target image may be obtained, the width of the color edge in each image to be quantized is calculated, and the intensity of the color edge of each image to be quantized is determined according to the width of the color edge in each image to be quantized. And the intensity degree of the color edges in the target image is evaluated by combining the intensity degrees of the color edges of the plurality of images to be quantized, so that the accuracy of quantizing the color edges is further improved.
And determining the intensity of the color edge in the image to be quantized according to the color difference of the color edge in the second mode corresponding to the second mode in the step 140.
In the scheme, the intensity degree of the color edge in the image is quantified directly according to the calculated color difference of the color edge, and the larger the color difference of the color edge is, the stronger the intensity degree of the color edge is.
Practically, a corresponding threshold value can be set to measure the intensity of the color edge, for example: the intensity degree of the color edge is divided into three types of high, medium and low, a color difference threshold value interval of the corresponding color edge is respectively set for each type, and when the intensity degree of the color edge is in the corresponding color difference threshold value interval, the intensity degree of the color edge can be determined to be high, medium or low.
The method can be realized, the maximum score of the color edge when the color edge is strongest is 100, the maximum color difference of the color edge corresponding to the maximum score of 100 is set, the intensity degree of the color edge is linearly scored according to the actually calculated color difference of the color edge and the maximum color difference of the color edge corresponding to the maximum score of 100, the intensity degree of the color edge is reflected through the score, and the influence caused by the color edge is evaluated objectively according to the result.
The above two implementations are merely illustrative, and are not limiting to the present application, and any implementation of determining the intensity of the color edge in the image to be quantized according to the color difference of the color edge is within the scope of protection of the present application.
In a possible implementation manner, the image to be quantized is a part of the image in the target image, at this time, multiple images to be quantized of the target image may be obtained, chromatic aberration of the color edge in each image to be quantized is calculated, and intensity of the color edge of each image to be quantized is determined according to the chromatic aberration of the color edge in each image to be quantized. And the intensity degree of the color edges in the target image is evaluated by combining the intensity degrees of the color edges of the plurality of images to be quantized, so that the accuracy of quantizing the color edges is further improved.
And thirdly, determining the intensity degree of the color edge in the image to be quantized according to the color difference and the width of the color edge.
Practically, determining the intensity of the color border in the image to be quantized according to the width of the color border and the color difference of the color border comprises: and carrying out weighted average according to the width of the color edge, the color difference of the color edge, the preset first weight value of the width of the corresponding color edge and the preset second weight value of the color difference of the corresponding color edge to obtain a target value, wherein the target value represents the intensity degree of the color edge in the image to be quantized.
Assuming that the target value representing the intensity of the color border in the image to be quantized is denoted by F, the target value F can be calculated by the following equations 1-6:
F=(W purple ×β 1purple ×β 2 )/(β 1 + β 2 ) (1-6)
wherein beta is 1 For the first weight value of the width of the corresponding color edge, beta 2 A second weight value for the color difference of the corresponding color edge.
The first weight value beta in the present application 1 And a second weight value beta 2 The adjustable parameters can be set by the evaluating personnel. For example: first weight value beta 1 And a second weight value beta 2 May be the same; if the evaluating person compares the width of the intentional color edge, a first weight can be setWeight value beta 1 Greater than the second weight value beta 2 The method comprises the steps of carrying out a first treatment on the surface of the If the evaluating person compares the color difference of the intentional color edge, a second weight value beta can be set 2 Greater than a first weight value beta 1 . Different evaluating staff set the first weight value and the second weight value according to different standards, so that the intensity degree of the color edge is quantified by the different evaluating staff according to different standards.
In one implementation manner, the image to be quantized is a partial image in the target image, at this time, multiple images to be quantized of the target image may be obtained, and the width of the color edge and the color difference of the color edge in each image to be quantized are calculated respectively, so as to determine a target value representing the intensity degree of the color edge in the image to be quantized, and comprehensively evaluate the intensity degree of the color edge in the target image according to the multiple target values, so that the accuracy of quantizing the color edge is further improved.
The method comprises the steps of setting a weight value corresponding to each image to be quantized in advance, carrying out weighted average on a target value corresponding to each image to be quantized and the weight value corresponding to each image to be quantized to obtain a total target value, wherein the total target value represents the intensity degree of a color edge in a target image. The weight value corresponding to each image to be quantized in the application can be set by an evaluation personnel, for example: the weight value corresponding to each image to be quantized can be determined according to the position of the image to be quantized in the target image, and the closer the image to be quantized is to the center position of the target image, the larger the weight value is.
The following illustrates a specific application scenario related to the image color border quantization method provided in the present application:
Scene one: for longitudinal evaluation of images from different sources.
The shooting scenes of the images with different sources may not be the same, and the images are quantized by the image color edge quantization method provided by the application respectively by adopting the same threshold standard so as to quantify the intensity degree of the color edges in the images, thereby realizing the longitudinal evaluation of the intensity degree of the color edges in the images with different sources.
Scene II: for lateral evaluation of various image sensors.
Giving a specific environment with high brightness and darkness contrast, and respectively shooting images in the specific environment by using each image sensor to obtain a plurality of RAW format images; according to the image color edge quantization method provided by the application, the plurality of RAW format images are quantized respectively, and the intensity degree of the color edge corresponding to each RAW format image is quantized; and ordering the hardware performances of the plurality of image sensors according to the quantized result of the intensity degree of the characterization color edge, so as to realize the transverse evaluation of the plurality of image sensors.
Scene III: the method is used for factory detection of equipment such as an image sensor, a camera, a video camera and the like.
Giving a specific environment with large brightness and darkness contrast, and shooting an image under the specific environment by using equipment to be detected to obtain the image of the specific environment; according to the image color edge quantization method provided by the application, the image shot by the equipment to be detected is quantized, and the intensity of the color edge corresponding to the equipment to be detected is quantized; comparing the quantized result with a preset standard value, and if the quantized result is smaller than or equal to the standard value, qualifying the equipment to be tested; if the equipment to be tested is larger than the standard value, the equipment to be tested is unqualified and needs to be returned to the factory for repair.
Scene four: for determining whether the cause of the edge being obvious is a hardware cause or an algorithm cause.
And giving a specific environment with large brightness and dark contrast, acquiring a RAW format image under the specific environment by using an image sensor, and obtaining JPG or JPEG and other format images after the RAW format image is processed by an image signal processing algorithm. According to the image color edge quantization method provided by the application, the images in the two formats are quantized respectively, the intensity of the color edge under each format image is quantized, the quantization result can be compared with the corresponding preset standard value, and the difference value of the quantization results of the images in the two formats is calculated, so that the influence of the comprehensive evaluation on the intensity of the color edge by hardware or an algorithm is large, and the method can be improved pertinently.
It should be appreciated that the above illustration is to aid one skilled in the art in understanding the present embodiments and is not intended to limit the present embodiments to the specific values or particular scenarios illustrated. It will be apparent to those skilled in the art from the foregoing description that various equivalent modifications or variations can be made, and such modifications or variations are intended to be within the scope of the embodiments of the present application.
Fig. 6 is an image color edge quantization apparatus 200 provided in an embodiment of the present application, which includes an obtaining module 210 and a processing module 220.
The obtaining module 210 is configured to obtain an image to be quantized, where the image to be quantized includes a color edge. Optionally, in some embodiments, the image to be quantized is a partial image in the target image.
The processing module 220 is configured to determine a width of a color edge and/or a color difference of the color edge according to a first color difference and a second color difference of a target pixel point in the image to be quantized, where the target pixel point corresponds to a first color component, a second color component, and a third color component, the first color difference is a color difference of the first color component of the target pixel point relative to the third color component of the target pixel point, and the second color difference is a color difference of the second color component of the target pixel point relative to the third color component of the target pixel point.
The processing module 220 is further configured to determine the intensity of the color border in the image to be quantized according to the width of the color border and/or the color difference of the color border.
Optionally, in some embodiments, the processing module 220 is specifically configured to perform weighted average according to the width of the color edge, the color difference of the color edge, the preset first weight value of the width of the corresponding color edge, and the preset second weight value of the color difference of the corresponding color edge, to obtain a target value, where the target value represents the intensity of the color edge in the image to be quantized.
Optionally, in some embodiments, the processing module 220 is specifically configured to determine, according to a first color difference and a second color difference of a target pixel point in the image to be quantized, a number of target pixel points where the first color difference is greater than a first preset color difference threshold and the second color difference is greater than a second preset color difference threshold; the method is particularly used for determining the width of the color edge according to the number of target pixel points, wherein the first color difference is larger than a first preset color difference threshold value, and the second color difference is larger than a second preset color difference threshold value.
Optionally, in some embodiments, the processing module 220 is specifically configured to determine a maximum value of the first color difference and a maximum value of the second color difference of the plurality of target pixel points in the image to be quantized; and is specifically configured to determine a color difference of the color edge according to the maximum value of the first color difference and the maximum value of the second color difference.
Optionally, in some embodiments, the processing module 220 is specifically configured to average the maximum value of the first color difference and the maximum value of the second color difference; and is specifically configured to determine a color difference of the color edge according to the average value.
Optionally, in some embodiments, the image to be quantized includes N rows and M columns of target pixels, where N and M are integers greater than or equal to 2; the processing module 220 is further configured to, before determining the width of the color edge and/or the color difference of the color edge according to the first color difference and the second color difference of the target pixel point in the image to be quantized, respectively performing weighted average on the first color component, the second color component, and the third color component of each row or each column of the target pixel point in the image to be quantized and the preset weight coefficient, so as to obtain a weighted-average first color component, a weighted-average second color component, and a weighted-average third color component of each row or each column of the target pixel point.
The processing module 220 is further configured to determine a first color difference and a second color difference of each row or each column of the target pixel point according to the weighted-averaged first color component, the weighted-averaged second color component, and the weighted-averaged third color component corresponding to each row or each column of the target pixel point.
Optionally, in some embodiments, the processing module 220 is specifically configured to normalize a difference value between the weighted-averaged first color component and the weighted-averaged third color component corresponding to each row or each column of the target pixel points to obtain a first color difference corresponding to each row or each column of the target pixel points; and the method is specifically used for carrying out normalization processing on the difference value between the second color component after weighted averaging and the third color component after weighted averaging corresponding to each row or each column of target pixel points to obtain the second color difference corresponding to each row or each column of target pixel points.
Optionally, in some embodiments, the color of the color border is purple; the first color component is a red color component, the second color component is a blue color component, and the third color component is a green color component.
Optionally, in some embodiments, the image to be quantized further includes a highlight region and a low-light region, and a difference between the brightness of the target pixel point in the highlight region and the brightness of the target pixel point in the low-light region is greater than a preset brightness difference.
The image color-side quantization apparatus 200 of the present embodiment may correspond to performing the image color-side quantization method described in the embodiments of the present application, and the above and other operations and/or functions of each unit in the image color-side quantization apparatus 200 are respectively for implementing the corresponding flows of the methods in fig. 1 and 3, and are not repeated herein for brevity.
Fig. 7 is a schematic block diagram of an image color edge quantization apparatus 300 according to an embodiment of the present application. The image color edge quantization apparatus 300 includes: processor 310, memory 320, communication interface 330, bus 340.
It should be appreciated that the processor 310 in the image edge quantization apparatus 300 shown in fig. 7 may correspond to the processing module 220 in the image edge quantization apparatus 200 in fig. 6, and the communication interface 330 in the image edge quantization apparatus 300 may correspond to the acquisition module 210 in the image edge quantization apparatus 200.
Wherein the processor 310 may be coupled to a memory 320. The memory 320 may be used to store the program code and data. Accordingly, the memory 320 may be a storage unit internal to the processor 310, an external storage unit independent of the processor 310, or a component including a storage unit internal to the processor 310 and an external storage unit independent of the processor 310.
Optionally, the image color edge quantization apparatus 300 may further include a bus 340. The memory 320 and the communication interface 330 may be connected to the processor 310 through a bus 340. Bus 340 may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The bus 340 may be classified into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one line is shown in fig. 7, but not only one bus or one type of bus.
It should be appreciated that in embodiments of the present application, the processor 310 may employ a central processing unit (central processing unit, CPU). The processor may also be other general purpose processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), off-the-shelf programmable gate arrays (field programmable gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. Or the processor 310 may employ one or more integrated circuits for executing associated programs to implement the techniques provided in embodiments of the present application.
The memory 320 may include read only memory and random access memory and provide instructions and data to the processor 310. A portion of the processor 310 may also include non-volatile random access memory. For example, the processor 310 may also store information of the device type.
The processor 310 executes computer-executable instructions in the memory 320 to perform the steps of the image color edge quantization method described above using hardware resources in the image color edge quantization device when the image color edge quantization device is running.
It should be understood that the image color side quantization apparatus 300 according to an embodiment of the present application may correspond to the image color side quantization apparatus 200 in an embodiment of the present application, and may correspond to a respective subject performing the methods shown in fig. 1 and 3 according to an embodiment of the present application, and that the above and other operations and/or functions of the respective modules in the image color side quantization apparatus 300 are respectively for implementing the respective flows of the methods in fig. 1 and 3, and are not repeated herein for brevity.
The application also provides a computer readable storage medium, and the computer readable storage medium stores a computer program, wherein the computer program comprises program instructions, and when the program instructions are executed, the image color edge quantization method provided by the embodiment of the application is realized.
The present application also provides a computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the image color edge quantization method provided by the embodiments of the present application.
The above embodiments may be implemented in whole or in part by software, hardware, firmware, or any other combination. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded or executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable devices. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data store such as a server, data center, etc. that contains one or more collections of available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium. The semiconductor medium may be a solid state disk (solid state drive, SSD).
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or, what contributes to the prior art, or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a memory (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific implementation of the embodiments of the present application, but the protection scope of the embodiments of the present application is not limited thereto, and any person skilled in the art may easily think about changes or substitutions within the technical scope of the embodiments of the present application, and all changes and substitutions are included in the protection scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (23)

1. A method for image color border quantization, the method comprising:
acquiring an image to be quantized, wherein the image to be quantized comprises a color edge;
determining the width of the color edge and/or the color difference of the color edge according to a first color difference and a second color difference of a target pixel point in the image to be quantized, wherein the target pixel point corresponds to a first color component, a second color component and a third color component, the first color difference is the color difference of the first color component of the target pixel point relative to the third color component of the target pixel point, and the second color difference is the color difference of the second color component of the target pixel point relative to the third color component of the target pixel point;
and determining the intensity degree of the color edge in the image to be quantized according to the width of the color edge and/or the color difference of the color edge.
2. The image color edge quantization method according to claim 1, wherein said determining the intensity of the color edge in the image to be quantized based on the width of the color edge and the color difference of the color edge comprises:
and carrying out weighted average according to the width of the color edge, the color difference of the color edge, a preset first weight value corresponding to the width of the color edge and a preset second weight value corresponding to the color difference of the color edge to obtain a target value, wherein the target value represents the intensity degree of the color edge in the image to be quantized.
3. The method according to claim 1, wherein determining the width of the color edge according to the first color difference and the second color difference of the target pixel point in the image to be quantized comprises:
determining the number of target pixel points, of which the first color difference is larger than a first preset color difference threshold value and the second color difference is larger than a second preset color difference threshold value, according to the first color difference and the second color difference of the target pixel points in the image to be quantized;
and determining the width of the color edge according to the number of target pixel points, wherein the first color difference is larger than a first preset color difference threshold value and the second color difference is larger than a second preset color difference threshold value.
4. The method according to claim 1, wherein determining the color difference of the color edge according to the first color difference and the second color difference of the target pixel point in the image to be quantized comprises:
determining the maximum value of the first chromatic aberration and the maximum value of the second chromatic aberration of a plurality of target pixel points in the image to be quantized;
and determining the color difference of the color edge according to the maximum value of the first color difference and the maximum value of the second color difference.
5. The method according to claim 4, wherein determining the color difference of the color edge based on the maximum value of the first color difference and the maximum value of the second color difference comprises:
averaging the maximum value of the first color difference and the maximum value of the second color difference,
and determining the chromatic aberration of the color edge according to the average value.
6. The image color-side quantization method according to any one of claims 1 to 5, wherein the image to be quantized includes N rows and M columns of target pixel points, N and M being integers greater than or equal to 2;
before the width of the color edge and/or the color difference of the color edge are determined according to the first color difference and the second color difference of the target pixel point in the image to be quantized, the method further comprises the following steps:
Respectively carrying out weighted average on the first color component, the second color component and the third color component of each row or each column of target pixel points in the image to be quantized and a preset weight coefficient to obtain a first color component after weighted average, a second color component after weighted average and a third color component after weighted average, wherein the first color component, the second color component and the third color component correspond to each row or each column of target pixel points;
and determining the first color difference and the second color difference of each row or each column of target pixel points according to the first color component after weighted averaging, the second color component after weighted averaging and the third color component after weighted averaging, which correspond to each row or each column of target pixel points.
7. The image color edge quantization method according to claim 6, wherein said determining the first color difference and the second color difference for each row or each column of target pixel points from the weighted-averaged first color component, the weighted-averaged second color component, and the weighted-averaged third color component for each row or each column of target pixel points comprises:
normalizing the difference value between the weighted and averaged first color component and the weighted and averaged third color component corresponding to each row or each column of target pixel points to obtain a first color difference corresponding to each row or each column of target pixel points;
And carrying out normalization processing on the difference value between the second color component after weighted averaging and the third color component after weighted averaging corresponding to each row or each column of target pixel points to obtain a second color difference corresponding to each row or each column of target pixel points.
8. The image color-side quantization method according to any one of claims 1 to 7, characterized in that the color of the color side is purple; the first color component is a red color component, the second color component is a blue color component, and the third color component is a green color component.
9. The image color-border quantization method according to any one of claims 1 to 8, wherein the image to be quantized further includes a highlight region and a low-highlight region, and a difference between the brightness of the target pixel in the highlight region and the brightness of the target pixel in the low-highlight region is greater than a preset brightness difference.
10. The image color-side quantization method according to any one of claims 1 to 9, characterized in that the image to be quantized is a partial image in a target image.
11. An image color edge quantization apparatus, comprising:
the acquisition module is used for acquiring an image to be quantized, wherein the image to be quantized comprises a color edge;
The processing module is used for determining the width of the color edge and/or the color difference of the color edge according to the first color difference and the second color difference of a target pixel point in the image to be quantized, wherein the target pixel point corresponds to a first color component, a second color component and a third color component, the first color difference is the color difference of the first color component of the target pixel point relative to the third color component of the target pixel point, and the second color difference is the color difference of the second color component of the target pixel point relative to the third color component of the target pixel point;
the processing module is further configured to determine the intensity of the color edge in the image to be quantized according to the width of the color edge and/or the color difference of the color edge.
12. The apparatus according to claim 11, wherein,
the processing module is specifically configured to perform weighted average according to the width of the color edge, the color difference of the color edge, a preset first weight value corresponding to the width of the color edge, and a preset second weight value corresponding to the color difference of the color edge, so as to obtain a target value, where the target value represents the intensity of the color edge in the image to be quantized.
13. The apparatus according to claim 11, wherein,
the processing module is specifically configured to determine, according to the first color difference and the second color difference of the target pixel point in the image to be quantized, the number of target pixel points where the first color difference is greater than a first preset color difference threshold and the second color difference is greater than a second preset color difference threshold;
the method is particularly used for determining the width of the color edge according to the number of target pixel points, wherein the first color difference is larger than a first preset color difference threshold value, and the second color difference is larger than a second preset color difference threshold value.
14. The apparatus according to claim 11, wherein,
the processing module is specifically configured to determine a maximum value of a first color difference and a maximum value of a second color difference of a plurality of target pixel points in the image to be quantized; and the method is specifically used for determining the color difference of the color edge according to the maximum value of the first color difference and the maximum value of the second color difference.
15. The apparatus according to claim 14, wherein,
the processing module is specifically configured to average the maximum value of the first color difference and the maximum value of the second color difference; and is specifically configured to determine a color difference of the color edge according to the average value.
16. The image color-side quantization apparatus according to any one of claims 11 to 15, wherein the image to be quantized includes N rows and M columns of target pixel points, N and M each being an integer greater than or equal to 2;
the processing module is further configured to, before determining the width of the color edge and/or the color difference of the color edge according to the first color difference and the second color difference of the target pixel point in the image to be quantized, respectively performing weighted average on the first color component, the second color component and the third color component of each row or each column of the target pixel point in the image to be quantized and a preset weight coefficient to obtain a weighted-average first color component, a weighted-average second color component and a weighted-average third color component of each row or each column of the target pixel point;
and the method is also used for determining the first color difference and the second color difference of each row or each column of target pixel points according to the first color component after weighted averaging, the second color component after weighted averaging and the third color component after weighted averaging, which correspond to each row or each column of target pixel points.
17. The apparatus according to claim 16, wherein,
The processing module is specifically configured to normalize a difference value between the weighted-averaged first color component and the weighted-averaged third color component corresponding to each row or each column of target pixel points to obtain a first color difference corresponding to each row or each column of target pixel points; and the method is specifically used for carrying out normalization processing on the difference value between the second color component after weighted averaging and the third color component after weighted averaging corresponding to each row or each column of target pixel points to obtain the second color difference corresponding to each row or each column of target pixel points.
18. The image color-side quantization apparatus according to any one of claims 11 to 17, wherein the color of the color side is purple; the first color component is a red color component, the second color component is a blue color component, and the third color component is a green color component.
19. The method according to any one of claims 11 to 18, wherein the image to be quantized further includes a highlight region and a low-light region, and a difference between a luminance of a target pixel in the highlight region and a luminance of a target pixel in the low-light region is greater than a preset luminance difference.
20. The image color-side quantization apparatus according to any one of claims 11 to 19, wherein the image to be quantized is a partial image in a target image.
21. An image color edge quantization device, characterized in that the device comprises a memory and a processor, the memory being for storing instructions which, when executed by the processor, cause the image color edge quantization device to perform the image color edge quantization method according to any of claims 1 to 10.
22. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program comprising program instructions which, when executed, implement the image color edge quantization method according to any one of claims 1 to 10.
23. A computer program product, the computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the image color edge quantization method according to any one of claims 1 to 10.
CN202310567260.6A 2023-05-18 2023-05-18 Image color edge quantization method, apparatus, storage medium, and program product Pending CN117710489A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310567260.6A CN117710489A (en) 2023-05-18 2023-05-18 Image color edge quantization method, apparatus, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310567260.6A CN117710489A (en) 2023-05-18 2023-05-18 Image color edge quantization method, apparatus, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN117710489A true CN117710489A (en) 2024-03-15

Family

ID=90150340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310567260.6A Pending CN117710489A (en) 2023-05-18 2023-05-18 Image color edge quantization method, apparatus, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN117710489A (en)

Similar Documents

Publication Publication Date Title
US8941755B2 (en) Image processing device with automatic white balance
US6381357B1 (en) Hi-speed deterministic approach in detecting defective pixels within an image sensor
US10210643B2 (en) Image processing apparatus, image processing method, and storage medium storing a program that generates an image from a captured image in which an influence of fine particles in an atmosphere has been reduced
US20140133774A1 (en) Image processor and image dead pixel detection method thereof
KR20110048922A (en) Method of modeling integrated noise and method of reducing noises in image sensors
US9635212B2 (en) Dynamic compression ratio selection
CN106067177B (en) HDR scene detection method and device
KR101374971B1 (en) Reduced position dependent noise in digital images
US8385642B2 (en) Method for removing color fringe in digital image
JP2004112802A (en) Digital image sensor for detecting defective pixel and the method
WO2004112401A1 (en) Image processing method, image processing program, image processor
US8786729B2 (en) White balance method and apparatus thereof
JP2016048815A (en) Image processor, image processing method and image processing system
US20180278867A1 (en) Method and system of correcting defective pixels
US8379977B2 (en) Method for removing color fringe in digital image
WO2019210707A1 (en) Image sharpness evaluation method, device and electronic device
WO2014013792A1 (en) Noise evaluation method, image processing device, imaging device, and program
US8026954B2 (en) System and computer-readable medium for automatic white balancing
US9373158B2 (en) Method for reducing image artifacts produced by a CMOS camera
US20200396440A1 (en) Method for video quality detection and image processing circuit using the same
JP2010252265A (en) Image processing apparatus
TWM458747U (en) Image processing module
US8054348B2 (en) Noise reduction device and digital camera
CN117710489A (en) Image color edge quantization method, apparatus, storage medium, and program product
Russo et al. A vector approach to quality assessment of color images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination