CN111815720A - Image processing method and device, readable storage medium and electronic equipment - Google Patents

Image processing method and device, readable storage medium and electronic equipment Download PDF

Info

Publication number
CN111815720A
CN111815720A CN201910295510.9A CN201910295510A CN111815720A CN 111815720 A CN111815720 A CN 111815720A CN 201910295510 A CN201910295510 A CN 201910295510A CN 111815720 A CN111815720 A CN 111815720A
Authority
CN
China
Prior art keywords
image
pixel
determining
connected region
distance value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910295510.9A
Other languages
Chinese (zh)
Inventor
周兰
许译天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Horizon Robotics Technology Research and Development Co Ltd
Original Assignee
Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Horizon Robotics Technology Research and Development Co Ltd filed Critical Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority to CN201910295510.9A priority Critical patent/CN111815720A/en
Publication of CN111815720A publication Critical patent/CN111815720A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Color Image Communication Systems (AREA)

Abstract

The application discloses an image processing method, an image processing device, a readable storage medium and electronic equipment, wherein the method comprises the steps of obtaining an RGB image, wherein a shooting object contained in the RGB image is a test card; determining at least one connected region of the RGB image; for each connected region, determining a distance value set in the connected region to obtain a distance value set corresponding to each of the at least one connected region; and determining the color difference of the RGB image according to the distance value set corresponding to each of the at least one connected region, so as to correct the RGB image according to the color difference. The measurement of the image color difference is quantized by determining the distance value set of the connected region in the RGB image and determining the color difference of the RGB image according to the distance value set, so that the subjective judgment of the color difference is avoided, and the purpose of improving the color difference measurement precision is achieved.

Description

Image processing method and device, readable storage medium and electronic equipment
Technical Field
The invention relates to the technical field of image processing, in particular to an image processing method, an image processing device, a readable storage medium and electronic equipment.
Background
Chromatic Aberration (CA) is an optical phenomenon that cannot focus all colors to the same point of convergence due to the different refractive indices of the lenses for different wavelengths of light. Chromatic Aberration can be divided into Axial or Longitudinal Aberration and Lateral chromatic Aberration. In the prior art, adjustment of lenses or replacement of high-quality and expensive lenses are often adopted to eliminate chromatic aberration at the lens, and the high cost of the method can limit the wide application of the method or the device in cameras. In addition, the person skilled in the art mostly determines the degree of chromatic aberration by determining the image quality of the image edge, and then adjusts the lens to eliminate the chromatic aberration, and the determination of the degree of chromatic aberration is subjective and has low accuracy, so that the adjustment of the lens or the selection of the specification and the model of the lens is limited to a certain extent.
Disclosure of Invention
In order to solve the above technical problem, an image processing method, an image processing apparatus, a readable storage medium, and an electronic device according to the present application are provided.
According to an aspect of the present application, there is provided an image processing method including: acquiring an RGB image, wherein a shooting object contained in the RGB image is a test card; determining at least one connected region of the RGB image; for each connected region, determining a distance value set in the connected region to obtain a distance value set corresponding to each of the at least one connected region; and determining the color difference of the RGB image according to the distance value set corresponding to each of the at least one connected region, so as to correct the RGB image according to the color difference.
According to another aspect of the present application, there is provided an image processing apparatus including: the device comprises an image acquisition module, a test card acquisition module and a display module, wherein the image acquisition module is used for acquiring an RGB image, and a shooting object contained in the RGB image is the test card; a first determining module for determining at least one connected region of the RGB image; a second determining module, configured to determine, for each connected region, a distance value set in the connected region to obtain a distance value set corresponding to each of the at least one connected region; and the third determining module is used for determining the color difference of the RGB image according to the distance value set corresponding to each of the at least one connected region so as to correct the RGB image according to the color difference.
According to another aspect of the present application, there is provided a computer-readable storage medium having stored thereon a computer program for executing the method of any of the above.
According to another aspect of the present application, there is provided an electronic apparatus including: a processor; a memory for storing the processor-executable instructions; the processor is configured to perform any of the methods described above.
The embodiment of the application provides an image processing method, which determines a distance value set of a connected region in an RGB image and determines the color difference of the RGB image according to the distance value set so as to realize the measurement quantification of the image color difference, avoid the subjective judgment of the color difference and improve the color difference measurement precision.
Drawings
The foregoing and other objects, features and advantages of the application will be apparent from the following more particular description of embodiments of the application, as illustrated in the accompanying drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application and not to limit the application. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 is a schematic application scenario diagram of the present application.
FIG. 2 is a block diagram of an exemplary test card of the present application.
Fig. 3 is a flowchart illustrating an image processing method according to a first exemplary embodiment of the present application.
Fig. 4 is a flowchart illustrating an image processing method according to a second exemplary embodiment of the present application.
Fig. 5 is a flowchart illustrating an image processing method according to a third exemplary embodiment of the present application.
Fig. 6 is a schematic diagram of a distance value set in the first exemplary embodiment of the present application.
Fig. 7 is a schematic structural diagram of an image processing apparatus according to a first exemplary embodiment of the present application.
Fig. 8 is a schematic structural diagram of an image processing apparatus according to a second exemplary embodiment of the present application.
Fig. 9 is a schematic structural diagram of an image processing apparatus according to a third exemplary embodiment of the present application.
Fig. 10 is a block diagram of an electronic device provided in an exemplary embodiment of the present application.
Detailed Description
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It is to be understood that the described embodiments are merely exemplary of the present application and not restrictive of the broad disclosure of all embodiments, and that this disclosure is not to be considered as limited to the exemplary embodiments described herein.
Chromatic aberration, also called Chromatic aberration, is an optical phenomenon that a lens cannot focus light of all colors to the same focus point because the lens has different refractive indexes for light of different wavelengths.
Fig. 1 is a schematic application scenario diagram of the present application, and fig. 2 is a structural diagram of an exemplary test card of the present application. First, as shown in fig. 2, the test card provided in the present application may be a black rectangular plate 20 on which a plurality of through holes 21 are provided at equal intervals. In conjunction with the application scenario diagram shown in fig. 1, the system includes a shooting fixing support 1, a white light source 2 installed at the bottom of the fixing support 1, a test card 20 shown in fig. 2 installed at the middle of the fixing support 1, and an image capturing device 3 (e.g., a camera) installed at the upper portion of the fixing support 1. The white light source 1 is turned on, the light source penetrates through the through hole 21 of the test card 20, the image acquisition device 3 acquires the image of the test card 20 at the moment, and the chromatic aberration phenomenon can be observed through the image.
The application provides an image processing method, which is used for measuring the chromatic aberration of an image acquired by an image acquisition device, so that the image is corrected according to the chromatic aberration, and the image quality is improved. The image processing method of the present application will be exemplarily described below with reference to the accompanying drawings, so that those skilled in the art can clearly and accurately understand the technical solutions and effects of the present application.
Fig. 3 is a flowchart illustrating an image processing method according to a first exemplary embodiment of the present application. The embodiment can be applied to an electronic device, as shown in fig. 3, and includes the following steps:
step 301, acquiring an RGB image, wherein a shooting object included in the RGB image is a test card.
Based on the image captured by the image capturing device as in fig. 1, in this step, the captured image may be acquired from the image capturing device. In the present application, the image acquired in this step may be an RGB image.
The RGB image can be obtained by, for example: the method comprises the following steps that 1, a white light source is used for irradiating one side of a test card, an image acquisition device shoots an image of the current test card, and the image is separated after image processing to obtain an RGB image; according to the scheme 2, the test cards are respectively irradiated by light of red, green and blue colors, and images of the test cards are respectively collected, so that RGB images are obtained; and in the scheme 3, the white light source is filtered through 3 filters of red, green and blue, the filtered light source irradiates the test card, and then the image of the test card is acquired, so that an RGB image is obtained.
Step 302, at least one connected region of the RGB image is determined.
In this step, all connected regions are found from the RGB image. Illustratively, all connected regions may be determined from the RGB image by a flood fill method (Floodfill) or a seed fill method.
In the present application, the number of connected regions in the RGB image may be the same as the number of through holes of the test card included in the RGB image. For example, if the RGB image includes all the areas of the test card, the number of through holes included in the test card is the number of connected areas, and if the RGB image includes only a local area of the test card, the number of through holes included in the local area is the number of connected areas. For example, assuming that the test card has 150 through holes, the RGB image has 150 connected regions when the RGB image includes the entire area of the test card, and 40 connected regions if the RGB image includes a partial area of the test card, for example, 40 through holes.
Step 303, for each connected region, determining a set of distance values in the connected region.
In this application, the distance value set of each connected region may include a distance value of RG, a distance value of GB, and a distance value of a connected region to a center point of the RGB image.
Illustratively, for R, G and B three color channels in each connected region, a distance value between the R channel and the G channel, and a distance value between the G channel and the B channel, respectively, are determined. The distance value in the present application may be an euclidean distance.
And 304, determining the color difference of the RGB image according to the distance value set corresponding to each of the at least one connected region, so as to correct the RGB image according to the color difference.
In this step, the color difference of the RGB image is determined based on the determined distance values in step 303, and for example, the size of the distance values included in the distance value set may determine the degree of color difference in the RGB image for any one connected region. For example, the distance values included in the distance value set are large, and the color difference phenomenon in the RGB image is significant.
In summary, the image processing method provided by the present application determines the distance value set of the connected region in the RGB image, and determines the color difference of the RGB image according to the distance value set, so as to implement the quantification of the measurement of the color difference of the image, avoid the subjective judgment of the color difference, and improve the accuracy of the color difference measurement.
Fig. 4 is a flowchart illustrating an image processing method according to a second exemplary embodiment of the present application. On the basis of the embodiment shown in fig. 3, step 302 may include the following steps:
step 3021, determining pixel values of all pixel points of any color channel image in the RGB image.
The image is composed of a plurality of pixel points. Pixel points are defined as tiles of the image, which are assigned a value representing a color. In the digitized image, the numerical value representing the color in the small square can be determined as the pixel value. For example, for an RGB image, the pixel value may be the value of R, G, B corresponding to the pixel point.
And step 3022, converting the color channel image into a binary image based on the pixel values of all the pixel points.
In this step, based on the pixel value obtained in step 3021, the pixel value of each pixel point may be compared with a selected threshold, so as to achieve the purpose of performing binarization conversion on the RGB image, so that the RGB image has an obvious black-and-white effect after the binarization conversion.
Exemplarily, converting the RGB image into a binarized image may be implemented by: step 1, comparing an original pixel value of any pixel point with a preset pixel threshold value; step 2, if the original pixel value of any pixel point is smaller than the preset pixel threshold, setting the pixel value of the pixel point as a first pixel value, or if the original pixel value of any pixel point is larger than the preset pixel threshold, setting the pixel value of the pixel point as a second pixel value; and 3, determining an image formed by all pixel points with the pixel values being the first pixel values and all pixel points with the pixel values being the second pixel values as a binary image. For example, assuming that the preset pixel threshold is a and the pixel value of any one pixel in the RGB image is M in the present application, if M > a, the pixel value of the pixel is reset to 255, and if M < a, the pixel value of the pixel is reset to 0. Further, based on the above steps, after traversing all the pixel points in the RGB image, the pixel values of the pixel points in the RGB image are all reset to 0 or 255, and the pixel points after the pixel value reset form a binary image corresponding to the RGB image.
In the present application, the preset pixel threshold may be determined by an adaptive threshold algorithm or a global threshold algorithm, or may be a value selected by those skilled in the art according to their experience. And are not limited herein.
And step 3023, traversing pixel points in the binarized image to determine a connected region of each color channel image.
In this step, for each pixel point in the binarized image, a connected region may be determined by, for example, a flood fill (Floodfill) method or a seed fill method. Taking a seed filling method as an example, points around a seed point are searched, points which are the same as the seed point value are queued as new seeds, and the same expansion operation is performed on the newly queued seeds, so that an area which is the same as the original seed value is selected to form a connected area. In this application, the process of this expansion operation may be a depth-first search or a breadth-first search, and this application is not limited herein.
Based on the above description of the seed filling method, in the present application, for example, determining the connected component area may be implemented by: step 1, selecting a reference pixel point from the binary image to traverse all pixel points adjacent to the reference pixel point until the pixel value of no pixel point in the adjacent pixel points is the same as the pixel value of the reference pixel point; and 2, determining an image area formed by the reference pixel point and all adjacent pixel points with the same pixel value as the reference pixel point as a connected area to obtain the connected area of each color channel image.
For convenience of understanding, for example, suppose a pixel a is selected, if a connected domain of the pixel a needs to be found, a pixel B (the number of the pixels B may be 0 or more than 1) around the pixel a and having the same pixel value as the pixel a is searched, if the pixel B is found, a pixel around the pixel B and having the same pixel value as the pixel B is searched, until all pixels having the same pixel value as the found pixel are found, and an image area formed by the pixel a and the searched pixel having the same pixel value as the pixel a is a connected area.
And step 3024, marking the connected region of each color channel image respectively to obtain the connected regions of the RGB images.
Based on the connected regions determined in step 3024, in this step, these connected regions are marked to obtain connected regions of the RGB image. For example, connected regions at corresponding same locations in the three color channel images may be marked with the same identification. In the present application, the region represented by each white hole (pixel value of 255) in the binarized image represents one connected region.
In this step, for example, for the image of any color channel in R, G, B, the position of each white hole (with a pixel value of 255) in its binarized image is marked as 1, the area outside the white hole (with a pixel value of 0) is marked as 0, and then the positions of the white holes in the R, G, B three color channel images are in one-to-one correspondence by the marks of the marks, so as to obtain the connected area of the RGB image.
Through the embodiment, the RGB image is converted into the binary image, all connected regions in the image can be determined more quickly and efficiently, the connected regions of each color channel image are marked, the connected regions in the RGB image are finally obtained, the accuracy of determination of the connected regions can be improved, and further the overall operation speed and accuracy are improved on the basis of realizing color difference qualitative determination.
Fig. 5 is a flowchart illustrating an image processing method according to a third exemplary embodiment of the present application. As shown in fig. 5, based on the embodiment shown in fig. 3, step 303 may include the following steps:
3031, determining the coordinate value of the centroid of the color channel according to the coordinates of the pixel points in each color channel in the communication area and the number of the pixel points.
In this step, for any one connected region, the number of pixel points included in each color channel and the coordinates (e.g., (X) of each pixel point in each color channel may be determinedi,Yi)). In any color channel in any connected region, the coordinate values of all pixel points are summed, for example, if N pixel points are shared, X is X1+X2+X3+﹍+XN,Y=Y1+Y2+Y3+﹍+YNFurther, the sum of the coordinate values of all the pixel points is divided by the total number of the pixel points, so as to obtain the coordinates of the centroid of the color channel, for example, the coordinates of the centroid is (X/N, Y/N).
And step 3032, determining the distance value of the RG and the distance value of the GB based on the coordinate values of the centroids of the three RGB color channels.
Based on the coordinate value of the centroid of the RGB three color channels in any connected region obtained in the above step, the coordinate value of the R color channel is (x), for exampler,yr) The coordinate value of the G color channel is (x)g,yg) And the coordinate value of the B color channel is (x)b,yb) To determine the distance value of RG and the distance value of GB. According to an exemplary embodiment of the present application, the distance value of the RG and the distance value of the GB may be, but are not limited to, an euclidean distance of the RG and an euclidean distance of the GB.
And 3033, aiming at each connected region, determining the coordinate value of the centroid of the connected region according to the coordinates of the pixel points in the connected region and the number of the pixel points.
In an exemplary embodiment of the present application, the implementation process in this step may also refer to the implementation process in step 3031, and only in this step, the coordinates of all the pixel points in the connected region and the number of the pixel points are selected to be determined. And will not be described in detail herein.
In other exemplary embodiments, since the coordinate values of the three color channels have been determined based on the pixel points of the three color channels in the foregoing step 3031, the coordinate value of the centroid of the connected region may be determined by using the coordinate values of the three color channels in this step. Exemplarily, let the coordinate of the centroid of the connected region be (x)R,yR) Wherein x isR=(xr+xg+xb)/3,yR=(yr+yg+yb)/3。
And step 3034, determining a distance value from the connected region to the center point of the RGB image based on the coordinate value of the centroid of the connected region.
Exemplarily, let the coordinate of the center point of the RGB image be (x)c,yc) Then the distance value R of the connected region to the center point of the RGB image can be expressed as
Figure BDA0002026348300000081
To facilitate understanding of the distance value sets in the present application, the following description is made in conjunction with the diagram shown in fig. 6. As shown in fig. 6, the left area is an image of the test card 20, and includes a plurality of connected areas (for example, corresponding to the through holes 21 in the test card 20), any one of the connected areas a is selected, three color channels therein are circles corresponding to R, g, and b, Δ RG and Δ bg shown in the figure are a distance value of RG and a distance value of GB, respectively, and R is a distance value from the connected area a to a center point of the RGB image.
On the basis of the embodiment shown in fig. 5, the present application may further include another embodiment, where the embodiment further includes inputting the distance value from the connected region to the center point of the RGB image into a trained multi-layer feedforward neural network to output the distance value of RG and the distance value of GB corresponding to the distance value. In the present application, a large number of distance values from a connected region to the center point of an RGB image, RG distance values corresponding to the distance values, and GB distance values are used as training data, and a multi-layer feedforward neural network is repeatedly trained to learn a mapping relationship between the distance values from the connected region to the center point of the RGB image, the RG distance values corresponding to the distance values, and the GB distance values. Based on the trained multilayer feedforward neural network, when the distance value from the connected region to the center point of the RGB image is input, the RG distance value and the GB distance value corresponding to the distance value are output, and the color difference of the RGB image can be determined. By the exemplary embodiment, the color difference of the RGB image is determined through the neural network, so that the operation speed can be increased, the data processing capacity can be improved, and the operation cost can be reduced.
The foregoing exemplary embodiments describe the image processing method of the present application in detail, and the image processing apparatus corresponding to the foregoing method embodiments will be further described below with reference to the accompanying drawings.
Fig. 7 is a schematic structural diagram of an image processing apparatus according to a first exemplary embodiment of the present application. As shown in fig. 7, an image processing apparatus 700 may include an image acquisition module 710, a first determination module 720, a second determination module 730, and a third determination module 740.
The image obtaining module 710 may be configured to obtain an RGB image, where a shooting object included in the RGB image is a test card; the first determining module 720 may be configured to determine at least one connected region of the RGB image; the second determining module 730 may be configured to determine, for each connected region, a distance value set in the connected region to obtain a distance value set corresponding to each of at least one connected region; the third determining module 740 may be configured to determine a color difference of the RGB image according to the respective corresponding distance value sets of the at least one connected component, so as to correct the RGB image according to the color difference.
Fig. 8 is a schematic structural diagram of an image processing apparatus according to a second exemplary embodiment of the present application. As shown in fig. 8, the first determination module 720 may include a first determination unit 721, an image conversion unit 722, a pass unit 723, and a marking unit 724. The first determining unit 721 may be configured to determine pixel values of all pixel points of any color channel image in the RGB image; the image conversion unit 722 may be configured to convert the color channel image into a binary image based on pixel values of all pixel points; the traversal unit 723 may be configured to traverse pixel points in the binarized image to determine a connected region of each color channel image; the marking unit 724 may be configured to mark the connected regions of each color channel image, where the connected regions at the same corresponding positions in the three color channel images are marked with the same identifier, so as to obtain the connected regions of the RGB images.
Fig. 9 is a schematic structural diagram of an image processing apparatus according to a third exemplary embodiment of the present application. As shown in fig. 9, the second determining module 730 may include a second determining unit 731, a third determining unit 732, a fourth determining unit 733, and a fifth determining unit 734. The second determining unit 731 may be configured to determine a coordinate value of a centroid of the color channel according to the coordinate of the pixel point in each color channel in the connected region and the number of the pixel points; the third determining unit 732 may be configured to determine a distance value of RG and a distance value of GB based on coordinate values of centroids of the three RGB color channels; the fourth determining unit 733 may be configured to determine, for each connected region, a coordinate value of a centroid of the connected region according to the coordinate of the pixel point in the connected region and the number of the pixel points; the fifth determining unit 734 is configured to determine a distance value from the center point of the RGB image to the connected region based on the coordinate value of the centroid of the connected region.
According to an exemplary embodiment of the present application, the image processing apparatus 700 may further include an input module (not shown in the figure), and the input module may be configured to input the distance value from the connected region to the center point of the RGB image into the trained multi-layer feedforward neural network, so as to output the distance value of RG and the distance value of GB corresponding to the distance value.
According to the image processing device, the distance value set of the connected region in the RGB image is determined, and the color difference of the RGB image is determined according to the distance value set, so that the purposes of quantifying the measurement of the image color difference, avoiding the subjective judgment of the color difference and improving the color difference measurement precision are achieved.
FIG. 10 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
As shown in fig. 10, the electronic device 11 includes one or more processors 111 and memory 112.
Processor 111 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in electronic device 11 to perform desired functions.
Memory 112 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, and the like. One or more computer program instructions may be stored on the computer-readable storage medium and executed by the processor 111 to implement the image processing methods of the various embodiments of the present application described above and/or other desired functions. Various contents such as an input signal, a signal component, a noise component, etc. may also be stored in the computer-readable storage medium.
In one example, the electronic device 11 may further include: an input device 113 and an output device 114, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
For example, the input device 113 may be a camera or a microphone, a microphone array, or the like as described above, for capturing an input signal of an image or a sound source. When the electronic device is a stand-alone device, the input means 123 may be a communication network connector for receiving the acquired input signals from the neural network processor.
The input device 113 may also include, for example, a keyboard, a mouse, and the like.
The output device 114 may output various information to the outside, including the determined output voltage, output current information, and the like. The output devices 114 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, among others.
Of course, for the sake of simplicity, only some of the components related to the present application in the electronic device 11 are shown in fig. 10, and components such as a bus, an input/output interface, and the like are omitted. In addition, the electronic device 11 may include any other suitable components, depending on the particular application.
In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the image processing method according to various embodiments of the present application described in the "exemplary methods" section of this specification, supra.
The computer program product may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + +, or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages, for performing the operations of embodiments of the present application. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform steps in an image processing method according to various embodiments of the present application described in the "exemplary methods" section above in this specification.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing describes the general principles of the present application in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present application are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the foregoing disclosure is not intended to be exhaustive or to limit the disclosure to the precise details disclosed.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "are used herein to mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the devices, apparatuses, and methods of the present application, the components or steps may be disassembled and/or reassembled. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit the claimed embodiments to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (10)

1. An image processing method comprising:
acquiring an RGB image, wherein a shooting object contained in the RGB image is a test card;
determining at least one connected region of the RGB image;
for each connected region, determining a distance value set in the connected region to obtain a distance value set corresponding to each of the at least one connected region;
and determining the color difference of the RGB image according to the distance value set corresponding to each of the at least one connected region, so as to correct the RGB image according to the color difference.
2. The method of claim 1, wherein the determining at least one connected region of the RGB image comprises:
determining pixel values of all pixel points of any color channel image in the RGB image;
converting the color channel image into a binary image based on the pixel values of all the pixel points;
traversing pixel points in the binary image to determine a communication area of each color channel image;
and marking the connected regions of each color channel image respectively, wherein the connected regions at the same corresponding positions in the three color channel images are marked with the same identification to obtain the connected regions of the RGB images.
3. The method of claim 2, wherein converting the color channel image into a binarized image based on the pixel values of all pixel points comprises:
aiming at each color channel image, comparing the original pixel value of each pixel point of the color channel image with a preset pixel threshold value;
if the original pixel value of any pixel point is smaller than the preset pixel threshold value, setting the pixel value of the pixel point as a first pixel value;
if the original pixel value of any pixel point is larger than the preset pixel threshold value, setting the pixel value of the pixel point as a second pixel value;
and determining an image formed by all pixel points with the pixel values being the first pixel values and all pixel points with the pixel values being the second pixel values as a binary image.
4. The method of claim 2, wherein said traversing pixel points in said binarized image to determine connected regions for each of said color channel images comprises:
selecting a reference pixel point from the binary image to traverse all pixel points adjacent to the reference pixel point until the pixel value of no pixel point in the adjacent pixel points is the same as the pixel value of the reference pixel point;
and determining an image area formed by the reference pixel point and all adjacent pixel points with the same pixel value as the reference pixel point as a connected area so as to obtain the connected area of each color channel image.
5. The method of claim 1, wherein said determining the set of distance values in the connected component comprises:
determining the coordinate value of the centroid of the color channel according to the coordinates of the pixel points in each color channel in the communication area and the number of the pixel points;
determining a distance value of RG and a distance value of GB based on coordinate values of the centroid of the three RGB color channels; and
for each connected region, determining the coordinate value of the centroid of the connected region according to the coordinates of the pixel points in the connected region and the number of the pixel points;
and determining the distance value from the connected region to the center point of the RGB image based on the coordinate value of the centroid of the connected region.
6. The method of claim 5, wherein the method further comprises:
and inputting the distance value from the connected region to the central point of the RGB image into a trained multilayer feedforward neural network so as to output the RG distance value and the GB distance value corresponding to the distance value.
7. An image processing apparatus comprising:
the device comprises an image acquisition module, a test card acquisition module and a display module, wherein the image acquisition module is used for acquiring an RGB image, and a shooting object contained in the RGB image is the test card;
a first determining module for determining at least one connected region of the RGB image;
a second determining module, configured to determine, for each connected region, a distance value set in the connected region to obtain a distance value set corresponding to each of the at least one connected region;
and the third determining module is used for determining the color difference of the RGB image according to the distance value set corresponding to each of the at least one connected region so as to correct the RGB image according to the color difference.
8. The apparatus of claim 7, wherein the first determining means comprises:
the first determining unit is used for determining pixel values of all pixel points of any color channel image in the RGB image;
the image conversion unit is used for converting the color channel image into a binary image based on the pixel values of all the pixel points;
the traversal unit is used for traversing pixel points in the binary image to determine a communication area of each color channel image;
and the marking unit is used for marking the connected regions of each color channel image respectively, wherein the connected regions at the same corresponding positions in the three color channel images are marked by the same identification to obtain the connected regions of the RGB images.
9. A computer-readable storage medium storing a computer program for executing the image processing method according to any one of claims 1 to 6.
10. An electronic device, the electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor for performing the image processing method of any of the above claims 1-6.
CN201910295510.9A 2019-04-12 2019-04-12 Image processing method and device, readable storage medium and electronic equipment Pending CN111815720A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910295510.9A CN111815720A (en) 2019-04-12 2019-04-12 Image processing method and device, readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910295510.9A CN111815720A (en) 2019-04-12 2019-04-12 Image processing method and device, readable storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN111815720A true CN111815720A (en) 2020-10-23

Family

ID=72843996

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910295510.9A Pending CN111815720A (en) 2019-04-12 2019-04-12 Image processing method and device, readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111815720A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781451A (en) * 2021-09-13 2021-12-10 长江存储科技有限责任公司 Wafer detection method and device, electronic equipment and computer readable storage medium
CN115482308A (en) * 2022-11-04 2022-12-16 平安银行股份有限公司 Image processing method, computer device, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040032588A1 (en) * 2002-08-19 2004-02-19 Imperial Chemical Industries Plc Method for obtaining an approximate standard colour definition for a sample colour
CN101166285A (en) * 2006-10-16 2008-04-23 展讯通信(上海)有限公司 Automatic white balance method and device
US20090189997A1 (en) * 2008-01-28 2009-07-30 Fotonation Ireland Limited Methods and Apparatuses for Addressing Chromatic Abberations and Purple Fringing
US20120069226A1 (en) * 2010-09-21 2012-03-22 Canon Kabushiki Kaisha Image processing apparatus that corrects for chromatic aberration for taken image, image pickup apparatus, method of correcting for chromatic aberration of magnification therefor, and storage medium
CN102970459A (en) * 2011-08-31 2013-03-13 佳能株式会社 Image processing apparatus, image capture apparatus, and image processing method
CN109215090A (en) * 2017-06-30 2019-01-15 百度在线网络技术(北京)有限公司 For calculating the method, apparatus and server of color difference

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040032588A1 (en) * 2002-08-19 2004-02-19 Imperial Chemical Industries Plc Method for obtaining an approximate standard colour definition for a sample colour
CN101166285A (en) * 2006-10-16 2008-04-23 展讯通信(上海)有限公司 Automatic white balance method and device
US20090189997A1 (en) * 2008-01-28 2009-07-30 Fotonation Ireland Limited Methods and Apparatuses for Addressing Chromatic Abberations and Purple Fringing
US20120069226A1 (en) * 2010-09-21 2012-03-22 Canon Kabushiki Kaisha Image processing apparatus that corrects for chromatic aberration for taken image, image pickup apparatus, method of correcting for chromatic aberration of magnification therefor, and storage medium
CN102970459A (en) * 2011-08-31 2013-03-13 佳能株式会社 Image processing apparatus, image capture apparatus, and image processing method
CN109215090A (en) * 2017-06-30 2019-01-15 百度在线网络技术(北京)有限公司 For calculating the method, apparatus and server of color difference

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113781451A (en) * 2021-09-13 2021-12-10 长江存储科技有限责任公司 Wafer detection method and device, electronic equipment and computer readable storage medium
CN113781451B (en) * 2021-09-13 2023-10-17 长江存储科技有限责任公司 Wafer detection method, device, electronic equipment and computer readable storage medium
CN115482308A (en) * 2022-11-04 2022-12-16 平安银行股份有限公司 Image processing method, computer device, and storage medium

Similar Documents

Publication Publication Date Title
CN102625043B (en) Image processing apparatus, imaging apparatus, and image processing method
KR102595704B1 (en) Image detection method, device, electronic device, storage medium, and program
WO2021057474A1 (en) Method and apparatus for focusing on subject, and electronic device, and storage medium
US8199246B2 (en) Image capturing apparatus, image capturing method, and computer readable media
US20170256036A1 (en) Automatic microlens array artifact correction for light-field images
CN108764358B (en) Terahertz image identification method, device and equipment and readable storage medium
US20120163704A1 (en) Apparatus and method for stereo matching
CN110400278B (en) Full-automatic correction method, device and equipment for image color and geometric distortion
CN109698944B (en) Projection area correction method, projection apparatus, and computer-readable storage medium
US10455163B2 (en) Image processing apparatus that generates a combined image, control method, and storage medium
CN108182421A (en) Methods of video segmentation and device
CN101983507A (en) Automatic redeye detection
WO2019210707A1 (en) Image sharpness evaluation method, device and electronic device
CN111815720A (en) Image processing method and device, readable storage medium and electronic equipment
US8482630B2 (en) Apparatus and method for adjusting automatic white balance by detecting effective area
JP6065656B2 (en) Pattern processing apparatus, pattern processing method, and pattern processing program
US20170085753A1 (en) Image data generating apparatus, printer, image data generating method, and non-transitory computer readable medium
CN117095417A (en) Screen shot form image text recognition method, device, equipment and storage medium
JP4984140B2 (en) Image processing apparatus, image processing method, imaging apparatus, imaging method, and program
CN113315995B (en) Method and device for improving video quality, readable storage medium and electronic equipment
CN116109543A (en) Method and device for quickly identifying and reading data and computer readable storage medium
CN114022367A (en) Image quality adjusting method, device, electronic equipment and medium
JP6390248B2 (en) Information processing apparatus, blur condition calculation method, and program
KR101491334B1 (en) Apparatus and method for detecting color chart in image
CN111836038B (en) Method and device for determining imaging quality, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination