CN114723602A - Image processing method, image processing apparatus, terminal, and readable storage medium - Google Patents

Image processing method, image processing apparatus, terminal, and readable storage medium Download PDF

Info

Publication number
CN114723602A
CN114723602A CN202210369140.0A CN202210369140A CN114723602A CN 114723602 A CN114723602 A CN 114723602A CN 202210369140 A CN202210369140 A CN 202210369140A CN 114723602 A CN114723602 A CN 114723602A
Authority
CN
China
Prior art keywords
color
image
value
clustering
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210369140.0A
Other languages
Chinese (zh)
Inventor
李海军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210369140.0A priority Critical patent/CN114723602A/en
Publication of CN114723602A publication Critical patent/CN114723602A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image processing method, an image processing device, a terminal and a non-volatile computer readable storage medium. The image processing method comprises the following steps: converting an image to be processed into an LAB color space to obtain a first image; clustering the first image to obtain a plurality of clustering results; and obtaining the theme color according to the plurality of clustering results and the color difference sum corresponding to each clustering result. The image processing method, the image processing device, the terminal and the nonvolatile computer readable storage medium of the embodiment of the application perform clustering processing by converting an image to be processed into an LAB color space, and obtain a theme color according to a color difference sum corresponding to a clustering result to generate the theme color conforming to human eye perception.

Description

Image processing method, image processing apparatus, terminal, and readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a terminal, and a non-volatile computer-readable storage medium.
Background
The image theme colors are colors extracted from an image that best represent the dominant hue of the image. That is, in a gorgeous picture, the number of different colors corresponds to the ratio of the color in the picture, and the subject color is calculated by calculating the number of pixels of different colors in the picture. The dominant hue of a color watermark (palette) is not a simple RGB value with the largest occurrence number, and it should conform to the habit of human eyes and be the focus of vision.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a terminal and a nonvolatile computer readable storage medium, which are used for generating theme colors according with human eye perception.
The image processing method of the embodiment of the application comprises the following steps: converting an image to be processed into an LAB color space to obtain a first image; clustering the first image to obtain a plurality of clustering results; and obtaining the theme color according to the plurality of clustering results and the color difference sum corresponding to each clustering result.
An image processing apparatus according to an embodiment of the present application includes: the system comprises a first acquisition module, a second acquisition module and a processing module, wherein the first acquisition module is used for converting an image to be processed into an LAB color space to acquire a first image; the clustering module is used for clustering the first image to obtain a plurality of clustering results; and the color obtaining module is used for obtaining the theme color according to the plurality of clustering results and the color difference sum corresponding to each clustering result.
The terminal of the embodiment of the application comprises: one or more processors, memory; and one or more programs, wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs including instructions for the image processing method of an embodiment of the present application. The image processing method of the embodiment of the application comprises the following steps: converting an image to be processed into an LAB color space to obtain a first image; clustering the first image to obtain a plurality of clustering results; and obtaining the theme color according to the plurality of clustering results and the color difference sum corresponding to each clustering result. .
The non-transitory computer-readable storage medium containing the computer program of the embodiments of the present application, when executed by one or more processors, causes the processors to implement the instructions of the image processing method of the embodiments of the present application. The image processing method comprises the following steps: converting an image to be processed into an LAB color space to obtain a first image; clustering the first image to obtain a plurality of clustering results; and obtaining the theme color according to the plurality of clustering results and the color difference sum corresponding to each clustering result.
The image processing method, the image processing device, the terminal and the nonvolatile computer readable storage medium of the embodiment of the application perform clustering processing by converting an image to be processed into an LAB color space, and obtain a theme color according to a color difference sum corresponding to a clustering result. The LAB color space is the color space which is most suitable for being perceived by human eyes, and the colors corresponding to the clustering result can be in accordance with the perception of the human eyes by clustering in the LAB color space so as to generate theme colors in accordance with the perception of the human eyes.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 2 is a schematic diagram of an image processing apparatus according to some embodiments of the present application;
FIG. 3 is a schematic block diagram of a terminal according to some embodiments of the present application;
FIG. 4 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 5 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 6 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 7 is a schematic view of a center point location of an image processing method according to some embodiments of the present application;
FIG. 8 is a schematic diagram of a clustering scene of an image processing method according to some embodiments of the present application;
FIG. 9 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 10 is a schematic diagram of a clustering scenario of an image processing method according to some embodiments of the present application;
FIG. 11 is a schematic diagram of a clustering scene of an image processing method according to some embodiments of the present application;
FIG. 12 is a schematic diagram of a clustering scenario of an image processing method according to some embodiments of the present application;
FIG. 13 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 14 is a schematic diagram of a clustering scenario of an image processing method according to some embodiments of the present application;
FIG. 15 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 16 is a schematic flow chart diagram of an image processing method according to some embodiments of the present application;
FIG. 17 is a schematic diagram of a YUV image and corresponding color watermark according to some embodiments of the present application;
FIG. 18 is a schematic diagram of a connection between a computer-readable storage medium and a processor according to some embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
Referring to fig. 1, an embodiment of the present application provides an image processing method. The image processing method comprises the following steps:
02: converting an image to be processed into an LAB color space to obtain a first image;
03: clustering the first image to obtain a plurality of clustering results; and
04: and obtaining the theme color according to the plurality of clustering results and the color difference sum corresponding to each clustering result.
Referring to fig. 2, an image processing apparatus 10 is further provided in the present embodiment. The image processing apparatus 10 includes a first obtaining module 11, a clustering module 13, and a color obtaining module 14. The first obtaining module 11 is used to implement the method in 02. The clustering module 13 is used to implement the method in 03. The color acquisition module 14 is used to implement the method in 04. That is, the first acquisition module 10 is configured to convert the image to be processed into the LAB color space to acquire the first image. The clustering module 13 is configured to perform clustering processing on the first image to obtain a plurality of clustering results. The color obtaining module 14 is configured to obtain a theme color according to the multiple clustering results and the color difference sum corresponding to each clustering result.
Referring to fig. 3, the present embodiment further provides a terminal 100. The terminal 100 may be a mobile phone, a desktop computer, a notebook computer, a server, an intelligent appliance, an intelligent wearable device, a game machine, a digital camera, etc., which are not listed here. The terminal 100 includes one or more processors 80, memory 90, and one or more programs. Where one or more programs are stored in the memory 90 and executed by the one or more processors 80. Referring to fig. 1, the program includes instructions for executing the image processing method according to the embodiment of the present application. That is, the program includes instructions for executing the methods in 02, 03, and 04.
The image theme color is a plurality of colors extracted from an image and capable of representing the dominant hue of the image. In the embodiment of the application, the image to be processed and the first image are representations of the same frame of image in different color spaces. The image processing method, the image processing apparatus 10, and the terminal 100 according to the embodiment of the present application perform clustering processing by converting an image to be processed into an LAB color space, and obtain a theme color according to a color difference sum corresponding to a clustering result.
The LAB color space is a color-opponent space that includes three components, L, a, and b. Where dimension L represents luminance, a and b represent color opponent dimensions, a represents a component from green to red, and b represents a component from blue to yellow. LAB are designed based on human perception of color, which is produced by the human eye receiving light stimuli: the perception of white to black, the perception of red to green, the perception of yellow to blue, and the three components L, a, b of the LAB color space are the color channels designed for these three visual senses. If the three parameters L, a and b are close in variation, the three parameters will bring about a similar variation in visual sense. Therefore, the LAB color space is the color space that is most visually perceived by the human eye. And clustering in an LAB color space to enable the color corresponding to the clustering result to accord with the perception of human eyes. And obtaining the theme color according to the color difference sum corresponding to the clustering result, reducing the gradient in the theme color, and obtaining the theme color with distinct hue, distinct characteristics and visual focus.
The following is further described with reference to the accompanying drawings.
The image to be processed may include images in various formats. In one embodiment, if the images to be processed are images in an LAB color space, the images to be processed are taken as a first image to be processed for subsequent processing.
Referring to fig. 4, in some embodiments, the image to be processed is an image in XYZ color space, and pixels of the image to be processed include a first XYZ component, a second XYZ component, and a third XYZ component. 02: converting the image to be processed into an LAB color space to obtain a first image, comprising:
021: acquiring an LAB three-channel value of a pixel of the image to be processed according to a first XYZ component, a second XYZ component, a third XYZ component, a preset mapping relation and a preset hyper-parameter of the pixel of the image to be processed; and
022: and acquiring a first image according to the LAB three-channel value of each pixel of the image to be processed.
Referring to fig. 2, in some embodiments, the first obtaining module 11 can be further configured to implement the methods in 021 and 022, that is, the first obtaining module 11 can be further configured to: acquiring an LAB three-channel value of a pixel of the image to be processed according to a first XYZ component, a second XYZ component, a third XYZ component, a preset mapping relation and a preset hyper-parameter of the pixel of the image to be processed; and 022: and acquiring a first image according to the LAB three-channel value of each pixel of the image to be processed.
Referring to FIG. 3, in some embodiments, the program further includes instructions for performing the methods of 021 and 022.
The XYZ color space is based on the RGB color system, replacing the actual RGB primaries with three ideal primaries. The XYZ color space includes X, Y, and Z coordinate axes, where the X coordinate of a color indicates the red proportion, the Y coordinate indicates the green proportion, and the Z coordinate indicates the blue proportion. An image pixel in the XYZ color space includes a first XYZ component, a second XYZ component, and a third XYZ component, and represents the ratio of red, green, and blue of the pixel, respectively. The following description will be made with the first XYZ component being X, the second XYZ component being Y, and the third XYZ component being Z, respectively.
Let the channel values of L, A, B channels of the LAB color space be L, a, b, respectively, and the following formula one, formula two, and formula three can be used to convert the to-be-processed image of the XYZ color space into the first image of the LAB color space.
The formula I is as follows:
Figure BDA0003587208190000041
the formula II is as follows:
Figure BDA0003587208190000042
the formula III is as follows:
Figure BDA0003587208190000043
wherein the content of the first and second substances,
Figure BDA0003587208190000044
Xn、Yn、Znis a hyper-parameter. In one embodiment, XnIs 95.047, YnIs taken to be 100.000, ZnIs 108.883 to accurately get the mapping of the XYZ color space to the LAB color space.
As formula one, L in the LAB color space represents luminance, and luminance in the XYZ color space is represented by the value of the Y component. And the first formula relates the brightness relation between the LAB color space and the XYZ color space to obtain the L channel value in the LAB color space.
In the LAB color space, a represents the green to red component, while in the XYZ color space, the X component represents the proportion of red and the Y component represents the proportion of green, as in equation two. And the second formula relates the red-green relation between the LAB color space and the XYZ color space to obtain the a channel value in the LAB color space.
B in the LAB color space represents the component from blue to yellow, while the Z component in the XYZ color space represents the proportion of blue, the Y component represents the proportion of green, and green is adjacent to yellow, as in equation three. And a third formula relates the blue-green relationship between the LAB color space and the XYZ color space to obtain the b channel value in the LAB color space.
Referring to fig. 1, in some embodiments, the image to be processed is an image in an RGB color space, and the RGB color space is a color space based on three primary colors of red, green, and blue. Converting the image in the RGB color space to the LAB color space requires the use of the XYZ color space as an intermediary. Setting the value ranges of the pixel values of the image to be processed in R, G, B color channels as [0, 255], and setting the value ranges as range, a fourth formula can be used to obtain a first image in an XYZ color space according to the image to be processed in the RGB color space.
The formula four is as follows:
Figure BDA0003587208190000051
wherein the content of the first and second substances,
Figure BDA0003587208190000052
finishing to obtain:
Figure BDA0003587208190000053
thus, X, Y, Z component values in the corresponding XYZ color space may be determined from the pixel values of the image to be processed at R, G, B three color channels to obtain a corresponding first image.
In some embodiments, since the conversion relationship between the XYZ color space and the LAB color space is relatively clear, an image to be processed in any color space may be converted into an image in the XYZ color space, and then converted into the LAB color space to obtain the first image.
Referring to fig. 5, in some embodiments, the image processing method further includes:
05: acquiring a YUV image;
06: performing down-sampling processing on the YUV image to obtain a second image;
07: converting the second image into an RGB color space to obtain a third image; and
08: the third image is converted to XYZ color space to obtain the image to be processed.
Referring to fig. 2, in some embodiments, the image processing apparatus 10 further includes a second obtaining module 12, a third obtaining module 15, a down-sampling module 16, and a fourth obtaining module 17. The second processing module 12 is used to implement the method in 05. The down-sampling module 16 is used to implement the method in 06. The third obtaining module 15 is configured to implement the method in 07. The fourth obtaining module 17 is configured to implement the method in 08. Namely, the second processing module 12 is used to acquire YUV images. The down-sampling module 16 is configured to down-sample the YUV image to obtain a second image. The third acquiring module 15 is configured to convert the second image into an RGB color space to acquire a third image. The fourth obtaining module 17 is configured to convert the third image into an XYZ color space to obtain an image to be processed. Referring to fig. 3, in some embodiments, the program further includes instructions for performing the methods of 05, 06, 07, and 08.
Generally, images captured by a mobile phone or a camera are YUV images. According to the method in 05, 06, 07, 08, the YUV image can be converted into an image of RGB color space for subsequent image processing. The YUV image is subjected to down-sampling processing, so that the speed of subsequent image processing can be increased, and the time for a user to wait for generating a theme color is reduced. In one embodiment, the YUV image is reduced by 16 times to obtain the third image, and in other embodiments, the YUV image may be reduced by 8 times, 4 times, 2 times, 32 times, and so on, which are not listed here.
In some embodiments, the second image may be converted to an RGB color space according to the following formula five to obtain a third image.
The formula is five:
Figure BDA0003587208190000061
wherein, Y, Cb、CγWhich are the components of the three channels of the third image in the YUV color space, R, G, B are the components of the three channels after the third image is converted into the RGB color space.
The third image is an image in an RGB color space, and with reference to formula four, the third image in the RGB color space may be converted into an image to be processed in an XYZ color space, so that with reference to formula one, formula two, and formula three, the image to be processed in the XYZ color space is converted into the first image in an LAB color space for processing.
Referring to fig. 6, in some embodiments, 03: clustering the first image to obtain a plurality of clustering results, comprising:
031: determining a plurality of central points in the first image according to the number of color classifications, wherein each central point corresponds to one color classification;
032: acquiring a first color value of each color classification and a second color value of each pixel point in the first image;
033: dividing the pixel points into corresponding color classifications according to the difference value between the second color value of each pixel point and the first color value of each color classification;
034: obtaining the average value of the second color values of all the pixel points in each color classification; and
035: and updating the central point and the first color value corresponding to each color classification according to the average value corresponding to each color classification.
Referring to fig. 2, in some embodiments, the clustering module 13 may further be configured to implement the methods in 031, 032, 033, 034 and 035, that is, the clustering module 13 may further be configured to: acquiring a first color value of each color classification and a second color value of each pixel point in the first image; dividing the pixel points into corresponding color classifications according to the difference value between the second color value of each pixel point and the first color value of each color classification; obtaining the average value of the second color values of all the pixel points in each color classification; and updating the center point and the first color value corresponding to each color classification according to the average value corresponding to each color classification. Referring to fig. 3, in some embodiments, the program further includes instructions for performing the methods of 031, 032, 033, 034, and 035.
In some embodiments, the number of color classifications is twice the target number of subject colors. The target number of the subject colors is a preset value indicating how many colors are used to reflect the dominant hue of an image. For example, if the number of theme colors is 5, 5 different colors are used to reflect the main tone of an image. The number of the clustering results is the same as the number of the color classifications, that is, how many color classifications are available, and how many clustering results can be finally obtained. Referring to fig. 1, the number of color classifications is twice the number of theme colors, that is, the number of clustering results is twice the number of theme colors, so that when the theme colors are obtained according to the plurality of clustering results and the sum of color differences corresponding to each clustering result, suitable colors can be screened from the colors corresponding to the plurality of clustering results as the theme colors, for example, colors with similar color differences in the plurality of clustering results are screened, so that the finally obtained theme colors have obvious differences, and vivid theme colors are obtained. In other embodiments, the number of color classifications is not limited to twice the target number of the subject color, and may be any number greater than the target number of the subject color, for example, the number of color classifications is 1, 2, 3, etc. greater than the target number of the subject color, and further for example, the number of color classifications is 3, 4, 5, etc. times the target number of the subject color, not to mention here.
Referring to fig. 7, the center point is a selected position point in the first image, and the image may be a to-be-processed image in RGB color space or an initial image in other color space, which is not limited herein. In the embodiment illustrated in fig. 5, the number of targets of the subject color is 5, the number of color classifications is 10, and the number of center points is 10. In determining the center points, the image is divided into 4 parts in the lateral direction by 3 dividing lines and 4 parts in the longitudinal direction by another 3 dividing lines, and among the intersection points D1, D2, D3, D4, D5, D6, D7, D8, D9 formed by the above 6 dividing lines, the intersection points D1, D2, D3, D4, D5, D7, D8, D9 are set as the center points, and the intersection points D4 and the midpoint D10 of the intersection point D5, the intersection point D6 and the midpoint D11 of the intersection point D5 are set as the center points, resulting in 10 center points. In other embodiments, the distribution of the center points may not be limited to the distribution illustrated in fig. 5, and is not limited herein. For example, in yet another embodiment, the center points are evenly distributed in the image.
The clustering process is to classify the pixel points in the first image into the color class with the closest color according to the colors. The first color value is a color corresponding to the position of the central point, and the first color value is represented by L, A and B channel values. For example, let the first color value corresponding to the center point Ai be Yai ═ (Li, Ai, Bi). The second color value is a color corresponding to the position of the pixel point, and is represented by L, A and B channel values. For example, let the second color value corresponding to the pixel point Pj be Ypj ═ (Lj, Aj, Bj). The disparity value reflects the color difference between the first color value Yai and the second color value Ypj, and the smaller the disparity value, the closer the corresponding color of the first color value Yai is to the corresponding color of the second color value Ypj. In the clustering process, each pixel point Pj is divided into color classifications where the center point Ai with the closest color is located.
Taking the pixel point P1 as an example, difference values between the pixel point P1 and each central point Ai are respectively obtained to determine a cluster with the color closest to the pixel point P1. If the difference between the colors corresponding to the pixel point P1 and the center point a2 is the minimum, and the center point a2 corresponds to the color classification S2, the pixel point P1 is classified into the color classification S2. And so on, respectively obtaining the difference value between each pixel point Pj and each central point Ai to respectively determine the color classification Sk corresponding to each pixel point Pj, wherein k belongs to [1, i ].
Referring to FIG. 8, as an example, the first image has a center point A1 and a center point A2 that are determined to correspond to color classification S1 and color classification S2, respectively. For convenience of explanation, other pixel points in the first image are omitted in the example of fig. 6, and clustering with pixel points P1, P2, P3, P4, P5, and P6 is explained. If the pixel point with the minimum difference value between the colors corresponding to the center point a1 is P1, P2, and P3, and the pixel point with the minimum difference value between the colors corresponding to the center point a2 is P4, P5, and P6, the colors P1, P2, and P3 are classified into color classification S1, and the colors P4, P5, and P6 are classified into color classification S2. In fig. 6, color classification S1 is located above dividing line O1, and color classification S2 is located below dividing line O1.
After the primary division is performed, an average value of the second color value Ypj of each pixel point Pj in each color classification Sk is obtained, and the center point Ai and the first color value Yai corresponding to each color classification Sk are updated according to the average value corresponding to each color classification Sk. In this way, the average value of the second color values Ypj of the pixel points Pj included in the divided color classifications Sk is used as the first color value Yai corresponding to the color classifications Sk, so that the overall color characteristics of the divided color classifications Sk can be represented more accurately. The updated center point Ai is a position point of the updated first color value Yai corresponding to the LAB color space.
Referring to fig. 8, as an example, an average value of the second color value Ypj of each pixel point Pj in each color classification Sk is set to Ydk. After the first division, the average value of the second color values Yp1, Yp2 and Yp3 of the pixel points P1, P2 and P3 in the color classification S1 is Yd1, Yd1 is used as the value of the first color value Ya1 of the updated color classification S1, and the central point corresponding to the updated color classification S1 is a 3; the average value of the second color values Yp4, Yp5 and Yp6 of the pixel points P4, P5 and P6 in the color classification S2 is Yd2, the Yd2 is used as the value of the first color value Ya2 of the updated color classification S2, and the central point corresponding to the updated color classification S1 is a 4.
Referring to fig. 9, in some embodiments, 03: clustering the first image to obtain a plurality of clustering results, further comprising:
036: dividing the pixel points into corresponding color classifications according to a difference value between the second color value of each pixel point and the first color value updated by each color classification;
037: finishing clustering processing when the updating times of the central point corresponding to the color classification reach a preset time threshold value, and outputting a first color value corresponding to the central point as a clustering result; or
038: and finishing clustering processing under the condition that the central points corresponding to the color classifications before and after updating are not changed, and outputting the first color values corresponding to the central points as clustering results.
Referring to fig. 2, in some embodiments, the clustering module 13 may also be configured to implement the methods in 036, 037, or 038, that is, the clustering module 13 may also be configured to: dividing the pixel points into corresponding color classifications according to a difference value between the second color value of each pixel point and the first color value updated by each color classification; finishing clustering processing when the updating times of the central points corresponding to the color classification reach a preset time threshold value, and outputting a first color value corresponding to the central point as a clustering result; or finishing the clustering processing under the condition that the central points corresponding to the color classifications before and after updating are not changed, and outputting the first color values corresponding to the central points as clustering results. Referring to fig. 3, in some embodiments, the program further includes instructions for performing the methods of 026, 027 or 028.
With reference to fig. 8, 10, and 11, after the updating of the central point Ai and the first color value Yai is completed once, the second color value Ypj of each pixel Pi is clustered next according to the updated first color value Yai of each color classification Sk until the updating frequency of the central point Ai reaches the preset frequency threshold, or the clustering is ended when the updated first color value Yai is the same as the first color value Yai before updating. Ending clustering means accepting the first color value Yai corresponding to the color classification Sk as a clustering result, i.e., accepting the first color value Yai as a representative color representing the color classification Sk.
Referring to fig. 8, 10 and 11, fig. 10 shows the next clustering based on the clustering of fig. 8. Setting the clustering for this time: the pixel points with the minimum difference value between the colors corresponding to the center point A3 are P1, P2 and P4, and the pixel points with the minimum difference value between the colors corresponding to the center point a2 are P3, P5 and P6, then P1, P2 and P4 are classified into color classification S1, and P3, P5 and P6 are classified into color classification S2. In fig. 10, the color classification S1 is on the left side of the dividing line O1, and the color classification S2 is on the right side of the dividing line O1. As shown in fig. 11, after the center point is divided and updated, the center point of the color classification S1 is a5, and the value Yd1 of the corresponding first color value Ya1 is the average value of the second color values Yp1, Yp2, Yp4 of the pixels P1, P2, and P4; the central point of the color classification S2 is a6, and the value Yd2 of the corresponding first color value Ya2 is the average value of the second color values Yp3, Yp5, Yp6 of the pixel points P3, P5, and P6.
Referring to fig. 12, fig. 12 shows the next clustering based on the clustering shown in fig. 11. After the clustering, the center point of the color classification S1 is updated to a7, and the center point of the color classification S2 is updated to a 8. In one embodiment, if the position of a7 in the LAB color space is the same as the position of a5 in the color space, and the position of A8 in the LAB color space is the same as the position of a6 in the color space, the clustering is ended, and the first color value Ya1 corresponding to the color classification S1 and the first color value Ya2 corresponding to the color classification S2 after the clustering are output as the clustering result. In yet another embodiment, the position of a7 in the LAB color space is different from the position of a5 in the color space, the position of A8 in the LAB color space is different from the position of a6 in the color space, but the update frequency of the center point has reached the preset frequency threshold, the clustering is ended, and the first color value Ya1 corresponding to the color classification S1 and the first color value Ya2 corresponding to the color classification S2 after the clustering is finished are output as clustering results.
Referring to fig. 13, in some embodiments, 033: dividing the pixel points into corresponding color classifications according to a difference value between the second color value of each pixel point and the first color value of each color classification, including:
0331: acquiring a first LAB component of the first color value and a second LAB component of the second color value;
0332: acquiring a hue rotation term, a brightness compensation value, a chroma compensation value and a hue compensation value according to the first LAB component and the second LAB component;
0333: obtaining a difference value according to the first LAB component, the second LAB component, the tone rotation item, the brightness compensation value, the chroma compensation value and the tone compensation value; and
0334: and selecting the minimum difference value in the plurality of difference values between the second color value and the plurality of first color values of the pixel point, and dividing the pixel point into color classifications corresponding to the minimum difference value.
Referring to fig. 2, in some embodiments, the clustering module 13 can also be used to implement the methods of 0331, 0332, 0333 and 0334, i.e., the clustering module 13 can also be used to: obtaining a first LAB component of the first color value and a second LAB component of the second color value; acquiring a hue rotation term, a brightness compensation value, a chroma compensation value and a hue compensation value according to the first LAB component and the second LAB component; obtaining a difference value according to the first LAB component, the second LAB component, the hue rotation item, the brightness compensation value, the chroma compensation value and the hue compensation value; and selecting the minimum difference value in a plurality of difference values between the second color value and the first color values of the pixel point, and dividing the pixel point into color classifications corresponding to the minimum difference value. Referring to fig. 3, in some embodiments, the program further includes instructions for performing the methods of 0231, 0232, 0233 and 0234.
Please refer to fig. 14, wherein the first color value Ya1 of the center point a1 is (L)1,a1,b1) The first color value Ya2 of the center point a2 ═ (L)2,a2,b2) The second color value Yp1 of the pixel point P1 is (L)1 *,a1 *,b1 *) The difference value between the center point a1 and the pixel point P1 is E1, and the difference value between the center point a2 and the pixel point P1 is E2. The difference value E1 can be calculated by the following formula six:
formula six:
Figure BDA0003587208190000101
wherein, KL、KC、KHAre all preset values, in one embodiment, KL、KC、KHAll values of (A) are 1.
△L=L1-L1 *,△C=C1-C2,
Figure BDA0003587208190000102
Figure BDA0003587208190000103
Figure BDA0003587208190000104
Figure BDA0003587208190000105
h1=atan2(b1,a1)mod360°,h2=atan2(b1 *,a1 *)mod360°;
Figure BDA0003587208190000106
Figure BDA0003587208190000107
Figure BDA0003587208190000108
Figure BDA0003587208190000109
Wherein R isTIntroducing a hue rotation term R for the hue rotation termTThe difference value E1 can be adapted to the blue region around the hue angle 275 ° to avoid the inaccuracy of the difference of the blue region. S. theLFor the brightness compensation value, the difference value E1 is made to accurately reflect the brightness difference. SCFor the chrominance compensation value, the difference value E1 is made to accurately reflect the chrominance difference. S. theHIs a color toneThe compensation value enables the difference value E1 to accurately reflect the hue difference. Similar to the method for calculating the difference value E1, the component L in the formula five1、a1、b1Is replaced by L2、a2、b2Then, the calculated difference E2 can be calculated. After the disparity value E1 and the disparity value E2 are respectively calculated, the pixel point P1 is divided into color classifications corresponding to the minimum disparity value of the disparity value E1 and the disparity value E2. For example, E1 < E2, pixel P1 is classified as color classification S1. Similarly, when the number of the central points is i, the difference value Ek between the second color value Yp1 of the pixel point P1 and each first color value Yai is respectively calculated, and the pixel point P1 is divided into the color classification Skmin corresponding to the minimum difference value Ekmin. Wherein k ∈ [1, i ]]. By analogy, each pixel point Pj can be divided into corresponding color classifications Sk.
After the clustering results are obtained, the n clustering results with the largest sum of color differences with other clustering results are selected from the clustering results to generate n theme colors, so that the characteristic among the theme colors is clear and representative, and the gradual change color with similar color is prevented from being selected as the theme color.
Referring to fig. 15, in some embodiments, 04: obtaining the theme color according to the plurality of clustering results and the color difference sum corresponding to each clustering result, comprising:
041: respectively obtaining the color difference sum corresponding to each clustering result; and
042: and acquiring the previous n clustering results in the order from large to small of the color difference sum to respectively generate n theme colors, wherein the value of n is the target number of the theme colors.
Referring to fig. 2, in some embodiments, the color obtaining module 14 may also be used to implement the methods in 041 and 042, that is, the clustering module 13 may also be used to: respectively obtaining the color difference sum corresponding to each clustering result; and acquiring the previous n clustering results in the order of color difference sum from large to small to respectively generate n theme colors, wherein the value of n is the target number of the theme colors. Referring to FIG. 3, in some embodiments, the program further includes instructions for performing the methods of 041 and 042.
In some embodiments, let Jm be the clustering result, sum of color differences corresponding to Jm be Zm, and let m be 6, i.e. 6 clustering results are obtained after clustering, and n be 3, i.e. 3 theme colors are generated. Firstly, mapping the clustering result Jm back to an RGB color space to obtain an RGB color value Um corresponding to the clustering result Jm, wherein the color difference sum Zm of the color value Um is the sum of each color difference between the color value Um and other color values. See, for example, the following table:
Figure BDA0003587208190000111
Figure BDA0003587208190000121
wherein, the numbers with the horizontal and vertical coordinates intersecting represent the color difference between the color value of the horizontal axis and the color value of the vertical axis, for example, the color difference between the color value U1 and the color value U1 has a value of 0, and the color difference between the color value U1 and the color value U2 has a value of 1. And finally, the Z1, the Z2, the Z3, the Z4, the Z5 and the Z6 are color difference sums corresponding to color values U1, U2, U3, U4, U5 and U6 respectively. The color difference sum Z6 is the maximum, which indicates that the color difference between the color value U6 corresponding to the clustering result J6 and the color value corresponding to each other clustering result is the maximum; the color difference sum Z1 is minimum, which means that the color difference between the color value U1 corresponding to the clustering result J1 and the color value corresponding to each of the other clustering results is minimum. And when n is 3, taking the first 3 clustering results with the largest color difference sum to generate a theme color, in this example, the first 3 clustering results with the largest color difference sum are respectively a clustering result J6, a clustering result J5 and a clustering result J4, and taking a color value U6, a color value U5 and a color value U4 which respectively correspond to the clustering result J6, the clustering result J5 and the clustering result J4 as the theme color. In other embodiments, after the first n clustering results with the largest color difference sum are obtained, the color value in the RGB color space corresponding to the clustering result is not limited to be used as the theme color, and the color value in other color spaces, such as the color value in the HSV color space, the color value in the LAB color space, and the color value in the YUV space, may also be used, and is not limited herein.
Referring to fig. 16, in some embodiments, the image processing method further includes:
09: converting the third color value corresponding to the theme color into a YUV color space to generate a color watermark; and
010: and sequencing the color watermarks according to the brightness value of the third color value and displaying the color watermarks and the YUV image.
Referring to fig. 2, in some embodiments, the image processing apparatus 10 further includes a watermark generating module 18 and a sorting module 19. The watermark generation module 18 is used to implement the method in 09. The sorting module 19 is used to implement the method in 010. That is, the watermark generating module 18 is configured to convert the third color value corresponding to the theme color into the YUV color space to generate the color watermark. And the sorting module 19 is used for sorting the color watermark according to the brightness value of the third color emission value and displaying the color watermark and the YUV image. Referring to fig. 3, in some embodiments, the program further includes instructions for performing the methods of 09 and 010.
And the third color value is a color value corresponding to the subject color in the RGB color space. The color watermark is usually displayed on a display screen of a terminal 100 such as a mobile phone, a camera, a computer, etc., and an image obtained by shooting an object by such a terminal 100 is usually a YUV image, so that the generation of the color watermark in the YUV format can facilitate editing the YUV image stored by the terminal 100 by using the color watermark.
In one embodiment, according to the above formulas one to five, the YUV image of the terminal device is sequentially converted into an RGB color space image, an XYZ color space image, and an LAB color space image to perform corresponding processing. And generating a theme color according to the LAB color space image corresponding to the YUV image, and converting the third color value corresponding to the theme color back to the YUV color space to generate and sort the color watermarks corresponding to the YUV color space.
Please refer to fig. 17, the sorted color watermarks are displayed together with the YUV image, so that the user can visually see the color watermarks conforming to the color theme of the YUV image. In some embodiments, the user can utilize the color watermark to realize functions of making a personal homepage, screening image tones and the like, and the immersive interactive experience when the user browses images is enhanced.
In one embodiment, in order to conform to the visual effect of human eyes, after the color watermarks are generated, the color watermarks are sorted according to the brightness value of the third color value, so that the color watermarks seen by a user are sequentially arranged from light to dark or from dark to light, and the visual effect is attractive.
Referring to fig. 17, the present application further provides a non-volatile computer-readable storage medium 800 containing a computer program 801. One or more non-transitory computer-readable storage media 800 containing a computer program 801 according to embodiments of the present application, when the computer program 801 is executed by one or more processors 80, cause the processor 80 to perform the adjustment method according to any one of the above embodiments, for example, to implement one or more of steps 01, 02, and 03.
For example, the computer program 801, when executed by the one or more processors 80, causes the processor 80 to perform the steps of:
02: converting an image to be processed into an LAB color space to obtain a first image;
03: clustering the first image to obtain a plurality of clustering results; and
04: and obtaining the theme color according to the plurality of clustering results and the color difference sum corresponding to each clustering result.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Moreover, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and brought together by those skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations of the above embodiments may be made by those of ordinary skill in the art within the scope of the present application.

Claims (12)

1. An image processing method, comprising:
converting an image to be processed into an LAB color space to obtain a first image;
clustering the first image to obtain a plurality of clustering results; and
and obtaining the theme color according to the plurality of clustering results and the color difference sum corresponding to each clustering result.
2. The image processing method according to claim 1, wherein the pixels of the image to be processed include a first XYZ component, a second XYZ component, and a third XYZ component, and the converting the image to be processed to the LAB color space to obtain the first image includes:
acquiring an LAB three-channel value of the pixel of the image to be processed according to the first XYZ component, the second XYZ component, the third XYZ component, a preset mapping relation and a preset hyper-parameter of the pixel of the image to be processed; and
and acquiring the first image according to the LAB three-channel value of each pixel of the image to be processed.
3. The image processing method according to claim 1, wherein the clustering the first image to obtain a plurality of clustering results comprises:
determining a plurality of center points in the first image according to the number of color classifications, each of the center points corresponding to one of the color classifications;
acquiring a first color value of each color classification and a second color value of each pixel point in the first image;
dividing the pixel points into the corresponding color classifications according to the difference value between the second color value of each pixel point and the first color value of each color classification;
obtaining an average value of second color values of the pixel points in each color classification; and
and updating the central point and the first color value corresponding to each color classification according to the average value corresponding to each color classification.
4. The image processing method according to claim 3, wherein the clustering the first image to obtain a plurality of clustering results further comprises:
dividing the pixel points into the corresponding color classifications according to the difference value between the second color value of each pixel point and the first color value updated by each color classification;
finishing the clustering processing when the updating times of the central point corresponding to the color classification reach a preset time threshold value, and outputting a first color value corresponding to the central point as the clustering result; or
And finishing the clustering processing under the condition that the central point corresponding to the color classification is not changed before and after the updating, and outputting the first color value corresponding to the central point as the clustering result.
5. The image processing method according to claim 3, wherein the number of color classifications is twice the target number of the subject color.
6. The method of claim 3, wherein the classifying the pixels into the corresponding color classifications according to a difference between a second color value of each of the pixels and a first color value of each of the color classifications comprises:
obtaining a first LAB component of the first color value and a second LAB component of a second color value;
acquiring a hue rotation item, a brightness compensation value, a chroma compensation value and a hue compensation value according to the first LAB component and the second LAB component;
acquiring the difference value according to the first LAB component, the second LAB component, the tone rotation item, the brightness compensation value, the chrominance compensation value and the tone compensation value; and
selecting the minimum difference value of the plurality of difference values between the second color value and the plurality of first color values of the pixel point, and dividing the pixel point into the color classification corresponding to the minimum difference value.
7. The image processing method according to claim 1, wherein the obtaining a theme color according to the plurality of clustering results and a color difference sum corresponding to each clustering result comprises:
respectively obtaining the color difference sum corresponding to each clustering result; and
and acquiring the previous n clustering results in the order from large to small of the color difference sum to respectively generate n theme colors, wherein the value of n is the target number of the theme colors.
8. The image processing method according to claim 1, characterized in that the image processing method further comprises:
acquiring a YUV image;
performing down-sampling processing on the YUV image to obtain a second image;
converting the second image into an RGB color space to obtain a third image; and
and converting the third image into an XYZ color space to acquire the image to be processed.
9. The image processing method according to claim 8, characterized in that the image processing method further comprises:
converting the third color value corresponding to the theme color into a YUV color space to generate a color watermark; and
and sequencing the color watermarks according to the brightness value of the third color value and displaying the color watermarks and the YUV image together.
10. An image processing apparatus characterized by comprising:
the system comprises a first acquisition module, a second acquisition module and a processing module, wherein the first acquisition module is used for converting an image to be processed into an LAB color space to acquire a first image;
the clustering module is used for clustering the first image to obtain a plurality of clustering results; and
and the color obtaining module is used for obtaining the theme color according to the plurality of clustering results and the color difference sum corresponding to each clustering result.
11. A terminal, characterized in that the terminal comprises:
one or more processors, memory; and
one or more programs, wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the image processing method of any of claims 1 to 8.
12. A non-transitory computer-readable storage medium containing a computer program which, when executed by one or more processors, causes the processors to implement the instructions of the image processing method of any one of claims 1 to 8.
CN202210369140.0A 2022-04-08 2022-04-08 Image processing method, image processing apparatus, terminal, and readable storage medium Pending CN114723602A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210369140.0A CN114723602A (en) 2022-04-08 2022-04-08 Image processing method, image processing apparatus, terminal, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210369140.0A CN114723602A (en) 2022-04-08 2022-04-08 Image processing method, image processing apparatus, terminal, and readable storage medium

Publications (1)

Publication Number Publication Date
CN114723602A true CN114723602A (en) 2022-07-08

Family

ID=82241614

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210369140.0A Pending CN114723602A (en) 2022-04-08 2022-04-08 Image processing method, image processing apparatus, terminal, and readable storage medium

Country Status (1)

Country Link
CN (1) CN114723602A (en)

Similar Documents

Publication Publication Date Title
CN105144233B (en) Reference picture selection for moving ghost image filtering
US8213711B2 (en) Method and graphical user interface for modifying depth maps
US7945113B2 (en) Enhancement of image data based on plural image parameters
TWI511559B (en) Image processing method
US20090027732A1 (en) Image processing apparatus, image processing method, and computer program
WO2013131311A1 (en) Method and system for carrying out vision perception high-fidelity transformation on color digital image
WO1999017254A1 (en) Digital redeye removal
US8369654B2 (en) Developing apparatus, developing method and computer program for developing processing for an undeveloped image
KR20070090224A (en) Method of electronic color image saturation processing
JP2009055465A (en) Image processing device and method
CN110248242B (en) Image processing and live broadcasting method, device, equipment and storage medium
CN102663741A (en) Method for carrying out visual stereo perception enhancement on color digit image and system thereof
GB2323735A (en) Image processing system
WO2010085237A1 (en) Image processing
US9092889B2 (en) Image processing apparatus, image processing method, and program storage medium
US7054484B2 (en) Guided color correction system
US6985637B1 (en) Method and apparatus of enhancing a digital image using multiple selected digital images
KR20150028474A (en) Color compansation method for 3D Image
CA2674104C (en) Method and graphical user interface for modifying depth maps
KR101513931B1 (en) Auto-correction method of composition and image apparatus with the same technique
CN114723602A (en) Image processing method, image processing apparatus, terminal, and readable storage medium
JP4359662B2 (en) Color image exposure compensation method
US8922580B2 (en) Method and system to modify a color lookup table
Kuang et al. A psychophysical study on the influence factors of color preference in photographic color reproduction
US20050089217A1 (en) Data creation method data creation apparatus and 3-dimensional model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination