CN112037160A - Image processing method, device and equipment - Google Patents

Image processing method, device and equipment Download PDF

Info

Publication number
CN112037160A
CN112037160A CN202010894912.3A CN202010894912A CN112037160A CN 112037160 A CN112037160 A CN 112037160A CN 202010894912 A CN202010894912 A CN 202010894912A CN 112037160 A CN112037160 A CN 112037160A
Authority
CN
China
Prior art keywords
region
color
regions
preset
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010894912.3A
Other languages
Chinese (zh)
Other versions
CN112037160B (en
Inventor
王紫嫣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010894912.3A priority Critical patent/CN112037160B/en
Publication of CN112037160A publication Critical patent/CN112037160A/en
Priority to PCT/CN2021/115837 priority patent/WO2022042754A1/en
Application granted granted Critical
Publication of CN112037160B publication Critical patent/CN112037160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Color Image Communication Systems (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image processing method, device and equipment, and belongs to the technical field of image processing. The image processing method comprises the following steps: acquiring an original image and a reference image; determining a first area corresponding to a target area in the original image in the reference image according to the hue information; and processing the target area by using the color of the first area to obtain a target image corresponding to the original image. The image processing method, the image processing device and the image processing equipment can improve the color migration speed and efficiency.

Description

Image processing method, device and equipment
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image processing method, apparatus, and device.
Background
Image processing is a technique for processing an image with electronic equipment to achieve a desired result. Image processing includes, but is not limited to: image compression, brightness adjustment, contrast adjustment, background blurring, color gradation processing, filter processing, color migration processing, and the like.
Color migration refers to synthesizing a new image C based on the image a and the image B, the image C having shape information of the image a and color information of the image B.
In the color migration process in the related art, an image A and an image B in an RGB space are firstly converted into an Lab space, color migration is performed by using the mean value and the standard deviation of the images to obtain an image C in the Lab space, and then the image C is converted into the RGB space to obtain an image D. The image D at this time has shape information of the image a and color information of the image B.
However, in the course of implementing the present application, the inventors found that at least the following problems exist in the related art: the color migration speed is slow and the efficiency is low.
Disclosure of Invention
The embodiment of the application aims to provide an image processing method, device and equipment, which can solve the problems of low color migration speed and low efficiency.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an image processing method, including:
acquiring an original image and a reference image;
determining a first area corresponding to a target area in the original image in the reference image according to the hue information;
and processing the target area by using the color of the first area to obtain a target image corresponding to the original image.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including:
the acquisition module is used for acquiring an original image and a reference image;
the determining module is used for determining a first area in the reference image, which corresponds to the target area in the original image, according to the hue information;
and the processing module is used for processing the target area by using the color of the first area to obtain a target image corresponding to the original image.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the method according to the first aspect.
In the embodiment of the application, after the original image and the reference image are acquired, a first region corresponding to a target region in the original image in the reference image can be determined according to the hue information, and then the target region can be processed by using the color of the first region to obtain the target image corresponding to the original image. In this way, in the embodiment of the present application, the color of the first region corresponding to the target region of the original image in the reference image may be used to directly perform color processing on the target region, and color migration in different images may be implemented in an RGB space of the image without performing spatial conversion and calculating a mean value, a standard deviation, and the like of the image.
Drawings
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application;
FIG. 2 is a first schematic diagram of dividing an original image and a reference image according to an embodiment of the present application;
FIG. 3 is a first diagram of a rank ordering scheme provided by an embodiment of the present application;
FIG. 4 is a first schematic diagram of a target image provided by an embodiment of the present application;
FIG. 5 is a second schematic diagram of dividing an original image and a reference image according to an embodiment of the present application;
FIG. 6 is a second exemplary diagram of the rank ordering provided by the embodiments of the present application;
FIG. 7 is a second schematic diagram of a target image provided by an embodiment of the present application;
fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 10 is a hardware structure diagram of an electronic device implementing an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail an image processing method, an image processing apparatus, and an image processing device provided in the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application. The image processing method may include:
s101: an original image and a reference image are acquired.
S102: and determining a first area in the reference image corresponding to the target area in the original image according to the hue information.
S103: and processing the target area by using the color of the first area to obtain a target image corresponding to the original image.
Specific implementations of the above steps will be described in detail below.
In the embodiment of the application, after the original image and the reference image are acquired, a first region corresponding to a target region in the original image in the reference image can be determined according to the hue information, and then the target region can be processed by using the color of the first region to obtain the target image corresponding to the original image. In this way, in the embodiment of the present application, the color of the first region corresponding to the target region of the original image in the reference image may be used to directly perform color processing on the target region, and color migration in different images may be implemented in an RGB space of the image without performing spatial conversion and calculating a mean value, a standard deviation, and the like of the image.
In some possible implementations of the embodiments of the present application, S102 may include: respectively carrying out region division on the original image and the reference image according to a first preset region division mode to obtain a plurality of second regions corresponding to the original image and a plurality of third regions corresponding to the reference image; generating a first hue sequence corresponding to the original image according to the hue information of the plurality of second areas, and generating a second hue sequence corresponding to the reference image according to the hue information of the plurality of third areas; and determining a first area in the reference image corresponding to the target area in the original image according to the first color phase sequence and the second color phase sequence.
In some possible implementations of the embodiment of the present application, the first preset region dividing manner may be set according to actual needs.
In the embodiment of the present application, a first preset region division manner is an average 4 × 5 region division manner, which is taken as an example for explanation. In the embodiment of the application, the division mode of the average 4 × 5 areas is 4 equal divisions in the horizontal direction and 5 equal divisions in the vertical direction.
For example, assume that the original image 100 and the reference image 200 each include 400 pixels in 25 rows and 16 columns.
The original image 100 is divided into 20 second regions and the reference image 200 is divided into 20 third regions according to an average 4-by-5 region division manner, as shown in fig. 2. Fig. 2 is a schematic diagram of dividing an original image and a reference image according to an embodiment of the present application. The 20 second regions are from left to right, from top to bottom, respectively, the 1 st second region 201, the 2 nd second regions 202, … …, and the 20 th second region 220. The 20 third regions are respectively a 1 st third region 301, a 2 nd third region 302, … … and a 20 th third region 320 from top to bottom from left to right. It will be appreciated that each second region and each third region comprises 20 pixels in 5 rows and 4 columns.
It is assumed that the average RGB colors of the 20 pixels included in the 1 st to 20 th second regions 201 to 220 are (34,253,5), (46,65,76), (155,46,76), (56,46,54), (46,54,32), (78,65,234), (36,65,49), (136,165,149), (136,87,58), (136,23,65), (136,34,87), (165,57,89), (189,200,43), (200,123,83), (246,234,231), (243,157,138), (254,158,156), (140,140,240) and (243,157,138), respectively, in this order.
The average RGB colors of 20 pixels included in the 1 st to 20 th third regions 301 to 320 are (65,34,57), (146,165,176), (155,146,176), (156,146,154), (146,154,132), (178,165,34), (136,165,149), (36,165,149), (136,8,58), (16,23,65), (13,123,65), (136,134,87), (65,157,89), (189,200,143), (200,123,183), (246,34,21), (143,57,38), (54,58,56), (140,240,140), and (143,57,38), respectively.
The hue values corresponding to the 1 st second area 201 to the 20 th second area 220 are respectively: 113 °, 202 °, 343 °, 312 °, 82 °, 245 °, 147 °, 22 °, 338 °, 329 °, 342 °, 64 °,21 °,12 °,11 °,1 °,240 °, and 11 °.
The hue values corresponding to the 1 st to 20 th third regions 301 to 320 are respectively: 315 °, 202 °, 258 °, 312 °, 82 °, 55 °, 147 °, 173 °, 337 °,231 °, 148 °,58 °, 136 °, 72 °, 313 °,3 °,11 °, 150 °,120 ° and 11 °.
Sorting the hue values corresponding to the 1 st second area 201 to the 20 th second area 220 from small to large to generate a first hue sequence corresponding to the original image 100: 1 °,11 °,12 °,21 °, 22 °, 64 °, 82 °, 113 °, 147 °, 202 °,240 °, 245 °, 312 °, 329 °, 338 °, 342 ° and 343 °.
Sorting the hue values corresponding to the 1 st to 20 th third regions 301 to 320 from small to large to generate a second hue sequence corresponding to the reference image 200: 3 °,11 °, 55 °,58 °, 72 °, 82 °,120 °, 136 °, 147 °, 148 °, 150 °, 173 °, 202 °,231 °, 258 °, 312 °, 313 °, 315 °, and 337 °.
After the first color sequence corresponding to the original image 100 and the second color sequence corresponding to the reference image 200 are generated, a first region in the reference image 200 corresponding to the target region in the original image 100 is determined according to the first color sequence and the second color sequence.
In some possible implementations of the embodiments of the present application, determining a first region in the reference image corresponding to the target region in the original image according to the first color phase sequence and the second color phase sequence may include: according to a preset segmentation mode, respectively segmenting the first color phase sequence and the second color phase sequence to obtain a plurality of first color phase subsequences corresponding to the first color phase sequence and a plurality of second color phase subsequences corresponding to the second color phase sequence; sequencing second regions corresponding to the first color sub-sequences and third regions corresponding to the second color sub-sequences according to a first preset area sequence to obtain sequencing orders of the second regions corresponding to the first color sub-sequences and sequencing orders of the third regions corresponding to the second color sub-sequences; and determining the region with the same sorting order corresponding to the target region in the reference image as a first region.
The preset segmentation mode is not limited in the embodiment of the present application, and any available segmentation mode can be applied to the embodiment of the present application.
Assume that the predetermined segmentation method is to divide the hue sequence equally into 3 hue subsequences. When a hue sequence comprises a number of hue values that is divisible by 3, each hue subsequence comprises an equal number of hue values. When the hue sequence comprises the hue value number which cannot be divided by 3, the last hue subsequence comprises the hue value number which is the sum of the hue value number comprised by the hue sequence and the quotient and remainder of 3; the hue subsequences other than the last hue subsequence include a number of hue values that is a quotient of the number of hue values included in the hue subsequence and 3.
The first hue sequence is divided into 3 first hue subsequences, where the first hue subsequences include 5 hue values: 1 °,11 °,12 °,21 °, and 22 °. The second first hue subsequence comprises 5 hue values: 64 °, 82 °, 113 °, 147 °, and 202 °. The third first hue subsequence comprises 7 hue values: 240 °, 245 °, 312 °, 329 °, 338 °, 342 ° and 343 °.
The second hue sequence is divided into 3 first hue subsequences, where the first second hue subsequences include 6 hue values: 3 °,11 °, 55 °,58 °, 72 °, and 82 °. The second color sub-sequence comprises 6 color values: 120 °, 136 °, 147 °, 148 °, 150 °, and 173 °. The third second color subsequence comprises 7 color values: 202 °,231 °, 258 °, 312 °, 313 °, 315 °, and 337 °.
The second region corresponding to the first color phase subsequence includes: 9 th second region 209, 15 th second region 215, 16 th second region 216, 17 th second region 217, 18 th second region 218, and 20 th second region 220. The area of the second region corresponding to the first color phase subsequence is 6 second regions.
The second region corresponding to the second first color phase subsequence includes: a 1 st second region 201, a 2 nd second region 202, a 5 th second region 205, a 7 th second region 207, an 8 th second region 208, and a 14 th second region 214. The area of the second region corresponding to the second first color phase subsequence is 6 second regions.
The second region corresponding to the third first color phase subsequence includes: a 3 rd second region 203, a 4 th second region 204, a 6 th second region 206, a 10 th second region 210, an 11 th second region 211, a 12 th second region 212, a 13 th second region 213, and a 19 th second region 219. The area of the second region corresponding to the third first color phase subsequence is 8 second regions.
The third region corresponding to the first second color sub-sequence comprises: a 5 th third region 305, a 6 th third region 306, a 12 th third region 312, a 14 th third region 314, a 16 th third region 316, a 17 th third region 317, and a 20 th third region 320. The area of the third region corresponding to the first second color sub-sequence is 7 third regions.
The third region corresponding to the second color sub-sequence comprises: a 7 th third region 307, an 8 th third region 308, an 11 th third region 311, a 13 th third region 313, an 18 th third region 318, and a 19 th third region 319. The area of the third region corresponding to the second color sub-sequence is 6 third regions.
The third region corresponding to the third second color sub-sequence comprises: a 1 st third region 301, a 2 nd third region 302, a 3 rd third region 303, a 4 th third region 304, a 9 th third region 309, a 10 th third region 310, and a 15 th third region 315. The area of the third region corresponding to the third second color sub-sequence is 7 third regions.
The first preset area size sequence is not limited in the embodiments of the present application, and any available area size sequence may be applied to the embodiments of the present application. For example, the areas are in the order of increasing to decreasing, or the areas are in the order of decreasing to increasing.
And sequencing a second region corresponding to the first color sub-sequence, a second region corresponding to the second first color sub-sequence and a second region corresponding to the third first color sub-sequence in an order from large to small in area, wherein the second region corresponding to the third first color sub-sequence is a first order, the second region corresponding to the first color sub-sequence is a second order, and the second region corresponding to the second first color sub-sequence is a third order.
And sequencing a third region corresponding to the first second color sub-sequence, a third region corresponding to the second color sub-sequence and a third region corresponding to the third second color sub-sequence in the order of decreasing area, wherein the third region corresponding to the first second color sub-sequence is the first order, the third region corresponding to the third second color sub-sequence is the second order and the third region corresponding to the second color sub-sequence is the third order.
The result of the rank ordering is shown in fig. 3, and fig. 3 is a first diagram of the rank ordering provided by the embodiment of the present application.
And when the target area is a second area corresponding to a third first color sub-sequence, determining a third area corresponding to the first second color sub-sequence as the first area. That is, when the target region includes the 3 rd second region 203, the 4 th second region 204, the 6 th second region 206, the 10 th second region 210, the 11 th second region 211, the 12 th second region 212, the 13 th second region 213, and the 19 th second region 219, the first region includes the 5 th third region 305, the 6 th third region 306, the 12 th third region 312, the 14 th third region 314, the 16 th third region 316, the 17 th third region 317, and the 20 th third region 320.
And when the target area is a second area corresponding to the first color sub-sequence, determining a third area corresponding to a third second color sub-sequence as the first area. That is, when the target region includes the 9 th second region 209, the 15 th second region 215, the 16 th second region 216, the 17 th second region 217, the 18 th second region 218, and the 20 th second region 220, the first region includes the 1 st third region 301, the 2 nd third region 302, the 3 rd third region 303, the 4 th third region 304, the 9 th third region 309, the 10 th third region 310, and the 15 th third region 315.
When the target area is a second area corresponding to a second first color sub-sequence, a third area corresponding to a second color sub-sequence is determined as the first area. That is, when the target region includes the 1 st second region 201, the 2 nd second region 202, the 5 th second region 205, the 7 th second region 207, the 8 th second region 208, and the 14 th second region 214, the first region includes the 7 th third region 307, the 8 th third region 308, the 11 th third region 311, the 13 th third region 313, the 18 th third region 318, and the 19 th third region 319.
After the first region corresponding to the target region in the original image 100 in the reference image 200 is determined, the target region is processed by using the color of the first region, so as to obtain the target image 500 corresponding to the original image 100.
In some possible implementations of the embodiment of the present application, after S102 and before S103, the image processing method provided in the embodiment of the present application may further include: calculating an average color of the first region according to the color of each of a plurality of third regions included in the first region; the average color is determined as the color of the first region.
For example, the following description will take as an example that the target region includes a 3 rd second region 203, a 4 th second region 204, a 6 th second region 206, a 10 th second region 210, an 11 th second region 211, a 12 th second region 212, a 13 th second region 213, and a 19 th second region 219, and the first region includes a 5 th third region 305, a 6 th third region 306, a 12 th third region 312, a 14 th third region 314, a 16 th third region 316, a 17 th third region 317, and a 20 th third region 320.
The average color of the first region is the average color of the 5 th third region 305, the 6 th third region 306, the 12 th third region 312, the 14 th third region 314, the 16 th third region 316, the 17 th third region 317, and the 20 th third region 320.
The red component R in the average color of the first region is:
(146+178+136+189+246+143+143)/7=169。
the green component G in the average color of the first region is:
(154+165+134+200+34+57+57)/7=114。
the blue component B in the average color of the first region is:
(132+34+87+143+21+38+38)/7=70。
the average RGB color of the first region including the 5 th third region 305, the 6 th third region 306, the 12 th third region 312, the 14 th third region 314, the 16 th third region 316, the 17 th third region 317, and the 20 th third region 320 is (169,114, 70). The average RGB color (169,114,70) is determined as the color of the first region.
Similarly, the color of the first region including the 1 st third region 301, the 2 nd third region 302, the 3 rd third region 303, the 4 th third region 304, the 9 th third region 309, the 10 th third region 310 and the 15 th third region 315 is determined to be (125,92, 124).
A first region including the 7 th third region 307, the 8 th third region 308, the 11 th third region 311, the 13 th third region 313, the 18 th third region 318, and the 19 th third region 319 is determined to be colored (74,151,108).
After the first region corresponding to the target region in the original image 100 in the reference image 200 is determined, the target region is processed by using the color of the first region, so as to obtain the target image 500 corresponding to the original image 100.
In some possible implementations of embodiments of the present application, S103 may include calculating an average color of the original color of the target region and the color of the first region; the color of the target area is adjusted to an average color.
In some possible implementations of the embodiment of the present application, S103 may include fusing the original color of the target region and the color of the first region according to a preset fusion parameter to obtain a fused color; the color of the target area is adjusted to a blend color.
In some possible implementations of embodiments of the present application, the fused color may be calculated using equation (1).
F(r,g,b)=(b*Rr+a*R1,b*Gr+a*G1,b*Br+a*B1) (1)
In the formula (1), F (R, g, b) is the fusion color, Rr is the red component in the original color of the target region, Gr is the green component in the original color of the target region, Br is the blue component in the original color of the target region, and R1Is the red component in the color of the first region, G1Is the green component of the color of the first region, B1Which is the blue component of the color of the first region,a and b are fusion parameters.
In some possible implementations of the embodiment of the present application, the fusion parameter a and the fusion parameter b may be set according to actual needs.
It is assumed that the fusion parameter b is 100% and the fusion parameter a is 60%.
The following description will be given taking as an example that the target region includes a 3 rd second region 203, a 4 th second region 204, a 6 th second region 206, a 10 th second region 210, an 11 th second region 211, a 12 th second region 212, a 13 th second region 213, and a 19 th second region 219, and the color of the first region is (169,114, 70).
The fusion color corresponding to the 3 rd second region 203 is:
(155+ 60% 169,46+ 60% 114,76+ 60% 70) ═ 256, 115, 118. Then, (256, 115, 118) is converted to (0, 115, 118) since the red component is equal to 256 and the color ranges between 0-255. The color of the 3 rd second area 203 is adjusted from (155,46,76) to (0, 115, 118).
Similarly, the colors of the 4 th second region 204, the 6 th second region 206, the 10 th second region 210, the 11 th second region 211, the 12 th second region 212, the 13 th second region 213, and the 19 th second region 219 included in the target region may be adjusted.
Similarly, the colors of the 9 th second region 209, the 15 th second region 215, the 16 th second region 216, the 17 th second region 217, the 18 th second region 218, the 20 th second region 220, the 1 st second region 201, the 2 nd second region 202, the 3 rd second region 203, the 4 th second region 204, the 9 th second region 209, the 10 th second region 210, and the 15 th second region 215 may also be adjusted.
The image obtained after the color adjustment is the target image 500, as shown in fig. 4. Fig. 4 is a first schematic diagram of a target image provided in an embodiment of the present application.
In the embodiment of the application, color migration does not need to perform spatial conversion and calculate the mean value, the standard deviation and the like of an image, and the color migration speed and efficiency can be improved.
In some possible implementations of the embodiments of the present application, S102 may include: according to a second preset region division mode, region division is respectively carried out on the original image and the reference image to obtain a plurality of fourth regions corresponding to the original image and a plurality of fifth regions corresponding to the reference image; dividing regions according to the hue information of the fourth region and a plurality of preset hue information, determining a sixth region corresponding to each preset hue information divided region in the original image, and dividing regions according to the hue information of the fifth region and the plurality of preset hue information, determining a seventh region corresponding to each preset hue information divided region in the reference image; sequencing sixth areas corresponding to the preset hue information division areas and seventh areas corresponding to the preset hue information division areas according to the size sequence of a second preset area to obtain the sequencing order of the sixth areas corresponding to the preset hue information division areas and the sequencing order of the seventh areas corresponding to the preset hue information division areas; and determining the region with the same sorting order corresponding to the target region in the reference image as a first region.
In some possible implementations of the embodiment of the present application, the second preset region dividing manner may be set according to actual needs.
In the embodiment of the present application, a second preset region division manner is an average 4 × 5 region division manner, which is taken as an example for explanation. In the embodiment of the application, the division mode of the average 4 × 5 areas is 4 equal divisions in the horizontal direction and 5 equal divisions in the vertical direction.
For example, assume that the original image 100 and the reference image 200 each include 400 pixels in 25 rows and 16 columns.
The original image 100 is divided into 20 fourth regions and the reference image 200 is divided into 20 fifth regions according to an average 4 by 5 region division manner, as shown in fig. 5. Fig. 5 is a second schematic diagram of dividing an original image and a reference image according to an embodiment of the present application. The 20 fourth regions are from left to right, and from top to bottom, respectively, a 1 st fourth region 401, a 2 nd fourth region 402, … …, and a 20 th fourth region 420. The 20 fifth regions are, from left to right, a 1 st fifth region 501, a 2 nd fifth regions 502, … …, and a 20 th fifth region 520, respectively, from top to bottom. It is understood that each of the fourth regions and each of the fifth regions includes 20 pixels in 5 rows and 4 columns.
It is assumed that the average RGB colors of the 20 pixels included in the 1 st fourth region 401 to the 20 th fourth region 420 are (34,253,5), (46,65,76), (155,46,76), (56,46,54), (46,54,32), (78,65,234), (36,65,49), (136,165,149), (136,87,58), (136,23,65), (136,34,87), (165,57,89), (189,200,43), (200,123,83), (246,234,231), (243,157,138), (254,158,156), (140,140,240) and (243,157,138), respectively, in this order.
The average RGB colors of 20 pixels included in the 1 st to 20 th fifth regions 501 to 520 are (65,34,57), (146,165,176), (155,146,176), (156,146,154), (146,154,132), (178,165,34), (136,165,149), (36,165,149), (136,8,58), (16,23,65), (13,123,65), (136,134,87), (65,157,89), (189,200,143), (200,123,183), (246,34,21), (143,57,38), (54,58,56), (140,240,140), and (143,57,38), respectively.
The hue values corresponding to the 1 st fourth region 401 to the 20 th fourth region 420 are respectively: 113 °, 202 °, 343 °, 312 °, 82 °, 245 °, 147 °, 22 °, 338 °, 329 °, 342 °, 64 °,21 °,12 °,11 °,1 °,240 °, and 11 °.
The hue values corresponding to the 1 st fifth area 501 to the 20 th fifth area 520 are respectively: 315 °, 202 °, 258 °, 312 °, 82 °, 55 °, 147 °, 173 °, 337 °,231 °, 148 °,58 °, 136 °, 72 °, 313 °,3 °,11 °, 150 °,120 ° and 11 °.
In some possible implementations of the embodiment of the present application, the preset hue information division interval may be set according to actual requirements.
Assume that there are three divisions of preset hue information, the first of which is [0,120 ], the second of which is [120,240 ], and the third of which is [240,360 ].
Determining a sixth region corresponding to the divided region of the first preset hue information according to the hue values corresponding to the 1 st to 20 th fourth regions 401 to 420 and the three divided regions of the preset hue information comprises: the 1 st fourth region 401, the 5 th fourth region 405, the 9 th fourth region 409, the 14 th fourth region 414, the 15 th fourth region 415, the 16 th fourth region 416, the 17 th fourth region 417, the 18 th fourth region 418, and the 20 th fourth region 420, for a total of 9 fourth regions. Determining a sixth region corresponding to the second divided region of the preset hue information comprises: a 2 nd fourth area 402, a 7 th fourth area 407, and an 8 th fourth area 408, for a total of 3 fourth areas. Determining a sixth region corresponding to the third preset hue information division region includes: a 3 rd fourth area 403, a 4 th fourth area 404, a 6 th fourth area 406, a 10 th fourth area 410, an 11 th fourth area 411, a 12 th fourth area 412, a 13 th fourth area 413 and a 19 th fourth area 419, for a total of 8 fourth areas.
Determining a seventh region corresponding to the divided first preset hue information region according to the hue values corresponding to the 1 st to 20 th fifth regions 501 to 520 and the three divided preset hue information regions comprises: a 5 th fifth region 505, a 6 th fifth region 506, a 12 th fifth region 512, a 14 th fifth region 514, a 16 th fifth region 516, a 17 th fifth region 517 and a 20 th fifth region 520, for a total of 7 fifth regions. Determining a seventh area corresponding to the second divided region of the preset hue information comprises: a 2 nd fifth region 502, a 7 th fifth region 507, an 8 th fifth region 508, an 11 th fifth region 511, a 13 th fifth region 513, an 18 th fifth region 518, and a 19 th fifth region 519, for a total of 7 fifth regions. Determining a seventh region corresponding to the third preset hue information division region includes: the 1 st fifth region 501, the 3 rd fifth region 503, the 4 th fifth region 504, the 9 th fifth region 509, the 10 th fifth region 510, and the 15 th fifth region 515, for a total of 6 fifth regions.
The second preset area size order is not limited in the embodiments of the present application, and any available area size order may be applied to the embodiments of the present application. For example, the areas are in the order of increasing to decreasing, or the areas are in the order of decreasing to increasing.
Sorting a sixth region corresponding to the first divided region of the preset hue information, a sixth region corresponding to the second divided region of the preset hue information, and a sixth region corresponding to the third divided region of the preset hue information in order of decreasing area, wherein the sixth region corresponding to the first divided region of the preset hue information is the first order, the sixth region corresponding to the third divided region of the preset hue information is the second order, and the sixth region corresponding to the second divided region of the preset hue information is the third order.
Sequencing a seventh area corresponding to the first preset hue information division interval, a seventh area corresponding to the second preset hue information division interval and a seventh area corresponding to the third preset hue information division interval according to the sequence of the areas from large to small, wherein the seventh area corresponding to the first preset hue information division interval is at the first order, the seventh area corresponding to the second preset hue information division interval is at the second order, and the seventh area corresponding to the third preset hue information division interval is at the third order.
The result of the rank ordering is shown in fig. 6, and fig. 6 is a second schematic diagram of the rank ordering provided in the embodiment of the present application.
When the target area is a sixth area corresponding to the first preset hue information divided section, a seventh area corresponding to the first preset hue information divided section is determined as the first area. That is, when the target region includes the 1 st fourth region 401, the 5 th fourth region 405, the 9 th fourth region 409, the 14 th fourth region 414, the 15 th fourth region 415, the 16 th fourth region 416, the 17 th fourth region 417, the 18 th fourth region 418, and the 20 th fourth region 420, the first region includes the 5 th fifth region 505, the 6 th fifth region 506, the 12 th fifth region 512, the 14 th fifth region 514, the 16 th fifth region 516, the 17 th fifth region 517, and the 20 th fifth region 520.
When the target area is a sixth area corresponding to a third divided section of the preset hue information, a seventh area corresponding to a second divided section of the preset hue information is determined as the first area. That is, when the target region includes the 3 rd fourth region 403, the 4 th fourth region 404, the 6 th fourth region 406, the 10 th fourth region 410, the 11 th fourth region 411, the 12 th fourth region 412, the 13 th fourth region 413, and the 19 th fourth region 419, the first region includes the 2 nd fifth region 502, the 7 th fifth region 507, the 8 th fifth region 508, the 11 th fifth region 511, the 13 th fifth region 513, the 18 th fifth region 518, and the 19 th fifth region 519.
When the target area is a sixth area corresponding to the second divided section of the preset hue information, a seventh area corresponding to the third divided section of the preset hue information is determined as the first area. That is, the target region includes the 2 nd fourth region 402, the 7 th fourth region 407, and the 8 th fourth region 408, the first region includes the 1 st fifth region 501, the 3 rd fifth region 503, the 4 th fifth region 504, the 9 th fifth region 509, the 10 th fifth region 510, and the 15 th fifth region 515.
After the first region corresponding to the target region in the original image 100 in the reference image 200 is determined, the target region is processed by using the color of the first region, so as to obtain the target image 500 corresponding to the original image 100.
In some possible implementations of the embodiment of the present application, after S102 and before S103, the image processing method provided in the embodiment of the present application may further include: calculating an average color of the first region according to a color of each of a plurality of fifth regions included in the first region; the average color is determined as the color of the first region.
Illustratively, the target region includes a 1 st fourth region 401, a 5 th fourth region 405, a 9 th fourth region 409, a 14 th fourth region 414, a 15 th fourth region 415, a 16 th fourth region 416, a 17 th fourth region 417, an 18 th fourth region 418, and a 20 th fourth region 420, and the first region includes a 5 th fifth region 505, a 6 th fifth region 506, a 12 th fifth region 512, a 14 th fifth region 514, a 16 th fifth region 516, a 17 th fifth region 517, and a 20 th fifth region 520.
The average color of the first region is the average color of the 5 th fifth region 505, the 6 th fifth region 506, the 12 th fifth region 512, the 14 th fifth region 514, the 16 th fifth region 516, the 17 th fifth region 517 and the 20 th fifth region 520.
The red component R in the average color of the first region is:
(146+178+136+189+246+143+143)/7=169。
the green component G in the average color of the first region is:
(154+165+134+200+34+57+57)/7=114。
the blue component B in the average color of the first region is:
(132+34+87+143+21+38+38)/7=70。
the average RGB color of the first region including the 5 th fifth region 505, the 6 th fifth region 506, the 12 th fifth region 512, the 14 th fifth region 514, the 16 th fifth region 516, the 17 th fifth region 517 and the 20 th fifth region 520 is (169,114, 70). The average RGB color (169,114,70) is determined as the color of the first region.
Similarly, the color of the first region including the 2 nd fifth region 502, the 7 th fifth region 507, the 8 th fifth region 508, the 11 th fifth region 511, the 13 th fifth region 513, the 18 th fifth region 518, and the 19 th fifth region 519 is determined to be (84,153,118).
The color of a first region including the 1 st fifth region 501, the 3 rd fifth region 503, the 4 th fifth region 504, the 9 th fifth region 509, the 10 th fifth region 510, and the 15 th fifth region 515 is determined to be (121,80, 116).
After the first region corresponding to the target region in the original image 100 in the reference image 200 is determined, the target region is processed by using the color of the first region, so as to obtain the target image 500 corresponding to the original image 100.
In some possible implementations of the embodiment of the present application, S103 may include fusing the original color of the target region and the color of the first region according to a preset fusion parameter to obtain a fused color; the color of the target area is adjusted to a blend color.
In some possible implementations of embodiments of the present application, the fused color may be calculated using equation (1) above.
Let the fusion parameter b be 100% and a be 60%.
Taking the target area including the 1 st fourth area 401, the 5 th fourth area 405, the 9 th fourth area 409, the 14 th fourth area 414, the 15 th fourth area 415, the 16 th fourth area 416, the 17 th fourth area 417, the 18 th fourth area 418, and the 20 th fourth area 420 as an example, the first area has a color of (169,114, 70).
The fusion color corresponding to the 1 st fourth region 401 is:
(34+ 60% 169,253+ 60% 114,5+ 60% 70) ═ 135, 322, 47. Then, since the green component 322 is greater than 255 and the color range is between 0-255, (135, 322, 47) translates to (135, 66, 47). The color of the 1 st fourth area 401 is adjusted from (34,253,5) to (135, 66, 47).
Similarly, the colors of the 5 th, 9 th, 14 th, 15 th, 415 th, 16 th, 416 th, 17 th, 417 th, 18 th, 418 th, and 20 th fourth regions 420 included in the target region may be adjusted.
Similarly, the colors of the 3 rd fourth area 403, the 4 th fourth area 404, the 6 th fourth area 406, the 10 th fourth area 410, the 11 th fourth area 411, the 12 th fourth area 412, the 13 th fourth area 413, the 19 th fourth area 419, the 2 nd fourth area 402, the 7 th fourth area 407, and the 8 th fourth area 408 may also be adjusted.
The image obtained after the color adjustment is the target image 500, as shown in fig. 7. Fig. 7 is a second schematic diagram of a target image provided in an embodiment of the present application.
In the embodiment of the application, color migration does not need to perform spatial conversion and calculate the mean value, the standard deviation and the like of an image, and the color migration speed and efficiency can be improved.
In some possible implementations of embodiments of the present application, the reference image may be an image selected by a user from pre-stored images or an image captured by an image capture component after an original image is acquired.
In the embodiment of the application, a user can select a reference image or a shot image, so that the obtained target image can meet the requirements of the user. When the user takes an image, the user can also keep the sense of ambience at the time of taking the image.
It should be noted that, in the image processing method provided in the embodiment of the present application, the execution subject may be an image processing apparatus, or a control module in the image processing apparatus for executing the image processing method. The image processing apparatus provided in the embodiment of the present application is described with an example in which an image processing apparatus executes an image processing method.
Fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application. The image processing apparatus may include:
an obtaining module 801, configured to obtain an original image and a reference image;
a determining module 802, configured to determine, according to the hue information, a first region in the reference image corresponding to a target region in the original image;
the processing module 803 is configured to process the target area with the color of the first area to obtain a target image corresponding to the original image.
In some possible implementations of the embodiment of the present application, the processing module 803 may be specifically configured to:
fusing the original color of the target area with the color of the first area according to preset fusion parameters to obtain a fused color;
the color of the target area is adjusted to a blend color.
In some possible implementations of the embodiments of the present application, the determining module 802 may include:
the first dividing module is used for respectively dividing the regions of the original image and the reference image according to a first preset region dividing mode to obtain a plurality of second regions corresponding to the original image and a plurality of third regions corresponding to the reference image;
the generating submodule is used for generating a first color phase sequence corresponding to the original image according to the color phase information of the second areas and generating a second color phase sequence corresponding to the reference image according to the color phase information of the third areas;
and the first determining submodule is used for determining a first area in the reference image, which corresponds to the target area in the original image, according to the first color phase sequence and the second color phase sequence.
In some possible implementations of the embodiment of the present application, the first determining submodule is specifically configured to:
according to a preset segmentation mode, respectively segmenting the first color phase sequence and the second color phase sequence to obtain a plurality of first color phase subsequences corresponding to the first color phase sequence and a plurality of second color phase subsequences corresponding to the second color phase sequence;
sequencing second regions corresponding to the first color sub-sequences and third regions corresponding to the second color sub-sequences according to a first preset area sequence to obtain sequencing orders of the second regions corresponding to the first color sub-sequences and sequencing orders of the third regions corresponding to the second color sub-sequences;
and determining the region with the same sorting order corresponding to the target region in the reference image as a first region.
In some possible implementations of the embodiments of the present application, the determining module 802 may include:
the second division submodule is used for respectively carrying out region division on the original image and the reference image according to a second preset region division mode to obtain a plurality of fourth regions corresponding to the original image and a plurality of fifth regions corresponding to the reference image;
the second determining submodule is used for dividing regions according to the hue information of the fourth region and a plurality of preset hue information, determining a sixth region corresponding to each preset hue information divided region in the original image, dividing regions according to the hue information of the fifth region and the plurality of preset hue information, and determining a seventh region corresponding to each preset hue information divided region in the reference image;
the region sorting submodule is used for sorting sixth regions corresponding to the preset hue information division regions and seventh regions corresponding to the preset hue information division regions according to a second preset area size sequence to obtain sorting orders of the sixth regions corresponding to the preset hue information division regions and sorting orders of the seventh regions corresponding to the preset hue information division regions;
and the third determining submodule is used for determining the region with the same sorting order corresponding to the target region in the reference image as the first region.
In some possible implementations of the embodiments of the present application, the image processing apparatus provided in the embodiments of the present application may further include:
a calculation module for calculating an average color of the first region according to a color of each of a plurality of third regions included in the first region or a color of each of a plurality of fifth regions included in the first region; and determining the average color as the color of the first area.
In some possible implementations of embodiments of the present application, the reference image is an image selected by a user from pre-stored images or an image acquired by an image acquisition assembly after an original image is acquired.
The image processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The image processing apparatus provided in the embodiment of the present application can implement each process implemented by the image processing apparatus in the image processing method embodiments of fig. 1 to fig. 7, and for avoiding repetition, details are not repeated here.
Optionally, as shown in fig. 9, an electronic device 900 is further provided in this embodiment of the present application, and includes a processor 901, a memory 902, and a program or an instruction stored in the memory 902 and executable on the processor 901, where the program or the instruction is executed by the processor 901 to implement each process of the foregoing image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 10 is a hardware structure diagram of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 10 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The processor 1010 is configured to obtain an original image and a reference image; determining a first area corresponding to a target area in the original image in the reference image according to the hue information; and processing the target area by using the color of the first area to obtain a target image corresponding to the original image.
In some possible implementations of the embodiments of the present application, the processor 1010 may be specifically configured to:
fusing the original color of the target area with the color of the first area according to preset fusion parameters to obtain a fused color;
the color of the target area is adjusted to a blend color.
In some possible implementations of the embodiments of the present application, the processor 1010 may be specifically configured to:
respectively carrying out region division on the original image and the reference image according to a first preset region division mode to obtain a plurality of second regions corresponding to the original image and a plurality of third regions corresponding to the reference image;
generating a first hue sequence corresponding to the original image according to the hue information of the plurality of second areas, and generating a second hue sequence corresponding to the reference image according to the hue information of the plurality of third areas;
and determining a first area in the reference image corresponding to the target area in the original image according to the first color phase sequence and the second color phase sequence.
In some possible implementations of the embodiments of the present application, the processor 1010 may be specifically configured to:
according to a preset segmentation mode, respectively segmenting the first color phase sequence and the second color phase sequence to obtain a plurality of first color phase subsequences corresponding to the first color phase sequence and a plurality of second color phase subsequences corresponding to the second color phase sequence;
sequencing second regions corresponding to the first color sub-sequences and third regions corresponding to the second color sub-sequences according to a first preset area sequence to obtain sequencing orders of the second regions corresponding to the first color sub-sequences and sequencing orders of the third regions corresponding to the second color sub-sequences;
and determining the region with the same sorting order corresponding to the target region in the reference image as a first region.
In some possible implementations of the embodiments of the present application, the processor 1010 may be specifically configured to:
according to a second preset region division mode, region division is respectively carried out on the original image and the reference image to obtain a plurality of fourth regions corresponding to the original image and a plurality of fifth regions corresponding to the reference image;
dividing regions according to the hue information of the fourth region and a plurality of preset hue information, determining a sixth region corresponding to each preset hue information divided region in the original image, and dividing regions according to the hue information of the fifth region and the plurality of preset hue information, determining a seventh region corresponding to each preset hue information divided region in the reference image;
sequencing sixth areas corresponding to the preset hue information division areas and seventh areas corresponding to the preset hue information division areas according to the size sequence of a second preset area to obtain the sequencing order of the sixth areas corresponding to the preset hue information division areas and the sequencing order of the seventh areas corresponding to the preset hue information division areas;
and determining the region with the same sorting order corresponding to the target region in the reference image as a first region.
In some possible implementations of embodiments of the application, the processor 1010 may be further configured to:
calculating an average color of the first region according to a color of each of a plurality of third regions included in the first region or a color of each of a plurality of fifth regions included in the first region; and determining the average color as the color of the first area.
It should be understood that in the embodiment of the present application, the input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, and the Graphics Processing Unit 10041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 may include two parts, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1009 may be used to store software programs as well as various data, including but not limited to application programs and operating systems. Processor 1010 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the embodiment of the image processing method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (13)

1. An image processing method, characterized in that the method comprises:
acquiring an original image and a reference image;
determining a first area corresponding to a target area in the original image in the reference image according to the hue information;
and processing the target area by using the color of the first area to obtain a target image corresponding to the original image.
2. The method of claim 1, wherein the processing the target region with the color of the first region comprises:
fusing the original color of the target area with the color of the first area according to preset fusion parameters to obtain a fused color;
and adjusting the color of the target area to the fusion color.
3. The method according to claim 1, wherein the determining the first region in the reference image corresponding to the target region in the original image according to the hue information comprises:
according to a first preset region division mode, respectively carrying out region division on the original image and the reference image to obtain a plurality of second regions corresponding to the original image and a plurality of third regions corresponding to the reference image;
generating a first color phase sequence corresponding to the original image according to the color phase information of the plurality of second areas, and generating a second color phase sequence corresponding to the reference image according to the color phase information of the plurality of third areas;
and determining a first region in the reference image corresponding to the target region in the original image according to the first color phase sequence and the second color phase sequence.
4. The method of claim 3, wherein determining the first region in the reference image corresponding to the target region in the original image according to the first color phase sequence and the second color phase sequence comprises:
according to a preset segmentation mode, respectively segmenting the first color phase sequence and the second color phase sequence to obtain a plurality of first color phase subsequences corresponding to the first color phase sequence and a plurality of second color phase subsequences corresponding to the second color phase sequence;
sequencing second regions corresponding to the plurality of first color phase sub-sequences and third regions corresponding to the plurality of second color phase sub-sequences according to a first preset area sequence to obtain sequencing orders of the second regions corresponding to the plurality of first color phase sub-sequences and sequencing orders of the third regions corresponding to the plurality of second color phase sub-sequences;
and determining the region with the same ranking order corresponding to the target region in the reference image as the first region.
5. The method according to claim 1, wherein the determining the first region in the reference image corresponding to the target region in the original image according to the hue information comprises:
according to a second preset region division mode, respectively carrying out region division on the original image and the reference image to obtain a plurality of fourth regions corresponding to the original image and a plurality of fifth regions corresponding to the reference image;
dividing regions according to the hue information of the fourth region and a plurality of preset hue information, determining a sixth region corresponding to each divided region of the preset hue information in the original image, and determining a seventh region corresponding to each divided region of the preset hue information in the reference image according to the hue information of the fifth region and the plurality of divided regions of the preset hue information;
sorting the sixth areas corresponding to the preset hue information division areas and the seventh areas corresponding to the preset hue information division areas according to a second preset area size sequence to obtain a sorting order of the sixth areas corresponding to the preset hue information division areas and a sorting order of the seventh areas corresponding to the preset hue information division areas;
and determining the region with the same ranking order corresponding to the target region in the reference image as the first region.
6. The method according to claim 4 or 5, wherein after determining the first region in the reference image corresponding to the target region in the original image according to the hue information, before processing the target region by using the color of the first region, the method further comprises:
calculating an average color of the first region according to a color of each of a plurality of third regions included in the first region or a color of each of a plurality of fifth regions included in the first region;
and determining the average color as the color of the first area.
7. An image processing apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring an original image and a reference image;
the determining module is used for determining a first area corresponding to a target area in the original image in the reference image according to the hue information;
and the processing module is used for processing the target area by using the color of the first area to obtain a target image corresponding to the original image.
8. The apparatus of claim 7, wherein the processing module is specifically configured to:
fusing the original color of the target area with the color of the first area according to preset fusion parameters to obtain a fused color;
and adjusting the color of the target area to the fusion color.
9. The apparatus of claim 7, wherein the determining module comprises:
the first dividing module is used for respectively dividing the regions of the original image and the reference image according to a first preset region dividing mode to obtain a plurality of second regions corresponding to the original image and a plurality of third regions corresponding to the reference image;
the generating submodule is used for generating a first color phase sequence corresponding to the original image according to the color phase information of the second areas and generating a second color phase sequence corresponding to the reference image according to the color phase information of the third areas;
and the first determining sub-module is used for determining a first area in the reference image corresponding to the target area in the original image according to the first color phase sequence and the second color phase sequence.
10. The apparatus of claim 9, wherein the first determining submodule is specifically configured to:
according to a preset segmentation mode, respectively segmenting the first color phase sequence and the second color phase sequence to obtain a plurality of first color phase subsequences corresponding to the first color phase sequence and a plurality of second color phase subsequences corresponding to the second color phase sequence;
sequencing second regions corresponding to the plurality of first color phase sub-sequences and third regions corresponding to the plurality of second color phase sub-sequences according to a first preset area sequence to obtain sequencing orders of the second regions corresponding to the plurality of first color phase sub-sequences and sequencing orders of the third regions corresponding to the plurality of second color phase sub-sequences;
and determining the region with the same ranking order corresponding to the target region in the reference image as the first region.
11. The apparatus of claim 7, wherein the determining module comprises:
the second division submodule is used for respectively carrying out region division on the original image and the reference image according to a second preset region division mode to obtain a plurality of fourth regions corresponding to the original image and a plurality of fifth regions corresponding to the reference image;
the second determining submodule is used for dividing regions according to the hue information of the fourth region and a plurality of preset hue information, determining a sixth region corresponding to each divided region of the preset hue information in the original image, and determining a seventh region corresponding to each divided region of the preset hue information in the reference image according to the hue information of the fifth region and the plurality of divided regions of the preset hue information;
the region sorting submodule is used for sorting sixth regions corresponding to the preset hue information division regions and seventh regions corresponding to the preset hue information division regions according to a second preset area size sequence to obtain sorting orders of the sixth regions corresponding to the preset hue information division regions and sorting orders of the seventh regions corresponding to the preset hue information division regions;
and the third determining submodule is used for determining the region with the same sorting order corresponding to the target region in the reference image as the first region.
12. The apparatus of claim 10 or 11, further comprising:
a calculation module, configured to calculate an average color of the first region according to a color of each of a plurality of third regions included in the first region or a color of each of a plurality of fifth regions included in the first region; and determining the average color as the color of the first area.
13. An electronic device, characterized in that the electronic device comprises: a processor, a memory and a program or instructions stored on the memory and executable on the processor, which when executed by the processor, implement the steps of the image processing method of any one of claims 1 to 6.
CN202010894912.3A 2020-08-31 2020-08-31 Image processing method, device and equipment Active CN112037160B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010894912.3A CN112037160B (en) 2020-08-31 2020-08-31 Image processing method, device and equipment
PCT/CN2021/115837 WO2022042754A1 (en) 2020-08-31 2021-08-31 Image processing method and apparatus, and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010894912.3A CN112037160B (en) 2020-08-31 2020-08-31 Image processing method, device and equipment

Publications (2)

Publication Number Publication Date
CN112037160A true CN112037160A (en) 2020-12-04
CN112037160B CN112037160B (en) 2024-03-01

Family

ID=73587673

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010894912.3A Active CN112037160B (en) 2020-08-31 2020-08-31 Image processing method, device and equipment

Country Status (2)

Country Link
CN (1) CN112037160B (en)
WO (1) WO2022042754A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112686355A (en) * 2021-01-12 2021-04-20 树根互联技术有限公司 Image processing method and device, electronic equipment and readable storage medium
WO2022042754A1 (en) * 2020-08-31 2022-03-03 维沃移动通信有限公司 Image processing method and apparatus, and device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230005112A1 (en) * 2021-06-30 2023-01-05 V5 Technologies Co., Ltd. Image matching method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070292608A1 (en) * 2006-06-16 2007-12-20 Allan Blase Joseph Rodrigues Color chips prepared by color clustering used for matching refinish paints
CN102164238A (en) * 2006-01-10 2011-08-24 松下电器产业株式会社 Color correction device, color correction method, dynamic camera color correction device, and video search device using the same
CN102694958A (en) * 2012-02-10 2012-09-26 华为终端有限公司 Image hue determination method and wireless handheld device
WO2013131311A1 (en) * 2012-03-04 2013-09-12 Hou Kejie Method and system for carrying out vision perception high-fidelity transformation on color digital image
CN104899909A (en) * 2015-05-12 2015-09-09 福建天晴数码有限公司 Color mapping method and device thereof
US20160005154A1 (en) * 2011-09-28 2016-01-07 U.S. Army Research Laboratory Attn: Rdrl-Loc-I System and processor implemented method for improved image quality and generating an image of a target illuminated by quantum particles
WO2018133305A1 (en) * 2017-01-19 2018-07-26 华为技术有限公司 Method and device for image processing

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011221812A (en) * 2010-04-09 2011-11-04 Sony Corp Information processing device, method and program
CN102542544A (en) * 2010-12-30 2012-07-04 北京大学 Color matching method and system
US8867833B2 (en) * 2013-03-14 2014-10-21 Ili Technology Corporation Image processing method
CN103617596A (en) * 2013-10-12 2014-03-05 中山大学 Image color style transformation method based on flow pattern transition
US9697233B2 (en) * 2014-08-12 2017-07-04 Paypal, Inc. Image processing and matching
CN107093168A (en) * 2017-03-10 2017-08-25 厦门美图之家科技有限公司 Processing method, the device and system of skin area image
CN107862657A (en) * 2017-10-31 2018-03-30 广东欧珀移动通信有限公司 Image processing method, device, computer equipment and computer-readable recording medium
CN107945135B (en) * 2017-11-30 2021-03-02 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, storage medium, and electronic device
CN112037160B (en) * 2020-08-31 2024-03-01 维沃移动通信有限公司 Image processing method, device and equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102164238A (en) * 2006-01-10 2011-08-24 松下电器产业株式会社 Color correction device, color correction method, dynamic camera color correction device, and video search device using the same
US20070292608A1 (en) * 2006-06-16 2007-12-20 Allan Blase Joseph Rodrigues Color chips prepared by color clustering used for matching refinish paints
US20160005154A1 (en) * 2011-09-28 2016-01-07 U.S. Army Research Laboratory Attn: Rdrl-Loc-I System and processor implemented method for improved image quality and generating an image of a target illuminated by quantum particles
CN102694958A (en) * 2012-02-10 2012-09-26 华为终端有限公司 Image hue determination method and wireless handheld device
WO2013131311A1 (en) * 2012-03-04 2013-09-12 Hou Kejie Method and system for carrying out vision perception high-fidelity transformation on color digital image
CN104899909A (en) * 2015-05-12 2015-09-09 福建天晴数码有限公司 Color mapping method and device thereof
WO2018133305A1 (en) * 2017-01-19 2018-07-26 华为技术有限公司 Method and device for image processing

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022042754A1 (en) * 2020-08-31 2022-03-03 维沃移动通信有限公司 Image processing method and apparatus, and device
CN112686355A (en) * 2021-01-12 2021-04-20 树根互联技术有限公司 Image processing method and device, electronic equipment and readable storage medium
CN112686355B (en) * 2021-01-12 2024-01-05 树根互联股份有限公司 Image processing method and device, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN112037160B (en) 2024-03-01
WO2022042754A1 (en) 2022-03-03

Similar Documents

Publication Publication Date Title
CN112037160B (en) Image processing method, device and equipment
CN107204034B (en) A kind of image processing method and terminal
US10554803B2 (en) Method and apparatus for generating unlocking interface, and electronic device
WO2022012657A1 (en) Image editing method and apparatus, and electronic device
CN111835982B (en) Image acquisition method, image acquisition device, electronic device, and storage medium
CN112437232A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN113014803A (en) Filter adding method and device and electronic equipment
CN112269522A (en) Image processing method, image processing device, electronic equipment and readable storage medium
CN113112428A (en) Image processing method and device, electronic equipment and readable storage medium
CN113870100A (en) Image processing method and electronic device
CN113284063A (en) Image processing method, image processing apparatus, electronic device, and readable storage medium
CN112565909B (en) Video playing method and device, electronic equipment and readable storage medium
CN113628259A (en) Image registration processing method and device
CN112419218A (en) Image processing method and device and electronic equipment
CN112508820A (en) Image processing method and device and electronic equipment
CN113393391B (en) Image enhancement method, image enhancement device, electronic apparatus, and storage medium
CN112468794B (en) Image processing method and device, electronic equipment and readable storage medium
JP6155349B2 (en) Method, apparatus and computer program product for reducing chromatic aberration in deconvolved images
CN115439386A (en) Image fusion method and device, electronic equipment and storage medium
CN112529766B (en) Image processing method and device and electronic equipment
CN105279727B (en) Image processing method and device
CN114119392A (en) Image processing method and device and electronic equipment
CN113473012A (en) Virtualization processing method and device and electronic equipment
CN111512341A (en) Image processing method and device
CN113114930B (en) Information display method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant