WO2021035505A1 - 图像处理方法及装置 - Google Patents

图像处理方法及装置 Download PDF

Info

Publication number
WO2021035505A1
WO2021035505A1 PCT/CN2019/102699 CN2019102699W WO2021035505A1 WO 2021035505 A1 WO2021035505 A1 WO 2021035505A1 CN 2019102699 W CN2019102699 W CN 2019102699W WO 2021035505 A1 WO2021035505 A1 WO 2021035505A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
hue
processed
hue value
target
Prior art date
Application number
PCT/CN2019/102699
Other languages
English (en)
French (fr)
Inventor
席迎来
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980033736.XA priority Critical patent/CN112204608A/zh
Priority to PCT/CN2019/102699 priority patent/WO2021035505A1/zh
Publication of WO2021035505A1 publication Critical patent/WO2021035505A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • This application relates to the field of image processing technology, and in particular to an image processing method and device.
  • image processing tools such as Photoshop, image processing program (GNU Image Manipulation Program, GIMP), etc.
  • Photoshop image processing program
  • GIMP GPU Image Manipulation Program
  • users need to create a new adjustment layer, set the fill color of the adjustment layer to black and white, select the range of colors that need to be retained in the original layer where the image to be processed is located, and change the original Steps such as merging layers and adjustment layers can be realized.
  • each step requires manual intervention by the user. This process is complicated in interaction, lengthy processing, and inconvenient for users to use. Users need to be familiar with image processing tools to achieve color selection retention through complex processing steps.
  • the embodiments of the present application provide an image processing method and device, which are used to solve the problems of complicated interaction in the process of realizing color selection retention in the prior art, lengthy processing, and inconvenience for users.
  • an image processing method including:
  • the to-be-processed image and the to-be-fused image corresponding to the to-be-processed image are fused to obtain a fused image.
  • an embodiment of the present application provides an image processing device, including: a processor and a memory; the memory is used to store program code; the processor calls the program code, and when the program code is executed, Used to perform the following operations:
  • the to-be-processed image and the to-be-fused image corresponding to the to-be-processed image are fused to obtain a fused image.
  • an embodiment of the present application provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, the computer program includes at least one piece of code, the at least one piece of code can be executed by a computer to control all The computer executes the method described in any one of the above-mentioned first aspects.
  • an embodiment of the present application provides a computer program, when the computer program is executed by a computer, it is used to implement the method described in any one of the above-mentioned first aspects.
  • the embodiments of the present application provide an image processing method and device, which determine the hue difference between the hue values of multiple pixels in the image to be processed and the target hue value according to the target hue value expected to be retained in the image to be processed.
  • the degree of hue difference between the hue values of multiple pixels and the target hue value determines the fusion weight of each pixel in the multiple pixels, and according to the fusion weight of each pixel, the image to be processed is matched with the image to be processed corresponding to the image to be processed.
  • the fused image is fused to obtain the fused image.
  • the fusion weight of the pixel is automatically determined and the image to be processed and the image to be fused according to the fusion weight are merged, and the image to be processed can be retained.
  • the saturation of pixels in the image that are the same as or close to the target hue value can suppress the saturation of other pixels in the image to be processed, and finally realize the color selection retention of the image to be processed, and reduce the process of achieving color selection retention.
  • the manual intervention of the user simplifies the interaction, shortens the processing time, and is convenient for the user to use.
  • FIG. 1A is a schematic diagram 1 of an application scenario of an image processing method provided by an embodiment of this application;
  • FIG. 1B is a second schematic diagram of an application scenario of the image processing method provided by an embodiment of this application.
  • FIG. 2 is a schematic flowchart of an image processing method provided by an embodiment of this application.
  • FIG. 3 is a schematic flowchart of an image processing method provided by another embodiment of this application.
  • FIG. 5 is a schematic flowchart of an image processing method provided by another embodiment of this application.
  • FIG. 6 is a schematic flowchart of an image processing method provided by another embodiment of this application.
  • Figure 7 is a schematic diagram of a model of the HSL color space
  • FIG. 8 is a schematic diagram of a hue circle provided by an embodiment of the application.
  • FIG. 9 is a schematic diagram of retention saturation provided by an embodiment of the application.
  • FIG. 10 is a schematic diagram 1 of a user selecting a color according to an embodiment of the application.
  • FIG. 11 is a second schematic diagram of a user selecting a color according to an embodiment of the application.
  • FIG. 12 is a schematic structural diagram of an image processing apparatus provided by an embodiment of the application.
  • the image processing method provided in the embodiments of the present application can be applied to any image processing process that requires color selection and retention, and the image processing method can be specifically executed by an image processing device.
  • the image processing device may be a device including an image acquisition module (for example, a camera).
  • FIG. 1A a schematic diagram of an application scenario of the image processing method provided in an embodiment of the present application may be as shown in FIG. 1A.
  • the image of the image processing device The acquisition module may acquire the image to be processed, and the processor of the image processing device may use the image processing method provided by the embodiment of the present application to process the image to be processed acquired by the image acquisition module.
  • FIG. 1A is only a schematic diagram and does not limit the structure of the image processing device.
  • an image filter may be connected between the image acquisition module and the processor to perform processing on the image to be processed collected by the image acquisition device. Filtering processing.
  • the image processing device may also be a device that does not include an image acquisition module.
  • the application scenario diagram of the image processing method provided in the embodiment of the present application may be as shown in FIG. 1B.
  • the communication interface of the image processing device The image to be processed collected by other devices or equipment may be received, and the processor of the image processing device may use the image processing method provided in the embodiment of the present application to process the received image to be processed.
  • FIG. 1B is only a schematic diagram, and does not limit the structure of the image processing device and the connection manner between the image processing device and other devices or equipment.
  • the communication interface in the image processing device can be replaced with a transceiver.
  • the type of equipment including the image processing device may not be limited in the embodiments of the present application.
  • the equipment may be, for example, a desktop computer, an all-in-one computer, a notebook computer, a palm computer, a tablet computer, a smart phone, or a screen with a screen. Remote control etc.
  • the image processing method provided by the embodiments of the present application automatically determines the fusion weight of the pixels according to the target hue value expected to be retained in the image to be processed, and performs the processing of the image to be processed with the image to be fused corresponding to the image to be processed according to the fusion weight Fusion realizes the color selection retention of the image to be processed, reduces the user's manual intervention in the color selection retention process, simplifies the interaction, shortens the processing time, and is convenient for users to use.
  • FIG. 2 is a schematic flowchart of an image processing method provided by an embodiment of the application.
  • the execution subject of this embodiment may be an image processing device, and specifically may be a processor of the image processing device.
  • the method of this embodiment may include:
  • Step 201 Determine the target hue value expected to be retained in the image to be processed.
  • the image to be processed refers to the image with the hue attribute of the pixels. Since the three colors of black, white and gray do not have the attribute of hue, the image to be processed refers to the image that includes all colors other than black, white and gray. Images in other colors. Among them, the image to be processed can be represented by a color space.
  • the color space may be, for example, a red (RED, R) green (GREEN, G) and blue (BLUE, B) color space, and the corresponding image to be processed may be an RGB image.
  • the color space may be, for example, a hue (Hue, H) saturation (Saturation, S) lightness (L) color space, and the corresponding image to be processed may be an HSL image.
  • each pixel in the RGB image is composed of three elements, R, G, and B, and each pixel in the HSL image is represented by the three elements of H, S, and L. It is understandable that conversion between different color spaces can be performed, for example, an RGB image can be converted into an HSL image.
  • the image to be processed may be a color image.
  • the hue value input by the user may be used as the target hue value.
  • the user can input a hue value of 100. or,
  • the color selected by the user on the color palette may be received, and the hue value corresponding to the color may be used as the target hue value.
  • the hue value corresponding to the color may be used as the target hue value. For example, suppose the user can select blue on the hue, and the hue value corresponding to the blue color is 240, then the target hue value is 240. In this way, the user can accurately determine the desired target hue value.
  • a user trigger operation of the image to be processed may be obtained, where the trigger operation is used to indicate that the user triggers a certain area in the original image, and the target hue value is determined according to the trigger operation.
  • the trigger operation can be a click, long press, etc., which is not limited here.
  • the area of the trigger operation may include multiple pixels, or may include one pixel.
  • the target hue value is determined in the area where the trigger operation acts.
  • the target hue value is the hue value of one of the pixels in the area where the trigger operation is applied.
  • the target pixel value will be retained as the target color. In this way, the target hue value can be determined only by simple operations on the image, which is convenient for operation.
  • the target hue value may be a preset hue value.
  • the target hue value may be determined by analyzing and processing the image to be processed, for example, it may be the hue value occupying the largest proportion in the image to be processed.
  • Step 202 Determine the degree of hue difference between the hue values of a plurality of pixels in the image to be processed and the target hue value according to the target hue value.
  • the degree of hue difference can be used to characterize the degree of color difference, such as the degree of difference between red and blue.
  • the hue difference between red and orange is smaller. Specifically, the greater the degree of the hue difference between the hue value of a pixel and the target hue value, the greater the difference between the color of the pixel and the color expected to be retained, and the hue difference between the hue value of a pixel and the target hue value. The smaller the degree, the smaller the difference between the color of the pixel and the color expected to be retained.
  • the degree of hue difference and the degree of hue value difference have different meanings.
  • the hue value difference between the hue value 0 and the hue value 259 is very large, and the hue value difference between the hue value 0 and the hue value 259 is very small.
  • Step 203 Determine the fusion weight of each pixel in the plurality of pixel points according to the degree of hue difference between the hue values of the plurality of pixel points and the target hue value.
  • the degree of hue difference can be used to characterize the degree of color difference, according to the hue difference degree between the hue value of the pixel and the desired hue value of the target hue value, it can be determined that the pixel point is formed when the color selection and retention of the image to be processed is performed The degree of retention of the hue.
  • the degree of retention can be realized by the fusion weight of pixels in the image to be processed when the image to be processed is fused with the image to be fused corresponding to the image to be processed.
  • Step 204 According to the fusion weight of each pixel, the to-be-processed image and the to-be-fused image corresponding to the to-be-processed image are fused to obtain a fused image.
  • the image to be fused refers to an image composed of one or more colors of black, white, and gray.
  • the image to be fused may be a grayscale image, or a black and white image.
  • the image to be fused includes any one of the following: a preset image, a grayscale image corresponding to the image to be processed, and a black and white image corresponding to the image to be processed.
  • the fusion weight of the pixels in the image to be fused may be fixed, or the fusion weight of the pixels in the image to be fused may be negatively related to the fusion weight of the pixels in the image to be processed. It should be noted that when the image to be processed and the image to be fused are merged, specifically, the element values of the same elements corresponding to the pixels are merged according to the fusion weight of the pixels.
  • the specific method of fusing the image to be processed and the image to be fused is not limited in this application.
  • saturation refers to color purity, the higher the saturation, the brighter the color, the lower the saturation, and the closer the color is to black and white.
  • the saturation of the image to be fused is 0. Therefore, according to the fusion weight of the pixels of the image to be processed, Compared with the image to be processed, the fusion image obtained by fusing the image to be processed and the image to be fused has a change in pixel saturation.
  • the fusion weight of a pixel in the image to be processed can represent the degree of retention of the hue of the pixel, and the fusion weight of the pixel is based on the difference between the hue value of the pixel and the target hue value expected to be retained
  • the degree of hue difference is determined, so the saturation of pixels that are the same as or close to the target hue value in the image to be processed can be retained, and the saturation of other pixels in the image to be processed can be suppressed, so that the color of the image to be processed can be realized Choose to keep.
  • the hue difference between the hue values of multiple pixels in the image to be processed and the target hue value is determined, respectively, according to the hue values of the multiple pixels.
  • the degree of hue difference from the target hue value determines the fusion weight of each pixel in the multiple pixels, and according to the fusion weight of each pixel, the image to be processed is fused with the image to be fused corresponding to the image to be processed, and the fusion is obtained
  • the fusion weight of the pixel is automatically determined and the image to be processed and the image to be fused according to the fusion weight are merged, which can retain the same hue as the target hue value in the image to be processed
  • the saturation of pixels at or close to it can suppress the saturation of other pixels in the image to be processed, and finally realize the color selection retention of the image to be processed, and reduce the manual intervention of the user in the process of color selection retention, which simplifies the interaction , Which short
  • FIG. 3 is a schematic flowchart of an image processing method provided by another embodiment of the application. This embodiment mainly describes an optional implementation manner of determining the fusion weight according to the degree of hue difference based on the embodiment shown in FIG. 2. As shown in FIG. 3, the method of this embodiment may include:
  • Step 301 Determine the target hue value expected to be retained in the image to be processed.
  • step 301 is similar to step 201 and will not be repeated here.
  • Step 302 Determine the degree of hue difference between the hue values of a plurality of pixels in the image to be processed and the target hue value according to the target hue value.
  • the multiple pixels are all the pixels of the image to be processed, that is, the entire image to be processed is fused with the image to be fused.
  • the multiple pixels are pixels in the target area of the image to be processed, that is, the target area of the image to be processed is fused with the image to be fused.
  • the target area is set by the user.
  • step 302 may specifically include the following steps A1 and A2.
  • Step A1 shift the hue value of each pixel according to the target hue value and the preset hue value to obtain the shifted hue value of each pixel; the preset hue value is the target hue The hue value after the value is shifted.
  • step A2 the degree of difference between the shifted hue value of each pixel and the preset hue value is used as the degree of hue difference between the hue value of each pixel and the target hue value.
  • the degree of difference in hue values may specifically be the absolute value of the difference in hue values.
  • the preset hue value is 180.
  • the degree of difference between the hue value of a pixel after the shift and the preset hue value can represent the degree of hue difference between the hue value of the pixel (that is, the hue value before shift) and the target hue value, and further
  • the degree of difference between the hue value of a pixel point and the target hue value can be obtained according to the degree of difference between the hue value of a pixel point and the preset hue value, which can achieve the effect of simplifying calculation.
  • step A1 may specifically include the following steps A11 and A12.
  • Step A11 Shift the target hue value in the hue ring to the preset hue value to obtain the shift relationship between the hue value before the shift and the hue value after the shift.
  • Step A12 Determine the shifted hue value of each pixel according to the shift relationship and the hue value of each pixel.
  • the hue ring before shifting is the hue ring before shifting the target hue value in the hue ring to the preset hue value
  • the hue ring after shifting is after shifting the target hue value in the hue ring to the preset hue value Hue circle.
  • the offset relationship can characterize the hue value in the hue circle before the offset (that is, the hue value before the offset), and the hue value in the hue circle after the offset (that is, the hue value after the offset), that is, the offset relationship can be Characterizes the hue value after the shift corresponding to the hue value before shift.
  • the offset relationship can represent the hue value before the offset is 80, and the corresponding hue value after the offset is 100; the hue value before the offset is 90 after the offset The hue value is 110; ..., the hue value 0 before the shift corresponds to the hue value after the shift 20; ....
  • step A11 the corresponding relationship between the hue value before the shift and the hue value after the shift can be obtained, and the hue value after the shift of each pixel can be determined according to the corresponding relationship through step A12, which realizes the determination of the pixel point based on the query shift relationship
  • the hue value after shifting achieves the effect of simplifying the calculation.
  • Step 303 Determine the fusion weight of each pixel in the plurality of pixels based on the weighting strategy according to the degree of hue difference between the hue values of the plurality of pixels and the target hue value.
  • the weighting strategy includes a negative correlation between the fusion weight and the degree of hue difference, that is, the greater the degree of hue difference, the smaller the fusion weight, and the smaller the degree of hue difference, the greater the fusion weight. Since the hue difference between the hue value of a pixel and the target hue value is greater, it can indicate that the hue value of the pixel is farther away from the hue value expected to be retained, and the hue difference between the hue value of a pixel and the target hue value is greater. Small, it can indicate that the hue value of the pixel is closer to the hue value expected to be retained.
  • the relationship between the fusion weight and the degree of hue difference satisfies a normal distribution.
  • the effect of smooth saturation transition in the fused image can be achieved.
  • the standard deviation of the normal distribution can adjust the range of color retention.
  • the larger the standard deviation the larger the color retention range, and more colors similar to the hue value of the target hue value will be retained.
  • the standard deviation of the normal distribution is a preset value, for example, the standard deviation of the normal distribution can be preset to 40; or the standard deviation of the normal distribution can be set by the user, for example, the standard deviation can be set
  • the range is 30 to 60, which is used to set the standard deviation of the normal distribution to a value in this range.
  • Step 304 According to the fusion weight of each pixel, the to-be-processed image and the to-be-fused image corresponding to the to-be-processed image are fused to obtain a fused image.
  • the following step may be further included before step 304: according to the image to be processed, determining that the image to be processed corresponds to Grayscale image.
  • the determining the grayscale image corresponding to the image to be processed according to the image to be processed may specifically include: processing the image to be processed in the RGB color space into the image to be processed in the YUV color space, and extracting The Y component in the image to be processed in the YUV color space is used to generate a grayscale image.
  • the following step may be further included before step 304: according to the image to be processed, a black and white image corresponding to the image to be processed is determined.
  • the hue difference between the hue values of multiple pixels in the image to be processed and the target hue value is determined according to the target hue value expected to be retained, and the hue values of the hue values of the multiple pixels are different from the hue value of the target hue value.
  • the degree of difference based on the weighting strategy, determines the fusion weight of each pixel in the multiple pixels, and according to the fusion weight of each pixel, the image to be processed is fused with the image to be fused corresponding to the image to be processed to obtain the fused image It can achieve the effect that the greater the degree of hue difference between the target hue value and the target hue in the image to be processed is, the greater the degree of saturation suppression is, and finally the color selection and retention of the image to be processed can be achieved.
  • FIG. 4 is a schematic flowchart of an image processing method provided by another embodiment of this application.
  • This embodiment mainly describes another optional implementation method for determining the fusion weight according to the degree of hue difference based on the embodiment shown in FIG. 2 .
  • the method of this embodiment may include:
  • Step 401 Determine the target hue value expected to be retained in the image to be processed.
  • step 401 is similar to step 201, and will not be repeated here.
  • Step 402 Determine the degree of hue difference between the hue values of a plurality of pixels in the image to be processed and the target hue value according to the target hue value.
  • step 402 is similar to step 202 and step 302, and will not be repeated here.
  • Step 403 When the degree of the hue difference between the hue value of each pixel and the target hue value is less than a first threshold, determine that the fusion weight of each pixel is the first weight.
  • the hue difference between the hue value of a pixel and the target hue value is less than the first threshold. It can be considered that the hue value of the pixel is close to the target hue value, and the fusion weight of the pixel is further determined as the first weight. , Can achieve the effect of retaining the saturation of the pixel point that is the same as or close to the target hue value in the image to be processed.
  • the fusion weight is a normalized weight
  • the first weight may be 1.
  • the fusion weight of each pixel is a second weight, and the second weight is less than the first weight.
  • the first weight is determined that the fusion weight of each pixel is a second weight, and the second weight is less than the first weight.
  • the hue difference between the hue value of a pixel and the target hue value is greater than the first threshold, it can be considered that the hue value of the pixel is not close to the target hue value, and the fusion weight of the pixel is further determined as the second weight.
  • the second weight may be 0, which can achieve the effect of suppressing the saturation of pixels whose hue value is not close to the target hue value in the image to be processed to 0.
  • the degree of hue difference between the hue value of each pixel and the target hue value is greater than the first threshold and less than the second threshold, determining that the fusion weight of each pixel is the third weight;
  • the fusion weight of each pixel is determined to be the second weight; wherein, the first threshold is less than the second threshold , The third weight is greater than the second weight.
  • the hue difference between the hue value of a pixel and the target hue value is greater than the first threshold and less than the second threshold, it can be considered that the hue value of the pixel is not close to the target hue value, and the hue value of the pixel is further determined.
  • the fusion weight is the third weight, which can achieve a smaller effect of suppressing the saturation of pixels in the image to be processed that the hue value is less similar to the target hue value.
  • the hue difference between the hue value of a pixel and the target hue value is greater than the second threshold. It can be considered that the hue value of the pixel is very close to the target hue value. It is further determined that the fusion weight of the pixel is the second weight.
  • the second weight may be 0, which can achieve the effect of suppressing the saturation of pixels whose hue value is very different from the target hue value in the image to be processed to 0.
  • thresholds can be set according to actual needs to subdivide the suppression effect.
  • Step 404 According to the fusion weight of each pixel, the to-be-processed image and the to-be-fused image corresponding to the to-be-processed image are fused to obtain a fused image.
  • step 404 is similar to step 204, and will not be repeated here.
  • the hue difference between the hue values of multiple pixels in the image to be processed and the target hue value is determined according to the target hue value expected to be retained, and the hue value of each pixel is compared with the hue of the target hue value.
  • the degree of difference is less than the first threshold
  • the fusion weight of each pixel is determined as the first weight, and the image to be processed and the image to be fused are fused according to the fusion weight of each pixel to obtain the fused image, which can be realized according to
  • the different levels of the degree of hue difference, the saturation of the pixels of the image to be processed are suppressed to different degrees, and finally the color selection and preservation of the image to be processed is realized.
  • the color selection operation input by the user in the target area in the image to be processed is acquired to generate a color selection instruction, and the color selection instruction is used to process the image to be processed to control the target area. colour.
  • the color selection instruction can also retain or change the color of the target area by processing the image to be processed. For example, if the same is replaced with a specified color, such as a user-specified color, or a default color, it is not limited here.
  • the image to be processed may be processed according to the color selection operation to generate a processed image, wherein the color of the target area in the processed image is the target color.
  • the target color may be a color specified by the user, a default color, or a color consistent with the target area in the image to be processed.
  • FIG. 5 is a schematic flowchart of an image processing method provided by another embodiment of the application. This embodiment mainly describes an optional implementation of determining the target hue value expected to be retained in the image to be processed on the basis of the above method embodiment. the way.
  • the method of this embodiment may include:
  • Step 501 Acquire a color selection operation input by the user to generate a color selection instruction, the color selection instruction being used to indicate that the target color in the image to be processed is expected to be retained.
  • the user can select the color desired to be retained in the image to be processed through the color selection operation, that is, the target color.
  • the color selection operation may specifically be a user's click operation on the target color in the color palette. It is convenient for users to operate.
  • the image to be processed includes an image currently displayed in a display interface; the color selection operation includes a click operation of a target position input in the display interface, and the target position includes the position of the target pixel in the display interface.
  • Position, the target pixel is an image pixel whose color is the target color. For example, if the user clicks on the sky position in the image to be processed displayed on the display interface, the target color may be the color of the sky position in the image to be processed.
  • the method further includes: identifying the position of the target pixel in the display interface in the display interface. By identifying the position of the target pixel in the display interface in the display interface, the effect of facilitating the user to know the selected target color can be achieved.
  • Step 502 Determine the target hue value according to the color selection operation.
  • the target hue value is the hue value of the target color.
  • the color value of the target color may be converted from an RGB value to an HSL value to obtain the target hue value.
  • step 501 the following steps B1 and B2 may also be included.
  • Step B1 Obtain the filter selection operation input by the user to generate a filter selection instruction, and the filter selection instruction is used to select a color retention filter.
  • Step B2 According to the filter selection operation, a color selection entry is opened, and the color selection entry is used to obtain the color selection operation input by the user.
  • step B1 and step B2 it is possible to provide the user with a filter for achieving the color selection retention effect of the image, that is, the color retention filter, to achieve an effect that is convenient for the user.
  • step B2 it may further include: outputting prompt information, where the prompt information is used to indicate that the input color selection operation is allowed.
  • the specific type of the prompt information is not limited in this application, for example, it may be voice, text, vibration, etc.
  • the color selection operation input by the user is acquired to generate a color selection instruction indicating that the target color in the image to be processed is expected to be retained, and the target hue value is determined according to the color selection operation, so that the user can input the color
  • the selection operation selects the target hue value expected to be retained in the image to be processed, and the user selectability of the target hue value is realized, and the effect of improving the user experience is achieved.
  • the image to be processed in the foregoing method embodiment may include a video image in a video.
  • the image to be processed includes the N1-th frame of video image of the video, and N1 is a positive integer.
  • the image to be processed includes the N1-th frame video image of the video, and the effect of selecting and retaining the color of the N1-th frame video image can be achieved.
  • the above method embodiment may further include the following step: determining the degree of difference between the hue values of a plurality of pixels in the video image of the N2th frame of the video and the target hue value.
  • the fusion weight of each pixel among the multiple pixels in the N2th frame of video image, N2 is a positive integer.
  • N2 may be greater than N1, so that the color selection retention effect of the N2-th frame of the video image and the N1-th frame of the video image after the N1-th frame of the video can be consistent.
  • the following step may be further included: converting the color values of the multiple pixels from red, green, and blue RGB values to hue saturation lightness HSL value.
  • the respective hue values of multiple pixels can be determined.
  • this application designs a simple and fast implementation method of the color selection retention filter.
  • This method requires only simple interaction to quickly generate the color selection retention image.
  • This method only requires the user to select the color that needs to be retained by clicking on the mouse or touch screen, and the region of the color can be retained, and other regions can be directly processed into black, white and gray.
  • This method can not only process images quickly, but also be suitable for real-time processing of high-definition video.
  • This method first obtains the RGB value of the color selected by the user according to the location selected by the user, and converts it from the RGB color space to the HSL color space.
  • the H (Hue) value of the color that is, the hue
  • the entire hue circle is offset, and a lookup table with a length of 360 is generated.
  • convert the entire image from RGB color space to HSL color space use the hue H value of each pixel to look up the table to get the hue shift, and use the hue shift to calculate the Gaussian distribution value as the fusion weight ⁇ .
  • the fusion weight ⁇ is used to perform fusion processing on the RGB components of the original image of the entire image to be processed and the brightness value L (Lightness) respectively, and a relatively ideal effect can be obtained.
  • other steps can be automatically processed by the computing device. Therefore, the method is very simple to operate and has strong practicability. It is very suitable for transplanting to a mobile terminal manual editor and is convenient for users to use.
  • the processing speed of the algorithm is very fast, which can meet the real-time processing needs of high-definition video. As shown in Figure 6, the following steps can be specifically included:
  • Step 601 color selection.
  • This step is very simple.
  • the user selects the color he wants to keep by clicking on the mouse or touch screen, and the computing device automatically obtains the RGB value of the color according to the interaction.
  • Step 602 Generate a hue lookup table.
  • the RGB value of the color selected in step 601 is converted into an HSL value to obtain the hue value H of the selected color.
  • the HSL color space can be expressed as a cone-like model, the hue changes along the outer circumference of the cylinder, the range is [0, 360], and the saturation changes along the distance from the center of the cross section. [0,1], hue and saturation form plane polar coordinates, and lightness varies along the distance between the bottom surface and the top surface of the cross section.
  • the hue circle is a relatively regular circle, as shown in Figure 8.
  • the purpose of the color selection retention filter is to make the processed image retain the original saturation at the hue selected by the user, and the saturation of the remaining hue areas are suppressed to 0 or close to 0, and the saturation is between the maximum value and 0
  • the transition should be smooth, as shown in Figure 9. Specifically, it is more appropriate to use Gaussian function to simulate the distribution of saturation.
  • the method for converting RGB values to HSL values may include the following steps 1 to 6.
  • Step 1 Normalize the RGB value to the value in [0,1].
  • Step 2 Find the maximum value maxcolor and minimum value mincolor among R, G, and B.
  • Step 4 If the maximum and minimum color values are the same, it means gray, then S and H are defined as 0.
  • Step 5 If the maximum and minimum color values are different, calculate the saturation S according to the brightness L:
  • the hue value range is [0,360].
  • the entire hue ring can be offset, and the hue value selected by the user can be offset to position 180 to generate a lookup table with a length of 360.
  • the implementation method is as follows:
  • step 603 the original image is converted from the RGB image to the HSL image and the table is looked up.
  • the original image may specifically be an RGB image, and the color of each pixel in the original image may be converted from the RGB space to the HSL space.
  • Step 604 Generate a fusion weight ⁇ .
  • the fusion weight ⁇ can be calculated using a Gaussian distribution model. For example, use the following formula to calculate the fusion weight ⁇
  • the variance ⁇ can be used to adjust the range of hue retention.
  • the larger the ⁇ the larger the color retention range, and the more colors close to the selected color will be retained.
  • the smaller the ⁇ the smaller the color retention range, and the more single and purer the selected color retention will be.
  • Step 605 the original image and the gray image are merged.
  • the RGB image and the grayscale image L may be merged according to the fusion weight.
  • the following formula fuses the RGB image and the grayscale image L according to the calculated fusion weight, and then the color selection preserved image can be obtained, that is, the fused image.
  • step 604 and step 605 it can be seen from the formulas in step 604 and step 605 that according to the principle of Gaussian distribution, when the hue offset value H'of a pixel in an image area in the original image is closer to 180, that is, the pixel offset in the image area The closer the front hue value H is to the hue value of the color selected by the user, the greater the fusion weight ⁇ obtained by the image to be processed, and the more the color of the image area remains after fusion. Conversely, when the hue offset value H'of a pixel in an image area in the original image is closer to 0 or 360, the hue value H before the offset of the pixel in the image area deviates from the hue value of the color selected by the user. The greater the fusion weight (1- ⁇ ) obtained by the degree image, the more colors in the image area will be discarded after fusion, and the result will be closer to a black and white image.
  • the red area with red color can be reserved, and other areas are converted into grayscale images.
  • This method has great flexibility, and the user can choose the color to be preserved in the image at will. For example, as shown in Figure 11, assuming that the user clicks on the sky area in the image with the mouse, that is, the mouse position is located in the sky area, and the color of the sky area is blue, the color selected by the user is blue.
  • the blue area with blue color can be reserved in the processed image, and other areas are converted into grayscale images.
  • the method provided in this embodiment has simpler interaction, more convenient implementation, and stronger flexibility.
  • This method can be easily transplanted to the mobile terminal manual editor APP, so that users can choose and preserve the colors of images and videos anytime, anywhere.
  • the processing speed of this method is very fast. Under the condition that the Intel i7 2.7GHz, memory 16GB 2133MHz LPDDR3 PC does not use GPU acceleration, it only takes 78ms to process 1080x720 images. If you use GPU acceleration, you can achieve real-time processing of high-definition video.
  • FIG. 12 is a schematic structural diagram of an image processing apparatus provided by an embodiment of the application. As shown in FIG. 12, the apparatus 1200 may include a processor 1201 and a memory 1202.
  • the memory 1202 is used to store program codes
  • the processor 1201 calls the program code, and when the program code is executed, is configured to perform the following operations:
  • the to-be-processed image and the to-be-fused image corresponding to the to-be-processed image are fused to obtain a fused image.
  • processor 1201 may also be used to perform the following operations:
  • the image to be processed is processed to generate a processed image, wherein the color of the target area in the processed image is the target color.
  • the image processing apparatus provided in this embodiment can be used to implement the technical solutions of the foregoing method embodiments, and its implementation principles and technical effects are similar to those of the method embodiments, and will not be repeated here.
  • a person of ordinary skill in the art can understand that all or part of the steps in the foregoing method embodiments can be implemented by a program instructing relevant hardware.
  • the aforementioned program can be stored in a computer readable storage medium. When the program is executed, it executes the steps including the foregoing method embodiments; and the foregoing storage medium includes: ROM, RAM, magnetic disk, or optical disk and other media that can store program codes.

Abstract

一种图像处理方法及装置,该方法包括:确定待处理图像中期望保留的目标色相值;根据所述目标色相值,确定所述待处理图像中多个像素点的色相值分别与所述目标色相值的色相差异程度;根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述多个像素点中各像素点的融合权重;根据各像素点的融合权重,将所述待处理图像跟与所述待处理图像相对应的待融合图像进行融合,得到融合后的图像。本申请可以减少了实现颜色选择保留过程中用户的手工介入,简化了交互,缩短了处理时长,便于用户使用。

Description

图像处理方法及装置 技术领域
本申请涉及图像处理技术领域,尤其涉及一种图像处理方法及装置。
背景技术
随着互联网技术的发展,视频娱乐和视频内容的消费趋向于大众化,视频内容的作者对视频编辑的需求日益增长。
处理图像例如照片或视频时,如果想要突出图像中的主体,用户可以使用图像处理工具,例如Photoshop,图像处理程序(GNU Image Manipulation Program,GIMP)等,将图像中背景区域的颜色去掉变成黑白,只保留要突出的主体物的颜色,从而实现颜色选择保留。通常,在用户使用图像处理工具实现颜色选择保留时,需要经过新建调整图层、设置调整图层的填充颜色为黑白、在待处理图像所在的原图层中选择需要保留颜色的范围、将原图层与调整图层合并等步骤才能实现。并且,每个步骤都要用户手工介入处理。这个过程交互复杂,处理冗长,存在不便于用户使用的问题,用户需要熟悉图像处理工具才能经过复杂的处理步骤实现颜色选择保留。
发明内容
本申请实施例提供一种图像处理方法及装置,用以解决现有技术中实现颜色选择保留的过程交互复杂,处理冗长,存在不便于用户使用的问题。
第一方面,本申请实施例提供一种图像处理方法,包括:
确定待处理图像中期望保留的目标色相值;
根据所述目标色相值,确定所述待处理图像中多个像素点的色相值分别与所述目标色相值的色相差异程度;
根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述多个像素点中各像素点的融合权重;
根据各像素点的融合权重,将所述待处理图像跟与所述待处理图像相对应的待融合图像进行融合,得到融合后的图像。
第二方面,本申请实施例提供一种图像处理装置,包括:处理器和存储器;所述存储器,用于存储程序代码;所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
确定待处理图像中期望保留的目标颜色;
根据所述目标色相值,确定所述待处理图像中多个像素点的色相值分别与所述目标色相值的色相差异程度;
根据所述多个像素点的色相值分别与目标色相值的色相差异程度,确定所述多个像素点中各像素点的融合权重;
根据各像素点的融合权重,将所述待处理图像跟与所述待处理图像相对应的待融合图像进行融合,得到融合后的图像。
第三方面,本申请实施例提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序包含至少一段代码,所述至少一段代码可由计算机执行,以控制所述计算机执行上述第一方面任一项所述的方法。
第四方面,本申请实施例提供一种计算机程序,当所述计算机程序被计算机执行时,用于实现上述第一方面任一项所述的方法。
本申请实施例提供一种图像处理方法及装置,通过根据定待处理图像中期望保留的目标色相值,确定待处理图像中多个像素点的色相值分别与目标色相值的色相差异程度,根据多个像素点的色相值分别与目标色相值的色相差异程度确定多个像素点中各像素点的融合权重,并根据各像素点的融合权重将待处理图像跟与待处理图像相对应的待融合图像进行融合,得到融合后的图像,实现了根据待处理图像中期望保留的目标色相值,自动确定像素点的融合权重并根据融合权重对待处理图像和待融合图像进行融合,可以保留待处理图像中与目标色相值色相相同或接近的像素点的饱和度,可以抑制待处理图像中的其他像素点的饱和度,最终实现待处理图像的颜色选择保留,并且减少了实现颜色选择保留过程中用户的手工介入,简化了交互,缩短了处理时长,便于用户使用。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1A为本申请实施例提供的图像处理方法的应用场景示意图一;
图1B为本申请实施例提供的图像处理方法的应用场景示意图二;
图2为本申请一实施例提供的图像处理方法的流程示意图;
图3为本申请另一实施例提供的图像处理方法的流程示意图;
图4为本申请又一实施例提供的图像处理方法的流程示意图;
图5为本申请又一实施例提供的图像处理方法的流程示意图;
图6为本申请又一实施例提供的图像处理方法的流程示意图;
图7为HSL色彩空间的模型示意图;
图8为本申请实施例提供的色相环的示意图;
图9为本申请实施例提供的保留饱和度的示意图;
图10为本申请实施例提供的用户选择颜色的示意图一;
图11为本申请实施例提供的用户选择颜色的示意图二;
图12为本申请一实施例提供的图像处理装置的结构示意图。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请实施例提供的图像处理方法可以应用于任何需要进行颜色选择保留的图像处理过程中,该图像处理方法具体可以由图像处理装置执行。该图 像处理装置可以为包括图像采集模块(例如,摄像头)的装置,相应的,本申请实施例提供的图像处理方法的应用场景示意图可以如图1A所示,具体的,该图像处理装置的图像采集模块可以采集得到待处理图像,图像处理装置的处理器可以对图像采集模块采集的待处理图像采用本申请实施例提供的图像处理方法进行处理。需要说明的是,图1A仅为示意图,并不对图像处理装置的结构作限定,例如图像采集模块与处理器之间可以连接有图像滤波器,用于对图像采集装置采集到的待处理图像进行滤波处理。
或者,该图像处理装置也可以为不包括图像采集模块的装置,相应的,本申请实施例提供的图像处理方法的应用场景示意图可以如图1B所示,具体的,该图像处理装置的通信接口可以接收其他装置或设备采集的待处理图像,图像处理装置的处理器可以对接收到的待处理图像采用本申请实施例提供的图像处理方法进行处理。需要说明的是,图1B仅为示意图,并不对图像处理装置的结构以及图像处理装置与其他装置或设备之间的连接方式作限定,例如图像处理装置中通信接口可以替换为收发器。
需要说明的是,对于包括该图像处理装置的设备的类型,本申请实施例可以不做限定,该设备例如可以为台式机、一体机、笔记本电脑、掌上电脑、平板电脑、智能手机、带屏遥控器等。
本申请实施例提供的图像处理方法,通过根据待处理图像中期望保留的目标色相值,自动确定像素点的融合权重并根据融合权重将待处理图像跟与待处理图像相对应的待融合图像进行融合,实现待处理图像的颜色选择保留,减少了实现颜色选择保留过程中用户的手工介入,简化了交互,缩短了处理时长,便于用户使用。
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
图2为本申请一实施例提供的图像处理方法的流程示意图,本实施例的执行主体可以为图像处理装置,具体可以为图像处理装置的处理器。如图2所示,本实施例的方法可以包括:
步骤201,确定待处理图像中期望保留的目标色相值。
本步骤中,待处理图像是指像素点存在色相属性的图像,由于黑、白和灰三种颜色不存在色相的属性,因此待处理图像是指图像中包括除黑、白和灰之外的其他颜色的图像。其中,待处理图像可以通过色彩空间表示。色彩 空间例如可以为红(RED,R)绿(GREEN,G)蓝(BLUE,B)色彩空间,对应的待处理图像可以为RGB图像。色彩空间例如可以为色相(Hue,H)饱和度(Saturation,S)明度(Lightness,L)色彩空间,对应的待处理图像可以为HSL图像。其中,RGB图像中的每个像素由R、G和B三要素构成,HSL图像中每个像素由H、S、L三要素表示。可以理解的是,不同色彩空间之间可以进行转换,例如,RGB图像可以转换为HSL图像。可选的,待处理图像可以为彩色图像。
示例性的,可以将用户输入的色相值作为目标色相值。例如,用户可以输入色相值100。或者,
示例性的,可以通过接收用户在色板上所选择的颜色,并将该颜色对应的色相值作为目标色相值。例如,假设用户可以在色相上选择蓝色,并且该蓝色对应的色相值为240,则目标色相值为240。如此,用户能够准确确定期望的目标色相值。
示例性的,可以获取用户对待处理图像的触发操作,其中该触发操作用于表示用户对该原始图像中的某个区域进行了触发,并根据该触发操作确定目标色相值。该触发操作可以是点击、长按等操作,在此不做限定。该触发操作作用的区域可以包括多个像素点,也可以包括一个像素点。获取用户的触发操作后,及在该触发操作作用的区域内确定了目标色相值。可选的,该目标色相值为该触发操作作用的区域内其中一个像素点的色相值。该目标像素值将被保留,作为目标颜色。如此,仅通过对图像的简单操作就可实现对目标色相值的确定,方便操作。
示例性的,目标色相值可以为预设色相值。
示例性的,目标色相值可以通过对待处理图像进行分析处理确定,例如可以为待处理图像中占比最大的色相值。
步骤202,根据所述目标色相值,确定所述待处理图像中多个像素点的色相值分别与所述目标色相值的色相差异程度。
本步骤中,由于色相是色彩最基本的特征,可以反映颜色的基本面貌,是区别不同颜色的最主要因素,因此色相差异程度可以用于表征颜色差异程度,例如红色与蓝色的色相差异程度,较红色与橙色的色相差异程度要小。具体的,一个像素点的色相值与目标色相值的色相差异程度越大,可以表示该像素点的颜色与期望保留的颜色的差异越大,一个像素点的色相值与目标 色相值的色相差异程度越小,可以表示该像素点的颜色与期望保留的颜色的差异越小。
需要说明的是,色相差异程度与色相值差异程度的含义并不相同。例如,色相值0和色相值259的色相值差异非常大,而色相值0和色相值259的色相差异程度非常小。
步骤203,根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述多个像素点中各像素点的融合权重。
本步骤中,由于色相差异程度可以用于表征颜色差异程度,因此根据像素点的色相值与期望保留的目标色相值的色相差异程度,可以确定在对待处理图像进行颜色选择保留时,构成像素点的色相的保留程度。其中,保留程度可以通过待处理图像跟与待处理图像相对应的待融合图像进行融合时待处理图像中像素点的融合权重实现。
步骤204,根据各像素点的融合权重,将所述待处理图像跟与所述待处理图像相对应的待融合图像进行融合,得到融合后的图像。
本步骤中,待融合图像是指由黑、白、灰中的一种或多种颜色构成的图像。示例性的,待融合图像可以为灰度图像,或者,黑白图像。可选的,所述待融合图像包括下述中的任意一种:预设图像、所述待处理图像对应的灰度图像、所述待处理图像对应的黑白图像。
可选的,待融合图像中像素点的融合权重可以固定,或者,待融合图像中像素点的融合权重可以与待处理图像中像素点的融合权重负相关。需要说明的是,对待处理图像与待融合图像进行融合时,具体是根据像素点的融合权重,将对应像素点的相同要素的要素值进行融合。对于将待处理图像与待融合图像进行融合的具体方式,本申请不做限定。
由于饱和度是指色彩纯度,饱和度越高,颜色越鲜艳,饱和度越低,颜色越趋近黑白,待融合图像的饱和度为0,因此根据待处理图像的像素点的融合权重,将待处理图像和待融合图像进行融合所得到的融合图像与待处理图像相比,像素点的饱和度会发生变化。并且,由于待处理图像中一个像素点的融合权重可以表征对于该像素点的色相的保留程度,而该像素点的融合权重是根据该像素点的色相值与期望保留的目标色相值之间的色相差异程度确定的,因此可以保留待处理图像中与目标色相值色相相同或接近的像素点的饱和度,可以抑制待处理图像中的其他像素点的饱和度,从而可以实现待处 理图像的颜色选择保留。
本实施例中,通过根据定待处理图像中期望保留的目标色相值,确定待处理图像中多个像素点的色相值分别与目标色相值的色相差异程度,根据多个像素点的色相值分别与目标色相值的色相差异程度确定多个像素点中各像素点的融合权重,并根据各像素点的融合权重将待处理图像跟与待处理图像相对应的待融合图像进行融合,得到融合后的图像,实现了根据待处理图像中期望保留的目标色相值,自动确定像素点的融合权重并根据融合权重对待处理图像和待融合图像进行融合,可以保留待处理图像中与目标色相值色相相同或接近的像素点的饱和度,可以抑制待处理图像中的其他像素点的饱和度,最终实现待处理图像的颜色选择保留,并且减少了实现颜色选择保留过程中用户的手工介入,简化了交互,缩短了处理时长,便于用户使用。
图3为本申请另一实施例提供的图像处理方法的流程示意图,本实施例在图2所示实施例的基础上主要描述了根据色相差异程度确定融合权重的一种可选的实现方式。如图3所示,本实施例的方法可以包括:
步骤301,确定待处理图像中期望保留的目标色相值。
需要说明的是,步骤301与步骤201类似,在此不再赘述。
步骤302,根据所述目标色相值,确定所述待处理图像中多个像素点的色相值分别与所述目标色相值的色相差异程度。
本步骤中,可选的,所述多个像素点为所述待处理图像的全部像素点,即将整个待处理图像与待融合图像进行融合。或者,所述多个像素点为所述待处理图像中目标区域中的像素点,即将待处理图像的目标区域与待融合图像进行融合。示例性的,所述目标区域由用户设置。
可选的,步骤302具体可以包括如下步骤A1和步骤A2。
步骤A1,根据所述目标色相值以及预设色相值,对各像素点的色相值进行偏移,获得各所述像素点的偏移后色相值;所述预设色相值为所述目标色相值的偏移后色相值。
步骤A2,将各像素点的偏移后色相值与所述预设色相值的差异程度,作为各所述像素点的色相值与所述目标色相值的色相差异程度。
示例性的,色相值的差异程度具体可以为色相值差值的绝对值。
示例性的,预设色相值为180。
通过步骤A1可以使得一个像素点的偏移后色相值与预设色相值的差异程 度可以表征该像素点的色相值(即,偏移前色相值)与目标色相值的色相差异程度,进一步的,通过步骤A2可以根据一个像素点的偏移后色相值与预设色相值的差异程度得到该像素点的色相值与目标色相值的色相差异程度,可以达到简化计算的效果。
进一步可选的,步骤A1具体可以包括如下步骤A11和步骤A12。
步骤A11,将色相环中所述目标色相值偏移至所述预设色相值,得到偏移前的所述色相环与偏移后的所述色相环之间色相值的偏移关系。
步骤A12,根据所述偏移关系以及各像素点的色相值,确定各所述像素点的偏移后色相值。
其中,偏移前的色相环,即将色相环中目标色相值偏移至预设色相值之前的色相环;偏移后的色相环,即将色相环中目标色相值偏移至预设色相值之后的色相环。偏移关系可以表征偏移前的色相环中的色相值(即偏移前色相值),在偏移后的色相环中的色相值(即,偏移后色相值),即偏移关系可以表征偏移前色相值对应的偏移后色相值。
以目标色相值为80,预设色相值为100为例,则偏移关系可以表征偏移前色相值80,对应的偏移后色相值为100;偏移前色相值90对应的偏移后色相值为110;……,偏移前色相值0对应的偏移后色相值为20;……。
通过步骤A11可以得到偏移前色相值与偏移后色相值的对应关系,通过步骤A12可以根据该对应关系确定各像素点的偏移后色相值,实现了基于查询偏移关系确定像素点的偏移后色相值,达到了简化计算的效果。
步骤303,根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,基于权重策略,确定所述多个像素点中各像素点的融合权重。
本步骤中,所述权重策略包括融合权重与色相差异程度负相关,即色相差异程度越大,融合权重越小,色相差异程度越小,融合权重越大。由于一个像素点的色相值与目标色相值的色相差异程度越大,可以表示该像素点的色相值与期望保留的色相值越远离,一个像素点的色相值与目标色相值的色相差异程度越小,可以表示该像素点的色相值与期望保留的色相值越接近,因此通过融合权重与色相差异程度负相关,可以实现对于待处理图像中与目标色相值色相差异程度越大的像素点饱和度抑制程度越大的效果,最终实现待处理图像的颜色选择保留。
示例性的,融合权重与色相差异程度的关系满足正态分布。通过融合权 重与色相差异程度的关系满足正态分布,可以达到融合后的图像中饱和度过渡平滑的效果。
其中,正态分布的标准差可以调节颜色保留的范围。标准差越大,颜色保留范围越大,更多的与被目标色相值的色相值相近的颜色会被保留。标准差越小,颜色保留范围越小,颜色保留会越单一,更纯净。
示例性的,所述正态分布的标准差为预设值,例如正态分布的标准差可以预设为40;或者,所述正态分布的标准差由用户设置,例如可以设置标准差的范围为30至60,用于可以将正态分布的标准差设置为该范围内的一个值。
步骤304,根据各像素点的融合权重,将所述待处理图像跟与所述待处理图像相对应的待融合图像进行融合,得到融合后的图像。
本步骤中,可选的,在所述待融合图像为所述待处理图像对应的灰度图像时,步骤304之前还可以包括如下步骤:根据所述待处理图像,确定所述待处理图像对应的灰度图像。示例性的,所述根据所述待处理图像,确定所述待处理图像对应的灰度图像具体可以包括:将RGB色彩空间的所述待处理图像处理成YUV色彩空间的待处理图像,并提取所述YUV色彩空间的待处理图像中的Y分量,以生成灰度图像。
可选的,在所述待融合图像为所述待处理图像对应的灰度图像时,步骤304之前还可以包括如下步骤:根据所述待处理图像,确定所述待处理图像对应的黑白图像。
本实施例中,通过根据期望保留的目标色相值确定待处理图像中多个像素点的色相值分别与目标色相值的色相差异程度,根据多个像素点的色相值分别与目标色相值的色相差异程度,基于权重策略确定多个像素点中各像素点的融合权重,并根据各像素点的融合权重将待处理图像跟与待处理图像相对应的待融合图像进行融合,得到融合后的图像,可以实现对于待处理图像中与目标色相值色相差异程度越大的像素点饱和度抑制程度越大的效果,最终实现待处理图像的颜色选择保留。
图4为本申请又一实施例提供的图像处理方法的流程示意图,本实施例在图2所示实施例的基础上主要描述了根据色相差异程度确定融合权重的另一种可选的实现方式。如图4所示,本实施例的方法可以包括:
步骤401,确定待处理图像中期望保留的目标色相值。
需要说明的是,步骤401与步骤201类似,在此不再赘述。
步骤402,根据所述目标色相值,确定所述待处理图像中多个像素点的色相值分别与所述目标色相值的色相差异程度。
需要说明的是,步骤402与步骤202、步骤302类似,在此不再赘述。
步骤403,在各像素点的色相值与所述目标色相值的色相差异程度小于第一阈值时,确定各所述像素点的融合权重为第一权重。
本步骤中,一个像素点的色相值与目标色相值的色相差异程度小于第一阈值,可以认为该像素点的色相值与目标色相值相近,进一步的确定该像素点的融合权重为第一权重,可以达到保留待处理图像中与目标色相值色相相同或接近像素点的饱和度的效果。示例性的,在融合权重为归一化权重时,第一权重可以为1。
可选的,在各像素点的色相值与所述目标色相值的色相差异程度大于所述第一阈值时,确定各所述像素点的融合权重为第二权重,所述第二权重小于所述第一权重。
具体的,一个像素点的色相值与目标色相值的色相差异程度大于第一阈值,可以认为该像素点的色相值与目标色相值不相近,进一步的确定该像素点的融合权重为第二权重,可以达到抑制待处理图像中色相值与目标色相值不相近像素点的饱和度的效果。示例性的,第二权重可以为0,可以达到对将待处理图像中色相值与目标色相值不相近的像素点的饱和度抑制为0的效果。
或者,可选的,在各像素点的色相值与所述目标色相值的色相差异程度大于所述第一阈值且小于第二阈值时,确定各所述像素点的融合权重为第三权重;在各像素点的色相值与所述目标色相值的色相差异程度大于第二阈值时,确定各所述像素点的融合权重为第二权重;其中,所述第一阈值小于所述第二阈值,所述第三权重大于所述第二权重。
具体的,一个像素点的色相值与目标色相值的色相差异程度大于第一阈值且小于第二阈值,可以认为该像素点的色相值与目标色相值较不相近,进一步的确定该像素点的融合权重为第三权重,可以达到较小抑制待处理图像中色相值与目标色相值较不相近像素点的饱和度的效果。一个像素点的色相值与目标色相值的色相差异程度大于第二阈值,可以认为该像素点的色相值与目标色相值非常不相近,进一步的确定该像素点的融合权重为第二权重,可以达到较大抑制待处理图像中色相值与目标色相值非常不相近像素点的饱和度的效果。示例性的,第二权重可以为0,可以达到对将待处理图像中色相 值与目标色相值非常不相近的像素点的饱和度抑制为0的效果。
需要说明的是,可以根据实际需求设置更多的阈值,以细分抑制效果。
步骤404,根据各像素点的融合权重,将所述待处理图像跟与所述待处理图像相对应的待融合图像进行融合,得到融合后的图像。
需要说明的是,步骤404与步骤204类似,在此不再赘述。
本实施例中,通过根据期望保留的目标色相值确定待处理图像中多个像素点的色相值分别与目标色相值的色相差异程度,在各像素点的色相值与所述目标色相值的色相差异程度小于第一阈值时,确定各所述像素点的融合权重为第一权重,并根据各像素点的融合权重将待处理图像与待融合图像进行融合,得到融合后的图像,可以实现根据色相差异程度的不同等级,对待处理图像的像素点的饱和度进行不同程度的抑制,最终实现待处理图像的颜色选择保留。
在一些实施例中,获取用户对待处理图像中的目标区域输入的颜色选取操作,以生成颜色选取指令,所述颜色选取指令用于对所述待处理图像进行处理,以控制所述目标区域的颜色。所述颜色选取指令通过对所述待处理图像进行处理,还可以保留或者更改所述目标区域的颜色。例如,同一替换成指定颜色,如用户指定的颜色,或默认颜色,在此不作限定。
在一些实施例中,还可以根据所述颜色选取操作,对所述待处理图像进行处理,以生成处理后的图像,其中,所述处理后的图像中目标区域的颜色为目标颜色。该目标颜色可以是用户指定的颜色、默认颜色、或与所述待处理图像中目标区域一致的颜色。
图5为本申请又一实施例提供的图像处理方法的流程示意图,本实施例在上述方法实施例的基础上主要描述了确定待处理图像中期望保留的目标色相值的一种可选的实现方式。
如图5所示,本实施例的方法可以包括:
步骤501,获取用户输入的颜色选取操作,以生成颜色选取指令,所述颜色选取指令用于指示期望保留所述待处理图像中的目标颜色。
本步骤中,用户可以通过颜色选取操作选取待处理图像中期望保留的颜色,即目标颜色。示例性的,颜色选取操作具体可以为用户在色板中目标颜色上的点击操作。方便用户操作。
或者,所述待处理图像包括显示界面中当前显示的图像;所述颜色选取 操作包括所述显示界面中目标位置输入的点击操作,所述目标位置包括所述目标像素在所述显示界面中的位置,所述目标像素为颜色为所述目标颜色的图像像素。例如,用户点击显示界面中所显示的待处理图像中的天空位置,则目标颜色可以为待处理图像中该天空位置的颜色。
进一步可选的,所述获取用户输入的颜色选取操作之后还包括:在所述显示界面中标识所述目标像素在所述显示界面中的位置。通过在显示界面中标识目标像素在显示界面中的位置,可以达到便于用户获知其所选取的目标颜色的效果。
步骤502,根据所述颜色选取操作,确定所述目标色相值。
本步骤中,目标色相值即为目标颜色的色相值。示例性的,可以将目标颜色的颜色值由RGB值转换为HSL值,得到目标色相值。
可选的,步骤501之前还可以包括如下步骤B1和步骤B2。
步骤B1,获取所述用户输入的滤镜选择操作,以生成滤镜选择指令,所述滤镜选择指令用于选择颜色保留滤镜。
步骤B2,根据所述滤镜选择操作,开启颜色选取入口,所述颜色选取入口用于获取用户输入的颜色选取操作。
通过步骤B1和步骤B2,可以实现向用户提供用来实现图像的颜色选择保留效果的滤镜,即颜色保留滤镜,达到便于用户使用的效果。
可选的,步骤B2之后还可以包括:输出提示信息,所述提示信息用于指示允许输入颜色选取操作。对于提示信息的具体类型,本申请不做限定,例如可以为语音、文字、震动等。通过在开启颜色选取入口之后输出提示信息,可以提醒用户输入颜色选取操作的时机,从而可以提高用户体验。
本实施例中,通过获取用户输入的颜色选取操作,以生成用于指示期望保留待处理图像中的目标颜色的颜色选取指令,并根据颜色选取操作,确定目标色相值,使得用户可以通过输入颜色选取操作选取待处理图像中期望保留的目标色相值,实现了目标色相值的用户可选取性,达到了提高用户体验的效果。
可选的,上述方法实施例中的所述待处理图像可以包括视频中的视频图像。示例性的,所述待处理图像包括视频的第N1帧视频图像,N1为正整数。通过待处理图像包括视频的第N1帧视频图像,可以达到对于第N1帧视频图像的颜色选择保留的效果。
进一步可选的,在上述方法实施例的还可以包括如下步骤:根据所述视频的第N2帧视频图像中多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述第N2帧视频图像中多个像素点中各像素点的融合权重,N2为正整数。通过根据视频的第N2帧视频图像中多个像素点的色相值分别与所述目标色相值的色相差异程度,确定第N2帧视频图像中多个像素点中各像素点的融合权重,可以使得视频的第N2帧视频图像与第N1帧视频图像的颜色选择保留效果一致。可选的,N2可以大于N1,可以使得视频的第N1帧视频帧之后的第N2帧视频图像与第N1帧视频图像与的颜色选择保留效果一致。
可选的,当待处理图像为RGB图像时,在上述方法实施例的基础上还可以包括如下步骤:将所述多个像素点的颜色值由红绿蓝RGB值转换为色相饱和度明度HSL值。通过将多个像素点的颜色值由红绿蓝RGB值转换为色相饱和度明度HSL值,可以确定多个像素点各自的色相值。
针对相关技术中颜色选择保留方法的上述缺点,本申请设计了一种简易快捷的颜色选择保留滤镜的实现方法,这种方法只需要很简单的交互就能快速的生成颜色选择保留图像。该方法只需要用户用鼠标或触摸屏点击选取需要保留的颜色,就能保留颜色为该颜色的区域,直接将其他区域处理成黑白灰。该方法不仅可以快速处理图像,也可以适用于高清视频的实时处理。
这种方法首先根据用户点击选取的位置获取用户选择的颜色RGB值,将其从RGB色彩空间转换到HSL色彩空间。使用该色彩的H(Hue)值,也就是色相,作为高斯分布均值,对整个色相环进行偏移处理,生成一个长度为360的查找表。然后将整幅图像从RGB色彩空间转换到HSL色彩空间,使用每一个像素点的色相H值查表得到色相偏移,利用色相偏移计算高斯分布值作为融合权重α。使用融合权重α将该整幅待处理图像的原始图像的RGB分量分别和亮度值L(Lightness)进行融合处理,可以得到比较理想的效果。整个过程除了颜色选取需要用户简单交互之外,其他步骤都可以由计算设备自动处理。因此该方法操作非常简单,实用性很强,非常适宜于移植到移动终端手动编辑器,方便用户使用。并且,算法的处理速度很快,能够满足高清视频实时处理的需求。如图6所示,具体可以包括如下步骤:
步骤601,颜色选取。
这一步非常简单,用户通过鼠标或者触摸屏点击选取他想要保留的颜色,计算设备根据交互自动获取该颜色的RGB值。
步骤602,生成色相查找表。
首先,将步骤601所选取的颜色的RGB值转换为HSL值,得到所选取的颜色的色相值H。
如图7所示,HSL色彩空间可以表示为类似于圆锥体的模型,色相沿着圆柱体的外圆周变化,范围[0,360],饱和度沿着从横截面的圆心的距离变化,范围[0,1],色相和饱和度形成平面极坐标,明度沿着横截面到底面和顶面的距离而变化。
其中,饱和度越高,颜色越鲜艳,饱和度越低,颜色越趋近黑白。假设原始图像中各个区域颜色的饱和度基本一致,其色相环是一个比较规则的圆形,如图8所示。颜色选择保留滤镜的目的是,使处理后的图像在用户选择的色相处保留原始的饱和度,其余色相区域的饱和度都被抑制到0或接近0,同时饱和度在最大值和0之间要平滑过渡,如图9所示。具体的,可以使用高斯函数模拟饱和度的分布比较合适。
其中,RGB值转换为HSL值的转换方法可以包括如下步骤1至步骤6。
步骤1:把RGB值归一化成[0,1]中的数值。
步骤2:找出R,G和B中的最大值maxcolor和最小值mincolor。
maxcolor=MAX(r,g,b);
mincolor=MIN(r,g,b);
步骤3:计算亮度:L=(maxcolor+mincolor)/2
步骤4:如果最大值和最小值的颜色值相同,即表示灰色,那么S和H定义为0。
步骤5:如果最大值和最小值的颜色值不同,则根据亮度L计算饱和度S:
If L<0.5,S=(maxcolor-mincolor)/(maxcolor+mincolor)
If L>=0.5,S=(maxcolor-mincolor)/(2.0-maxcolor-mincolor)
步骤6:计算色调H:
If R=maxcolor,H=(G-B)/(maxcolor-mincolor)
If G=maxcolor,H=2.0+(B-R)/(maxcolor-mincolor)
If B=maxcolor,H=4.0+(R-G)/(maxcolor-mincolor)
H=H*60
If H<0,H=H+360
色相取值范围[0,360]。
为了将选取颜色的色相值H作为高斯分布均值,可以对整个色相环进行偏移处理,将用户选择的色相值偏移到位置180,生成一个长度为360的查找表,实现方法如下:
Figure PCTCN2019102699-appb-000001
步骤603,原始图像由RGB图像转换为HSL图像和查表。
具体的,原始图像具体可以为RGB图像,可以将原始图像中每一个像素点的颜色由RGB空间转换成HSL空间。使用每一个像素点色相值H(四舍五入后整数化)查表得到该点的色相偏移值H’。
H’=hue_tab[round(H)]
步骤604,生成融合权重α。
可以使用高斯分布模型计算融合权重α。例如,通过如下公式来计算融合权重α
Figure PCTCN2019102699-appb-000002
方差σ可以用来调节色相保留的范围。σ越大,颜色保留范围越大,更多的与被选取颜色接近的颜色会被保留。σ越小,颜色保留范围越小,被选取颜色保留会越单一,更纯净。
步骤605,原始图像与灰度图像融合。
具体的,可以通过根据融合权重将RGB图像与灰度图像L进行融合。例如,如下公式将RGB图像与灰度图像L按照计算得到的融合权重进行融合,就可以得到颜色选择保留图像,即融合后的图像。
R=(L×(1-α)+R×α)×255;
G=(L×(1-α)+G×α)×255;
B=(L×(1-α)+B×α)×255;
从步骤604和步骤605的公式可以看出,根据高斯分布的原理,当原始图像中某个图像区域像素点的色相偏移值H’越接近180,也就是该图像区域的 像素点的偏移前色相值H越接近用户选取的颜色的色相值,待处理图像获得的融合权重α越大,融合后,该图像区域的颜色保留就越多。反之,当原始图像中某个图像区域像素点的色相偏移值H’越接近0或360,该图像区域的像素点的偏移前色相值H偏离用户选取的颜色的色相值越远,灰度图像获得的融合权重(1-α)就越大,融合后,该图像区域的颜色丢弃就越多,结果更加接近于黑白图像。
如图10所示,假设用户使用鼠标点击图像中的人物的裙摆区域,即鼠标位置位于裙摆区域,且裙摆区域的颜色为红色,则用户所选取的颜色为红色,通过上述步骤601至步骤605处理之后的图像中可以保留颜色为红色的红色区域,其他区域被转成灰度图像。
该方法具有很大的灵活性,用户可以随意选取图像中要保留的颜色。例如图11所示,假设用户使用鼠标点击图像中的天空区域,即鼠标位置位于天空区域,且天空区域的颜色为蓝色,则用户所选取的颜色为蓝色,通过上述步骤601至步骤605处理之后的图像中可以保留颜色为蓝色的蓝色区域,其他区域被转成灰度图像。
可以看出,相比较相关技术中的处理方法,本实施例提供的方法交互更简单,实现更方便,灵活性更强。该方法可以方便的移植到移动终端手动编辑器APP上,使得用户能够随时随地,随心所欲的对图像和视频进行颜色选择保留。另外,该方法的处理速度很快,在Intel i7 2.7GHz,内存16GB 2133MHz LPDDR3的PC机不使用GPU加速的条件下,处理1080x720图像只需要78ms。如果使用GPU加速可以实现高清视频的实时处理。
图12为本申请一实施例提供的图像处理装置的结构示意图,如图12所示,该装置1200可以包括:处理器1201和存储器1202。
所述存储器1202,用于存储程序代码;
所述处理器1201,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
确定待处理图像中期望保留的目标色相值;
根据所述目标色相值,确定所述待处理图像中多个像素点的色相值分别与所述目标色相值的色相差异程度;
根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述多个像素点中各像素点的融合权重;
根据各像素点的融合权重,将所述待处理图像跟与所述待处理图像相对应的待融合图像进行融合,得到融合后的图像。
在一些实施例中,所述处理器1201,还可以用于执行以下操作:
获取用户对待处理图像中的目标区域输入的颜色选取操作,以生成颜色选取指令,所述颜色选取指令用于对所述待处理图像进行处理,以控制所述目标区域的颜色;
根据所述颜色选取操作,对所述待处理图像进行处理,以生成处理后的图像,其中,所述处理后的图像中目标区域的颜色为目标颜色。
本实施例提供的图像处理装置,可以用于执行前述方法实施例的技术方案,其实现原理和技术效果与方法实施例类似,在此不再赘述。
本领域普通技术人员可以理解:实现上述各方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成。前述的程序可以存储于一计算机可读取存储介质中。该程序在执行时,执行包括上述各方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (58)

  1. 一种图像处理方法,其特征在于,包括:
    确定待处理图像中期望保留的目标色相值;
    根据所述目标色相值,确定所述待处理图像中多个像素点的色相值分别与所述目标色相值的色相差异程度;
    根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述多个像素点中各像素点的融合权重;
    根据各像素点的融合权重,将所述待处理图像跟与所述待处理图像相对应的待融合图像进行融合,得到融合后的图像。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述目标色相值,确定所述待处理图像中多个像素点的色相值分别与所述目标色相值的色相差异程度,包括:
    根据所述目标色相值以及预设色相值,对各像素点的色相值进行偏移,获得各所述像素点的偏移后色相值;所述预设色相值为所述目标色相值的偏移后色相值;
    将各像素点的偏移后色相值与所述预设色相值的差异程度,作为各所述像素点的色相值与所述目标色相值的色相差异程度。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述目标色相值以及预设色相值,确定各像素点的偏移后色相值,包括:
    将色相环中所述目标色相值偏移至所述预设色相值,得到偏移前的所述色相环与偏移后的所述色相环之间色相值的偏移关系;
    根据所述偏移关系以及各像素点的色相值,确定各所述像素点的偏移后色相值。
  4. 根据权利要求2所述的方法,其特征在于,所述预设色相值为180。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,所述根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述多个像素点中各像素点的融合权重,包括:
    根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,基于权重策略,确定所述多个像素点中各像素点的融合权重,所述权重策略包括:融合权重与色相差异程度负相关。
  6. 根据权利要求5所述的方法,其特征在于,融合权重与色相差异程度 的关系满足正态分布。
  7. 根据权利要求6所述的方法,其特征在于,所述正态分布的标准差为预设值,或者,所述正态分布的标准差由用户设置。
  8. 根据权利要求1-4任一项所述的方法,其特征在于,所述根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述多个像素点中各像素点的融合权重,包括:
    在各像素点的色相值与所述目标色相值的色相差异程度小于第一阈值时,确定各所述像素点的融合权重为第一权重。
  9. 根据权利要求8所述的方法,其特征在于,所述方法还包括:在各像素点的色相值与所述目标色相值的色相差异程度大于所述第一阈值时,确定各所述像素点的融合权重为第二权重,所述第二权重小于所述第一权重。
  10. 根据权利要求8所述的方法,其特征在于,所述方法还包括:在各像素点的色相值与所述目标色相值的色相差异程度大于所述第一阈值且小于第二阈值时,确定各所述像素点的融合权重为第三权重;
    在各像素点的色相值与所述目标色相值的色相差异程度大于第二阈值时,确定各所述像素点的融合权重为第二权重;
    其中,所述第一阈值小于所述第二阈值,所述第三权重大于所述第二权重。
  11. 根据权利要求8-10任一项所述的方法,其特征在于,所述第一权重等于1。
  12. 根据权利要求9或10所述的方法,其特征在于,所述第二权重等于0。
  13. 根据权利要求1-12任一项所述的方法,其特征在于,所述确定待处理图像中期望保留的目标色相值,包括:
    获取用户输入的颜色选取操作,以生成颜色选取指令,所述颜色选取指令用于指示期望保留所述待处理图像中的目标颜色;
    根据所述颜色选取操作,确定所述目标色相值。
  14. 根据权利要求13所述的方法,其特征在于,所述待处理图像包括显示界面中当前显示的图像;
    所述颜色选取操作包括所述显示界面中目标位置输入的点击操作,所述目标位置包括所述目标像素在所述显示界面中的位置,所述目标像素为颜色为所述目标颜色的图像像素。
  15. 根据权利要求14所述的方法,其特征在于,所述获取用户输入的颜色选取操作之后还包括:
    在所述显示界面中标识所述目标像素在所述显示界面中的位置。
  16. 根据权利要求13-15任一项所述的方法,其特征在于,所述获取用户输入的颜色选取操作之前,还包括:
    获取所述用户输入的滤镜选择操作,以生成滤镜选择指令,所述滤镜选择指令用于选择颜色保留滤镜;
    根据所述滤镜选择操作,开启颜色选取入口,所述颜色选取入口用于获取用户输入的颜色选取操作。
  17. 根据权利要求16所述的方法,其特征在于,所述开启颜色选取入口之后,还包括:
    输出提示信息,所述提示信息用于指示允许输入颜色选取操作。
  18. 根据权利要求1-17任一项所述的方法,其特征在于,所述方法还包括:
    将所述多个像素点的颜色值由红绿蓝RGB值转换为色相饱和度明度HSL值。
  19. 根据权利要求1-17任一项所述的方法,其特征在于,所述多个像素点为所述待处理图像的全部像素点,或者,所述多个像素点为所述待处理图像中目标区域中的像素点。
  20. 根据权利要求19所述的方法,其特征在于,所述目标区域由用户设置。
  21. 根据权利要求1-17任一项所述的方法,其特征在于,所述待处理图像包括视频的第N1帧视频图像,N1为正整数。
  22. 根据权利要求21所述的方法,其特征在于,所述方法还包括:
    根据所述视频的第N2帧视频图像中多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述第N2帧视频图像中多个像素点中各像素点的融合权重,N2为正整数。
  23. 根据权利要求22所述的方法,其特征在于,N2大于N1。
  24. 根据权利要求1-17任一项所述的方法,其特征在于,所述待融合图像包括下述中的任意一种:
    预设图像、所述待处理图像对应的灰度图像、所述待处理图像对应的黑白图像。
  25. 根据权利要求24所述的方法,其特征在于,包括:
    根据所述待处理图像,确定所述待处理图像对应的灰度图像;或者,
    根据所述待处理图像,确定所述待处理图像对应的黑白图像。
  26. 根据权利要求25所述的方法,其特征在于,包括:
    将RGB色彩空间的所述待处理图像处理成YUV色彩空间的待处理图像,并提取所述YUV色彩空间的待处理图像中的Y分量,以生成灰度图像。
  27. 根据权利要求1所述的方法,其特征在于,包括:所述待处理图像为彩色图像。
  28. 一种图像处理方法,其特征在于,包括:
    获取用户对待处理图像中的目标区域输入的颜色选取操作,以生成颜色选取指令,所述颜色选取指令用于指示期望保留所述待处理图像中的目标颜色;
    根据所述颜色选取操作,对所述待处理图像进行处理,以生成处理后的图像,其中,所述处理后的图像中保留所述目标颜色。
  29. 一种图像处理装置,其特征在于,包括:处理器和存储器;
    所述存储器,用于存储程序代码;
    所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
    确定待处理图像中期望保留的目标颜色;
    根据所述目标色相值,确定所述待处理图像中多个像素点的色相值分别与所述目标色相值的色相差异程度;
    根据所述多个像素点的色相值分别与目标色相值的色相差异程度,确定所述多个像素点中各像素点的融合权重;
    根据各像素点的融合权重,将所述待处理图像跟与所述待处理图像相对应的待融合图像进行融合,得到融合后的图像。
  30. 根据权利要求29所述的装置,其特征在于,所述处理器用于根据所述目标色相值,确定所述待处理图像中多个像素点的色相值分别与所述目标色相值的色相差异程度,具体包括:
    根据所述目标色相值以及预设色相值,对各像素点的色相值进行偏移,获得各所述像素点的偏移后色相值;所述预设色相值为所述目标色相值的偏移后色相值;
    将各像素点的偏移后色相值与所述预设色相值的差异程度,作为各所述像素点的色相值与所述目标色相值的色相差异程度。
  31. 根据权利要求29所述的装置,其特征在于,所述处理器用于根据所述目标色相值以及预设色相值,确定各像素点的偏移后色相值,具体包括:
    将色相环中所述目标色相值偏移至所述预设色相值,得到偏移前的所述色相环与偏移后的所述色相环之间色相值的偏移关系;
    根据所述偏移关系以及各像素点的色相值,确定各所述像素点的偏移后色相值。
  32. 根据权利要求29所述的装置,其特征在于,所述预设色相值为180。
  33. 根据权利要求28-31任一项所述的装置,其特征在于,所述处理器用于根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述多个像素点中各像素点的融合权重,具体包括:
    根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,基于权重策略,确定所述多个像素点中各像素点的融合权重,所述权重策略包括:融合权重与色相差异程度负相关。
  34. 根据权利要求32所述的装置,其特征在于,融合权重与色相差异程度的关系满足正态分布。
  35. 根据权利要求33所述的装置,其特征在于,所述正态分布的标准差为预设值,或者,所述正态分布的标准差由用户设置。
  36. 根据权利要求28-31任一项所述的装置,其特征在于,所述处理器用于根据所述多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述多个像素点中各像素点的融合权重,具体包括:
    在各像素点的色相值与所述目标色相值的色相差异程度小于第一阈值时,确定各所述像素点的融合权重为第一权重。
  37. 根据权利要求35所述的装置,其特征在于,所述处理器还用于:在各像素点的色相值与所述目标色相值的色相差异程度大于所述第一阈值时,确定各所述像素点的融合权重为第二权重,所述第二权重小于所述第一权重。
  38. 根据权利要求35所述的装置,其特征在于,所述处理器还用于:在各像素点的色相值与所述目标色相值的色相差异程度大于所述第一阈值且小于第二阈值时,确定各所述像素点的融合权重为第三权重;
    在各像素点的色相值与所述目标色相值的色相差异程度大于第二阈值 时,确定各所述像素点的融合权重为第二权重;
    其中,所述第一阈值小于所述第二阈值,所述第三权重大于所述第二权重。
  39. 根据权利要求35-37任一项所述的装置,其特征在于,所述第一权重等于1。
  40. 根据权利要求36或37所述的装置,其特征在于,所述第二权重等于0。
  41. 根据权利要求28-39任一项所述的装置,其特征在于,所述处理器用于确定待处理图像中期望保留的目标色相值,具体包括:
    获取用户输入的颜色选取操作,以生成颜色选取指令,所述颜色选取指令用于指示期望保留所述待处理图像中的目标颜色;
    根据所述颜色选取操作,确定所述目标色相值。
  42. 根据权利要求30所述的装置,其特征在于,所述待处理图像包括显示界面中当前显示的图像;
    所述颜色选取操作包括所述显示界面中目标位置输入的点击操作,所述目标位置包括所述目标像素在所述显示界面中的位置,所述目标像素为颜色为所述目标颜色的图像像素。
  43. 根据权利要求41所述的装置,其特征在于,所述处理器还用于:
    在所述显示界面中标识所述目标像素在所述显示界面中的位置。
  44. 根据权利要求40-42任一项所述的装置,其特征在于,所述处理器还用于:
    获取所述用户输入的滤镜选择操作,以生成滤镜选择指令,所述滤镜选择指令用于选择颜色保留滤镜;
    根据所述滤镜选择操作,开启颜色选取入口,所述颜色选取入口用于获取用户输入的颜色选取操作。
  45. 根据权利要求43所述的装置,其特征在于,所述处理器还用于:
    输出提示信息,所述提示信息用于指示允许输入颜色选取操作。
  46. 根据权利要求28-44任一项所述的装置,其特征在于,所述处理器好还用于:
    将所述多个像素点的颜色值由红绿蓝RGB值转换为色相饱和度明度HSL值。
  47. 根据权利要求28-44任一项所述的装置,其特征在于,所述多个像素 点为所述待处理图像的全部像素点,或者,所述多个像素点为所述待处理图像中目标区域中的像素点。
  48. 根据权利要求46所述的方法,其特征在于,所述目标区域由用户设置。
  49. 根据权利要求28-44任一项所述的装置,其特征在于,所述待处理图像包括视频的第N1帧视频图像,N1为正整数。
  50. 根据权利要求48所述的装置,其特征在于,所述处理器还用于:
    根据所述视频的第N2帧视频图像中多个像素点的色相值分别与所述目标色相值的色相差异程度,确定所述第N2帧视频图像中多个像素点中各像素点的融合权重,N2为正整数。
  51. 根据权利要求49所述的装置,其特征在于,N2大于N1。
  52. 根据权利要求28-44任一项所述的装置,其特征在于,所述待融合图像包括下述中的任意一种:
    预设图像、所述待处理图像对应的灰度图像、所述待处理图像对应的黑白图像。
  53. 根据权利要求50所述的装置,其特征在于,所述处理器还用于:
    根据所述待处理图像,确定所述待处理图像对应的灰度图像;或者,
    根据所述待处理图像,确定所述待处理图像对应的黑白图像。
  54. 根据权利要求52所述的装置,其特征在于,所述处理器具体用于:
    将RGB色彩空间的所述待处理图像处理成YUV色彩空间的待处理图像,并提取所述YUV色彩空间的待处理图像中的Y分量,以生成灰度图像。
  55. 根据权利要求28所述的装置,其特征在于,包括:所述待处理图像为彩色图像。
  56. 一种图像处理装置,其特征在于,包括:处理器和存储器;
    所述存储器,用于存储程序代码;
    所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
    获取用户对待处理图像中的目标区域输入的颜色选取操作,以生成颜色选取指令,所述颜色选取指令用于指示期望保留所述待处理图像中的目标颜色;
    根据所述颜色选取操作,对所述待处理图像进行处理,以生成处理后的 图像,其中,所述处理后的图像中保留所述目标颜色。
  57. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序包含至少一段代码,所述至少一段代码可由计算机执行,以控制所述计算机执行如权利要求1-27任一项所述的方法。
  58. 一种计算机程序,其特征在于,当所述计算机程序被计算机执行时,用于实现如权利要求1-27任一项所述的方法。
PCT/CN2019/102699 2019-08-27 2019-08-27 图像处理方法及装置 WO2021035505A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980033736.XA CN112204608A (zh) 2019-08-27 2019-08-27 图像处理方法及装置
PCT/CN2019/102699 WO2021035505A1 (zh) 2019-08-27 2019-08-27 图像处理方法及装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/102699 WO2021035505A1 (zh) 2019-08-27 2019-08-27 图像处理方法及装置

Publications (1)

Publication Number Publication Date
WO2021035505A1 true WO2021035505A1 (zh) 2021-03-04

Family

ID=74004560

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/102699 WO2021035505A1 (zh) 2019-08-27 2019-08-27 图像处理方法及装置

Country Status (2)

Country Link
CN (1) CN112204608A (zh)
WO (1) WO2021035505A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113240760A (zh) * 2021-06-29 2021-08-10 北京市商汤科技开发有限公司 一种图像处理方法、装置、计算机设备和存储介质
CN113421340A (zh) * 2021-06-24 2021-09-21 百特利德(大连)科技有限公司 一种点群数据特定目标数据提取的自动建模方法及系统
CN113763496A (zh) * 2021-03-19 2021-12-07 北京沃东天骏信息技术有限公司 图像着色的方法、装置及计算机可读存储介质
US20230005112A1 (en) * 2021-06-30 2023-01-05 V5 Technologies Co., Ltd. Image matching method
CN116109933A (zh) * 2023-04-13 2023-05-12 山东省土地发展集团有限公司 一种用于废弃矿山生态修复的动态识别方法
CN116452827A (zh) * 2023-06-16 2023-07-18 青岛奥维特智能科技有限公司 基于计算机视觉的油墨印刷表面质量检测方法及系统

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112907704B (zh) * 2021-02-04 2024-04-12 浙江大华技术股份有限公司 一种图像融合方法、计算机设备以及装置
CN113191938B (zh) * 2021-04-29 2022-11-15 北京市商汤科技开发有限公司 图像处理方法、装置、电子设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102447913A (zh) * 2011-12-22 2012-05-09 深圳市万兴软件有限公司 一种移色处理方法及系统
CN103618886A (zh) * 2013-12-13 2014-03-05 厦门美图网科技有限公司 一种根据主色调智能脱色的摄像方法
CN103824256A (zh) * 2012-11-16 2014-05-28 腾讯科技(深圳)有限公司 一种图像处理方法及装置
CN106648354A (zh) * 2015-11-02 2017-05-10 奥林巴斯株式会社 图像调整装置
JP2017112543A (ja) * 2015-12-17 2017-06-22 キヤノン株式会社 画像処理装置、画像処理方法、及びプログラム
CN108900766A (zh) * 2018-06-15 2018-11-27 北京华捷艾米科技有限公司 一种全景图像自动增强装置和方法、以及应用该装置的全景相机
CN109741283A (zh) * 2019-01-23 2019-05-10 芜湖明凯医疗器械科技有限公司 一种实现智能滤镜的方法和装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102447913A (zh) * 2011-12-22 2012-05-09 深圳市万兴软件有限公司 一种移色处理方法及系统
CN103824256A (zh) * 2012-11-16 2014-05-28 腾讯科技(深圳)有限公司 一种图像处理方法及装置
CN103618886A (zh) * 2013-12-13 2014-03-05 厦门美图网科技有限公司 一种根据主色调智能脱色的摄像方法
CN106648354A (zh) * 2015-11-02 2017-05-10 奥林巴斯株式会社 图像调整装置
JP2017112543A (ja) * 2015-12-17 2017-06-22 キヤノン株式会社 画像処理装置、画像処理方法、及びプログラム
CN108900766A (zh) * 2018-06-15 2018-11-27 北京华捷艾米科技有限公司 一种全景图像自动增强装置和方法、以及应用该装置的全景相机
CN109741283A (zh) * 2019-01-23 2019-05-10 芜湖明凯医疗器械科技有限公司 一种实现智能滤镜的方法和装置

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113763496A (zh) * 2021-03-19 2021-12-07 北京沃东天骏信息技术有限公司 图像着色的方法、装置及计算机可读存储介质
CN113763496B (zh) * 2021-03-19 2024-04-09 北京沃东天骏信息技术有限公司 图像着色的方法、装置及计算机可读存储介质
CN113421340A (zh) * 2021-06-24 2021-09-21 百特利德(大连)科技有限公司 一种点群数据特定目标数据提取的自动建模方法及系统
CN113421340B (zh) * 2021-06-24 2023-12-05 百特利德(大连)科技有限公司 一种点群数据特定目标数据提取的自动建模方法及系统
CN113240760A (zh) * 2021-06-29 2021-08-10 北京市商汤科技开发有限公司 一种图像处理方法、装置、计算机设备和存储介质
CN113240760B (zh) * 2021-06-29 2023-11-24 北京市商汤科技开发有限公司 一种图像处理方法、装置、计算机设备和存储介质
US20230005112A1 (en) * 2021-06-30 2023-01-05 V5 Technologies Co., Ltd. Image matching method
CN116109933A (zh) * 2023-04-13 2023-05-12 山东省土地发展集团有限公司 一种用于废弃矿山生态修复的动态识别方法
CN116452827A (zh) * 2023-06-16 2023-07-18 青岛奥维特智能科技有限公司 基于计算机视觉的油墨印刷表面质量检测方法及系统
CN116452827B (zh) * 2023-06-16 2023-08-15 青岛奥维特智能科技有限公司 基于计算机视觉的油墨印刷表面质量检测方法及系统

Also Published As

Publication number Publication date
CN112204608A (zh) 2021-01-08

Similar Documents

Publication Publication Date Title
WO2021035505A1 (zh) 图像处理方法及装置
CN109639982B (zh) 一种图像降噪方法、装置、存储介质及终端
US11430209B2 (en) Image signal processing method, apparatus, and device
JP4040625B2 (ja) 画像処理装置、プリンタ装置、撮影装置、及びテレビ受像装置
CN104076928B (zh) 一种调整文字显示区域色调的方法
EP3664016B1 (en) Image detection method and apparatus, and terminal
WO2023124722A1 (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
WO2017124909A1 (zh) 拍照装置及方法
CN113132696B (zh) 图像色调映射方法、装置、电子设备和存储介质
US20230079582A1 (en) Image processing method and apparatus, terminal, and storage medium
WO2023016035A1 (zh) 视频处理方法、装置、电子设备和存储介质
WO2023016037A1 (zh) 视频处理方法、装置、电子设备和存储介质
WO2022121893A1 (zh) 图像处理方法、装置、计算机设备和存储介质
JP2006013836A (ja) カラー画像のカラー画像データを処理する画像データ処理
US8565523B2 (en) Image content-based color balancing
US11032529B2 (en) Selectively applying color to an image
US20220327749A1 (en) Method and electronic device for processing images
WO2023005850A1 (zh) 图像处理方法及装置、电子设备、存储介质及计算机程序产品
WO2022246945A1 (zh) 一种信息显示方法、装置、ar设备及存储介质
CN113781330A (zh) 图像处理方法、装置及电子系统
CN113409713A (zh) 一种蓝光护眼强度调节方法、装置、介质和设备
CN104090764B (zh) 一种终端
US10887567B2 (en) Camera color image processing
WO2021212977A1 (zh) 图像颜色过滤方法及装置、电子设备、存储介质
WO2023010913A1 (zh) 一种图像处理方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19943586

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19943586

Country of ref document: EP

Kind code of ref document: A1