WO2021243955A1 - 主色调提取方法及装置 - Google Patents

主色调提取方法及装置 Download PDF

Info

Publication number
WO2021243955A1
WO2021243955A1 PCT/CN2020/127558 CN2020127558W WO2021243955A1 WO 2021243955 A1 WO2021243955 A1 WO 2021243955A1 CN 2020127558 W CN2020127558 W CN 2020127558W WO 2021243955 A1 WO2021243955 A1 WO 2021243955A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
color value
reference number
pixels
Prior art date
Application number
PCT/CN2020/127558
Other languages
English (en)
French (fr)
Inventor
杨鼎超
刘易周
汪洋
Original Assignee
北京达佳互联信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京达佳互联信息技术有限公司 filed Critical 北京达佳互联信息技术有限公司
Publication of WO2021243955A1 publication Critical patent/WO2021243955A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Definitions

  • the present disclosure relates to the field of computer technology, and in particular to a method for extracting dominant colors.
  • a common image processing method is to extract the main color of the image. Due to the different colors of each pixel in the image, how to extract the dominant tone in the image has become an urgent problem to be solved.
  • the present disclosure provides a method, device, electronic device, and storage medium for extracting main tones, which can make the extracted main tones more in line with the characteristics of the first image, improve the accuracy of main tones extraction, and save processing time and improve the main The extraction efficiency of hue.
  • a method for extracting a dominant tone which is applied to an electronic device, and the method includes:
  • a first color value of each pixel is obtained in parallel, and the first color value is based on each pixel and multiple other pixels in the first image. Determining the distance of the pixel, the original color value of the multiple other pixels, and the original color value of each pixel;
  • the main hue of the first image is determined, and the main hue is the color value of at least one pixel in the second image.
  • obtaining the first color value of each pixel in parallel includes:
  • multiple second color values of the pixel are obtained in parallel, and the multiple second color values are the mixed color values of the pixel relative to each other pixel, and one of the second color values is The color value is determined according to the distance between the pixel point and one of the other pixel points, the color value of the pixel point, and the color value of one of the other pixel points;
  • the acquiring the first image includes:
  • the first image is acquired, and the first image is composed of the first reference number of pixels.
  • the obtaining the first reference number of pixels includes:
  • the first image is created, and the first image includes the color values of the first reference number of pixels.
  • the parallel determination of the color values of the first reference number of pixels includes:
  • the dividing the third image into a first reference number of image regions includes:
  • An image area meeting the first size is divided from the third image.
  • extracting the color value of at least one pixel from the second image includes:
  • the second reference number of pixels is any pixel extracted from each image area in the second reference number of image areas
  • the color value of the second reference number of pixel points is extracted.
  • the dividing the second image into a second reference number of image regions includes:
  • An image area meeting the second size is divided from the second image.
  • the image area of the second reference quantity has a corresponding image area in the first image
  • the determining the main hue of the first image includes:
  • the number of image areas corresponds to the image area of the first image area.
  • a main tone extraction device including:
  • the color value obtaining unit is configured to obtain, for each pixel in the first image, the first color value of each pixel in parallel, and the first color value is based on each pixel and the Determine the distances of multiple other pixels in the first image, the original color values of the multiple other pixels, and the original color values of each pixel;
  • a generating unit configured to obtain a second image, the second image being obtained based on the first color value of each pixel
  • the extraction unit is configured to determine the main hue of the first image, where the main hue is the color value of at least one pixel in the second image.
  • the color value obtaining unit includes:
  • the mixing subunit is used to obtain a plurality of second color values of the pixel in parallel for any pixel, and the plurality of second color values are the mixed color values of the pixel relative to each other pixel , One of the second color values is determined according to the distance between the pixel point and one of the other pixel points, the color value of the pixel point, and the color value of one of the other pixel points;
  • the determining subunit is configured to obtain a first color value of the pixel point, where the first color value is the sum of the multiple second color values.
  • the image acquisition unit includes:
  • the first division subunit is used to divide the third image into a first reference number of image areas, and each image area has the same size;
  • the processing subunit is configured to obtain the first reference number of pixels in parallel, and the first reference number of pixels are obtained by performing down-sampling processing on each image area in parallel.
  • the processing subunit is used to:
  • the first image is created, and the first image includes the color values of the first reference number of pixels.
  • the processing subunit is used to:
  • the first division subunit is used for:
  • An image area meeting the first size is divided from the third image.
  • the extraction unit includes:
  • a second division subunit configured to divide the second image into a second reference number of image areas, and the second reference number of image areas have the same size
  • the pixel extraction subunit is used to obtain the second reference number of pixels, where the second reference number of pixels is any pixel extracted from each image area in the second reference number of image areas point;
  • the color value extraction subunit is used to extract the color value of the second reference number of pixel points.
  • the second division subunit is used for:
  • An image area meeting the second size is divided from the second image.
  • the second reference number of image areas have corresponding image areas in the first image
  • the extraction unit is configured to:
  • the number of image areas corresponds to the image area of the first image area.
  • an electronic device including:
  • One or more processors are One or more processors;
  • Volatile or non-volatile memory for storing the one or more processor executable commands
  • the one or more processors are configured to execute the main tone extraction method as described in the above aspect.
  • a non-transitory computer-readable storage medium When instructions in the storage medium are executed by a processor of an electronic device, the electronic device can execute The main color extraction method.
  • a computer program product When the instructions in the computer program product are executed by the processor of the electronic device, the electronic device can execute the main color extraction method as described in the above aspect. .
  • the method, device, electronic equipment, and storage medium provided by the embodiments of the application improve the efficiency of image processing, and the extracted main tones are more in line with the characteristics of the first image, improve the accuracy of main tones extraction, and save processing Time to improve the extraction efficiency of the main color.
  • Fig. 1 is a flow chart showing a method for extracting a dominant tone according to an exemplary embodiment.
  • Fig. 2 is a flow chart showing a method for extracting a dominant tone according to an exemplary embodiment.
  • Fig. 3 is a schematic diagram showing a method of dividing a third image according to an exemplary embodiment.
  • Fig. 4 is a schematic diagram of obtaining an average value of color feature values of a third image according to an exemplary embodiment.
  • Fig. 5 is a schematic diagram showing a down-sampling process according to an exemplary embodiment.
  • Fig. 6 is a schematic diagram of performing interpolation processing according to an exemplary embodiment.
  • Fig. 7 is a schematic diagram showing a hue extraction according to an exemplary embodiment.
  • Fig. 8 is a schematic structural diagram of a main tone extraction device according to an exemplary embodiment.
  • Fig. 9 is a schematic structural diagram showing another main tone extraction device according to an exemplary embodiment.
  • Fig. 10 is a block diagram showing a terminal according to an exemplary embodiment.
  • first, second, etc. used in the present disclosure can be used herein to describe various concepts, but unless otherwise specified, these concepts are not limited by these terms. These terms are only used to distinguish one concept from another.
  • the first image may be referred to as the second image
  • the second image may be referred to as the first image.
  • the embodiment of the present disclosure provides a method for extracting a dominant tone, which is applied in a variety of scenarios.
  • the method provided by the embodiment of the present disclosure is applied in an image classification scenario. If the terminal needs to classify multiple images, the main color of each image can be obtained by using the method provided by the embodiment of the present disclosure, and then according to each image classification. The main color of the image, to classify multiple images.
  • the method provided by the embodiment of the present disclosure is applied in an image search scenario.
  • the method provided by the embodiment of the present disclosure can be used to obtain the main color of each image, and find the main color to be searched. Get search results for images with the same main color tone.
  • the main tone extraction method provided by the embodiment of the present disclosure is applied to a terminal.
  • the terminal is a variety of types of terminals such as mobile phones, tablet computers, and computers.
  • the terminal includes a GPU (Graphics Processing Unit, graphics processor).
  • the GPU is a processor that performs drawing operations in the terminal. Using the GPU to process data in parallel can increase the data processing rate.
  • Fig. 1 is a flow chart showing a method for extracting a dominant tone according to an exemplary embodiment. Referring to Fig. 1, applied to an electronic device, the method includes:
  • the electronic device acquires a first image.
  • the electronic device simultaneously performs the following operations for each pixel in the first image: based on the distance between the reference pixel and the pixel, the color value of the reference pixel, and the color value of the pixel, obtain the mixed color of the reference pixel Value, the reference pixel is any pixel in the first image.
  • the above step 102 can also be expressed as: for each pixel in the first image, obtain the first color value of each pixel in parallel, and the first The color value is determined based on the distance between each pixel and the multiple other pixels in the first image, the original color value of the multiple other pixels, and the original color value of each pixel.
  • the electronic device processes the first image based on the mixed color value of each pixel in the first image to generate a second image.
  • the electronic device obtains the second image, and the second image is obtained based on the first color value of each pixel.
  • the electronic device extracts the color value of at least one pixel from the second image, and determines the color value of the at least one pixel as the main hue of the first image.
  • the electronic device determines the main hue of the first image, and the main hue is the color value of at least one pixel in the second image.
  • the method provided by the embodiments of the present disclosure improves the efficiency of image processing, the extracted main tone is more in line with the characteristics of the first image, improves the accuracy of main tone extraction, saves processing time, and improves the extraction of main tone efficient.
  • the following operations are performed for each pixel at the same time: based on the distance between the reference pixel and the pixel, the color value of the reference pixel and the color value of the pixel, the mixed color value of the reference pixel is obtained, and the reference A pixel is any pixel in the first image, including:
  • the sum of the mixed color value of each pixel relative to the reference pixel is used as the mixed color value of the reference pixel.
  • acquiring the first image includes,
  • the down-sampling process is performed on the image area of the first reference number to obtain the pixel points of the first reference number, and the pixel points of the first reference number are formed into the first image.
  • simultaneously performing down-sampling processing on the image area of the first reference number to obtain the first reference number of pixels, and forming the first image with the first reference number of pixels includes:
  • a first image including the plurality of pixels is created.
  • determining the color value of each image area based on the color value of the pixel points included in each image area at the same time, as the color value of the pixel point corresponding to each image area includes:
  • the average value of the color value of the pixel in each image area is taken as the color value of the pixel corresponding to each image area in the first image.
  • dividing the third image into a first reference number of image regions includes:
  • an image area satisfying the first size is divided from the third image.
  • extracting the color value of at least one pixel from the second image includes:
  • dividing the second image into a second reference number of image regions includes:
  • an image area meeting the second size is divided from the second image.
  • the image area of the second reference quantity has a corresponding image area in the first image, and determining the color value of at least one pixel as the main hue of the first image includes:
  • the color value of each pixel extracted in the second image is used as the main color of the target area corresponding to each pixel.
  • the target area corresponding to each pixel is the image area that each pixel belongs to in the first The corresponding image area in the image.
  • Fig. 2 is a flow chart showing a method for extracting a dominant tone according to an exemplary embodiment. Referring to Fig. 2, when applied to an electronic device, the method includes:
  • the electronic device divides the third image into a first reference number of image regions, and each image region has the same size.
  • the third image is any image for which the main color needs to be extracted.
  • the third image is a landscape image, a character image, or other types of images, etc.
  • the third image is obtained by shooting, by searching for images posted by other users on a social platform, by searching, or by other methods.
  • a third image is acquired, and the third image is divided into a first reference number of image areas, and the first reference number of image areas have the same size.
  • the third image includes a plurality of pixels, and the plurality of pixels in the third image are evenly distributed. Therefore, the number of pixels in the image area of the first reference number is also the same.
  • the first reference quantity is the default quantity of the terminal, or the first reference quantity is set by the user, or the first reference quantity is set in other ways.
  • the first reference number is any integer greater than or equal to 1, for example, the first reference number is 4, 6, 8, and so on.
  • a third image is divided into 5 image regions from top to bottom.
  • At least one of the length or the width of the third image is uniformly divided, so as to divide the third image into a first reference number of image regions.
  • the length of the third image is evenly divided, for example, the third image is divided into 6*1 image regions.
  • the width of the third image is evenly divided, for example, the third image is divided into 1*6 image areas.
  • the length and width of the third image are evenly divided, for example, the third image is divided into 2*8 image regions.
  • the first size of the divided image area is determined based on the size of the third image and the first reference quantity, and based on the first size, the division from the third image satisfies The image area of the first size.
  • the electronic device determines the first size, and the first size is obtained based on the size of the third image and the first reference quantity; the third image is divided into an image area that meets the first size.
  • the size of the third image is divided equally based on the first reference quantity to obtain the first size.
  • the size is the size of each image area divided from the third image, so that the third image is divided into uniform image areas, and the size of each divided image area is the same, so the first size
  • the third image in the process of dividing the third image, is divided in a horizontal manner to obtain the first reference number of regions arranged horizontally, or the third image is divided in a vertical manner.
  • the division is performed to obtain the first reference number of regions arranged in the vertical direction, or the third image is divided in two ways including the horizontal and vertical directions to obtain the first reference number of regions that are evenly arranged in the horizontal and vertical directions.
  • the electronic device performs down-sampling processing on the image area of the first reference number at the same time to obtain the pixel points of the first reference number, and the pixel points of the first reference number form the first image.
  • the above-mentioned step 202 can also be expressed as: obtaining a first reference number of pixels in parallel, and performing parallel processing on each image area for the first reference number of pixels. Obtained by down-sampling processing.
  • the down-sampling process is used to reduce the number of pixels in the image and create a new image with the reduced number of pixels.
  • the third image includes 100 pixels
  • the third image is down-sampled to obtain a new image including 10 pixels
  • the size of the third image is smaller than the size of the first image.
  • each image area is merged into one pixel at the same time to obtain a first reference number of pixels, and the first reference number of pixels form a first image. That is, the first image is acquired, and the first image is composed of a first reference number of pixels.
  • the 5 image regions are down-sampled to obtain 5 pixels, and these 5 pixels form the first image.
  • the pixel points corresponding to each image area are added to the first image to form the first image. image.
  • the color value of each image area is determined based on the color value of the pixel points included in each image area at the same time, as the color value of the pixel point corresponding to each image area, based on the determined multiple pixel points
  • the color value of to create the first image containing the multiple pixels That is, the color values of the first reference number of pixels are determined in parallel, and the color values are determined based on the color values of the pixels included in the image area corresponding to the pixels; the first image is created, and the first image contains the first reference number of pixels. The color value of the pixel.
  • the third image includes the first reference number of image areas, and each image area corresponds to one pixel in the first image, the first image includes the first reference number of pixels.
  • the color value is used to represent the color of the pixel.
  • the color value may be an RGB (Red Green Blue) value, or the color value may be a pixel value, or the color value may be other types of values, and so on.
  • the average value of the color values of the pixels in each image area is obtained, and the average value of the color values of the pixels in each image area is taken as the pixel point corresponding to each image area in the first image The color value. That is, determining the color values of the first reference number of pixels in parallel includes: obtaining the average value of the color values of the pixels in each image area in parallel; obtaining the color value of the first reference number of pixels, the color value It is the average value of the color value of the pixel in the image area corresponding to the pixel.
  • C i is used to represent the color average value of the color value of the pixel in the i-th image area, It represents the area for the i-th image region, C (u, v) represent the color values for the pixel coordinates (u, v) at, D i represent a set of i-th pixel in the image area.
  • i is any value greater than or equal to 1
  • u and v are any value.
  • the coordinates (u, v) are determined based on the image coordinate system of the image area. For example, use the lower left corner of the third image as the origin of the coordinate system, and determine the coordinates of the pixel based on the position of the pixel in the third image; or, determine any coordinate, and obtain the color of the pixel at any coordinate value.
  • the first image area includes 2 pixels, and the color values of these 2 pixels are (20, 60, 30), (60, 80, 20), then Using the above formula, the obtained average value of the color values of the pixels in the first image area is (40, 70, 25).
  • the average value of the color values of the pixels in each image area is obtained.
  • the first image includes the colors of 5 pixels. value.
  • the color values of the 4 image regions obtained in the order from top to bottom are (20, 40, 20), (30, 20, 30), (50, 50, 60), (100, 20, 20), the determined color value of each pixel of the first image is (20, 40, 20), (30 , 20, 30), (50, 50, 60), (100, 20, 20).
  • the resolution of the third image is M*N
  • the steps in the related technology are used for downsampling, the time complexity is O(M*N), and the steps 201-202 provided by the embodiment of the present disclosure are adopted.
  • the third image is reduced to the first image, and the time complexity is O(1).
  • the processing speed is increased by M*N times, and the efficiency of downsampling the image is improved.
  • the color values of the pixels corresponding to the determined image area are more uniform, which can improve the accuracy of the determined color values of the pixels.
  • the first image is obtained, which improves the efficiency of extracting the main tones of the image and reduces the amount of processed data.
  • steps 201-202 are not performed, but the first image is directly obtained.
  • the first image is any image whose dominant color needs to be extracted.
  • the method of obtaining is similar to the method of obtaining the third image in step 201 above.
  • the first image in the embodiment of the present disclosure is a landscape image, a person image, or other types of images, and so on.
  • the first image is obtained by shooting, by searching for images posted by other users on a social platform, by searching, or by other means.
  • steps 203-205 can be executed subsequently to directly acquire the main tone of the first image.
  • the electronic device simultaneously performs the following operations for each pixel: based on the distance between the reference pixel and the pixel, the color value of the reference pixel, and the color value of the pixel, obtaining the mixed color value of the reference pixel.
  • the above step 203 can also be expressed as: for each pixel in the first image, the first color value of each pixel is obtained in parallel, and the first color value is based on each pixel. The distance between each pixel and the multiple other pixels in the first image, the original color value of the multiple other pixels, and the original color value of each pixel are determined.
  • the reference pixel is any pixel in the first image.
  • the following operations are performed for each pixel at the same time: based on the distance between the reference pixel and the pixel, the color value of the pixel is mixed with the color value of the reference pixel, and the mixed color value is used as the pixel The mixed color value of the point relative to the reference pixel; the sum of the mixed color value of each pixel relative to the reference pixel is used as the mixed color value of the reference pixel.
  • the multiple second color values of the pixel are obtained in parallel, the multiple second color values are the mixed color value of the pixel relative to each other pixel, and a second color value is based on the pixel
  • the distance between the point and another pixel, the color value of the pixel and the color value of one other pixel are determined; the first color value of the pixel is obtained, and the first color value is the sum of multiple second color values.
  • the following formula is used to obtain the mixed color value of each pixel with respect to the reference pixel:
  • y is the mixed color value of the first pixel with respect to the reference pixel
  • A is the color value of the pixel
  • k is a fixed value
  • x is the coordinate value of the pixel.
  • step performed in step 203 in the embodiment of the present disclosure is the process of performing interpolation processing on the image, that is, "for each pixel, the following operations are performed at the same time: based on the reference pixel and the pixel
  • the distance of the point, the color value of the reference pixel, and the color value of the pixel, to obtain the mixed color value of the reference pixel is to interpolate the color value of each pixel.
  • a second image after the interpolation processing is obtained.
  • the color value of each pixel in the second image is smoother, which can improve the accuracy of the main color extracted subsequently.
  • processing each pixel in the first image at the same time can save processing time and improve processing efficiency.
  • steps 202-203 in the embodiments of the present disclosure use GPU to perform down-sampling processing on the image area of the first reference number in parallel to obtain the pixel points of the first reference number.
  • the first image is formed, and then based on the distance between the reference pixel and the pixel, the color value of the reference pixel, and the color value of the pixel, the mixed color value of the reference pixel is obtained in parallel.
  • the GPU is used to perform processing in parallel to achieve the effect of simultaneously performing down-sampling for multiple image regions in step 202 and the effect of simultaneously obtaining the color values of the reference pixels after mixing in step 203.
  • the processing time complexity is O(M* N*C)
  • the processing time complexity is O(C)
  • the electronic device processes the first image based on the mixed color value of each pixel in the first image to generate a second image.
  • the embodiment of the present disclosure only uses Gaussian function
  • the interpolation processing of the first image will be described as an example.
  • a linear interpolation method is used to perform interpolation processing on the first image to obtain an interpolated second image.
  • the electronic device extracts the color value of at least one pixel from the second image, and determines the color value of the at least one pixel as the main hue of the first image.
  • the second image includes a plurality of pixels, and when determining the main hue of the first image, the color value of at least one pixel is extracted from the second image as the main hue of the first image.
  • the second image is divided into a second reference number of image areas, any one pixel is extracted from each image area in the second reference number of image areas to obtain the second reference number of pixels, and the The color value of the second reference number of pixels is determined as the main hue of the first image.
  • the size of the image area of the second reference quantity is the same. That is, the second image is divided into image areas of the second reference number, and the size of the image areas of the second reference number is the same; the pixels of the second reference number are obtained, and the pixels of the second reference number are obtained from the second reference number. Any one pixel extracted from each image area in the number of image areas; extract the color value of the second reference number of pixels.
  • the second image is an image obtained by processing the first image
  • the second image is divided into a second reference number
  • the image area of the second reference number can represent the main tone of the corresponding area in the first image, then any pixel is extracted from each image area in the image area of the second reference number to obtain the second reference number , Extract the color value of the second reference number of pixels, and determine it as the main hue of the first image.
  • the center pixel of each image area is extracted to obtain the second reference number of pixels, and the second reference number of pixels are extracted.
  • the color value of the pixel is determined as the main hue of the first image.
  • the size of the image area of the second reference quantity is the same.
  • the second reference quantity is the default value of the terminal, or the second reference quantity is set by the user, or it can also be set in other ways.
  • the second reference number can be 5, 6, 7, or other numerical values.
  • the center pixel is the pixel located in the center of the image area.
  • the fourth pixel is the center pixel of the image area.
  • the second image is divided into 5 image areas from top to bottom, and the sizes of these 5 image areas are the same. Then, according to the size of the 5 image areas, determine the center pixel of each image area, then 5 The color corresponding to the color value of each central pixel is the main color of the first image.
  • the second size of the divided image area is determined based on the size of the second image and the second reference number, and based on the second size, the image area that meets the second size is divided from the second image.
  • This embodiment can also be expressed as: determining the second size, which is obtained based on the size of the second image and the second reference quantity; and dividing the image area that meets the second size from the second image.
  • the size of the second image is divided equally based on the second reference quantity to obtain the second size. That is, the size of each image area divided from the second image, so that the second image is divided into uniform image areas, and the size of each divided image area is the same, so the second size is The image is divided to obtain a second reference number of image areas.
  • the image area of the second reference number in the second image has a corresponding image area in the first image. Therefore, the color value of each pixel extracted in the second image is used as each The main color of the target area corresponding to the pixel, and the target area corresponding to each pixel is the image area corresponding to the image area to which each pixel belongs in the first image. That is to say, determine the main hue of at least one first image area in the first image, the main hue is the color value of any pixel extracted from the second image area, and the second image area is the second reference number of image areas The image area corresponding to the first image area.
  • steps 201-202 are performed to perform down-sampling processing on the third image to obtain the first image
  • steps 203-204 are performed to perform interpolation processing on the first image to obtain the first image.
  • step 205 is executed to extract 5 pixels from the second image, and the color values of these 5 pixels are the main hue of the first image.
  • the first image is an image after down-sampling processing of the third image
  • the determined main color of the first image may also be used as the main color of the third image.
  • steps 201-202 are not executed, the main color of the first image is directly obtained.
  • step 205 in the embodiment of the present disclosure simultaneously extracts the pixel value of at least one pixel from the second image.
  • step 205 in the embodiment of the present disclosure extracts the color value of at least one pixel in the second image in parallel through the GPU, and then uses the color corresponding to the color value of the at least one pixel as the color value of the first image. main color tone.
  • the target image is obtained by the CPU, each pixel in the target image is traversed, the color feature value of each pixel is extracted, and then based on the color feature value of each pixel, the color feature value with the most pixels is obtained , And use the color corresponding to the color feature value as the main hue of the target image.
  • the above method needs to traverse each pixel in the target image, which requires a long processing time and low efficiency in extracting the dominant tone.
  • the method provided by the embodiments of the application improves the efficiency of image processing, the extracted main tones are more in line with the characteristics of the first image, the accuracy of main tones extraction is improved, and the processing time is saved, and the extraction of main tones is improved. efficient.
  • the color values of the pixels corresponding to the determined image area are more uniform, which improves the accuracy of the color values of the determined pixels.
  • the amount of processed data is reduced, and the efficiency of extracting the main tone of the image is improved.
  • Fig. 8 is a schematic structural diagram of a main tone extraction device according to an exemplary embodiment. Referring to Figure 8, the device includes:
  • the image acquisition unit 801 is configured to acquire a first image
  • the color value obtaining unit 802 is configured to obtain the first color value of each pixel in parallel for each pixel in the first image, and the first color value is based on each pixel and multiple other pixels in the first image To determine the distance, the original color value of multiple other pixels, and the original color value of each pixel;
  • the generating unit 803 is configured to obtain a second image, which is obtained based on the first color value of each pixel;
  • the extraction unit 804 is configured to determine the main hue of the first image, where the main hue is the color value of at least one pixel in the second image.
  • the color value obtaining unit 802 includes:
  • the mixing subunit 8021 is used to obtain multiple second color values of the pixel in parallel for any pixel, and the multiple second color values are the mixed color values of the pixel relative to each other pixel, and a second color The value is determined according to the distance between the pixel and one other pixel, the color value of the pixel and the color value of one other pixel;
  • the determining subunit 8022 is used to obtain the first color value of the pixel, and the first color value is the sum of multiple second color values.
  • the image acquisition unit 801 includes:
  • the first division subunit 8011 is configured to divide the third image into a first reference number of image regions, each of which has the same size;
  • the processing subunit 8012 is configured to obtain a first reference number of pixels in parallel, and the first reference number of pixels are obtained by down-sampling each image area in parallel.
  • the processing subunit 8012 is configured to:
  • a first image including the plurality of pixels is created.
  • processing subunit 8012 is used to:
  • the first image contains the color values of the first reference number of pixels.
  • processing subunit 8012 is used to:
  • the color value of the first reference number of pixel points is acquired, and the color value is the average value of the color value of the pixel points in the image area corresponding to the pixel point.
  • the first division subunit 8011 is used to:
  • the third image is divided into an image area that meets the first size.
  • the extraction unit 804 includes:
  • the second division subunit 8041 is configured to divide the second image into a second reference number of image areas, and the second reference number of image areas have the same size;
  • the pixel extraction subunit 8042 is configured to acquire a second reference number of pixels, where the second reference number of pixels is any pixel extracted from each image area in the second reference number of image areas;
  • the color value extraction subunit 8043 is used to extract the color value of the second reference number of pixel points.
  • the second division subunit 8041 is used to:
  • the second image is divided into an image area that meets the second size.
  • the image area of the second reference number has a corresponding image area in the first image
  • the extracting unit 804 is configured to:
  • the main hue is the color value of any pixel extracted from the second image area
  • the second image area is the second reference number of image areas and the first image The image area corresponding to the area.
  • Fig. 10 is a block diagram showing an electronic device, such as a terminal, according to an exemplary embodiment.
  • the terminal 1000 is a portable mobile terminal, such as a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, dynamic image expert compression standard audio layer 3), MP4 (Moving Picture Experts Group) Audio Layer IV, the dynamic image expert compresses the standard audio layer 4) Player, laptop or desktop computer.
  • the terminal 800 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the terminal 1000 includes: one or more processors 1001 and one or more memories 1002.
  • the processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 1001 may adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). accomplish.
  • the processor 1001 also includes a main processor and a co-processor.
  • the main processor is a processor used to process data in the awake state, also called a CPU (Central Processing Unit, central processing unit); the co-processor is a It is a low-power processor for processing data in the standby state.
  • the processor 1001 is integrated with a GPU (Graphics Processing Unit, data recommender), and the GPU is used to render and draw content that needs to be displayed on the display screen.
  • the processor 1001 further includes an AI (Artificial Intelligence) processor, and the AI processor is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1002 includes one or more computer-readable storage media.
  • the computer-readable storage medium is non-transitory.
  • the memory 1002 also includes volatile memory or non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1002 is used to store at least one instruction, and the at least one instruction is used by the processor 1001 to implement the main tone provided by the method embodiments of the present disclosure. method of extraction.
  • the terminal 1000 further includes: a peripheral device interface 1003 and at least one peripheral device.
  • the processor 1001, the memory 1002, and the peripheral device interface 1003 are connected by a bus or signal line.
  • Each peripheral device is connected to the peripheral device interface 1003 through a bus, a signal line or a circuit board.
  • the peripheral device includes at least one of a radio frequency circuit 1004, a display screen 1005, a camera component 1006, an audio circuit 10010, a positioning component 1008, or a power supply 1009.
  • the peripheral device interface 1003 can be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1001 and the memory 1002.
  • the processor 1001, the memory 1002, and the peripheral device interface 1003 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1001, the memory 1002, and the peripheral device interface 1003 or The two are implemented on a separate chip or circuit board, which is not limited in this embodiment.
  • the radio frequency circuit 1004 is used for receiving and transmitting RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 1004 communicates with a communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 1004 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 1004 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
  • the radio frequency circuit 1004 communicates with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes, but is not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks and/or WiFi (Wireless Fidelity, wireless fidelity) networks.
  • the radio frequency circuit 1004 further includes a circuit related to NFC (Near Field Communication), which is not limited in the present disclosure.
  • the display screen 1005 is used to display UI (User Interface).
  • the UI includes graphics, text, icons, videos, and any combination of them.
  • the display screen 1005 also has the ability to collect touch signals on or above the surface of the display screen 1005.
  • the touch signal is input to the processor 1001 as a control signal for processing.
  • the display screen 1005 is also used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 1005 there is one display screen 1005, which is arranged on the front panel of the terminal 1000; in other embodiments, there are at least two display screens 1005, which are respectively arranged on different surfaces of the terminal 1000 or in a folded design; In some embodiments, the display screen 1005 is a flexible display screen, which is disposed on the curved surface or the folding surface of the terminal 1000. Furthermore, the display screen 1005 is also set as a non-rectangular irregular pattern, that is, a special-shaped screen.
  • the display screen 1005 is made of materials such as LCD (Liquid Crystal Display) and OLED (Organic Light-Emitting Diode).
  • the camera assembly 1006 is used to collect images or videos.
  • the camera assembly 1006 includes a front camera and a rear camera.
  • the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
  • there are at least two rear cameras each of which is a main camera, a depth-of-field camera, a wide-angle camera, and a telephoto camera, so as to realize the fusion of the main camera and the depth-of-field camera to realize the background blur function, the main camera Integrate with the wide-angle camera to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions.
  • the camera assembly 1006 also includes a flash.
  • the flash is a single-color temperature flash, or a dual-color temperature flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, used for light compensation under different color temperatures.
  • the audio circuit 1007 includes a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals and input to the processor 1001 for processing, or input to the radio frequency circuit 1004 to implement voice communication. For the purpose of stereo collection or noise reduction, there may be multiple microphones, which are respectively set in different parts of the terminal 1000.
  • the microphone can also be an array microphone or an omnidirectional collection microphone.
  • the speaker is used to convert the electrical signal from the processor 1001 or the radio frequency circuit 1004 into sound waves.
  • the speakers are traditional thin-film speakers or piezoelectric ceramic speakers.
  • the speaker is a piezoelectric ceramic speaker, it can not only convert electrical signals into human audible sound waves, but also convert electrical signals into human inaudible sound waves for purposes such as distance measurement.
  • the audio circuit 1007 also includes a headphone jack.
  • the positioning component 1008 is used to locate the current geographic location of the terminal 1000 to implement navigation or LBS (Location Based Service, location-based service).
  • the positioning component 1008 is a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China, the Granus system of Russia, or the Galileo system of the European Union.
  • the power supply 1009 is used to supply power to various components in the terminal 1000.
  • the power source 1009 is alternating current, direct current, disposable batteries or rechargeable batteries. If the power source 1009 includes a rechargeable battery, the rechargeable battery supports wired charging or wireless charging. The rechargeable battery is also used to support fast charging technology.
  • the terminal 1000 further includes one or more sensors 1011.
  • the one or more sensors 1011 include, but are not limited to: an acceleration sensor 1011, a gyroscope sensor 1012, a pressure sensor 1013, a fingerprint sensor 1014, an optical sensor 1015, and a proximity sensor 1016.
  • the acceleration sensor 1011 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 1000. For example, the acceleration sensor 1011 is used to detect the components of gravitational acceleration on three coordinate axes.
  • the processor 1001 may control the display screen 1005 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 1011.
  • the acceleration sensor 1011 is also used for the collection of game or user motion data.
  • the gyroscope sensor 1012 detects the body direction and rotation angle of the terminal 1000, and the gyroscope sensor 1012 and the acceleration sensor 1011 cooperate to collect the user's 3D actions on the terminal 1000.
  • the processor 1001 implements the following functions according to the data collected by the gyroscope sensor 1012: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1013 is arranged on the side frame of the terminal 1000 and/or the lower layer of the display screen 1005. If the pressure sensor 1013 is arranged on the side frame of the terminal 1000, the user's holding signal of the terminal 1000 is detected, and the processor 1001 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 1013. If the pressure sensor 1013 is arranged on the lower layer of the display screen 1005, the processor 1001 operates according to the user's pressure on the display screen 1005 to control the operability controls on the UI interface.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, or a menu control.
  • the fingerprint sensor 1014 is used to collect the user's fingerprint.
  • the processor 1001 identifies the user's identity according to the fingerprint collected by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies the user's identity according to the collected fingerprint.
  • the processor 1001 authorizes the user to have related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
  • the fingerprint sensor 1014 may be provided on the front, back or side of the terminal 1000. When a physical button or manufacturer logo is provided on the terminal 1000, the fingerprint sensor 1014 can be integrated with the physical button or manufacturer logo.
  • the optical sensor 1015 is used to collect the ambient light intensity.
  • the processor 1001 controls the display brightness of the display screen 1005 according to the ambient light intensity collected by the optical sensor 1015. If the ambient light intensity is high, increase the display brightness of the display screen 1005; if the ambient light intensity is low, decrease the display brightness of the display screen 1005.
  • the processor 1001 also dynamically adjusts the shooting parameters of the camera assembly 1006 according to the ambient light intensity collected by the optical sensor 1015.
  • the proximity sensor 1016 also called a distance sensor, is arranged on the front panel of the terminal 1000.
  • the proximity sensor 1016 is used to collect the distance between the user and the front of the terminal 1000. In one embodiment, if the proximity sensor 1016 detects that the distance between the user and the front of the terminal 1000 gradually decreases, the processor 1001 controls the display screen 1005 to switch from the on-screen state to the off-screen state; if the proximity sensor 1016 detects The distance between the user and the front of the terminal 1000 gradually increases, and the processor 1001 controls the display screen 1005 to switch from the rest screen state to the bright screen state.
  • FIG. 10 does not constitute a limitation on the terminal 1000, and may include more or fewer components than shown in the figure, or combine certain components, or adopt different component arrangements.
  • an electronic device including: one or more processors; a volatile or non-volatile memory for storing commands executable by the one or more processors; Among them, one or more processors are configured to perform the following steps:
  • the first color value of each pixel is obtained in parallel.
  • the first color value is based on the distance between each pixel and multiple other pixels in the first image, and multiple other pixels
  • the original color value of and the original color value of each pixel are determined;
  • the second image is obtained based on the first color value of each pixel
  • the main hue of the first image is determined, and the main hue is the color value of at least one pixel in the second image.
  • one or more processors are configured to perform the following steps:
  • multiple second color values of the pixel are obtained in parallel, multiple second color values are the mixed color value of the pixel relative to each other pixel, and a second color value is based on the pixel and one other The distance between pixels, the color value of the pixel and the color value of one other pixel are determined;
  • the first color value of the pixel is acquired, and the first color value is the sum of multiple second color values.
  • one or more processors are configured to perform the following steps:
  • the first image is composed of a first reference number of pixels.
  • one or more processors are configured to perform the following steps:
  • the first image contains the color values of the first reference number of pixels.
  • one or more processors are configured to perform the following steps:
  • the color value of the first reference number of pixel points is acquired, and the color value is the average value of the color value of the pixel points in the image area corresponding to the pixel point.
  • one or more processors are configured to perform the following steps:
  • the third image is divided into an image area that meets the first size.
  • one or more processors are configured to perform the following steps:
  • one or more processors are configured to perform the following steps:
  • the second image is divided into an image area that meets the second size.
  • the second reference number of image regions have corresponding image regions in the first image
  • the one or more processors are configured to perform the following steps:
  • the main hue is the color value of any pixel extracted from the second image area
  • the second image area is the second reference number of image areas and the first image The image area corresponding to the area.
  • a non-transitory computer-readable storage medium is also provided.
  • the electronic device can execute the following steps:
  • the first color value of each pixel is obtained in parallel.
  • the first color value is based on the distance between each pixel and multiple other pixels in the first image, and multiple other pixels
  • the original color value of and the original color value of each pixel are determined;
  • the second image is obtained based on the first color value of each pixel
  • the main hue of the first image is determined, and the main hue is the color value of at least one pixel in the second image.
  • the electronic device is enabled to perform the following steps:
  • multiple second color values of the pixel are obtained in parallel, multiple second color values are the mixed color value of the pixel relative to each other pixel, and a second color value is based on the pixel and one other The distance between pixels, the color value of the pixel and the color value of one other pixel are determined;
  • the first color value of the pixel is acquired, and the first color value is the sum of multiple second color values.
  • the electronic device is enabled to perform the following steps:
  • the first image is composed of a first reference number of pixels.
  • the electronic device is enabled to perform the following steps:
  • the first image contains the color values of the first reference number of pixels.
  • the electronic device is enabled to perform the following steps:
  • the color value of the first reference number of pixel points is acquired, and the color value is the average value of the color value of the pixel points in the image area corresponding to the pixel point.
  • the electronic device is enabled to perform the following steps:
  • the third image is divided into an image area that meets the first size.
  • the electronic device is enabled to perform the following steps:
  • the electronic device is enabled to perform the following steps:
  • the second image is divided into an image area that meets the second size.
  • the image area of the second reference number has a corresponding image area in the first image, so that the electronic device can perform the following steps:
  • the main hue is the color value of any pixel extracted from the second image area
  • the second image area is the second reference number of image areas and the first image The image area corresponding to the area.
  • a computer program product is also provided.
  • the electronic device can execute the following steps:
  • the first color value of each pixel is obtained in parallel.
  • the first color value is based on the distance between each pixel and multiple other pixels in the first image, and multiple other pixels
  • the original color value of and the original color value of each pixel are determined;
  • the second image is obtained based on the first color value of each pixel
  • the main hue of the first image is determined, and the main hue is the color value of at least one pixel in the second image.
  • the electronic device is enabled to perform the following steps:
  • multiple second color values of the pixel are obtained in parallel, multiple second color values are the mixed color value of the pixel relative to each other pixel, and a second color value is based on the pixel and one other The distance between pixels, the color value of the pixel and the color value of one other pixel are determined;
  • the first color value of the pixel is acquired, and the first color value is the sum of multiple second color values.
  • the electronic device is enabled to perform the following steps:
  • the first image is composed of a first reference number of pixels.
  • the electronic device is enabled to perform the following steps:
  • the first image contains the color values of the first reference number of pixels.
  • the electronic device is enabled to perform the following steps:
  • the color value of the first reference number of pixel points is acquired, and the color value is the average value of the color value of the pixel points in the image area corresponding to the pixel point.
  • the electronic device is enabled to perform the following steps:
  • the third image is divided into an image area that meets the first size.
  • the electronic device is enabled to perform the following steps:
  • the electronic device is enabled to perform the following steps:
  • the second image is divided into an image area that meets the second size.
  • the image area of the second reference number has a corresponding image area in the first image, so that the electronic device can perform the following steps:
  • the main hue is the color value of any pixel extracted from the second image area
  • the second image area is the second reference number of image areas and the first image The image area corresponding to the area.

Abstract

一种主色调提取方法,该方法包括:对于每个像素点同时执行以下操作:根据参考像素点与像素点的距离、参考像素点的颜色值和像素点的颜色值,获取第一图像中的任一参考像素点混合后的颜色值,能够提高对图像进行处理的效率(102),再根据第一图像中每个像素点混合后的颜色值,对第一图像进行处理,生成第二图像,该第二图像中的每个像素点的颜色值更加均匀(103),从第二图像中提取至少一个像素点的颜色值,将至少一个像素点的颜色值确定为第一图像的主色调,提取的主色调更符合第一图像的特征(104)。该方法提高了主色调提取的准确率,并且节省了处理时间,提高了主色调的提取效率。

Description

主色调提取方法及装置
本公开要求于2020年06月01日提交、申请号为202010485775.8、发明名称为“主色调提取方法、装置、电子设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开涉及计算机技术领域,尤其涉及一种主色调提取方法。
背景技术
随着计算机技术的快速发展,图像处理的方式也越来越多,目前一种常用的图像处理方式为提取图像中的主色调。由于图像中各个像素点的颜色不同,如何提取图像中的主色调成为亟待解决的问题。
发明内容
本公开提供了一种主色调提取方法、装置、电子设备及存储介质,能够使提取的主色调更符合第一图像的特征,提高主色调提取的准确率,并且节省了处理时间,提高了主色调的提取效率。
根据本公开实施例的一方面,提供一种主色调提取方法,应用于电子设备,所述方法包括:
获取第一图像;
对于所述第一图像中的每个像素点,并行获取每个所述像素点的第一颜色值,所述第一颜色值基于每个所述像素点与所述第一图像中多个其他像素点的距离、所述多个其他像素点的原颜色值和每个所述像素点的原颜色值确定;
获取第二图像,所述第二图像基于每个所述像素点的第一颜色值得到;
确定所述第一图像的主色调,所述主色调为所述第二图像中至少一个像素点的颜色值。
在一些实施例中,所述对于所述第一图像中的每个像素点,并行获取每个所述像素点 的第一颜色值,包括:
对于任一像素点,并行获取所述像素点的多个第二颜色值,所述多个第二颜色值为所述像素点相对于每个其他像素点的混合颜色值,一个所述第二颜色值根据所述像素点与一个所述其他像素点之间的距离、所述像素点的颜色值和一个所述其他像素点的颜色值确定;
获取所述像素点的第一颜色值,所述第一颜色值为所述多个第二颜色值之和。
在一些实施例中,所述获取第一图像,包括:
将第三图像划分为第一参考数量的图像区域,每个图像区域的尺寸相同;
并行获取所述第一参考数量的像素点,所述第一参考数量的像素点通过并行对所述每个图像区域进行降采样处理得到;
获取所述第一图像,所述第一图像由所述第一参考数量的像素点构成。
在一些实施例中,所述获取所述第一参考数量的像素点,包括:
并行确定所述第一参考数量的像素点的颜色值,所述颜色值基于所述像素点对应的图像区域中包括的像素点的颜色值确定;
创建所述第一图像,所述第一图像包含所述第一参考数量的像素点的颜色值。
在一些实施例中,所述并行确定所述第一参考数量的像素点的颜色值,包括:
并行获取每个所述图像区域中的像素点的颜色值的平均值;
获取所述第一参考数量的像素点的颜色值,所述颜色值为与所述像素点对应的图像区域中像素点的颜色值的平均值。
在一些实施例中,所述将第三图像划分为第一参考数量的图像区域,包括:
确定第一尺寸,所述第一尺寸基于所述第三图像的尺寸和所述第一参考数量得到;
从所述第三图像中划分满足所述第一尺寸的图像区域。
在一些实施例中,从所述第二图像中提取至少一个像素点的颜色值,包括:
将所述第二图像划分为第二参考数量的图像区域,所述第二参考数量的图像区域的尺寸相同;
获取所述第二参考数量的像素点,所述第二参考数量的像素点为从所述第二参考数量的图像区域中每个图像区域中提取的任一个像素点;
提取所述第二参考数量的像素点的颜色值。
在一些实施例中,所述将所述第二图像划分为第二参考数量的图像区域,包括:
确定第二尺寸,所述第二尺寸基于所述第二图像的尺寸和所述第二参考数量得到;
从所述第二图像中划分满足所述第二尺寸的图像区域。
在一些实施例中,所述第二参考数量的图像区域在所述第一图像中具有对应的图像区域,所述确定所述第一图像的主色调,包括:
确定所述第一图像中至少一个第一图像区域的主色调,所述主色调为从第二图像区域中提取的任一个像素点的颜色值,所述第二图像区域为所述第二参考数量的图像区域与所述第一图像区域对应的图像区域。
根据本公开实施例的另一方面,提供一种主色调提取装置,所述装置包括:
图像获取单元,用于获取第一图像;
颜色值获取单元,用于对于所述第一图像中的每个像素点,并行获取每个所述像素点的第一颜色值,所述第一颜色值基于每个所述像素点与所述第一图像中多个其他像素点的距离、所述多个其他像素点的原颜色值和每个所述像素点的原颜色值确定;
生成单元,用于获取第二图像,所述第二图像基于每个所述像素点的第一颜色值得到;
提取单元,用于确定所述第一图像的主色调,所述主色调为所述第二图像中至少一个像素点的颜色值。
在一些实施例中,所述颜色值获取单元,包括:
混合子单元,用于对于任一像素点,并行获取所述像素点的多个第二颜色值,所述多个第二颜色值为所述像素点相对于每个其他像素点的混合颜色值,一个所述第二颜色值根据所述像素点与一个所述其他像素点之间的距离、所述像素点的颜色值和一个所述其他像素点的颜色值确定;
确定子单元,用于获取所述像素点的第一颜色值,所述第一颜色值为所述多个第二颜色值之和。
在一些实施例中,所述图像获取单元,包括,
第一划分子单元,用于将第三图像划分为第一参考数量的图像区域,每个图像区域的尺寸相同;
处理子单元,用于并行获取所述第一参考数量的像素点,所述第一参考数量的像素点通过并行对所述每个图像区域进行降采样处理得到。
在一些实施例中,所述处理子单元,用于:
并行确定所述第一参考数量的像素点的颜色值,所述颜色值基于所述像素点对应的图像区域中包括的像素点的颜色值确定;
创建所述第一图像,所述第一图像包含所述第一参考数量的像素点的颜色值。
在一些实施例中,所述处理子单元,用于:
并行获取每个所述图像区域中的像素点的颜色值的平均值;
获取所述第一参考数量的像素点的颜色值,所述颜色值为与所述像素点对应的图像区域中像素点的颜色值的平均值。
在一些实施例中,所述第一划分子单元,用于:
确定第一尺寸,所述第一尺寸基于所述第三图像的尺寸和所述第一参考数量得到;
从所述第三图像中划分满足所述第一尺寸的图像区域。
在一些实施例中,所述提取单元,包括:
第二划分子单元,用于将所述第二图像划分为第二参考数量的图像区域,所述第二参考数量的图像区域的尺寸相同;
像素点提取子单元,用于获取所述第二参考数量的像素点,所述第二参考数量的像素点为从所述第二参考数量的图像区域中每个图像区域中提取的任一个像素点;
颜色值提取子单元,用于提取所述第二参考数量的像素点的颜色值。
在一些实施例中,所述第二划分子单元,用于:
确定第二尺寸,所述第二尺寸基于所述第二图像的尺寸和所述第二参考数量得到;
从所述第二图像中划分满足所述第二尺寸的图像区域。
在一些实施例中,所述第二参考数量的图像区域在所述第一图像中具有对应的图像区域,所述提取单元,用于:
确定所述第一图像中至少一个第一图像区域的主色调,所述主色调为从第二图像区域中提取的任一个像素点的颜色值,所述第二图像区域为所述第二参考数量的图像区域与所述第一图像区域对应的图像区域。
根据本公开实施例的另一方面,提供一种电子设备,所述电子设备包括:
一个或多个处理器;
用于存储所述一个或多个处理器可执行命令的易失性或非易失性存储器;
其中,所述一个或多个处理器被配置为执行如上述方面所述的主色调提取方法。
根据本公开实施例提供的另一方面,提供一种非临时性计算机可读存储介质,当所述存储介质中的指令由电子设备的处理器执行时,使得电子设备能够执行如上述方面所述的 主色调提取方法。
根据本公开实施例的另一方面,提供一种计算机程序产品,当所述计算机程序产品中的指令由电子设备的处理器执行时,使得电子设备能够执行如上述方面所述的主色调提取方法。
本公开的实施例提供的技术方案可以包括以下有益效果:
本申请实施例提供的方法、装置、电子设备及存储介质,提高了对图像进行处理的效率,并且提取的主色调更符合第一图像的特征,提高了主色调提取的准确率,节省了处理时间,提高了主色调的提取效率。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。
图1是根据一示例性实施例示出的一种主色调提取方法的流程图。
图2是根据一示例性实施例示出的一种主色调提取方法的流程图。
图3是根据一示例性实施例示出的一种对第三图像进行划分的示意图。
图4是根据一示例性实施例示出的一种获取第三图像的颜色特征值的平均值的示意图。
图5是根据一示例性实施例示出的一种进行降采样处理的示意图。
图6是根据一示例性实施例示出的一种进行插值处理的示意图。
图7是根据一示例性实施例示出的一种进行色调提取的示意图。
图8是根据一示例性实施例示出的一种主色调提取装置的结构示意图。
图9是根据一示例性实施例示出的另一种主色调提取装置的结构示意图。
图10是根据一示例性实施例示出的一种终端的框图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图 时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权权利要求书中所详述的、本公开的一些方面相一致的设备和方法的例子。
可以理解,本公开所使用的术语“第一”、“第二”等可在本文中用于描述各种概念,但除非特别说明,这些概念不受这些术语限制。这些术语仅用于将一个概念与另一个概念区分。举例来说,在不脱离本申请的范围的情况下,可以将第一图像称为第二图像,将第二图像称为第一图像。
本公开实施例提供了一种主色调提取方法,应用于多种场景下。
例如,本公开实施例提供的方法,应用在图像分类场景下,如果终端需要对多个图像进行分类,采用本公开实施例提供的方法,即可获取每个图像的主色调,再按照每个图像的主色调,对多个图像进行分类。
又如,本公开实施例提供的方法,应用在图像搜索场景下,当终端需要搜索图像时,采用本公开实施例提供的方法,即可获取每个图像的主色调,查找与待搜索的主色调相同的主色调的图像,得到搜索结果。
本公开实施例提供的主色调提取方法应用于终端中。其中,该终端为手机、平板电脑、计算机等多种类型的终端。
在一些实施例中,该终端中包括GPU(Graphics Processing Unit,图形处理器),该GPU为终端中进行绘图运算的处理器,采用该GPU并行地处理数据,能够提高处理数据的速率。
图1是根据一示例性实施例示出的一种主色调提取方法的流程图,参见图1,应用于电子设备中,该方法包括:
101、电子设备获取第一图像。
102、电子设备对于第一图像中的每个像素点同时执行以下操作:基于参考像素点与像素点的距离、参考像素点的颜色值和像素点的颜色值,获取参考像素点混合后的颜色值,参考像素点为第一图像中的任一像素点。
由于参考像素点为第一图像中的任一像素点,因此,上述步骤102还能够表示为:对于第一图像中的每个像素点,并行获取每个像素点的第一颜色值,第一颜色值基于每个像素点与第一图像中多个其他像素点的距离、多个其他像素点的原颜色值和每个像素点的原 颜色值确定。
103、电子设备基于第一图像中每个像素点混合后的颜色值,对第一图像进行处理,生成第二图像。
也就是说,电子设备获取第二图像,第二图像基于每个像素点的第一颜色值得到。
104、电子设备从第二图像中提取至少一个像素点的颜色值,将至少一个像素点的颜色值确定为第一图像的主色调。
也就是说,电子设备确定第一图像的主色调,该主色调为第二图像中至少一个像素点的颜色值。
本公开实施例提供的方法,提高了对图像进行处理的效率,提取的主色调更符合第一图像的特征,提高了主色调提取的准确率,并且节省了处理时间,提高了主色调的提取效率。
在一些实施例中,对于每个像素点同时执行以下操作:基于参考像素点与像素点的距离、参考像素点的颜色值和像素点的颜色值,获取参考像素点混合后的颜色值,参考像素点为第一图像中的任一像素点,包括:
对于每个像素点同时执行以下操作:基于参考像素点与像素点的距离,将像素点的颜色值与参考像素点的颜色值进行混合,将混合后的颜色值作为像素点相对于参考像素点的混合颜色值;
将每个像素点相对于参考像素点的混合颜色值之和,作为参考像素点混合后的颜色值。
在一些实施例中,获取第一图像,包括,
将第三图像划分为第一参考数量的图像区域,每个图像区域的尺寸相同;
同时对第一参考数量的图像区域进行降采样处理,得到第一参考数量的像素点,将第一参考数量的像素点构成第一图像。
在一些实施例中,同时对第一参考数量的图像区域进行降采样处理,得到第一参考数量的像素点,将第一参考数量的像素点构成第一图像,包括:
同时基于每个图像区域中包括的像素点的颜色值,确定每个图像区域的颜色值,作为每个图像区域对应的像素点的颜色值;
基于确定的多个像素点的颜色值,创建包含多个像素点的第一图像。
在一些实施例中,同时基于每个图像区域中包括的像素点的颜色值,确定每个图像区域的颜色值,作为每个图像区域对应的像素点的颜色值,包括:
同时获取每个图像区域中的像素点的颜色值的平均值;
将每个图像区域中的像素点的颜色值的平均值作为第一图像中每个图像区域对应的像素点的颜色值。
在一些实施例中,将第三图像划分为第一参考数量的图像区域,包括:
基于第三图像的尺寸和第一参考数量,确定所划分图像区域的第一尺寸;
基于第一尺寸,从第三图像中划分满足第一尺寸的图像区域。
在一些实施例中,从第二图像中提取至少一个像素点的颜色值,包括:
将第二图像划分为第二参考数量的图像区域,第二参考数量的图像区域的尺寸相同;
从第二参考数量的图像区域中每个图像区域中提取任一个像素点,得到第二参考数量的像素点;
提取第二参考数量的像素点的颜色值。
在一些实施例中,将第二图像划分为第二参考数量的图像区域,包括:
基于第二图像的尺寸和第二参考数量,确定所划分图像区域的第二尺寸;
基于第二尺寸,从第二图像中划分满足第二尺寸的图像区域。
在一些实施例中,第二参考数量的图像区域在第一图像中具有对应的图像区域,将至少一个像素点的颜色值确定为第一图像的主色调,包括:
将第二图像中提取的每个像素点的颜色值,分别作为每个像素点所对应的目标区域的主色调,每个像素点所对应的目标区域为每个像素点所属图像区域在第一图像中对应的图像区域。
图2是根据一示例性实施例示出的一种主色调提取方法的流程图,参见图2,应用于电子设备中,该方法包括:
201、电子设备将第三图像划分为第一参考数量的图像区域,每个图像区域的尺寸相同。
其中,该第三图像为需要提取主色调的任一图像。该第三图像为风景图像、人物图像或者其他类型的图像等等。在一些实施例中,该第三图像通过拍摄得到、通过在社交平台上查找其他用户发布的图像得到、通过检索得到,或者采用其他方式得到。
获取第三图像,将该第三图像划分为第一参考数量的图像区域,且该第一参考数量的图像区域的尺寸相同。另外,第三图像中包括多个像素点,且第三图像中的多个像素点均匀分布,因此,第一参考数量的图像区域中的像素点的数量也相同。
其中,该第一参考数量是终端默认的数量,或者第一参考数量由用户设置,或者采用其他方式设置第一参考数量。在一些实施例中,第一参考数量为大于等于1的任一整数,例如,该第一参考数量为4、6、8等。
例如,如图3所示,将一个第三图像划分为由上到下的5个图像区域。
在一些实施例中,对该第三图像的长度或者宽度中的至少一项进行均匀地划分,从而将该第三图像划分为第一参考数量的图像区域。在一些实施例中,对第三图像的长度进行均匀地划分,例如,将第三图像划分为6*1的图像区域。在一些实施例中,对第三图像的宽度进行均匀地划分,例如,将第三图像划分为1*6的图像区域。在一些实施例中,对第三图像的长度和宽度进行均匀地划分,例如,将第三图像划分为2*8的图像区域。
在一些实施例中,对第三图像进行划分的过程中,基于第三图像的尺寸和第一参考数量,确定所划分图像区域的第一尺寸,基于第一尺寸,从第三图像中划分满足第一尺寸的图像区域。换而言之,电子设备确定第一尺寸,第一尺寸基于第三图像的尺寸和第一参考数量得到;从第三图像中划分满足第一尺寸的图像区域。
对第三图像进行划分的过程中,在确定了第三图像的尺寸和第一参考数量后,基于该第一参考数量对该第三图像的尺寸进行平均划分,得到第一尺寸,该第一尺寸即为从第三图像中划分出的每个图像区域的尺寸,以便将第三图像划分为均匀的图像区域,并且划分后的每个图像区域的尺寸均相同,因此按照第一尺寸对第三图像进行划分,能够得到第一参考数量的图像区域。
在一些实施例中,在对第三图像进行划分的过程中,按照横向的方式对第三图像进行划分,以得到横向排列的第一参考数量的区域,或者,按照纵向的方式对第三图像进行划分,以得到纵向排列的第一参考数量的区域,或者按照包括横向和纵向两种方式对第三图像进行划分,得到按照横向和纵向均匀排列的第一参考数量的区域。
202、电子设备同时对第一参考数量的图像区域进行降采样处理,得到第一参考数量的像素点,将第一参考数量的像素点构成第一图像。
由于本公开实施例中电子设备通过GPU并行地处理数据,因此,上述步骤202还可表示为:并行获取第一参考数量的像素点,第一参考数量的像素点通过并行对每个图像区域进行降采样处理得到。
其中,降采样处理用于降低图像的像素点的数量,创建一个降低像素点数量后的新图像。例如,第三图像中包括100个像素点,对该第三图像进行降采样处理,得到一个包括10个像素点的新图像,该第三图像的尺寸小于第一图像的尺寸。
对第一参考数量的图像区域进行降采样处理时,同时将每个图像区域融合为一个像素点,得到第一参考数量的像素点,将该第一参考数量的像素点构成第一图像。也即是,获取第一图像,第一图像由第一参考数量的像素点构成。
例如,将第三图像划分为5个图像区域后,对这5个图像区域进行降采样处理,得到5个像素点,将这5个像素点构成第一图像。
其中,在将第一参考数量的像素点构成第一图像时,按照每个图像区域在第三图像中的位置,将每个图像区域对应的像素点添加在第一图像中,以构成第一图像。
在一些实施例中,同时基于每个图像区域中包括的像素点的颜色值,确定每个图像区域的颜色值,作为每个图像区域对应的像素点的颜色值,基于确定的多个像素点的颜色值,创建包含该多个像素点的第一图像。也即是,并行确定第一参考数量的像素点的颜色值,颜色值基于像素点对应的图像区域中包括的像素点的颜色值确定;创建第一图像,第一图像包含第一参考数量的像素点的颜色值。
由于第三图像中包括第一参考数量的图像区域,每个图像区域对应于第一图像中的一个像素点,则该第一图像中包括第一参考数量的像素点。
其中,该颜色值用于表示像素点的颜色。该颜色值可以为RGB(Red Green Blue,红绿蓝)值,或者该颜色值为像素值,或者该颜色值为其他类型的值等等。
在一些实施例中,获取每个图像区域中的像素点的颜色值的平均值,将每个图像区域中的像素点的颜色值的平均值作为第一图像中每个图像区域对应的像素点的颜色值。也即是,并行确定第一参考数量的像素点的颜色值,包括:并行获取每个图像区域中的像素点的颜色值的平均值;获取第一参考数量的像素点的颜色值,颜色值为与像素点对应的图像区域中像素点的颜色值的平均值。
其中,采用下述公式,获取每个图像区域中的像素点的颜色值的平均值:
Figure PCTCN2020127558-appb-000001
其中,C i用于表示第i个图像区域的像素点的颜色值的颜色平均值,
Figure PCTCN2020127558-appb-000002
用于表示第i个图像区域的面积,C (u,v)用于表示坐标(u,v)处的像素点的颜色值,D i用于表示第i个图像区域中像素点的集合。其中,i为大于等于1的任一数值,u和v为任一数值。坐标(u,v)基于图像区域的图像坐标系确定。例如,将第三图像的左下角作为坐标系的原点,基于像素点在第三图像中的位置,确定该像素点的坐标;或者,确定任一坐标,获 取该任一坐标处像素点的颜色值。
例如,对于第一个图像区域,该第一个图像区域中包括2个像素点,且这2个像素点的颜色值分别为(20,60,30)、(60,80,20),则采用上述公式,获取的该第一个图像区域中的像素点的颜色值的平均值为(40,70,25)。
例如,如图4所示,将第三图像由上到下划分为5个图像区域后,获取每个图像区域的像素点的颜色值的平均值,第一图像中包括5个像素点的颜色值。
例如,当将第三图像按照从上到下的顺序划分为4个图像区域后,按照从上到下的顺序,获取到的4个图像区域的颜色值分别为(20,40,20)、(30,20,30)、(50,50,60)、(100,20,20),则确定的第一图像的每个像素点的颜色值分别为(20,40,20)、(30,20,30)、(50,50,60)、(100,20,20)。
另外,若第三图像的分辨率为M*N,采用相关技术中的步骤进行降采样处理的话,时间复杂度为O(M*N),而采用本公开实施例提供的步骤201-202,将第三图像降为第一图像,时间复杂度为O(1),相对于相关技术中的步骤处理速度提高了M*N倍,提高了对图像进行降采样处理的效率。
本公开实施例,基于每个图像区域的像素点的颜色值的平均值,确定的图像区域对应的像素点的颜色值更均匀,能够提高确定的像素点的颜色值的准确性。并且,通过对第三图像进行降采样处理,得到第一图像,提高了对图像进行主色调提取的效率,减少处理的数据量。
需要说明的是,本公开实施例仅是以对第三图像进行降采样处理,得到第一图像为例进行说明。在另一实施例中,不执行步骤201-202,而是直接获取第一图像,第一图像为需要提取主色调的任一图像,获取方式与上述步骤201中获取第三图像的方式类似。
在一些实施例中,若不执行步骤201-202,本公开实施例中的第一图像为风景图像、人物图像或者其他类型的图像等等。另外,该第一图像通过拍摄得到、通过在社交平台上查找其他用户发布的图像得到、通过检索得到,或者采用其他方式得到。
在获取到第一图像后,后续即可执行步骤203-205,直接获取第一图像的主色调。
203、电子设备对于每个像素点同时执行以下操作:基于参考像素点与像素点的距离、参考像素点的颜色值和像素点的颜色值,获取参考像素点混合后的颜色值。
由于电子设备通过GPU并行对数据进行处理,因此,上述步骤203还可表示为:对于第一图像中的每个像素点,并行获取每个像素点的第一颜色值,第一颜色值基于每个像素点与第一图像中多个其他像素点的距离、多个其他像素点的原颜色值和每个像素点的原 颜色值确定。
其中,参考像素点为第一图像中的任一像素点。
在对每个像素点进行处理时,将该第一图像中的任一像素点作为参考像素点,则基于参考像素点的坐标与像素点的坐标,确定参考像素点和像素点的距离,再基于获取的距离、参考像素点的颜色值以及像素点的颜色值,获取参考像素点混合后的颜色值。
在一些实施例中,对于每个像素点同时执行以下操作:基于参考像素点与像素点的距离,将像素点的颜色值与参考像素点的颜色值进行混合,将混合后的颜色值作为像素点相对于参考像素点的混合颜色值;将每个像素点相对于参考像素点的混合颜色值之和,作为参考像素点混合后的颜色值。也即是,对于任一像素点,并行获取像素点的多个第二颜色值,多个第二颜色值为像素点相对于每个其他像素点的混合颜色值,一个第二颜色值根据像素点与一个其他像素点之间的距离、像素点的颜色值和一个其他像素点的颜色值确定;获取像素点的第一颜色值,第一颜色值为多个第二颜色值之和。
在一些实施例中,采用下述公式,获取每个像素点相对于参考像素点的混合颜色值:
Figure PCTCN2020127558-appb-000003
另外,y为第一像素点相对于参考像素点的混合颜色值,A为像素点的颜色值,k为固定数值,x为像素点的坐标值。
另外,需要说明的是,本公开实施例中的步骤203所执行的步骤即是对图像执行插值处理的过程,也就是说,“对于每个像素点同时执行以下操作:基于参考像素点与像素点的距离、参考像素点的颜色值和像素点的颜色值,获取参考像素点混合后的颜色值”即是对每个像素点的颜色值进行插值处理。基于插值处理后每个像素点的颜色值,得到插值处理后的第二图像,该第二图像中的每个像素点的颜色值更加平滑,能够提高后续提取的主色调的准确率。
在对第一图像进行处理时,同时对该第一图像中的每个像素点进行处理,能够节省处理时长,提高处理效率。
在一些实施例中,本公开实施例中的步骤202-203采用GPU并行地对第一参考数量的图像区域进行降采样处理,得到第一参考数量的像素点,将第一参考数量的像素点构成第一图像,然后并行地基于参考像素点与像素点的距离、参考像素点的颜色值和像素点的颜色值,获取参考像素点混合后的颜色值。采用GPU并行地进行处理,以实现步骤202中同时针对多个图像区域进行降采样的效果,以及在步骤203中同时获取参考像素点混合后的颜色值的效果。
另外,如果第一图像的分辨率为M*N,在采用卷积核对图像进行处理时,如果采用的卷积核大小为C,则在相关技术中,处理的时间复杂度为O(M*N*C),而本公开实施例提供的步骤203,通过同时对像素点进行运算,则处理的时间复杂度为O(C),相对于相关技术中的步骤处理速度提高了M*N倍,提高了对图像进行处理的效率。
204、电子设备基于第一图像中每个像素点混合后的颜色值,对第一图像进行处理,生成第二图像。
在获取到第一图像中每个像素点混合后的颜色值后,继续对该第一图像进行处理,将第一图像中每个像素点的颜色值确定为混合后的颜色值,生成第一图像对应的第二图像。
需要说明的是,本公开实施例仅是以采用高斯函数
Figure PCTCN2020127558-appb-000004
对第一图像进行插值处理为例进行说明。在另一实施例中,采用线性插值的方法,对第一图像进行插值处理,得到插值处理后的第二图像。
205、电子设备从第二图像中提取至少一个像素点的颜色值,将至少一个像素点的颜色值确定为第一图像的主色调。
其中,该第二图像中包括多个像素点,在确定第一图像的主色调时,从该第二图像中提取至少一个像素点的颜色值,作为第一图像的主色调。
在一些实施例中,将第二图像划分为第二参考数量的图像区域,从第二参考数量的图像区域中每个图像区域中提取任一个像素点,得到第二参考数量的像素点,提取第二参考数量的像素点的颜色值,确定为第一图像的主色调。其中,第二参考数量的图像区域的尺寸相同。也就是说,将第二图像划分为第二参考数量的图像区域,第二参考数量的图像区域的尺寸相同;获取第二参考数量的像素点,第二参考数量的像素点为从第二参考数量的图像区域中每个图像区域中提取的任一个像素点;提取第二参考数量的像素点的颜色值。
其中,由于第一图像中不同区域的主色调可能不同,且第二图像为对第一图像进行处理得到的图像,为了提高确定的主色调的准确性,将第二图像划分为第二参考数量的图像区域,该第二参考数量的图像区域能够代表第一图像中对应区域的主色调,则从第二参考数量的图像区域中每个图像区域中提取任一个像素点,得到第二参考数量的像素点,提取第二参考数量的像素点的颜色值,确定为第一图像的主色调。
在一些实施例中,在从第二图像的第二参考数量的图像区域中提取像素点时,提取每个图像区域的中心像素点,得到第二参考数量的像素点,提取第二参考数量的像素点的颜色值,确定为第一图像的主色调。
其中,该第二参考数量的图像区域的尺寸相同。另外,该第二参考数量为终端默认的 数值,或者第二参考数量由用户设置,或者还可以采用其他方式设置。例如,该第二参考数量可以为5、6、7或者其他数值。该中心像素点为位于图像区域的中心的像素点。
例如,该图像区域包括的像素点数量为7,且这7个像素点从上到下排布,则第4个像素点为该图像区域的中心像素点。
另外,将第二图像从上到下划分为5个图像区域,且这5个图像区域的尺寸相同,再按照5个图像区域的尺寸,确定每个图像区域的中心像素点,则确定的5个中心像素点的颜色值对应的颜色即为第一图像的主色调。
在一些实施例中,基于第二图像的尺寸和第二参考数量,确定所划分图像区域的第二尺寸,基于第二尺寸,从第二图像中划分满足第二尺寸的图像区域。该实施例还可表示为:确定第二尺寸,第二尺寸基于第二图像的尺寸和第二参考数量得到;从第二图像中划分满足第二尺寸的图像区域。
对第二图像进行划分的过程中,确定了第二图像的尺寸和第二参考数量后,基于该第二参考数量对该第二图像的尺寸进行平均划分,得到第二尺寸,该第二尺寸即为从第二图像中划分出的每个图像区域的尺寸,以便将第二图像划分为均匀的图像区域,并且划分后的每个图像区域的尺寸均相同,因此按照第二尺寸对第二图像进行划分,得到第二参考数量的图像区域。
在一些实施例中,第二图像中的第二参考数量的图像区域,在第一图像中具有对应的图像区域,因此将第二图像中提取的每个像素点的颜色值,分别作为每个像素点所对应的目标区域的主色调,每个像素点所对应的目标区域为每个像素点所属图像区域在第一图像中对应的图像区域。也就是说,确定第一图像中至少一个第一图像区域的主色调,主色调为从第二图像区域中提取的任一个像素点的颜色值,第二图像区域为第二参考数量的图像区域与第一图像区域对应的图像区域。
另外,通过举例的方式,以说明本公开实施例提供的提取图像的主色调的方法。例如,如图5所示,执行步骤201-202,将该第三图像进行降采样处理得到第一图像,再如图6所示,执行步骤203-204,对第一图像进行插值处理得到第二图像,再如图7所示,执行步骤205,从第二图像中提取5个像素点,且这5个像素点的颜色值即为第一图像的主色调。
需要说明的是,本公开实施例中第一图像为第三图像进行降采样处理后的图像,则确定的第一图像的主色调,也可以作为第三图像的主色调。而在另一实施例中,如果不执行步骤201-202,则直接就获取了第一图像的主色调。
在一些实施例中,本公开实施例中的步骤205同时从第二图像中提取至少一个像素点的像素值。
在一些实施例中,本公开实施例中的步骤205通过GPU并行地在该第二图像中提取至少一个像素点的颜色值,再将至少一个像素点的颜色值对应的颜色作为第一图像的主色调。
相关技术中,通过CPU获取目标图像,遍历该目标图像中的每个像素点,提取每个像素点的颜色特征值,再基于每个像素点的颜色特征值,获取像素点最多的颜色特征值,将该颜色特征值对应的颜色作为该目标图像的主色调。但是,上述方法需要遍历目标图像中的每个像素点,处理时间长,提取主色调的效率低。
本申请实施例提供的方法,提高了对图像进行处理的效率,提取的主色调更符合第一图像的特征,提高了主色调提取的准确率,并且节省了处理时间,提高了主色调的提取效率。
并且,确定的图像区域对应的像素点的颜色值更均匀,提高了确定的像素点的颜色值的准确性。并且,减少了处理的数据量,提高了对图像进行主色调提取的效率。
图8是根据一示例性实施例示出的一种主色调提取装置的结构示意图。参见图8,该装置包括:
图像获取单元801,用于获取第一图像;
颜色值获取单元802,用于对于第一图像中的每个像素点,并行获取每个像素点的第一颜色值,第一颜色值基于每个像素点与第一图像中多个其他像素点的距离、多个其他像素点的原颜色值和每个像素点的原颜色值确定;
生成单元803,用于获取第二图像,第二图像基于每个像素点的第一颜色值得到;
提取单元804,用于确定第一图像的主色调,主色调为第二图像中至少一个像素点的颜色值。
在一些实施例中,参见图9,颜色值获取单元802,包括:
混合子单元8021,用于对于任一像素点,并行获取像素点的多个第二颜色值,多个第二颜色值为像素点相对于每个其他像素点的混合颜色值,一个第二颜色值根据像素点与一个其他像素点之间的距离、像素点的颜色值和一个其他像素点的颜色值确定;
确定子单元8022,用于获取像素点的第一颜色值,第一颜色值为多个第二颜色值之和。
在一些实施例中,参见图9,图像获取单元801,包括,
第一划分子单元8011,用于将第三图像划分为第一参考数量的图像区域,每个图像区域的尺寸相同;
处理子单元8012,用于并行获取第一参考数量的像素点,第一参考数量的像素点通过并行对每个图像区域进行降采样处理得到。
在一些实施例中,参见图9,处理子单元8012,用于:
同时基于每个图像区域中包括的像素点的颜色值,确定每个图像区域的颜色值,作为每个图像区域对应的像素点的颜色值;
基于确定的多个像素点的颜色值,创建包含多个像素点的第一图像。
在一些实施例中,处理子单元8012,用于:
并行确定第一参考数量的像素点的颜色值,颜色值基于像素点对应的图像区域中包括的像素点的颜色值确定;
创建第一图像,第一图像包含第一参考数量的像素点的颜色值。
在一些实施例中,处理子单元8012,用于:
并行获取每个图像区域中的像素点的颜色值的平均值;
获取第一参考数量的像素点的颜色值,颜色值为与像素点对应的图像区域中像素点的颜色值的平均值。
在一些实施例中,第一划分子单元8011,用于:
确定第一尺寸,第一尺寸基于第三图像的尺寸和第一参考数量得到;
从第三图像中划分满足第一尺寸的图像区域。
在一些实施例中,参见图9,提取单元804,包括:
第二划分子单元8041,用于将第二图像划分为第二参考数量的图像区域,第二参考数量的图像区域的尺寸相同;
像素点提取子单元8042,用于获取第二参考数量的像素点,第二参考数量的像素点为从第二参考数量的图像区域中每个图像区域中提取的任一个像素点;
颜色值提取子单元8043,用于提取第二参考数量的像素点的颜色值。
在一些实施例中,第二划分子单元8041,用于:
确定第二尺寸,第二尺寸基于第二图像的尺寸和第二参考数量得到;
从第二图像中划分满足第二尺寸的图像区域。
在一些实施例中,第二参考数量的图像区域在第一图像中具有对应的图像区域,提取 单元804,用于:
确定第一图像中至少一个第一图像区域的主色调,主色调为从第二图像区域中提取的任一个像素点的颜色值,第二图像区域为第二参考数量的图像区域与第一图像区域对应的图像区域。
关于上述实施例中的装置,其中各个模块执行操作的方式已经在有关该方法的实施例中进行了描述,此处将不做阐述说明。
图10是根据一示例性实施例示出的一种电子设备,如终端的框图。在一些实施例中,该终端1000是便携式移动终端,比如:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、笔记本电脑或台式电脑。终端800还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
终端1000包括有:一个或多个处理器1001和一个或多个存储器1002。
处理器1001可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1001可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1001还包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1001集成有GPU(Graphics Processing Unit,数据推荐器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1001还包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1002包括一个或多个计算机可读存储介质。在一些实施例中,该计算机可读存储介质是非暂态的。存储器1002还包括易失性存储器或非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1002中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1001所具有以实现本公开中方法实施例提供的主色调提取方法。
在一些实施例中,终端1000还包括有:外围设备接口1003和至少一个外围设备。处 理器1001、存储器1002和外围设备接口1003之间通过总线或信号线相连。各个外围设备通过总线、信号线或电路板与外围设备接口1003相连。在一些实施例中,外围设备包括:射频电路1004、显示屏1005、摄像头组件1006、音频电路10010、定位组件1008或者电源1009中的至少一种。
外围设备接口1003可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1001和存储器1002。在一些实施例中,处理器1001、存储器1002和外围设备接口1003被集成在同一芯片或电路板上;在一些其他实施例中,处理器1001、存储器1002和外围设备接口1003中的任意一个或两个在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1004用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路1004通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1004将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。在一些实施例中,射频电路1004包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路1004通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:城域网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1004还包括NFC(Near Field Communication,近距离无线通信)有关的电路,本公开对此不加以限定。
显示屏1005用于显示UI(UserInterface,用户界面)。该UI包括图形、文本、图标、视频及其它们的任意组合。当显示屏1005是触摸显示屏时,显示屏1005还具有采集在显示屏1005的表面或表面上方的触摸信号的能力。该触摸信号作为控制信号输入至处理器1001进行处理。此时,显示屏1005还用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1005为一个,设置在终端1000的前面板;在另一些实施例中,显示屏1005为至少两个,分别设置在终端1000的不同表面或呈折叠设计;在另一些实施例中,显示屏1005是柔性显示屏,设置在终端1000的弯曲表面上或折叠面上。甚至,显示屏1005还设置成非矩形的不规则图形,也即异形屏。显示屏1005采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1006用于采集图像或视频。在一些实施例中,摄像头组件1006包括前置摄像头和后置摄像头。前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。 在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件1006还包括闪光灯。闪光灯是单色温闪光灯,或者是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,用于不同色温下的光线补偿。
音频电路1007包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1001进行处理,或者输入至射频电路1004以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在终端1000的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器1001或射频电路1004的电信号转换为声波。扬声器是传统的薄膜扬声器,或者是压电陶瓷扬声器。若扬声器是压电陶瓷扬声器,不仅能够将电信号转换为人类可听见的声波,还能够将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1007还包括耳机插孔。
定位组件1008用于定位终端1000的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件1008是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统、俄罗斯的格雷纳斯系统或欧盟的伽利略系统的定位组件。
电源1009用于为终端1000中的各个组件进行供电。电源1009是交流电、直流电、一次性电池或可充电电池。若电源1009包括可充电电池,该可充电电池支持有线充电或无线充电。该可充电电池还用于支持快充技术。
在一些实施例中,终端1000还包括有一个或多个传感器1011。该一个或多个传感器1011包括但不限于:加速度传感器1011、陀螺仪传感器1012、压力传感器1013、指纹传感器1014、光学传感器1015以及接近传感器1016。
加速度传感器1011可以检测以终端1000建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1011用于检测重力加速度在三个坐标轴上的分量。处理器1001可以根据加速度传感器1011采集的重力加速度信号,控制显示屏1005以横向视图或纵向视图进行用户界面的显示。加速度传感器1011还用于游戏或者用户的运动数据的采集。
陀螺仪传感器1012检测终端1000的机体方向及转动角度,陀螺仪传感器1012与加速度传感器1011协同采集用户对终端1000的3D动作。处理器1001根据陀螺仪传感器 1012采集的数据,实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器1013设置在终端1000的侧边框和/或显示屏1005的下层。若压力传感器1013设置在终端1000的侧边框,检测用户对终端1000的握持信号,由处理器1001根据压力传感器1013采集的握持信号进行左右手识别或快捷操作。若压力传感器1013设置在显示屏1005的下层,由处理器1001根据用户对显示屏1005的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件或者菜单控件中的至少一种。
指纹传感器1014用于采集用户的指纹,由处理器1001根据指纹传感器1014采集到的指纹识别用户的身份,或者,由指纹传感器1014根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1001授权该用户具有相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器1014可以被设置终端1000的正面、背面或侧面。当终端1000上设置有物理按键或厂商Logo时,指纹传感器1014可以与物理按键或厂商标志集成在一起。
光学传感器1015用于采集环境光强度。在一个实施例中,处理器1001根据光学传感器1015采集的环境光强度,控制显示屏1005的显示亮度。若环境光强度较高,调高显示屏1005的显示亮度;若环境光强度较低,调低显示屏1005的显示亮度。在另一个实施例中,处理器1001还根据光学传感器1015采集的环境光强度,动态调整摄像头组件1006的拍摄参数。
接近传感器1016,也称距离传感器,设置在终端1000的前面板。接近传感器1016用于采集用户与终端1000的正面之间的距离。在一个实施例中,若接近传感器1016检测到用户与终端1000的正面之间的距离逐渐变小,由处理器1001控制显示屏1005从亮屏状态切换为息屏状态;若接近传感器1016检测到用户与终端1000的正面之间的距离逐渐变大,由处理器1001控制显示屏1005从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图10中示出的结构并不构成对终端1000的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
在示例性实施例中,还提供了一种电子设备,该电子设备包括:一个或多个处理器;用于存储一个或多个处理器可执行命令的易失性或非易失性存储器;其中,一个或多个处理器被配置为执行以下步骤:
获取第一图像;
对于第一图像中的每个像素点,并行获取每个像素点的第一颜色值,第一颜色值基于每个像素点与第一图像中多个其他像素点的距离、多个其他像素点的原颜色值和每个像素点的原颜色值确定;
获取第二图像,第二图像基于每个像素点的第一颜色值得到;
确定第一图像的主色调,主色调为第二图像中至少一个像素点的颜色值。
在一些实施例中,一个或多个处理器被配置为执行以下步骤:
对于任一像素点,并行获取像素点的多个第二颜色值,多个第二颜色值为像素点相对于每个其他像素点的混合颜色值,一个第二颜色值根据像素点与一个其他像素点之间的距离、像素点的颜色值和一个其他像素点的颜色值确定;
获取像素点的第一颜色值,第一颜色值为多个第二颜色值之和。
在一些实施例中,一个或多个处理器被配置为执行以下步骤:
将第三图像划分为第一参考数量的图像区域,每个图像区域的尺寸相同;
并行获取第一参考数量的像素点,第一参考数量的像素点通过并行对每个图像区域进行降采样处理得到;
获取第一图像,第一图像由第一参考数量的像素点构成。
在一些实施例中,一个或多个处理器被配置为执行以下步骤:
并行确定第一参考数量的像素点的颜色值,颜色值基于像素点对应的图像区域中包括的像素点的颜色值确定;
创建第一图像,第一图像包含第一参考数量的像素点的颜色值。
在一些实施例中,一个或多个处理器被配置为执行以下步骤:
并行获取每个图像区域中的像素点的颜色值的平均值;
获取第一参考数量的像素点的颜色值,颜色值为与像素点对应的图像区域中像素点的颜色值的平均值。
在一些实施例中,一个或多个处理器被配置为执行以下步骤:
确定第一尺寸,第一尺寸基于第三图像的尺寸和第一参考数量得到;
从第三图像中划分满足第一尺寸的图像区域。
在一些实施例中,一个或多个处理器被配置为执行以下步骤:
将第二图像划分为第二参考数量的图像区域,第二参考数量的图像区域的尺寸相同;
获取第二参考数量的像素点,第二参考数量的像素点为从第二参考数量的图像区域中 每个图像区域中提取的任一个像素点;
提取第二参考数量的像素点的颜色值。
在一些实施例中,一个或多个处理器被配置为执行以下步骤:
确定第二尺寸,第二尺寸基于第二图像的尺寸和第二参考数量得到;
从第二图像中划分满足第二尺寸的图像区域。
在一些实施例中,第二参考数量的图像区域在第一图像中具有对应的图像区域,一个或多个处理器被配置为执行以下步骤:
确定第一图像中至少一个第一图像区域的主色调,主色调为从第二图像区域中提取的任一个像素点的颜色值,第二图像区域为第二参考数量的图像区域与第一图像区域对应的图像区域。
在示例性实施例中,还提供了一种非临时性计算机可读存储介质,当存储介质中的指令由电子设备的处理器执行时,使得电子设备能够执行以下步骤:
获取第一图像;
对于第一图像中的每个像素点,并行获取每个像素点的第一颜色值,第一颜色值基于每个像素点与第一图像中多个其他像素点的距离、多个其他像素点的原颜色值和每个像素点的原颜色值确定;
获取第二图像,第二图像基于每个像素点的第一颜色值得到;
确定第一图像的主色调,主色调为第二图像中至少一个像素点的颜色值。
在一些实施例中,使得电子设备能够执行以下步骤:
对于任一像素点,并行获取像素点的多个第二颜色值,多个第二颜色值为像素点相对于每个其他像素点的混合颜色值,一个第二颜色值根据像素点与一个其他像素点之间的距离、像素点的颜色值和一个其他像素点的颜色值确定;
获取像素点的第一颜色值,第一颜色值为多个第二颜色值之和。
在一些实施例中,使得电子设备能够执行以下步骤:
将第三图像划分为第一参考数量的图像区域,每个图像区域的尺寸相同;
并行获取第一参考数量的像素点,第一参考数量的像素点通过并行对每个图像区域进行降采样处理得到;
获取第一图像,第一图像由第一参考数量的像素点构成。
在一些实施例中,使得电子设备能够执行以下步骤:
并行确定第一参考数量的像素点的颜色值,颜色值基于像素点对应的图像区域中包括 的像素点的颜色值确定;
创建第一图像,第一图像包含第一参考数量的像素点的颜色值。
在一些实施例中,使得电子设备能够执行以下步骤:
并行获取每个图像区域中的像素点的颜色值的平均值;
获取第一参考数量的像素点的颜色值,颜色值为与像素点对应的图像区域中像素点的颜色值的平均值。
在一些实施例中,使得电子设备能够执行以下步骤:
确定第一尺寸,第一尺寸基于第三图像的尺寸和第一参考数量得到;
从第三图像中划分满足第一尺寸的图像区域。
在一些实施例中,使得电子设备能够执行以下步骤:
将第二图像划分为第二参考数量的图像区域,第二参考数量的图像区域的尺寸相同;
获取第二参考数量的像素点,第二参考数量的像素点为从第二参考数量的图像区域中每个图像区域中提取的任一个像素点;
提取第二参考数量的像素点的颜色值。
在一些实施例中,使得电子设备能够执行以下步骤:
确定第二尺寸,第二尺寸基于第二图像的尺寸和第二参考数量得到;
从第二图像中划分满足第二尺寸的图像区域。
在一些实施例中,第二参考数量的图像区域在第一图像中具有对应的图像区域,使得电子设备能够执行以下步骤:
确定第一图像中至少一个第一图像区域的主色调,主色调为从第二图像区域中提取的任一个像素点的颜色值,第二图像区域为第二参考数量的图像区域与第一图像区域对应的图像区域。
在示例性实施例中,还提供了一种计算机程序产品,当计算机程序产品中的指令由电子设备的处理器执行时,使得电子设备能够执行以下步骤:
获取第一图像;
对于第一图像中的每个像素点,并行获取每个像素点的第一颜色值,第一颜色值基于每个像素点与第一图像中多个其他像素点的距离、多个其他像素点的原颜色值和每个像素点的原颜色值确定;
获取第二图像,第二图像基于每个像素点的第一颜色值得到;
确定第一图像的主色调,主色调为第二图像中至少一个像素点的颜色值。
在一些实施例中,使得电子设备能够执行以下步骤:
对于任一像素点,并行获取像素点的多个第二颜色值,多个第二颜色值为像素点相对于每个其他像素点的混合颜色值,一个第二颜色值根据像素点与一个其他像素点之间的距离、像素点的颜色值和一个其他像素点的颜色值确定;
获取像素点的第一颜色值,第一颜色值为多个第二颜色值之和。
在一些实施例中,使得电子设备能够执行以下步骤:
将第三图像划分为第一参考数量的图像区域,每个图像区域的尺寸相同;
并行获取第一参考数量的像素点,第一参考数量的像素点通过并行对每个图像区域进行降采样处理得到;
获取第一图像,第一图像由第一参考数量的像素点构成。
在一些实施例中,使得电子设备能够执行以下步骤:
并行确定第一参考数量的像素点的颜色值,颜色值基于像素点对应的图像区域中包括的像素点的颜色值确定;
创建第一图像,第一图像包含第一参考数量的像素点的颜色值。
在一些实施例中,使得电子设备能够执行以下步骤:
并行获取每个图像区域中的像素点的颜色值的平均值;
获取第一参考数量的像素点的颜色值,颜色值为与像素点对应的图像区域中像素点的颜色值的平均值。
在一些实施例中,使得电子设备能够执行以下步骤:
确定第一尺寸,第一尺寸基于第三图像的尺寸和第一参考数量得到;
从第三图像中划分满足第一尺寸的图像区域。
在一些实施例中,使得电子设备能够执行以下步骤:
将第二图像划分为第二参考数量的图像区域,第二参考数量的图像区域的尺寸相同;
获取第二参考数量的像素点,第二参考数量的像素点为从第二参考数量的图像区域中每个图像区域中提取的任一个像素点;
提取第二参考数量的像素点的颜色值。
在一些实施例中,使得电子设备能够执行以下步骤:
确定第二尺寸,第二尺寸基于第二图像的尺寸和第二参考数量得到;
从第二图像中划分满足第二尺寸的图像区域。
在一些实施例中,第二参考数量的图像区域在第一图像中具有对应的图像区域,使得 电子设备能够执行以下步骤:
确定第一图像中至少一个第一图像区域的主色调,主色调为从第二图像区域中提取的任一个像素点的颜色值,第二图像区域为第二参考数量的图像区域与第一图像区域对应的图像区域。
本领域技术人员在考虑说明书及实践这里的公开后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。

Claims (12)

  1. 一种主色调提取方法,其特征在于,应用于电子设备,所述方法包括:
    获取第一图像;
    对于所述第一图像中的每个像素点,并行获取每个所述像素点的第一颜色值,所述第一颜色值基于每个所述像素点与所述第一图像中多个其他像素点的距离、所述多个其他像素点的原颜色值和每个所述像素点的原颜色值确定;
    获取第二图像,所述第二图像基于每个所述像素点的第一颜色值得到;
    确定所述第一图像的主色调,所述主色调为所述第二图像中至少一个像素点的颜色值。
  2. 根据权利要求1所述的方法,其特征在于,所述对于所述第一图像中的每个像素点,并行获取每个所述像素点的第一颜色值,包括:
    对于任一像素点,并行获取所述像素点的多个第二颜色值,所述多个第二颜色值为所述像素点相对于每个其他像素点的混合颜色值,一个所述第二颜色值根据所述像素点与一个所述其他像素点之间的距离、所述像素点的颜色值和一个所述其他像素点的颜色值确定;
    获取所述像素点的第一颜色值,所述第一颜色值为所述多个第二颜色值之和。
  3. 根据权利要求1所述的方法,其特征在于,所述获取第一图像,包括,
    将第三图像划分为第一参考数量的图像区域,每个图像区域的尺寸相同;
    并行获取所述第一参考数量的像素点,所述第一参考数量的像素点通过并行对所述每个图像区域进行降采样处理得到;
    获取所述第一图像,所述第一图像由所述第一参考数量的像素点构成。
  4. 根据权利要求3所述的方法,其特征在于,所述获取所述第一参考数量的像素点,包括:
    并行确定所述第一参考数量的像素点的颜色值,所述颜色值基于所述像素点对应的图像区域中包括的像素点的颜色值确定;
    创建所述第一图像,所述第一图像包含所述第一参考数量的像素点的颜色值。
  5. 根据权利要求4所述的方法,其特征在于,所述并行确定所述第一参考数量的像素点的颜色值,包括:
    并行获取每个所述图像区域中的像素点的颜色值的平均值;
    获取所述第一参考数量的像素点的颜色值,所述颜色值为与所述像素点对应的图像区域中像素点的颜色值的平均值。
  6. 根据权利要求3所述的方法,其特征在于,所述将第三图像划分为第一参考数量的图像区域,包括:
    确定第一尺寸,所述第一尺寸基于所述第三图像的尺寸和所述第一参考数量得到;
    从所述第三图像中划分满足所述第一尺寸的图像区域。
  7. 根据权利要求1所述的方法,其特征在于,所述从所述第二图像中提取至少一个像素点的颜色值,包括:
    将所述第二图像划分为第二参考数量的图像区域,所述第二参考数量的图像区域的尺寸相同;
    获取所述第二参考数量的像素点,所述第二参考数量的像素点为从所述第二参考数量的图像区域中每个图像区域中提取的任一个像素点;
    提取所述第二参考数量的像素点的颜色值。
  8. 根据权利要求7所述的方法,其特征在于,所述将所述第二图像划分为第二参考数量的图像区域,包括:
    确定第二尺寸,所述第二尺寸基于所述第二图像的尺寸和所述第二参考数量得到;
    从所述第二图像中划分满足所述第二尺寸的图像区域。
  9. 根据权利要求7所述的方法,其特征在于,所述第二参考数量的图像区域在所述第一图像中具有对应的图像区域,所述确定所述第一图像的主色调,包括:
    确定所述第一图像中至少一个第一图像区域的主色调,所述主色调为从第二图像区域中提取的任一个像素点的颜色值,所述第二图像区域为所述第二参考数量的图像区域与所述第一图像区域对应的图像区域。
  10. 一种主色调提取装置,其特征在于,所述装置包括:
    图像获取单元,用于获取第一图像;
    颜色值获取单元,用于对于所述第一图像中的每个像素点,并行获取每个所述像素点的第一颜色值,所述第一颜色值基于每个所述像素点与所述第一图像中多个其他像素点的距离、所述多个其他像素点的原颜色值和每个所述像素点的原颜色值确定;
    生成单元,用于获取第二图像,所述第二图像基于每个所述像素点的第一颜色值得到;
    提取单元,用于确定所述第一图像的主色调,所述主色调为所述第二图像中至少一个像素点的颜色值。
  11. 一种电子设备,其特征在于,所述电子设备包括:
    一个或多个处理器;
    用于存储所述一个或多个处理器可执行命令的易失性或非易失性存储器;
    其中,所述一个或多个处理器被配置为执行如权利要求1-9任一项权利要求所述的主色调提取方法。
  12. 一种非临时性计算机可读存储介质,其特征在于,当所述存储介质中的指令由电子设备的处理器执行时,使得所述电子设备能够执行如权利要求1-9任一项权利要求所述的主色调提取方法。
PCT/CN2020/127558 2020-06-01 2020-11-09 主色调提取方法及装置 WO2021243955A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010485775.8A CN113763486B (zh) 2020-06-01 2020-06-01 主色调提取方法、装置、电子设备及存储介质
CN202010485775.8 2020-06-01

Publications (1)

Publication Number Publication Date
WO2021243955A1 true WO2021243955A1 (zh) 2021-12-09

Family

ID=78782666

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/127558 WO2021243955A1 (zh) 2020-06-01 2020-11-09 主色调提取方法及装置

Country Status (2)

Country Link
CN (1) CN113763486B (zh)
WO (1) WO2021243955A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020090133A1 (en) * 2000-11-13 2002-07-11 Kim Sang-Kyun Method and apparatus for measuring color-texture distance, and method and apparatus for sectioning image into plurality of regions using measured color-texture distance
CN1977542A (zh) * 2004-06-30 2007-06-06 皇家飞利浦电子股份有限公司 利用感知规律提取主色以产生来自视频内容的环境光
CN102523367A (zh) * 2011-12-29 2012-06-27 北京创想空间商务通信服务有限公司 基于多调色板的实时图像压缩和还原方法
CN103761303A (zh) * 2014-01-22 2014-04-30 广东欧珀移动通信有限公司 一种图片的排列显示方法及装置
EP2806403A1 (en) * 2013-05-23 2014-11-26 Thomson Licensing Method and device for processing a picture
CN106898026A (zh) * 2017-03-15 2017-06-27 腾讯科技(深圳)有限公司 一种图片的主色调提取方法和装置
CN109472832A (zh) * 2018-10-15 2019-03-15 广东智媒云图科技股份有限公司 一种配色方案生成方法、装置及智能机器人

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104461485B (zh) * 2013-09-17 2019-03-15 腾讯科技(深圳)有限公司 一种窗体着色的方法及用户设备
CN105989799B (zh) * 2015-02-12 2018-07-20 西安诺瓦电子科技有限公司 图像处理方法及图像处理装置
CN106780634B (zh) * 2016-12-27 2019-06-18 努比亚技术有限公司 图片主色调提取方法及装置
CN110825968B (zh) * 2019-11-04 2024-02-13 腾讯科技(深圳)有限公司 信息推送方法、装置、存储介质和计算机设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020090133A1 (en) * 2000-11-13 2002-07-11 Kim Sang-Kyun Method and apparatus for measuring color-texture distance, and method and apparatus for sectioning image into plurality of regions using measured color-texture distance
CN1977542A (zh) * 2004-06-30 2007-06-06 皇家飞利浦电子股份有限公司 利用感知规律提取主色以产生来自视频内容的环境光
CN102523367A (zh) * 2011-12-29 2012-06-27 北京创想空间商务通信服务有限公司 基于多调色板的实时图像压缩和还原方法
EP2806403A1 (en) * 2013-05-23 2014-11-26 Thomson Licensing Method and device for processing a picture
CN103761303A (zh) * 2014-01-22 2014-04-30 广东欧珀移动通信有限公司 一种图片的排列显示方法及装置
CN106898026A (zh) * 2017-03-15 2017-06-27 腾讯科技(深圳)有限公司 一种图片的主色调提取方法和装置
CN109472832A (zh) * 2018-10-15 2019-03-15 广东智媒云图科技股份有限公司 一种配色方案生成方法、装置及智能机器人

Also Published As

Publication number Publication date
CN113763486A (zh) 2021-12-07
CN113763486B (zh) 2024-03-01

Similar Documents

Publication Publication Date Title
KR102635373B1 (ko) 이미지 처리 방법 및 장치, 단말 및 컴퓨터 판독 가능 저장 매체
WO2021008456A1 (zh) 图像处理方法、装置、电子设备及存储介质
WO2020221012A1 (zh) 图像特征点的运动信息确定方法、任务执行方法和设备
CN111127509B (zh) 目标跟踪方法、装置和计算机可读存储介质
WO2022134632A1 (zh) 作品处理方法及装置
WO2020108041A1 (zh) 耳部关键点检测方法、装置及存储介质
US11386586B2 (en) Method and electronic device for adding virtual item
CN112581358B (zh) 图像处理模型的训练方法、图像处理方法及装置
CN111754386B (zh) 图像区域屏蔽方法、装置、设备及存储介质
CN112565806B (zh) 虚拟礼物赠送方法、装置、计算机设备及介质
WO2023142915A1 (zh) 图像处理方法、装置、设备及存储介质
CN111105474B (zh) 字体绘制方法、装置、计算机设备及计算机可读存储介质
CN108664300B (zh) 一种画中画模式下的应用界面显示方法及装置
CN110992268B (zh) 背景设置方法、装置、终端及存储介质
WO2020125739A1 (zh) 图像修复方法、装置、设备及存储介质
CN111860064B (zh) 基于视频的目标检测方法、装置、设备及存储介质
WO2021218926A1 (zh) 图像显示方法、装置和计算机设备
WO2021243955A1 (zh) 主色调提取方法及装置
CN111488895B (zh) 对抗数据生成方法、装置、设备及存储介质
CN108881739B (zh) 图像生成方法、装置、终端及存储介质
CN108881715B (zh) 拍摄模式的启用方法、装置、终端及存储介质
CN110660031B (zh) 图像锐化方法及装置、存储介质
CN111381765B (zh) 文本框的显示方法、装置、计算机设备及存储介质
CN111091512B (zh) 图像处理方法及装置、计算机可读存储介质
CN111953767B (zh) 内容分享的方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20938860

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.03.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20938860

Country of ref document: EP

Kind code of ref document: A1