CN110555443B - Color classification method, device and storage medium - Google Patents

Color classification method, device and storage medium Download PDF

Info

Publication number
CN110555443B
CN110555443B CN201810557323.9A CN201810557323A CN110555443B CN 110555443 B CN110555443 B CN 110555443B CN 201810557323 A CN201810557323 A CN 201810557323A CN 110555443 B CN110555443 B CN 110555443B
Authority
CN
China
Prior art keywords
color gamut
point
points
color
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810557323.9A
Other languages
Chinese (zh)
Other versions
CN110555443A (en
Inventor
田仁富
刘刚
曾峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201810557323.9A priority Critical patent/CN110555443B/en
Publication of CN110555443A publication Critical patent/CN110555443A/en
Application granted granted Critical
Publication of CN110555443B publication Critical patent/CN110555443B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Probability & Statistics with Applications (AREA)
  • Multimedia (AREA)
  • Color Image Communication Systems (AREA)

Abstract

The invention discloses a color classification method, a color classification device and a storage medium, and belongs to the technical field of image processing. The method comprises the following steps: determining a plurality of color gamut points in a color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image, and classifying the plurality of color gamut points through a clustering algorithm according to the color temperature of each color gamut point in the plurality of color gamut points to obtain N color gamut point sets. After the N color gamut point sets are sequenced according to the descending order of the color temperatures of the N central gamut points, the color types existing in the initial image can be determined according to the limit value between any two adjacent color gamut point sets, so that the initial image can be processed according to the determined color types.

Description

Color classification method, device and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a color classification method and apparatus, and a storage medium.
Background
Since there may be many color classes included in an image, it is generally necessary to classify the image by colors, that is, it is necessary to determine which color classes are included in the image, so as to process the image according to the determined image classes.
Disclosure of Invention
The embodiment of the invention provides a color classification method, a color classification device and a storage medium, which can be used for determining color classes included in an image. The technical scheme is as follows:
in a first aspect, a method for color classification is provided, the method comprising:
determining a plurality of color gamut points in a color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image;
determining the color temperature of each color gamut point according to the coordinates of each color gamut point in the plurality of color gamut points and the color temperature of each standard color gamut point in the plurality of standard color gamut points marked in the color gamut coordinate system;
classifying the plurality of color gamut points through a clustering algorithm according to the color temperature of each color gamut point in the plurality of color gamut points to obtain N color gamut point sets, wherein each color gamut point set comprises at least one color gamut point, and N is a positive integer greater than or equal to 2;
the method comprises the steps of sequencing N color gamut point sets according to the descending order of color temperatures of N central color gamut points, determining a limit value between any two adjacent color gamut point sets after sequencing to obtain N-1 limit values, wherein the N central color gamut points are color gamut points which are positioned at the center positions of areas defined by color gamut points included in each color gamut point set in the N color gamut point sets;
and if the N-1 threshold values are all larger than the threshold value, determining that one color class exists in the initial image, and if the threshold value of the N-1 threshold values is smaller than or equal to the threshold value, determining that at least two color classes exist in the initial image.
Optionally, the classifying the plurality of color gamut points according to the color temperature of each color gamut point in the plurality of color gamut points through a clustering algorithm to obtain N sets of color gamut points includes:
determining a first color gamut point and a second color gamut point, wherein the first color gamut point is a color gamut point with the largest color temperature in the plurality of color gamut points, and the second color gamut point is a color gamut point with the smallest color temperature in the plurality of color gamut points;
when the N is 2, clustering the plurality of color gamut points through a K-means clustering algorithm according to the first color gamut point and the second color gamut point to obtain N color gamut point sets;
when the N is larger than or equal to 3, determining N-2 coordinate points according to the coordinates of the first color gamut point and the coordinates of the second color gamut point, wherein the N-2 coordinate points refer to coordinate points which divide a connecting line between the first color gamut point and the second color gamut point into N-1 equal parts, selecting a color gamut point which is closest to each coordinate point in the N-2 coordinate points from the plurality of color gamut points to obtain N-2 third color gamut points, and clustering the plurality of color gamut points according to the first color gamut point, the second color gamut point and the N-2 third color gamut points through a K-means clustering algorithm to obtain N color gamut point sets.
Optionally, the determining a threshold value between any two adjacent color gamut point sets after sorting includes:
determining a distance between a color gamut point at the center position of a region surrounded by color gamut points included in a first color gamut point set and a color gamut point at the center position of a region surrounded by color gamut points included in a second color gamut point set to obtain a first distance, wherein the first color gamut point set and the second color gamut point set are two color gamut point sets which are arbitrarily adjacent after being sorted;
combining any color gamut point in the first color gamut point set with any color gamut point in the second color gamut point set to obtain a plurality of color gamut point pairs;
determining a distance between two color gamut points included in each of the plurality of color gamut point pairs to obtain a plurality of distances corresponding to the plurality of color gamut point pairs one to one;
determining a ratio between the first distance and a second distance as a threshold value between the first set of gamut points and the second set of gamut points, the second distance being a smallest distance of the plurality of distances.
Optionally, after determining that a color class exists in the initial image, the method further includes:
determining the color temperature weight of each color gamut point in the plurality of color gamut points, and determining the color temperature weight of each pixel point in the initial image according to the color temperature weight of each color gamut point in the plurality of color gamut points;
according to the color temperature weight of each pixel point in the initial image, respectively carrying out weighted summation on the R pixel values, the G pixel values and the B pixel values of all the pixel points in the initial image to obtain the R pixel summation value, the G pixel summation value and the B pixel summation value of the initial image;
and performing white balance correction on each pixel point in the initial image according to the R pixel sum value, the G pixel sum value and the B pixel sum value of the initial image.
Optionally, after determining that at least two color categories exist in the initial image, the method further includes:
if the threshold value between the first two color gamut point sets after sorting is smaller than or equal to the threshold value, selecting P color gamut point sets from the N color gamut point sets, wherein the P color gamut point sets do not comprise the first color gamut point set after sorting, determining pixel points corresponding to the color gamut points in the P color gamut point sets in the initial image, determining R pixel sum values, G pixel sum values and B pixel sum values in the initial image according to the determined pixel points, and P is a positive integer greater than or equal to 1;
if the threshold value between the first two color gamut point sets after sorting is larger than the threshold value, determining the color temperature weight of each color gamut point in the plurality of color gamut points, determining the color temperature weight of each pixel point in the initial image according to the color temperature weight of each color gamut point in the plurality of color gamut points, and respectively performing weighted summation on the R pixel value, the G pixel value and the B pixel value of all the pixel points in the initial image according to the color temperature weight of each pixel point in the initial image to obtain an R pixel summation value, a G pixel summation value and a B pixel summation value of the initial image;
and performing white balance correction on each pixel point in the initial image according to the R pixel sum value, the G pixel sum value and the B pixel sum value of the initial image.
Optionally, the determining a color temperature weight of each of the plurality of color gamut points includes:
selecting a maximum color temperature and a minimum color temperature from the color temperatures of the plurality of color gamut points;
determining the color temperature weight of each color gamut point in the plurality of color gamut points through a preset formula;
the preset formula is as follows: WT ═ W (T-Tmax)/(Tmin-Tmax);
WT is a color temperature weight of any one of the plurality of color gamut points, W is a preset weight value, T is a color temperature of any one of the plurality of color gamut points, Tmax is the maximum color temperature, and Tmin is the minimum color temperature.
Optionally, the determining, according to the R pixel value, the G pixel value, and the B pixel value of each pixel point in the initial image, a plurality of color gamut points in the color gamut coordinate system includes:
dividing the initial image into a plurality of image areas;
determining an R pixel value, a G pixel value and a B pixel value of each image area, wherein the R pixel value, the G pixel value and the B pixel value of each image area are respectively the average values of the R pixel value, the G pixel value and the B pixel value of all pixel points included in each image area;
and determining a plurality of coordinate points in the color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each image area, and taking the plurality of coordinate points as the plurality of color gamut points, wherein each coordinate point corresponds to one image area.
Optionally, the determining, according to the R pixel value, the G pixel value, and the B pixel value of each pixel point in the initial image, a plurality of color gamut points in the color gamut coordinate system includes:
and determining a plurality of coordinate points in the color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image, and taking the plurality of coordinate points as the plurality of color gamut points, wherein each coordinate point corresponds to one pixel point.
Optionally, the color gamut coordinate system includes a plurality of regions of interest with equal areas, and the plurality of regions of interest are preset by a user;
after determining the plurality of coordinate points in the color gamut coordinate system, the method further includes:
screening M interested areas from the plurality of interested areas, wherein each screened interested area comprises at least one coordinate point, and M is a positive integer greater than or equal to 2;
and determining M interesting coordinate points according to the M interesting regions, taking the M interesting coordinate points as the plurality of color gamut points, wherein each interesting coordinate point corresponds to one interesting region, and each interesting coordinate point is a coordinate point at the center position of the corresponding interesting region.
In a second aspect, there is provided a color sorting apparatus, the apparatus comprising:
the first determining module is used for determining a plurality of color gamut points in a color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image;
a second determining module, configured to determine a color temperature of each color gamut point according to the coordinates of each color gamut point in the plurality of color gamut points and the color temperature of each standard color gamut point in the plurality of standard color gamut points marked in the color gamut coordinate system;
the classification module is used for classifying the plurality of color gamut points through a clustering algorithm according to the color temperature of each color gamut point in the plurality of color gamut points to obtain N color gamut point sets, each color gamut point set comprises at least one color gamut point, and N is a positive integer greater than or equal to 2;
a third determining module, configured to sort the N color gamut point sets according to a descending order of color temperatures of the N central color gamut points, and determine a threshold between any two adjacent color gamut point sets after the sorting, to obtain N-1 thresholds, where the N central color gamut points are color gamut points located in a center position of an area surrounded by color gamut points included in each of the N color gamut point sets;
a fourth determining module, configured to determine that one color category exists in the initial image if the N-1 threshold values are all greater than the threshold value, and determine that at least two color categories exist in the initial image if the N-1 threshold values exist and are less than or equal to the threshold value.
Optionally, the classification module includes:
a first determining unit, configured to determine a first color gamut point and a second color gamut point, where the first color gamut point is a color gamut point with a largest color temperature among the plurality of color gamut points, and the second color gamut point is a color gamut point with a smallest color temperature among the plurality of color gamut points;
the first clustering unit is used for clustering the plurality of color gamut points through a K-means clustering algorithm according to the first color gamut point and the second color gamut point to obtain N color gamut point sets when N is 2;
and the second clustering unit is used for determining N-2 coordinate points according to the coordinates of the first color gamut point and the coordinates of the second color gamut point when N is greater than or equal to 3, wherein the N-2 coordinate points refer to the coordinate points which divide a connecting line between the first color gamut point and the second color gamut point into N-1 equal parts, selecting the color gamut point which is closest to each coordinate point in the N-2 coordinate points from the plurality of color gamut points to obtain N-2 third color gamut points, and clustering the plurality of color gamut points according to the first color gamut point, the second color gamut point and the N-2 third color gamut points by using the K-means clustering algorithm to obtain the N color gamut point sets.
Optionally, the third determining module is specifically configured to:
determining a distance between a color gamut point at a center position of a region surrounded by color gamut points included in a first color gamut point set and a color gamut point at a center position of a region surrounded by color gamut points included in a second color gamut point set to obtain a first distance, wherein the first color gamut point set and the second color gamut point set are any two adjacent color gamut point sets after being sorted;
combining any color gamut point in the first color gamut point set with any color gamut point in the second color gamut point set to obtain a plurality of color gamut point pairs;
determining a distance between two color gamut points included in each of the plurality of color gamut point pairs to obtain a plurality of distances corresponding to the plurality of color gamut point pairs one to one;
determining a ratio between the first distance and a second distance as a threshold value between the first set of gamut points and the second set of gamut points, the second distance being a smallest distance of the plurality of distances.
Optionally, the apparatus further comprises:
a fifth determining module, configured to determine a color temperature weight of each color gamut point in the plurality of color gamut points, and determine a color temperature weight of each pixel point in the initial image according to the color temperature weight of each color gamut point in the plurality of color gamut points;
the weighted summation module is used for respectively carrying out weighted summation on the R pixel values, the G pixel values and the B pixel values of all the pixel points in the initial image according to the color temperature weight of each pixel point in the initial image to obtain the R pixel summation value, the G pixel summation value and the B pixel summation value of the initial image;
and the white balance correction module is used for carrying out white balance correction on each pixel point in the initial image according to the R pixel sum value, the G pixel sum value and the B pixel sum value of the initial image.
Optionally, the apparatus further comprises:
a sixth determining module, configured to select P color gamut point sets from the N color gamut point sets if a threshold between the first two color gamut point sets after the sorting is smaller than or equal to the threshold, where the P color gamut point sets do not include the first color gamut point set after the sorting, determine a pixel point in the initial image corresponding to a color gamut point in the P color gamut point sets, and determine an R pixel sum value, a G pixel sum value, and a B pixel sum value in the initial image according to the determined pixel point, where P is a positive integer greater than or equal to 1;
a fifth determining module, configured to determine a color temperature weight of each color gamut point in the multiple color gamut points if a threshold between the first two color gamut point sets after the sorting is greater than the threshold, and determine a color temperature weight of each pixel point in the initial image according to the color temperature weight of each color gamut point in the multiple color gamut points;
the weighted summation module is used for respectively carrying out weighted summation on the R pixel values, the G pixel values and the B pixel values of all the pixel points in the initial image according to the color temperature weight of each pixel point in the initial image to obtain the R pixel summation value, the G pixel summation value and the B pixel summation value of the initial image;
and the white balance correction module is used for carrying out white balance correction on each pixel point in the initial image according to the R pixel sum value, the G pixel sum value and the B pixel sum value of the initial image.
Optionally, the fifth determining module is specifically configured to:
selecting a maximum color temperature and a minimum color temperature from the color temperatures of the plurality of color gamut points;
determining the color temperature weight of each color gamut point in the plurality of color gamut points through a preset formula;
the preset formula is as follows: WT ═ W (T-Tmax)/(Tmin-Tmax);
WT is a color temperature weight of any one of the plurality of color gamut points, W is a preset weight value, T is a color temperature of any one of the plurality of color gamut points, Tmax is the maximum color temperature, and Tmin is the minimum color temperature.
Optionally, the first determining module is specifically configured to:
dividing the initial image into a plurality of image areas;
determining an R pixel value, a G pixel value and a B pixel value of each image area, wherein the R pixel value, the G pixel value and the B pixel value of each image area are respectively the average values of the R pixel value, the G pixel value and the B pixel value of all pixel points included in each image area;
and determining a plurality of coordinate points in the color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each image area, and taking the plurality of coordinate points as the plurality of color gamut points, wherein each coordinate point corresponds to one image area.
Optionally, the first determining module is specifically configured to:
and determining a plurality of coordinate points in the color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image, and taking the plurality of coordinate points as the plurality of color gamut points, wherein each coordinate point corresponds to one pixel point.
Optionally, the color gamut coordinate system includes a plurality of regions of interest with equal areas, and the plurality of regions of interest are preset by a user;
the first determining module is further specifically configured to:
screening M interested areas from the plurality of interested areas, wherein each screened interested area comprises at least one coordinate point, and M is a positive integer greater than or equal to 2;
and determining M interesting coordinate points according to the M interesting regions, taking the M interesting coordinate points as the plurality of color gamut points, wherein each interesting coordinate point corresponds to one interesting region, and each interesting coordinate point is a coordinate point at the center position of the corresponding interesting region.
In a third aspect, there is provided a color classification apparatus, the apparatus comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the steps of any of the methods of the first aspect described above.
In a fourth aspect, a computer-readable storage medium is provided, having instructions stored thereon, which when executed by a processor, implement the steps of any of the methods of the first aspect described above.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the steps of any of the methods of the first aspect described above.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
in the embodiment of the invention, a plurality of color gamut points in a color gamut coordinate system are determined according to the R pixel value, the G pixel value and the B pixel value of each pixel point in an initial image, and the plurality of color gamut points are classified through a clustering algorithm according to the color temperature of each color gamut point in the plurality of color gamut points to obtain N color gamut point sets. After the N color gamut point sets are sequenced according to the descending order of the color temperatures of the N central color gamut points, the color types existing in the initial image can be determined according to the limit value between any two adjacent color gamut point sets, so that the initial image can be processed according to the determined color types.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flow chart of a color classification method according to an embodiment of the present invention;
FIG. 2 is a block diagram of a color sorting apparatus according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 1 is a color classification method according to an embodiment of the present invention, and as shown in fig. 1, the color classification method includes the following steps:
step 101: and determining a plurality of color gamut points in a color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image.
In the embodiment of the present invention, the step 101 specifically has the following three implementation manners:
(1) in a first implementation manner, an initial image is divided into a plurality of image areas, an R pixel value, a G pixel value, and a B pixel value of each image area are determined, the R pixel value, the G pixel value, and the B pixel value of each image area are respectively an average value of the R pixel value, the G pixel value, and the B pixel value of all pixel points included in each image area, a plurality of coordinate points in a color gamut coordinate system are determined according to the R pixel value, the G pixel value, and the B pixel value of each image area, and the plurality of coordinate points are used as a plurality of color gamut points, each coordinate point corresponding to one image area.
In a first implementation manner, in order to reduce the data processing amount in the color classification process and improve the speed of performing color classification on the initial image, the pixel points in the initial image may be aggregated according to the above manner, and then the multiple color gamut points are determined by the pixel points after the aggregation. The aggregation processing refers to processing a certain number of pixel points into one pixel point.
The implementation manner of determining a plurality of coordinate points in the color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each image region is as follows: and respectively determining a first ratio and a second ratio corresponding to each image area, wherein the first ratio refers to the ratio between the R pixel value and the G pixel value of the corresponding image area, the second ratio refers to the ratio between the B pixel value and the G pixel value of the corresponding image area, the first ratio and the second ratio corresponding to each image area are respectively used as a vertical coordinate and a horizontal coordinate, and a plurality of coordinate points in the color gamut coordinate system are determined according to the vertical coordinate and the horizontal coordinate corresponding to each image area.
For example, assuming that the pixel size of the initial image is 256 × 256, the initial image is divided into 32 × 32 image regions, and at this time, the number of pixels included in each image region is 8 × 8. For any image area, the R pixel value, the G pixel value and the B pixel value of each pixel point in 64 pixel points included in the image area are determined. And determining the average value of the R pixel values of the 64 pixel points to obtain the R pixel value of the image area. And determining the average value of the G pixel values of the 64 pixel points to obtain the G pixel value of the image area. And determining the average value of the B pixel values of the 64 pixel points to obtain the B pixel value of the image area.
After the R pixel value, the G pixel value, and the B pixel value of each of the 32 × 32 image areas are determined, a first ratio and a second ratio corresponding to each image area are determined, so as to determine a coordinate point corresponding to each image area in the color gamut coordinate system according to the first ratio and the second ratio of each image area. That is, each image region has a corresponding coordinate point in the color gamut coordinate system, and the ordinate and the abscissa of the coordinate point corresponding to each image region are the first ratio and the second ratio of the image region, respectively. The determined coordinate points are then taken as a plurality of gamut points.
(2) In a second implementation manner, a plurality of coordinate points in a color gamut coordinate system are determined according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image, and the plurality of coordinate points are used as a plurality of color gamut points, wherein each coordinate point corresponds to one pixel point.
In the second implementation manner, the pixels in the initial image are not aggregated, and each pixel in the initial image is directly mapped into the color gamut coordinate system, that is, for any pixel, the ratio between the R pixel value and the G pixel value of the pixel is used as the ordinate, and the ratio between the B pixel value and the G pixel value of the pixel is used as the abscissa, so as to obtain the coordinate of the coordinate point corresponding to the pixel in the color gamut point coordinate system. At this time, each pixel point in the initial image corresponds to one coordinate point in the color gamut coordinate system.
(3) In a third implementation manner, a plurality of coordinate points in the color gamut coordinate system are determined according to the first implementation manner or the second implementation manner, then M regions of interest are screened out from the plurality of regions of interest, each screened region of interest includes at least one coordinate point, M is a positive integer greater than or equal to 2, M coordinate points of interest are determined according to the M regions of interest, the M coordinate points of interest are used as a plurality of color gamut points, each coordinate point of interest corresponds to one region of interest, and each coordinate point of interest refers to a coordinate point located at a center position of the corresponding region of interest.
In the embodiment of the present invention, in order to further reduce the data processing amount, the user may set a region of interest in the color gamut coordinate system in advance, so as to filter the plurality of color gamut points. In this case, the process is equivalent to further performing an aggregation process on the plurality of color gamut points, and the process of the aggregation process is as follows: counting an area including at least one coordinate point in the plurality of interested areas, and representing the coordinate point falling into the same interested area by adopting a coordinate point of the central position of the interested area.
Step 102: and determining the color temperature of each color gamut point according to the coordinates of each color gamut point in the plurality of color gamut points and the color temperature of each standard color gamut point in the plurality of standard color gamut points marked in the color gamut coordinate system.
For any color gamut point in the color gamut coordinate system, the color gamut point may represent a color, and for convenience of description, the color temperature of the color represented by the color gamut point is referred to as the color temperature of the color gamut point, and the color temperature is used to identify the corresponding color. Specifically, a plurality of standard color gamut points, each of which has a known color temperature, are marked in the color gamut coordinate system in advance. When the color temperature of a certain color gamut point in the color gamut coordinate system needs to be determined, the relative position relationship between the color gamut point and the plurality of standard color gamut points can be determined according to the coordinates of the color gamut point and the coordinates of the plurality of standard color gamut points marked in advance, and the color temperature of the color gamut point can be determined according to the determined relative position relationship and the color temperature of the standard color gamut points. That is, for any color gamut point in the color gamut coordinate system, when the coordinates of the color gamut point are known, the color temperature of the color gamut point can be determined in the above manner.
It should be noted that, the connecting line between the standard color gamut points marked in the color gamut coordinate system in advance is usually a smooth curve. If a certain color gamut point in the color gamut coordinate system happens to fall on the curve, the color temperature of the color gamut point determined at this time is usually called the true color temperature. If the color gamut point does not fall on the curve, the determined color temperature of the color gamut point is generally referred to as correlated color temperature. In the embodiment of the present invention, for convenience of explanation, the color temperature of the determined color gamut point is collectively referred to as a color temperature regardless of whether the color gamut point falls on the curve, and it is not distinguished whether it is a true color temperature or a correlated color temperature.
Among them, in the black body radiation, the color of light radiated from the black body is different depending on the temperature. For example, as the temperature increases, the color of light radiated from a black body shows a gradual process of red → orange red → yellow → white → blue-white. Thus, the color temperature of a color means: when a certain color is the same as the color of light radiated by a black body at a certain temperature, the temperature of the black body is referred to as the color temperature of the color.
Step 103: classifying the plurality of color gamut points through a clustering algorithm according to the color temperature of each color gamut point in the plurality of color gamut points to obtain N color gamut point sets, wherein each color gamut point set comprises at least one color gamut point, and N is a positive integer greater than or equal to 2.
In the embodiment of the invention, a plurality of color gamut points are classified by a K-means (K-means) clustering algorithm. It should be noted that the K-means clustering algorithm is a hard clustering algorithm, and clusters each object according to the distance from each object to a centroid point, and the number of the clustered sets is consistent with the number of the preset centroid points. That is, in K-means clustering, the number of sets after clustering needs to be set first, then an initial centroid point is determined according to the set number, and each object is clustered according to the determined initial centroid point.
Specifically, in the embodiment of the present invention, the color gamut points are classified by the K-means clustering algorithm in the following two scenarios:
in the first scenario, when N is 2, a first color gamut point and a second color gamut point are determined, the first color gamut point is a color gamut point with the largest color temperature among the plurality of color gamut points, the second color gamut point is a color gamut point with the smallest color temperature among the plurality of color gamut points, and the plurality of color gamut points are clustered by a K-means clustering algorithm according to the first color gamut point and the second color gamut point to obtain two sets of color gamut points.
The method for clustering the plurality of color gamut points through the K-means clustering algorithm according to the first color gamut point and the second color gamut point comprises the following steps: for any of a plurality of gamut points, the distance between the gamut point and the first gamut point and the distance between the gamut point and the second gamut point are determined, and the gamut point is added to the set represented by the first gamut point or the set represented by the second gamut point according to the two determined distances. When the above processing is completed for all of the plurality of color gamut points, the first clustering of the plurality of color gamut points may be implemented, and two sets are obtained, for convenience of description, the set represented by the first color gamut point is referred to as a first set, and the set represented by the second color gamut point is referred to as a second set.
After the first clustering, since the first color gamut point is most likely not at the center position of the region enclosed by the color gamut points included in the first set, and the second color gamut point is also most likely not at the center position of the region enclosed by the color gamut points included in the second set, the centroid points of the first set and the second set need to be re-determined to obtain two centroid points, which are the color gamut points at the center positions of the region enclosed by the color gamut points included in the corresponding sets. And at the moment, performing secondary clustering on the plurality of color gamut points according to the determined two centroid points, and repeatedly executing the process until the determined two centroid points of the two sets are the same as the two centroid points according to clustering, so as to obtain 2 color gamut point sets.
In the second scenario, when N is greater than or equal to 3, N-2 coordinate points are determined according to the coordinates of the first color gamut point and the coordinates of the second color gamut point, the N-2 coordinate points refer to coordinate points which divide a connecting line between the first color gamut point and the second color gamut point into N-1 equal parts, the color gamut point closest to each coordinate point in the N-2 coordinate points is selected from the plurality of color gamut points to obtain N-2 third color gamut points, and the plurality of color gamut points are clustered through the K-means clustering algorithm according to the first color gamut point, the second color gamut point and the N-2 third color gamut points to obtain N color gamut point sets.
The implementation manner of clustering the plurality of color gamut points by the K-means clustering algorithm according to the first color gamut point, the second color gamut point and the N-2 third color gamut points is basically the same as that in the first scenario, and is not described in detail here.
It should be noted that when N is too large, the speed of the K-means clustering algorithm is seriously affected. In addition, in the embodiment of the present invention, after the color category of the initial image is determined, white balance correction may be performed on the initial image according to the determined color category. In addition, experiments have shown that a large area of monochrome, whose ratio is 1/3 or less, in the shooting area has little effect on the white balance correction result, so in practical applications, N can be set to 3. That is, the K-means clustering algorithm is used to cluster a plurality of color gamut points to obtain three color gamut point sets.
For example, when N is 3, after the first gamut point and the second gamut point are determined, 1 coordinate point needs to be determined, where the coordinate point is a coordinate point obtained by dividing a connection line between the first gamut point and the second gamut point into 2 equal parts. That is, the coordinate point refers to a coordinate point at a midpoint position of a line connecting the first gamut point and the second gamut point. And then, selecting a color gamut point closest to the coordinate point from the plurality of color gamut points to obtain a third color gamut point, and clustering the plurality of color gamut points through a K-means clustering algorithm to obtain 3 color gamut point sets.
Step 104: the N color gamut point sets are sequenced according to the sequence of the color temperatures of the N central color gamut points from large to small, the limit value between any two adjacent color gamut point sets after sequencing is determined, N-1 limit values are obtained, and the N central color gamut points are the color gamut points which are positioned in the center positions of the areas defined by the color gamut points included in each color gamut point set in the N color gamut point sets.
After N sets of gamut points are determined, if the overlap ratio between the regions surrounded by some sets of gamut points is relatively large, these sets of gamut points may correspond to a color class. Thus, after the N sets of gamut points are determined, the degree of coincidence between the different sets of gamut points may be determined by step 104. Wherein the threshold value may be used to characterize the degree of coincidence between the two sets of gamut points. A larger limit value indicates a less pronounced boundary between the two sets of gamut points, and a smaller limit value indicates a more pronounced boundary between the two sets of gamut points.
Specifically, the implementation manner of determining the threshold value between any two adjacent color gamut point sets after sorting is as follows: determining a distance between a color gamut point at a center position of a region surrounded by color gamut points included in the first set of color gamut points and a color gamut point at a center position of a region surrounded by color gamut points included in the second set of color gamut points, the method comprises the steps of obtaining a first distance, wherein the first color gamut point set and the second color gamut point set are any two adjacent color gamut point sets after sorting, combining any color gamut point in the first color gamut point set with any color gamut point in the second color gamut point set to obtain a plurality of color gamut point pairs, determining the distance between two color gamut points included in each color gamut point pair in the plurality of color gamut point pairs to obtain a plurality of distances one-to-one corresponding to the plurality of color gamut point pairs, determining the ratio of the first distance to the second distance as the boundary value between the first color gamut point set and the second color gamut point set, and determining the second distance as the minimum distance in the plurality of distances.
For example, if the first distance is L1 and the second distance is L2, the boundary value between the first set of target color gamut points and the second set of target color gamut points is L1/L2.
Step 105: and if the N-1 threshold values are all larger than the threshold value, determining that one color category exists in the initial image, and if the threshold values in the N-1 threshold values are smaller than or equal to the threshold value, determining that at least two color categories exist in the initial image.
If all of the N-1 threshold values are greater than the threshold value, indicating that the degree of coincidence between the N sets of color gamut points is greater, then it can be determined that a color class exists in the initial image. If the boundary value of the N-1 boundary values is smaller than or equal to the threshold value, which indicates that the coincidence degree between certain writing color gamut points is not large, at this time, at least two color categories exist in the initial image.
In an embodiment of the present invention, after the color class of the initial image is determined, white balance correction may be performed on the initial image. For the sake of convenience of the following description, the white balance correction is explained here:
when a target object is photographed under a light source, since light emitted from the light source may have a color, there may be a deviation between the color of the photographed target object and the color of the target object itself, for example, when white paper is photographed under a yellow light source, the photographed color may be yellow. Therefore, in actual shooting, it is generally necessary to perform white balance correction on an initial image shot by a camera for a target object so that the color of the target object in the image after correction is as consistent as possible with the color of the target object itself.
In the related art, when a target object is photographed under a standard light source, the color of the photographed target object substantially matches the color of the target object itself, and when the photographing region includes a large number of color types, the sum of R pixel values, the sum of G pixel values, and the sum of B pixel values of all pixel points in an image photographed for the photographing region under the standard light source are substantially the same. The white balance correction of the initial image is thus implemented as: determining an R pixel value, a G pixel value and a B pixel value of each pixel point on the initial image, respectively counting the sum of the R pixel values of all the pixel points on the initial image, the sum of the G pixel values of all the pixel points and the sum of the B pixel values of all the pixel points, and respectively marking the three sums as sumR, sumG and sumB. Based on these three sums, a first gain of sumG/sumR, a second gain of sumG/sumG-1, and a third gain of sumG/sumB are determined. And for any pixel point in the initial image, multiplying the R pixel value, the G pixel value and the B pixel value of the pixel point by the determined first gain, second gain and third gain respectively to obtain the corrected R pixel value, G pixel value and B pixel value. For example, it is assumed that the RGB pixel values of the pixel in the initial image are (r, g, b), and the RGB pixel values after the white balance correction are (r ', g ', b '), where r ═ r × sumG/sumR, g ═ g × 1, and b ═ b sumG/sumB.
By the method, the sum of the R pixel values, the sum of the G pixel values and the sum of the B pixel values of all the pixel points in the corrected image can be kept consistent, and therefore white balance correction of the initial image is achieved. However, the white balance correction method is performed under the condition that the number of types of colors included in the shot area is large, and therefore, when a large-area monochrome area exists in the shot area, the assumed condition for performing white balance correction by the method is not satisfied, and at this time, if white balance correction is still performed by the method, the image after correction has a large color difference. The large-area monochromatic area can be a large-area blue area or a large-area yellow area.
Therefore, in the embodiment of the present invention, the white balance correction may be performed on the initial image according to the determined color class. That is, the R pixel sum value, the G pixel sum value, and the B pixel sum value for performing white balance correction are determined according to the color classification for the initial image, instead of directly accumulating the R pixel value, the G pixel value, and the B pixel value of each pixel point in the initial image to determine the R pixel sum value, the G pixel sum value, and the B pixel sum value, which avoids a large color difference in the image after correction. Specifically, the initial image may be subjected to white balance correction by the following step 106.
Step 106: and performing white balance correction on the initial image according to the determined color category.
As known from step 105, there may be one color category or at least two color categories in the initial image, and therefore, step 106 may be specifically implemented by the following steps 1061 and 1062:
step 1061: when one color class exists in the initial image, the white balance correction is carried out on the initial image by a color temperature weighting method.
Specifically, determining the color temperature weight of each color gamut point in a plurality of color gamut points, and determining the color temperature weight of each pixel point in the initial image according to the color temperature weight of each color gamut point in the plurality of color gamut points; respectively carrying out weighted summation on R pixel values, G pixel values and B pixel values of all pixel points in the initial image according to the color temperature weight of each pixel point in the initial image to obtain an R pixel summation value, a G pixel summation value and a B pixel summation value of the initial image; and carrying out white balance correction on each pixel point in the initial image according to the R pixel sum value, the G pixel sum value and the B pixel sum value of the initial image.
When a color class exists in the initial image, it indicates that a large area of a single color region exists in the initial image, and at this time, if the white balance correction is performed on the initial image according to the white balance correction method in the related art, the corrected image will have a large color difference. When the white balance correction is performed on the initial image according to the color temperature weighting method, the image after correction can be prevented from having large color difference.
The implementation manner of determining the color temperature weight of each color gamut point in the plurality of color gamut points is as follows: the color temperature weight of each color gamut point in the plurality of color gamut points is determined through a preset formula.
The preset formula is as follows: WT ═ W (T-Tmax)/(Tmin-Tmax);
WT is a color temperature weight of any one of the plurality of color gamut points, W is a preset weight value, T is a color temperature of any one of the plurality of color gamut points, Tmax is a maximum color temperature, and Tmin is a minimum color temperature.
In addition, because each color gamut point has a corresponding pixel point in the initial image, the color temperature weight of each pixel point in the initial image is determined according to the color temperature weight of each color gamut point in the plurality of color gamut points, that is, the color temperature weight of a certain color gamut point is used as the color temperature weight of the pixel point corresponding to the color gamut point in the initial image, so as to obtain the color temperature weight of each pixel point in the initial image.
When the first implementation manner determines multiple color gamut points in step 101, at this time, for any color gamut point, the pixel point corresponding to the color gamut point refers to: and all pixel points included in the image area corresponding to the color gamut point in the initial image. When a plurality of color gamut points are determined in step 101 through a third implementation manner based on the first implementation manner, for any color gamut point, a pixel point corresponding to the color gamut point refers to: and all pixel points in the image area corresponding to each coordinate point in all coordinate points included in the region of interest where the color gamut point is located. For example, the region of interest where the color gamut point is located includes 3 coordinate points, and then the pixel points corresponding to the color gamut point refer to all the pixel points included in 3 image regions corresponding to the 3 coordinate points one to one
Step 1062: when at least two color classes exist in the initial image, white balance correction is performed on the initial image according to the boundary value between the color gamut point sets.
Specifically, the white balance correction of the initial image according to the boundary value between the sets of gamut points has the following two cases:
case 1: if the threshold value between the first two color gamut point sets after sorting is smaller than or equal to the threshold value, P color gamut point sets are selected from the N color gamut point sets, the P color gamut point sets do not comprise the first color gamut point set after sorting, pixel points corresponding to the color gamut points in the P color gamut point sets in the initial image are determined, R pixel summation values, G pixel summation values and B pixel summation values in the initial image are determined according to the determined pixel points, P is a positive integer larger than or equal to 1, and white balance correction is performed on each pixel point in the initial image according to the R pixel summation values, the G pixel summation values and the B pixel summation values of the initial image.
For the sake of convenience in the following, the top-ranked set of gamut points is referred to as the set of gamut points of higher color temperature, and the bottom-ranked set of gamut points is referred to as the set of gamut points of lower color temperature.
If the threshold value between the first two color gamut point sets after sorting is less than or equal to the threshold value, it indicates that the boundary between the color gamut point set with the higher color temperature and the color gamut point set with the lower color temperature is obvious, that is, the color classification in the initial image is obvious. At this time, since the P color gamut point sets do not include the first color gamut point set after sorting, it is equivalent to remove the pixel points corresponding to the color gamut points included in the color gamut point set with the higher color temperature in the initial image, and the R pixel sum value, the G pixel sum value, and the B pixel sum value are calculated by using the remaining pixel points.
Determining the R pixel summation value, the G pixel summation value and the B pixel summation value according to the determined pixel points, namely, superposing the R pixel values of all the determined pixel points to obtain the R pixel summation value; superposing the G pixel values of all the determined pixel points to obtain a G pixel summation value; and overlapping the B pixel values of all the determined pixel points to obtain the B pixel sum value.
Further, selecting P sets of gamut points from N sets of gamut points means selecting P sets of gamut points from the second set of gamut points to the last set of gamut points after sorting. For example, the middle ordered ones of the second to last sets of gamut points may be determined as P sets of gamut points. Alternatively, the second target color gamut point set to the last color gamut point set may be directly determined as the P color gamut point sets, and the embodiment of the present invention is not specifically limited herein.
Case 2: if the threshold value between the first two color gamut point sets after sorting is larger than the threshold value of the threshold value, determining the color temperature weight of each color gamut point in the plurality of color gamut points, determining the color temperature weight of each pixel point in the initial image according to the color temperature weight of each color gamut point in the plurality of color gamut points, and respectively performing weighted summation on the R pixel value, the G pixel value and the B pixel value of all the pixel points in the initial image according to the color temperature weight of each pixel point in the initial image to obtain the R pixel summation value, the G pixel summation value and the B pixel summation value of the initial image.
If the boundary value between the first two color gamut point sets after sorting is larger than the threshold value, it indicates that the boundary between the color gamut point set with the higher color temperature and the color gamut point set with the lower color temperature is not obvious, and at this time, a large area of monochromatic area may exist in the initial image, and the R pixel summation value, the G pixel summation value, and the B pixel summation value need to be determined by a color temperature weighting method, so as to perform white balance correction.
Wherein, the R pixel summation value, the G pixel summation value and the B pixel summation value are determined by a color temperature weighting method, which is not described in detail herein, with reference to step 1061.
For example, N is 3, it is determined through steps 101 to 105 that there are three color categories in the initial image, and for convenience of description, the three sets of color gamut points after being sorted are referred to as a color gamut point set C, a color gamut point set D, and a color gamut point set E. At this time, the white balance correction is performed in two cases:
(1) if the threshold value between the color gamut point set C and the color gamut point set D is less than or equal to the threshold value, P color gamut point sets can be selected from the color gamut point set D and the color gamut point set E, pixel points corresponding to the color gamut points in the selected color gamut point set in the initial image are determined, and the R pixel summation value, the G pixel summation value and the B pixel summation value are determined according to the determined pixel points, so that white balance correction is performed.
The implementation manner of selecting P color gamut point sets from the color gamut point set D and the color gamut point set E is as follows: if the threshold value between the color gamut point set D and the color gamut point set E is smaller than or equal to the threshold value, determining the color gamut point set D as P selected color gamut point sets; if the threshold value between the color gamut point set D and the color gamut point set E is larger than the threshold value, the coincidence degree between the color gamut point set D and the color gamut point set E is high, and at this time, the color gamut point set D and the color gamut point set E are determined as P selected color gamut point sets.
(2) If the threshold value between the color gamut point set C and the color gamut point set D is larger than the threshold value, the coincidence degree between the color gamut point set with higher color temperature and the color gamut point with lower color temperature is higher, and at the moment, the white balance correction is carried out on the initial image by a color temperature weighting method.
In addition, in case 1 or case 2, the implementation manner of performing white balance correction on each pixel point in the initial image according to the R pixel sum value, the G pixel sum value, and the B pixel sum value of the initial image is as follows: and determining a first gain, a second gain and a third gain according to the R pixel summation value, the G pixel summation value and the B pixel summation value, wherein the first gain is the ratio of the G pixel summation value to the R pixel summation value, the second gain is 1, the third gain is the ratio of the G pixel summation value to the B pixel summation value, and white balance correction is performed on each pixel point in the initial image according to the first gain, the second gain and the third gain.
The implementation manner of performing white balance correction on each pixel point in the initial image according to the first gain, the second gain and the third gain is as follows: for any pixel point in the initial image, multiplying the R pixel value of the pixel point by the first gain to obtain a corrected R pixel value; multiplying the G pixel value of the pixel point by a second gain to obtain a corrected G pixel value; and multiplying the B pixel value of the pixel point by the third gain to obtain the corrected B pixel value.
In the embodiment of the invention, the color category in the initial image is determined firstly according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image. And then, performing white balance correction on each pixel point in the initial image according to the determined number of the color categories. That is, in the embodiment of the present invention, the R pixel summation value, the G pixel summation value, and the B pixel summation value for performing white balance correction are determined according to the number of color categories included in the initial image, instead of directly performing white balance correction on the R pixel value, the G pixel value, and the B pixel value of each pixel point in the initial image, so that a large color difference of the corrected image is avoided.
An embodiment of the present invention provides a color classification apparatus, as shown in fig. 2, the apparatus 200 includes a first determining module 201, a second determining module 202, a classifying module 203, a third determining module 204, and a fourth determining module 205:
a first determining module 201, configured to determine a plurality of color gamut points in a color gamut coordinate system according to an R pixel value, a G pixel value, and a B pixel value of each pixel point in an initial image;
a second determining module 202, configured to determine a color temperature of each color gamut point according to the coordinates of each color gamut point in the plurality of color gamut points and the color temperature of each standard color gamut point in the plurality of standard color gamut points marked in the color gamut coordinate system;
the classification module 203 is configured to classify the plurality of color gamut points through a clustering algorithm according to a color temperature of each color gamut point in the plurality of color gamut points to obtain N color gamut point sets, where each color gamut point set includes at least one color gamut point, and N is a positive integer greater than or equal to 2;
a third determining module 204, configured to sort the N color gamut point sets according to a descending order of color temperatures of the N central color gamut points, and determine a threshold between any two adjacent color gamut point sets after the sorting, so as to obtain N-1 thresholds, where the N central color gamut points are color gamut points located in a center position of an area surrounded by color gamut points included in each of the N color gamut point sets;
a fourth determining module 205, configured to determine that one color category exists in the initial image if the N-1 threshold values are all greater than the threshold value, and determine that at least two color categories exist in the initial image if the N-1 threshold values exist and are less than or equal to the threshold value.
Optionally, the classification module 203 includes:
the first determining unit is used for determining a first color gamut point and a second color gamut point, wherein the first color gamut point is a color gamut point with the largest color temperature in the plurality of color gamut points, and the second color gamut point is a color gamut point with the smallest color temperature in the plurality of color gamut points;
the first clustering unit is used for clustering the plurality of color gamut points through a K-means clustering algorithm according to the first color gamut point and the second color gamut point when N is 2 to obtain N color gamut point sets;
and the second clustering unit is used for determining N-2 coordinate points according to the coordinates of the first gamut point and the coordinates of the second gamut point when N is greater than or equal to 3, wherein the N-2 coordinate points refer to the coordinate points which divide a connecting line between the first gamut point and the second gamut point into N-1 equal parts, selecting the gamut point closest to each coordinate point in the N-2 coordinate points from the plurality of gamut points to obtain N-2 third gamut points, and clustering the plurality of gamut points according to the first gamut point, the second gamut point and the N-2 third gamut points by using a K-means clustering algorithm to obtain N sets of gamut points.
Optionally, the third determining module 204 is specifically configured to:
determining the distance between a color gamut point at the center position of a region surrounded by color gamut points included in the first color gamut point set and a color gamut point at the center position of a region surrounded by color gamut points included in the second color gamut point set to obtain a first distance, wherein the first color gamut point set and the second color gamut point set are two color gamut point sets which are arbitrarily adjacent after being sorted;
combining any color gamut point in the first color gamut point set with any color gamut point in the second color gamut point set to obtain a plurality of color gamut point pairs;
determining the distance between two color gamut points included in each of the plurality of color gamut point pairs to obtain a plurality of distances corresponding to the plurality of color gamut point pairs one by one;
and determining the ratio of the first distance to a second distance as a boundary value between the first color gamut point set and the second color gamut point set, wherein the second distance is the minimum distance in the plurality of distances.
Optionally, the apparatus 200 further comprises:
the fifth determining module is used for determining the color temperature weight of each color gamut point in the plurality of color gamut points and determining the color temperature weight of each pixel point in the initial image according to the color temperature weight of each color gamut point in the plurality of color gamut points;
the weighted summation module is used for respectively carrying out weighted summation on the R pixel value, the G pixel value and the B pixel value of all the pixel points in the initial image according to the color temperature weight of each pixel point in the initial image to obtain an R pixel summation value, a G pixel summation value and a B pixel summation value of the initial image;
and the white balance correction module is used for carrying out white balance correction on each pixel point in the initial image according to the R pixel sum value, the G pixel sum value and the B pixel sum value of the initial image.
Optionally, the apparatus 200 further comprises:
a sixth determining module, configured to select P color gamut point sets from the N color gamut point sets if a threshold between the first two color gamut point sets after the sorting is smaller than or equal to a threshold, where the P color gamut point sets do not include the first color gamut point set after the sorting, determine a pixel point in the initial image corresponding to a color gamut point in the P color gamut point sets, and determine an R pixel sum value, a G pixel sum value, and a B pixel sum value in the initial image according to the determined pixel point, where P is a positive integer greater than or equal to 1;
a fifth determining module, configured to determine a color temperature weight of each color gamut point in the multiple color gamut points if a threshold between the sorted first two sets of color gamut points is greater than a threshold, and determine a color temperature weight of each pixel point in the initial image according to the color temperature weight of each color gamut point in the multiple color gamut points;
the weighted summation module is used for respectively carrying out weighted summation on the R pixel value, the G pixel value and the B pixel value of all the pixel points in the initial image according to the color temperature weight of each pixel point in the initial image to obtain an R pixel summation value, a G pixel summation value and a B pixel summation value of the initial image;
and the white balance correction module is used for carrying out white balance correction on each pixel point in the initial image according to the R pixel sum value, the G pixel sum value and the B pixel sum value of the initial image.
Optionally, the fifth determining module is specifically configured to:
selecting a maximum color temperature and a minimum color temperature from color temperatures of a plurality of color gamut points;
determining the color temperature weight of each color gamut point in the plurality of color gamut points through a preset formula;
the preset formula is as follows: WT ═ W (T-Tmax)/(Tmin-Tmax);
WT is a color temperature weight of any one of the plurality of color gamut points, W is a preset weight value, T is a color temperature of any one of the plurality of color gamut points, Tmax is a maximum color temperature, and Tmin is a minimum color temperature.
Optionally, the first determining module 201 is specifically configured to:
dividing an initial image into a plurality of image areas;
determining an R pixel value, a G pixel value and a B pixel value of each image area, wherein the R pixel value, the G pixel value and the B pixel value of each image area are respectively the average values of the R pixel value, the G pixel value and the B pixel value of all pixel points included in each image area;
and determining a plurality of coordinate points in a color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each image area, and taking the plurality of coordinate points as a plurality of color gamut points, wherein each coordinate point corresponds to one image area.
Optionally, the first determining module 201 is specifically configured to:
determining a plurality of coordinate points in a color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image, and taking the plurality of coordinate points as a plurality of color gamut points, wherein each coordinate point corresponds to one pixel point.
Optionally, the color gamut coordinate system includes a plurality of regions of interest with equal areas, and the plurality of regions of interest are preset by a user;
the first determining module 201 is further specifically configured to:
screening M interested areas from the plurality of interested areas, wherein each screened interested area comprises at least one coordinate point, and M is a positive integer greater than or equal to 2;
determining M interesting coordinate points according to the M interesting regions, taking the M interesting coordinate points as a plurality of color gamut points, wherein each interesting coordinate point corresponds to one interesting region, and each interesting coordinate point is a coordinate point at the center position of the corresponding interesting region.
In the embodiment of the invention, the color category in the initial image is determined firstly according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image. And then, performing white balance correction on each pixel point in the initial image according to the determined number of the color categories. That is, in the embodiment of the present invention, the R pixel summation value, the G pixel summation value, and the B pixel summation value for performing white balance correction are determined according to the number of color categories included in the initial image, instead of directly performing white balance correction on the R pixel value, the G pixel value, and the B pixel value of each pixel point in the initial image, so that a large color difference of the corrected image is avoided.
It should be noted that: in the color sorting apparatus provided in the above embodiment, only the division of the functional modules is illustrated when performing color sorting, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above. In addition, the color classification device and the color classification method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments in detail and are not described herein again.
Fig. 3 is a block diagram of a terminal 300 according to an embodiment of the present invention. The terminal 300 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. The terminal 300 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
Generally, the terminal 300 includes: a processor 301 and a memory 302.
The processor 301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 301 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in a wake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 301 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 302 may include one or more computer-readable storage media, which may be non-transitory. Memory 302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 302 is used to store at least one instruction for execution by processor 301 to implement a color classification method provided by embodiments of the present invention.
In some embodiments, the terminal 300 may further include: a peripheral interface 303 and at least one peripheral. The processor 301, memory 302 and peripheral interface 303 may be connected by a bus or signal lines. Each peripheral may be connected to the peripheral interface 303 by a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 304, touch display screen 305, camera 306, audio circuitry 307, positioning components 308, and power supply 309.
The peripheral interface 303 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 301 and the memory 302. In some embodiments, processor 301, memory 302, and peripheral interface 303 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 301, the memory 302 and the peripheral interface 303 may be implemented on a separate chip or circuit board, which is not limited by the embodiment.
The Radio Frequency circuit 304 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 304 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 304 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 304 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 304 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 304 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 305 is a touch display screen, the display screen 305 also has the ability to capture touch signals on or over the surface of the display screen 305. The touch signal may be input to the processor 301 as a control signal for processing. At this point, the display screen 305 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 305 may be one, providing the front panel of the terminal 300; in other embodiments, the display screens 305 may be at least two, respectively disposed on different surfaces of the terminal 300 or in a folded design; in still other embodiments, the display 305 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 300. Even further, the display screen 305 may be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display screen 305 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 306 is used to capture images or video. Optionally, the camera assembly 306 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 306 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
Audio circuitry 307 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 301 for processing or inputting the electric signals to the radio frequency circuit 304 to realize voice communication. The microphones may be provided in plural numbers, respectively, at different portions of the terminal 300 for the purpose of stereo sound collection or noise reduction. The microphone may also be an array microphone or an omni-directional acquisition microphone. The speaker is used to convert electrical signals from the processor 301 or the radio frequency circuitry 304 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 307 may also include a headphone jack.
The positioning component 308 is used to locate the current geographic Location of the terminal 300 to implement navigation or LBS (Location Based Service). The Positioning component 308 may be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
The power supply 309 is used to supply power to the various components in the terminal 300. The power source 309 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 309 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 300 also includes one or more sensors 310. The one or more sensors 310 include, but are not limited to: acceleration sensor 311, gyro sensor 312, pressure sensor 313, fingerprint sensor 314, optical sensor 315, and proximity sensor 316.
The acceleration sensor 311 may detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the terminal 300. For example, the acceleration sensor 311 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 301 may control the touch display screen 305 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 311. The acceleration sensor 311 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 312 may detect a body direction and a rotation angle of the terminal 300, and the gyro sensor 312 may acquire a 3D motion of the user on the terminal 300 in cooperation with the acceleration sensor 311. The processor 301 may implement the following functions according to the data collected by the gyro sensor 312: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 313 may be disposed on a side bezel of the terminal 300 and/or an underlying layer of the touch display screen 305. When the pressure sensor 313 is disposed on the side frame of the terminal 300, the holding signal of the user to the terminal 300 can be detected, and the processor 301 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 313. When the pressure sensor 313 is disposed at the lower layer of the touch display screen 305, the processor 301 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 305. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 314 is used for collecting a fingerprint of the user, and the processor 301 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 314, or the fingerprint sensor 314 identifies the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, processor 301 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 314 may be disposed on the front, back, or side of the terminal 300. When a physical button or a vendor Logo is provided on the terminal 300, the fingerprint sensor 314 may be integrated with the physical button or the vendor Logo.
The optical sensor 315 is used to collect the ambient light intensity. In one embodiment, the processor 301 may control the display brightness of the touch screen display 305 based on the ambient light intensity collected by the optical sensor 315. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 305 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 305 is turned down. In another embodiment, the processor 301 may also dynamically adjust the shooting parameters of the camera head assembly 306 according to the ambient light intensity collected by the optical sensor 315.
A proximity sensor 316, also known as a distance sensor, is typically provided on the front panel of the terminal 300. The proximity sensor 316 is used to collect the distance between the user and the front surface of the terminal 300. In one embodiment, when the proximity sensor 316 detects that the distance between the user and the front surface of the terminal 300 gradually decreases, the processor 301 controls the touch display screen 305 to switch from the bright screen state to the dark screen state; when the proximity sensor 316 detects that the distance between the user and the front surface of the terminal 300 gradually becomes larger, the processor 301 controls the touch display screen 305 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 3 is not intended to be limiting of terminal 300 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Embodiments of the present invention also provide a non-transitory computer-readable storage medium, where instructions in the storage medium, when executed by a processor of a mobile terminal, enable the mobile terminal to perform the color classification method provided in the foregoing embodiments.
Embodiments of the present invention also provide a computer program product containing instructions, which when run on a computer, cause the computer to execute the color classification method provided by the above embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A method of color classification, the method comprising:
determining a plurality of color gamut points in a color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image;
determining the color temperature of each color gamut point according to the coordinates of each color gamut point in the plurality of color gamut points and the color temperature of each standard color gamut point in the plurality of standard color gamut points marked in the color gamut coordinate system;
classifying the plurality of color gamut points through a clustering algorithm according to the color temperature of each color gamut point in the plurality of color gamut points to obtain N color gamut point sets, wherein each color gamut point set comprises at least one color gamut point, and N is a positive integer greater than or equal to 2;
the method comprises the steps of sequencing N color gamut point sets according to the descending order of color temperatures of N central color gamut points, determining a limit value between any two adjacent color gamut point sets after sequencing to obtain N-1 limit values, wherein the N central color gamut points are color gamut points which are positioned at the center positions of areas defined by color gamut points included in each color gamut point set in the N color gamut point sets;
and if the N-1 threshold values are all larger than the threshold value, determining that one color class exists in the initial image, and if the threshold value of the N-1 threshold values is smaller than or equal to the threshold value, determining that at least two color classes exist in the initial image.
2. The method of claim 1, wherein the classifying the plurality of gamut points by a clustering algorithm based on the color temperature of each of the plurality of gamut points to obtain N sets of gamut points comprises:
determining a first color gamut point and a second color gamut point, wherein the first color gamut point is a color gamut point with the largest color temperature in the plurality of color gamut points, and the second color gamut point is a color gamut point with the smallest color temperature in the plurality of color gamut points;
when the N is 2, clustering the plurality of color gamut points through a K-means clustering algorithm according to the first color gamut point and the second color gamut point to obtain N color gamut point sets;
when the N is larger than or equal to 3, determining N-2 coordinate points according to the coordinates of the first color gamut point and the coordinates of the second color gamut point, wherein the N-2 coordinate points refer to the coordinate points which divide a connecting line between the first color gamut point and the second color gamut point into N-1 equal parts, selecting the color gamut point which is closest to each coordinate point in the N-2 coordinate points from the color gamut points to obtain N-2 third color gamut points, and clustering the color gamut points according to the first color gamut point, the second color gamut point and the N-2 third color gamut points by using the K-means clustering algorithm to obtain the N color gamut point sets.
3. The method of claim 1, wherein the determining a threshold value between any two adjacent sets of gamut points after the ordering comprises:
determining a distance between a color gamut point at the center position of a region surrounded by color gamut points included in a first color gamut point set and a color gamut point at the center position of a region surrounded by color gamut points included in a second color gamut point set to obtain a first distance, wherein the first color gamut point set and the second color gamut point set are two color gamut point sets which are arbitrarily adjacent after being sorted;
combining any color gamut point in the first color gamut point set with any color gamut point in the second color gamut point set to obtain a plurality of color gamut point pairs;
determining a distance between two color gamut points included in each of the plurality of color gamut point pairs to obtain a plurality of distances corresponding to the plurality of color gamut point pairs one to one;
determining a ratio between the first distance and a second distance as a threshold value between the first set of gamut points and the second set of gamut points, the second distance being a smallest distance of the plurality of distances.
4. A method as claimed in any one of claims 1 to 3, wherein said determining that a colour class is present in said initial image further comprises:
determining the color temperature weight of each color gamut point in the plurality of color gamut points, and determining the color temperature weight of each pixel point in the initial image according to the color temperature weight of each color gamut point in the plurality of color gamut points;
according to the color temperature weight of each pixel point in the initial image, respectively carrying out weighted summation on the R pixel values, the G pixel values and the B pixel values of all the pixel points in the initial image to obtain the R pixel summation value, the G pixel summation value and the B pixel summation value of the initial image;
and performing white balance correction on each pixel point in the initial image according to the R pixel sum value, the G pixel sum value and the B pixel sum value of the initial image.
5. A method as claimed in any one of claims 1 to 3, wherein said determining that there are at least two color classes in said initial image further comprises:
if the threshold value between the first two color gamut point sets after sorting is smaller than or equal to the threshold value, selecting P color gamut point sets from the N color gamut point sets, wherein the P color gamut point sets do not comprise the first color gamut point set after sorting, determining pixel points corresponding to the color gamut points in the P color gamut point sets in the initial image, determining R pixel sum values, G pixel sum values and B pixel sum values in the initial image according to the determined pixel points, and P is a positive integer greater than or equal to 1;
if the threshold value between the first two color gamut point sets after sorting is larger than the threshold value, determining the color temperature weight of each color gamut point in the plurality of color gamut points, determining the color temperature weight of each pixel point in the initial image according to the color temperature weight of each color gamut point in the plurality of color gamut points, and respectively performing weighted summation on the R pixel value, the G pixel value and the B pixel value of all the pixel points in the initial image according to the color temperature weight of each pixel point in the initial image to obtain an R pixel summation value, a G pixel summation value and a B pixel summation value of the initial image;
and carrying out white balance correction on each pixel point in the initial image according to the R pixel sum value, the G pixel sum value and the B pixel sum value of the initial image.
6. The method of claim 4 or 5, wherein determining the color temperature weight for each of the plurality of color gamut points comprises:
selecting a maximum color temperature and a minimum color temperature from the color temperatures of the plurality of color gamut points;
determining the color temperature weight of each color gamut point in the plurality of color gamut points through a preset formula;
the preset formula is as follows: WT ═ W (T-Tmax)/(Tmin-Tmax);
WT is a color temperature weight of any one of the plurality of color gamut points, W is a preset weight value, T is a color temperature of any one of the plurality of color gamut points, Tmax is the maximum color temperature, and Tmin is the minimum color temperature.
7. The method of claim 1, wherein determining a plurality of gamut points in a gamut coordinate system based on R, G, and B pixel values of each pixel point in the initial image comprises:
dividing the initial image into a plurality of image areas;
determining an R pixel value, a G pixel value and a B pixel value of each image area, wherein the R pixel value, the G pixel value and the B pixel value of each image area are respectively the average values of the R pixel value, the G pixel value and the B pixel value of all pixel points included in each image area;
and determining a plurality of coordinate points in the color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each image area, and taking the plurality of coordinate points as the plurality of color gamut points, wherein each coordinate point corresponds to one image area.
8. The method of claim 1, wherein determining a plurality of gamut points in a gamut coordinate system based on R, G, and B pixel values of each pixel point in the initial image comprises:
and determining a plurality of coordinate points in the color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image, and taking the plurality of coordinate points as the plurality of color gamut points, wherein each coordinate point corresponds to one pixel point.
9. The method according to claim 7 or claim 8, wherein the color gamut coordinate system comprises a plurality of regions of interest with equal areas, and the plurality of regions of interest are preset by a user;
after determining the plurality of coordinate points in the color gamut coordinate system, the method further includes:
screening M interested areas from the plurality of interested areas, wherein each screened interested area comprises at least one coordinate point, and M is a positive integer greater than or equal to 2;
and determining M interesting coordinate points according to the M interesting regions, taking the M interesting coordinate points as the plurality of color gamut points, wherein each interesting coordinate point corresponds to one interesting region, and each interesting coordinate point is a coordinate point at the center position of the corresponding interesting region.
10. A color sorting apparatus, the apparatus comprising:
the first determining module is used for determining a plurality of color gamut points in a color gamut coordinate system according to the R pixel value, the G pixel value and the B pixel value of each pixel point in the initial image;
a second determining module, configured to determine a color temperature of each color gamut point according to the coordinates of each color gamut point in the plurality of color gamut points and the color temperature of each standard color gamut point in the plurality of standard color gamut points marked in the color gamut coordinate system;
the classification module is used for classifying the plurality of color gamut points through a clustering algorithm according to the color temperature of each color gamut point in the plurality of color gamut points to obtain N color gamut point sets, each color gamut point set comprises at least one color gamut point, and N is a positive integer greater than or equal to 2;
a third determining module, configured to sort the N color gamut point sets according to a descending order of color temperatures of the N central color gamut points, and determine a threshold between any two adjacent color gamut point sets after the sorting, to obtain N-1 thresholds, where the N central color gamut points are color gamut points located in a center position of an area surrounded by color gamut points included in each of the N color gamut point sets;
a fourth determining module, configured to determine that one color category exists in the initial image if the N-1 threshold values are all greater than the threshold value, and determine that at least two color categories exist in the initial image if the N-1 threshold values exist and are less than or equal to the threshold value.
CN201810557323.9A 2018-06-01 2018-06-01 Color classification method, device and storage medium Active CN110555443B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810557323.9A CN110555443B (en) 2018-06-01 2018-06-01 Color classification method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810557323.9A CN110555443B (en) 2018-06-01 2018-06-01 Color classification method, device and storage medium

Publications (2)

Publication Number Publication Date
CN110555443A CN110555443A (en) 2019-12-10
CN110555443B true CN110555443B (en) 2022-05-20

Family

ID=68734909

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810557323.9A Active CN110555443B (en) 2018-06-01 2018-06-01 Color classification method, device and storage medium

Country Status (1)

Country Link
CN (1) CN110555443B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113079319B (en) * 2021-04-07 2022-10-14 杭州涂鸦信息技术有限公司 Image adjusting method and related equipment thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103929632A (en) * 2014-04-15 2014-07-16 浙江宇视科技有限公司 Automatic white balance correcting method and device
CN104935903A (en) * 2014-03-18 2015-09-23 韩华泰科株式会社 White balance correcting apparatus and white balance correcting method
CN105554489A (en) * 2015-07-10 2016-05-04 宇龙计算机通信科技(深圳)有限公司 White balance adjusting method and shooting terminal
CN105578165A (en) * 2015-12-30 2016-05-11 浙江大华技术股份有限公司 Method and device for processing white balance of image, and vidicon

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7184080B2 (en) * 2001-06-25 2007-02-27 Texas Instruments Incorporated Automatic white balancing via illuminant scoring

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104935903A (en) * 2014-03-18 2015-09-23 韩华泰科株式会社 White balance correcting apparatus and white balance correcting method
CN103929632A (en) * 2014-04-15 2014-07-16 浙江宇视科技有限公司 Automatic white balance correcting method and device
CN105554489A (en) * 2015-07-10 2016-05-04 宇龙计算机通信科技(深圳)有限公司 White balance adjusting method and shooting terminal
CN105578165A (en) * 2015-12-30 2016-05-11 浙江大华技术股份有限公司 Method and device for processing white balance of image, and vidicon

Also Published As

Publication number Publication date
CN110555443A (en) 2019-12-10

Similar Documents

Publication Publication Date Title
CN108304265B (en) Memory management method, device and storage medium
US11386586B2 (en) Method and electronic device for adding virtual item
CN109840584B (en) Image data classification method and device based on convolutional neural network model
CN110839128B (en) Photographing behavior detection method and device and storage medium
US20240193831A1 (en) Method, apparatus, and device for processing image, and storage medium
CN111078521A (en) Abnormal event analysis method, device, equipment, system and storage medium
CN111144365A (en) Living body detection method, living body detection device, computer equipment and storage medium
CN110705614A (en) Model training method and device, electronic equipment and storage medium
CN110619614A (en) Image processing method and device, computer equipment and storage medium
CN111857793A (en) Network model training method, device, equipment and storage medium
CN110647881A (en) Method, device, equipment and storage medium for determining card type corresponding to image
CN111754386A (en) Image area shielding method, device, equipment and storage medium
CN112989198B (en) Push content determination method, device, equipment and computer-readable storage medium
CN110728167A (en) Text detection method and device and computer readable storage medium
CN111931712A (en) Face recognition method and device, snapshot machine and system
CN112184581A (en) Image processing method, image processing apparatus, computer device, and medium
CN110555443B (en) Color classification method, device and storage medium
CN112184802A (en) Calibration frame adjusting method and device and storage medium
CN113343709B (en) Method for training intention recognition model, method, device and equipment for intention recognition
CN115798417A (en) Backlight brightness determination method, device, equipment and computer readable storage medium
CN112907939B (en) Traffic control subarea dividing method and device
CN112560903A (en) Method, device and equipment for determining image aesthetic information and storage medium
CN112990424A (en) Method and device for training neural network model
CN111258673A (en) Fast application display method and terminal equipment
CN111369434A (en) Method, device and equipment for generating cover of spliced video and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant