US10070110B2 - Automatic white balance - Google Patents

Automatic white balance Download PDF

Info

Publication number
US10070110B2
US10070110B2 US15/368,133 US201615368133A US10070110B2 US 10070110 B2 US10070110 B2 US 10070110B2 US 201615368133 A US201615368133 A US 201615368133A US 10070110 B2 US10070110 B2 US 10070110B2
Authority
US
United States
Prior art keywords
statistical
points
cluster
target cluster
white
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/368,133
Other languages
English (en)
Other versions
US20170195648A1 (en
Inventor
De Zhang
Duoming Chen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Assigned to ZHEJIANG UNIVIEW TECHNOLOGIES CO., LTD reassignment ZHEJIANG UNIVIEW TECHNOLOGIES CO., LTD ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHEN, Duoming, ZHANG, De
Publication of US20170195648A1 publication Critical patent/US20170195648A1/en
Application granted granted Critical
Publication of US10070110B2 publication Critical patent/US10070110B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • H04N1/6077Colour balance, e.g. colour cast correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06K9/6218
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • H04N9/735

Definitions

  • the present disclosure generally relates to the field of image processing technologies.
  • Light visible to human eyes may consist of light spectrums of seven colors upon superposition, and human visual perception to the same color is basically the same under different illuminations.
  • a white object may look like white in the morning light as the sun rises; and it may still look like white under the faint light at night. This is because human brains have been adapted to color rendition of an object under different lighting conditions.
  • a front end device of a video monitoring system such as an analog camera, a network camera or a digital camera, may not have the adaptability of human eyes. Since light of different color temperatures may have different color components, in allusion to the same object, colors of an image captured by the front end device may likely be different under illumination by light of different color temperatures. Specifically, the front end device may capture an image for an object, which includes three color components, namely Component R (red), Component G (green) and Component B (blue). However, colors of the image finally obtained may likely appear not to be consistent as the object actually seems, due to impacts of the color of the object itself, color temperature of ambient light, and photosensitive properties of optical filters and sensors on various color components, etc. To ensure the colors of the image captured by the front end device furthest close to human visual perception, image processing such as white balance may be performed on the captured image.
  • image processing such as white balance may be performed on the captured image.
  • One aspect of the present disclosure provides a method for automatic white balance, including:
  • the cluster determining, by the front end device, the cluster as a target cluster
  • the white region may include a plurality of sub-regions, and clustering the statistical points within the white region of the to-be-detected image may include:
  • the determining a white balance gain of the to-be-detected image based on the target cluster may include:
  • the front end device determines, by the front end device, the white balance gain of the to-be-detected image with the weight value and the tricolor information of the statistical points within the target cluster.
  • the allocating a weight value for the statistical points within the target cluster according to the clustering feature may include:
  • the weighting parameter may include one or more of the followings:
  • the allocating weight values for the statistical points within the target clusters according to a weighting parameter may include:
  • the tricolor information of the statistical point may include: a response value of Component G, a response value of Component R and a response value of Component B of the statistical point.
  • the determining the white balance gain of the to-be-detected image with the weight value and the tricolor information of the statistical points within the target cluster may include:
  • CbAvg ⁇ Cb ⁇ ( i ) ⁇ W ⁇ ( i ) ⁇ W ⁇ ( i ) ;
  • i denotes an i th statistical point within a cluster
  • W(i) denotes the weight value of the i th statistical point
  • Cr(i) denotes a value obtained by dividing the response value of Component G of the i th statistical point by the response value of Component R of the i th statistical point
  • Cb(i) denotes a value obtained by dividing the response value of Component G of the i th statistical point by the response value of Component B of the i th statistical point.
  • Another aspect of the present disclosure may provide an apparatus for automatic white balance, including a processor and a non-transitory storage medium storing machine-executable instructions corresponding to a control logic for automatic white balance, where by executing the machine-executable instructions the processor may be caused to:
  • the white region may include a plurality of sub-regions, and when clustering the statistical points within the white region of a to-be-detected image falling, the machine-executable instructions further cause the processor to:
  • a density-based spatial clustering algorithm which takes a center of each of the sub-regions as a clustering object and takes the number of statistical points within each of the sub-regions as a density of the clustering object.
  • the machine-executable instructions When determining the white balance gain of the to-be-detected image based on the target cluster, the machine-executable instructions further cause the processor to:
  • the machine-executable instructions further cause the processor to:
  • the weighting parameter may include one or more of the followings:
  • the machine-executable instructions further cause processor to:
  • the tricolor information of the statistical points may include: a response value of Component G, a response value of Component R and a response value of Component B of the statistical point.
  • the machine-executable instructions further cause the processor to:
  • CbAvg ⁇ Cb ⁇ ( i ) ⁇ W ⁇ ( i ) ⁇ W ⁇ ( i ) ;
  • i may denote an i th statistical point within a cluster
  • W(i) may denote the weight value of the i th statistical point
  • Cr(i) may denote a value obtained by dividing the response value of Component G of the i th statistical point by the response value of Component R of the i th statistical point, and
  • Cb(i) may denote a value obtained by dividing the response value of Component G of the i th statistical point by the response value of Component B of the i th statistical point.
  • interfering points may be effectively separated from white points and the interfering points may be determined as non-white points.
  • the interfering points falling within the white region may be eliminated.
  • an adverse effect of the interfering points on the white balance may be eliminated.
  • an adverse effect of local colorized sceneries and multisource light supplement, moving objects, and mixed color temperatures or the like on the white balance may be eliminated.
  • the automatic white balance may be applicable to complex scenes where the local colorized sceneries and the multisource light supplement, and moving colorized objects and mixed color temperatures or the like are present. In this way, a precision and stability in automatic white balance processing in complex scenes may be improved.
  • FIG. 1A is a flowchart of a method for automatic white balance according to one example of the present disclosure.
  • FIG. 1B is a flowchart of a method for automatic white balance according to one example of the present disclosure.
  • FIGS. 2A-2G are schematic diagrams of a blackbody locus curve according to one example of the present disclosure.
  • FIGS. 3A-3C are schematic diagrams illustrating clustering of statistical points according to one example of the present disclosure.
  • FIG. 4 is a hardware structure diagram of a front end device according to one example of the present disclosure.
  • FIG. 5 is a block diagram of functional modules of a control logic for automatic white balance according to one example of the present disclosure.
  • White balance may be popularly understood as: when displaying a white object as an image, the displayed image is made still look like white.
  • a process for adjusting white balance may be referred to a white balance adjustment, and automatic white balance (AWB) may be one of important manners for a white balance adjustment.
  • the AWB generally may be configured by default in a front end device, and by adjusting tricolor components (red, green and blue) of the image, the AWB may allow the original white part to be still displayed as white.
  • one white region may be predetermined, statistical points falling within the white region may be taken as white points, and statistical points outside the white region may be taken as non-white points. Based on such a manner, if a statistical point is determined to be a white point, an actual color of an object corresponding to the statistical point may be considered to be white.
  • the examples of the present disclosure propose a method for automatic white balance (AWB), which may be applied to a front end device of a video monitoring system, such as an analog camera, a network camera or a digital camera.
  • AVB automatic white balance
  • the method may be applied to an image sensor of the network camera.
  • the method for automatic white balance specifically may include following blocks.
  • a white region for performing automatic white balance may be obtained.
  • At block 102 at least one cluster may be clustered by clustering statistical points within the white region of a to-be-detected image.
  • the cluster may be determined as a target cluster and statistical points falling within the target cluster may be determined as white points.
  • a white balance gain of the to-be-detected image may be determined based on the target cluster.
  • interfering points may be effectively separated from white points by clustering statistical points falling within the white region, thereby eliminating an adverse effect of the interfering points on the white balance. Implementations of each block in FIG. 1A may be specifically described with reference to FIG. 1B .
  • the white region for performing automatic white balance may be obtained.
  • a white region may be drawn in a coordinate system, and a statistical point may be considered to be a white point if the statistical point falls within the white region.
  • the coordinate system may be a G/R-G/B coordinate system or an r-b coordinate system, and hereinafter, the r-b coordinate system may be taken as an example, where meanings of the r and the b will be explained in the following.
  • all white point responses of a light source conforming to a blackbody radiation curve may be manifested as basically falling on one curve (namely the blackbody locus curve) in the r-b coordinate system, and two ends of the curve may respectively represent an region where a color temperature is higher and an region where a color temperature is lower. Based on this, a statistical point having a smaller distance from the curve may be considered to be a white point. Since neither position nor shape of the curve is fixed, it is unable to represent the curve by using an effective mathematical expression, and thus it is difficult to draw a white region by using a distance from statistical points to the curve.
  • a plurality of white sub-regions may be drawn along a color temperature variation direction, wherein each white sub-region may be a polygon.
  • a white sub-region is a quadrilateral in subsequent description, and a plurality of white sub-regions may constitute the white region.
  • a corresponding white region may be drawn.
  • a white sub-region shaped like a quadrilateral may be drawn by using two pairs of straight lines whose slopes are respectively positive and negative. If a value of one statistical point falls within the quadrilateral, it is considered that the statistical point falls within the white sub-region, namely falls within the white region.
  • a circumscribed quadrilateral corresponding to the white sub-region may also be drawn, the circumscribed quadrilateral may consist of two horizontal straight lines and two perpendicular straight lines.
  • a statistical point falls between the two horizontal straight lines and falls between the two perpendicular straight lines, namely when the statistical point falls within the circumscribed quadrilateral, it may be determined whether the statistical point falls within the quadrilateral of the white sub-region.
  • a statistical point does not fall between the two horizontal straight lines and/or does not fall between the two perpendicular straight lines, namely when the statistical point does not fall within the circumscribed quadrilateral, it may be directly determined that the statistical point does not fall within the quadrilateral of the white sub-region.
  • One feasible manner for obtaining the white region of automatic white balance may include but is not limited to following treatments.
  • N standard grey scale images having different color temperatures, under a light source conforming to blackbody radiation characteristics may be captured, and tricolor information of each standard grey scale image may be obtained.
  • a value of the N may be a positive integer greater than or equal to 3.
  • other types of images may also be captured. For example, images captured in outside luminous environment may be obtained, which will not be described.
  • standard grey scale images under a light source conforming to the blackbody radiation characteristics may be taken as examples.
  • N standard grey scale images (namely, images outputted from an image sensor) of a gray object having different color temperatures, under a light source conforming to the blackbody radiation characteristics, may be captured, so that a corresponding blackbody locus curve may be fitted by using the tricolor information of the N captured standard grey scale images.
  • a unit of a color temperature (Tc) is Kelvin (K). The higher the color temperature is, the more the short-wave components are; the more inclined to blue-green the images are, the lower the color temperature is; and the more the long-wave components are, the more inclined to red-yellow the images are.
  • the color temperature may only indicate spectral components of the light source but may not indicate a luminous intensity.
  • the color temperature of the light source may be determined by comparing its color with a theoretical thermal blackbody radiation.
  • a Kelvin temperature obtained when a thermal blackbody radiator is matched with the color of the light source may be the color temperature of the light source, which is associated with the Planck's law.
  • chromaticity coordinates which may be respectively denoted by r, g and b.
  • the tricolor information may be denoted by any two of an r value, a b value and a g value.
  • the tricolor information of a standard grey scale image may be obtained in the following way.
  • Statistical average values respectively for an R component, a B component and a G component of the standard grey scale image may be determined.
  • the r value may be calculated as the average value for the R Component/(the average value for the R component+the average value for the G component+the average value for the B component), namely
  • the b value may be calculated as the average value for the B component/(the average value for the R component+the average value for the G component+the average value for the B component), namely
  • the g value may be calculated as the average value for the G component/(the average value for the R component+the average value for the G component+the average value for the B component), namely
  • the r value and the b value may be determined as the tricolor information, or the r value and the g value may be determined as the tricolor information, or the b value and the g value may be determined as the tricolor information. It is to be noticed that the manner for denoting the tricolor information and the manner for obtaining the tricolor information as set forth herein are merely for exemplary purposes, and are not restrictive.
  • the treatment of the tricolor information being the r value and the b value the treatment of the tricolor information being the r value and the g value, and the treatment of the tricolor information being the b value and the g value are the same.
  • the tricolor information being the r value and the b value as an example in the following.
  • the r value is 0.6 and the b value is 0.2 in allusion to a standard grey scale image whose color temperature (Tc) is 3,000K. It is assumed that the r value is 0.5 and the b value is 0.25 in allusion to a standard grey scale image whose color temperature (Tc) is 4,000K. It is assumed that the r value is 0.4 and the b value is 0.3 in allusion to a standard grey scale image whose color temperature (Tc) is 5,000 K.
  • the r value is 0.3 and the b value is 0.4 in allusion to a standard grey scale image whose color temperature (Tc) is 6,000 K; and it is assumed that the r value is 0.2 and the b value is 0.7 in allusion to a standard grey scale image whose color temperature (Tc) is 7,000 K.
  • a blackbody locus curve may be fitted by using N different color temperatures and the tricolor information corresponding to each color temperature (namely, the tricolor information of standard grey scale images corresponding to the N color temperatures).
  • Points at a middle section of the blackbody locus curve may indicate positions for colorimetric values where human eyes may sense white light, wherein a high color temperature section is cold white and a low high color temperature section is warm white. Therefore, points falling within a certain distance from the blackbody locus curve may be taken as white points. Based on this principle, in the examples of the present disclosure, the blackbody locus curve may need to be fitted so as to perform automatic white balance.
  • the process of fitting the blackbody locus curve by using N different color temperatures and the tricolor information corresponding to each color temperature specifically may include but is not limited to: selecting two pieces of information from the tricolor information as an abscissa and an ordinate respectively.
  • N sampling points may be obtained by drawing one sampling point at a coordinate position corresponding to each piece of tricolor information, wherein each sampling point may represent a color temperature corresponding to the tricolor information.
  • one blackbody locus curve may be fitted out by using the N sampling points.
  • the r value in the tricolor information may be taken as an abscissa and the b value may be taken as an ordinate, or the b value in the tricolor information may be taken as an abscissa and the r value may be taken as an ordinate.
  • FIG. 2A is a schematic diagram of a blackbody locus curve fitted out.
  • the r value is taken as an abscissa and the b value is taken as an ordinate.
  • the blackbody locus curve as shown in FIG. 2A may be a 2-dimensional curve representing a spatial relationship between the r and the b.
  • the blackbody locus curve as shown in FIG. 2B may be a 3-dimensional curve representing a spatial relationship among the r, the b and the Tc. Reference may be made by taking the blackbody locus curve as shown in FIG. 2A as an example in subsequent description.
  • M calibration points may be selected from the blackbody locus curve, wherein the M is greater than or equal to the N.
  • the process of selecting M calibration points from the blackbody locus curve specifically may include but is not limited to following manners. Manner I: each sampling point among the N sampling points may be determined as a calibration point. Manner II: in allusion to any two adjacent sampling points on the blackbody locus curve, it may be determined whether the distance traveled by the two sampling points on the blackbody locus curve is greater than a preset distance; a new sampling point may be interposed between the two sampling points if the determination result is yes so that the distance traveled by any two adjacent sampling points on the blackbody locus curve is not greater than the preset distance after the new sampling point is interposed. All sampling points on the blackbody locus curve may be determined as calibration points. In addition, if it is determined that the distance traveled by the two sampling points on the blackbody locus curve is not greater than the preset distance, no new sampling point may be interposed between the two sampling points, or a new sampling point may be interposed between the two sampling points.
  • the five sampling points as shown in FIG. 2A may be directly determined as calibration points.
  • the value of the preset distance max_dist may be arbitrarily set according to practical experiences.
  • the distance between two sampling points may refer to the distance traveled by the two sampling points on the blackbody locus curve. Therefore, the preset distance max_dist also may refer to the distance traveled on the blackbody locus curve.
  • Treatment 4 for each calibration point, a white sub-region which the calibration point belongs to may be obtained, and a region including white sub-regions for all calibration points may be a white region, namely, the white region for performing automatic white balance obtained at block 101 .
  • white sub-regions which the two calibration points belong to may be not obtained.
  • the processes of obtaining a white sub-region which each calibration point belongs to may be the same, and thus reference may be made by taking one calibration point as an example.
  • the process of obtaining a white sub-region which a calibration point belongs to specifically may include but is not limited to following manners. First, a first calibration point and a second calibration point adjacent to the calibration point on the blackbody locus curve may be obtained. Then a first midperpendicular corresponding to a first line segment and a second midperpendicular corresponding to a second line segment may be obtained, wherein the first line segment may connect the calibration point with the first calibration point, and the second line segment may connect the calibration point with the second calibration point.
  • a first location point and a second location point may be obtained by taking an intersection point of the first line segment and the first midperpendicular as an initial position and respectively extending by a first preset length from the initial position toward two directions of the first midperpendicular.
  • a third location point and a fourth location point may be obtained by taking an intersection point of the second line segment and the second midperpendicular as an initial position and respectively extending by a second preset length from the initial position toward two directions of the second midperpendicular.
  • a quadrilateral formed by the first location point, the second location point, the third location point and the fourth location point may be determined as a white sub-region which the calibration point belongs to.
  • the value of the first preset length and the value of the second preset length may be arbitrarily set according to practical experiences, and the first preset length may be identical to or different from the second preset length.
  • a first line segment may be obtained by connecting the calibration point Pi(r i , b i ) with the first calibration point P i ⁇ 1 (r i ⁇ 1 , b i ⁇ 1 ), and a second line segment may be obtained by connecting the calibration point Pi(r i , b i ) with the second calibration point P i+1 (r i+1 , b i+1 ). Further, a first midperpendicular corresponding to the first line segment and a second midperpendicular corresponding to the second line segment may be obtained.
  • a first location point (r_min) and a second location point (b_max) may be obtained by taking an intersection point of the first line segment and the first midperpendicular as an initial position and respectively extending by a first preset length (such as width_thr) from the initial position toward two directions of the first midperpendicular.
  • a third location point (b_min) and a fourth location point (r_max) may be obtained by taking an intersection point of the second line segment and the second midperpendicular as an initial position and respectively extending by the first preset length (such as width_thr) from the initial position toward two directions of the second midperpendicular.
  • the above four location points may form a quadrilateral, which may be a white sub-region corresponding to the calibration point P i (r i , b i ).
  • FIG. 2E is a schematic diagram of a white region including a plurality of white sub-regions, namely, the white region including a plurality of white sub-regions may be obtained by using N standard grey scale images having different color temperatures.
  • the white region including a plurality of white sub-regions
  • the r value and the b value of the statistical point P may be determined, then the statistical point P may be indicated as P(r p , b p ).
  • the manner for determining the r value and the b value may refer to the above block, which is not repeated any more herein.
  • all white sub-regions may be traversed successively, and it may be determined whether the statistical point P(r p , b p ) falls within the currently-traversed white sub-region. No more white sub-region may be traversed if the determination result is yes. If all the white sub-regions are traversed and the statistical point does not fall within any white sub-region, it may be determined that the statistical point P(r p , b p ) does not fall within the white region.
  • the process of determining whether a statistical point falls within a white sub-region specifically may include but is not limited to following manners.
  • a first slope and a first intercept of a third line segment connecting the first location point with the second location point may be obtained, and a second slope and a second intercept of a fourth line segment connecting the third location point with the fourth location point may be obtained.
  • a third slope and a third intercept of a fifth line segment connecting the first location point with the third location point may be obtained, and a fourth slope and a fourth intercept of a sixth line segment connecting the second location point with the fourth location point may be obtained. Then it may be determined whether the statistical point falls between the fifth line segment and the sixth line segment by using the third slope, the third intercept, the fourth slope and the fourth intercept. When the statistical point falls between the third line segment and the fourth line segment and between the fifth line segment and the sixth line segment, it may be determined that the statistical point falls within the white sub-region enclosed by the four location points. Otherwise, it may be determined that the statistical point does not fall within the white sub-region enclosed by the four location points.
  • the above is merely one feasible manner for determining whether a statistical point falls within a white sub-region, and in practical application, it may be determined whether a statistical point falls within a white sub-region by other means, which is not repeated any more herein.
  • the statistical point P(r p , b p ) falls within the white sub-region corresponding to the calibration point P i (r i , b i ).
  • the first slope (k_A_ 2 ) and the first intercept (b_A_ 2 ) of the third line segment may also be obtained.
  • the second slope (k_A_ 1 ) and the second intercept (b_A_ 1 ) of the fourth line segment may also be obtained.
  • the third slope (k_B_ 2 ) and the third intercept (b_B_ 2 ) of the fifth line segment may also be obtained.
  • the sixth line segment connecting the second location point (b_max) with the fourth location point (r_max) the fourth slope (k_B_ 1 ) and the fourth intercept (b_B_ 1 ) of the sixth line segment may also be obtained.
  • the first slope (k_A_ 2 ) and the first intercept (b_A_ 2 ), the second slope (k_A_ 1 ) and the second intercept (b_A_ 1 ), the third slope (k_B_ 2 ) and the third intercept (b_B_ 2 ), and the fourth slope (k_B_ 1 ) and the fourth intercept (b_B_ 1 ) may be stored on the front end device, and the stored information may be directly used in a subsequent process.
  • the first slope (k_A_ 2 ) and the first intercept (b_A_ 2 ) the second slope (k_A_ 1 ) and the second intercept (b_A_ 1 ), the third slope (k_B_ 2 ) and the third intercept (b_B_ 2 ), and the fourth slope (k_B_ 1 ) and the fourth intercept (b_B_ 1 ) may be calculated out, and the calculation may use many ways and is not described herein.
  • the statistical point P(r p , b p ) falls between the third line segment and the fourth line segment. Otherwise, it may be determined that the statistical point P(r p , b p ) does not fall between the third line segment and the fourth line segment.
  • the statistical point P(r p , b p ) falls between the fifth line segment and the sixth line segment. Otherwise, it may be determined that the statistical point P(r p , b p ) does not fall between the fifth line segment and the sixth line segment.
  • the used first intercept (b_A_ 2 ), the second intercept (b_A_ 1 ), the third intercept (b_B_ 2 ) and the fourth intercept (b_B_ 1 ) may be vertical intercepts and/or horizontal intercepts. In this application scenario, other available equations may be not described in detail.
  • a circumscribed quadrilateral of the white sub-region may be formed by two horizontal straight lines and two perpendicular straight lines.
  • the circumscribed quadrilateral may be a rectangle, and the white sub-region may locate within the circumscribed quadrilateral.
  • the two horizontal straight lines may be straight lines parallel to an X-axis of a coordinate system for the white sub-region, and the two perpendicular straight lines may be straight lines perpendicular to the X-axis of the coordinate system for the white sub-region.
  • FIG. 2G is a schematic diagram of the circumscribed quadrilateral of the white sub-region.
  • the statistical point P(r p , b p ) may be preliminarily determined whether the statistical point P(r p , b p ) falls within the circumscribed quadrilateral of the white sub-region by way of comparison. If the statistical point P(r p , b p ) does not fall within the circumscribed quadrilateral of the white sub-region, it may be determined that the statistical point P(r p , b p ) does not fall within the white sub-region. If the statistical point P(r p , b p ) falls within the circumscribed quadrilateral of the white sub-region, it may be continued to determine whether the statistical point P(r p , b p ) falls within the white sub-region.
  • a statistical point falling within the white region may be not directly determined as a white point, instead, all the statistical points falling within the white region may be analyzed in subsequent blocks, and interfering points thereamong may be determined as non-white points.
  • one or more clusters may be generated by clustering all statistical points falling within the white region.
  • each white sub-region may be equally divided into a plurality of sub-regions according to a principle of equal division, wherein each sub-region may be a polygon, and reference may be made by taking an example in which each sub-region is a quadrilateral, as shown in FIG. 3A . Based on this, the sub-region which the statistical point belongs to may be determined, and a specific determination manner may be similar to the manner for determining the white sub-region which the statistical point belongs to.
  • each sub-region among all white sub-regions may be traversed successively, and it may be determined whether the statistical point P(r p , b p ) falls within the quadrilateral of the currently-traversed sub-region. If the determination result is yes, no more sub-region may be traversed, and it may be determined that the statistical point P(r p , b p ) falls within the currently-traversed sub-region.
  • a statistical point P(r p , b p ) falls within a sub-region based on four slopes and four intercepts respectively corresponding to four sides of a quadrilateral of the sub-region, and a specific determination manner may refer to the foregoing manner for determining whether the statistical point falls within the white sub-region, which is not repeated any more herein.
  • the process of generating one or more clusters by clustering all statistical points within the white regions specifically may include but is not limited to the following manners.
  • the number of statistical points within each sub-region may be determined.
  • a density-based spatial clustering algorithm which takes a center of each sub-region as a clustering object and the number of statistical points within the sub-region as a density of the clustering object, one or more clusters may be generated from the sub-regions.
  • a sub-region to which the statistical point belongs may be determined, a cluster to which the sub-region belongs may be determined, and the statistical point may be classified into the cluster.
  • FIG. 3B is a schematic diagram of sub-regions determined for all statistical points
  • FIG. 3C is a schematic diagram illustrating the number of statistical points within each sub-region determined in FIG. 3B .
  • Spatial clustering algorithms may include a partition-based clustering algorithm, a hierarchical-based clustering algorithm, a density-based clustering algorithm, a grid-based clustering algorithm, and a model-based clustering algorithm, etc.
  • all statistical points falling within the white region may be clustered by using the density-based spatial clustering algorithm, and one or more clusters may be obtained.
  • the density-based spatial clustering algorithm is mainly characterized by using a region density as a basis for partitional clustering, and adding a data space region into a similar cluster as long as the density of the data space region exceeds a predefined threshold.
  • the density-based spatial clustering algorithm may be a Density-based Spatial Clustering of Applications with Noise (DBSCAN) algorithm, an Ordering Points To Identify the Clustering Structure (OPTICS) algorithm, and a Density-based Clustering (DENCLUE) algorithm, etc.
  • DBSCAN Density-based Spatial Clustering of Applications with Noise
  • OTICS Ordering Points To Identify the Clustering Structure
  • DENCLUE Density-based Clustering
  • one or more clusters may be generated from the sub-regions. As shown in FIG.
  • the sub-region 5 , the sub-region 6 , the sub-region 7 and the sub-region 8 which are adjacent may be taken as a cluster 1
  • the sub-region 9 and the sub-region 10 which are adjacent may be taken as a cluster 2 .
  • the sub-region 1 may form a cluster 3
  • the sub-region 2 may form a cluster 4
  • the sub-region 3 and the sub-region 4 may form a cluster 5 .
  • a clustering analysis may be carried out for each sub-region instead of each statistical point, which may reduce a computational complexity. For example, if there are 1,024 statistical points and 256 sub-regions and a time complexity in clustering analysis is O(N 2 ), when performing a clustering analysis for each statistical point, the time complexity may be O(1024 2 ), and when performing a clustering analysis for each sub-region, the time complexity may be O(256 2 ). Hence, the computational complexity may be reduced if a clustering analysis is performed for each sub-region.
  • the cluster when the number of statistical points within the cluster is greater than a preset threshold, the cluster may be determined as a target cluster and the statistical points falling within the target cluster may be determined as white points. When the number of statistical points within the cluster is not greater than the preset threshold, the statistical points falling within the target cluster may be determined as non-white points (interfering points).
  • the cluster 1 may be determined as a target cluster and all the statistical points within the target cluster may be determined as white points. If the number of statistical points within the cluster 2 is greater than the preset threshold, the cluster 2 may be determined as a target cluster and all the statistical points within the target cluster may be determined as white points.
  • the cluster 3 if the number of statistical points within the cluster 3 is not greater than the preset threshold, all the statistical points within the cluster 3 may be determined as non-white points; if the number of statistical points within the cluster 4 is not greater than the preset threshold, all the statistical points within the cluster 4 may be determined as non-white points; if the number of statistical points within the cluster 5 is not greater than the preset threshold, all the statistical points within the cluster 5 may be determined as non-white points.
  • the preset threshold may be arbitrarily set according to practical experiences, for example, may be set as 5.
  • interfering points may be effectively separated from white points and the interfering points may be determined as non-white points. In this way, the interfering points falling within the white region may be eliminated.
  • a white balance gain of the to-be-detected image may be calculated based on the target cluster. Afterwards, the to-be-detected image may be corrected by using the white balance gain, wherein a specific correction process is not unnecessarily described any more.
  • interfering points may be effectively separated from white points, and interfering points falling within the white region may be eliminated, namely, clusters including interfering points also may be eliminated.
  • the process of calculating the white balance gain of the to-be-detected image based on the target cluster specifically may include but is not limited to following manners.
  • a clustering feature of the target cluster and tricolor information of statistical points falling within the target cluster may be obtained, and a weight value may be allocated for a statistical point falling within the target cluster by using the clustering feature of the target cluster.
  • the white balance gain of the to-be-detected image may be calculated by using the weight values and the tricolor information of the statistical points falling within the target cluster.
  • the process of allocating a weight value for a statistical point falling within the target cluster by using the clustering feature of the target cluster specifically may include but is not limited to following manners.
  • Manner I when the clustering feature indicates there is only one target cluster within the white region, an identical weight value may be allocated for each statistical point within the target cluster.
  • Manner II when the clustering feature indicates there are at least two target clusters within the white region, weight values may be allocated for the statistical points within the target clusters by using a weighting parameter, such that the statistical points within the same target cluster may have the same weight value and the statistical points within different target clusters may have different weight values.
  • the weighting parameter specifically may include but is not limited to one or more of followings: the number of statistical points within a cluster, a color temperature corresponding to a center position of the cluster, and a distance relation between the center position of the cluster and a blackbody locus curve.
  • a different weight value may be allocated for statistical points within each target cluster according to a weighting parameter. For example, the larger the number of statistical points within a target cluster is, the larger the weight value allocated for the statistical points within the target cluster is. The closer the color temperature corresponding to the center position of the target cluster is to a common color temperature section of a light source, the larger the weight value allocated for the statistical points within the target cluster is. The nearer the distance between the center position of the target cluster and the blackbody locus curve is, the larger the weight value allocated for the statistical points within the target cluster is.
  • the closer the color temperature being to a common color temperature section of a light source may be understood that: if the color temperature of a light source used by a calibrated white balance is from 2,000K to 10,000K whereas in practice the common color temperature of the light source is from 4,000K to 6,000K, it may be indicated that the color temperature gets close to the common color temperature section of the light source when the color temperature is within a section from 4,000K to 6,000K.
  • the weight value a allocated for all the statistical points within the cluster 1 may be greater than the weight value b allocated for all the statistical points within the cluster 2 .
  • the weight value a allocated for all the statistical points within the cluster 1 may be greater than the weight value b allocated for all the statistical points within the cluster 2 .
  • the weight value a allocated for all the statistical points within the cluster 1 may be greater than the weight value b allocated for all the statistical points within the cluster 2 .
  • the weighting parameter corresponding to a cluster may simultaneously include the number of statistical points within the cluster, the color temperature corresponding to the center position of the cluster, and the distance relation between the center position of the cluster and the blackbody locus curve, one feasible implementation may be as below.
  • the number of the statistical points within the cluster 1 is numA
  • the number of the statistical points within the cluster 2 is numB
  • a difference value between the color temperature corresponding to the center position of the cluster 1 and a mid-value of the common color temperature section of the light source is T 1
  • a difference value between the color temperature corresponding to the center position of the cluster 2 and the mid-value of the common color temperature section of the light source is T 2
  • the distance between the center position of the cluster 1 and the blackbody locus curve is D 1
  • the distance between the center position of the cluster 2 and the blackbody locus curve is D 2 .
  • the Coefficient 1 , the Coefficient 2 and the Coefficient 3 may be numerical values configured according to practical experiences.
  • the tricolor information of a statistical point specifically may include: a response value of Component G, a response value of Component R and a response value of Component B of the statistical point.
  • the process of calculating the white balance gain of the to-be-detected image by using the weight values and the tricolor information of statistical points within the target cluster specifically may include but is not limited to:
  • i denotes an i th statistical point within a cluster, the i has a value range from 1 to N, and the N is a maximum number of statistical points.
  • W(i) denotes the weight value of the i th statistical point within the cluster.
  • the assigned value generally may be 1.0.
  • interfering points white points under a weak light source at different color temperatures, white points with larger color temperature difference or other colorized points
  • the interfering points may be effectively separated from white points and the interfering points may be determined as non-white points. In this way, the interfering points falling within the white region may be eliminated.
  • an adverse effect of the interfering points on the white balance may be eliminated, and an adverse effect of local colorized sceneries and multisource light supplement, moving objects, and mixed color temperatures or the like on the white balance may be eliminated so that the automatic white balance may be applicable to complex scenes where the local colorized sceneries and the multisource light supplement are present, and moving colorized objects and mixed color temperatures or the like are present. In this way, a precision and stability in automatic white balance processing in complex scenes may be improved.
  • examples of the present disclosure provide an apparatus for automatic white balance, which may be applied to a front end device.
  • the apparatus for automatic white balance may be implemented by means of software, or may be implemented by means of hardware or a combination of software and hardware.
  • Taking software implementation as an example as an apparatus in a logical sense, it is formed by reading machine-executable instructions in a nonvolatile storage medium by a processor of the front end device.
  • FIG. 4 it is a hardware structure diagram of a front end device located with the apparatus for automatic white balance.
  • the front end device may further include other hardwares, such as a forwarding chip in charge of packet processing, a network interface, a memory, and so on. From a perspective of a hardware structure, the front end device may also be a distributed device, and may include a plurality of interface cards so as to carry out extension of packet processing.
  • FIG. 5 illustrates a block diagram of functional modules of the control logic for automatic white balance according to the present disclosure
  • the functional modules of the control logic for automatic white balance may specifically include:
  • an obtaining module 11 configured to obtain a white region used for performing automatic white balance
  • a clustering module 12 configured to generate at least one cluster by clustering statistical points within the white region of a to-be-detected image
  • a determining module 13 configured to determine, in case that the number of statistical points within one cluster is greater than a preset threshold, the cluster as a target cluster and determine the statistical points falling within the target cluster as white points;
  • a calculating module 14 configured to calculate a white balance gain of the to-be-detected image based on the target cluster.
  • the white region may include a plurality of sub-regions.
  • the clustering module 12 may determine the number of statistical points within each sub-region when clustering the statistical points of the to-be-detected image falling within the white region.
  • the clustering module 12 may cluster the plurality of sub-regions to generate at least one cluster by using a density-based spatial clustering algorithm which takes a center of the sub-region as a clustering object and the number of statistical points within the sub-region as a density of the clustering object.
  • the calculating module 14 may obtain a clustering feature of the target cluster and tricolor information of statistical points falling within the target cluster, allocate a weight value for the statistical points falling within the target cluster by using the clustering feature, and calculate the white balance gain of the to-be-detected image by using the weight value and the tricolor information of the statistical points falling within the target cluster.
  • the calculating module 14 may allocate an identical weight value for the statistical points within the target cluster when the clustering feature indicates there is only one target cluster in the white region, or allocate weight values for the statistical points within the target clusters by using a weighting parameter when the clustering feature indicates there are at least two target clusters in the white region, such that the statistical points within the same target cluster may have the same weight value, and the statistical points within different target clusters may have different weight values.
  • the weighting parameter includes one or more of followings: the number of statistical points within a cluster, a color temperature corresponding to a center position of the cluster, and a distance relation between the center position of the cluster and a blackbody locus curve.
  • a larger weight value may be allocated for the statistical points within the target cluster.
  • a larger weight value may be allocated for the statistical points within the target cluster.
  • the center position of the target cluster becomes nearer to the blackbody locus curve.
  • the tricolor information of the statistical points may include: a response value of Component G, a response value of Component R and a response value of Component B of the statistical points.
  • the calculating module 14 may calculate an R channel gain CrAvg of the to-be-detected image by using a following formula:
  • CbAvg ⁇ Cb ⁇ ( i ) ⁇ W ⁇ ( i ) ⁇ W ⁇ ( i ) , and assign a numerical value to a G channel gain CgAvg of the to-be-detected image;
  • i may denote an i th statistical point within a cluster
  • W(i) may denote the weight value of the i th statistical point
  • Cr(i) may denote a value obtained by dividing the response value of Component G of the i th statistical point by the response value of Component R of the i th statistical point
  • Cb(i) may denote a value obtained by dividing the response value of Component G of the i th statistical point by the response value of Component B of the i th statistical point.
  • modules of the apparatus in the present disclosure may be integrated as a whole, or may be detachably deployed.
  • the foregoing modules may be merged into one module, or may be further divided into a plurality of submodules.
  • modules in the apparatus in the example may be distributed according to description of the example, or may be correspondingly changed and positioned in one or more apparatuses different from this example.
  • the foregoing modules in the example may be merged into one module, or may be further divided into a plurality of submodules.
  • Serial numbers in the foregoing examples of the present disclosure are merely for the purpose of descriptions and do not represent advantages and disadvantages of the examples.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Processing Of Color Television Signals (AREA)
  • Color Television Image Signal Generators (AREA)
US15/368,133 2015-12-30 2016-12-02 Automatic white balance Active US10070110B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201511024504.8A CN105430367B (zh) 2015-12-30 2015-12-30 一种自动白平衡的方法和装置
CN201511024504.8 2015-12-30
CN201511024504 2015-12-30

Publications (2)

Publication Number Publication Date
US20170195648A1 US20170195648A1 (en) 2017-07-06
US10070110B2 true US10070110B2 (en) 2018-09-04

Family

ID=55508265

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/368,133 Active US10070110B2 (en) 2015-12-30 2016-12-02 Automatic white balance

Country Status (6)

Country Link
US (1) US10070110B2 (zh)
EP (1) EP3188481B1 (zh)
CN (1) CN105430367B (zh)
ES (1) ES2763024T3 (zh)
HU (1) HUE048668T2 (zh)
PL (1) PL3188481T3 (zh)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108024106B (zh) * 2016-11-04 2019-08-23 上海富瀚微电子股份有限公司 支持rgbir和rgbw格式的颜色校正装置及方法
CN108632582B (zh) * 2017-03-17 2019-12-27 比亚迪股份有限公司 图像的白平衡调整方法和装置
CN107483906B (zh) * 2017-07-25 2019-03-19 Oppo广东移动通信有限公司 图像的白平衡处理方法、装置和终端设备
CN108156379A (zh) * 2017-12-28 2018-06-12 努比亚技术有限公司 一种自动白平衡的优化方法、终端及计算机可读存储介质
CN108093234B (zh) * 2017-12-29 2019-10-29 努比亚技术有限公司 一种图像处理方法、终端及存储介质
CN108377372B (zh) * 2018-03-13 2019-10-29 普联技术有限公司 一种白平衡处理方法、装置、终端设备和存储介质
CN108540787B (zh) * 2018-03-16 2019-09-17 浙江大华技术股份有限公司 一种确定白平衡点区域的方法、装置、设备及存储介质
CN108337496B (zh) * 2018-04-25 2020-01-31 普联技术有限公司 白平衡处理方法、处理装置、处理设备及存储介质
WO2020000262A1 (zh) * 2018-06-27 2020-01-02 华为技术有限公司 光源估测方法、图像处理方法和相关产品
CN110876049B (zh) * 2018-08-29 2021-12-28 浙江宇视科技有限公司 图像白平衡处理方法及装置
CN110388987B (zh) * 2019-07-24 2020-11-10 深圳市华星光电技术有限公司 获取影像色温的方法
CN112788322B (zh) * 2019-11-07 2023-04-07 浙江宇视科技有限公司 自适应白平衡处理方法、装置、介质及电子设备
CN113055665B (zh) * 2019-12-27 2023-04-07 Oppo广东移动通信有限公司 图像处理方法、终端及存储介质
CN111210764B (zh) * 2020-01-21 2021-01-12 卡莱特(深圳)云科技有限公司 一种led屏校正方法及校正装置
CN114071107B (zh) * 2020-08-10 2023-10-31 合肥君正科技有限公司 基于融合聚类分析与色温曲线的自动白平衡方法及装置
CN113068016B (zh) * 2021-04-02 2023-03-24 杭州涂鸦信息技术有限公司 白平衡校正方法、装置和计算机设备
CN113676716B (zh) * 2021-08-23 2022-10-14 深圳创维-Rgb电子有限公司 白平衡控制方法、装置、终端设备以及存储介质
CN114222105B (zh) * 2021-12-16 2024-05-03 苏州科达科技股份有限公司 白平衡调整方法、系统、白平衡终端及存储介质
CN114866754B (zh) * 2022-04-27 2023-12-19 北京奕斯伟计算技术股份有限公司 自动白平衡方法、装置及计算机可读存储介质和电子设备
CN114866755B (zh) * 2022-05-17 2023-12-19 北京奕斯伟计算技术股份有限公司 自动白平衡方法、装置和计算机存储介质、电子设备
CN115334295B (zh) * 2022-08-10 2024-05-03 杭州联吉技术有限公司 图像白平衡处理方法及电子设备
CN115379186B (zh) * 2022-08-19 2023-11-03 福州鑫图光电有限公司 一种图像自动白平衡的方法及终端

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080101690A1 (en) 2006-10-26 2008-05-01 De Dzwo Hsu Automatic White Balance Statistics Collection
US20090097745A1 (en) 2007-10-11 2009-04-16 Korea Advanced Institute Of Science And Technology Method of performing robust auto white balance
US20090102944A1 (en) 2007-10-22 2009-04-23 Nishizawa Yuka Color signal processing circuit, image pickup apparatus, and color signal processing method
CN103297789A (zh) 2013-05-20 2013-09-11 周宇 白平衡校正方法及其装置
CN103929632A (zh) 2014-04-15 2014-07-16 浙江宇视科技有限公司 一种自动白平衡校正方法以及装置
US8854709B1 (en) 2013-05-08 2014-10-07 Omnivision Technologies, Inc. Automatic white balance based on dynamic mapping
CN104618702A (zh) 2014-12-31 2015-05-13 湖南国科微电子有限公司 一种基于白块假设的数码摄像装置自动白平衡方法
CN104702941A (zh) 2013-12-09 2015-06-10 展讯通信(上海)有限公司 一种白点区域表示及判定方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080101690A1 (en) 2006-10-26 2008-05-01 De Dzwo Hsu Automatic White Balance Statistics Collection
US20090097745A1 (en) 2007-10-11 2009-04-16 Korea Advanced Institute Of Science And Technology Method of performing robust auto white balance
US20090102944A1 (en) 2007-10-22 2009-04-23 Nishizawa Yuka Color signal processing circuit, image pickup apparatus, and color signal processing method
US8854709B1 (en) 2013-05-08 2014-10-07 Omnivision Technologies, Inc. Automatic white balance based on dynamic mapping
CN103297789A (zh) 2013-05-20 2013-09-11 周宇 白平衡校正方法及其装置
CN104702941A (zh) 2013-12-09 2015-06-10 展讯通信(上海)有限公司 一种白点区域表示及判定方法
CN103929632A (zh) 2014-04-15 2014-07-16 浙江宇视科技有限公司 一种自动白平衡校正方法以及装置
CN104618702A (zh) 2014-12-31 2015-05-13 湖南国科微电子有限公司 一种基于白块假设的数码摄像装置自动白平衡方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
European Patent Office, Extended European Search Report Issued in Application No. 16203411.0, May 22, 2017, Germany, 7 pages.

Also Published As

Publication number Publication date
PL3188481T3 (pl) 2020-06-15
CN105430367A (zh) 2016-03-23
EP3188481A1 (en) 2017-07-05
ES2763024T3 (es) 2020-05-26
EP3188481B1 (en) 2019-11-06
CN105430367B (zh) 2017-11-03
HUE048668T2 (hu) 2020-08-28
US20170195648A1 (en) 2017-07-06

Similar Documents

Publication Publication Date Title
US10070110B2 (en) Automatic white balance
EP3542347B1 (en) Fast fourier color constancy
CN109361910B (zh) 自适应白平衡校正方法及装置
US20200396434A1 (en) Image White Balance Processing System and Method
CN110660088A (zh) 一种图像处理的方法和设备
JP2021507637A (ja) マルチカメラ画像処理
US10055824B2 (en) Image correction device, image correction method and storage medium
CN105472365B (zh) 一种自动白平衡的方法和装置
US20150131902A1 (en) Digital Image Analysis
WO2022257396A1 (zh) 图像中的色边像素点的确定方法、确定装置和计算机设备
CN102867295A (zh) 一种彩色图像颜色校正方法
CN112580433A (zh) 一种活体检测的方法及设备
Kawakami et al. Consistent surface color for texturing large objects in outdoor scenes
CN114866754B (zh) 自动白平衡方法、装置及计算机可读存储介质和电子设备
CN107872663A (zh) 图像处理方法及装置、计算机可读存储介质和计算机设备
KR20120069539A (ko) 광원 추정 장치 및 광원 추정 방법
CN115082328A (zh) 用于图像校正的方法和设备
US20110033085A1 (en) Image processing apparatus and image processing method
US20240212101A1 (en) Image fusion method, electronic device, unmanned aerial vehicle and storage medium
US10991130B2 (en) Systems and methods for implementing a sensor based real time tracking system
JP6573798B2 (ja) 画像処理装置及び画像処理方法
KR100886323B1 (ko) 컬러 히스토그램을 이용한 실시간 물체 추적 방법 및 장치
Ebner et al. On determining the color of the illuminant using the dichromatic reflection model
CN114071108B (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
KR100230446B1 (ko) 칼라 화상으로부터 조명의 칼라를 결정하는 방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: ZHEJIANG UNIVIEW TECHNOLOGIES CO., LTD, CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZHANG, DE;CHEN, DUOMING;REEL/FRAME:040540/0084

Effective date: 20161118

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4