EP1339239A2 - Korrektur des Weissabgleichs - Google Patents

Korrektur des Weissabgleichs Download PDF

Info

Publication number
EP1339239A2
EP1339239A2 EP03250997A EP03250997A EP1339239A2 EP 1339239 A2 EP1339239 A2 EP 1339239A2 EP 03250997 A EP03250997 A EP 03250997A EP 03250997 A EP03250997 A EP 03250997A EP 1339239 A2 EP1339239 A2 EP 1339239A2
Authority
EP
European Patent Office
Prior art keywords
condition
white
image
color temperature
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP03250997A
Other languages
English (en)
French (fr)
Other versions
EP1339239B1 (de
EP1339239A3 (de
Inventor
Eiichiro Canon Kabushiki Kaisha Ikeda
Takaaki Canon Kabushiki Kaisha Fukui
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of EP1339239A2 publication Critical patent/EP1339239A2/de
Publication of EP1339239A3 publication Critical patent/EP1339239A3/de
Application granted granted Critical
Publication of EP1339239B1 publication Critical patent/EP1339239B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control

Definitions

  • This invention relates to a white balance correction apparatus and method. More particularly, the invention relates to a white balance correction apparatus and method used in a device such as a digital camera or digital video camera.
  • Fig. 13 is a block diagram illustrating the structure of a single CCD type digital camera as one example of a conventional image sensing apparatus.
  • a solid-state image sensing device 1 such as a CCD has its surface covered by an RGB color filter of, e.g., a Bayer-type array, which enables color image sensing.
  • the optical image of an object that impinges upon the image sensing device 1 via a lens (not shown) is converted to an electric signal by the image sensing device 1.
  • the signal is processed by a CDS/AGC circuit 2, after which a conversion is made to a digital signal by an A/D converter circuit 3 pixel by pixel in successive fashion.
  • the digital signal that is output from the A/D converter circuit 3 has its white gain adjusted by a white balance circuit 4, whence the resulting signal is sent to a luminance notch circuit 12.
  • the luminance notch circuit 12 uses a vertical low-pass filter (VLPF) to execute processing for reducing signal gain of a frequency in the vicinity of the Nyquist frequency in the vertical direction.
  • VLPF vertical low-pass filter
  • HLPF horizontal low-pass filter
  • a horizontal band-pass filter HBPF
  • VBPF vertical band-pass filter
  • Amplitude is adjusted subsequently by PP (aperture peak) gain circuits 14 and 17 in both the horizontal and vertical directions, and then low amplitude is cut, thereby eliminating noise, by base clipping (BC) circuits 15 and 18.
  • the horizontal and vertical components are subsequently added by an adder 19, main gain is applied by an APC (Aperture Control) main gain circuit 20, and the resultant signal is added to a baseband signal by an adder 21.
  • a gamma conversion circuit 22 then performs a gamma conversion and a luminance correction (YCOMP) circuit 23 executes a luminance-signal level correction based upon color.
  • a color interpolation circuit 5 executes an interpolation with regard to all pixels in such a manner that all color pixel values will be present, and a color conversion matrix (MTX) circuit 6 converts each of the color signals to a luminance signal (Y) and color difference signals (Cr, Cb). Color-difference gain of low- and high-luminance regions is then suppressed by a chroma suppression (CSUP) circuit 7, and band is limited by a chroma low-pass filter (CLPF) circuit 8. The bandlimited chroma signal is converted to an RGB signal and is simultaneously subjected to a gamma conversion by a gamma conversion circuit 9.
  • CSUP chroma suppression
  • CLPF chroma low-pass filter
  • the RGB signal resulting from the gamma conversion is again converted to Y, Cr, Cb signals, gain is adjusted again by a chroma gain knee (CGain Knee) circuit 10, and a linear clip matrix (LCMTX) circuit 11 makes a minor correction of hue and corrects a shift in hue caused by individual differences between image sensing devices.
  • CGain Knee chroma gain knee
  • LCMTX linear clip matrix
  • the color evaluation values Cx, Cy of each block calculated in accordance with Equations (1) are compared with a previously set white detection region (described later). If the evaluation values fall within the white detection region, it is assumed that the block is white and then the summation values (SumR, SumG, SumB) of respective ones of the color pixels of the blocks assumed to be white are calculated.
  • the white balance circuit 4 performs the white balance correction using the white balance gains thus obtained.
  • Figs. 15A and 15B are graphs illustrating a white detection region 101.
  • a white object such as a standard white sheet (not shown) is sensed from high to low color temperatures using light sources at arbitrary color temperature intervals, and the color evaluation values Cx, Cy are calculated based upon Equations (1) using the signal values obtained from the image sensing device 1.
  • Cx and Cy obtained with regard to each of the light sources are plotted along the X axis and Y axis, respectively, and the plotted points are connected by straight lines. Alternatively, plotted points are approximated using a plurality of straight lines.
  • a white detection axis 102 from high to low color temperatures is produced.
  • the white detection axis 102 is provided with some width along the direction of the Y axis. This region is defined as the white detection region 101.
  • the conventional white balance detection apparatus has certain drawbacks. For example, consider a case where a close-up of the human face is taken in the presence of a light source such as sunlight having a high color temperature. Though the color evaluation values of a white subject in the presence of sunlight are distributed as indicated by area 103 in Fig. 15A, the color evaluation values of the human complexion are distributed as indicated by area 105 in Fig. 15A. These values are distributed in an area substantially the same as that (the area indicated at 104 in Fig. 15A) of color evaluation values of the color white photographed in the presence of a light source such as white tungsten having a low color temperature.
  • the color evaluation values are distributed as indicated in area 106 in Fig. 15B in a case where an area (on the order of 7000 K) in which the blue of the sky has grown pale, as at the horizon or at the boundaries of clouds, is included in a scene in, say, a scenery mode.
  • the color temperature of the scene is judged to be higher than it actually is (color temperature under clear skies is on the order of 5500 K) and the pale blue of the sky is corrected to white. This represents a judgment error.
  • the present invention has been made in consideration of the above situation and has as an aspect to reduce erroneous decisions regarding white evaluation, thereby performing a better white balance correction.
  • a white balance correction apparatus characterised by comprising; a dividing unit adapted to divide an image into a plurality of areas; a white evaluation unit adapted to determine whether image data within every divided area is indicative of the color white according to the position of each divided area in the image; and a white balance correction unit adapted to perform a white balance correction based upon image data determined to be indicative of the color white.
  • a white balance correction method characterised by comprising; dividing an image into a plurality of areas; determining whether image data within every divided area is indicative of the color white according to the position of each divided area in the image; and performing a white balance correction based upon image data determined to be indicative of the color white.
  • Fig. 1 is a block diagram illustrating the structure of a white balance correction apparatus according to a first embodiment of the present invention.
  • the white balance correction apparatus according to this embodiment is capable of being used instead of the white balance circuit 4 shown in Fig. 13, by way of example.
  • the components of the image sensing apparatus with the exception of the white balance circuit 4 will be described below with reference to Fig. 13.
  • the white balance correction apparatus includes a mode determination unit 30 for determining the operating mode of the image sensing apparatus (modes such as an automatic mode, portrait mode for sensing a person or persons, scenery mode for sensing scenery and manual mode for allowing the user to set a white balance correction value); an evaluation block dividing unit 31 for dividing the output signal of the image sensing device 1 into a plurality of evaluation blocks of the kind shown in Fig.
  • mode determination unit 30 for determining the operating mode of the image sensing apparatus (modes such as an automatic mode, portrait mode for sensing a person or persons, scenery mode for sensing scenery and manual mode for allowing the user to set a white balance correction value
  • an evaluation block dividing unit 31 for dividing the output signal of the image sensing device 1 into a plurality of evaluation blocks of the kind shown in Fig.
  • a white detection region storage unit 32 for storing a white detection region (referred to as a "standard white correction region” below) serving as a standard; a white detection region varying unit 33 for changing the white detection region when appropriate using limit values; a pattern storage unit 34 for storing, classified by mode, patterns which are combinations of position on the screen and a white decision region that has been changed by the white detection region varying unit 33 used in order to subject the evaluation block at this position to white evaluation; a white evaluation unit 35 for determining whether each evaluation block obtained by division by the evaluation block dividing unit 31 is white or not; a WB coefficient calculation unit 36 for calculating white balance (WB) coefficients, which are used in a WB correction, from signal values of evaluation blocks judged to be white by the white evaluation unit 35; a WB coefficient storage unit 37 for storing WB coefficients obtained by the WB coefficient calculation unit 36; and a WB correction unit 38 for applying a WB correction to the output signal of the image sensing device 1 using the WB coefficients stored in the WB
  • the reference white detection region stored in the white detection region storage unit 32 will now be described. The description will relate to a case where a primary-color filter is used as the image sensing device 1.
  • reference numerals 201 and 202 denote a white detection region and a white detection axis, respectively.
  • a white object such as a standard white sheet (not shown) is sensed from high to low color temperatures using light sources at arbitrary color temperature intervals, and the color evaluation values Cx, Cy are calculated based upon Equations (1) using the signal values obtained from the image sensing device 1.
  • Cx and Cy obtained with regard to each of the light sources are plotted along the X axis and Y axis, respectively, and the plotted points are connected by straight lines or the plotted points are approximated by a plurality of straight lines.
  • the white detection axis 202 from high to low color temperatures is produced.
  • the X axis corresponds to the color temperature of the light source and the Y axis corresponds to the amount of correction in the green direction (that is, the color-temperature direction of luminance and the color-temperature direction of fluorescent light).
  • the white detection axis is provided with some width along the direction of the Y axis. This region is defined as the white detection region 201. The data of the white detection region thus defined is stored in the white detection region storage unit 32 when the white balance correction apparatus is manufactured or shipped.
  • the evaluation block dividing unit 31 divides the output signal of the image sensing device 1 into a plurality of evaluation blocks of the kind shown in Fig. 14, and the white evaluation unit 35 calculates the color evaluation values Cx, Cy block by block using the Equations (1) and determines that a block is white if the calculated color evaluation values Cx, Cy fall within the white detection region 201.
  • the size (limits along the direction of color temperature in an example shown in Figs. 2A and 2B) of the white detection region 201 can be changed by the white detection region varying unit 33 in accordance with relative position of the evaluation block on the screen.
  • a white detection region 203 is obtained by setting white detection limits Ll1, Lh1 with respect to the white detection region 201, thereby applying limits in such a manner that the range of Cx is made Ll1 to Lh1.
  • a white detection region 204 in Fig. 2B is obtained by setting white detection limits Ll2, Lh2 with respect to the white detection region 201, thereby applying limits that make the range of Cx equal to Ll2 to Lh2 of the white detection region 204 so as to cut the low region of color temperatures from the white detection region 203.
  • the white evaluation unit 35 subjects the evaluation blocks to a white evaluation and the WB coefficient calculation unit 36 calculates the WB coefficients (white balance gains) from the summed values of the pixel values in evaluation blocks determined to be white and stores these coefficients in the WB coefficient storage unit 37.
  • the WB correction unit 38 subjects the input image to a WB correction using the coefficients stored in the WB coefficient storage unit 37.
  • Fig. 3A illustrates a pattern setting example of a pattern stored in the pattern storage unit 34 in the automatic mode
  • Fig. 3B illustrates a pattern setting example of a pattern stored in the pattern storage unit 34 in the portrait mode.
  • Each pattern indicates a combination of the position of each evaluation block and the size of the changed white detection region used when subjecting the evaluation block at this position to white evaluation. It may be so arranged that even though the patterns are stored beforehand at the time of manufacture or shipping of the white balance correction apparatus, the user can alter the region settings.
  • the mode determination unit 30 determines whether the automatic mode or the portrait mode has been set. If the automatic mode has been set, control proceeds to step S12, at which the white evaluation unit 35 acquires the region data of the pattern shown in Fig. 3A from the pattern storage unit 34. If the portrait mode has been set, control proceeds to step S13, at which the white evaluation unit 35 acquires the region data of the pattern shown in Fig. 3B from the pattern storage unit 34.
  • step S14 the white evaluation unit 35 determines whether each evaluation block is in region (1) or region (2). If it is in region (1) ("YES" at step S14), control proceeds to step S15, at which the color evaluation values of the evaluation block are compared with the white detection region 203, which is shown in Fig. 2A, limited by the white detection region varying unit 33.
  • step S16 at which the color evaluation values of the evaluation block are compared with the white detection region 204, which is shown in Fig. 2B, limited by the white detection region varying unit 33.
  • the limitation on the side of low color temperature is set to be higher for the central area than for the periphery of the screen, thereby applying a limitation to the white detection region 204 in such a manner that a complexion tone will not be evaluated incorrectly as being white.
  • step S15 or S16 If the color evaluation values of an evaluation block are found to fall within the white detection region 203 or 204 at either step S15 or S16, then control proceeds to step S17, at which the white evaluation unit 35 decides that this evaluation block is white. If the color evaluation values of an evaluation block are found not to fall within the white detection region 203 or 204, then control proceeds to step S18, at which the white evaluation unit 35 decides that this evaluation block is not white.
  • An evaluation block thus determined to be white has its pixel values summed in order to calculate the white gains (WB coefficients), as described above.
  • Step S19 Whether the decision as to whether an evaluation block is white has been rendered with regard to all evaluation blocks is determined at step S19. Steps S14 to S18 are repeated until all of the evaluation blocks have been evaluated.
  • erroneous decisions regarding white evaluation can be reduced by using a white detection region that differs depending upon position on the screen. As a result, it is possible to perform a better white balance correction.
  • a white balance correction of higher precision can be achieved by performing the determination regarding the absence or presence of complexion before the operation indicated at step S12 in Fig. 4 and performing the operations from step S12 onward if complexion is judged to be present. This operation will be described with reference to the flowchart of Fig. 5.
  • Evaluation blocks in which image data is determined to be white are detected at step S21 using the same white detection region (a region delimited by the white detection limits or a region in which the color-temperature region of complexion is contained in a white detection region that is not limited; e.g., either the white detection region 201 or 203) with regard to all evaluation blocks in the central portion of the screen [region (2) in Fig. 3A] and peripheral portion [region (1) in Fig.
  • light-source color temperature CtAround is obtained at step S22 from data that is the result of summing and averaging the image data of the white evaluation blocks in the peripheral portion of the screen
  • light-source color temperature CtCenter is obtained at step S23 from data that is the result of summing and averaging the image data of the white evaluation blocks in the central portion of the screen. It should be noted that the order of the processing steps S22 and S23 may be reversed or the processing of steps S22 and S23 may be executed in parallel.
  • CtAround and CtCenter are compared at step S24. If the color temperature CtCenter obtained from the central portion of the screen is less than the color temperature CtAround obtained from the peripheral portion of the screen, then it is judged at step S25 that there is a high likelihood that complexion is present in the central portion of the screen. In other words, if CtCenter ⁇ CtAround holds, then the central portion of the screen is judged to contain complexion and the light-source color temperature is calculated at step S26 by performing the white evaluation in the automatic mode shown in Figs. 2A to 4B.
  • step S27 if the color temperature CtCenter obtained from the central portion of the screen is substantially equal to or greater than the color temperature CtAround obtained from the peripheral portion of the screen, it is judged at step S27 that there is a high likelihood that complexion is not present at the central portion of the screen.
  • CtCenter ⁇ CtAround holds, then it is judged that complexion is not present, all evaluation blocks are compared with a common white detection region (step S28), white evaluation blocks are detected and the light-source color temperature obtained is adopted.
  • Figs. 6A and 6B illustrate pattern setting examples for suppressing erroneous decisions that blue sky is the color white.
  • Fig. 6A illustrates an example of a pattern in the case of the automatic mode
  • Fig. 6B illustrates an example of a pattern in the case of the scenery mode.
  • whether an evaluation block is white or not is determined by comparing regions (1) and (2) with white detection regions limited using different white detection limits.
  • a white detection limit Lh4 on the side of high color temperature limiting the white detection region for the purpose of judging evaluation blocks at the upper portion of the screen is set to the side of low color temperature in comparison with a white detection limit Lh3 on the side of high color temperature shown in Fig. 7A for the purpose of judging evaluation blocks at the lower portion of the screen, thereby arranging is so that pale blue will not be judged erroneously as white.
  • brightness Bv of a subject may be detected from the captured image data and white detection patterns of the kind shown in Figs. 6A and 6B may be changed in accordance with brightness.
  • Bv is greater than a preset value Bv2
  • the probability that the image was captured outdoors is high (the percentage of area occupied by the sky is large).
  • the proportion of evaluation blocks (2) in the upper part of the screen in which the white detection range is limited by a white detection region 209 shown in Fig. 10A is enlarged, as shown in Fig. 16B.
  • Fig. 8 illustrates an example of pattern setting for suppressing erroneous decisions relating to both complexion and the sky. Excellent results are obtained if, by way of example, the following settings are made for white evaluation of evaluation blocks in region (1) of Fig. 8:
  • the pattern used can be changed depending upon the image sensing mode in a manner similar to that of the first embodiment.
  • Fig. 11 is a block diagram illustrating the brief structure of a white balance correction apparatus according to a second embodiment of the present invention. This embodiment differs from that of Fig. 1 in that the mode determination unit 30 in Fig. 1 is replaced by a vertical/horizontal orientation determination unit 40 in Fig. 11 for discriminating horizontal and vertical orientations at the time of image sensing. Components in Fig. 11 identical with those shown in Fig. 1 are designated by like reference numerals, and explanation of them are omitted.
  • the characterizing feature of the second embodiment is that the vertical/horizontal orientation determination unit 40 discriminates horizontal and vertical orientations (rotation clockwise by 90° and rotation counter-clockwise by 90°) based upon the output of a gravity sensor (not shown) and makes it possible to change the setting of white detection limits depending upon the state of photography.
  • Figs. 12A to 12C are diagrams illustrating pattern setting examples.
  • Fig. 12A illustrates an example of a pattern in a case where horizontal orientation has been determined by the vertical/horizontal orientation determination unit 40
  • Fig. 12B an example of a pattern in a case where rotation counter-clockwise by 90° has been determined by the vertical/horizontal orientation determination unit 40
  • Fig. 12C an example of a pattern in a case where rotation clockwise by 90° has been determined by the vertical/horizontal orientation determination unit 40.
  • Point A in Figs. 12A to 12C indicates the same corner and is shown in order to facilitate an understanding of direction of rotation.
  • the evaluation blocks in the regions (1) to (4) of Figs. 12A to 13C are judged using, e.g., the white detection regions 207 to 210 shown in Figs. 9A, 9B, 10A and 10B.
  • white evaluation of each evaluation block can be performed appropriately irrespective of camera orientation (horizontal or vertical) at the time of image sensing.
  • Figs. 12A to 12C are one example. It may be so arranged that the pattern settings shown in Figs. 3A, 3B and Figs. 6A, 6B or other settings may be used, and the settings may be changed depending upon the image sensing mode.
  • the white detection region is limited by the white detection region varying unit 33 .
  • a plurality of white detection regions are stored in advance and any of the stored white detection regions is used in dependence upon the position of the evaluation block.
  • the present invention may serve as an element that constructs such apparatus.
  • the present invention can be implemented by providing a storage medium storing program codes for performing the aforesaid processes to a computer system or apparatus (e.g., a personal computer), reading the program codes, by a CPU or MPU of the computer system or apparatus, from the storage medium, then executing the program.
  • a computer system or apparatus e.g., a personal computer
  • the program codes read from the storage medium realize the functions according to the embodiments, and the storage medium storing the program codes constitutes the invention.
  • the storage medium such as a floppy disk, a hard disk, an optical disk, a magneto-optical disk, CD-ROM, CD-R, a magnetic tape, a non-volatile type memory card, and ROM
  • computer network such as LAN (local area network) and WAN (wide area network)
  • LAN local area network
  • WAN wide area network
  • the present invention includes a case where an OS (operating system) or the like working on the computer performs a part or entire processes in accordance with designations of the program codes and realizes functions according to the above embodiments.
  • the present invention also includes a case where, after the program codes read from the storage medium are written in a function expansion card which is inserted into the computer or in a memory provided in a function expansion unit which is connected to the computer, CPU or the like contained in the function expansion card or unit performs a part or entire process in accordance with designations of the program codes and realizes functions of the above embodiments.
  • the storage medium stores program codes corresponding to the flowcharts shown in Fig. 4 and/or Fig. 5 described in the embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Image Signal Generators (AREA)
  • Processing Of Color Television Signals (AREA)
EP03250997A 2002-02-20 2003-02-19 Korrektur des Weissabgleichs Expired - Lifetime EP1339239B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2002043326 2002-02-20
JP2002043326A JP3513506B2 (ja) 2002-02-20 2002-02-20 ホワイトバランス補正装置およびそれを搭載した撮像装置、及びホワイトバランス補正方法

Publications (3)

Publication Number Publication Date
EP1339239A2 true EP1339239A2 (de) 2003-08-27
EP1339239A3 EP1339239A3 (de) 2004-12-29
EP1339239B1 EP1339239B1 (de) 2007-08-08

Family

ID=27655262

Family Applications (1)

Application Number Title Priority Date Filing Date
EP03250997A Expired - Lifetime EP1339239B1 (de) 2002-02-20 2003-02-19 Korrektur des Weissabgleichs

Country Status (5)

Country Link
US (3) US7362356B2 (de)
EP (1) EP1339239B1 (de)
JP (1) JP3513506B2 (de)
CN (1) CN100423587C (de)
DE (1) DE60315368T2 (de)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1648178A3 (de) * 2004-10-15 2008-07-09 Genesis Microchip, Inc. Strecken von Blau mittels eingeschränkte Farbpalette und weiche Übergangsgrenzschicht
EP1737247A3 (de) * 2005-06-20 2009-04-01 Canon Kabushiki Kaisha Bildaufnahmevorrichtung und Bildverarbeitungsverfahren
US7671900B2 (en) 2005-01-07 2010-03-02 Canon Kabushiki Kaisha Imaging apparatus and its control method displaying an electronic viewfinder screen and a frame within the screen including color information
CN103250418A (zh) * 2010-11-30 2013-08-14 富士胶片株式会社 图像处理设备、成像设备、图像处理方法、及白平衡调整方法

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3746432B2 (ja) * 2001-03-01 2006-02-15 株式会社牧野フライス製作所 接合面の加工方法及び装置
JP3513506B2 (ja) * 2002-02-20 2004-03-31 キヤノン株式会社 ホワイトバランス補正装置およびそれを搭載した撮像装置、及びホワイトバランス補正方法
JP2005173269A (ja) * 2003-12-11 2005-06-30 Canon Inc 光学機器
JP2005176271A (ja) 2003-12-15 2005-06-30 Canon Inc 撮像方法及びその装置
JP4262632B2 (ja) * 2004-04-19 2009-05-13 オリンパス株式会社 受信装置
JP4754227B2 (ja) * 2005-01-31 2011-08-24 イーストマン コダック カンパニー オートホワイトバランス装置及びホワイトバランス調整方法
JP4812073B2 (ja) 2005-01-31 2011-11-09 キヤノン株式会社 画像撮像装置、画像撮像方法、プログラムおよび記録媒体
JP4533168B2 (ja) 2005-01-31 2010-09-01 キヤノン株式会社 撮像装置及びその制御方法
JP4329125B2 (ja) * 2005-02-09 2009-09-09 富士フイルム株式会社 ホワイトバランス制御方法、ホワイトバランス制御装置及び撮像装置
JP2006262451A (ja) 2005-02-18 2006-09-28 Canon Inc 画像記録装置及び方法
JP4849818B2 (ja) * 2005-04-14 2012-01-11 イーストマン コダック カンパニー ホワイトバランス調整装置及び色識別装置
JP4662356B2 (ja) 2005-10-21 2011-03-30 キヤノン株式会社 撮像装置及びその制御方法、及びその制御プログラム、制御プログラムを格納した記憶媒体
US8107762B2 (en) 2006-03-17 2012-01-31 Qualcomm Incorporated Systems, methods, and apparatus for exposure control
KR101010988B1 (ko) * 2006-04-25 2011-01-26 캐논 가부시끼가이샤 촬상장치 및 그 제어 방법
US7974487B2 (en) * 2007-04-17 2011-07-05 Kabushiki Kaisha Toshiba System and method for image white balance adjustment
JP5250996B2 (ja) * 2007-04-25 2013-07-31 株式会社ニコン ホワイトバランス調整装置、撮像装置、およびホワイトバランス調整プログラム
JP5064947B2 (ja) * 2007-09-11 2012-10-31 キヤノン株式会社 画像処理装置及び方法、及び撮像装置
US8922672B2 (en) 2008-01-03 2014-12-30 Apple Inc. Illumination systems and methods for imagers
US7990431B2 (en) * 2008-03-14 2011-08-02 Asia Optical Co., Inc. Calculation method for the correction of white balance
JP4726251B2 (ja) * 2008-09-18 2011-07-20 キヤノン株式会社 撮像装置及び画像処理方法
JP5045731B2 (ja) 2009-11-04 2012-10-10 カシオ計算機株式会社 撮像装置、ホワイトバランス設定方法、及び、プログラム
US8605167B2 (en) * 2010-09-01 2013-12-10 Apple Inc. Flexible color space selection for auto-white balance processing
JP5808142B2 (ja) * 2011-05-02 2015-11-10 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム
JP5769524B2 (ja) * 2011-07-05 2015-08-26 キヤノン株式会社 撮像装置、撮像装置の制御方法、及びコンピュータプログラム
JP2013143593A (ja) * 2012-01-06 2013-07-22 Canon Inc 撮像装置、その制御方法およびプログラム
JP6083974B2 (ja) * 2012-04-24 2017-02-22 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム
JP6270423B2 (ja) 2013-11-14 2018-01-31 キヤノン株式会社 画像処理装置およびその制御方法
JP6234191B2 (ja) * 2013-11-28 2017-11-22 オリンパス株式会社 マルチエリアホワイトバランス制御装置、マルチエリアホワイトバランス制御方法、マルチエリアホワイトバランス制御プログラム、マルチエリアホワイトバランス制御プログラムを記録したコンピュータ、マルチエリアホワイトバランス画像処理装置、マルチエリアホワイトバランス画像処理方法、マルチエリアホワイトバランス画像処理プログラム、マルチエリアホワイトバランス画像処理プログラムを記録したコンピュータ及びマルチエリアホワイトバランス画像処理装置を備えた撮像装置
JP5957705B2 (ja) * 2013-12-20 2016-07-27 パナソニックIpマネジメント株式会社 撮像装置
WO2015151747A1 (ja) * 2014-03-31 2015-10-08 富士フイルム株式会社 画像処理装置、撮像装置、画像処理方法及びプログラム
CN105981379B (zh) * 2014-09-17 2018-09-25 深圳市大疆创新科技有限公司 自动白平衡系统及方法
JP6331966B2 (ja) * 2014-10-24 2018-05-30 株式会社Jvcケンウッド 撮像装置及び撮像装置の調整方法
JP6521776B2 (ja) 2015-07-13 2019-05-29 オリンパス株式会社 画像処理装置、画像処理方法
JP2017175349A (ja) * 2016-03-23 2017-09-28 キヤノン株式会社 撮像装置、その制御方法及びプログラム
CN106851121B (zh) 2017-01-05 2019-07-05 Oppo广东移动通信有限公司 控制方法及控制装置
CN108551576B (zh) * 2018-03-07 2019-12-20 浙江大华技术股份有限公司 一种白平衡方法及装置
CN108377372B (zh) * 2018-03-13 2019-10-29 普联技术有限公司 一种白平衡处理方法、装置、终端设备和存储介质
CN110827364B (zh) * 2018-08-07 2023-01-13 阿里巴巴(中国)有限公司 一种检测绿屏图像的方法及装置
CN109451292B (zh) * 2018-12-15 2020-03-24 深圳市华星光电半导体显示技术有限公司 图像色温校正方法及装置
CN113784104B (zh) * 2021-08-18 2024-07-16 杭州涂鸦信息技术有限公司 一种白平衡处理方法及相关装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05292533A (ja) * 1992-04-10 1993-11-05 Matsushita Electric Ind Co Ltd オートホワイトバランス装置
EP0738085A2 (de) * 1995-04-13 1996-10-16 Eastman Kodak Company Verfahren und Vorrichtung zum automatischen Weissabgleich
JPH11262029A (ja) * 1997-12-25 1999-09-24 Canon Inc 撮像装置および信号処理装置
JP2000092509A (ja) * 1998-09-11 2000-03-31 Eastman Kodak Japan Ltd オートホワイトバランス装置

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5264921A (en) * 1987-11-30 1993-11-23 Canon Kabushiki Kaisha White balance including means for preventing colored object from disturbing the balance
US5530474A (en) * 1991-09-05 1996-06-25 Canon Kabushiki Kaisha White balance correction device with correction signal limiting device
JPH0955949A (ja) * 1995-08-11 1997-02-25 Minolta Co Ltd 撮像装置
US6788339B1 (en) * 1997-12-25 2004-09-07 Canon Kabushiki Kaisha Image pickup apparatus
US6603885B1 (en) * 1998-04-30 2003-08-05 Fuji Photo Film Co., Ltd. Image processing method and apparatus
US6906744B1 (en) * 1999-09-28 2005-06-14 Nikon Corporation Electronic camera
KR100311482B1 (ko) * 1999-10-21 2001-10-18 구자홍 보간 영상의 화질 개선을 위한 필터링 제어방법
US6798449B2 (en) * 2001-01-18 2004-09-28 Kinpo Electronics, Inc. Automatic white-balance correction for digital camera
JP3513506B2 (ja) * 2002-02-20 2004-03-31 キヤノン株式会社 ホワイトバランス補正装置およびそれを搭載した撮像装置、及びホワイトバランス補正方法
US7148921B2 (en) * 2002-03-06 2006-12-12 Canon Kabushiki Kaisha White balance adjustment method, image sensing apparatus, program, and storage medium
US7492967B2 (en) * 2003-09-24 2009-02-17 Kabushiki Kaisha Toshiba Super-resolution processor and medical diagnostic imaging apparatus
US20080166114A1 (en) * 2007-01-09 2008-07-10 Sony Ericsson Mobile Communications Ab Image deblurring system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05292533A (ja) * 1992-04-10 1993-11-05 Matsushita Electric Ind Co Ltd オートホワイトバランス装置
EP0738085A2 (de) * 1995-04-13 1996-10-16 Eastman Kodak Company Verfahren und Vorrichtung zum automatischen Weissabgleich
JPH11262029A (ja) * 1997-12-25 1999-09-24 Canon Inc 撮像装置および信号処理装置
JP2000092509A (ja) * 1998-09-11 2000-03-31 Eastman Kodak Japan Ltd オートホワイトバランス装置

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 0180, no. 88 (E-1507), 14 February 1994 (1994-02-14) & JP 5 292533 A (MATSUSHITA ELECTRIC IND CO LTD), 5 November 1993 (1993-11-05) *
PATENT ABSTRACTS OF JAPAN vol. 1999, no. 14, 22 December 1999 (1999-12-22) & JP 11 262029 A (CANON INC), 24 September 1999 (1999-09-24) -& US 6 788 339 B1 (IKEDA EIICHIRO) 7 September 2004 (2004-09-07) *
PATENT ABSTRACTS OF JAPAN vol. 2000, no. 06, 22 September 2000 (2000-09-22) & JP 2000 092509 A (EASTMAN KODAK JAPAN LTD), 31 March 2000 (2000-03-31) -& US 6 727 942 B1 (MIYANO TOSHIKI) 27 April 2004 (2004-04-27) *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1648178A3 (de) * 2004-10-15 2008-07-09 Genesis Microchip, Inc. Strecken von Blau mittels eingeschränkte Farbpalette und weiche Übergangsgrenzschicht
US7460181B2 (en) 2004-10-15 2008-12-02 Genesis Microchip Inc. Blue stretch using restricted color range and soft transition boundary
US7834936B2 (en) 2004-10-15 2010-11-16 Sheena Luu Blue stretch using restricted color range and soft transition boundary
US7671900B2 (en) 2005-01-07 2010-03-02 Canon Kabushiki Kaisha Imaging apparatus and its control method displaying an electronic viewfinder screen and a frame within the screen including color information
EP1737247A3 (de) * 2005-06-20 2009-04-01 Canon Kabushiki Kaisha Bildaufnahmevorrichtung und Bildverarbeitungsverfahren
EP2107814A1 (de) * 2005-06-20 2009-10-07 Canon Kabushiki Kaisha Bildaufnahmevorrichtung und Bildverarbeitungsverfahren
US8013906B2 (en) 2005-06-20 2011-09-06 Canon Kabushiki Kaisha Image sensing apparatus and image processing method
CN103250418A (zh) * 2010-11-30 2013-08-14 富士胶片株式会社 图像处理设备、成像设备、图像处理方法、及白平衡调整方法
CN103250418B (zh) * 2010-11-30 2014-10-15 富士胶片株式会社 图像处理设备、成像设备、图像处理方法、及白平衡调整方法

Also Published As

Publication number Publication date
EP1339239B1 (de) 2007-08-08
EP1339239A3 (de) 2004-12-29
CN100423587C (zh) 2008-10-01
JP2003244723A (ja) 2003-08-29
DE60315368D1 (de) 2007-09-20
US7932931B2 (en) 2011-04-26
US20030156206A1 (en) 2003-08-21
DE60315368T2 (de) 2008-05-08
US20110176030A1 (en) 2011-07-21
JP3513506B2 (ja) 2004-03-31
US8508616B2 (en) 2013-08-13
CN1443009A (zh) 2003-09-17
US7362356B2 (en) 2008-04-22
US20080100722A1 (en) 2008-05-01

Similar Documents

Publication Publication Date Title
EP1339239B1 (de) Korrektur des Weissabgleichs
US9462247B2 (en) Image processing and white balance controlling apparatus
US8089525B2 (en) White balance control device and white balance control method
EP1774797B1 (de) Verfahren und vorrichtung zum automatischen weissabgleich
US7738699B2 (en) Image processing apparatus
US8743233B2 (en) Sensitivity-settable image capture apparatus
US9438875B2 (en) Image processing apparatus, image processing method, and program
US8624995B2 (en) Automatic white balancing method, medium, and system
US7030913B2 (en) White balance control apparatus and method, and image pickup apparatus
US6853748B2 (en) Signal processing apparatus and method for reducing generation of false color by adaptive luminance interpolation
USRE46232E1 (en) Image processing method and apparatus for processing an image by using a face detection result
US20020071041A1 (en) Enhanced resolution mode using color image capture device
US20030169348A1 (en) White balance adjustment method, image sensing apparatus, program, and storage medium
JP3330905B2 (ja) ビデオカメラのホワイトバランスの補正方法
US20030067548A1 (en) Image processing method, image pickup apparatus and program
US6788339B1 (en) Image pickup apparatus
JP3848274B2 (ja) ホワイトバランス調整方法及び撮像装置及びプログラム及び記憶媒体
US8115834B2 (en) Image processing device, image processing program and image processing method
JP4960597B2 (ja) ホワイトバランス補正装置及び方法、及び撮像装置
JPH05110936A (ja) ビデオカメラ

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK RO

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK RO

RIC1 Information provided on ipc code assigned before grant

Ipc: 7H 04N 9/64 B

Ipc: 7H 04N 9/73 A

17P Request for examination filed

Effective date: 20050511

AKX Designation fees paid

Designated state(s): DE FR GB

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

RIN1 Information on inventor provided before grant (corrected)

Inventor name: FUKUI, TAKAAKI,CANON KABUSHIKI KAISHA

Inventor name: IKEDA, EIICHIRO,CANON KABUSHIKI KAISHA

REF Corresponds to:

Ref document number: 60315368

Country of ref document: DE

Date of ref document: 20070920

Kind code of ref document: P

ET Fr: translation filed
PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20080509

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 14

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20160229

Year of fee payment: 14

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20160225

Year of fee payment: 14

Ref country code: GB

Payment date: 20160224

Year of fee payment: 14

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 60315368

Country of ref document: DE

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20170219

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20171031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170228

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170901

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170219