CN106504200B - Image illumination compensation method and system based on hue offset estimation and point-by-point hue mapping - Google Patents
Image illumination compensation method and system based on hue offset estimation and point-by-point hue mapping Download PDFInfo
- Publication number
- CN106504200B CN106504200B CN201610824126.XA CN201610824126A CN106504200B CN 106504200 B CN106504200 B CN 106504200B CN 201610824126 A CN201610824126 A CN 201610824126A CN 106504200 B CN106504200 B CN 106504200B
- Authority
- CN
- China
- Prior art keywords
- point
- image
- hue
- channel
- calibration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000013507 mapping Methods 0.000 title claims abstract description 34
- 238000000034 method Methods 0.000 title claims abstract description 32
- 238000005286 illumination Methods 0.000 title claims abstract description 23
- 239000011159 matrix material Substances 0.000 claims abstract description 3
- 238000010586 diagram Methods 0.000 claims description 13
- 238000012545 processing Methods 0.000 claims description 10
- 230000000877 morphologic effect Effects 0.000 claims description 4
- 239000003086 colorant Substances 0.000 claims description 3
- 230000007797 corrosion Effects 0.000 claims description 3
- 238000005260 corrosion Methods 0.000 claims description 3
- 238000003708 edge detection Methods 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 2
- 238000012937 correction Methods 0.000 abstract description 3
- 239000000284 extract Substances 0.000 abstract description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000010339 dilation Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003628 erosive effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
An image illumination compensation method based on hue shift estimation and point-by-point hue mapping extracts hue shift and hue mapping functions respectively corresponding to three primary color channels from a calibration graph containing matrix arrangement in an off-line stage, and maps an image to be detected point-by-point according to the obtained hue mapping functions in an on-line stage, thereby realizing image illumination compensation. Compared with the existing gamma correction method, the gamma correction method is simpler and more efficient, and can quickly obtain and process the calibration result.
Description
Technical Field
The invention relates to a technology in the field of image processing, in particular to an image illumination compensation method and system based on hue shift estimation and point-by-point hue mapping.
Background
The image illumination compensation technique is an indispensable part in image processing. Because the position of the light source, the surrounding environment and the like in the process of acquiring the picture can cause certain uneven illumination and can generate larger influence on the picture quality, the picture needs to be optimized by means of image processing to perform illumination compensation. The existing techniques such as a color sense consistency method, a target region color similarity method under motion estimation, a color space conversion method, a histogram equalization method, and a gamma correction extremely improved method.
A LEARNING-BASED technique FOR improving the subjective quality of IMAGEs is proposed in the text "LEARNING-BASED IMAGE quality IMPROVEMENT FOR VIDEO recording", by Zicheng Liu et al at ICME2007 conference. The method eliminates the influence of illumination on the subjective quality of the picture to a certain extent, but firstly, a large number of pictures which look high in quality to human eyes need to be selected for training, so that the method is complex and has certain difficulty in realization.
In addition, the color space conversion method cannot solve the problem of correlation among color channels; the histogram equalization method tends to form discontinuous patches in the image.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides an image illumination compensation method and system based on hue shift estimation and point-by-point hue mapping, the hue shift estimation is obtained according to calibration, and then a hue mapping function is established for each pixel point to perform illumination compensation on the image.
The invention is realized by the following technical scheme:
the invention relates to an image illumination compensation method based on hue shift estimation and point-by-point hue mapping, which extracts and obtains hue shift and hue mapping functions respectively corresponding to three primary color channels from a calibration graph containing matrix arrangement in an off-line stage, and maps the image to be detected point-by-point according to the obtained hue mapping functions in an on-line stage, thereby realizing the image illumination compensation.
The calibration plot is preferably a white background and contains (2N + 1) solid color filled squares, with N being a natural constant.
The extraction comprises the steps of detecting square areas from a shot image after shooting a calibration graph, calculating the average value of colors in each square area, and constructing a curved surface to obtain a deviation color distribution graph; and establishing a corresponding tone mapping function of the three primary color channel for each pixel in the shot image by taking the square in the middle position in the horizontal and vertical directions of the calibration graph as a reference.
The edge detection is carried out on the detected square area through a Prewitt operator, then morphological operation expansion and corrosion treatment are carried out, and (2N + 1) ((2N + 1)) maximum connected areas are obtained, and each connected area is the square area.
The deviation color distribution diagram is as follows: and performing linear interpolation according to the respective average values of RGB (red, green and blue) channels of each square area and the average value of Z = R channel, the average value of Z = G channel and the average value of Z = B channel, wherein X-Y is the two-dimensional coordinate of the pixel point, so as to obtain the curved surfaces of the three channels under a Z-X-Y three-dimensional orthogonal coordinate system, namely three deviation color distribution diagrams.
The tone mapping function is that: for any pixel point in the image, obtaining the intensity value of the pixel point corresponding to the channel according to the deviation color distribution diagram, and performing linear fitting on a target line segment; in the fitting process, when Y corresponding to X = m (m < 255) reaches 255, it is set that Y corresponding to X is 255 when X is greater than m and smaller than 255.
The starting point of the target line segment is the origin (0, 0), the abscissa of the end point is the intensity value of the channel of the color shift distribution diagram, and the ordinate is the average value of the intensity of the channel corresponding to all pixels in the square area located in the middle of the calibration diagram.
The point-by-point mapping of the image to be measured refers to: and mapping each pixel of the image to be detected by using the tone mapping function to obtain a tone compensation result for eliminating the illumination influence.
The invention relates to a system for realizing the method, which comprises the following steps: calibration module, fitting module and processing module, wherein: the calibration module receives a shot image aiming at the calibration graph, is connected with the fitting module and outputs pixel information extracted from the calibration graph, the fitting module is connected with the processing module and outputs a tone mapping function calculated according to the pixel information, and the processing module performs real-time illumination compensation on an online input image to be detected according to the tone mapping function and outputs a compensated image.
Drawings
FIG. 1 is a calibration chart of the present invention;
FIG. 2 is a schematic diagram illustrating the effects of the embodiment.
Detailed Description
The embodiment comprises the following steps:
an off-line preparation stage:
in a first step, a calibration graph containing 5 × 5=25 yellow filled squares as shown in fig. 1 is made.
And secondly, shooting the calibration graph, detecting a square area in the shot graph by using a Prewitt operator, namely edge detection, and then performing morphological expansion operation and corrosion operation on the image to obtain 25 maximum communication areas, namely boundaries of 25 squares.
The morphological filtering is to perform dilation and erosion operations on the image.
And thirdly, respectively calculating the average value of the intensity of each channel of RGB of each square area as the intensity value of all pixel points in the square. And performing linear interpolation according to the discrete points in the squares to obtain three curved surfaces, and obtaining the intensity value of the RGB channel of each pixel point.
The curved surface is as follows: the Z axis is the intensity value of the R, G or B channel, and X and Y are the coordinates of the pixel points.
And fourthly, respectively establishing a corresponding tone mapping function of each channel of RGB for each pixel point in the calibration graph by taking the average value of the three channels of RGB in the middle square area as a target value.
The tone mapping function is that: performing linear fitting on the target line segment in a two-dimensional orthogonal coordinate system; in the fitting process, when Y corresponding to X = m (m < 255) reaches 255, it is set that Y corresponding to X is 255 when X is larger than m and smaller than 255.
The starting point of the target line segment is the origin (0, 0), the abscissa of the end point is the intensity value of the channel of the color shift distribution diagram, and the ordinate is the average value of the intensity of the channel corresponding to all pixels in the square area located in the middle of the calibration diagram. When the intensity value of the pixel in any channel of R, G or B is 100 and the target value of the channel is 200, the obtained tone mapping function is as shown in fig. 2.
And (3) an online detection stage:
and fifthly, taking the intensity value of each channel of each pixel of the image to be detected as X, using the tone mapping function obtained in the fourth step, and taking the function value Y corresponding to X as the intensity value of the channel of the pixel point of the processed image, and finally obtaining the tone compensation result for eliminating the illumination influence.
The illumination condition of the image to be detected when the image to be detected is shot is completely the same as the shooting condition of the calibration graph.
The foregoing embodiments may be modified in many different ways by those skilled in the art without departing from the spirit and scope of the invention, which is defined by the appended claims and all changes that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Claims (6)
1. An image illumination compensation method based on hue shift estimation and point-by-point hue mapping is characterized in that hue shift and hue mapping functions respectively corresponding to three primary color channels are extracted and obtained from a calibration graph containing matrix arrangement in an off-line stage, and the point-by-point mapping is carried out on an image to be detected according to the obtained hue mapping functions in an on-line stage, so that image illumination compensation is realized;
the extraction comprises the steps of detecting square areas from a shot image after shooting a calibration graph, calculating the average value of colors in each square area, and constructing a curved surface to obtain a deviation color distribution graph; establishing a corresponding tone mapping function of a three-primary-color channel for each pixel in the shot image by taking a square in the middle position in the horizontal and vertical directions of the calibration graph as a reference;
the deviation color distribution diagram is as follows: performing linear interpolation according to the respective average values of the RGB channels of each square area and the average value of the Z = R channel, the average value of the Z = G channel and the average value of the Z = B channel, wherein X-Y is the two-dimensional coordinate of the pixel point, so as to obtain the curved surfaces of the three channels under a Z-X-Y three-dimensional orthogonal coordinate system, namely three deviation color distribution diagrams;
the tone mapping function is that: for any pixel point in the image, obtaining the intensity value of the pixel point corresponding to the channel according to the deviation color distribution diagram, and performing linear fitting on a target line segment to obtain a tone mapping function;
the starting point of the target line segment is the origin (0, 0), the abscissa of the end point is the intensity value of the channel of the color shift distribution diagram, and the ordinate is the average value of the intensity of the channel corresponding to all pixels in the square area located in the middle of the calibration diagram.
2. The method of claim 1, wherein the calibration chart is a white background and comprises (2n + 1) squares filled with pure colors, and N is a natural constant.
3. The image illumination compensation method of claim 1, wherein the square region is detected by using a Prewitt operator to perform edge detection, and then performing morphological operation expansion and corrosion processing to obtain (2n + 1) maximum connected regions, and each connected region is the square region.
4. The method of claim 1, wherein in the fitting process, when X = m and Y reaches 255 at m <255, Y is set to be 255 when X is larger than m and smaller than 255.
5. The image illumination compensation method of claim 1, wherein the point-by-point mapping of the image to be measured is: and mapping each pixel of the image to be detected by using the tone mapping function to obtain a tone compensation result for eliminating the illumination influence.
6. A system for implementing the method of any of claims 1-5, comprising: calibration module, fitting module and processing module, wherein: the calibration module receives a shot image aiming at the calibration graph, is connected with the fitting module and outputs pixel information extracted from the calibration graph, the fitting module is connected with the processing module and outputs a tone mapping function calculated according to the pixel information, and the processing module performs real-time illumination compensation on an online input image to be detected according to the tone mapping function and outputs a compensated image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610824126.XA CN106504200B (en) | 2016-09-14 | 2016-09-14 | Image illumination compensation method and system based on hue offset estimation and point-by-point hue mapping |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610824126.XA CN106504200B (en) | 2016-09-14 | 2016-09-14 | Image illumination compensation method and system based on hue offset estimation and point-by-point hue mapping |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106504200A CN106504200A (en) | 2017-03-15 |
CN106504200B true CN106504200B (en) | 2022-12-09 |
Family
ID=58290444
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610824126.XA Active CN106504200B (en) | 2016-09-14 | 2016-09-14 | Image illumination compensation method and system based on hue offset estimation and point-by-point hue mapping |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106504200B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108205798A (en) * | 2017-03-22 | 2018-06-26 | 哈尔滨理工大学 | The MIcrosope image illumination compensation method of robust |
CN114071108B (en) * | 2021-11-04 | 2024-02-23 | Oppo广东移动通信有限公司 | Image processing method, apparatus, electronic device, and computer-readable storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1479529A (en) * | 2002-07-26 | 2004-03-03 | ���ǵ�����ʽ���� | Device and method of colour compensation |
US7336277B1 (en) * | 2003-04-17 | 2008-02-26 | Nvidia Corporation | Per-pixel output luminosity compensation |
CN102289360A (en) * | 2011-08-25 | 2011-12-21 | 浙江大学 | Self-adaptive projection color compensation method |
CN104486603A (en) * | 2014-11-20 | 2015-04-01 | 北京理工大学 | Multi-projection color correcting method based on HDR (high dynamic range) imaging |
CN104539921A (en) * | 2014-11-26 | 2015-04-22 | 北京理工大学 | Illumination compensation method based on multi-projector system |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7522781B2 (en) * | 2005-02-11 | 2009-04-21 | Samsung Electronics Co., Ltd. | Method and apparatus for image processing based on a mapping function |
US7639893B2 (en) * | 2006-05-17 | 2009-12-29 | Xerox Corporation | Histogram adjustment for high dynamic range image mapping |
KR101917404B1 (en) * | 2011-03-04 | 2019-01-24 | 엘비티 이노베이션스 리미티드 | Colour calibration method for an image capture device |
-
2016
- 2016-09-14 CN CN201610824126.XA patent/CN106504200B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1479529A (en) * | 2002-07-26 | 2004-03-03 | ���ǵ�����ʽ���� | Device and method of colour compensation |
US7336277B1 (en) * | 2003-04-17 | 2008-02-26 | Nvidia Corporation | Per-pixel output luminosity compensation |
CN102289360A (en) * | 2011-08-25 | 2011-12-21 | 浙江大学 | Self-adaptive projection color compensation method |
CN104486603A (en) * | 2014-11-20 | 2015-04-01 | 北京理工大学 | Multi-projection color correcting method based on HDR (high dynamic range) imaging |
CN104539921A (en) * | 2014-11-26 | 2015-04-22 | 北京理工大学 | Illumination compensation method based on multi-projector system |
Non-Patent Citations (3)
Title |
---|
RGB空间的HDR图像合成与色彩调节算法;姚洪涛等;《长春理工大学学报(自然科学版)》;20151015(第05期);第149-153页 * |
基于HSV空间的高动态范围彩色显微图像合成;李双建等;《四川大学学报(自然科学版)》;20130128(第01期);第47-51页 * |
改进线性模板映射的色彩转移算法;方力洋等;《浙江大学学报(工学版)》;20160630(第06期);第100-107、115页 * |
Also Published As
Publication number | Publication date |
---|---|
CN106504200A (en) | 2017-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108760767B (en) | Large-size liquid crystal display defect detection method based on machine vision | |
CN108492776B (en) | Intelligent external optical compensation method for AMOLED screen brightness unevenness | |
CN111489694B (en) | Method and system for carrying out external optical compensation on camera screen under AMOLED screen | |
CN104882098A (en) | Image correction method based on LED splicing display screen and image sensor | |
CN108305232B (en) | A kind of single frames high dynamic range images generation method | |
CN111107330B (en) | Color cast correction method for Lab space | |
CN105245785A (en) | Brightness balance adjustment method of vehicle panoramic camera | |
CN110910319B (en) | Operation video real-time defogging enhancement method based on atmospheric scattering model | |
CN106504200B (en) | Image illumination compensation method and system based on hue offset estimation and point-by-point hue mapping | |
CN109151431B (en) | Image color cast compensation method and device and display equipment | |
CN111541886A (en) | Vision enhancement system applied to muddy underwater | |
CN113792564A (en) | Indoor positioning method based on invisible projection two-dimensional code | |
JP7030425B2 (en) | Image processing device, image processing method, program | |
CN106408617B (en) | Interactive single image material obtaining system and method based on YUV color space | |
KR101279576B1 (en) | Method for generating panorama image within digital image processing apparatus | |
CN110290313B (en) | Method for guiding automatic focusing equipment to be out of focus | |
EP4090006A2 (en) | Image signal processing based on virtual superimposition | |
WO2015154526A1 (en) | Color restoration method and apparatus for low-illumination-level video surveillance images | |
JP5280940B2 (en) | Specific color detection circuit | |
CN106817542B (en) | imaging method and imaging device of microlens array | |
KR101358270B1 (en) | Method and apparatus for controlling humanoid robot arm | |
KR102470242B1 (en) | Image processing device, image processing method and program | |
CN111192333B (en) | Image display method, image display device, and computer storage medium | |
CN114663299A (en) | Training method and device suitable for image defogging model of underground coal mine | |
JP2006133055A (en) | Unevenness defect detection method and device, spatial filter, unevenness defect inspection system, and program for unevenness defect detection method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |