CN103617601B - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN103617601B
CN103617601B CN201310632201.9A CN201310632201A CN103617601B CN 103617601 B CN103617601 B CN 103617601B CN 201310632201 A CN201310632201 A CN 201310632201A CN 103617601 B CN103617601 B CN 103617601B
Authority
CN
China
Prior art keywords
target object
projection
object area
value
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310632201.9A
Other languages
Chinese (zh)
Other versions
CN103617601A (en
Inventor
唐春益
简培云
王轼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Super Technology Co Ltd
Original Assignee
深圳超多维光电子有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳超多维光电子有限公司 filed Critical 深圳超多维光电子有限公司
Priority to CN201310632201.9A priority Critical patent/CN103617601B/en
Publication of CN103617601A publication Critical patent/CN103617601A/en
Application granted granted Critical
Publication of CN103617601B publication Critical patent/CN103617601B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an image processing method and device. The method comprises the following steps that an image to be processed for a target object is acquired; a target object area in the image to be processed is determined; illumination projection in different directions is conducted on the target objet area to obtain a projection result; illumination compensation is conducted on the target object area according to the projection result. According to the technical scheme of the image processing method and device, the compensation uniformity of the target object area in the image can be improved.

Description

A kind of image processing method and device
Technical field
The present invention relates to image processing field, particularly relate to a kind of image processing method and device.
Background technology
During the light-metering determination methods of human face region stem primarily from based on the face tracking of video, environment is continually changing So that being alternately present uniform illumination and uneven phenomenon in human face region and putting forward.Do not differentiate between human face region uniform illumination Carry out illumination compensation and can make image under the uniform normal illumination of human face light with uneven, facial image distortion, face Tracking precision reduces.
In the environment of current illumination compensation method all assumes that current light environment all in needing illumination compensation, and Directly give corresponding human face region, however, after these methods carry out illumination compensation to image under normal illumination environment, past Unsatisfactory toward the compensation effect drawing, picture image grayscale transition discontinuous the problems such as such as occurs.
Content of the invention
The technical problem to be solved in the present invention is to provide a kind of image processing method and device, can judge based on figure in advance Whether the human face region of picture is in uniform illumination state, and whether enables illumination compensation according to uniform illumination Determines, improves The compensation uniformity to human face region in image.
For solving above-mentioned technical problem, embodiments of the invention provide a kind of image processing method, including:
Obtain a pending image with object;
Determine the target object area in described pending image;
Described target object area is carried out with the light projection of different directions, obtains projection result;
According to described projection result, illumination compensation is carried out to described target object area.
Wherein, the described light projection that described target object area is carried out with different directions, obtains the step bag of projection result Include:
Described target object area is carried out with the light projection in x direction, obtains the first gray value projection, to the described first ash Angle value projection is processed, and obtains projection result;Or
Described target object area is carried out with the light projection in y direction, obtains the second gray value projection, to the second gray value Projection is processed, and obtains projection result.
Wherein, according to described projection result, the step carrying out illumination compensation to described target object area includes:
Determine the average statistical of described target object area according to described projection result;
If described average statistical is less than the first predetermined threshold value, illumination compensation is carried out to described target object area.
Wherein, according to described projection result, the step carrying out illumination compensation to described target object area includes:
Determine the average statistical of described target object area according to described projection result;
If described average statistical is more than or equal to the first predetermined threshold value, described target is determined according to described projection result The highest brightness value of object area and minimum brightness value, and obtain the span between described highest brightness value and minimum brightness value Difference;
If described span difference is more than the second predetermined threshold value, illumination compensation is carried out to described target object area.
Wherein, described highest brightness value and the minimum brightness value determining described target object area according to described projection result Step include:
According to:Vmax=max(yi) determine the highest brightness value of described target object area;
According to:Vmin=min(yi) determine the minimum brightness value of described target object area;
Wherein, VmaxFor highest brightness value, VminFor minimum brightness value, yiFor y-axis direction gray value;
The described step obtaining the span difference between described highest brightness value and minimum brightness value includes:
According to formula:G=Vmax-VminObtain described span difference;Wherein, G is span difference.
Wherein, according to described projection result, the step carrying out illumination compensation to described target object area includes:
Determine highest brightness value and the minimum brightness value of described target object area according to described projection result, and obtain institute State the span difference between highest brightness value and minimum brightness value;
If described span difference is more than the second predetermined threshold value, according to described projection result, pending image is divided into height Luminance area, brightness uniformity region and low brightness area;
Obtain the percentage ratio that brightness uniformity region accounts for pending image, if described percentage ratio is less than three predetermined threshold value, Then illumination compensation is carried out to described target object area.
Wherein, according to described projection result, pending image is divided into high-brightness region, brightness uniformity region and low bright The step in degree region includes:
By formula:0.3(Vmax-Vmean)≥yi≥0.3(Vmean-Vmin) determine brightness uniformity region;
Determine yi≤0.3(Vmean-Vmin) region be low brightness area;
Determine yi≥(Vmax-Vmean) region be high-brightness region.
Wherein, VmaxFor the highest brightness value of target object area, VminFor the minimum brightness value of target object area, VmeanFor system Meter average, yiFor y-axis direction gray value.
Wherein, obtain the percentage ratio that brightness uniformity region accounts for pending image, if described percentage ratio is less than the 3rd default threshold During value, then the step carrying out illumination compensation to target object area includes:
If 0.3 (Vmax-Vmean)≥yi≥0.3(Vmean-Vmin) set up, then make n1=n1+ 1, and pass through formula:ObtainObtain the percentage ratio that brightness uniformity region accounts for pending image;
Wherein, n1For number in specified tonal range for the pixel value;WhenWhen, to described target object area Carry out illumination compensation.
Wherein, the scope of a is:0.4≤a≤0.7.
Wherein, the step carrying out illumination compensation to described target object area includes:
Obtain view data I (x, y) of the target object area of described pending image;
Obtain reflecting component R (x, y) of object in the target object area in described pending image;
Fuzzy Processing is carried out to described I (x, y), obtains irradiation component L (x, y) of the ambient light of described object;
According to described I (x, y), R (x, y) and L (x, y), described target object area is compensated, obtain described target The original image of thing.
On the other hand, embodiments of the invention also provide a kind of image processing apparatus, including:
Obtain module, for obtaining a pending image with object;
First determining module, for determining the target object area in described pending image;
Second determining module, for described target object area is carried out with the light projection of different directions, obtains projection result;
Compensating module, for according to described projection result, carrying out illumination compensation to described target object area.
Wherein, described second determining module includes:
First projection submodule, for described target object area is carried out with the light projection in x direction, obtains the first gray value Projection, processes to described first gray value projection, obtains projection result;
Second projection submodule, for described target object area is carried out with the light projection in y direction, obtains the second gray value Projection, processes to the second gray value projection, obtains projection result.
Wherein, described compensating module specifically for:Determine that according to described projection result the statistics of described target object area is equal Value;If described average statistical is less than the first predetermined threshold value, illumination compensation is carried out to described target object area.
Wherein, described compensating module specifically for:Determine that according to described projection result the statistics of described target object area is equal Value;If described average statistical is more than or equal to the first predetermined threshold value, described object area is determined according to described projection result The highest brightness value in domain and minimum brightness value, and obtain the span difference between described highest brightness value and minimum brightness value Value;If described span difference is more than the second predetermined threshold value, illumination compensation is carried out to described target object area.
Wherein, described compensating module specifically for:The the most highlighted of described target object area is determined according to described projection result Angle value and minimum brightness value, and obtain the span difference between described highest brightness value and minimum brightness value;If described across Degree difference is more than the second predetermined threshold value, then according to described projection result, pending image is divided into high-brightness region, brightness uniformity Region and low brightness area;Obtain the percentage ratio that brightness uniformity region accounts for pending image, if described percentage ratio is less than the 3rd During predetermined threshold value, then illumination compensation is carried out to described target object area.
The technique scheme of the present invention has the following technical effect that:
The above embodiment of the present invention by light projection is carried out to target area, and according to projection result to described target Object area carries out illumination compensation.Due to distinguishing the compensation situation treated under different light projections, thus avoiding general illumination bar Carry out illumination compensation, the low problem of picture quality, and the above embodiment of the present invention under part to improve to object area in image The compensation uniformity in domain, thus improve picture quality.
Brief description
Fig. 1 is the schematic flow sheet of embodiments of the invention image processing method;
Fig. 2 is an idiographic flow schematic diagram of embodiments of the invention image processing method;
Fig. 3 A is the pending image 1 described in Fig. 1 or Fig. 2;
Fig. 3 B is the projection histogram of the corresponding X-axis of pending image shown in Fig. 3 A;
Fig. 3 C is the projection histogram of the corresponding Y-axis of pending image shown in Fig. 3 A;
Fig. 4 A schematic diagram when relatively low for pending image 1 brightness described in Fig. 1 or Fig. 2;
Fig. 4 B is the projection histogram of the corresponding X-axis of pending image shown in Fig. 4 A;
Fig. 4 C is the projection histogram of the corresponding Y-axis of pending image shown in Fig. 4 A;
Fig. 5 A is schematic diagram during the pending image 1 left side light source irradiation described in Fig. 1 or Fig. 2;
Fig. 5 B is the projection histogram of the corresponding X-axis of pending image shown in Fig. 5 A;
Fig. 5 C is the projection histogram of the corresponding Y-axis of pending image shown in Fig. 5 A;
Fig. 6 A is schematic diagram during the pending image 1 right side source irradiation described in Fig. 1 or Fig. 2;
Fig. 6 B is the projection histogram of the corresponding X-axis of pending image shown in Fig. 6 A;
Fig. 6 C is the projection histogram of the corresponding Y-axis of pending image shown in Fig. 6 A;
Fig. 7 A is schematic diagram during the pending image 1 light source irradiation below described in Fig. 1 or Fig. 2;
Fig. 7 B is the projection histogram of the corresponding X-axis of pending image shown in Fig. 7 A;
Fig. 7 C is the projection histogram of the corresponding Y-axis of pending image shown in Fig. 7 A;
Fig. 8 is the structural representation of the image processing apparatus of the present invention.
Specific embodiment
For making the technical problem to be solved in the present invention, technical scheme and advantage clearer, below in conjunction with accompanying drawing and tool Body embodiment is described in detail.
As shown in figure 1, a kind of image processing method of embodiments of the invention, including:
Step 11, obtains a pending image with object;
Step 12, determines the target object area in described pending image;
Step 13, carries out the light projection of different directions, obtains projection result to described target object area;
Step 14, according to described projection result, carries out illumination compensation to described target object area.
The above embodiment of the present invention by light projection is carried out to target area, and according to projection result to described target Object area carries out illumination compensation.Due to distinguishing the compensation situation treated under different light projections, thus avoiding general illumination bar Carry out illumination compensation, the low problem of picture quality, and the above embodiment of the present invention under part to improve to object area in image The compensation uniformity in domain, thus improve picture quality.
Due to environment and hardware problem, the appearance noise that the image generally collecting all can be more or less, make an uproar to reduce The impact to statistics for the sound point, needs image is carried out with the process of a pretreatment, as shown in Fig. 2 acceptable after above-mentioned steps 11 Including:The step that pretreatment is filtered to described pending image.
Specifically, it is possible to use wave filter is filtered to entire image, using Gaussian filter to figure in the present embodiment As the pretreatment being filtered.
After pre-treatment step is carried out to image, determine in the step 12 of target object area in pending image, permissible Determine the target object area in pending image using multiple methods, if this target object area is human face region, for face The method of extracted region has many kinds, can select a kind of relatively stable human face region extraction side in embodiments of the invention Method, such as haar feature add the method for detecting human face of AdaBoost grader, and this method for detecting human face can detect that major part goes out The face of image now.
In above-mentioned steps 13, when light projection is carried out to target object area, different according to light source place direction, to phase Mutually vertical axle carries out histogrammic projection and can draw different results.But, when light source direction is consistent with the direction being projected When, the diversity that the corresponding projection histogram being drawn shows is bigger, is so easy for whether region object is in sidelight state; Whereas if light source direction is orthogonal with the direction of institute axis of projection, the diversity that the corresponding projection histogram being drawn shows is less, It is less susceptible to distinguish whether object is in sidelight state.
Specifically, so that object is as face as a example, image as shown in Figure 3A is human face region artwork, along this image y-axis The gray average projection that direction projects to x-axis as shown in Figure 3 B, along the x-axis direction gray average that y-axis is projected of this image Projection is as shown in Figure 3 C.
Specifically, in Fig. 3 A, light source is in the left side side of face, and light source direction is basically identical with x-axis direction, then, from Gray average projection to x-axis projection is it can be deduced that highest brightness value and minimum brightness value along the y-axis direction, and the difference of the two Value is larger, thus the difference of gray average projection is big, easily distinguishes.But, it is little to the gray average projection difference of y-axis, Distinction is poor.
The reason cause above-mentioned situation is the particularity of human face structure.When the direction of light source direction and image x-axis basic During cause, because the nasal area part in T-shape area in face is higher than the other parts of face, when the orthogonal to that phase of light source direction Then lower than left side of the face brightness to the right side face away from light source, thus result in the uneven phenomenon of illumination patterns, thus being formed Article one, more obvious light and shade cut-off rule, as shown in Figure 3A, light and shade cut-off rule is almost parallel with image y-axis.
Therefore, the axle orthogonal with light and shade cut-off rule is carried out project the rectangular histogram drawing most distinction.For example, in figure In 3A, this normal axis may be considered x-axis.But try to achieve because the light and shade cut-off rule of face is more difficult, approximate with and light source direction Orthogonal axle is as light and shade cut-off rule.The axis of projection of gray average projection is referred to as histogram projection angle with the angle of image x-axis. Along light and shade cut-off rule to the image of its orthogonal axial projection be maximum discrimination projection, as shown in Figure 3 B;And along with light and shade The image that cut-off rule its light and shade of orthogonal axial direction splits axial projection is the gray average projection of minimum discrimination, as shown in Figure 3 C.
Therefore, the projection histogram simply by the presence of a certain axle does not reach certain discrimination, then this facial image is carried out Illumination compensation.As shown in Figure 3 C, do not reach certain discrimination after x-axis is projected, then need to carry out illumination compensation to Fig. 3 A.
Certainly, in order to increase the stability of illumination compensation Rule of judgment, need to find maximum discrimination in facial image Projection, as long as maximum discrimination projection meets image compensation condition and is then believed that needs carry out illumination compensation.Due to face Image is symmetrical, as long as the axis of projection in the range of calculating 0-180 ° is just permissible, adds that light source is not strict from one Direction direct projection, therefore according to experiment, when calculating the projection to image x-axis direction(As Fig. 3 B)And the throwing to y-axis direction Shadow(As Fig. 3 C), can be judged by calculating gray average projection corresponding uniform parts proportion.When wherein one When individual image is unsatisfactory for this ratio, then carry out illumination compensation.
Widely, when illumination compensation is carried out to obtained image:If most of area of target object is in Color under colour system of the same race, and detect more careful target object image, then can apply the present invention to this target object Illumination judged.Technical staff belonging to this area can in the same way, and the embodiment of the present invention repeats no more.
In one specific embodiment of the present invention, step 13 can specifically include:
Described target object area is carried out with the light projection in x direction, obtains the first gray value projection, and to described first Gray value projection is processed, and obtains projection result;As the sweep in Fig. 3 B, represent the first gray value projection;Or
Described target object area is carried out with the light projection in y direction, obtains the second gray value projection, and to the second gray scale Value projection is processed, and obtains projection result;As the sweep in Fig. 3 C, represent the second gray value projection.
In one specific embodiment of the present invention, above-mentioned steps 14 can specifically include:
Step 141, determines the average statistical of described target object area according to described projection result;
Step 142, if described average statistical is less than the first predetermined threshold value, carries out illumination benefit to described target object area Repay.
In the still another embodiment of the present invention, above-mentioned steps 14 can also specifically include:
Step 143, determines the average statistical of described target object area according to described projection result;
Step 144, if described average statistical is more than or equal to the first predetermined threshold value, determines according to described projection result The highest brightness value of described target object area and minimum brightness value, and obtain described highest brightness value and minimum brightness value it Between span difference;
Step 145, if described span difference is more than the second predetermined threshold value, carries out illumination benefit to described target object area Repay.
Certainly, it is also possible to directly be determined according to described projection result in the variant embodiment based on this embodiment The highest brightness value of described target object area and minimum brightness value, and obtain described highest brightness value and minimum brightness value it Between span difference;If described span difference is more than the second predetermined threshold value, illumination compensation is carried out to described target object area.
In the still another embodiment of the present invention, above-mentioned steps 14 can also specifically include:
Step 146, determines highest brightness value and the minimum brightness value of described target object area according to described projection result, And obtain the span difference between described highest brightness value and minimum brightness value;
Step 147, if described span difference is more than the second predetermined threshold value, according to described projection result, by pending figure As being divided into high-brightness region, brightness uniformity region and low brightness area;
Step 148, obtains the percentage ratio that brightness uniformity region accounts for pending image, if described percentage ratio is less than the 3rd presetting During threshold value, then illumination compensation is carried out to described target object area.
In a variant embodiment of this embodiment, can also include before above-mentioned steps 146 as above-mentioned steps 143 and Judgement in step 144, in the case that even described average statistical is more than or equal to the first predetermined threshold value, then carries out step 146.
With reference to Fig. 2, after having carried out above-mentioned steps 11-13, the one of the above-mentioned steps 14 of the present invention is specifically preferably real Existing flow process, including:
Step 1401, determines the average statistical of described target object area according to described projection result;Specifically, can basis First gray value projection, obtains the average statistical of the gray value in x-axis direction for the described target object area, or according to the second ash Angle value projection, obtains the average statistical of the gray value in y-axis direction for the described target object area;
Step 1402, if described average statistical is less than the first predetermined threshold value Thr1, determines described target object area brightness Low, directly carry out illumination compensation;Otherwise, enter step 1403;
Step 1403, determines highest brightness value and the minimum brightness of described target object area according to described projection result Value;And obtain the span difference between described highest brightness value and minimum brightness value;
Step 1404, if described span difference is less than the second predetermined threshold value Thr2 then it is assumed that the brightness of target area is more equal Even, can directly export pending image;Otherwise, enter step 1405 to determine whether;This is because because image is obtaining The region of face can not only be got during human face region, wherein can comprise some background areas, span difference when causing some Value is too high, and such situation cannot be only with span difference for uniquely whether judging human face region uniform illumination, so carrying out Uniformly whether complete illumination after preliminary judgement, need to determine whether;
Step 1405, according to described projection result, by pending image be divided into high-brightness region, brightness uniformity region and Low brightness area;
Step 1406, obtains the percentage ratio that brightness uniformity region accounts for pending image, if described percentage ratio is pre- more than the 3rd If then it is assumed that not needing to carry out illumination compensation to target object area during threshold value a, otherwise, illumination compensation is carried out to target object area.
Wherein, in above-mentioned steps the first predetermined threshold value Thr1 and the second predetermined threshold value Thr2 can equal it is also possible to not phase Deng can be set according to practical situation.
Further, in above-mentioned steps 1401, for convenience of understanding, only taking the projection in y-axis direction as a example, can be by public affairs Formula:Obtain the average statistical of the gray average in y-axis direction for the target area;Wherein, VmeanFor statistics all Value, yiFor y-axis direction gray value, n is the number of pixels of y-axis.
Likewise, the projection for x-axis direction equally can be realized it is only necessary to by y according to the projecting method in y-axis direction Direction of principal axis gray value replaces with x-axis direction gray value.
And in above-mentioned steps 1403, still taking y-axis direction as a example, can basis:Vmax=max(yi) determine described object The highest brightness value in region;According to:Vmin=min(yi) determine the minimum brightness value of described target object area.
Likewise, for the projection in x-axis direction, again may be by for y-axis direction gray value replacing with x-axis direction ash Angle value can get highest brightness value and the minimum brightness value of target object area.
Further, can be according to formula:G=Vmax-VminObtain the span difference of target object area, if G>Thr2, then Determine that the brightness of target object area is more uniform, can directly export pending image;
Wherein, Thr2 is described second predetermined threshold value.Preferably, 40≤Thr2≤80.
In the above-mentioned specific embodiment of the present invention, in step 1405, formula can be passed through:
0.3(Vmax-Vmean)≥yi≥0.3(Vmean-Vmin) determine brightness uniformity region;
Accordingly, determine yi≤0.3(Vmean-Vmin) region be low brightness area;
Determine yi≥(Vmax-Vmean) region be high-brightness region.
In step 1406, if yi≤0.3(Vmean-Vmin) set up, described low brightness area is determined directly as brightness not Uniformly, need illumination compensation;
If yi≥(Vmax-Vmean) set up, described high-brightness region is determined directly as brightness irregularities, needs illumination to mend Repay;
If 0.3 (Vmax-Vmean)≥yi≥0.3(Vmean-Vmin) set up, then n1=n1+ 1, and pass through formula Obtain the percentage ratio that brightness uniformity region accounts for whole image;
Wherein, n1For number in specified tonal range for the pixel value;
WhenWhen it is believed that target object area be brightness uniformity it is not necessary to illumination compensation, otherwise it is assumed that object Region is brightness irregularities, needs illumination compensation.
Preferably, the scope of a is:0.4≤a≤0.7.
In above-mentioned steps 14, carry out specifically can including during illumination compensation:
Step 151, obtains view data I (x, y) of the target object area of described pending image;
Step 152, obtains reflecting component R (x, y) of object in the human face region in described pending image;
Step 153, carries out Fuzzy Processing to described I (x, y), obtain the ambient light of described object irradiation component L (x, y);
Step 154, according to described I (x, y), R (x, y) and L (x, y), compensates to described human face region, obtains institute State the original image of object.
Wherein, in step 154, formula can be passed through:Log [R (x, y)]=log [I (x, y)]-log [L (x, y)], obtains The value of log [R (x, y)];The value of log [R (x, y)] is quantified as pixel value output;Pixel value according to output is to described face Region compensates, and obtains the original image of described object.
The specific embodiment shown in above-mentioned Fig. 2 of the present invention is directed to the uniform illumination of target object area and uneven situation Accurate judgement can be provided, as the Rule of judgment whether enabling illumination compensation method, it is to avoid some are in target object light According to uniform situation still enable illumination compensation method generation some to human face target thing follow the tracks of or target object segmentation carried The error result coming.
Implement effect with reference to specific marginal data above-described embodiment:
Rectangular histogram:Rectangular histogram abscissa unit is that pixel is consistent with artwork abscissa, and vertical coordinate is gray level(0-255 it Between);
Fig. 3 B is the rectangular histogram to x-axis projection:Along y-axis add up all pixels taking project to after average straight in x-axis Fang Tu;
The rectangular histogram that Fig. 3 C projects to y-axis:Add up all pixels take the Nogata projecting to after average in y-axis along x-axis Figure;Wherein, line 3:Represent the lower limit of homogeneous area;Line 4:Represent the upper limit of homogeneous area;Line 5:Represent that histogrammic gray scale is equal Value;
The relatively low situation of illumination brightness, as shown in Figure 4 A, the rectangular histogram being presented is shown in Fig. 4 B and Fig. 4 C:Face is overall Look like and be in dark state, and present more uniform, from the point of view of projection histogram, which either projects to On axle, the most fractional part of histogram curve is all in uniform area-wide, but because brightness is relatively low, thick judgement is directly regarded as Need illumination compensation.
In the normal situation of illumination, as shown in Figure 5A:
Situation 1:Face integrally looks like the state being in normal illumination, and presents more uniform, straight from projection Which from the point of view of square figure Fig. 5 B and Fig. 5 C, either project on axle, the most fractional part of histogram curve is all in homogeneous area In the range of, so directly regarding as not needing to carry out illumination compensation.
Situation 2:The face of Fig. 6 A integrally looks like the state being in normal illumination, assumes more uniform illumination feelings Condition, the region of the face due to intercepting contains background, and background gray scale differs larger with the gray scale of human face region.
In the histogram projection to x-axis(Fig. 6 B), in determining image, the partly shared percentage ratio of brightness uniformity is full The 3rd predetermined threshold value set by foot, is therefore still set as not needing to carry out illumination compensation;
In the same manner, in the histogram projection to y-axis(Fig. 6 C)It is also to draw the same result.
Side light source face rectangular histogram is as shown in Figure 7 A:
In side light source state, x-axis is very different with to the histogram projection of y-axis.
In the projection to x-axis(As Fig. 7 B)Preferable distinction can be drawn, be therefore judged as needing illumination compensation.
For y-axis projection(As Fig. 7 C)Because preliminary Rule of judgment does not meet, therefore, it is determined that being without illumination compensation, but root According to principle described herein, as long as having a rectangular histogram to meet condition then think needs illumination compensation.
In sum, the above embodiment of the present invention is by carrying out light projection to target object area, and according to projection knot Really, illumination compensation is carried out to described target object area.Due to distinguishing the compensation situation treated under different light projections, thus avoiding Carry out illumination compensation under general illumination condition, picture quality low problem, and the above embodiment of the present invention improves to figure The compensation uniformity of target object area in picture, thus improve picture quality.
As shown in figure 8, the another aspect of embodiments of the invention, also provide a kind of image processing apparatus 80, including:
Obtain module 81, for obtaining a pending image with object;Afterwards, pending image can also be entered Row pretreatment, common process is all using wave filter, entire image to be filtered, using Gaussian filter in the present embodiment The pretreatment that image is filtered;
First determining module 82, for determining the target object area in described pending image;
Second determining module 83, for described target object area is carried out with the light projection of different directions, obtains projection knot Really;
Compensating module 84, for according to described projection result, carrying out illumination compensation to described target object area.
Wherein, described second determining module 83 includes:
First projection submodule, for described target object area is carried out with the light projection in x direction, obtains the first gray value Projection, processes to described first gray value projection, obtains projection result;
Second projection submodule, for described target object area is carried out with the light projection in y direction, obtains the second gray value Projection, processes to the second gray value projection, obtains projection result.
Described compensating module specifically for:Determine the average statistical of described target object area according to described projection result;If Described average statistical is less than the first predetermined threshold value, then carry out illumination compensation to described target object area.
Described compensating module can also be specifically for:Determine that according to described projection result the statistics of described target object area is equal Value;If described average statistical is more than or equal to the first predetermined threshold value, described object area is determined according to described projection result The highest brightness value in domain and minimum brightness value, and obtain the span difference between described highest brightness value and minimum brightness value Value;If described span difference is more than the second predetermined threshold value, illumination compensation is carried out to described target object area.
Described compensating module can also be specifically for:The the most highlighted of described target object area is determined according to described projection result Angle value and minimum brightness value, and obtain the span difference between described highest brightness value and minimum brightness value;If described across Degree difference is more than the second predetermined threshold value, then according to described projection result, pending image is divided into high-brightness region, brightness uniformity Region and low brightness area;Obtain the percentage ratio that brightness uniformity region accounts for pending image, if described percentage ratio is less than the 3rd During predetermined threshold value, then illumination compensation is carried out to described target object area.
It should be noted that:This device embodiment is device corresponding with said method embodiment, said method embodiment In all realization rate be all applied to the embodiment of this device, also can reach and said method embodiment identical technology effect Really.
The said apparatus embodiment of the present invention is equally directed to the uniform illumination of target object area and uneven situation can Provide accurate judgement, as the Rule of judgment whether enabling illumination compensation method, it is to avoid some are in target object area light Some still enabling the generation of illumination compensation method according to uniform situation are brought to target tracking or target object segmentation Error result.
Each device embodiment of the present invention can be realized with hardware, or to run on one or more processor Software module realize, or with combinations thereof realize.It will be understood by those of skill in the art that can use in practice Microprocessor or digital signal processor(DSP)To realize some or all moulds in device according to embodiments of the present invention The some or all functions of block.The present invention is also implemented as the program of device for executing method as described herein(Example As, computer program and computer program).
The above is the preferred embodiment of the present invention it is noted that for those skilled in the art For, on the premise of without departing from principle of the present invention, some improvements and modifications can also be made, these improvements and modifications Should be regarded as protection scope of the present invention.

Claims (8)

1. a kind of image processing method is it is characterised in that include:
Obtain a pending image with object;
Determine the target object area in described pending image;
Described target object area is carried out with the light projection of different directions, obtains projection result;
Determine the average statistical of described target object area according to described projection result;
If described average statistical is more than or equal to the first predetermined threshold value, described target object area is determined according to described projection result Highest brightness value and minimum brightness value, and obtain the span difference between described highest brightness value and minimum brightness value; If described average statistical is less than described first predetermined threshold value, illumination compensation is carried out to described target object area;
If described span difference is more than the second predetermined threshold value, according to described projection result, pending image is divided into high brightness Region, brightness uniformity region and low brightness area;
Obtain the percentage ratio that brightness uniformity region accounts for pending image, if described percentage ratio is less than the 3rd predetermined threshold value, to institute State target object area and carry out illumination compensation.
2. image processing method according to claim 1 is it is characterised in that described carry out difference to described target object area The light projection in direction, the step obtaining projection result includes:
Described target object area is carried out with the light projection in x direction, obtains the first gray value projection, to described first gray value Projection is processed, and obtains projection result;Or
Described target object area is carried out with the light projection in y direction, obtains the second gray value projection, the second gray value is projected Figure is processed, and obtains projection result.
3. image processing method according to claim 1 it is characterised in that described according to described projection result determine described in The step of the highest brightness value of target object area and minimum brightness value includes:
According to:Vmax=max (yi) determine the highest brightness value of described target object area;
According to:Vmin=min (yi) determine the minimum brightness value of described target object area;
Wherein, VmaxFor highest brightness value, VminFor minimum brightness value, yiFor y-axis direction gray value;
The described step obtaining the span difference between described highest brightness value and minimum brightness value includes:
According to formula:G=Vmax-VminObtain described span difference;Wherein, G is span difference.
4. image processing method according to claim 1 is it is characterised in that according to described projection result, by pending figure As being divided into the step of high-brightness region, brightness uniformity region and low brightness area to include:
By formula:0.3(Vmax-Vmean)≥yi≥0.3(Vmean-Vmin) determine brightness uniformity region;
Determine yi≤0.3(Vmean-Vmin) region be low brightness area;
Determine yi≥(Vmax-Vmean) region be high-brightness region;
Wherein, VmaxFor the highest brightness value of target object area, VminFor the minimum brightness value of target object area, VmeanFor statistics all Value, yiIt is the gray value in y-axis direction for the described target object area being obtained according to the second gray value projection.
5. image processing method according to claim 4 accounts for pending image it is characterised in that obtaining brightness uniformity region Percentage ratio, if described percentage ratio be less than three predetermined threshold value, the step carrying out illumination compensation to target object area includes:
If 0.3 (Vmax-Vmean)≥yi≥0.3(Vmean-Vmin) set up, then make n1=n1+ 1, and pass through formula: Obtain the percentage ratio that brightness uniformity region accounts for pending image;
Wherein, n1For number in specified tonal range for the pixel value, n is the number of pixels of y-axis;As p (n1)<During a, to described Target object area carries out illumination compensation, and wherein, the scope of a is:0.4≤a≤0.7.
6. image processing method according to claim 1 is it is characterised in that carry out illumination compensation to described target object area Step include:
Obtain view data I (x, y) of the target object area of described pending image;
Obtain reflecting component R (x, y) of object in the target object area in described pending image;
Fuzzy Processing is carried out to described I (x, y), obtains irradiation component L (x, y) of the ambient light of described object;
According to described I (x, y), R (x, y) and L (x, y), described target object area is compensated, obtain described object Original image.
7. a kind of image processing apparatus are it is characterised in that include:
Obtain module, for obtaining a pending image with object;
First determining module, for determining the target object area in described pending image;
Second determining module, for described target object area is carried out with the light projection of different directions, obtains projection result;
Compensating module, for determining the average statistical of described target object area according to described projection result;If described average statistical The highest brightness value and of described target object area more than or equal to the first predetermined threshold value, is then determined according to described projection result Low brightness values, and obtain the span difference between described highest brightness value and minimum brightness value;If described average statistical is less than Described first predetermined threshold value, then carry out illumination compensation to described target object area;If described span difference is more than the second default threshold Value, then according to described projection result, pending image is divided into high-brightness region, brightness uniformity region and low brightness area; Obtain the percentage ratio that brightness uniformity region accounts for pending image, if described percentage ratio is less than the 3rd predetermined threshold value, to described mesh Mark object area carries out illumination compensation.
8. image processing apparatus according to claim 7 are it is characterised in that described second determining module includes:
First projection submodule, for described target object area is carried out with the light projection in x direction, obtains the first gray value projection Figure, processes to described first gray value projection, obtains projection result;
Second projection submodule, for described target object area is carried out with the light projection in y direction, obtains the second gray value projection Figure, processes to the second gray value projection, obtains projection result.
CN201310632201.9A 2013-11-29 2013-11-29 Image processing method and device Active CN103617601B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310632201.9A CN103617601B (en) 2013-11-29 2013-11-29 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310632201.9A CN103617601B (en) 2013-11-29 2013-11-29 Image processing method and device

Publications (2)

Publication Number Publication Date
CN103617601A CN103617601A (en) 2014-03-05
CN103617601B true CN103617601B (en) 2017-02-22

Family

ID=50168305

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310632201.9A Active CN103617601B (en) 2013-11-29 2013-11-29 Image processing method and device

Country Status (1)

Country Link
CN (1) CN103617601B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104754184A (en) * 2015-04-10 2015-07-01 四川理工学院 Real-time monitoring video illumination compensation method and real-time monitoring video illumination compensation system
CN105530498A (en) * 2015-12-15 2016-04-27 深圳市时代华影科技股份有限公司 3D projection system capable of compensating uniformity of metal curtain and compensation method thereof
CN105760868B (en) * 2016-02-03 2019-04-30 Oppo广东移动通信有限公司 Target in adjustment image looks for the method, device and mobile terminal of tendency

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116756A (en) * 2013-01-23 2013-05-22 北京工商大学 Face detecting and tracking method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7536030B2 (en) * 2005-11-30 2009-05-19 Microsoft Corporation Real-time Bayesian 3D pose tracking

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116756A (en) * 2013-01-23 2013-05-22 北京工商大学 Face detecting and tracking method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Combine image quality fusion and illumination compensation for video-based face recognition;Chao Wang等;《Neurocomputing》;20100331;第73卷(第7期);第1478-1490页 *
人脸识别中光照补偿参数及必要性的判定方法;孙雪梅等;《计算机应用研究》;20070731;第24卷(第7期);第306-311页 *
基于光照分类的可变光照下人脸识别方法;崔瑞 等;《计算机工程与应用》;20101001;第46卷(第28期);第185-188页 *

Also Published As

Publication number Publication date
CN103617601A (en) 2014-03-05

Similar Documents

Publication Publication Date Title
US11314979B2 (en) Method and apparatus for evaluating image acquisition accuracy, electronic device and storage medium
Chung et al. A non-parametric blur measure based on edge analysis for image processing applications
CN104519281B (en) The processing method and processing unit of a kind of image
US7639878B2 (en) Shadow detection in images
CN105046655B (en) A kind of automatic sharpening method of video image and device
CN107067382A (en) A kind of improved method for detecting image edge
CN105787902B (en) Utilize the image denoising method of block sorting detection noise
US8280122B2 (en) Registration device, collation device, extraction method, and program
CN102385753A (en) Illumination-classification-based adaptive image segmentation method
CN103617601B (en) Image processing method and device
CN104021533B (en) A kind of real time imaging noise-reduction method and device
CN109086724A (en) A kind of method for detecting human face and storage medium of acceleration
CN103702034A (en) Photographic method and device for improving brightness distribution of picture
CN103489168A (en) Enhancing method and system for infrared image being converted to pseudo color image in self-adaptive mode
CN102012770B (en) Image correction-based camera positioning method
CN110909631A (en) Finger vein image ROI extraction and enhancement method
CN113706521A (en) Carbon fiber surface hairiness detection method and device, storage medium and electronic equipment
CN115439523A (en) Method and equipment for detecting pin size of semiconductor device and storage medium
CN109064439A (en) Single-sided illumination formula light guide plate shadow defect extracting method based on subregion
TWI383690B (en) Method for image processing
CN104463812B (en) The method for repairing the video image by raindrop interference when shooting
CN109003246A (en) Eye repairs graph parameter detection method
CN105631816A (en) Iris image noise classification detection method
CN106951902A (en) A kind of image binaryzation processing method and processing device
CN105160635B (en) A kind of image filtering method based on fractional order differential estimation gradient field

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20180725

Address after: 518054 Room 201, building A, 1 front Bay Road, Shenzhen Qianhai cooperation zone, Shenzhen, Guangdong

Patentee after: Shenzhen super Technology Co., Ltd.

Address before: 518053 H-1 Tung 101, overseas Chinese town, Nanshan District, Shenzhen, Guangdong.

Patentee before: Shenzhen SuperD Photoelectronic Co., Ltd.