CN105872397A - Single frame wide dynamic enhancement method and system capable of automatically identifying bright area and dark area - Google Patents

Single frame wide dynamic enhancement method and system capable of automatically identifying bright area and dark area Download PDF

Info

Publication number
CN105872397A
CN105872397A CN201610203548.5A CN201610203548A CN105872397A CN 105872397 A CN105872397 A CN 105872397A CN 201610203548 A CN201610203548 A CN 201610203548A CN 105872397 A CN105872397 A CN 105872397A
Authority
CN
China
Prior art keywords
result
area
image
submodule
minrgb1
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610203548.5A
Other languages
Chinese (zh)
Inventor
刘军
向多春
沈建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CHENGDU CORPRO TECHNOLOGY Co Ltd
Original Assignee
CHENGDU CORPRO TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CHENGDU CORPRO TECHNOLOGY Co Ltd filed Critical CHENGDU CORPRO TECHNOLOGY Co Ltd
Priority to CN201610203548.5A priority Critical patent/CN105872397A/en
Publication of CN105872397A publication Critical patent/CN105872397A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/63Noise processing, e.g. detecting, correcting, reducing or removing noise applied to dark current

Abstract

The invention discloses a single frame wide dynamic enhancement method and system capable of automatically identifying a bright area and a dark area. The method comprises following steps of capturing each frame of image in video streaming; and carrying out following steps of S1, adjusting a low brightness area; S2, adjusting a high brightness area; and S3, intelligent fusion, on each frame of image. According to the method and the system, method modification is carried out on the existing single frame wide dynamic technique. According to the actual measurement, in a wide dynamic scene, both the dark area and the bright area can be adjusted; and especially, the processing effect on the bright area is relatively clear. Partial operators are realized by table look-up, therefore, the operation steps are reduced, and automatic calculation of fusion parameters is suitable for engineering realization. Compared with the prior art, the method and the system have the advantages that the high brightness area is adjusted clearly; the method and the system are suitable for processing both a relatively dark scene and a relatively bright scene; the principle is simple; the hardware realization is convenient; and the intelligent fusion is realized with parameter adjustment.

Description

A kind of automatic identification clear zone and the dynamic Enhancement Method of single frames width of dark space and system
Technical field
The present invention relates to a kind of automatic identification clear zone and the dynamic Enhancement Method of single frames width of dark space and system.
Background technology
It is integrated with single frames width dynamic adjustment algorithm inside current most of soc chip, but most of soc adjustment algorithm is the most right Dark space is adjusted, and adjusts clear zone not quite or does not has clear improvement, it is impossible to meeting actual application environment complicated and changeable, especially It is in scene alternately bright, dark, it may appear that over-exposed cause image information to be lost.The method of adjustment purpose of the present invention be into One step promotes the details in clear zone in wide dynamic scene, does prior art the most perfect.
Summary of the invention
It is an object of the invention to overcome the deficiencies in the prior art, it is provided that the single frames width of a kind of automatic identification clear zone and dark space dynamically increases Strong method and system, design clear zone and dark space adjustment framework, clear zone adjustment algorithm, generation look-up method and clear zone and dark space are certainly Dynamic recognition methods.
It is an object of the invention to be achieved through the following technical solutions: the single frames width of a kind of automatic identification clear zone and dark space dynamically increases Strong method, captures each two field picture img in video streaming, and each two field picture carries out following steps:
S1: adjust low brightness area, including following sub-step:
S11: take the minima of original image img each pixel color component, the result obtained is designated as minRGB1;
S12: negated by the minRGB1 obtained in step S11, it may be assumed that 255-minRGB1, the result obtained is designated as MminRGB1;
S13: the MminRGB1 obtained in step S12 is substituted into two-sided filter, and result is designated as bf1;
S14: result step S13 obtained is tabled look-up, and result is designated as lookbf1;Wherein, the generation of form is according to as follows Mode generates:
E1=a(b-x)./b
Table=ye1
In formula, a, b are adjustable parameter;E1 is intermediate object program, and table is the form generated, and the value of x, y is respectively as follows:
X=0,1,2,3 ... 255
Y=0,1,2,3 ... 255;
Owing to denominator can not be 0, therefore, 0 in y is modified to 1 and substitutes into calculating again;
S2: adjust high-brightness region, including following sub-step:
S21: being negated by original image img, result is designated as img2, it may be assumed that img2=255-img;
S22: in the result images that will obtain in step S21, each pixel takes color component minima, result is designated as min2;
S23: result step S22 obtained negates, and result is designated as Mmin2, i.e. Mmin2=255-min2;
S24: the two-sided filter that result step S23 obtained substitutes into, result is designated as bf2;
S25: result step S24 obtained is tabled look-up, and result is designated as lookbf2;Wherein, the generation of form is according to step The mode of S14 produces;
S26: result step S25 obtained negates, result is designated as M lookbf2, i.e. M lookbf2=255-lookbf2;
S3: Intelligent Fusion, result Mlookbf2 result lookbf1 produced in step S1 and step S2 produced is according to such as Under type merges:
Out=m lookbf1+ (1-m) Mlookbf2;
In formula, out is fusion results, and m is fusion parameters, and span is [0,1], and when m is less, main adjustment is highlighted Region;When m is bigger, adjust low brightness area.
In step S1, the span of a is [1,5], and the span of b is [1,255].
Interval less for described m be [0,0.5), interval less for described m is [0.5,1].
Described step S3 includes following sub-step:
S31: detection clear zone: the minRGB1 of step S11 is substituted into below equation and calculates, obtain high luminance area area image o (x, y):
O (x, y)=1/ (1+r (x, y)-z);
In formula, (x y) represents that (z is parameter to pixel for x, the value of minRGB1 y) at coordinate to r;Then high-brightness region is traveled through Image o (x, y), to each pixel o (x, y) carries out binary conversion treatment: when L (x, y) value more than or equal to M time, o ' (x, y)=1, When L (x, y) value less than M time, o ' (x, y)=0, it may be assumed that
o &prime; ( x , y ) = 1 L ( x , y ) &GreaterEqual; M 0 L ( x , y ) < M
In formula, (x, y) is binary image to o ', and M is the average of image minRGB1, and acquiring method is cumulative minRGB1 The all pixel values of image, then the number of pixels divided by minRGB1 image;
S32: reference area: (x, y) all pixel values of image add up, and the result obtained is designated as k ', k ' for image highlight bar by o ' The area in territory;If minRGB1 image pixel number is k ", then calculating clear zone accounts for the ratio of minRGB1 image size is t:
T=k '/k ";
S33: calculate average: by step S31 obtains clear zone binary image o ' (x, y) with minRGB1 image respective pixel Point is multiplied, and the result obtained is designated as o " (x, y), it may be assumed that
O " (x, y)=minRGB1 o ' (x, y);
By o, " (x, y) all pixel values of image add up, and the result obtained is designated as k, calculate minRGB1 image according to below equation Clear zone average M ':
M '=k/k ';
S34: determine fusion parameters m:
(1) when clear zone average M ' is more than or equal to first threshold P, fusion parameters m choose by detection clear zone decision: when When bright area area accounts for ratio t of total image area more than Q%, mainly adjust bright area, adjust parameter m and take X, X More than 0.5;Otherwise take 0;
(2) when clear zone average M ' is less than first threshold P, mainly adjusting dark areas, fusion parameters m takes 1;
S35: Intelligent Fusion: the lookbf1 obtained in fusion parameters m that step S34 is obtained, step S1 and step S2 In the Mlookbf2 that obtains substitute into below equation and merge:
Out=m lookbf1+ (1-m) Mlookbf2;
Wherein, out is fusion results, and m is calculated fusion parameters.
The value of described first threshold P is 200, and the value of Q is 40, and the value of X is 0.6, and the value of z takes 10.
The single frames width of a kind of automatic identification clear zone and dark space dynamically strengthens system, including:
Picture frame acquisition module: for each two field picture in settlement in video streaming;
Low brightness area adjusting module: for being adjusted low brightness area, including capture vegetarian refreshments minima being sequentially connected with Modules A, negate submodule A, bilateral filtering submodule A and the submodule A that tables look-up;The described submodule A that tables look-up includes that table generates Unit A and table query unit A;
High-brightness region adjusting module: for being adjusted high-brightness region, negates submodule B including be sequentially connected with, take Pixel minima submodule B, negate submodule C, bilateral filtering submodule B, table look-up submodule B and negate submodule D; The described submodule B that tables look-up includes table signal generating unit B and table query unit B;
Intelligent Fusion module: the knot that the result for being produced by low brightness area adjusting module produces with high-brightness region adjusting module Fruit carries out Intelligent Fusion.
Brightness detection sub-module that described Intelligent Fusion module includes being sequentially connected with, areal calculation submodule, mean value computation submodule Block, fusion parameters determine submodule and Intelligent Fusion submodule.
The invention has the beneficial effects as follows:
The present invention is the improvement doing in method to existing single frames width dynamic technique, through actual measurement, in wide dynamic scene, it is possible to right Dark areas and bright area all adjust, especially the process to bright area, and effect is more apparent.Part operator uses realization of tabling look-up, Decreasing calculation step, automatically calculating of fusion parameters is suitable for through engineering approaches realization.Compared with prior art, there is following advantage: (1) highlight regions is adjusted substantially;(2) it is suitable for the process of relatively dark scene, also is adapted for the process of relatively bright field scape;(3) principle letter Single, facilitate hardware to realize;(4) Intelligent Fusion, printenv regulates.
Accompanying drawing explanation
Fig. 1 is the inventive method flow chart.
Detailed description of the invention
Technical scheme is described in further detail below in conjunction with the accompanying drawings: automatically identify that clear zone is with dark as it is shown in figure 1, a kind of The dynamic Enhancement Method of single frames width in district, captures each two field picture img in video streaming, and each two field picture carries out following steps:
S1: adjust low brightness area, including following sub-step:
S11: take the minima of original image img each pixel color component, the result obtained is designated as minRGB1;
S12: negated by the minRGB1 obtained in step S11, it may be assumed that 255-minRGB1, the result obtained is designated as MminRGB1;
S13: the MminRGB1 obtained in step S12 is substituted into two-sided filter, and result is designated as bf1;
S14: result step S13 obtained is tabled look-up, and result is designated as lookbf1;Wherein, the generation of form is according to as follows Mode generates:
E1=a(b-x)./b
Table=ye1
In formula, a, b are adjustable parameter;E1 is intermediate object program, and table is the form generated, and the value of x, y is respectively as follows:
X=0,1,2,3 ... 255
Y=0,1,2,3 ... 255;
Owing to denominator can not be 0, therefore, 0 in y is modified to 1 and substitutes into calculating again;
S2: adjust high-brightness region, including following sub-step:
S21: being negated by original image img, result is designated as img2, it may be assumed that img2=255-img;
S22: in the result images that will obtain in step S21, each pixel takes color component minima, result is designated as min2;
S23: result step S22 obtained negates, and result is designated as Mmin2, i.e. Mmin2=255-min2;
S24: the two-sided filter that result step S23 obtained substitutes into, result is designated as bf2;
S25: result step S24 obtained is tabled look-up, and result is designated as lookbf2;Wherein, the generation of form is according to step The mode of S14 produces;
S26: result step S25 obtained negates, result is designated as M lookbf2, i.e. M lookbf2=255-lookbf2;
S3: Intelligent Fusion, result Mlookbf2 result lookbf1 produced in step S1 and step S2 produced is according to such as Under type merges:
Out=m lookbf1+ (1-m) Mlookbf2;
In formula, out is fusion results, and m is fusion parameters, and span is [0,1], and when m is less, main adjustment is highlighted Region;When m is bigger, adjust low brightness area.
In step S1, the span of a is [1,5], and the span of b is [1,255].
Interval less for described m be [0,0.5), interval less for described m is [0.5,1].
Described step S3 includes following sub-step:
S31: detection clear zone: the minRGB1 of step S11 is substituted into below equation and calculates, obtain high luminance area area image o (x, y):
O (x, y)=1/ (1+r (x, y)-z);
In formula, (x y) represents that (z is parameter to pixel for x, the value of minRGB1 y) at coordinate to r;Then high-brightness region is traveled through Image o (x, y), to each pixel o (x, y) carries out binary conversion treatment: when L (x, y) value more than or equal to M time, o ' (x, y)=1, When L (x, y) value less than M time, o ' (x, y)=0, it may be assumed that
o &prime; ( x , y ) = 1 L ( x , y ) &GreaterEqual; M 0 L ( x , y ) < M
In formula, (x, y) is binary image to o ', and M is the average of image minRGB1, and acquiring method is cumulative minRGB1 The all pixel values of image, then the number of pixels divided by minRGB1 image;
S32: reference area: (x, y) all pixel values of image add up, and the result obtained is designated as k ', k ' for image highlight bar by o ' The area in territory;If minRGB1 image pixel number is k ", then calculating clear zone accounts for the ratio of minRGB1 image size is t:
T=k '/k ";
S33: calculate average: by step S31 obtains clear zone binary image o ' (x, y) with minRGB1 image respective pixel Point is multiplied, and the result obtained is designated as o " (x, y), it may be assumed that
O " (x, y)=minRGB1 o ' (x, y);
By o, " (x, y) all pixel values of image add up, and the result obtained is designated as k, calculate minRGB1 image according to below equation Clear zone average M ':
M '=k/k ';
S34: determine fusion parameters m:
(1) when clear zone average M ' is more than or equal to first threshold P, fusion parameters m choose by detection clear zone decision: when When bright area area accounts for ratio t of total image area more than Q%, mainly adjust bright area, adjust parameter m and take X, X More than 0.5;Otherwise take 0;
(2) when clear zone average M ' is less than first threshold P, mainly adjusting dark areas, fusion parameters m takes 1;
S35: Intelligent Fusion: the lookbf1 obtained in fusion parameters m that step S34 is obtained, step S1 and step S2 In the Mlookbf2 that obtains substitute into below equation and merge:
Out=m lookbf1+ (1-m) Mlookbf2;
Wherein, out is fusion results, and m is calculated fusion parameters.
The value of described first threshold P is 200, and the value of Q is 40, and the value of X is 0.6, and the value of z takes 10.
The single frames width of a kind of automatic identification clear zone and dark space dynamically strengthens system, corresponding with method, including:
Picture frame acquisition module: for each two field picture in settlement in video streaming;
Low brightness area adjusting module: for being adjusted low brightness area, including capture vegetarian refreshments minima being sequentially connected with Modules A, negate submodule A, bilateral filtering submodule A and the submodule A that tables look-up;The described submodule A that tables look-up includes that table generates Unit A and table query unit A;
High-brightness region adjusting module: for being adjusted high-brightness region, negates submodule B including be sequentially connected with, take Pixel minima submodule B, negate submodule C, bilateral filtering submodule B, table look-up submodule B and negate submodule D; The described submodule B that tables look-up includes table signal generating unit B and table query unit B;
Intelligent Fusion module: the knot that the result for being produced by low brightness area adjusting module produces with high-brightness region adjusting module Fruit carries out Intelligent Fusion.
Brightness detection sub-module that described Intelligent Fusion module includes being sequentially connected with, areal calculation submodule, mean value computation submodule Block, fusion parameters determine submodule and Intelligent Fusion submodule.
The key point of the present invention:
1, image highlight area reverse process method;
2, the method generating form;
3, the computational methods of fusion parameters.

Claims (7)

1. an automatic identification clear zone and the dynamic Enhancement Method of single frames width of dark space, it is characterised in that: capture each frame figure in video streaming As img, each two field picture is carried out following steps:
S1: adjust low brightness area, including following sub-step:
S11: take the minima of original image img each pixel color component, the result obtained is designated as minRGB1;
S12: negated by the minRGB1 obtained in step S11, it may be assumed that 255-minRGB1, the result obtained is designated as MminRGB1;
S13: the MminRGB1 obtained in step S12 is substituted into two-sided filter, and result is designated as bf1;
S14: result step S13 obtained is tabled look-up, and result is designated as lookbf1;Wherein, the generation of form is as follows Generate:
E1=a(b-x)./b
Table=ye1
In formula, a, b are adjustable parameter;E1 is intermediate object program, and table is the form generated, and the value of x, y is respectively as follows:
X=0,1,2,3 ... 255
Y=0,1,2,3 ... 255;
Owing to denominator can not be 0, therefore, 0 in y is modified to 1 and substitutes into calculating again;
S2: adjust high-brightness region, including following sub-step:
S21: being negated by original image img, result is designated as img2, it may be assumed that img2=255-img;
S22: in the result images that will obtain in step S21, each pixel takes color component minima, result is designated as min2;
S23: result step S22 obtained negates, and result is designated as Mmin2, i.e. Mmin2=255-min2;
S24: the two-sided filter that result step S23 obtained substitutes into, result is designated as bf2;
S25: result step S24 obtained is tabled look-up, and result is designated as lookbf2;Wherein, the generation of form is according to step S14 Mode produce;
S26: result step S25 obtained negates, result is designated as M lookbf2, i.e. M lookbf2=255-lookbf2;
S3: Intelligent Fusion, result Mlookbf2 result lookbf1 produced in step S1 and step S2 produced is according to such as lower section Formula merges:
Out=m lookbf1+ (1-m) Mlookbf2;
In formula, out is fusion results, and m is fusion parameters, and span is [0,1], when m is less, mainly adjusts highlight regions; When m is bigger, adjust low brightness area.
A kind of automatic identification clear zone the most according to claim 1 and the dynamic Enhancement Method of single frames width of dark space, it is characterised in that: step In rapid S1, the span of a is [1,5], and the span of b is [1,255].
A kind of automatic identification clear zone the most according to claim 1 and the dynamic Enhancement Method of single frames width of dark space, it is characterised in that: institute The interval less for m stated for [0,0.5), interval less for described m is [0.5,1].
A kind of automatic identification clear zone the most according to claim 1 and the dynamic Enhancement Method of single frames width of dark space, it is characterised in that: institute Step S3 stated includes following sub-step:
S31: detection clear zone: the minRGB1 of step S11 is substituted into below equation and calculates, obtain high luminance area area image o (x, y):
O (x, y)=1/ (1+r (x, y)-z);
In formula, (x y) represents that (z is parameter to pixel for x, the value of minRGB1 y) at coordinate to r;Then traversal high luminance area area image O (x, y), to each pixel o (x, y) carries out binary conversion treatment: when L (x, y) value more than or equal to M time, o ' (x, y)=1, When L (x, y) value less than M time, o ' (x, y)=0, it may be assumed that
o &prime; ( x , y ) = 1 L ( x , y ) &GreaterEqual; M 0 L ( x , y ) < M
In formula, (x, y) is binary image to o ', and M is the average of image minRGB1, and acquiring method is cumulative minRGB1 image All pixel values, then the number of pixels divided by minRGB1 image;
S32: reference area: (x, y) all pixel values of image add up, and the result obtained is designated as k ', k ' for image highlight area by o ' Area;If minRGB1 image pixel number is k ", then calculating clear zone accounts for the ratio of minRGB1 image size is t:
T=k '/k ";
S33: calculate average: (x, y) with minRGB1 image corresponding pixel points phase by the clear zone binary image o ' that obtains in step S31 Taking advantage of, the result obtained is designated as o " (x, y), it may be assumed that
O " (x, y)=minRGB1 o ' (x, y);
By o, " (x, y) all pixel values of image add up, and the result obtained is designated as k, calculate the bright of minRGB1 image according to below equation District average M ':
M '=k/k ';
S34: determine fusion parameters m:
(1) when clear zone average M ' is more than or equal to first threshold P, choosing by the decision in detection clear zone of fusion parameters m: work as clear zone When territory area accounts for ratio t of total image area more than Q%, mainly adjusting bright area, adjustment parameter m takes X, X and is more than 0.5;Otherwise take 0;
(2) when clear zone average M ' is less than first threshold P, mainly adjusting dark areas, fusion parameters m takes 1;
S35: Intelligent Fusion: in the lookbf1 obtained in fusion parameters m that step S34 is obtained, step S1 and step S2 To Mlookbf2 substitute into below equation merge:
Out=m lookbf1+ (1-m) Mlookbf2;
Wherein, out is fusion results, and m is calculated fusion parameters.
A kind of automatic identification clear zone the most according to claim 4 and the dynamic Enhancement Method of single frames width of dark space, it is characterised in that: institute The value of first threshold P stated is 200, and the value of Q is 40, and the value of X is 0.6, and the value of z takes 10.
6. the single frames width of a kind of automatic identification clear zone as described in any one in Claims 1 to 5 and dark space dynamically strengthens system, and it is special Levy and be: including:
Picture frame acquisition module: for each two field picture in settlement in video streaming;
Low brightness area adjusting module: for low brightness area being adjusted, including the capture vegetarian refreshments minima submodule being sequentially connected with A, negate submodule A, bilateral filtering submodule A and the submodule A that tables look-up;It is single that the described submodule A that tables look-up includes that table generates Unit A and table query unit A;
High-brightness region adjusting module: for being adjusted high-brightness region, negates submodule B, capture element including be sequentially connected with Point minima submodule B, negate submodule C, bilateral filtering submodule B, table look-up submodule B and negate submodule D;Institute The submodule B that tables look-up stated includes table signal generating unit B and table query unit B;
Intelligent Fusion module: for the result that low brightness area adjusting module produces is entered with the result that high-brightness region adjusting module produces Row Intelligent Fusion.
A kind of automatic identification clear zone the most according to claim 6 and the dynamic Enhancement Method of single frames width of dark space, it is characterised in that: institute The Intelligent Fusion module stated includes the brightness detection sub-module being sequentially connected with, areal calculation submodule, mean value computation submodule, melts Close parameter determination submodule and Intelligent Fusion submodule.
CN201610203548.5A 2016-04-01 2016-04-01 Single frame wide dynamic enhancement method and system capable of automatically identifying bright area and dark area Pending CN105872397A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610203548.5A CN105872397A (en) 2016-04-01 2016-04-01 Single frame wide dynamic enhancement method and system capable of automatically identifying bright area and dark area

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610203548.5A CN105872397A (en) 2016-04-01 2016-04-01 Single frame wide dynamic enhancement method and system capable of automatically identifying bright area and dark area

Publications (1)

Publication Number Publication Date
CN105872397A true CN105872397A (en) 2016-08-17

Family

ID=56628074

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610203548.5A Pending CN105872397A (en) 2016-04-01 2016-04-01 Single frame wide dynamic enhancement method and system capable of automatically identifying bright area and dark area

Country Status (1)

Country Link
CN (1) CN105872397A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109102481A (en) * 2018-07-11 2018-12-28 江苏安威士智能安防有限公司 automatic wide dynamic processing algorithm based on illumination analysis
CN116363017A (en) * 2023-05-26 2023-06-30 荣耀终端有限公司 Image processing method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101082992A (en) * 2007-07-06 2007-12-05 浙江大学 Drawing of real time high dynamic range image and display process
CN101340511A (en) * 2008-08-07 2009-01-07 中兴通讯股份有限公司 Adaptive video image enhancing method based on lightness detection
US20100103194A1 (en) * 2008-10-27 2010-04-29 Huawei Technologies Co., Ltd. Method and system for fusing images
CN101707666A (en) * 2009-11-26 2010-05-12 北京中星微电子有限公司 Adjusting method and device with high dynamic range
CN102075688A (en) * 2010-12-28 2011-05-25 青岛海信网络科技股份有限公司 Wide dynamic processing method for single-frame double-exposure image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101082992A (en) * 2007-07-06 2007-12-05 浙江大学 Drawing of real time high dynamic range image and display process
CN101340511A (en) * 2008-08-07 2009-01-07 中兴通讯股份有限公司 Adaptive video image enhancing method based on lightness detection
US20100103194A1 (en) * 2008-10-27 2010-04-29 Huawei Technologies Co., Ltd. Method and system for fusing images
CN101707666A (en) * 2009-11-26 2010-05-12 北京中星微电子有限公司 Adjusting method and device with high dynamic range
CN102075688A (en) * 2010-12-28 2011-05-25 青岛海信网络科技股份有限公司 Wide dynamic processing method for single-frame double-exposure image

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109102481A (en) * 2018-07-11 2018-12-28 江苏安威士智能安防有限公司 automatic wide dynamic processing algorithm based on illumination analysis
CN109102481B (en) * 2018-07-11 2021-09-28 江苏安威士智能安防有限公司 Automatic wide dynamic processing algorithm based on illumination analysis
CN116363017A (en) * 2023-05-26 2023-06-30 荣耀终端有限公司 Image processing method and device
CN116363017B (en) * 2023-05-26 2023-10-24 荣耀终端有限公司 Image processing method and device

Similar Documents

Publication Publication Date Title
JP4127411B1 (en) Image processing apparatus and method
CN104637064A (en) Defocus blurred image definition detecting method based on edge strength weight
CN103606132A (en) Multiframe digital image denoising method based on space domain and time domain combination filtering
US20150009355A1 (en) Motion adaptive cmos imaging system
CN101304489A (en) Automatic exposure method and apparatus
CN105100632A (en) Adjusting method and apparatus for automatic exposure of imaging device, and imaging device
CN103973957A (en) Binocular 3D camera automatic focusing system and method
CN106454014B (en) A kind of method and device improving backlight scene vehicle snapshot picture quality
CN105458490A (en) Real-time monitoring system and monitoring method for judging welding type of laser deep penetration welding by using high-speed camera
US20180053287A1 (en) Video Image Denoising and Enhancing Method and Device Based On Random Spray Retinex
US20150189155A1 (en) Focus control apparatus and method
CN107533756A (en) Image processing apparatus, camera device, image processing method and storage image processing unit image processing program storage medium
CN107392879B (en) A kind of low-light (level) monitoring image Enhancement Method based on reference frame
TW201351301A (en) Self-adaptive obstacle image detection method
CN105592258B (en) Auto focusing method and device
CN109040720B (en) A kind of method and device generating RGB image
CN105872397A (en) Single frame wide dynamic enhancement method and system capable of automatically identifying bright area and dark area
Yahiaoui et al. Optimization of ISP parameters for object detection algorithms
CN104243793A (en) Image-capturing device having image identification mechanism and method
CN102223545B (en) Rapid multi-view video color correction method
CN102256062A (en) Circuit and method for automatically detecting image flicker
US9013605B2 (en) Apparatus and method for processing intensity of image in digital camera
Jiang et al. Multiple templates auto exposure control based on luminance histogram for onboard camera
CN105704349A (en) Single frame width dynamic enhancing method based on individual adjustment of clear zone and dark zone
CN117036401A (en) Distribution network line inspection method and system based on target tracking

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160817