CN113781317A - Brightness equalization method of panoramic all-around system - Google Patents
Brightness equalization method of panoramic all-around system Download PDFInfo
- Publication number
- CN113781317A CN113781317A CN202110880913.7A CN202110880913A CN113781317A CN 113781317 A CN113781317 A CN 113781317A CN 202110880913 A CN202110880913 A CN 202110880913A CN 113781317 A CN113781317 A CN 113781317A
- Authority
- CN
- China
- Prior art keywords
- optimal
- camera
- cameras
- value
- view
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 238000013507 mapping Methods 0.000 claims abstract description 76
- 239000000654 additive Substances 0.000 claims abstract description 38
- 230000000996 additive effect Effects 0.000 claims abstract description 38
- 238000012805 post-processing Methods 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims abstract description 8
- 238000004364 calculation method Methods 0.000 claims description 13
- 238000007781 pre-processing Methods 0.000 claims description 13
- 238000006243 chemical reaction Methods 0.000 claims description 10
- 238000005070 sampling Methods 0.000 claims description 7
- 240000004050 Pentaglottis sempervirens Species 0.000 claims description 4
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 claims description 4
- 238000012937 correction Methods 0.000 claims description 3
- 230000000903 blocking effect Effects 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 3
- 230000004927 fusion Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000000638 solvent extraction Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/92—Dynamic range modification of images or parts thereof based on global image properties
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a brightness balancing method of a panoramic all-round system, which comprises the following steps: firstly, only processing the brightness Y component of an input view, carrying out block average downsampling on an overlapping area of adjacent camera views, selecting an interior point, and using an obtained interior point sample index set for calculating optimal additive gain; then, brightness equalization pretreatment is carried out by utilizing the optimal additive gain; and finally, reselecting the preprocessed interior point samples, calculating an optimal linear piecewise mapping function, and carrying out brightness equalization post-processing. The method combines the additive gain model and the optimal linear piecewise mapping function, only processes the brightness component of the input view, reduces the computational complexity and improves the image splicing and fusion effect.
Description
Technical Field
The invention relates to the technical field of digital image processing, in particular to a brightness balancing method of a panoramic all-around system.
Background
In recent years, due to the increase of the number of automobiles, parking spaces in a parking lot are tense, parking spaces are narrow, and when a driver backs a car and parks at a side position, the driver is caused with parking obstacles due to the fact that vision blind areas exist on left and right rearview mirrors of the automobile. The panoramic all-round looking system can display all-round looking images of the surrounding environment of the vehicle body, provide road surface information of the surrounding of the vehicle body for a driver, assist the driver in driving judgment and improve the safety performance of the vehicle in parking and in a complex environment.
Because the lighting conditions of different cameras in the panoramic looking-around system are different, and the Automatic Exposure (AE) and Automatic White Balance (AWB) parameters of each camera are also different, the synthesized looking-around view has obvious boundaries between adjacent views, thereby influencing the visual effect and further influencing the judgment of a driver on the road surface condition.
Disclosure of Invention
The invention aims to solve the defects in the prior art, provides a brightness equalization method of a panoramic all-around system, reduces the calculation complexity of a brightness equalization algorithm, and is beneficial to the real-time realization of an embedded system.
The purpose of the invention can be achieved by adopting the following technical scheme:
a brightness equalization method for a panoramic surround view system, said brightness equalization method comprising the steps of:
s1, image format conversion: converting the input circular view of the P cameras of the panoramic circular view system into YUV format, checking the format of the input circular view of the P cameras of the panoramic circular view system, if the input circular view is not YUV format, performing color space conversion on the P circular view, converting the color space conversion into YUV format, and obtaining the input view I of the P camerasmThe corresponding Y, U, V components are respectivelyWherein m is a camera index, m is 0, 1. If the input look-around aerial view is in YUV format, the input view I is directly obtainedmAnd corresponding Y, U, V componentAnd
s2, block average down-sampling: input view of camera m, taking into account only the luminance Y componentInput views with neighboring cameras nIs recorded as omegamnWherein n is the next camera index in the clockwise direction, n ≡ (m +1) mod P, the symbol "mod" represents the modulo operation, and the symbol "≡" represents identity in the modulo operation; input views for camera nInput views with neighboring cameras mIs recorded as omeganmFor each set of overlapping area images omega of adjacent camerasmn and ΩnmRespectively partitioning, wherein the size of each subblock is N multiplied by N pixels, taking the average value of each subblock as the output of the corresponding subblock, and obtaining a downsampled image phi consisting of the average values of the subblocksmn and Φnm;
S3, selecting a sample, selecting an inner point with smaller difference, and obtaining a sample index set; downsampled image phi based on overlapping area of each group of adjacent camerasmn and ΦnmSelecting the coordinates of the inner points with smaller difference to form a sample index set Smn:
Smn={[i,j]|(Φmn[i,j]-Φnm[i,j])2<Th,[i,j]∈[1,Nmn]×[1,Mmn]}
Wherein i, j represent the abscissa and ordinate of the image, respectively, [ i, j]Coordinates representing the image, [ phi ]mn[i,j]Representing a down-sampled image phimnCorresponding coordinate [ i, j]Pixel gray value of phinm[i,j]Representing a down-sampled image phinmCorresponding coordinate [ i, j]Th represents an inner point threshold, Nmn and MmnRespectively representing down-sampled images phimn and ΦnmThe number of columns and rows;
s4, calculating the optimal additive gains of the P camerasDown-sampled image phi of adjusted adjacent camera overlap regionmn and ΦnmThe mean square error of the corresponding interior point pixel gray value is the minimum, and the following formula is satisfied:
wherein ,gmFor the additive gain of the m-th camera,represents the optimal additive gain for the mth camera;
s5, performing brightness equalization preprocessing by an additive gain model, and performing brightness equalization preprocessing by adding the optimal additive gain to the Y component of the input view of each camera, wherein the formula is as follows:
wherein ,an output view which represents the Y component of the mth input view after the optimal additive gain preprocessing;
s6, modifying the downsampled image, each downsampled image phimn and ΦnmRespectively adding the corresponding optimal additive gains, wherein the formula is as follows:
wherein ,andrespectively representing down-sampled images phimn and ΦnmThe down-sampled image is corrected by the optimal additive gain;
s7, updating the sample index set, and obtaining the modified down-sampled image of the overlapping area of the adjacent cameras again by using the method in the step S3Andsample index set of inliers ofThe following formula is satisfied:
wherein ,representing modified downsampled imagesCorresponding coordinate [ i, j]The gray value of the pixel of (a),representing modified downsampled imagesCorresponding coordinate [ i, j]Pixel gray scale value of (a);
s8, calculating the optimal mapping curve function, wherein the mapping curve function y of the mth camera is Tm(x) Will [0,255]The input pixel value x in between maps to 0,255]Output value y between, optimal mapping curve function T of P camerasm() Obtained by optimizing the following formula:
wherein ,Tm() A mapping curve function representing the mth camera,representing a corresponding optimal mapping curve function, wherein beta is a penalty factor and restricts the deviation degree of an input value and an output value of the mapping curve function;
s9, brightness equalization post-processing of the optimal mapping curve function, and view preprocessing of the optimal additive gain of the P camerasPerforming brightness equalization post-processing through the optimal mapping curve function to obtain a Y component of the output all-round aerial viewThe formula is as follows:
finally, Y, U, V components of the output circular bird's-eye view image of the P cameras after brightness equalization processing are respectively obtained
Further, the brightness equalization method obtains the Y component of the output all-round bird' S-eye view through the brightness equalization post-processing of the optimal mapping curve function in step S9Then, the method also comprises the following steps:
and S10, restoring the image format, and if the input circular view is not in the YUV format, performing color space conversion on the output circular view after the brightness equalization processing to convert the output circular view into the original input format.
Further, the optimal mapping curve function calculation process in step S8 is as follows:
s801, the mapping curve function of the mth camera is expressed by a linear piecewise mapping function, the linear piecewise mapping function is defined by a group of anchor points, the number of the anchor points is d +1, and the coordinate of the kth anchor point isk is anchor point index, k is 0,1m() Can be expressed as:
where x represents the input pixel value, Tm(x) An output mapped pixel value representing a mapping curve function;
s802, fixing the 0 th anchor point of the optimal linear piecewise mapping function of the mth cameraThe d anchor point is fixed asDown-sampling images based on correctionAnddetermining the minimum value and the maximum value of the image pixel gray value to obtain the abscissa range of other anchor points as wherein Andis determined by the following formula:
where l is the previous camera index in the clockwise direction, l ≡ (m-1) mod P,andmodified down-sampled images corresponding to two overlapping regions of the mth camera in the clockwise direction respectively,andare respectively asAndmax () represents the maximum value taking operation, min () represents the minimum value taking operation;
other anchor pointsUniformly distributed in the range of abscissaAnd (d) is determined by the following formula:
s803, calculating a sample index setAndkth anchor point ordinate of mth cameraOnly with respect to the grey value of the pixelModified downsampled image of intervalAndthe inner point of (a); selecting the gray value of the pixel atModified downsampled image of intervalAndthe coordinates of the inliers of (a) constitute a sample index setAnd
wherein ,representing modified downsampled imagesCorresponding coordinate [ i, j]The gray value of the pixel of (a),andare respectively asAndis determined by the sample index set of the inliers,the abscissa representing the anchor point of the (k-1) th camera,represents the abscissa of the (k +1) th anchor point;
s804, sequentially and iteratively solving d-1 anchor point vertical coordinates of the optimal linear piecewise mapping function of the mth camera in sequenceObtaining an optimal linear piecewise mapping function of the mth camera;
s805, the step S804 is repeatedly executed until the absolute value of the iteration variation of the front and back two times of all the anchor point vertical coordinates of all the cameras is smaller than the set threshold value TanchorAnd stopping iteration, and outputting the optimal linear piecewise mapping function determined by the anchor point as an optimal mapping curve function.
Further, in the step S804, the d-1 anchor point vertical coordinates of the optimal linear piecewise mapping function of the mth camera are sequentially solved in an iterative mannerObtaining the optimal linear piecewise mapping function of the mth camera, wherein the process is as follows:
s80401, fixing the vertical coordinates of other anchor points, and updating the vertical coordinates of the anchor points of the odd array of the optimal linear piecewise mapping function of the mth camera
wherein A, B, C are intermediate calculation variables, and are defined as follows:
wherein ,andfor a modified down-sampled image of the overlapping region of cameras m and n,andfor modified downsampled images of the overlapping region of cameras l and m, Tn() and Tl() Mapping curve function, function F, for the n, l cameras respectivelymk() and Y’mk() For the piecewise function, the following is defined:
s80402, fixing the vertical coordinates of other anchor points, and updating the vertical coordinates of the anchor points of the even group of the optimal linear piecewise mapping function of the mth camera
Compared with the prior art, the invention has the following advantages and effects:
1. in the prior art, all color component channels of the panoramic looking-around system looking-around aerial view need to be processed, the invention only performs brightness equalization on the brightness Y component, the calculated amount is 1/3 in the prior art, and the calculation complexity of the algorithm is reduced.
2. The method combines the additive gain model and the optimal linear piecewise mapping function, utilizes the additive gain model to carry out preprocessing, utilizes the optimal linear piecewise mapping function to carry out postprocessing, further improves the brightness balancing effect, and reduces the brightness difference of the overlapping areas of different camera views of the panoramic all-round looking system.
Drawings
FIG. 1 is a flow chart illustrating the steps of a brightness equalization method for a panoramic surround view system;
FIG. 2 is a block average downsampling schematic diagram disclosed in the present invention;
FIG. 3 is a schematic diagram of a linear piecewise mapping function disclosed in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Examples
The embodiment discloses a brightness equalization method of a panoramic all-round system, as shown in fig. 1, comprising the following steps:
s1, image format conversion: checking the format of the input circular view of the P cameras of the panoramic circular view system, if the input circular view is not in YUV format, performing color space conversion on the P circular view, converting the color space conversion into YUV format, and obtaining the input view I of the P camerasmThe corresponding Y, U, V components are respectivelyWherein m is a camera index, m is 0, 1. If the input look-around aerial view is in YUV format, the input view I is directly obtainedmAnd corresponding Y, U, V componentAndin this embodiment, P is 4.
S2, block average down-sampling: as shown in FIG. 2, considering only the luminance Y component, the input view of camera mInput views with neighboring cameras nIs recorded as omegamnWherein m is the camera index, n is the next camera index in the clockwise direction, n ≡ (m +1) mod P, the symbol "mod" represents the modulo operation, and the symbol "≡" represents the identity in the modulo operation; input views for camera nInput views with neighboring cameras mIs recorded as omeganmFor each set of overlapping area images omega of adjacent camerasmn and ΩnmRespectively partitioning, wherein the size of each subblock is N multiplied by N pixels, taking the average value of each subblock as the output of the corresponding subblock, and obtaining a downsampled image phi consisting of the average values of the subblocksmn and Φnm(ii) a In this embodiment, N is 4.
S3, selecting a sample, selecting an inner point with smaller difference, and obtaining a sample index set; downsampled image phi based on overlapping area of each group of adjacent camerasmn and ΦnmSelecting the coordinates of the inner points with smaller difference to form a sample index set Smn:
Smn={[i,j]|(Φmn[i,j]-Φmm[i,j])2<Th,[i,j]∈[1,Nmn]×[1,Mmn]}
Wherein i, j represent the abscissa and ordinate of the image, respectively, [ i, j]Coordinates representing the image, [ phi ]mn[i,j]Representing a down-sampled image phimnCorresponding coordinate [ i, j]Pixel gray value of phinm[i,j]Representing a down-sampled image phinmCorresponding coordinate [ i, j]Th represents an inner point threshold, Nmn and MmnRespectively representing down-sampled images phimn and ΦnmThe number of columns and rows; in this implementationIn the example, Th was 500.
S4, calculating the optimal additive gains of the P camerasDown-sampled image phi of adjusted adjacent camera overlap regionmn and ΦnmThe mean square error of the corresponding interior point pixel gray value is the minimum, and the following formula is satisfied:
wherein ,gmFor the additive gain of the m-th camera,represents the corresponding optimal additive gain, SmnDownsampled image phi representing overlapping area of adjacent camerasmn and ΦnmA sample index set of inliers of (a);
s401, selecting a camera input view with medium brightness as a reference view, wherein the index ref of the reference view is calculated as follows:
s40101, calculating a downsampled image phimn and ΦnmSample mean a of the interior points of (1)mn and Anm:
wherein ,Φmn[i,j]Representing a down-sampled image phimnCorresponding coordinate [ i, j]Pixel gray value of phinm[i,j]Representing a down-sampled image phinmCorresponding coordinate [ i, j]Pixel gray value of pmnRepresenting a sample index set SmnThe number of indexes of (d);
s40102, calculating an input view Y component of the mth cameraOverlap region luminance mean value Am:
Am=Amn+Aml
Where n ≡ (m +1) mod P, l is the previous camera index in the clockwise direction, l ≡ (m-1) mod P, Amn and AmlDown-sampled images phi respectively corresponding to two overlapping regions of the m-th camera in the clockwise directionmn and ΦmlThe interior point sample mean of (2);
s40104, calculating a camera index ref of the reference view:taking the ref view as a reference view;
s402, calculating a down-sampling image phimn and ΦnnSample mean difference d of inner points of (1)m:dm=Amn-Anm;
where ref is the camera index of the reference view, dmMean difference for the mth sample;
s5, performing brightness equalization preprocessing by an additive gain model, and performing brightness equalization preprocessing by adding the optimal additive gain to the Y component of the input view of each camera, wherein the formula is as follows:
wherein ,an output view representing the m-th input view after the Y component is subjected to optimal additive gain preprocessing,represents the optimal additive gain for the mth camera;
s6, modifying the downsampled image, each downsampled image phimn and ΦnmRespectively adding the corresponding optimal additive gains, wherein the formula is as follows:
wherein ,andrespectively representing down-sampled images phimn and ΦnmThe down-sampled image is corrected by the optimal additive gain;
s7, updating the sample index set; in step S3Method described for retrieving a modified downsampled imageAndsample index set of inliers ofThe following formula is satisfied:
wherein ,representing modified downsampled imagesCorresponding coordinate [ i, j]The gray value of the pixel of (a),representing modified downsampled imagesCorresponding coordinate [ i, j]Pixel gray scale value of (a);
s8, calculating the optimal mapping curve function, wherein the mapping curve function y of the mth camera is Tm(x) Will [0,255]The input pixel value x in between maps to 0,255]Output value y between, optimal mapping curve function T of P camerasm() Obtained by optimizing the following formula:
wherein ,Tm() A mapping curve function representing the mth camera,representing the corresponding optimal mapping curve function, n being the next camera index in the clockwise direction, β being a penalty factor, constraining the degree of deviation of the input and output values of the mapping curve function,anda modified downsampled image for an overlapping region of adjacent cameras;
optimal mapping curve function y ═ T of P camerasm(x) The calculation flow of (2) is as follows:
s801, the mapping curve function of the mth camera is expressed by using a linear piecewise mapping function, as shown in fig. 3, the linear piecewise mapping function is defined by using a group of anchor points, d +1 anchor points are total, in this embodiment, d is 5, and the coordinate of the kth anchor point isk is anchor point index, k is 0,1m() Can be expressed as:
Where m denotes a camera index, x denotes an input pixel value, Tm(x) An output mapped pixel value representing a mapping curve function;
s802, fixing the 0 th anchor point of the optimal linear piecewise mapping function of the mth cameraThe d anchor point is fixed asDown-sampling images based on correctionAndthe distribution range of the pixel gray value of the inner point, the minimum value and the maximum value of the pixel gray value of the image are determined, and the abscissa ranges of other anchor points can be obtained as wherein Andis determined by the following formula:
where m is the camera index, n is the next camera index in the clockwise direction, n ≡ (m +1) modP, l is the previous camera index in the clockwise direction, l ≡ (m-1) mod P,andmodified down-sampled images corresponding to two overlapping regions of the mth camera in the clockwise direction respectively,andare respectively asAndmax () represents the maximum value taking operation, min () represents the minimum value taking operation;
other anchor pointsUniformly distributed in the range of abscissaAnd (d) is determined by the following formula:
s803, calculating a sample index setAndkth anchor point ordinate of mth cameraOnly with respect to the grey value of the pixelModified downsampled image of intervalAndthe inner point of (a); selecting the gray value of the pixel atModified downsampled image of intervalAndthe coordinates of the inliers of (a) constitute a sample index setAnd
where k is the anchor index, k is 1, 2., d-1,representing modified downsampled imagesCorresponding coordinate [ i, j]The gray value of the pixel of (a),representing modified downsampled imagesCorresponding coordinate [ i, j]The gray value of the pixel of (a),andare respectively asAndis determined by the sample index set of the inliers,the abscissa representing the anchor point of the (k-1) th camera,represents the abscissa of the (k +1) th anchor point;
s804, sequentially and iteratively solving an optimal linear piecewise mapping function of the mth camera in sequence, where m is 0, 1.
Iterative solution of d-1 anchor point ordinates of optimal linear piecewise mapping function of mth cameraThe process is as follows:
s80401, fixing the vertical coordinates of other anchor points, and updating the vertical coordinates of the anchor points of the odd array of the optimal linear piecewise mapping function of the mth camera
wherein A, B, C are intermediate calculation variables, and are defined as follows:
wherein ,andfor a modified down-sampled image of the overlapping region of cameras m and n,andfor modified downsampled images of the overlapping region of cameras l and m, Tn() and Tl() The mapping curve function of the n and l cameras, function Fmk() and Y’mk() For the piecewise function, the following is defined:
s80402, fixing the vertical coordinates of other anchor points, and updating the vertical coordinates of the anchor points of the even group of the optimal linear piecewise mapping function of the mth camera
S805, the step S804 is repeatedly executed until the absolute value of the iteration variation of the front and back two times of all the anchor point vertical coordinates of all the cameras is smaller than the set threshold value TanchorStopping the iteration and outputting the optimum determined by the anchor pointTaking the linear piecewise mapping function as an optimal mapping curve function; in the present embodiment, TanchorTaking 1.0 e-6;
s9, performing brightness equalization post-processing on the optimal mapping curve function; optimal additive gain pre-processing view for P camerasPerforming brightness equalization post-processing through the optimal mapping curve function to obtain a Y component of the output all-round aerial viewThe formula is as follows:
s10, restoring the image format; the Y, U, V components of the output circular bird's-eye view image of the P cameras after brightness equalization processing are respectivelyAnd if the input all-round looking aerial view is not in the YUV format, performing color space conversion on the output all-round looking aerial view after brightness equalization processing, and converting the output all-round looking aerial view into the original input format.
In summary, because the illumination conditions of different cameras in the panoramic all-around view system are different, and the parameters of Automatic Exposure (AE) and Automatic White Balance (AWB) of each camera are also different, and the synthesized all-around view has an obvious boundary between adjacent views, which affects the visual effect, for the above-mentioned problems in the prior art, this embodiment discloses a luminance equalization method of the panoramic all-around view system, which only processes the luminance Y component of the input view, performs block average downsampling on the overlapping area of the views of the adjacent cameras, selects the interior points, and obtains an index set of interior point samples for calculating the optimal additive gain; then, brightness equalization pretreatment is carried out by utilizing the optimal additive gain; and finally, reselecting the preprocessed interior point samples, calculating an optimal linear piecewise mapping function, and carrying out brightness equalization post-processing. In the prior art, an additive gain model is adopted in all color component channels, and an explicit mathematical solution is provided; the invention only processes the brightness component of the input view, and the calculation amount is 1/3 in the prior art, thereby reducing the calculation complexity.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.
Claims (4)
1. A brightness equalization method for a panoramic surround view system, said brightness equalization method comprising the steps of:
s1, image format conversion: converting the input panoramic all-round looking aerial view of the P cameras of the panoramic all-round looking system into a YUV format to obtain an input view I of the P camerasmThe corresponding Y, U, V components are respectivelyWherein m is a camera index, m is 0,1, …, P-1;
s2, block average down-sampling: input view of camera m, taking into account only the luminance Y componentInput views with neighboring cameras nIs recorded as omegamnWherein n is the next camera index in the clockwise direction, n ≡ (m +1) mod P, the symbol "mod" represents the modulo operation, and the symbol "≡" represents identity in the modulo operation; input views for camera nInput views with neighboring cameras mIs recorded as omeganmFor each set of overlapping area images omega of adjacent camerasmn and ΩnmRespectively blocking, each subblock is N multiplied by N pixels, taking the average value of each subblock as the output of the corresponding subblock, and obtaining the downsampled image phi consisting of the average values of the subblocksmn and Фnm;
S3, selecting a sample, selecting an inner point with smaller difference, and obtaining a sample index set; downsampled image phi based on overlapping area of each group of adjacent camerasmn and ФnmSelecting the coordinates of the inner points with smaller difference to form a sample index set Smn:
Smn={[i,j]|(Φmn[i,j]-Φnm[i,j])2<Th,[i,j]∈[1,Nmn]X[1,Mmn]}
Wherein i, j represent the abscissa and ordinate of the image, respectively, [ i, j]Coordinates, phi, representing imagesmn[i,j]Representing a downsampled image phimnCorresponding coordinate [ i, j]Pixel gray value of (1), phinm[i,j]Representing a downsampled image phinmCorresponding coordinate [ i, j]Th represents an inner point threshold, Nmn and MmnRespectively representing down-sampled images phimn and ФnmThe number of columns and rows;
s4, calculating the optimal additive gains of the P camerasDown-sampled image phi enabling adjusted overlap area of adjacent camerasmn and ФnmThe mean square error of the corresponding interior point pixel gray value is the minimum, and the following formula is satisfied:
wherein ,gmFor the additive gain of the m-th camera,represents the optimal additive gain for the mth camera;
s5, performing brightness equalization preprocessing by an additive gain model, and performing brightness equalization preprocessing by adding the optimal additive gain to the Y component of the input view of each camera, wherein the formula is as follows:
wherein ,an output view which represents the Y component of the mth input view after the optimal additive gain preprocessing;
s6, correcting the down-sampled images, each down-sampled image phimn and ФnmRespectively adding the corresponding optimal additive gains, wherein the formula is as follows:
wherein ,andrespectively representing down-sampled images phimn and ФnmThe down-sampled image is corrected by the optimal additive gain;
s7, updating the sample index set, and obtaining the modified down-sampled image of the overlapping area of the adjacent cameras again by using the method in the step S3Andsample index set of inliers ofThe following formula is satisfied:
wherein ,representing modified downsampled imagesCorresponding coordinate [ i, j]The gray value of the pixel of (a),representing modified downsampled imagesCorresponding coordinate [ i, j]Pixel gray scale value of (a);
s8, calculating the optimal mapping curve function, wherein the mapping curve function y of the mth camera is Tm(x) Will [0,255 ]]The input pixel value x in between maps to [0,255%]Output value y between, optimal mapping curve function T of P camerasm() Obtained by optimizing the following formula:
wherein ,Tm() A mapping curve function representing the mth camera,representing a corresponding optimal mapping curve function, wherein beta is a penalty factor and restricts the deviation degree of an input value and an output value of the mapping curve function;
s9, brightness equalization post-processing of the optimal mapping curve function, and view preprocessing of the optimal additive gain of the P camerasPerforming brightness equalization post-processing through the optimal mapping curve function to obtain a Y component of the output all-round aerial viewThe formula is as follows:
2. The brightness equalization method of the panoramic all-around view system according to claim 1, wherein the brightness equalization method obtains the Y component of the output all-around view through the brightness equalization post-processing of the optimal mapping curve function in step S9Then, the method also comprises the following steps:
and S10, restoring the image format, and if the input circular view is not in the YUV format, performing color space conversion on the output circular view after the brightness equalization processing to convert the output circular view into the original input format.
3. The brightness equalizing method of a panoramic looking-around system according to claim 1, wherein the optimal mapping curve function in step S8 is calculated as follows:
s801, the mapping curve function of the mth camera is expressed by a linear piecewise mapping function, the linear piecewise mapping function is defined by a group of anchor points, the number of the anchor points is d +1, and the coordinate of the kth anchor point isk is anchor point index, k is 0,1, …, d, mapping curve function T of mth cameram() Can be expressed as:
Where x represents the input pixel value, Tm(x) An output mapped pixel value representing a mapping curve function;
s802, fixing the 0 th anchor point of the optimal linear piecewise mapping function of the mth cameraThe d anchor point is fixed asDown-sampling images based on correctionAnddetermining the distribution range of the pixel gray value of the inner pointThe minimum value and the maximum value of the pixel gray value are obtained, and the abscissa ranges of other anchor points are obtained wherein Andis determined by the following formula:
where l is the previous camera index in the clockwise direction, l ≡ (m-1) mod P,andmodified down-sampled images corresponding to two overlapping regions of the mth camera in the clockwise direction respectively,andare respectively asAndmax () represents the maximum value taking operation, min () represents the minimum value taking operation;
other anchor pointsUniformly distributed in the range of abscissaAnd (d) is determined by the following formula:
s803, calculating a sample index setAndkth anchor point ordinate of mth cameraOnly with respect to the grey value of the pixelModified downsampled image of intervalAndthe inner point of (a); selecting the gray value of the pixel atModified downsampled image of intervalAndthe coordinates of the inliers of (a) constitute a sample index setAnd
wherein ,representing modified downsampled imagesCorresponding coordinate [ i, j]The gray value of the pixel of (a),andare respectively asAndis determined by the sample index set of the inliers,the abscissa representing the anchor point of the (k-1) th camera,represents the abscissa of the (k +1) th anchor point;
s804, sequentially and iteratively solving d-1 anchor point vertical coordinates of the optimal linear piecewise mapping function of the mth camera in sequenceObtaining an optimal linear piecewise mapping function of the mth camera;
s805, the step S804 is repeatedly executed until the absolute value of the iteration variation of the front and back two times of all the anchor point vertical coordinates of all the cameras is smaller than the set threshold value TanchorAnd stopping iteration, and outputting the optimal linear piecewise mapping function determined by the anchor point as an optimal mapping curve function.
4. The brightness equalizing method of claim 3, wherein the d-1 vertical coordinates of anchor points of the optimal linear piecewise mapping function of the mth camera are sequentially solved by iteration in step S804Obtaining the optimal linear piecewise mapping function of the mth camera, wherein the process is as follows:
s80401, fixing the vertical coordinates of other anchor points, and updating the vertical coordinates of the anchor points of the odd array of the optimal linear piecewise mapping function of the mth camera
wherein A, B, C are intermediate calculation variables, and are defined as follows:
wherein ,andfor a modified down-sampled image of the overlapping region of cameras m and n,andfor modified downsampled images of the overlapping region of cameras l and m, Tn() and Tl() Mapping curve function, function F, for the n, l cameras respectivelymk() and Y’mk() For the piecewise function, the following is defined:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110880913.7A CN113781317B (en) | 2021-08-02 | 2021-08-02 | Luminance balancing method for panoramic all-around system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110880913.7A CN113781317B (en) | 2021-08-02 | 2021-08-02 | Luminance balancing method for panoramic all-around system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113781317A true CN113781317A (en) | 2021-12-10 |
CN113781317B CN113781317B (en) | 2023-08-18 |
Family
ID=78836472
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110880913.7A Active CN113781317B (en) | 2021-08-02 | 2021-08-02 | Luminance balancing method for panoramic all-around system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113781317B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117237237A (en) * | 2023-11-13 | 2023-12-15 | 深圳元戎启行科技有限公司 | Luminosity balancing method and device for vehicle-mounted 360-degree panoramic image |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060177150A1 (en) * | 2005-02-01 | 2006-08-10 | Microsoft Corporation | Method and system for combining multiple exposure images having scene and camera motion |
US20090231447A1 (en) * | 2008-03-12 | 2009-09-17 | Chung-Ang University Industry-Academic Cooperation Foundation | Apparatus and method for generating panorama images and apparatus and method for object-tracking using the same |
CN109166076A (en) * | 2018-08-10 | 2019-01-08 | 深圳岚锋创视网络科技有限公司 | Luminance regulating method, device and the portable terminal of polyphaser splicing |
-
2021
- 2021-08-02 CN CN202110880913.7A patent/CN113781317B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060177150A1 (en) * | 2005-02-01 | 2006-08-10 | Microsoft Corporation | Method and system for combining multiple exposure images having scene and camera motion |
US20090231447A1 (en) * | 2008-03-12 | 2009-09-17 | Chung-Ang University Industry-Academic Cooperation Foundation | Apparatus and method for generating panorama images and apparatus and method for object-tracking using the same |
CN109166076A (en) * | 2018-08-10 | 2019-01-08 | 深圳岚锋创视网络科技有限公司 | Luminance regulating method, device and the portable terminal of polyphaser splicing |
Non-Patent Citations (1)
Title |
---|
范翔;夏顺仁;: "基于特征的显微图像全自动拼接", 浙江大学学报(工学版) * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117237237A (en) * | 2023-11-13 | 2023-12-15 | 深圳元戎启行科技有限公司 | Luminosity balancing method and device for vehicle-mounted 360-degree panoramic image |
Also Published As
Publication number | Publication date |
---|---|
CN113781317B (en) | 2023-08-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110148095A (en) | A kind of underwater picture Enhancement Method and enhancement device | |
US9135688B2 (en) | Method for brightness equalization of various images | |
JP7268001B2 (en) | Arithmetic processing unit, object identification system, learning method, automobile, vehicle lamp | |
US20150138312A1 (en) | Method and apparatus for a surround view camera system photometric alignment | |
TWI599989B (en) | Image processing method and image system for transportation | |
CN104794705B (en) | Image defogging method and device based on image local content characteristic | |
US11082631B2 (en) | Image processing device | |
CN113344820B (en) | Image processing method and device, computer readable medium and electronic equipment | |
CN113077505A (en) | Optimization method of monocular depth estimation network based on contrast learning | |
CN112731436A (en) | Multi-mode data fusion travelable area detection method based on point cloud up-sampling | |
CN112529813B (en) | Image defogging processing method and device and computer storage medium | |
CN113781317A (en) | Brightness equalization method of panoramic all-around system | |
US7995107B2 (en) | Enhancement of images | |
CN115965531A (en) | Model training method, image generation method, device, equipment and storage medium | |
CN113658058A (en) | Brightness balancing method and system in vehicle-mounted all-round system | |
CN113256516A (en) | Image enhancement method | |
US11523053B2 (en) | Image processing apparatus | |
CN113840123B (en) | Image processing device of vehicle-mounted image and automobile | |
CN111800586B (en) | Virtual exposure processing method for vehicle-mounted image, vehicle-mounted image splicing processing method and image processing device | |
KR101230909B1 (en) | Apparatus and method for processing wide angle image | |
CN114926331A (en) | Panoramic image splicing method applied to vehicle | |
CN115443651A (en) | Determining a current focal region of a camera image based on a position of a vehicle camera on a vehicle and based on current motion parameters | |
CN115100083B (en) | Image brightness self-adaptive adjusting method for vehicle-mounted image | |
CN111915520B (en) | Method, device and computer equipment for improving brightness of spliced image | |
CN113506218B (en) | 360-degree video splicing method for multi-compartment ultra-long vehicle type |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |