CN115460354A - Image brightness processing method and device, electronic equipment, vehicle and storage medium - Google Patents

Image brightness processing method and device, electronic equipment, vehicle and storage medium Download PDF

Info

Publication number
CN115460354A
CN115460354A CN202111389253.9A CN202111389253A CN115460354A CN 115460354 A CN115460354 A CN 115460354A CN 202111389253 A CN202111389253 A CN 202111389253A CN 115460354 A CN115460354 A CN 115460354A
Authority
CN
China
Prior art keywords
brightness
target
image
images
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111389253.9A
Other languages
Chinese (zh)
Inventor
刘锋
李倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Rockwell Technology Co Ltd
Original Assignee
Beijing Rockwell Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Rockwell Technology Co Ltd filed Critical Beijing Rockwell Technology Co Ltd
Priority to CN202111389253.9A priority Critical patent/CN115460354A/en
Publication of CN115460354A publication Critical patent/CN115460354A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/20Circuitry for controlling amplitude response
    • H04N5/202Gamma control

Abstract

The disclosure provides an image brightness processing method, an image brightness processing device, an electronic apparatus, a vehicle and a storage medium, wherein the method comprises the following steps: the method comprises the steps of obtaining a target image collected by a first camera, obtaining a reference image collected by a second camera adjacent to the first camera, wherein an overlapping area exists between the target image and the reference image, determining a target brightness balance parameter of the target image according to brightness information of the overlapping area in the reference image so as to adjust the brightness of the image collected by the first camera, adjusting the brightness of the image collected by the subsequent first camera by determining the target brightness balance parameter of the image collected by the first camera, ensuring the consistency of the brightness information in the images collected by the multiple cameras in a plurality of camera scenes, avoiding brightness difference after splicing, and improving visual effect.

Description

Image brightness processing method and device, electronic equipment, vehicle and storage medium
Technical Field
The present disclosure relates to the field of graphics processing technologies, and in particular, to an image brightness processing method and apparatus, an electronic device, a vehicle, and a storage medium.
Background
At present, panoramic mosaic images are more and more widely applied, for example, when a vehicle runs, panoramic images, such as panoramic images, are spliced according to images collected by a plurality of cameras carried on the vehicle, and the brightness of the images in different areas is different due to different collection sources of the current panoramic mosaic images, so that the obtained panoramic mosaic images have obvious visual splicing feeling, the splicing effect is poor, and therefore, how to improve the brightness consistency of the panoramic mosaic images is a technical problem to be solved.
Disclosure of Invention
The present disclosure is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, the present disclosure provides an image brightness processing method and apparatus, an electronic device, a vehicle, and a storage medium, so as to improve brightness difference in a panoramic stitched image and improve the stitching effect.
An embodiment of the disclosure provides an image brightness processing method, including:
acquiring a target image acquired by a first camera;
acquiring a reference image acquired by a second camera adjacent to the first camera, wherein an overlapping area exists between the target image and the reference image;
and determining a target brightness balance parameter of the target image according to the brightness information of the overlapping area in the reference image so as to adjust the brightness of the image acquired by the first camera.
An embodiment of another aspect of the present disclosure provides an image brightness processing apparatus, including:
the first acquisition module is used for acquiring a target image acquired by the first camera;
the second acquisition module is used for acquiring a reference image acquired by a second camera adjacent to the first camera, wherein an overlapping area exists between the target image and the reference image;
and the determining module is used for determining a target brightness balance parameter of the target image according to the brightness information of the overlapping area in the reference image so as to adjust the brightness of the image acquired by the first camera.
An embodiment of another aspect of the present disclosure provides an electronic device, including:
at least one processor; and a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of the preceding aspect.
An embodiment of another aspect of the present disclosure provides a vehicle, which includes the electronic device in an embodiment of another aspect.
Another aspect of the present disclosure proposes a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the method of the preceding aspect.
Another embodiment of the present disclosure provides a computer program product, which includes computer instructions, when executed by a processor, implement the method of the foregoing aspect.
According to the image brightness processing method, the image brightness processing device, the electronic equipment, the vehicle and the storage medium, the target brightness balance parameter of the target image acquired by each first camera can be determined according to the brightness information of the overlapping area of the corresponding reference image, the image brightness of the subsequently acquired image of the first camera can be adjusted, the consistency of the brightness information in the images acquired by the plurality of cameras is ensured, the brightness difference after splicing is avoided, and the visual effect is improved.
Additional aspects and advantages of the disclosure will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the disclosure.
Drawings
The foregoing and/or additional aspects and advantages of the present disclosure will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flowchart of an image brightness processing method according to an embodiment of the disclosure;
FIG. 2 is a schematic illustration of a splice provided by the present common embodiment;
fig. 3 is a schematic flowchart of another image brightness processing method according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of another image brightness processing method according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an image brightness processing apparatus according to an embodiment of the present disclosure; and
fig. 6 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the accompanying drawings are illustrative and intended to explain the present disclosure, and should not be construed as limiting the present disclosure.
An image luminance processing method, apparatus, vehicle, electronic device, and storage medium of the embodiments of the present disclosure are described below with reference to the drawings.
Fig. 1 is a schematic flowchart of an image brightness processing method according to an embodiment of the disclosure.
As shown in fig. 1, the method comprises the steps of:
step 101, acquiring a target image acquired by a first camera.
The first camera may be any one of a plurality of cameras, the first camera is only set for convenience of distinguishing, a second camera and a third camera exist subsequently, and the name does not indicate priority.
In the embodiment of the disclosure, the target image acquired by the first camera is an image which needs to be subjected to brightness balance adjustment, and the target image may be a frame of image acquired at one moment or a plurality of frames of images acquired at a plurality of moments, that is, a group of images. The acquired target image acquired by the first camera is related to the acquisition frequency of the camera, the acquisition frequency of the camera may be one frame in 10 seconds, or multiple frames, for example, 10 frames, may be acquired in one second, which is not limited herein, that is, the image acquired by the first camera in the embodiment of the present disclosure may not be continuous frames, so that the vehicle processing efficiency is improved, or may also be continuous frames, so as to obtain a better processing effect.
For example, the multiple cameras are 4 cameras, one camera 1 is used as a first camera, and in one scene, a target image acquired by the first camera is a frame; in another scene, the acquired target image acquired by the first camera is a 5-frame image, and the images acquired by the first camera at 5 moments are respectively S11, S12, S13, S14 and S15, where the images S11, S12, S13, S14 and S15 are the target images acquired by the first camera, and the target image is a group of images acquired by the first camera, that is, the images acquired by the same camera are a group of images.
And 102, acquiring a reference image acquired by a second camera adjacent to the first camera, wherein an overlapping area exists between the target image and the reference image.
The reference image is an image collected by a second camera adjacent to the first camera. The number of the second cameras may be one or more, that is, the number of the reference images may be one or more groups, where the image included in one group is one or more frames. The reference image and the target image have a corresponding relationship at the time of acquiring the target image, that is, the time of acquiring a frame of the target image and the time of acquiring a frame of the reference image have a corresponding relationship, and the corresponding relationship may be the same time or a preset time difference.
In the embodiment of the disclosure, the first camera and the second camera have an adjacent relationship, in order to realize panoramic image stitching, the same image area acquired between the target image acquired by the first camera and the reference image acquired by the second camera exists, and the overlapping area existing between the target image and the reference image can be determined according to the same image area acquired. As an example, fig. 2 is a schematic diagram of a splicing map provided in an embodiment of the present application, and a front shot collected image, a left shot collected image, a right shot collected image, and a rear shot collected image are collected by 4 cameras respectively, as shown in a left image in fig. 2, in a top view scene, the left shot collected image is a target image, a corresponding reference image is a front shot collected image, and an overlapping area between the left shot collected image and the front shot collected image is determined to be an area indicated by a in a left image in fig. 2, and an overlapping area indicated by a left shot collected image A1 and a front shot collected image A2 in a right image in fig. 2 is expanded. Therefore, a group of left-view captured images and a corresponding group of reference images, that is, a group of pre-view captured images, are determined, the overlapping regions of the captured images corresponding to each time are the regions indicated by a in fig. 2, and the overlapping regions are respectively indicated by A1 in the left-view captured image and A2 in the pre-view captured image in the right-view captured image in fig. 2 when the overlapping regions are unfolded.
And 103, determining a target brightness balance parameter of the target image according to the brightness information of the overlapping area in the reference image so as to adjust the brightness of the image acquired by the first camera.
In the embodiment of the present disclosure, the luminance information of the overlapping area in the reference image is determined according to the target luminance equalization parameter corresponding to the reference image, and the target luminance equalization parameter of the reference image may be determined, and the target luminance equalization parameter of the reference image may be a target luminance equalization parameter used for indicating the luminance information of the reference image itself, or a set target luminance equalization parameter, or a luminance equalization parameter determined according to the luminance information of the reference image corresponding to the target luminance equalization parameter. Therefore, the luminance information of the overlapping region of the reference image is determined in different ways in different scenes, and the following description is made for different scenes:
in the first scenario, the brightness information of the reference image may be the brightness information of the reference image itself, that is, in an initial stage of performing brightness equalization on the images acquired by the multiple cameras, an image acquired by any one of the multiple cameras is taken as the reference image, and thus, the brightness information of the overlapping area of the reference image is the brightness information of the overlapping area itself of the reference image.
For example, as shown in fig. 2, for 4 frames of images acquired by 4 cameras, one frame of image can be selected from the 4 frames of images as a reference image, for example, a pre-shot image is selected as the reference image, so that the brightness information of the overlapping area of the pre-shot image is the brightness information of the overlapping area itself, that is, the brightness information without adjustment. In the second scenario, the brightness information of the reference image may be brightness information adjusted according to the set target brightness equalization parameter, that is, in an initial stage of performing brightness equalization on images acquired by multiple cameras, an image acquired by any one of the multiple cameras is used as the reference image, and the brightness information after the reference image is adjusted is determined according to the set target brightness equalization parameter. It should be noted that the target brightness equalization parameter set for the reference image may be set by a person skilled in the art according to a service requirement or a scene requirement, and is not limited in this embodiment.
In a third scenario, the target brightness equalization parameter of the reference image is determined according to the brightness information of the reference image corresponding to the target brightness equalization parameter, that is, the brightness information corresponding to the overlapping area of the reference image is adjusted according to the determined target brightness equalization parameter of the reference image, so as to obtain the adjusted brightness information of the overlapping area in the reference image.
Further, in a case that the reference image has the corresponding target brightness equalization parameter, in an implementation manner of the embodiment of the present disclosure, according to the brightness information Y of each pixel point in the YUV space in the overlap region of each group of reference images, the brightness information Y of each pixel point in the overlap region is adjusted by the corresponding target brightness equalization parameter to obtain adjusted brightness information Y1 of each pixel point, according to the adjusted brightness information Y1 of each pixel point, a brightness average value of the overlap region in the adjusted reference image is determined, and the brightness average value of the overlap region in the adjusted reference image is used as the brightness information of the overlap region in the reference image.
In the embodiment of the present disclosure, there is an overlapping area between a target image collected by a first camera and a reference image collected by an adjacent second camera, and lens parameters of the first camera and the second camera may be different, or light rays during collection are different due to different camera setting positions, so that brightness of images in the overlapping area is different in images collected by different cameras, and therefore, brightness of images collected by adjacent cameras needs to be balanced, that is, brightness of images collected by two adjacent cameras is adjusted to make brightness of images consistent. For example, for each pixel point in the overlapping region in the left captured image and the front captured image, their Y (luminance value) in YUV space after the luminance equalization process should be the same or close. Therefore, as an implementation manner, when the camera has a low acquisition frequency, for example, the acquired target image is a single-frame image, the target brightness equalization parameter of the target image corresponding to the reference image can be determined according to the brightness information of the overlapping region of the corresponding reference image, and the brightness of the target image acquired by the first camera and the brightness of the corresponding reference image are kept consistent through the target brightness equalization parameter, so that the accuracy of the target brightness equalization parameter is improved.
As another implementation manner, when the acquisition frequency of the camera is high, for example, the acquired target image is a group of images, that is, a multi-frame image, and according to the brightness information of the overlapping area of the corresponding multi-frame reference image, the target brightness balance parameter of the multi-frame target image corresponding to the multi-frame reference image can be determined in a linear fitting manner, and since the brightness information included in the multi-frame reference image is richer, the accuracy of determining the target brightness information of the target image is improved compared with a manner that the reference image is a single-frame image.
For a scene with a plurality of cameras, for example, in a scene of splicing a panoramic annular view by images acquired by the plurality of cameras, the above process is repeatedly executed, the target equalization parameter of the image acquired by each camera can be obtained by fitting according to the brightness information of the overlapping area of the reference image, so that the consistency of the brightness information in the images acquired by the plurality of cameras through the reference image is determined, and the visual effect is improved.
According to the image brightness processing method, the target image acquired by the first camera is acquired, the reference image acquired by the second camera adjacent to the first camera is acquired, an overlapping area exists between the target image and the reference image, the target brightness balance parameter of the target image is determined according to the brightness information of the overlapping area in the reference image, the target brightness balance parameter of the target image acquired by each first camera can be adjusted according to the image brightness of the images acquired by the subsequent first cameras, the continuous consistency of the brightness information in the images acquired by the multiple cameras is guaranteed, the brightness difference after splicing is avoided, and the visual effect is improved.
Based on the foregoing embodiments, another image brightness processing method is provided in the embodiments of the present disclosure, and fig. 3 is a schematic flow diagram of another image brightness processing method provided in the embodiments of the present disclosure, as shown in fig. 3, the method includes the following steps:
step 301, acquiring a target image acquired by a first camera.
Step 302, acquiring a reference image acquired by a second camera adjacent to the first camera, wherein an overlapping area exists between the target image and the reference image.
Step 301 and step 302 may refer to the explanations in the foregoing embodiments, and the principle is the same, which is not described again in this embodiment.
The embodiment of the present disclosure provides a scene, for example for on-vehicle all around splicing map scene, all around splicing map passes through a plurality of cameras that set up on the vehicle and gathers the image, as an example to the image concatenation that 4 cameras that carry on all around on the vehicle gathered on the on-vehicle all around splicing map and obtained, for example, the camera is the flake camera, and the flake camera has great wide angle. The stitched panoramic image is illustrated by taking a top view angle shown in fig. 2 as an example.
Step 303, determining whether the reference images corresponding to the target image group are a group, if yes, executing step 304, and if not, executing step 305.
In the embodiment of the present disclosure, in the all-round-view mosaic image, the image collected by each camera has two adjacent cameras, that is, one or two of the images collected by two adjacent cameras can be used as a reference image thereof, and the specific description is as follows:
in an implementation manner of the embodiment of the present disclosure, if the reference image is a group of images acquired by the second camera, that is, the reference images corresponding to the group of target images acquired by the first camera are a group, step 304 is executed.
In another implementation manner of the embodiment of the present disclosure, if the reference images are two groups of images acquired by two second cameras, that is, if the reference image corresponding to one group of images acquired by the first camera is two groups of images, step 305 is executed.
And step 304, determining target brightness balance parameters by adopting a linear fitting mode according to the brightness information of the overlapped area in the group of reference images and the brightness information of the overlapped area in the group of target images.
In the embodiment of the present disclosure, the target image is a group of target images acquired by the first camera, and the reference image is a group of reference images acquired by the second camera. Any image in the group of target images and the corresponding image in the group of reference images have the same acquisition time, and the target images and the reference images are acquired at the same time, so that the acquisition in the same scene is ensured, the influence of different external environments is reduced, and the accuracy is improved when the target brightness balance parameters corresponding to the target images are determined according to the reference images.
In the embodiment of the present disclosure, the images collected by the 4 cameras in fig. 2 are described as an example under a top view angle, and for other angles, the principle is the same, and details are not repeated in this embodiment.
The left camera is used as a first camera, the adjacent cameras are a front camera and a rear camera, and for a group of target images collected by the left camera, if the corresponding reference images are determined to be a group of images collected by the front camera, the left camera is used as the first camera. For example, the multi-frame images acquired by the front camera at multiple times are C11, C12, and C13, and the multi-frame images acquired by the left camera at multiple times are C21, C22, and C23, where there is an overlapping region, such as the region a shown in fig. 2, between the top view corresponding to the front-shot acquired image and the top view corresponding to the left-shot acquired image as the reference image. Y is brightness information of an overlapping area in the reference image, Y1 is brightness information of an overlapping area in the target image acquired by the first camera, and a and b are target brightness balance parameters for brightness adjustment of the target image acquired by the first camera. For example, in a group of target images acquired by the first camera, the average brightness value corresponding to the overlapping area of the target image C11 is Y11, the average brightness value corresponding to the overlapping area of the target image C12 is Y12, and the average brightness value corresponding to the overlapping area of the target image C13 is Y13; in a group of reference images, the average luminance value corresponding to the overlapping region of the reference image C21 is Y21, the average luminance value corresponding to the overlapping region of the reference image C22 is Y22, and the average luminance value corresponding to the overlapping region of the reference image C23 is Y23. Each target image in a group of target images acquired by the first camera needing to determine the target brightness balance parameters and each reference image in the corresponding group of reference images are respectively substituted into a fitting formula to be represented as follows:
Y11=Y21*a 1l +b 1l
Y12=Y22*a 2l +b 2l
Y13=Y23*a 3l +b 3l
the luminance balance parameters al, bl of a group of collected images corresponding to the left camera are solved by linear fitting, for example, fitting a straight line by a least square method or by a Random Sample Consensus (RANSAC).
It should be noted that, in the embodiment of the present disclosure, a group of images is used as 3 frames of images for fitting, in practical applications, for example, the accuracy of the brightness equalization parameters obtained by fitting is improved, and a group of images including more frames of image data may be used for fitting, which is not illustrated one by one nor limited in this embodiment.
Step 305, determining a first candidate brightness equalization parameter by adopting a linear fitting mode according to the brightness information of the overlapped area in the group of reference images and the brightness information of the overlapped area in the group of target images.
In an implementation manner of the embodiment of the present disclosure, it is necessary to determine candidate luminance balance parameters corresponding to a group of target images acquired by a first camera according to two groups of reference images, respectively.
The method for determining the candidate brightness equalization parameter of the target image acquired by the corresponding first camera according to each group of reference images may refer to the description in step 304, and the principle is the same, and is not described in detail in this embodiment. When the candidate brightness balance parameters of the target image acquired by the corresponding first camera are determined by fitting the reference image with the determined target brightness balance parameters, the brightness information corresponding to the overlapping area of the reference image is the brightness information adjusted by using the determined target brightness balance parameters, so that the brightness information is transmitted according to the reference image, and the brightness balance is performed on the images acquired by the multiple cameras, so that the brightness adjustment in the images acquired by the multiple cameras is kept consistent.
And step 306, determining a second candidate brightness balance parameter by adopting a linear fitting mode according to the brightness information of the overlapping area in the other group of reference images and the brightness information of the overlapping area in the group of target images.
Similarly, reference may be made to the description in step 305, and details are not repeated in this embodiment.
And 307, determining a target brightness balance parameter of a group of target images according to the first candidate brightness balance parameter and the second candidate brightness balance parameter so as to adjust the brightness of the image acquired by the first camera.
In an implementation manner of the embodiment of the present disclosure, after determining two candidate luminance balance parameters of a group of target images acquired by a first camera, that is, determining a first candidate luminance balance parameter and a second candidate luminance balance parameter, averaging the first candidate luminance balance parameter and the second candidate luminance balance parameter to obtain the target luminance balance parameter of the group of images acquired by the one camera.
In the embodiment of the present disclosure, in order to improve the accuracy of determining the target brightness equalization coefficient of a group of images acquired by one camera, it is necessary to determine whether the target brightness equalization parameter of the group of target images acquired by the first camera is usable according to the difference between two candidate brightness equalization parameters corresponding to the group of target images acquired by the first camera, which is specifically described as follows:
in a first implementation manner of the embodiment of the present disclosure, if the candidate luminance equalization parameter is a slope, determining a first ratio between a slope in the first candidate luminance equalization parameter and a slope in the second candidate luminance equalization parameter, and if the first ratio belongs to a set interval in a scene, taking an average value of the slope in the first candidate luminance equalization parameter and the slope in the second candidate luminance equalization parameter as a target luminance equalization parameter of a group of target images acquired by the first camera; and if the first ratio does not belong to the set interval, the accuracy of the first candidate brightness balance parameter and the second candidate brightness balance parameter is considered to be poor and not accord with the set requirement, and the target brightness balance parameter of the target image is determined to be abandoned.
In a second implementation manner of the embodiment of the present disclosure, if the candidate luminance balancing parameter is an intercept, determining a second ratio between the intercept in the first candidate luminance balancing parameter and the intercept in the second candidate luminance balancing parameter, and if the second ratio belongs to a set interval in a scene, taking an average value of the intercept in the first candidate luminance balancing parameter and the intercept in the second candidate luminance balancing parameter as a target luminance balancing parameter of a group of target images acquired by the first camera; and if the second ratio does not belong to the set interval, the accuracy of the first candidate brightness balance parameter and the second candidate brightness balance parameter is considered to be poor and not meet the set requirement, and the target brightness balance parameter of the target image is determined to be abandoned.
In a third implementation manner of the embodiment of the present disclosure, the candidate luminance equalization parameters are two, that is, two linear parameters obtained by linear fitting are slope a and intercept b, respectively, a first ratio between a slope in the first candidate luminance equalization parameter and a slope in the second candidate luminance equalization parameter is determined, and a second ratio between an intercept in the first candidate luminance equalization parameter and an intercept in the second candidate luminance equalization parameter is determined, where, under the condition that both the first ratio and the second ratio belong to the set interval, an average value of the slope in the first candidate luminance equalization parameter and the slope in the second candidate luminance equalization parameter is used as a slope in the target luminance equalization parameters of the group of target images acquired by the first camera, and an average value of the intercept in the first candidate luminance equalization parameter and the intercept in the second candidate luminance equalization parameter is used as an intercept in the target luminance equalization parameters of the group of target images acquired by the first camera.
In the embodiment of the disclosure, the determined target brightness balance parameter of the target image acquired by the first camera is used for adjusting the brightness of the subsequent image acquired by the first camera. Under the scene that a plurality of cameras exist, the plurality of cameras adjust the brightness of the subsequently acquired images according to the corresponding target brightness balance parameters, so that the brightness information of each image is kept consistent, and the display effect is improved.
It should be noted that, in the embodiment of the present disclosure, the target brightness equalization parameter of each camera may be determined according to the above image brightness processing method based on the set period, so as to improve the reliability of equalization adjustment performed on the brightness of the acquired image by each camera.
In the image brightness processing method of the embodiment of the disclosure, the target brightness balance parameter of the target image acquired by each first camera can be obtained by fitting according to the brightness information of the overlapping area of the reference images, and when the reference images corresponding to the target image acquired by one first camera are two groups, the reliability of the target brightness balance parameter of the first camera adjusted this time is determined according to the difference between two candidate brightness balance parameters determined by the two groups of reference images, so that the accuracy of brightness adjustment of multiple cameras is improved, the difference between brightness in a mosaic image obtained by mosaic of images acquired by multiple cameras is avoided, and the visual effect is improved.
Based on the foregoing embodiments, an embodiment of the present disclosure provides another image brightness processing method, and fig. 4 is a schematic flowchart of the another image brightness processing method provided by the embodiment of the present disclosure, as shown in fig. 4, the method includes the following steps:
step 401, a target image acquired by a first camera is acquired.
And 402, acquiring a reference image acquired by a second camera adjacent to the first camera, wherein an overlapping area exists between the target image and the reference image.
For details, reference may be made to the description in the foregoing embodiments, and the principle is the same, which is not described in detail in this embodiment.
And 403, mapping the target image and the reference image from the image coordinate system to the world coordinate system according to the mapping relation between the image coordinate system and the world coordinate system to obtain an overlapping area between the target image and the reference image.
In the embodiment of the disclosure, the images collected by the plurality of cameras are used for performing image splicing to obtain a spliced image, and the spliced image can be a closed-loop panoramic all-around spliced image or a non-closed-loop spliced image.
In the embodiment of the disclosure, after a target image acquired by a first camera at least one moment and a reference image acquired by an adjacent second camera at least one corresponding moment are mapped into a world coordinate system from an image coordinate system, as an implementation manner, a mosaic image is obtained according to relative position information between cameras or an image mosaic sequence according to a frame of image acquired by each camera at each moment, and further, an overlapping region between acquired images is determined according to a mosaic position relationship between the acquired images in the mosaic image, and further, for a target image acquired by any first camera, a brightness equalization parameter of the target image acquired by the first camera is determined according to brightness information of the overlapping region in the target image acquired by the first camera and brightness information of the overlapping region in the corresponding reference image, so that brightness adjustment can be performed on each target image, and simultaneously, the brightness equalization parameter is used for performing brightness equalization adjustment on a subsequently acquired image of the first camera.
Step 404, determining whether a group of reference images corresponding to a group of target images is a group, if yes, executing step 405, and if not, executing step 406.
Step 405, determining target brightness balance parameters by adopting a linear fitting mode according to the brightness information of the overlapping area in the group of reference images and the brightness information of the overlapping area in the group of target images.
Step 406, determining a first candidate luminance equalization parameter by using a linear fitting manner according to the luminance information of the overlapping region in the group of reference images and the luminance information of the overlapping region in the group of target images.
Step 407, determining a second candidate luminance equalization parameter by using a linear fitting manner according to the luminance information of the overlapping region in the other group of reference images and the luminance information of the overlapping region in the group of target images.
And step 408, determining a target brightness balance parameter of a group of target images according to the first candidate brightness balance parameter and the second candidate brightness balance parameter so as to adjust the brightness of the image acquired by the first camera.
The steps 404 to 408 can refer to the explanations in the foregoing embodiments, and the principle is the same, which is not described again in this embodiment.
In the image brightness processing method in the embodiment of the disclosure, the overlapping area between at least one target image acquired by the first cameras is determined by mapping each frame of target image to the world coordinate system, so that the color information of each target image is determined by fitting based on the overlapping area, and the accuracy of color balance is provided.
Based on the foregoing embodiments, in order to further clearly describe the foregoing embodiments, the embodiments of the present disclosure describe the foregoing image brightness processing method with reference to the scene of fig. 2.
In the embodiment of the present disclosure, the fitting formula is Y = Y1 × a + b, where Y is luminance information of an overlapping region of one collected image in the reference image, and Y1 is luminance information of an overlapping region of one frame of target image for which a target luminance balance coefficient is to be determined.
In the embodiment of the disclosure, collected images in four directions of front, back, left and right are collected at least one time by cameras arranged in four directions of front, back, left and right of a vehicle body, 4 original collected images collected at each time are read respectively, calibrated internal and external parameters of a fisheye camera are read, then the original collected images in the four directions are mapped into a world coordinate system according to a mapping relation between an image coordinate system and the world coordinate system, so as to obtain 4 collected images in the world coordinate system, and top views of the four collected images corresponding to top view angles, that is, top views of the four collected images shown in fig. 2, that is, panoramic circular top views, including a front-shot top view, a left-shot top view, a right-shot top view and a back-shot top view, are obtained, wherein an overlapping region exists between adjacent top views, wherein the overlapping region between adjacent top views may be an irregular overlapping region.
In the embodiment of the disclosure, before determining the brightness balance parameter corresponding to the acquired image, the brightness of the image acquired by each camera is the brightness of the original acquired image, and the brightness between the images after splicing is not uniform, so that it is possible to determine that any camera, that is, the image acquired by the first camera, is the target image for which the brightness balance parameter needs to be determined, determine the reference image corresponding to the target image acquired by the first camera, and determine the brightness balance coefficient of the target image based on the reference image. Specifically, the left-view plan view and the front-view plan view are adjacent top views, and when the left-view plan view is taken as a target image and a brightness equalization parameter corresponding to the left-view plan view is determined, the front-view plan view is taken as a reference image and brightness information of the front-view plan view is taken as a reference, and the target brightness equalization parameter corresponding to the left-view plan view is determined by fitting according to brightness information of an overlapping region between multiple frames of front-view plan views and left-view plan views.
Furthermore, after the target brightness balance parameter of the left shot top view and the target brightness balance parameter of the right shot top view are determined, the target brightness balance parameter of the back shot top view also needs to be determined, and the reference image of the back shot top view comprises the left shot top view and the right shot top view of which the target brightness balance parameters are determined, so that the brightness information of each frame of the left shot top view is updated according to the determined target brightness balance parameter, the overlapping area between the multiple frames of the left shot top view and the back shot top view is determined, the first candidate brightness balance parameter of the back shot top view is determined according to the brightness information of the overlapping area between the multiple frames of the left shot top view and the back shot top view in a fitting mode, and similarly, the second candidate brightness balance parameter of the back shot top view is determined according to the updated brightness information of the overlapping area between the multiple frames of the right shot top view and the back shot top view in a fitting mode.
In an embodiment of the present disclosure, the luminance equalization parameter includes a slope and/or an intercept, and in an example of an embodiment of the present disclosure, the luminance equalization parameter includes a slope and an intercept, for example, the first candidate luminance equalization parameter includes a slope a1 and an intercept b1, and the second candidate luminance equalization parameter includes a slope a2 and an intercept b2. Furthermore, the ratio of a1 and a2 is compared to determine whether the ratio falls within a set range, and the ratio of b1 and b2 is compared to determine whether the ratio also falls within a set range, for example, the set range is 0.8-1.2, wherein the set ranges of the two equalization parameters may also be different, and the present embodiment is not limited thereto. Under a scene, when the ratio of the two fitting parameters, namely the ratio of the slope and the ratio of the intercept, belong to a set range, the accuracy of the target brightness balance parameter corresponding to the top view corresponding to each camera is confirmed to meet the set requirement, and then the target brightness balance parameter is used for adjusting the brightness of the subsequently acquired images of each camera, so that the brightness of the subsequently acquired panoramic annular view images is kept consistent, and the splicing effect of the panoramic annular view images is improved. In another scenario, if any ratio of the two fitting parameters, that is, the ratio of the slope to the intercept, does not belong to the setting range, it is determined that the accuracy of the target brightness balance parameter of the top view corresponding to each camera does not meet the setting requirement, and the target brightness balance parameter is not used for adjusting the brightness of the image subsequently acquired by each camera. As an implementation manner, for the adjustment of the brightness of the subsequently acquired image of each camera, if the historical brightness balance parameters exist in each top view, the historical brightness balance parameters corresponding to each top view can be adopted to adjust the brightness corresponding to the panoramic annular view obtained by splicing each top view; and if the historical brightness balance parameters do not exist in each top view, the brightness of each top view is not adjusted, namely the original image is output, the plurality of cameras are driven to acquire the images at least at one moment again, and the brightness balance adjustment is performed by sampling the face height balance processing method so as to improve the reliability of the brightness balance adjustment.
In order to implement the above embodiments, the present disclosure also provides an image brightness processing apparatus.
Fig. 5 is a schematic structural diagram of an image brightness processing apparatus according to an embodiment of the present disclosure.
As shown in fig. 5, the apparatus includes:
the first obtaining module 51 is configured to obtain a target image collected by the first camera.
A second obtaining module 52, configured to obtain a reference image collected by a second camera adjacent to the first camera, where an overlapping area exists between the target image and the reference image.
The determining module 53 is configured to determine a target brightness balance parameter of the target image according to brightness information of an overlapping area in the reference image, so as to adjust brightness of an image acquired by the first camera.
Further, in an implementation manner provided in the embodiment of the present application, the target image is a group of target images acquired by the first camera, the reference image is a group of reference images acquired by the second camera, and the determining module 53 is specifically configured to:
and determining the target brightness balance parameters by adopting a linear fitting mode according to the brightness information of the overlapped area in the group of reference images and the brightness information of the overlapped area in the group of target images.
In an implementation manner provided in the embodiment of the present application, the determining module 53 is further specifically configured to:
when the reference images are two groups of reference images respectively acquired by two second cameras, determining a first candidate brightness balance parameter by adopting a linear fitting mode according to brightness information of an overlapping region in the group of reference images and brightness information of an overlapping region in the group of target images;
determining a second candidate brightness balance parameter by adopting a linear fitting mode according to the brightness information of the overlapping area in the other group of reference images and the brightness information of the overlapping area in the group of target images;
and determining the target brightness balance parameters of the group of target images according to the first candidate brightness balance parameters and the second candidate brightness balance parameters.
In an implementation manner provided in the embodiment of the present application, the determining module 53 is further specifically configured to:
determining a ratio between the first candidate luminance equalization parameter and the second candidate luminance equalization parameter;
and taking the average value of the first candidate brightness equalization parameter and the second candidate brightness equalization parameter as the target brightness equalization parameter of the group of target images under the condition that the ratio belongs to a set interval.
In an implementation manner provided in the embodiment of the present application, the candidate luminance equalization parameter includes a slope and/or an intercept obtained by linear fitting, and the determining module 53 is specifically further configured to:
determining a first ratio between a slope in the first candidate luminance equalization parameter and a slope in the second candidate luminance equalization parameter and/or determining a second ratio between an intercept in the first candidate luminance equalization parameter and an intercept in the second candidate luminance equalization parameter;
and under the condition that the first ratio and/or the second ratio belong to a set interval, taking the average value of the two slopes as the slope of the target brightness equalization parameters of the group of target images, and/or taking the average value of the two intercepts as the intercept of the target brightness equalization parameters of the group of target images.
In one implementation provided by the embodiments of the present application, any image in a set of target images and a corresponding image in the set of reference images have the same acquisition time.
In an implementation manner provided by the embodiment of the present application, the apparatus further includes:
and the mapping module is used for mapping the target image and the reference image from the image coordinate system to the world coordinate system according to the mapping relation between the image coordinate system and the world coordinate system to obtain an overlapping area between the target image and the reference image.
It should be noted that the foregoing explanation of the method embodiment is also applicable to the apparatus of the embodiment, and is not repeated herein.
In the image brightness processing device in the embodiment of the disclosure, a target image acquired by a first camera is acquired, a reference image acquired by a second camera adjacent to the first camera is acquired, wherein an overlapping region exists between the target image and the reference image, and a target brightness balance parameter of the target image is determined according to brightness information of the overlapping region in the reference image, so as to adjust the brightness of the image acquired by the first camera, ensure consistency of the brightness information in the images acquired by a plurality of cameras, avoid brightness difference after splicing, and improve visual effect.
In order to implement the above embodiments, an embodiment of the present disclosure provides an electronic device, including:
at least one processor; and a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of the foregoing method embodiments.
In order to achieve the above embodiments, the embodiments of the present disclosure provide a vehicle including the electronic device described in the foregoing embodiments.
To achieve the above embodiments, the embodiments of the present disclosure propose a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the method described in the foregoing method embodiments.
To achieve the above embodiments, the present disclosure provides a computer program product, which includes computer instructions that, when executed by a processor, implement the method described in the foregoing method embodiments.
Fig. 6 is a block diagram of an electronic device according to an embodiment of the present disclosure. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, the electronic device 10 includes a processor 11, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 12 or a program loaded from a Memory 16 into a Random Access Memory (RAM) 13. In the RAM 13, various programs and data necessary for the operation of the electronic apparatus 10 are also stored. The processor 11, the ROM 12, and the RAM 13 are connected to each other via a bus 14. An Input/Output (I/O) interface 15 is also connected to the bus 14.
The following components are connected to the I/O interface 15: a memory 16 including a hard disk and the like; and a communication section 17 including a Network interface card such as a LAN (Local Area Network) card, a modem, or the like, the communication section 17 performing communication processing via a Network such as the internet; a drive 18 is also connected to the I/O interface 15 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program, carried on a computer readable medium, containing program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 17. The computer program, when executed by the processor 11, performs the above-described functions defined in the method of the present disclosure.
In an exemplary embodiment, there is also provided a storage medium comprising instructions, such as the memory 16 comprising instructions, executable by the processor 11 of the electronic device 10 to perform the above-described method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In the description of the present specification, reference to the description of "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present disclosure, "plurality" means at least two, e.g., two, three, etc., unless explicitly defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present disclosure in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present disclosure.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are well known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried out in the method of implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may also be stored in a computer-readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present disclosure have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present disclosure, and that changes, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present disclosure.

Claims (18)

1. An image brightness processing method, comprising:
acquiring a target image acquired by a first camera;
acquiring a reference image acquired by a second camera adjacent to the first camera, wherein an overlapping area exists between the target image and the reference image;
and determining a target brightness balance parameter of the target image according to the brightness information of the overlapping area in the reference image so as to adjust the brightness of the image acquired by the first camera.
2. The method of claim 1, wherein the target image is a set of target images captured by the first camera and the reference image is a set of reference images captured by the second camera; the determining a target brightness balance parameter of the target image according to the brightness information of the overlapping area in the reference image includes:
and determining the target brightness balance parameters by adopting a linear fitting mode according to the brightness information of the overlapping area in the group of reference images and the brightness information of the overlapping area in the group of target images.
3. The method of claim 2, wherein determining the target brightness equalization parameter for the target image based on the brightness information of the overlapping region in the reference image comprises:
when the reference images are two groups of reference images respectively acquired by two second cameras, determining a first candidate brightness balance parameter by adopting a linear fitting mode according to brightness information of an overlapping region in the group of reference images and brightness information of an overlapping region in the group of target images;
determining a second candidate brightness balance parameter by adopting a linear fitting mode according to the brightness information of the overlapping region in the other group of reference images and the brightness information of the overlapping region in the group of target images;
and determining the target brightness balance parameters of the group of target images according to the first candidate brightness balance parameter and the second candidate brightness balance parameter.
4. The method of claim 3, wherein determining the target brightness equalization parameter for the set of target images based on the first candidate brightness equalization parameter and the second candidate brightness equalization parameter comprises:
determining a ratio between the first candidate luminance equalization parameter and the second candidate luminance equalization parameter;
and taking the average value of the first candidate brightness equalization parameter and the second candidate brightness equalization parameter as the target brightness equalization parameter of the group of target images under the condition that the ratio belongs to a set interval.
5. The method of claim 3, wherein the candidate luminance equalization parameters comprise a slope and/or an intercept resulting from a linear fit, and wherein determining the target luminance equalization parameters for the set of target images based on the first candidate luminance equalization parameters and the second candidate luminance equalization parameters comprises:
determining a first ratio between a slope in the first candidate luminance equalization parameter and a slope in the second candidate luminance equalization parameter and/or determining a second ratio between an intercept in the first candidate luminance equalization parameter and an intercept in the second candidate luminance equalization parameter;
and when the first ratio and/or the second ratio belong to a set interval, taking the average value of the two slopes as the slope of the target brightness equalization parameters of the group of target images, and/or taking the average value of the two intercepts as the intercept of the target brightness equalization parameters of the group of target images.
6. The method of claim 2, wherein any image in the set of target images has the same acquisition time as a corresponding image in the set of reference images.
7. The method of any one of claims 1-6, wherein said obtaining a reference image acquired by a second camera adjacent to the first camera comprises:
and mapping the target image and the reference image from the image coordinate system to the world coordinate system according to the mapping relation between the image coordinate system and the world coordinate system to obtain an overlapping area between the target image and the reference image.
8. An image brightness processing apparatus, comprising:
the first acquisition module is used for acquiring a target image acquired by the first camera;
the second acquisition module is used for acquiring a reference image acquired by a second camera adjacent to the first camera, wherein an overlapping area exists between the target image and the reference image;
and the determining module is used for determining a target brightness balance parameter of the target image according to the brightness information of the overlapping area in the reference image so as to adjust the brightness of the image acquired by the first camera.
9. The apparatus of claim 8, wherein the target image is a set of target images captured by the first camera and the reference image is a set of reference images captured by the second camera; the determining module is specifically configured to:
and determining the target brightness balance parameters by adopting a linear fitting mode according to the brightness information of the overlapping area in the group of reference images and the brightness information of the overlapping area in the group of target images.
10. The apparatus of claim 8, wherein the determining module is further specifically configured to:
when the reference images are two groups of reference images respectively acquired by two second cameras, determining a first candidate brightness balance parameter by adopting a linear fitting mode according to brightness information of an overlapping region in the group of reference images and brightness information of an overlapping region in the group of target images;
determining a second candidate brightness balance parameter by adopting a linear fitting mode according to the brightness information of the overlapping area in the other group of reference images and the brightness information of the overlapping area in the group of target images;
and determining the target brightness balance parameters of the group of target images according to the first candidate brightness balance parameters and the second candidate brightness balance parameters.
11. The apparatus of claim 10, wherein the determining module is further specifically configured to:
determining a ratio between the first candidate luminance equalization parameter and the second candidate luminance equalization parameter;
and taking the average value of the first candidate brightness equalization parameter and the second candidate brightness equalization parameter as the target brightness equalization parameter of the group of target images under the condition that the ratio belongs to a set interval.
12. The apparatus of claim 10, wherein the candidate luminance equalization parameters comprise a slope and/or an intercept resulting from a linear fit, and wherein the determining module is further configured to:
determining a first ratio between a slope in the first candidate luminance equalization parameter and a slope in the second candidate luminance equalization parameter and/or determining a second ratio between an intercept in the first candidate luminance equalization parameter and an intercept in the second candidate luminance equalization parameter;
and when the first ratio and/or the second ratio belong to a set interval, taking the average value of the two slopes as the slope of the target brightness equalization parameters of the group of target images, and/or taking the average value of the two intercepts as the intercept of the target brightness equalization parameters of the group of target images.
13. The apparatus of claim 9, wherein any image in the set of target images has the same acquisition time as a corresponding image in the set of reference images.
14. The apparatus of any one of claims 8-13, further comprising:
and the mapping module is used for mapping the target image and the reference image from the image coordinate system to the world coordinate system according to the mapping relation between the image coordinate system and the world coordinate system to obtain an overlapping area between the target image and the reference image.
15. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
16. A vehicle characterized by comprising the electronic device of claim 15.
17. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-7.
18. A computer program product comprising computer instructions, characterized in that the computer instructions, when executed by a processor, implement the method of any of claims 1-7.
CN202111389253.9A 2021-11-22 2021-11-22 Image brightness processing method and device, electronic equipment, vehicle and storage medium Pending CN115460354A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111389253.9A CN115460354A (en) 2021-11-22 2021-11-22 Image brightness processing method and device, electronic equipment, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111389253.9A CN115460354A (en) 2021-11-22 2021-11-22 Image brightness processing method and device, electronic equipment, vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN115460354A true CN115460354A (en) 2022-12-09

Family

ID=84295105

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111389253.9A Pending CN115460354A (en) 2021-11-22 2021-11-22 Image brightness processing method and device, electronic equipment, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN115460354A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104240211A (en) * 2014-08-06 2014-12-24 中国船舶重工集团公司第七0九研究所 Image brightness and color balancing method and system for video stitching
CN104463786A (en) * 2014-12-03 2015-03-25 中国科学院自动化研究所 Mobile robot figure stitching method and device
CN105321151A (en) * 2015-10-27 2016-02-10 Tcl集团股份有限公司 Panorama stitching brightness equalization method and system
CN107330872A (en) * 2017-06-29 2017-11-07 无锡维森智能传感技术有限公司 Luminance proportion method and apparatus for vehicle-mounted viewing system
CN111080557A (en) * 2019-12-24 2020-04-28 科大讯飞股份有限公司 Brightness equalization processing method and related device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104240211A (en) * 2014-08-06 2014-12-24 中国船舶重工集团公司第七0九研究所 Image brightness and color balancing method and system for video stitching
CN104463786A (en) * 2014-12-03 2015-03-25 中国科学院自动化研究所 Mobile robot figure stitching method and device
CN105321151A (en) * 2015-10-27 2016-02-10 Tcl集团股份有限公司 Panorama stitching brightness equalization method and system
CN107330872A (en) * 2017-06-29 2017-11-07 无锡维森智能传感技术有限公司 Luminance proportion method and apparatus for vehicle-mounted viewing system
CN111080557A (en) * 2019-12-24 2020-04-28 科大讯飞股份有限公司 Brightness equalization processing method and related device

Similar Documents

Publication Publication Date Title
KR101343220B1 (en) Real-time preview for panoramic images
US9591237B2 (en) Automated generation of panning shots
CN103517041B (en) Based on real time panoramic method for supervising and the device of polyphaser rotation sweep
CN107516294B (en) Method and device for splicing images
CN101673395B (en) Image mosaic method and image mosaic device
WO2017000484A1 (en) Panoramic image generation method and apparatus for user terminal
CN105516614B (en) Information processing method and electronic equipment
CN106713755A (en) Method and apparatus for processing panoramic image
CN109040596B (en) Method for adjusting camera, mobile terminal and storage medium
CN108200360A (en) A kind of real-time video joining method of more fish eye lens panoramic cameras
CN108053438B (en) Depth of field acquisition method, device and equipment
CN112954194B (en) Image acquisition device adjusting method, system, terminal and medium
CN111462503A (en) Vehicle speed measuring method and device and computer readable storage medium
KR101705558B1 (en) Top view creating method for camera installed on vehicle and AVM system
CN114390262A (en) Method and electronic device for splicing three-dimensional spherical panoramic image
US20130222376A1 (en) Stereo image display device
CN110278366A (en) A kind of panoramic picture weakening method, terminal and computer readable storage medium
CN114494448A (en) Calibration error evaluation method and device, computer equipment and storage medium
CN115460354A (en) Image brightness processing method and device, electronic equipment, vehicle and storage medium
CN108833874B (en) Panoramic image color correction method for automobile data recorder
CN114663521A (en) All-round-view splicing processing method for assisting parking
CN115460390A (en) Image color processing method, image color processing device, vehicle, electronic device, and storage medium
CN112150355B (en) Image processing method and related equipment
CN110545375B (en) Image processing method, image processing device, storage medium and electronic equipment
US10796451B2 (en) Object tracking method, device, augmented reality system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination