CN113240582B - Image stitching method and device - Google Patents

Image stitching method and device Download PDF

Info

Publication number
CN113240582B
CN113240582B CN202110391998.2A CN202110391998A CN113240582B CN 113240582 B CN113240582 B CN 113240582B CN 202110391998 A CN202110391998 A CN 202110391998A CN 113240582 B CN113240582 B CN 113240582B
Authority
CN
China
Prior art keywords
image
shooting
acquisition module
compensation
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110391998.2A
Other languages
Chinese (zh)
Other versions
CN113240582A (en
Inventor
易荣刚
李俊英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202110391998.2A priority Critical patent/CN113240582B/en
Publication of CN113240582A publication Critical patent/CN113240582A/en
Application granted granted Critical
Publication of CN113240582B publication Critical patent/CN113240582B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

The embodiment of the invention discloses an image stitching method and device, which are used for improving the overall visual effect of stitched images. The method comprises the following steps: determining compensation parameters according to a first image and a second image acquired by a first image acquisition module and a second image acquisition module in the image stitching device, wherein the first image is acquired by the first image acquisition module, the second image is acquired by the second image acquisition module, and the first image and the second image are images obtained by shooting the same object or the same shooting scene from different angles; determining a first shooting parameter of a first image acquisition module; according to the first shooting parameters and the compensation parameters, adjusting second shooting parameters of a second image acquisition module; and the third image obtained by the first image acquisition module through shooting by using the first shooting parameters is the same as or close to the fourth image obtained by the second image acquisition module through shooting by using the adjusted second shooting parameters.

Description

Image stitching method and device
Technical Field
The present invention relates to the field of image video technologies, and in particular, to an image stitching method and apparatus.
Background
With the development of image video technology, especially, the application of image stitching technology in various fields is becoming wider and wider. The panoramic image acquisition and stitching are one of the most important technologies in the field of image stitching, and a plurality of images acquired by different image acquisition devices under the same scene are subjected to seamless stitching to obtain the panoramic image. However, when a plurality of images are spliced, the matching degree of the image parameters such as spatial position, brightness and color of each image is involved, and if the difference of the image parameters of each image is large, the quality of the panoramic image is affected when the plurality of images are spliced.
Disclosure of Invention
The embodiment of the invention provides an image stitching method and device, which are used for solving the problem that brightness and color of non-transition areas after stitching of different images acquired by different equipment in the same scene are inconsistent, so that the stitched images have better visual effect.
In a first aspect, an embodiment of the present invention provides an image stitching method, which is applied to an image stitching device, where the image stitching device includes a first image acquisition module and a second image acquisition module, and includes:
determining compensation parameters according to a first image and a second image, wherein the first image is acquired by the first image acquisition module, the second image is acquired by the second image acquisition module, and the first image and the second image are images obtained by shooting the same object or the same shooting scene from different angles;
Determining a first shooting parameter of the first image acquisition module;
according to the first shooting parameters and the compensation parameters, adjusting second shooting parameters of the second image acquisition module;
and the third image obtained by the first image acquisition module through shooting by using the first shooting parameters is the same as or close to the fourth image obtained by the second image acquisition module through shooting by using the adjusted second shooting parameters.
Optionally, the compensation parameters include a first compensation parameter and/or a second compensation parameter, wherein the first compensation parameter is a luminance ratio of the first image and the second image; the second compensation parameter is a difference in white balance gain of the first image and the second image.
Optionally, when the compensation parameter includes the first compensation parameter, adjusting a second shooting parameter of the second image acquisition module according to the first shooting parameter and the first compensation parameter, where brightness of a third image obtained by the first image acquisition module shooting with the first shooting parameter is the same as or is close to brightness of a fourth image obtained by the second image acquisition module shooting with the adjusted second shooting parameter;
When the compensation parameters comprise the second compensation parameters, adjusting second shooting parameters of the second image acquisition module according to the first shooting parameters and the second compensation parameters, wherein the third image shot by the first image acquisition module by using the first shooting parameters is the same as or similar to the fourth image shot by the second image acquisition module by using the adjusted second shooting parameters in color.
Optionally, when the compensation parameter includes a first compensation parameter, the first shooting parameter includes at least one of exposure, gain and aperture value, and the second shooting parameter includes at least one of exposure, gain and aperture value;
when the compensation parameter includes a second compensation parameter, the first photographing parameter includes a white balance gain, and the second photographing parameter includes a white balance gain.
Optionally, the first compensation parameter is a luminance ratio of the first image and the second image, including: the first compensation parameter is a luminance ratio of an overlapping area on the first image and the second image;
the second compensation parameter is a difference in white balance gain of the first image and the second image, comprising: the second compensation parameter is a difference in white balance gain of an overlapping region on the first image and the second image.
Optionally, the second compensation parameter includes a white balance gain compensation parameter of an R component and/or a white balance gain compensation parameter of a B component; wherein the white balance gain compensation parameter of the R component satisfies:
delta_Rgain=ROI_Rgain 2 -ROI_Rgain 1
wherein delta_Rgain is used to indicate the white balance gain compensation parameter of the R component, ROI_Rgain 1 White balance gain, ROI_Rgain, for indicating overlapping region of the first image and the second image 2 A white balance gain for indicating an overlapping region on the second image with the first image;
the white balance gain compensation parameters of the component B satisfy the following conditions:
delta_Bgain=ROI_Bgain 2 -ROI_Bgain 1
wherein delta_bgain is used to indicate the white balance gain compensation parameter of the B component, roi_bgain 1 White balance gain, roi_bgain, for indicating overlapping region of the first image and the second image 2 For indicating a white balance gain of an overlapping area on the second image with the first image.
In a second aspect, an embodiment of the present invention provides an image stitching apparatus, including:
the first image acquisition module is used for acquiring a first image;
the second image acquisition module is used for acquiring a second image; the first image and the second image are images obtained by shooting the same object or the same shooting scene from different angles;
The processing module is used for determining compensation parameters according to the first image and the second image;
the processing module is further used for determining a first shooting parameter of the first image acquisition module; according to the first shooting parameters and the compensation parameters, adjusting second shooting parameters of the second image acquisition module:
and the third image shot by the first image acquisition module by using the first shooting parameters is the same as the fourth image shot by the second image acquisition module by using the adjusted second shooting parameters.
Optionally, the compensation parameters include a first compensation parameter and/or a second compensation parameter, wherein the first compensation parameter is a luminance ratio of the first image and the second image; the second compensation parameter is a difference in white balance gain of the first image and the second image.
Optionally, when the compensation parameter includes the first compensation parameter, the processing module is specifically configured to: adjusting a second shooting parameter of the second image acquisition module according to the first shooting parameter and the first compensation parameter, wherein the brightness of a third image shot by the first image acquisition module through the first shooting parameter and the brightness of a fourth image shot by the second image acquisition module through the adjusted second shooting parameter are the same or close to each other;
When the compensation parameter includes the second compensation parameter, the processing module is specifically configured to: and adjusting a second shooting parameter of the second image acquisition module according to the first shooting parameter and the second compensation parameter, wherein the color of a third image shot by the first image acquisition module by using the first shooting parameter is the same as or is close to that of a fourth image shot by the second image acquisition module by using the adjusted second shooting parameter.
Optionally, when the compensation parameter includes a first compensation parameter, the first shooting parameter includes at least one of exposure, gain and aperture value, and the second shooting parameter includes at least one of exposure, gain and aperture value;
when the compensation parameter includes a second compensation parameter, the first photographing parameter includes a white balance gain, and the second photographing parameter includes a white balance gain.
Optionally, the first compensation parameter is a luminance ratio of the first image and the second image, including: the first compensation parameter is a luminance ratio of an overlapping area on the first image and the second image;
the second compensation parameter is a difference in white balance gain of the first image and the second image, comprising: the second compensation parameter is a difference in white balance gain of an overlapping region on the first image and the second image.
Optionally, the second compensation parameter includes a white balance gain compensation parameter of an R component and/or a white balance gain compensation parameter of a B component, where the white balance gain compensation parameter of the R component satisfies:
delta_Rgain=ROI_Rgain 2 -ROI_Rgain 1
wherein delta_Rgain is used to indicate the white balance gain compensation parameter of the R component, ROI_Rgain 1 For indicating a stationWhite balance gain, roi_rgain, of the overlapping region of the first image and the second image 2 A white balance gain for indicating an overlapping region on the second image with the first image;
the white balance gain compensation parameters of the component B satisfy the following conditions:
delta_Bgain=ROI_Bgain 2 -ROI_Bgain 1
wherein delta_bgain is used to indicate the white balance gain compensation parameter of the B component, roi_bgain 1 White balance gain, roi_bgain, for indicating overlapping region of the first image and the second image 2 For indicating a white balance gain of an overlapping area on the second image with the first image.
In a third aspect, an embodiment of the present invention provides an image stitching apparatus, including a memory and a processor, where the memory stores computer instructions that, when executed on the processor, cause the processor to perform a method as provided in the first aspect above.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing computer instructions which, when run on a computer, cause the computer to perform a method as provided in the first aspect above.
In a fifth aspect, embodiments of the present application provide a computer program product which, when run on a computer, causes the computer to perform the method as provided in the first aspect above.
In the embodiment of the application, the image stitching device comprises a first image acquisition module and a second image acquisition module. The image stitching device determines compensation parameters according to a first image and a second image, wherein the first image is acquired by the first image acquisition module, the second image is acquired by the second image acquisition module, and the first image and the second image are images obtained by shooting the same object or the same shooting scene from different angles; determining a first shooting parameter of the first image acquisition module; according to the first shooting parameters and the compensation parameters, adjusting second shooting parameters of the second image acquisition module; and the third image shot by the first image acquisition module by using the first shooting parameters is the same as the fourth image shot by the second image acquisition module by using the adjusted second shooting parameters. In this way, the image parameters of the images acquired by the two image acquisition modules (namely, the third image and the fourth image) are the same or close to each other, so that the image quality obtained after the images acquired by the two image acquisition modules are spliced is higher.
Drawings
Fig. 1 is a schematic view of a scene of an image stitching method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of an image stitching method according to an embodiment of the present application;
fig. 3 is a specific exemplary diagram of an image stitching method according to an embodiment of the present application;
fig. 4 is a specific exemplary diagram of an image stitching method according to an embodiment of the present application;
fig. 5 is a specific schematic diagram of an image stitching device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an image stitching device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of another image stitching device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. It is apparent that the described embodiments are some embodiments of the technical solution of the present application, but not all embodiments. All other embodiments, based on the embodiments described in the present document, which can be obtained by a person skilled in the art without any creative effort, are within the scope of protection of the technical solutions of the present application.
As described above, when acquiring a panoramic image, it is necessary to stitch a plurality of images acquired by different image acquisition apparatuses. In the current image stitching technology, when a plurality of images are stitched, the difference of image parameters existing on different images is not considered, and the visual effect of the stitched images can be affected in this way.
In view of this, an embodiment of the present invention provides an image stitching method, which may be applied to an image stitching apparatus including more than one image capturing module, and specifically, the image stitching apparatus includes a first image capturing module and a second image capturing module. The image stitching device determines compensation parameters according to a first image and a second image, wherein the first image is acquired by the first image acquisition module, the second image is acquired by the second image acquisition module, and the first image and the second image are images obtained by shooting the same object or the same shooting scene from different angles; determining a first shooting parameter of the first image acquisition module; according to the first shooting parameters and the compensation parameters, adjusting second shooting parameters of the second image acquisition module; and the third image shot by the first image acquisition module by using the first shooting parameters is the same as the fourth image shot by the second image acquisition module by using the adjusted second shooting parameters. In this way, the image parameters of the images acquired by the two image acquisition modules (namely, the third image and the fourth image) are the same or close to each other, so that the image quality obtained after the images acquired by the two image acquisition modules are spliced is higher.
The following describes the technical scheme provided by the embodiment of the application with reference to the attached drawings.
Fig. 1 is a schematic diagram of an application scenario provided in an embodiment of the present application. As shown in fig. 1, the image stitching device includes a first image acquisition module and a second image acquisition module, where the two image acquisition modules respectively shoot the same object or the same scene from different angles to obtain a first image and a second image. Optionally, there may be an overlapping region between the first image and the second image.
Compared with the first image and the second image, the image obtained by splicing the first image and the second image comprises more scenes or larger scenes, and the panoramic image is obtained by splicing the images acquired by the plurality of image acquisition modules, so that the shooting view angle is expanded.
In the embodiment of the application, the image stitching device determines compensation parameters according to the first image and the second image; and adjusting a second shooting parameter of the second image acquisition module according to the first shooting parameter and the compensation parameter. And enabling the third image shot by the first image acquisition module by using the first shooting parameters and the fourth image shot by the second image acquisition module by using the adjusted second shooting parameters to be the same or close. In this way, the image parameters of the images collected by the two image collecting modules (namely, the third image and the fourth image) are the same or close, so that the image quality obtained after the images collected by the two image collecting modules are spliced is higher.
It can be understood that, besides the application scenario, the image stitching device of the present application may further include a greater number of image acquisition modules, and accordingly, the image stitching device may stitch a greater number of images to obtain a panoramic image. It should be noted that the above-mentioned application scenarios are only shown for facilitating understanding of the spirit and principles of the present application, and the present application examples are not limited in this respect. Rather, embodiments of the present application may be applied to any scenario where applicable.
Referring to fig. 2, a flowchart of an image stitching method according to an embodiment of the present application may be applicable to the scenario shown in fig. 1, and specifically, the method includes the following steps:
step 201: the first image acquisition module acquires a first image, and the second image acquisition module acquires a second image. As shown in fig. 1, the first image acquisition module and the second image acquisition module are located on the same image stitching device, and the two image acquisition modules acquire images at the same time, and the first image and the second image are images obtained by shooting the same object or the same shooting scene from different angles;
For example, the first image capturing module may be a camera, a lens group, a camera group, etc., and the second image capturing module may be a camera, a lens group, a camera group, etc. The first image acquisition module and the second image acquisition module are arranged at different positions on the same image splicing device. The first image acquisition module and the second image acquisition module may be the same or different.
Step 202: and determining compensation parameters according to the first image and the second image.
In an embodiment of the present application, the compensation parameter includes a first compensation parameter and/or a second compensation parameter, where the first compensation parameter is a luminance ratio of the first image and the second image, and the second compensation parameter is a difference value of white balance gains of the first image and the second image.
The process of determining the first compensation parameter and the second compensation parameter from the first image and the second image, respectively, is described below.
In the first case, the compensation parameter includes the first compensation parameter, that is, the luminance ratio, and includes the following steps 1 to 3.
And step 1, determining an overlapping area of the first image and the second image.
The first step is to take the first image as a substrate, and to match the first image with the second image to obtain a geometric correction parameter tform matrix. The image matching method is not limited to the use of image matching techniques such as feature points, optical flow methods, CNN, and the like. The correction parameter is a 3X3 matrix, and the matrix is inverted to be a tform_inv matrix.
And a second step of calculating an accurate position coordinate range of the first image in the second image by utilizing a positive mapping relation between the first image and the second image, namely a tform matrix. The original four end point coordinates (0, 0), (w 1-1, 0), (0, h 1-1), (w 1-1, h 1-1) of the first image are subjected to positive mapping calculation to obtain mapped end point coordinates, and the obtained coordinates are intersected with the image size (the end points are (0, 0), (w 2-1, 0), (0, h 2-1), (w 2-1, h 2-1)) of the second image, namely an overlapping area 2 between the first image and the second image. The positive mapping calculation method comprises the following steps:
w=x·t 11 +y·t 21 +t 31
u=x·t 12 +y·t 22 +t 32
z=x·t 13 +y·t 23 +t 33
(tx,ty)=(w./z,u./z)
wherein w is 1 Is the width of the first image, h 1 Is the height, w, of the first image 2 Is the width of the second image, h 2 Is the height of the second image, x and y in the formula are the four original endpoint values, t matrix is a tform matrix, tx and ty are the endpoint coordinates (bx) after calculation and mapping 0 ,by 0 ),(bx 1 ,by 1 ),(bx 2 ,by 2 ),(bx 3 ,by 4 )。
And thirdly, calculating the accurate position coordinate range of the second image in the first image after correction by using an inverse mapping relation, namely a tform_inv matrix, of the second image to the first image. The original four end coordinates (0, 0), (w) 2 –1,0),(0,h 2 –1),(w 2 –1,h 2 -1) performing an inverse mapping calculation to obtain mapped endpoint coordinates (bx) 0 ’,by 0 ’),(bx 1 ’,by 1 ’),(bx 2 ’,by 2 ’),(bx 3 ’,by 4 ') and the obtained coordinates are compared with the image size (end points are (0, 0), (w) 1 –1,0),(0,h 1 –1),(w 1 –1,h 1 -1) taking an intersection, i.e. the overlapping area 1 of the first image with the second image, in particularThe calculation is consistent with a positive mapping and will not be described in detail here.
It should be noted here that the overlapping area 2 and the overlapping area 1 refer to coordinate ranges of overlapping areas on the first image and the second image, respectively, on the respective images, that is, the overlapping area 1 and the overlapping area 2 are different coordinate ranges of the same overlapping area on different images.
And 2, respectively counting original image file (RAW) domain statistical information of the overlapped area of the first image and the second image. The RAW domain statistics includes a luminance value of each pixel in the overlapping region and/or a sum of luminance values of all pixels in the overlapping region,
And 3, determining the brightness ratio of the overlapping area of the first image and the second image according to RAW domain statistical information of the overlapping area of the first image and the second image, wherein the brightness ratio is the first compensation parameter.
Assuming that the sum of brightness values of all pixels of the first image in the overlapping area is sum_y1, and the sum of brightness values of all pixels of the second image in the overlapping area is sum_y2. The luminance ratio of the first image and the second image in the overlapping area is sum_y1/sum_y2, i.e. the first compensation parameter is sum_y1/sum_y2.
It should be noted that, the above step 1 may be performed or not, and if not, the luminance ratio may be the luminance ratio of the first image integral map and the second image integral map.
In a second case, the compensation parameter includes the second compensation parameter, that is, a white balance difference value, and the determining process of the second compensation parameter includes the following steps 1 to 3.
And step 1, determining an overlapping area of the first image and the second image.
Regarding step 1, please refer to the implementation manner of step 1 in the first case, and the description thereof is omitted herein.
And 2, counting RAW domain statistical information of the overlapping area of the first image and the second image, wherein the RAW domain statistical information comprises white balance gain of each pixel point in the overlapping area and/or the sum of white balance gains of all pixel points in the overlapping area.
And step 3, determining a difference value of white balance gains of the two images in the overlapping area, namely the second compensation parameter, according to RAW domain statistical information of the overlapping area of the first image and the second image.
Assuming that the white balance gain of the overlapping region on the first image includes ROI Rgain 1 And/or roi_bgain 1 Wherein, ROI_Rgain 1 Is the white balance gain of the R component of the overlapping area on the first image, and ROI_Bgain 1 Is the white balance gain of the overlapping area B component on the first image.
Similarly, assume that the white balance gain of the overlapping region on the second image is ROI_Rgain 2 And ROI_Bgain 2 Wherein, ROI_Rgain 2 Is the white balance gain of the R component of the overlapping region on the second image, ROI_Bgain 2 Is the white balance gain of the overlapping area B component on the second image.
Wherein the white balance gain difference value of the overlapping region includes delta_rgain and delta_bgain, delta_rgain being used to indicate the white balance gain difference value of the R component; delta_bgain is used to indicate the white balance gain difference of the B component, and the specific formula is as follows:
delta_Rgain=ROI_Rgain 2 -ROI_Rgain 1
delta_Bgain=ROI_Bgain 2 -ROI_Bgain 1
the delta_Rgain and the delta_Bgain are the second compensation parameters.
After determining the first and second compensation parameters, step 203 may be performed.
Step 203: and determining a first shooting parameter of the first image acquisition module. The first photographing parameter includes at least one of exposure, gain, aperture value, or white balance gain.
Wherein, before step 203, the method further comprises: judging whether the brightness of the first image is proper (such as whether the brightness is within a preset brightness range) or not according to the counted RAW domain information of the first image, and if the brightness of the first image is proper, determining the exposure, gain and aperture value in the first shooting parameters as the current exposure, gain and aperture value of the first image; if the brightness of the first image is not suitable, the RAW domain statistical information is input into an Automatic Exposure (AE) algorithm module, the AE algorithm module can adaptively adjust the RAW domain information of the image so that the brightness of the adjusted image is suitable, and the AE algorithm is output as the Exposure, the gain and the aperture value corresponding to the adjusted image. And the exposure, gain and aperture value in the first shooting parameter are the exposure, gain and aperture value output by the AE algorithm.
Step 204: and adjusting a second shooting parameter of the second image acquisition module according to the first shooting parameter and the compensation parameter.
When the compensation parameters comprise the first compensation parameters, adjusting second shooting parameters of the second image acquisition module according to the first shooting parameters and the first compensation parameters, wherein the brightness of a third image shot by the first image acquisition module through the first shooting parameters is the same as or close to that of a fourth image shot by the second image acquisition module through the adjusted second shooting parameters;
when the compensation parameters comprise the second compensation parameters, adjusting second shooting parameters of the second image acquisition module according to the first shooting parameters and the second compensation parameters, wherein the third image shot by the first image acquisition module by using the first shooting parameters is the same as or similar to the fourth image shot by the second image acquisition module by using the adjusted second shooting parameters in color.
Illustratively, the second photographing parameter includes exposure, gain, aperture value, and white balance gain.
The exposure, gain and aperture value are obtained according to the first compensation parameter and the first shooting parameter, and the specific method is realized through dynamic adjustment in an AE adjustment module in the image splicing device.
For example, assume that the first image acquisition module is referred to as Sensor1 and the second image acquisition module is referred to as Sensor2. The AE adjustment module converts the exposure, gain and aperture value to be configured of the Sensor2 according to the first compensation parameter and the exposure, gain and aperture value (namely the first shooting parameter) configured at present of the Sensor1 of the first image, and configures the exposure, gain and aperture value to the Sensor2, namely the adjustment of the exposure, gain and aperture value in the second shooting parameter is completed.
The white balance gain is adjusted according to the second compensation parameter and the first shooting parameter, and the specific method is realized through an automatic white balance (Auto White Balance, AWB) adjusting module in the image splicing device.
For example, assume that the first image acquisition module is referred to as Sensor1 and the second image acquisition module is referred to as Sensor2. The AWB adjustment module sums the second compensation parameter obtained in step 202 with the current white balance gain value (i.e. the first shooting parameter) of the first image to obtain the white balance gain to be configured by Sensor2, and configures the white balance gain to Sensor2, so that the adjustment of the white balance gain in the second shooting parameter is completed.
Two examples are presented below.
Example 1, taking as an example that the compensation parameter is a luminance ratio of the first image and the second image.
Referring to fig. 3, the process includes:
step 301, inputting a first image and a second image, and obtaining RAW domain statistical information of the two images.
Step 302: if the static calibration function is enabled, the step 303 is entered, and if not, the step 310 is entered.
The static calibration function refers to a function of the image stitching device for determining a first compensation parameter. The user may choose to turn on or off the static calibration function, so that after entering the first image and the second image, it may be determined whether the static calibration function is on (i.e. enabled).
Step 303: if yes, go to step 307, otherwise go to step 304. The calibration preparation work refers to the work of storing the coordinates of the overlapping area of the first image and the second image in the image splicing device, and if the coordinates of the overlapping area of the first image and the second image are already stored in the image splicing device, the calibration preparation work is judged to be completed; if the coordinates of the overlapping area of the first image and the second image are not stored in the image splicing device, judging that the calibration preparation work is not finished; or the calibration preparation may also refer to activating a correlation operation module (such as an AE adjustment module and/or an AWB adjustment module) in the image stitching device.
Step 304: and judging whether the brightness of the first image is proper, if so, proceeding to step 305, otherwise, proceeding to step 310.
Step 305: judging whether an overlapping area exists, if so, entering a step 306, otherwise, ending; and the overlapping area is obtained through a feature matching module, and if the overlapping area does not exist, the overlapping area which is formed by the first image and the second image and is not matched with each other is indicated, namely, the two images cannot be spliced.
Step 306: and reading the coordinates of the overlapped area, and finishing the calibration preparation work.
Step 307: and respectively counting RAW domain statistical information of the first image and the second image in an overlapped area.
Step 308: and respectively calculating the sum_y1 and sum_y2 of brightness values of all pixel points in the overlapping areas of the two images. The sum of the brightness values of all the pixels in the overlapping area on the first image is sum_y1, and the sum of the brightness values of all the pixels in the overlapping area on the second image is sum_y2.
Step 309: the luminance ratio, i.e. the first compensation parameter, is determined and saved, the static annotation function is turned off, luminance ratio = sum_y1/sum_y2.
The above is the execution step of the brightness of the first image being appropriate. If the brightness of the first image is not appropriate, step 310 is performed directly after step 304 described above.
Step 310: and inputting RAW domain statistical information of the first image into an AE algorithm module. And the output of the AE algorithm module is exposure, gain and aperture value corresponding to the first image under proper brightness after the first image is adaptively adjusted.
Step 311: sensor1 configures the current exposure, gain, aperture value. The Sensor1 is the first image acquisition module.
Step 312: and calculating the exposure, gain and aperture value required to be configured by the Sensor2 according to the brightness ratio and the current exposure, gain and aperture value of the Sensor 1.
Step 313: sensor2 configures the exposure, gain, aperture value it needs to configure. Namely, the exposure, gain, and aperture value calculated in Sensor2 configuration step 312.
Example 2 takes as an example that the compensation parameter is a white balance gain difference value of the first image and the second image. Referring to fig. 4, the process includes:
step 401: and inputting the first image and the second image, and acquiring RAW domain statistical information of the two images.
Step 402: and inputting RAW domain statistical information of the first image in the overlapping area into an AWB algorithm module. The output of the AWB algorithm module is the current white balance gain of the first image. The white balance gain includes: rgain 1 、Bgain 1
Step 403: sensor1 configures white balance gain Rgain 1 、Bgain 1
Step 404: if yes, step 405 is entered, and if not, step 407 is entered. Whether the overlapping region coordinates need to be reacquired is judged according to the marker bit of the overlapping region coordinates, and the marker bit default position 1 of the overlapping region coordinates need to be reacquired indicates that the overlapping region coordinates need to be reacquired when the image splicing device is started each time; after the image stitching device has acquired the corresponding overlapping region coordinates, the acquiring the flag position 0 of the overlapping region coordinates again indicates that the overlapping region coordinates do not need to be acquired again.
Step 405: and reading the coordinates of the overlapping area of the first image and the second image.
Step 406: and setting the zone bit of the reacquired overlapping region coordinate to 0.
Step 407: and judging whether the overlapping area coordinates exist, if so, entering a step 408, and otherwise, ending. If the overlapping region coordinates do not exist, the fact that the first image and the second image do not match with each other is indicated, namely, the two images cannot be spliced.
Step 408: acquiring the white balance gain of the first image in the overlapping area comprises the following steps: ROI_Rgain 1 、ROI_Bgain 1
Step 409: acquiring the white balance gain of the second image in the overlapping area comprises the following steps: ROI_Rgain 2 、ROI_Bgain 2
Step 410: calculating a white balance gain difference value of the first image and the second image in an overlapping area:
delta_Rgain=ROI_Rgain 2 -ROI_Rgain 1
delta_Bgain=ROI_Bgain 2 -ROI_Bgain 1
step 411: calculating a final white balance gain of the second image:
Rgain 2 =Rgain 1 +delta_Rgain
Bgain 2 =Bgain 1 +delta_Bgain
step 412: sensor2 configures white balance gain Rgain 2 And Bgain 2 For the final white balance gain.
Fig. 5 is a schematic diagram of an embodiment of the present invention. In the following, taking fig. 5 as an example, a detailed description will be given of an image stitching method provided by an embodiment of the present invention, referring to fig. 5, two camera lenses (i.e., the first image capturing module and the second image capturing module) of a binocular camera (i.e., the image stitching device) are photographed to obtain two images, i.e., a first image and a second image in fig. 5, and RAW domain statistical information of the two images is input to a feature matching module to obtain an overlapping region and an overlapping region coordinate of the first image and the second image, and a specific method thereof may refer to a corresponding description in the embodiment shown in fig. 2.
In one possible implementation, the AE adjustment module is parallel to the AWB adjustment module, as shown in fig. 5. In the embodiment of the present invention, the AE adjusting module and the AWB adjusting module may be serial, and the AE adjusting module may precede the AWB adjusting module or follow the AWB adjusting module.
And inputting RAW domain statistical information of the first image and the second image into the AE adjustment module, adjusting brightness shooting parameters (namely exposure, gain and aperture value) corresponding to the second image according to image parameters of the two images to obtain brightness shooting parameters corresponding to the adjusted two images, and ensuring that two images shot by two lenses of the binocular camera have the same brightness when shooting is performed next time.
And inputting the RAW domain statistical information of the first image and the second image into the AWB adjustment module, adjusting the color shooting parameters (namely white balance gain) corresponding to the second image according to the image parameters of the two images to obtain the color shooting parameters corresponding to the two adjusted images, and ensuring that the two images shot by the two lenses of the binocular camera have the same color when shooting is performed next time.
And configuring shooting parameters of the two lenses adjusted by the AE adjusting module and the AWB adjusting module to sensors (namely Sensor1 and Sensor 2) of the two camera lenses, so that images shot by the two lenses of the binocular camera after the adjustment have the same brightness and color, the subsequent panoramic image can be spliced conveniently, and the spliced panoramic image has a good overall visual effect.
Based on the same inventive concept, the embodiment of the invention provides an image stitching device, which can realize the functions corresponding to the image stitching method. The image stitching device may be a hardware structure, a software module, or a hardware structure plus a software module. The image stitching device can be realized by a chip system, and the chip system can be composed of chips or can contain chips and other discrete devices. Referring to fig. 6, the apparatus includes an acquisition module 601 and a processing module 602, wherein:
the acquisition module 601 comprises a first image acquisition module and a second image acquisition module, wherein the first image acquisition module is used for acquiring the first image, and the second image acquisition module is used for acquiring the second image.
A processing module 602, configured to determine a compensation parameter according to the first image and the second image; the first shooting parameter is also used for determining the first shooting parameter of the first image acquisition module; according to the first shooting parameters and the compensation parameters, adjusting second shooting parameters of the second image acquisition module;
and the third image obtained by the first image acquisition module through shooting by using the first shooting parameters is the same as or close to the fourth image obtained by the second image acquisition module through shooting by using the adjusted second shooting parameters.
In a possible implementation manner, the compensation parameters determined by the processing module include a first compensation parameter and/or a second compensation parameter, including:
the first compensation parameter is a luminance ratio of the first image and the second image;
the second compensation parameter is a difference in white balance gain of the first image and the second image.
In a possible implementation manner, when the compensation parameter includes the first compensation parameter, adjusting a second shooting parameter of the second image acquisition module according to the first shooting parameter and the first compensation parameter, wherein the brightness of a third image shot by the first image acquisition module by using the first shooting parameter is the same as or is close to that of a fourth image shot by the second image acquisition module by using the adjusted second shooting parameter;
when the compensation parameters comprise the second compensation parameters, adjusting second shooting parameters of the second image acquisition module according to the first shooting parameters and the second compensation parameters, wherein the third image shot by the first image acquisition module by using the first shooting parameters is the same as or similar to the fourth image shot by the second image acquisition module by using the adjusted second shooting parameters in color.
In one possible embodiment, when the compensation parameter includes a first compensation parameter, the first photographing parameter includes at least one of exposure, gain, and aperture value, and the second photographing parameter includes at least one of exposure, gain, and aperture value;
when the compensation parameter includes a second compensation parameter, the first photographing parameter includes a white balance gain, and the second photographing parameter includes a white balance gain.
In one possible implementation, the first compensation parameter is a luminance ratio of the first image and the second image, including: the first compensation parameter is a luminance ratio of an overlapping area on the first image and the second image;
the second compensation parameter is a difference in white balance gain of the first image and the second image, comprising: the second compensation parameter is a difference in white balance gain of the first image and the second image in an overlapping region.
In one possible implementation, the second compensation parameter includes a white balance gain compensation parameter of an R component and a white balance gain compensation parameter of a B component;
the white balance gain compensation parameters of the R component satisfy the following conditions:
delta_Rgain=ROI_Rgain 2 -ROI_Rgain 1
Wherein delta_Rgain is used to indicate the white balance gain compensation parameter of the R component, ROI_Rgain 1 White balance gain, ROI_Rgain, for indicating overlapping region of the first image and the second image 2 A white balance gain for indicating an overlapping region on the second image with the first image;
the white balance gain compensation parameters of the component B satisfy the following conditions:
delta_Bgain=ROI_Bgain 2 -ROI_Bgain 1
wherein delta_bgain is used to indicate the white balance gain compensation parameter of the B component, roi_bgain 1 White balance gain, roi_bgain, for indicating overlapping region of the first image and the second image 2 For indicating a white balance gain of an overlapping area on the second image with the first image.
All relevant contents of the steps involved in the foregoing embodiments of the image stitching method may be cited in the functional descriptions of the corresponding functional modules of the image stitching device in the embodiments of the present application, which are not described herein again.
The division of the modules in the embodiments of the present application is schematically only one logic function division, and there may be another division manner in actual implementation, and in addition, each functional module in each embodiment of the present application may be integrated in one processor, or may exist separately and physically, or two or more modules may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules.
Referring to fig. 7, based on the same inventive concept, an embodiment of the present invention provides an image stitching apparatus, which includes at least one processor 701, where the processor 701 is configured to execute a computer program stored in a memory, to implement the steps of the image stitching method shown in fig. 3 according to the embodiment of the present invention.
In the alternative, processor 701 may be a general purpose processor such as a Central Processing Unit (CPU), digital signal processor, application specific integrated circuit, field programmable gate array or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, and may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present invention. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the image stitching method disclosed in connection with the embodiment of the invention can be directly embodied as being executed by a hardware processor or by a combination of hardware and software modules in the processor.
Optionally, the image stitching apparatus may further include a memory 702 connected to the at least one processor 701, where the memory 702 stores instructions executable by the at least one processor 701, and the at least one processor 701 may execute the steps included in the image stitching method by executing the instructions stored in the memory 702.
The specific connection medium between the processor 701 and the Memory 702 is not limited in the embodiments of the present invention, and the Memory 702 may include at least one type of storage medium, for example, flash Memory, hard disk, multimedia card, card Memory, random access Memory (Random Access Memory, RAM), static random access Memory (Static Random Access Memory, SRAM), programmable Read Only Memory (Programmable Read Only Memory, PROM), read Only Memory (ROM), charged erasable programmable Read Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), magnetic Memory, magnetic disk, optical disk, and the like. Memory 702 is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory 702 in embodiments of the present invention may also be circuitry or any other device capable of performing storage functions for storing program instructions and/or data.
By programming the processor 701, the code corresponding to the image stitching method described in the foregoing embodiment may be cured into the chip, so that the chip can execute the steps of the foregoing image stitching method when running, and how to program the processor 701 into the design is a technology known to those skilled in the art, which is not repeated here. The physical device corresponding to the processing module 602 may be the aforementioned processor 701. The image stitching device may be used to perform the method provided by the embodiment shown in fig. 2. Therefore, for the functions that can be implemented by the functional modules in the device, reference may be made to the corresponding description in the embodiment shown in fig. 2, which is not repeated.
Optionally, the image stitching device in fig. 7 may further include an image acquisition module (including a first image acquisition module and a second image acquisition module), such as a camera. The first image acquisition module and the second image acquisition module can be different cameras, such as cameras arranged at different positions on the image splicing device. The camera may be a common camera or a wide-angle camera, and the embodiment of the application is not limited.
Based on the same inventive concept, embodiments of the present application also provide a computer-readable storage medium storing computer instructions that, when run on a computer, cause the computer to perform the steps of the image stitching method as described above.
In some possible embodiments, aspects of the image stitching method provided by the present application may also be implemented in the form of a program product comprising program code for causing a detection device to carry out the steps of the image stitching method according to the various exemplary embodiments of the application as described in the present specification, when the program product is run on an electronic device.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (15)

1. The image stitching method is applied to an image stitching device, and the image stitching device comprises a first image acquisition module, a second image acquisition module and an automatic exposure module, and is characterized by comprising the following steps:
determining compensation parameters according to a first image and a second image, wherein the first image is acquired by the first image acquisition module, the second image is acquired by the second image acquisition module, the first image and the second image are images obtained by shooting the same object or the same shooting scene from different angles, the compensation parameters comprise a first compensation parameter and a second compensation parameter, the first compensation parameter is the brightness ratio of the first image and the second image, the second compensation parameter comprises a white balance gain compensation parameter of an R component and/or a white balance gain compensation parameter of a B component, and the second compensation parameter is the difference value of the white balance gains of the first image and the second image;
If the brightness of the first image is within a preset brightness range, determining a first shooting parameter of the first image acquisition module;
if the brightness of the first image does not belong to the preset brightness range, adjusting the brightness of the first image to the preset brightness range through the automatic exposure module, and determining the first shooting parameter of the first image acquisition module according to the adjusted brightness of the first image;
according to the first shooting parameters and the compensation parameters, adjusting second shooting parameters of the second image acquisition module; and the third image obtained by the first image acquisition module through shooting by using the first shooting parameters is the same as or close to the fourth image obtained by the second image acquisition module through shooting by using the adjusted second shooting parameters.
2. The method of claim 1, wherein,
the compensation parameters comprise the second compensation parameters, the second shooting parameters of the second image acquisition module are adjusted according to the first shooting parameters and the second compensation parameters, and the color of a third image shot by the first image acquisition module through the first shooting parameters is the same as or is close to that of a fourth image shot by the second image acquisition module through the adjusted second shooting parameters.
3. The method of claim 1, wherein,
when the compensation parameters comprise the first compensation parameters, adjusting second shooting parameters of the second image acquisition module according to the first shooting parameters and the first compensation parameters, wherein the brightness of a third image shot by the first image acquisition module through the first shooting parameters is the same as or close to that of a fourth image shot by the second image acquisition module through the adjusted second shooting parameters.
4. The method of claim 2, wherein,
the compensation parameters include a second compensation parameter, the first photographing parameter includes a white balance gain, and the second photographing parameter includes a white balance gain.
5. The method of claim 1, wherein,
when the compensation parameter includes a first compensation parameter, the first photographing parameter includes at least one of exposure, gain, and aperture value, and the second photographing parameter includes at least one of exposure, gain, and aperture value.
6. The method of claim 1, wherein,
the first compensation parameter is a luminance ratio of the first image and the second image, comprising: the first compensation parameter is a luminance ratio of an overlapping area on the first image and the second image.
7. The method of claim 1, wherein,
the white balance gain compensation parameters of the R component satisfy the following conditions:
delta_Rgain=ROI_Rgain 2 -ROI_Rgain 1
wherein delta_Rgain is used to indicate the white balance gain compensation parameter of the R component, ROI_Rgain 1 White balance gain, ROI_Rgain, for indicating overlapping region of the first image and the second image 2 A white balance gain for indicating an overlapping region on the second image with the first image;
the white balance gain compensation parameters of the component B satisfy the following conditions:
delta_Bgain=ROI_Bgain 2 -ROI_Bgain 1
wherein delta_bgain is used for indicating the white balance gain compensation parameter of the B component, and ROI_bgain 1 White balance gain, roi_bgain, for indicating overlapping region of the first image and the second image 2 For indicating a white balance gain of an overlapping area on the second image with the second image.
8. An image stitching device, comprising:
the first image acquisition module is used for acquiring a first image;
the second image acquisition module is used for acquiring a second image; the first image and the second image are images obtained by shooting the same object or the same shooting scene from different angles;
a processing module, configured to determine a compensation parameter according to the first image and the second image, where the compensation parameter includes a first compensation parameter and a second compensation parameter, the first compensation parameter is a luminance ratio of the first image and the second image, the second compensation parameter includes a white balance gain compensation parameter of an R component and/or a white balance gain compensation parameter of a B component, and the second compensation parameter is a difference value of white balance gains of the first image and the second image;
The processing module is further configured to determine a first shooting parameter of the first image acquisition module if the brightness of the first image belongs to a preset brightness range, and adjust the brightness of the first image to the preset brightness range through the automatic exposure module if the brightness of the first image does not belong to the preset brightness range, determine the first shooting parameter of the first image acquisition module according to the adjusted brightness of the first image, and adjust a second shooting parameter of the second image acquisition module according to the first shooting parameter and the compensation parameter;
and the third image obtained by the first image acquisition module through shooting by using the first shooting parameters is the same as or close to the fourth image obtained by the second image acquisition module through shooting by using the adjusted second shooting parameters.
9. The apparatus of claim 8, wherein,
the compensation parameters comprise the second compensation parameters, the second shooting parameters of the second image acquisition module are adjusted according to the first shooting parameters and the second compensation parameters, and the color of a third image shot by the first image acquisition module through the first shooting parameters is the same as or is close to that of a fourth image shot by the second image acquisition module through the adjusted second shooting parameters.
10. The apparatus of claim 8, wherein,
when the compensation parameter includes the first compensation parameter, the processing module is specifically configured to: and adjusting a second shooting parameter of the second image acquisition module according to the first shooting parameter and the first compensation parameter, wherein the brightness of a third image shot by the first image acquisition module through the first shooting parameter and the brightness of a fourth image shot by the second image acquisition module through the adjusted second shooting parameter are the same or close to each other.
11. The apparatus of claim 8, wherein,
the compensation parameters include a second compensation parameter, the first photographing parameter includes a white balance gain, and the second photographing parameter includes a white balance gain.
12. The apparatus of claim 8, wherein,
when the compensation parameter includes a first compensation parameter, the first photographing parameter includes at least one of exposure, gain, and aperture value, and the second photographing parameter includes at least one of exposure, gain, and aperture value.
13. The apparatus of claim 8, wherein,
the first compensation parameter is a luminance ratio of the first image and the second image, comprising: the first compensation parameter is a luminance ratio of an overlapping area on the first image and the second image.
14. The apparatus of claim 8, wherein,
the white balance gain compensation parameters of the R component satisfy the following conditions:
delta_Rgain=ROI_Rgain 2 -ROI_Rgain 1
wherein delta_Rgain is used to indicate the white balance gain compensation parameter of the R component, ROI_Rgain 1 White balance gain, ROI_Rgain, for indicating overlapping region of the first image and the second image 2 A white balance gain for indicating an overlapping region on the second image with the first image;
the white balance gain compensation parameters of the component B satisfy the following conditions:
delta_Bgain=ROI_Bgain 2 -ROI_Bgain 1
wherein delta_bgain is used to indicate the white balance gain compensation parameter of the B component, roi_bgain 1 White balance gain, roi_bgain, for indicating overlapping region of the first image and the second image 2 For indicating a white balance gain of an overlapping area on the second image with the first image.
15. A computer readable storage medium storing computer instructions which, when run on a computer, cause the computer to perform the method of any one of claims 1 to 7.
CN202110391998.2A 2021-04-13 2021-04-13 Image stitching method and device Active CN113240582B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110391998.2A CN113240582B (en) 2021-04-13 2021-04-13 Image stitching method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110391998.2A CN113240582B (en) 2021-04-13 2021-04-13 Image stitching method and device

Publications (2)

Publication Number Publication Date
CN113240582A CN113240582A (en) 2021-08-10
CN113240582B true CN113240582B (en) 2023-12-12

Family

ID=77128083

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110391998.2A Active CN113240582B (en) 2021-04-13 2021-04-13 Image stitching method and device

Country Status (1)

Country Link
CN (1) CN113240582B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114040179B (en) * 2021-10-20 2023-06-06 重庆紫光华山智安科技有限公司 Image processing method and device
CN115460354A (en) * 2021-11-22 2022-12-09 北京罗克维尔斯科技有限公司 Image brightness processing method and device, electronic equipment, vehicle and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107659769A (en) * 2017-09-07 2018-02-02 维沃移动通信有限公司 A kind of image pickup method, first terminal and second terminal
KR101831429B1 (en) * 2017-11-10 2018-02-22 (주)대지이엔지 Apparatus for air shooting able to get the image of blind spot and to control resolution automatically
CN109598673A (en) * 2017-09-30 2019-04-09 深圳超多维科技有限公司 Image split-joint method, device, terminal and computer readable storage medium
CN110012209A (en) * 2018-01-05 2019-07-12 广东欧珀移动通信有限公司 Panorama image generation method, device, storage medium and electronic equipment
WO2020042858A1 (en) * 2018-08-29 2020-03-05 上海商汤智能科技有限公司 Image stitching method and device, on-board image processing device, electronic apparatus, and storage medium
WO2020093651A1 (en) * 2018-11-09 2020-05-14 浙江宇视科技有限公司 Method and apparatus for automatically detecting and suppressing fringes, electronic device and computer-readable stroage medium
CN111182217A (en) * 2020-01-07 2020-05-19 徐梦影 Image white balance processing method and device
WO2021026822A1 (en) * 2019-08-14 2021-02-18 深圳市大疆创新科技有限公司 Image processing method and apparatus, image photographing device, and mobile terminal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107659769A (en) * 2017-09-07 2018-02-02 维沃移动通信有限公司 A kind of image pickup method, first terminal and second terminal
CN109598673A (en) * 2017-09-30 2019-04-09 深圳超多维科技有限公司 Image split-joint method, device, terminal and computer readable storage medium
KR101831429B1 (en) * 2017-11-10 2018-02-22 (주)대지이엔지 Apparatus for air shooting able to get the image of blind spot and to control resolution automatically
CN110012209A (en) * 2018-01-05 2019-07-12 广东欧珀移动通信有限公司 Panorama image generation method, device, storage medium and electronic equipment
WO2020042858A1 (en) * 2018-08-29 2020-03-05 上海商汤智能科技有限公司 Image stitching method and device, on-board image processing device, electronic apparatus, and storage medium
WO2020093651A1 (en) * 2018-11-09 2020-05-14 浙江宇视科技有限公司 Method and apparatus for automatically detecting and suppressing fringes, electronic device and computer-readable stroage medium
WO2021026822A1 (en) * 2019-08-14 2021-02-18 深圳市大疆创新科技有限公司 Image processing method and apparatus, image photographing device, and mobile terminal
CN111182217A (en) * 2020-01-07 2020-05-19 徐梦影 Image white balance processing method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Liao W S..Real-time spherical panorama image stitching using Open CL.In:Proceedings of International Conference on Computer Graphics and Virtual Reality.《In:Proceedings of International Conference on Computer Graphics and Virtual Reality,Las Vegas》.2011,第113-119页. *
一种鱼眼视频全景拼接中的亮度补偿算法;黎吉国;王悦;张新峰;马思伟;;中国科学:信息科学(第03期);第33-45页 *

Also Published As

Publication number Publication date
CN113240582A (en) 2021-08-10

Similar Documents

Publication Publication Date Title
US10834316B2 (en) Image processing apparatus, image processing method, and imaging system
CN111028189B (en) Image processing method, device, storage medium and electronic equipment
US20200288059A1 (en) Image processor, image processing method and program, and imaging system
US11430103B2 (en) Method for image processing, non-transitory computer readable storage medium, and electronic device
US10389948B2 (en) Depth-based zoom function using multiple cameras
US9325899B1 (en) Image capturing device and digital zooming method thereof
WO2019105262A1 (en) Background blur processing method, apparatus, and device
US8724007B2 (en) Metadata-driven method and apparatus for multi-image processing
CN113240582B (en) Image stitching method and device
KR20180109918A (en) Systems and methods for implementing seamless zoom functionality using multiple cameras
CN111028190A (en) Image processing method, image processing device, storage medium and electronic equipment
US20130121525A1 (en) Method and Apparatus for Determining Sensor Format Factors from Image Metadata
WO2019056527A1 (en) Capturing method and device
CN109803086B (en) Method, device and camera for blending first and second images with overlapping fields of view
CN110288511B (en) Minimum error splicing method and device based on double camera images and electronic equipment
CN109040596B (en) Method for adjusting camera, mobile terminal and storage medium
CN109166076B (en) Multi-camera splicing brightness adjusting method and device and portable terminal
CN112930677B (en) Method for switching between first lens and second lens and electronic device
TWI785162B (en) Method of providing image and electronic device for supporting the method
CN108198189B (en) Picture definition obtaining method and device, storage medium and electronic equipment
CN110650288A (en) Focusing control method and device, electronic equipment and computer readable storage medium
CN113592739A (en) Method and device for correcting lens shadow and storage medium
WO2013109192A1 (en) Method and device for image processing
US20190052815A1 (en) Dual-camera image pick-up apparatus and image capturing method thereof
CN113965664A (en) Image blurring method, storage medium and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant