CN111652916B - Panoramic image generation method, panoramic image generation device and computer storage medium - Google Patents

Panoramic image generation method, panoramic image generation device and computer storage medium Download PDF

Info

Publication number
CN111652916B
CN111652916B CN202010394240.XA CN202010394240A CN111652916B CN 111652916 B CN111652916 B CN 111652916B CN 202010394240 A CN202010394240 A CN 202010394240A CN 111652916 B CN111652916 B CN 111652916B
Authority
CN
China
Prior art keywords
image
sequence
scene
images
exposure level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010394240.XA
Other languages
Chinese (zh)
Other versions
CN111652916A (en
Inventor
王元炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202010394240.XA priority Critical patent/CN111652916B/en
Publication of CN111652916A publication Critical patent/CN111652916A/en
Application granted granted Critical
Publication of CN111652916B publication Critical patent/CN111652916B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a panoramic image generation method, equipment and a computer storage medium. Comprising the following steps: obtaining an image sequence comprising scene images at a plurality of viewpoints; selecting a scene image from the image sequence as a reference image, and registering other scene images in the image sequence with the reference image; calibrating a camera response curve based on the registered scene images; transforming all the registered scene images from a pixel domain to an irradiation domain by using a camera response curve to obtain an irradiation image; and fusing all the irradiation images of the irradiation field to obtain a panoramic image. The panoramic image generation method and device improve the panoramic image generation efficiency.

Description

Panoramic image generation method, panoramic image generation device and computer storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a panoramic image generation method, apparatus, and computer storage medium.
Background
In the prior art, due to the limitation of hardware conditions, in order to obtain a panoramic image with high dynamic, the high dynamic image synthesis and the panoramic image synthesis are required to be completed in two independent steps, and in the image synthesis, multiple image registration and multiple image fusion are generally required to be carried out. However, multiple image registration may cause loss of precision of images, so that errors are accumulated in image transformation, and the amount of image information after multiple image fusion is always smaller than the sum of information of each image before fusion, i.e. a part of image information is lost, resulting in poor effect of the generated images.
Disclosure of Invention
The application provides a panoramic image generation method, equipment and a computer storage medium, which mainly solve the technical problem of how to improve the production efficiency of panoramic images.
In order to solve the technical problems, the application provides a panoramic image generation method, which comprises the following steps:
obtaining an image sequence comprising images of a scene at a plurality of viewpoints;
selecting one scene image from the image sequence as a reference image, and registering other scene images in the image sequence with the reference image;
calibrating a camera response curve based on the registered scene images;
transforming all the scene images after registration from a pixel domain to an irradiation domain by using the camera response curve to obtain an irradiation image;
and fusing all the irradiation images of the irradiation field to obtain the panoramic image.
According to an embodiment of the present application, the acquiring an image sequence includes:
acquiring scene images based on the exposure level sequence and the viewpoint sequence, and obtaining scene images under a plurality of viewpoints; the view sequence comprises a plurality of views with shooting angles changing in sequence, and the difference of exposure levels between every two adjacent views is smaller than the difference of the maximum exposure level and the minimum exposure level in the exposure level sequence.
According to an embodiment of the present application, the capturing a scene image based on an exposure level sequence and a viewpoint sequence includes:
scene images are acquired sequentially according to a viewpoint sequence at a first sequence exposure level and a second sequence exposure level, wherein the first sequence exposure level is an exposure level of an odd-numbered sequence from small to large in the exposure level sequence, and the second sequence exposure level is an exposure level of an even-numbered sequence from large to small in the exposure level sequence.
According to an embodiment of the present application, the selecting a scene image in the image sequence as a reference image includes:
if the horizontal view angle range of each view point of the image sequence reaches 360 degrees, taking a scene image under any view point as the reference image;
and if the horizontal view angle range of each view point of the image sequence is less than 360 degrees, taking the scene image at the middle view point as the reference image.
According to an embodiment of the present application, the fusing all the irradiation images of the irradiation field to obtain the panoramic image includes:
fusion of all irradiation images is performed based on the following formula
wherein ,Ti Is the irradiance value of the pixel at the ith position of the panoramic image, Z ij Is the pixel value of the pixel in the j Zhang Fuzhao image corresponding to the i-th position in the panoramic image, deltat j Is the exposure time of the j Zhang Fuzhao image, g is the pixel value transformation function in the camera response curve, ω 1 Is a weight function, ω, related to irradiance values 2 Is a weight function associated with irradiance value coordinate locations.
According to one embodiment of the present application, the weight function ω related to irradiance values is calculated using a triangle weight method 1 Calculating the weight function omega related to irradiance value coordinate position by adopting a progressive-in and progressive-out method 2
According to an embodiment of the present application, the acquiring an image sequence, the image sequence including, after the step of capturing images of a scene at a plurality of viewpoints, includes:
and projectively transforming the image sequence to a space coordinate system to obtain the image sequence in the space coordinate system.
According to an embodiment of the present application, the calibrating a camera response curve based on the registered scene image includes:
the response curve is calibrated using a development algorithm.
To solve the above technical problems, the present application provides a panoramic image generation apparatus, the apparatus including a memory and a processor coupled to the memory;
the memory is used for storing program data, and the processor is used for executing the program data to realize the panoramic image generation method.
In order to solve the above technical problem, the present application also provides a computer storage medium for storing program data, which when executed by a processor, is configured to implement the panoramic image generation method as described above.
According to the panoramic image generation method, one scene image is selected from the image sequence to serve as a reference image, and other image sequences are registered with the reference image, so that the number of times of mutual registration between the image sequences is reduced, the accuracy of image registration is improved, and the error accumulation of image registration is reduced; and based on the registered scene image calibration response curve, all registered scene images are transformed from a pixel domain to an irradiation domain to obtain irradiation images, so that the irradiation images are fused, the frequency of fusion of the irradiation images is reduced, excessive useful information is prevented from being lost in the image fusion process, serious image distortion is caused, and the generated panoramic image is poor in effect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art. Wherein:
fig. 1 is a schematic flow chart of an embodiment of a panoramic image generation method provided by the present application;
FIG. 2 is a schematic view of a weight of a fade-in fade-out method in the panoramic image generation method of FIG. 1;
FIG. 3 is a schematic view of scene image acquisition in the panoramic image generation method of FIG. 1
Fig. 4 is a schematic structural view of an embodiment of a panoramic image generation apparatus provided by the present application;
fig. 5 is a schematic structural diagram of an embodiment of a computer storage medium provided by the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The application provides a panoramic image generation method, and particularly referring to fig. 1, fig. 1 is a schematic flow chart of an embodiment of the panoramic image generation method provided by the application. The panoramic image generation method in this embodiment may be applied to a panoramic image generation apparatus, and may also be applied to a server having data processing capability. The panoramic image generation method of the embodiment specifically includes the following steps:
s101: a sequence of images is obtained.
Because redundancy of the existing multi-view multi-exposure image acquisition technology is too high, workload in an image sequence acquisition process is large, in the embodiment, a scene image is acquired once at each view point through the single-camera-based image-classified multi-exposure image acquisition technology, so that the scene images under multiple view points are obtained.
In this step, the image sequence includes images of the scene at a plurality of viewpoints. The viewpoint represents the shooting angle of the image pickup apparatus. The scene images at different viewpoints represent images of the image capturing device at different capturing angles. Specifically, during the shooting of the image sequence, the viewpoints may be ordered in a clockwise or counterclockwise sequence, and the image capturing device captures images of the scene in a small-to-large rotation of the viewpoint sequence.
In practical applications, the image acquisition device may be a single device, such as a common mobile phone, a common camera or a fisheye camera, or may be a plurality of devices, such as a plurality of cameras, for synchronous shooting, but the performance of each camera needs to be similar, so as to facilitate the subsequent acquisition of the irradiation image.
S102: and selecting one scene image from the image sequence as a reference image, and registering other scene images in the image sequence with the reference image.
In order to avoid that registration accuracy is lost in multiple image registration, the embodiment selects one scene image in the image sequence as a reference image, and registers other scene images in the image sequence onto the reference image.
For the selection of the reference image, if the horizontal view angle range of each view point of the image sequence reaches 360 degrees, the scene image at any view point can be used as the reference image; if the horizontal viewing angle range of each view point of the image sequence is less than 360 degrees, the scene image at the middle view point is used as a reference image.
Specifically, a scene image is selected from each viewpoint as a reference scene image under the viewpoint, and one viewpoint is selected from all viewpoints as a reference viewpoint, wherein the reference scene image in the reference viewpoint is the selected reference image.
In practical applications, to reduce the difficulty of image registration, an image registration algorithm that adaptively blocks according to feature matching points, such as an image registration algorithm that adaptively blocks a quadtree, may be used. Specifically, setting the maximum depth of the quadtree, estimating a matching threshold value of each scene image block according to the total number of image matching points, and performing self-adaptive quadtree blocking on the scene image based on the maximum depth and the matching point threshold value to obtain a homography matrix between the reference image and the scene image, so as to register all the scene images on the reference image; the image registration algorithm based on the characteristic points can also be adopted, the characteristic points in the reference image and any scene image are matched by extracting the characteristic points in the reference image and any scene image, a plurality of pairs of characteristic points with high characteristic point matching degree are selected from the characteristic points, and a transformation matrix between the scene image and the reference image is estimated so as to register all the scene images on the reference image based on the transformation matrix.
S103: calibrating a camera response curve based on the registered scene images.
The camera response curve is a curve specific to each image capturing device, e.g. a camera, the camera response curves in different image capturing devices generally being different. In order to transform the scene image in the pixel domain into the irradiation domain, so as to facilitate fusion of the irradiation images, the embodiment needs to calibrate the camera response curve in each image acquisition device, and specifically can calibrate the camera response curve by adopting a Debevec algorithm. The calibration method of the camera response curve is not limited in this embodiment.
S104: and transforming all the registered scene images from the pixel domain to the irradiation domain by using a camera response curve to obtain an irradiation image.
Based on the camera response curve calibrated in S103, a function mapping relation between the irradiance domain and the pixel value is obtained, so that all scene images registered in S102 are transformed from the pixel domain to the irradiation domain, and an irradiation image under the irradiation domain is obtained.
S105: and fusing all the irradiation images of the irradiation field to obtain a panoramic image.
In order to avoid overlarge errors of irradiation values in the overexposed area and the overexposed area, the irradiation images can be fused by a weighted fusion method, and in other embodiments, the irradiation images can be fused by a gradual-in and gradual-out method or a pyramid fusion method.
For example, the irradiation images are fused by adopting a weighted fusion method, and all irradiation image fusion satisfies the following formula:
wherein ,Ti Is the irradiance value of the pixel at the ith position of the panoramic image, Z ij Is the pixel value of the pixel in the j Zhang Fuzhao image corresponding to the i-th position in the panoramic image, deltat j Is the exposure time of the j Zhang Fuzhao image, g is the pixel value transformation function in the camera response curve, ω 1 Is a weight function, ω, related to irradiance values 2 Is a weight function associated with irradiance value coordinate locations.
Further, the irradiance value correlation can be calculated using a triangle weighting methodWeight function omega 1 Calculating the weight function omega related to irradiance value coordinate position by adopting a progressive-in and progressive-out method 2
The triangle weighting method formula is as follows:
referring to fig. 2, fig. 2 is a schematic view of a weight of a fade-in fade-out method in the panoramic image generation method of fig. 1. As can be seen from FIG. 2, the left border of the overlap region has an abscissa x 1 The abscissa of the right boundary of the overlapping region is x r Omega is then 1 and ω2 The calculation formula of (2) is as follows:
in this embodiment, an image sequence is obtained, the image sequence including scene images at a plurality of viewpoints; selecting a scene image from the image sequence as a reference image, and registering other scene images in the image sequence with the reference image; calibrating a camera response curve based on the registered scene images; transforming all the registered scene images from a pixel domain to an irradiation domain by using a camera response curve to obtain an irradiation image; and fusing all the irradiation images of the irradiation field to obtain a panoramic image. By selecting one scene image from the image sequences as a reference image, registering other image sequences with the reference image, the number of times of mutual registration between the image sequences is reduced, the accuracy of image registration is improved, and the error accumulation of image registration is reduced; and based on the registered scene image calibration response curve, all registered scene images are transformed from a pixel domain to an irradiation domain to obtain irradiation images, so that the irradiation images are fused, the frequency of fusion of the irradiation images is reduced, the efficiency of panoramic image generation is improved, and serious image distortion caused by excessive useful information loss in the image fusion process is avoided.
Further, for the obtained image sequence in step S101, the existing image acquisition technology shoots a plurality of scene images with different exposure degrees at each view point, that is, one view point acquires a plurality of scene images with different exposure degrees, so that a large number of overlapping portions exist in the scene images between two adjacent view points, the overlapping degree of the view angles between the two view points is larger, and the total number of acquired scene images is:
wherein N represents the total number of images to be acquired, N represents the exposure level, ω represents the shooting angle of the camera, and d represents the view angle overlapping degree between two adjacent views.
In order to reduce the collection quantity of scene images and reduce the workload of subsequent scene image fusion and avoid collecting a great amount of redundant information, the embodiment adopts the scene image collection based on the exposure level sequence and the viewpoint sequence to obtain scene images under a plurality of viewpoints, namely, scene images with different exposure degrees are collected at each viewpoint, so that the view angle overlapping degree between adjacent viewpoints is reduced. Wherein, the view sequence includes a plurality of views whose photographing angles sequentially change.
Referring to fig. 3, fig. 3 is a schematic view of scene image acquisition in the panoramic image generation method of fig. 1. For example, as shown in fig. 3, assuming that there are 3 exposure levels, there are 9 viewpoints in total, the dotted arrow indicates the shooting angle of each viewpoint, the number on the dotted arrow indicates the viewpoint position, the arc indicated by the dotted arrow indicates the scene image shot each time, the dotted arc indicates the underexposed image, the thin solid arc indicates the normal exposure image, the thick solid arc indicates the overexposed image, during the acquisition, the camera is placed at viewpoint 1, an underexposed scene image is shot at viewpoint 1, the camera is rotated clockwise around the node by 40 ° to viewpoint 2, a normally exposed scene image is shot at viewpoint 2, the camera is rotated clockwise around the node again by 40 ° to viewpoint 3, an overexposed scene image is shot at viewpoint 3, the camera is continued to be rotated clockwise around the node by 40 ° to viewpoint 4, an underexposed scene image is shot at viewpoint 4, until the scene images of 9 viewpoints are acquired.
As can be seen from the above, in this embodiment, by collecting scene images with different exposure degrees at each view point, the view angle overlapping degree between adjacent view points is reduced, and the total number of images required to be collected in this embodiment is as follows:
the shooting angle satisfies the following formula:
where α represents an angle of each rotation of the camera during photographing.
Further, the captured image sequence in the prior art can be expressed as:
I 11 ;I 22 ......I MM ;I (M+1)1 .....I ij ....
wherein ,i is a view identification, representing the i-th view, j is the exposure level mark, the larger j represents the larger exposure value, M is the total exposure level, I ij Indicating that the exposure index is i at the viewpoint index j Is a scene image of a scene. When the first is collected M When the scene images of the Mth and the (1) th viewpoints are acquired, the exposure degree of the scene images becomes the minimum, the difference of the exposure degrees of the scene images between the two adjacent viewpoints is larger, and the registering difficulty of the scene images between the two adjacent viewpoints is increased.
In order to reduce the difficulty of registering scene images between adjacent viewpoints, the difference of exposure levels between every two adjacent viewpoints is smaller than the difference of the maximum exposure level and the minimum exposure level in the exposure level sequence when the image sequence is acquired.
In a specific embodiment, the scene images may be acquired sequentially according to the viewpoint sequence at a first sequence of exposure levels and a second sequence of exposure levels, where the first sequence of exposure levels is an odd-numbered from small to large ordered exposure level in the sequence of exposure levels, and the second sequence of exposure levels is an even-numbered from large to small ordered exposure level in the sequence of exposure levels. In other embodiments, the first sequence exposure level may be an even number of exposure levels ordered from small to large in the sequence of exposure levels, the second sequence exposure level is an odd number of exposure levels ordered from large to small in the sequence of exposure levels, and the specific ordering manner of the first sequence exposure level and the second sequence exposure level is not limited in this embodiment.
For example, the first sequence of exposure levels is an odd-numbered from small to large ordered sequence of exposure levels, and the second sequence of exposure levels is an even-numbered from large to small ordered sequence of exposure levels. The image sequence may be expressed as:
I 11 ;I 23 ......I (M/2)(M-1) ;I (M/2+1)M ;I (M/2+2)(M-2) ......I M2 ;I (M+1)1 .....I ij ....
wherein the exposure level M is an even number.
Compared with the traditional image acquisition method, the image sequence acquisition method reduces redundant information of scene image acquisition by acquiring the scene image once at each view point, increases the efficiency of image acquisition, is beneficial to registration of subsequent scene images and fusion of the subsequent scene images, and improves the efficiency of panoramic image generation.
In order to avoid the unnatural deformation of the registered scene image, after the image sequence is acquired in step S101, the image sequence needs to be projectively transformed under a spatial coordinate system to obtain the image sequence under the spatial coordinate system. In particular embodiments, spherical projection transformation or cylindrical projection transformation may be employed to project the scene image on the two-dimensional plane to the scene image under the spatial coordinate system, facilitating registration of the scene image.
In the embodiment, scene images are acquired based on the exposure level sequence and the viewpoint sequence, so that scene images in multiple viewpoints are obtained, the view angle overlapping degree of the scene images between adjacent viewpoints is reduced, the number of the acquired scene images is greatly reduced, excessive redundant image information is prevented from being acquired, the workload of subsequent irradiation image fusion is reduced, and the production efficiency of panoramic images is improved; by selecting a scene image from the image sequences as a reference image, registering other image sequences with the reference image, the number of times of mutual registration between the image sequences is reduced, and the accuracy of image registration is improved and the error accumulation of image registration is reduced by adopting one-time scene image registration; and based on the registered scene image calibration response curve, all registered scene images are transformed from a pixel domain to an irradiation domain to obtain irradiation images, so that the irradiation images are fused, the frequency of fusion of the irradiation images is reduced, excessive useful information is prevented from being lost in the image fusion process, serious image distortion is caused, and the panoramic image generation efficiency is improved.
In order to implement the panoramic image generation method of the above embodiment, another device is proposed in the present application, and referring specifically to fig. 4, fig. 4 is a schematic structural diagram of an embodiment of the panoramic image generation device provided in the present application.
The device 400 comprises a memory 41 and a processor 42, wherein the memory 41 and the processor 42 are coupled.
The memory 41 is used for storing program data, and the processor 42 is used for executing the program data to implement the panoramic image generation method of the above-described embodiment.
In the present embodiment, the processor 42 may also be referred to as a CPU (Central Processing Unit ). The processor 42 may be an integrated circuit chip having signal processing capabilities. Processor 42 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The general purpose processor may be a microprocessor or the processor 42 may be any conventional processor or the like.
The present application also provides a computer storage medium 500, as shown in fig. 5, where the computer storage medium 500 is configured to store program data 51, and the program data 51, when executed by a processor, is configured to implement a panoramic image generation method according to an embodiment of the method of the present application.
The method according to the embodiment of the panoramic image generation method of the present application may be stored in a device, such as a computer readable storage medium, when implemented in the form of a software functional unit and sold or used as a stand alone product. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing description is only of embodiments of the present application, and is not intended to limit the scope of the application, and all equivalent structures or equivalent processes using the descriptions and the drawings of the present application or directly or indirectly applied to other related technical fields are included in the scope of the present application.

Claims (9)

1. A panoramic image generation method, the method comprising:
acquiring an image sequence, wherein the image sequence comprises scene images under a plurality of viewpoints;
selecting one scene image from the image sequence as a reference image, and registering other scene images in the image sequence with the reference image;
calibrating a camera response curve based on the registered scene images;
transforming all the scene images after registration from a pixel domain to an irradiation domain by using the camera response curve to obtain an irradiation image;
fusing all irradiation images of the irradiation field to obtain the panoramic image;
wherein the acquiring obtains a sequence of images comprising:
acquiring scene images based on the exposure level sequence and the viewpoint sequence, and acquiring the scene images once at each viewpoint to obtain scene images at a plurality of viewpoints; the view sequence comprises a plurality of views with shooting angles changing in sequence, and the difference of exposure levels between every two adjacent views is smaller than the difference of the maximum exposure level and the minimum exposure level in the exposure level sequence.
2. The method of claim 1, wherein the acquiring a scene image based on the sequence of exposure levels and the sequence of viewpoints comprises:
scene images are acquired sequentially according to a viewpoint sequence at a first sequence exposure level and a second sequence exposure level, wherein the first sequence exposure level is an exposure level of an odd-numbered sequence from small to large in the exposure level sequence, and the second sequence exposure level is an exposure level of an even-numbered sequence from large to small in the exposure level sequence.
3. The method of claim 1, wherein selecting a scene image in the sequence of images as a reference image comprises:
if the horizontal view angle range of each view point of the image sequence reaches 360 degrees, taking a scene image under any view point as the reference image;
and if the horizontal view angle range of each view point of the image sequence is less than 360 degrees, taking the scene image at the middle view point as the reference image.
4. The method of claim 1, wherein the fusing all of the shot images of the shot domain to obtain the panoramic image comprises:
fusion of all irradiation images is performed based on the following formula
wherein ,is panoramic image +.>Irradiance value of a pixel of a location, +.>Is the pixel value of the pixel in the j Zhang Fuzhao th image corresponding to the i-th position in the panoramic image,/>Is the exposure time of the j Zhang Fuzhao image, g is the pixel value transformation function in the camera response curve,/-, is>Is a weight function related to irradiance values, < ->Is a weight function associated with irradiance value coordinate locations, and N is the total number of scene images that need to be acquired.
5. The method of claim 4, wherein the weight function associated with irradiance values is calculated using a triangle weighting methodCalculating weight function of irradiance value coordinate position correlation by adopting a progressive-in and progressive-out method>
6. The method according to claim 1, wherein the acquiring obtains a sequence of images, the sequence of images comprising, after the step of capturing images of the scene at a plurality of viewpoints, comprising:
and projectively transforming the image sequence to a space coordinate system to obtain the image sequence in the space coordinate system.
7. The method of claim 1, wherein calibrating the camera response curve based on the registered scene image comprises:
the response curve is calibrated using a development algorithm.
8. A panoramic image generation device, the device comprising a memory and a processor coupled to the memory;
the memory is configured to store program data, and the processor is configured to execute the program data to implement the panoramic image generation method according to any one of claims 1 to 7.
9. A computer storage medium for storing program data which, when executed by a processor, is adapted to carry out the panoramic image generation method of any one of claims 1 to 7.
CN202010394240.XA 2020-05-11 2020-05-11 Panoramic image generation method, panoramic image generation device and computer storage medium Active CN111652916B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010394240.XA CN111652916B (en) 2020-05-11 2020-05-11 Panoramic image generation method, panoramic image generation device and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010394240.XA CN111652916B (en) 2020-05-11 2020-05-11 Panoramic image generation method, panoramic image generation device and computer storage medium

Publications (2)

Publication Number Publication Date
CN111652916A CN111652916A (en) 2020-09-11
CN111652916B true CN111652916B (en) 2023-09-29

Family

ID=72346637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010394240.XA Active CN111652916B (en) 2020-05-11 2020-05-11 Panoramic image generation method, panoramic image generation device and computer storage medium

Country Status (1)

Country Link
CN (1) CN111652916B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853524A (en) * 2010-05-13 2010-10-06 北京农业信息技术研究中心 Method for generating corn ear panoramic image by using image sequence
CN101963751A (en) * 2010-08-19 2011-02-02 西北工业大学 Device and method for acquiring high-resolution full-scene image in high dynamic range in real time
CN105122302A (en) * 2013-04-15 2015-12-02 高通股份有限公司 Generation of ghost-free high dynamic range images
JP2016119693A (en) * 2016-02-02 2016-06-30 株式会社ソニー・インタラクティブエンタテインメント Imaging apparatus and imaging method
CN105933617A (en) * 2016-05-19 2016-09-07 中国人民解放军装备学院 High dynamic range image fusion method used for overcoming influence of dynamic problem
CN106097244A (en) * 2016-06-03 2016-11-09 上海小蚁科技有限公司 Method and apparatus for stitching image and the method for combination image
US10009551B1 (en) * 2017-03-29 2018-06-26 Amazon Technologies, Inc. Image processing for merging images of a scene captured with differing camera parameters
CN109506591A (en) * 2018-09-14 2019-03-22 天津大学 A kind of adaptive illumination optimization method being adapted to complex illumination scene

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003083773A2 (en) * 2002-03-27 2003-10-09 The Trustees Of Columbia University In The City Of New York Imaging method and system
WO2011093994A1 (en) * 2010-01-27 2011-08-04 Thomson Licensing High dynamic range (hdr) image synthesis with user input
US10721448B2 (en) * 2013-03-15 2020-07-21 Edge 3 Technologies, Inc. Method and apparatus for adaptive exposure bracketing, segmentation and scene organization
US9424632B2 (en) * 2013-04-05 2016-08-23 Ittiam Systems (P) Ltd. System and method for generating high dynamic range images
US10764496B2 (en) * 2018-03-16 2020-09-01 Arcsoft Corporation Limited Fast scan-type panoramic image synthesis method and device
US20190335077A1 (en) * 2018-04-25 2019-10-31 Ocusell, LLC Systems and methods for image capture and processing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853524A (en) * 2010-05-13 2010-10-06 北京农业信息技术研究中心 Method for generating corn ear panoramic image by using image sequence
CN101963751A (en) * 2010-08-19 2011-02-02 西北工业大学 Device and method for acquiring high-resolution full-scene image in high dynamic range in real time
CN105122302A (en) * 2013-04-15 2015-12-02 高通股份有限公司 Generation of ghost-free high dynamic range images
JP2016119693A (en) * 2016-02-02 2016-06-30 株式会社ソニー・インタラクティブエンタテインメント Imaging apparatus and imaging method
CN105933617A (en) * 2016-05-19 2016-09-07 中国人民解放军装备学院 High dynamic range image fusion method used for overcoming influence of dynamic problem
CN106097244A (en) * 2016-06-03 2016-11-09 上海小蚁科技有限公司 Method and apparatus for stitching image and the method for combination image
US10009551B1 (en) * 2017-03-29 2018-06-26 Amazon Technologies, Inc. Image processing for merging images of a scene captured with differing camera parameters
CN109506591A (en) * 2018-09-14 2019-03-22 天津大学 A kind of adaptive illumination optimization method being adapted to complex illumination scene

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
不同曝光值图像的直接融合方法;张军;戴霞;孙德全;王邦平;;软件学报(04);全文 *
华顺刚 ; 王丽丹 ; 欧宗瑛 ; .同一场景不同曝光图像的配准及HDR图像合成.计算机辅助设计与图形学学报.2007,(04),全文. *
周继权 ; 王庆 ; .基于相机阵列的高动态范围图像合成方法.计算机应用研究.2013,(09),全文. *
基于多曝光的高动态范围全景图像合成;权巍 等;系统仿真学报;第27卷(第10期);全文 *
徐雅丽 ; 郁梅 ; 邵华 ; 谢登梅 ; .一种去虚影的高动态范围图像融合算法.激光杂志.2018,(03),全文. *

Also Published As

Publication number Publication date
CN111652916A (en) 2020-09-11

Similar Documents

Publication Publication Date Title
CN111457886B (en) Distance determination method, device and system
US8447140B1 (en) Method and apparatus for estimating rotation, focal lengths and radial distortion in panoramic image stitching
JP4566591B2 (en) Image deformation estimation method and image deformation estimation apparatus
EP2494524A2 (en) Algorithms for estimating precise and relative object distances in a scene
JP6308748B2 (en) Image processing apparatus, imaging apparatus, and image processing method
CN111583119B (en) Orthoimage splicing method and equipment and computer readable medium
CN111935398B (en) Image processing method and device, electronic equipment and computer readable medium
CN110611767A (en) Image processing method and device and electronic equipment
CN112017215B (en) Image processing method, device, computer readable storage medium and computer equipment
TWI626500B (en) Image capturing apparatus, lens unit, and signal processing apparatus
CN109598763A (en) Camera calibration method, device, electronic equipment and computer readable storage medium
CN103262523A (en) Imaging device, imaging system, imaging method, and image-processing method
JP4354096B2 (en) Imaging device
CN113286084A (en) Terminal image acquisition method and device, storage medium and terminal
KR101938067B1 (en) Method and Apparatus for Stereo Matching of Wide-Angle Images using SIFT Flow
CN111652916B (en) Panoramic image generation method, panoramic image generation device and computer storage medium
CN111292380B (en) Image processing method and device
CN111028296A (en) Method, device, equipment and storage device for estimating focal length value of dome camera
CN115984348A (en) Panoramic image processing method and device, electronic equipment and storage medium
CN111885297B (en) Image definition determining method, image focusing method and device
CN110519486B (en) Distortion compensation method and device based on wide-angle lens and related equipment
US11893704B2 (en) Image processing method and device therefor
CN114862934B (en) Scene depth estimation method and device for billion pixel imaging
CN109600552B (en) Image refocusing control method and system
CN117422650B (en) Panoramic image distortion correction method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant