CN113962867B - Image processing method, image processing device, computer equipment and storage medium - Google Patents

Image processing method, image processing device, computer equipment and storage medium Download PDF

Info

Publication number
CN113962867B
CN113962867B CN202111576663.4A CN202111576663A CN113962867B CN 113962867 B CN113962867 B CN 113962867B CN 202111576663 A CN202111576663 A CN 202111576663A CN 113962867 B CN113962867 B CN 113962867B
Authority
CN
China
Prior art keywords
image
gradient
optimized
displacement
downsampled
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111576663.4A
Other languages
Chinese (zh)
Other versions
CN113962867A (en
Inventor
吴海浪
蒋念娟
沈小勇
吕江波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Simou Intelligent Technology Co ltd
Shenzhen Smartmore Technology Co Ltd
Original Assignee
Beijing Simou Intelligent Technology Co ltd
Shenzhen Smartmore Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Simou Intelligent Technology Co ltd, Shenzhen Smartmore Technology Co Ltd filed Critical Beijing Simou Intelligent Technology Co ltd
Priority to CN202111576663.4A priority Critical patent/CN113962867B/en
Publication of CN113962867A publication Critical patent/CN113962867A/en
Application granted granted Critical
Publication of CN113962867B publication Critical patent/CN113962867B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The application relates to an image method, an apparatus, a computer device, a storage medium and a computer program product. The method comprises the following steps: acquiring a down-sampling image of the initial panoramic image; optimizing the downsampling image to obtain an optimized downsampling image; each optimized sub-image in the optimized downsampled image has the same image style, and edge features among the optimized sub-images are matched; determining a first displacement image according to the difference between the down-sampled image and the optimized down-sampled image; conducting guide filtering processing on the up-sampling image of the first displacement image to obtain a second displacement image; the second displacement image has associated boundary information with the initial panoramic image; the resolution of the second displaced image is equal to the resolution of the initial panoramic image; and overlapping the second displacement image and the initial panoramic image to obtain a target fusion image. By adopting the method, the effect of the neutron image fusion in the panoramic image can be improved.

Description

Image processing method, image processing device, computer equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a computer device, a storage medium, and a computer program product.
Background
Images, as the visual basis of the world perceived by humans, are important means for humans to acquire, express and transmit information. Image processing, i.e., techniques in which an image is processed by a computer to achieve a desired result.
In the field of image processing, an image stitching technology is often used, that is, a plurality of images with overlapped parts are stitched into a seamless panoramic image or a high-resolution image, and the problems of brightness, color, overlapped area dislocation and the like often occur between adjacent images when image stitching processing is performed in the conventional technology, so that the image fusion effect is not ideal.
Therefore, the conventional technique has a problem that the image fusion effect is not ideal.
Disclosure of Invention
In view of the above, it is necessary to provide an image processing method, an apparatus, a computer device, a computer readable storage medium, and a computer program product capable of improving image fusion quality in a panoramic image in view of the above technical problems.
In a first aspect, the present application provides an image processing method. The method comprises the following steps:
acquiring a down-sampling image of the initial panoramic image; the initial panoramic image is an image obtained by splicing a plurality of sub-images;
optimizing the downsampling image to obtain an optimized downsampling image; each optimized sub-image in the optimized downsampled image has the same image style, and edge features among the optimized sub-images are matched;
determining a first displacement image according to the difference between the downsampled image and the optimized downsampled image;
conducting guide filtering processing on the up-sampling image of the first displacement image to obtain a second displacement image; the second shifted image has associated boundary information with the initial panoramic image; a resolution of the second shifted image is equal to a resolution of the initial panoramic image;
and overlapping the second displacement image and the initial panoramic image to obtain a target fusion image.
In one embodiment, the performing guided filtering processing on the up-sampled image of the first displacement image to obtain a second displacement image includes:
taking the initial panoramic image as a guide map;
and performing edge-preserving filtering processing on the up-sampling image of the first displacement image by adopting the guide image to obtain the second displacement image.
In one embodiment, the optimizing the downsampled image to obtain an optimized downsampled image includes:
performing gradient operation on the down-sampling image, and generating a processed gradient image of the down-sampling image according to a gradient operation result of the down-sampling image;
performing laplacian operation on the processed gradient image to obtain a laplacian gradient image of the downsampled image;
and determining a Poisson fusion result of the Laplace gradient image to obtain the optimized down-sampling image.
In one embodiment, the performing a gradient operation on the down-sampled image and generating a processed gradient image of the down-sampled image according to a gradient operation result of the down-sampled image includes:
performing gradient operation on the down-sampled image, and respectively determining the gradient images of the down-sampled image in the horizontal direction and the vertical direction to obtain a horizontal gradient image and a vertical gradient image of the down-sampled image;
respectively determining the splicing boundaries of the horizontal gradient image and the vertical gradient image according to the positions of the suture lines in the horizontal gradient image and the vertical gradient image;
in the horizontal gradient image, setting corresponding smoothing coefficients for adjacent images at the splicing boundary respectively; the sum of the smoothing coefficients of the adjacent images is equal to one;
according to the smoothing coefficient corresponding to the adjacent image, carrying out weighted summation on the adjacent image to obtain a processed horizontal gradient image;
and the number of the first and second groups,
in the vertical gradient image, corresponding smoothing coefficients are respectively set for adjacent images at the splicing boundary; the sum of the smoothing coefficients of the adjacent images is equal to one;
and according to the smoothing coefficients corresponding to the adjacent images, carrying out weighted summation on the adjacent images to obtain a processed vertical gradient image.
In one embodiment, the processed gradient image comprises a processed horizontal gradient image and a processed vertical gradient image; the performing laplacian operation on the processed gradient image to obtain a laplacian gradient image of the downsampled image includes:
respectively carrying out Laplace operation on the processed horizontal gradient image and the processed vertical gradient image to obtain a Laplace horizontal gradient image and a Laplace vertical gradient image;
and superposing the Laplace horizontal gradient image and the Laplace vertical gradient image to obtain a Laplace gradient image of the down-sampling image.
In one embodiment, the determining a poisson fusion result of a laplacian gradient image of the downsampled image to obtain the optimized downsampled image includes:
acquiring a Fourier transform result of a Green function;
determining a product of a Fourier transform result of the Laplace gradient image and a Fourier transform result of the Green function;
determining an inverse fourier transform result of the product;
and obtaining the optimized down-sampled image according to the constant and the real part of the inverse Fourier transform result of the product.
In a second aspect, the present application further provides an image processing apparatus. The device comprises:
each optimized sub-image in the optimized downsampled image has the same image style, and edge features among the optimized sub-images are matched;
a determining module, configured to determine a first displacement image according to a difference between the downsampled image and the optimized downsampled image;
the filtering module is used for conducting guided filtering processing on the up-sampling image of the first displacement image to obtain a second displacement image; the second shifted image has associated boundary information with the initial panoramic image; a resolution of the second shifted image is equal to a resolution of the initial panoramic image;
and the superposition module is used for superposing the second displacement image and the initial panoramic image to obtain a target fusion image.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the following steps when executing the computer program:
acquiring a down-sampling image of the initial panoramic image; the initial panoramic image is an image obtained by splicing a plurality of sub-images;
optimizing the downsampling image to obtain an optimized downsampling image; each optimized sub-image in the optimized downsampled image has the same image style, and edge features among the optimized sub-images are matched;
determining a first displacement image according to the difference between the downsampled image and the optimized downsampled image;
conducting guide filtering processing on the up-sampling image of the first displacement image to obtain a second displacement image; the second shifted image has associated boundary information with the initial panoramic image; a resolution of the second shifted image is equal to a resolution of the initial panoramic image;
and overlapping the second displacement image and the initial panoramic image to obtain a target fusion image.
In a fourth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
acquiring a down-sampling image of the initial panoramic image; the initial panoramic image is an image obtained by splicing a plurality of sub-images;
optimizing the downsampling image to obtain an optimized downsampling image; each optimized sub-image in the optimized downsampled image has the same image style, and edge features among the optimized sub-images are matched;
determining a first displacement image according to the difference between the downsampled image and the optimized downsampled image;
conducting guide filtering processing on the up-sampling image of the first displacement image to obtain a second displacement image; the second shifted image has associated boundary information with the initial panoramic image; a resolution of the second shifted image is equal to a resolution of the initial panoramic image;
and overlapping the second displacement image and the initial panoramic image to obtain a target fusion image.
In a fifth aspect, the present application further provides a computer program product. The computer program product comprising a computer program which when executed by a processor performs the steps of:
acquiring a down-sampling image of the initial panoramic image; the initial panoramic image is an image obtained by splicing a plurality of sub-images;
optimizing the downsampling image to obtain an optimized downsampling image; each optimized sub-image in the optimized downsampled image has the same image style, and edge features among the optimized sub-images are matched;
determining a first displacement image according to the difference between the downsampled image and the optimized downsampled image;
conducting guide filtering processing on the up-sampling image of the first displacement image to obtain a second displacement image; the second shifted image has associated boundary information with the initial panoramic image; a resolution of the second shifted image is equal to a resolution of the initial panoramic image;
and overlapping the second displacement image and the initial panoramic image to obtain a target fusion image.
The image processing method, the image processing device, the computer equipment, the storage medium and the computer program product are used for obtaining the down-sampling image of the initial panoramic image; the initial panoramic image is obtained by splicing a plurality of sub-images; then, optimizing the down-sampled image to obtain an optimized down-sampled image; the optimized sub-images in the optimized downsampled image have the same image style, and the edge characteristics of the optimized sub-images are matched, so that the problems that the optimized downsampled image is inconsistent in color and brightness and has a staggered overlapping area are solved, and the image fusion effect of the initial panoramic image on a low fraction rate is improved; then, determining a first displacement image according to the difference between the down-sampled image and the optimized down-sampled image; then, conducting guide filtering processing on the up-sampling image of the first displacement image to obtain a second displacement image; wherein the second shifted image has associated boundary information with the initial panoramic image; wherein the resolution of the second displaced image is equal to the resolution of the initial panoramic image; finally, overlapping the second displacement image with the initial panoramic image to obtain a target fusion image; in this way, by performing the guided filtering process on the up-sampled image of the first displacement image with the low resolution, the color style and the color style of the up-sampled image of the first displacement image are not changed, so that the obtained second displacement image is consistent with the first displacement image in overall brightness and color; the first displacement image is determined and obtained based on the difference between the down-sampling image of the initial panoramic image on the low resolution and the optimized down-sampling image, and the color and the brightness of each optimized sub-image in the optimized down-sampling image are consistent; therefore, if the second displacement image is directly superposed with the initial panoramic image, the problem of inconsistent color and brightness of the obtained target fusion image does not occur; meanwhile, the second displacement image and the initial panoramic image have related boundary information, so that the problem that a target fusion image obtained by overlapping the second displacement image and the initial panoramic image is not staggered in an overlapping area is solved; and the resolution ratio of the second displacement image is equal to that of the initial panoramic image, so that the resolution ratio of a target fusion image obtained by superposing the second displacement image and the initial panoramic image is equal to that of the initial panoramic image, the image quality of the target fusion image is ensured, and the effect of subimage fusion in the panoramic image is further improved.
Drawings
FIG. 1 is a flow diagram illustrating an exemplary method for image processing;
FIG. 2 is a schematic diagram of an initial panoramic image in one embodiment;
FIG. 3 is a flow chart illustrating an image processing method according to another embodiment;
FIG. 4 is a flow diagram illustrating another exemplary image processing method in one embodiment;
FIG. 5 is a block diagram showing an example of an image processing apparatus;
FIG. 6 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. Furthermore, the terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order.
In one embodiment, as shown in fig. 1, an image processing method is provided, and this embodiment is illustrated by applying the method to a terminal, and it is to be understood that the method may also be applied to a server, and may also be applied to a system including a terminal and a server, and is implemented by interaction between the terminal and the server. In this embodiment, the method includes the steps of:
step S110, a downsampled image of the initial panoramic image is obtained.
The initial panoramic image is obtained by splicing a plurality of sub-images.
Wherein the initial panoramic image may be a color panoramic image.
In specific implementation, a terminal can first obtain an initial panoramic image with high resolution, the initial panoramic image can be obtained by directly splicing a plurality of high-resolution color sub-images, and the problems of inconsistent color and brightness of each sub-image and misplacement of an overlapping area exist, as shown in fig. 2. Then, the terminal performs down-sampling processing on the initial panoramic image to obtain a down-sampled image of the initial panoramic image; the image resolution of the initial panoramic image can be reduced by performing down-sampling processing on the initial panoramic image; for example, for an initial panoramic image with an image size of N × M, if the down-sampling coefficient is k, every k pixels in each row and column of the initial panoramic image may be taken as a new pixel, and the new image is the down-sampled image of the initial panoramic image.
And step S120, optimizing the downsampled image to obtain an optimized downsampled image.
And the optimized sub-images in the optimized downsampled image have the same image style, and the edge characteristics of the optimized sub-images are matched.
The image style at least comprises a color style and a brightness style.
Wherein, the optimization processing comprises gradient editing processing and Poisson fusion processing.
The gradient editing processing comprises gradient operation, smoothing processing and Laplace operation on the down-sampled image.
In specific implementation, a terminal performs optimization processing on a down-sampled image of an initial panoramic image, and specifically, the terminal determines a horizontal gradient image and a vertical gradient image of the initial panoramic image; respectively smoothing the splicing boundary of the horizontal gradient image and the vertical gradient image to obtain a smoothed horizontal gradient image and a smoothed vertical gradient image; then, generating a Laplace gradient image of the down-sampled image according to the horizontal gradient image after the smoothing processing and the vertical gradient image after the smoothing processing; finally, the terminal calculates a Poisson fusion result image of the initial panoramic image on the low resolution according to the Laplace gradient image, and takes the Poisson fusion result image as an optimized down-sampling image, namely a sub-image fusion result of the initial panoramic image on the low resolution; enabling each optimized sub-image in the optimized downsampled image to have the same image style; the image style comprises a color style and a brightness style, so that the color and the brightness of each optimized sub-image in the optimized downsampled image are consistent; and the edge characteristics of the optimized sub-images in the optimized downsampled image are matched, so that the problems of dislocation of an overlapping area, ghost and the like of the optimized downsampled image are solved.
Step S130, determining a first displacement image according to a difference between the down-sampled image and the optimized down-sampled image.
In specific implementation, the terminal can compare the downsampled image with the optimized downsampled image and determine the difference between the downsampled image and the optimized downsampled image; specifically, the terminal may determine corresponding pixels in the downsampled image and the optimized downsampled image, so that a displacement of each corresponding pixel in the downsampled image and the optimized downsampled image may be determined, and then the first displacement image may be generated.
Step S140, performing guided filtering processing on the up-sampled image of the first displacement image to obtain a second displacement image.
Wherein the second shifted image has associated boundary information with the initial panoramic image.
Wherein the resolution of the second displaced image is equal to the resolution of the initial panoramic image.
Wherein the second shift image has the same color style as the luminance style and the first shift image.
In a specific implementation, the terminal may perform upsampling processing on the first displacement image, so that the resolution of the upsampled image of the first displacement image is consistent with the resolution of the initial panoramic image; then, the terminal may perform guided filtering processing on the up-sampled image of the first displacement image based on the initial panoramic image, and retain boundary information in the up-sampled image of the first displacement image, which is correlated with the initial panoramic image, so that the second displacement image obtained through the guided filtering processing has correlated boundary information with the initial panoramic image; and the guide filtering processing is carried out on the up-sampled image of the first displacement image, and the color style of the up-sampled image of the first displacement image are not changed, so that the color style of the second displacement image is the same as the brightness style and the first displacement image; meanwhile, the guide filtering processing is performed on the up-sampled image of the first displacement image without changing the resolution of the up-sampled image of the first displacement image, so that the resolution of the second displacement image is the same as that of the initial panoramic image.
And S150, overlapping the second displacement image and the initial panoramic image to obtain a target fusion image.
In specific implementation, the terminal can directly superimpose the second displacement image and the initial panoramic image, so that a target fusion image can be obtained; and the resolution of the target fusion image is the same as that of the initial panoramic image, and the target fusion image is a high-resolution panoramic image.
In the image processing method, a down-sampling image of an initial panoramic image is obtained; the initial panoramic image is obtained by splicing a plurality of sub-images; then, optimizing the down-sampled image to obtain an optimized down-sampled image; the optimized sub-images in the optimized downsampled image have the same image style, and the edge characteristics of the optimized sub-images are matched, so that the problems that the optimized downsampled image is inconsistent in color and brightness and has a staggered overlapping area are solved, and the image fusion effect of the initial panoramic image on a low fraction rate is improved; then, determining a first displacement image according to the difference between the down-sampled image and the optimized down-sampled image; then, conducting guide filtering processing on the up-sampling image of the first displacement image to obtain a second displacement image; wherein the second shifted image has associated boundary information with the initial panoramic image; wherein the resolution of the second displaced image is equal to the resolution of the initial panoramic image; finally, overlapping the second displacement image with the initial panoramic image to obtain a target fusion image; in this way, by performing the guided filtering process on the up-sampled image of the first displacement image with the low resolution, the color style and the color style of the up-sampled image of the first displacement image are not changed, so that the obtained second displacement image is consistent with the first displacement image in overall brightness and color; the first displacement image is determined and obtained based on the difference between the down-sampling image of the initial panoramic image on the low resolution and the optimized down-sampling image, and the color and the brightness of each optimized sub-image in the optimized down-sampling image are consistent; therefore, if the second displacement image is directly superposed with the initial panoramic image, the problem of inconsistent color and brightness of the obtained target fusion image does not occur; meanwhile, the second displacement image and the initial panoramic image have related boundary information, so that the problem that a target fusion image obtained by overlapping the second displacement image and the initial panoramic image is not staggered in an overlapping area is solved; and the resolution ratio of the second displacement image is equal to that of the initial panoramic image, so that the resolution ratio of a target fusion image obtained by superposing the second displacement image and the initial panoramic image is equal to that of the initial panoramic image, the image quality of the target fusion image is ensured, and the effect of subimage fusion in the panoramic image is further improved.
In one embodiment, the performing the guided filtering process on the up-sampled image of the first shifted image to obtain the second shifted image includes: taking the initial panoramic image as a guide map; and performing edge-preserving filtering processing on the up-sampled image of the first displacement image by adopting the guide image to obtain a second displacement image.
In a specific implementation, the terminal may perform upsampling processing on the first displacement image, so that the resolution of the obtained upsampled image of the first displacement image is the same as the resolution of the initial panoramic image; then, the terminal may use the initial panoramic image as a guide map, and perform edge-preserving filtering processing on the up-sampled image of the first displacement image by using the initial panoramic image as the guide map, so that the terminal may filter the up-sampled image of the first displacement image according to the boundary information of the initial panoramic image, and preserve the boundary information of the up-sampled image of the first displacement image, which has correlation with the initial panoramic image; meanwhile, smoothing a non-boundary area in the up-sampling image of the first displacement image, so that the texture features of the obtained second displacement image are similar to the initial panoramic image; the color style and the brightness style of the up-sampled image of the first shift image are not changed, and the obtained color style and brightness style of the second shift image are consistent with those of the first shift image.
Specifically, the calculation formula of the second displacement image is as follows:
Figure 948864DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure 397163DEST_PATH_IMAGE004
for the second displacement image, f denotes the guided filtering operation and k denotes the normalization parameter.
Wherein the content of the first and second substances,
Figure 708059DEST_PATH_IMAGE006
wherein i and j represent the coordinates of the pixel points; n and M respectively correspond to the ranges of i and j, i belongs to N, and j belongs to M;
Figure 385159DEST_PATH_IMAGE008
an up-sampled image being a first displacement image;
Figure 199531DEST_PATH_IMAGE010
the pixel value of a pixel point located at the (i, j) coordinate in the up-sampled image of the first displacement image; and wi.j is the weight value of the guide map on the value range.
Wherein the content of the first and second substances,
Figure 553152DEST_PATH_IMAGE012
wherein the content of the first and second substances,
Figure 351344DEST_PATH_IMAGE014
wherein abs is an absolute value operation,
Figure 81402DEST_PATH_IMAGE016
is an initial panoramic image; sigmaC is a constant and ranges from 0.0, 1.0]。
In the technical solution of this embodiment, the obtaining the second displacement image by performing guided filtering processing on the up-sampled image of the first displacement image includes: taking the initial panoramic image as a guide map; performing edge-preserving filtering processing on the up-sampled image of the first displacement image by adopting a guide image to obtain a second displacement image; in this way, the boundary information which is in the upsampling image of the first displacement image and has correlation with the initial panoramic image is reserved, so that the second displacement image obtained after filtering and the initial panoramic image have the correlation boundary information; smoothing a non-boundary region in the up-sampling image of the first displacement image, so that the texture features of the second displacement image obtained after filtering are similar to the initial panoramic image; moreover, the color style and the brightness style of the up-sampling image of the first displacement image are not changed, so that the color style and the brightness style of the second displacement image obtained after filtering are consistent with those of the first displacement image; therefore, the color style and the color style of the target fusion image obtained by superposing the second displacement image and the initial panoramic image are consistent, the problem of dislocation of an overlapped area is avoided, the image quality of the target fusion image is ensured, and the fusion effect of the sub-image in the target fusion image is improved.
In one embodiment, optimizing the downsampled image to obtain an optimized downsampled image includes: performing gradient operation on the down-sampled image, and generating a processed gradient image of the down-sampled image according to a gradient operation result of the down-sampled image; performing Laplace operation on the processed gradient image to obtain a Laplace gradient image of the down-sampled image; and determining a Poisson fusion result of the Laplace gradient image to obtain an optimized down-sampling image.
Wherein the gradient operation result comprises a horizontal gradient image and a vertical gradient image of the down-sampled image.
In specific implementation, the terminal performs gradient operation on the down-sampled image in the horizontal direction and the vertical direction to obtain a gradient map of the down-sampled image in the horizontal direction, namely a horizontal gradient image, and obtain a gradient map of the down-sampled image in the vertical direction, namely a vertical gradient image; taking the horizontal gradient image and the vertical gradient image as gradient operation results of the down-sampling image; then, smoothing the horizontal gradient image and the vertical gradient image to obtain a processed horizontal gradient image and a processed vertical gradient image; then, performing laplacian operation on the processed horizontal gradient image and the processed vertical gradient image, so as to obtain a laplacian image of the downsampled image; and finally, obtaining an optimized down-sampling image according to a Poisson fusion result of the Laplace image.
According to the technical scheme of the embodiment, gradient operation is carried out on the down-sampled image, and a processed gradient image of the down-sampled image is generated according to the gradient operation result of the down-sampled image; performing Laplace operation on the processed gradient image to obtain a Laplace gradient image of the down-sampled image; determining a Poisson fusion result of the Laplace gradient image to obtain an optimized downsampled image; therefore, the optimized downsampled image obtained by performing Poisson fusion operation on the Laplace gradient image of the downsampled image retains the gradient information of the downsampled image, solves the problems of inconsistent color and brightness and overlapping dislocation in the downsampled image of the initial panoramic image, and improves the image fusion quality of the initial panoramic image on low resolution.
In one embodiment, performing a gradient operation on the down-sampled image and generating a processed gradient image of the down-sampled image according to a gradient operation result of the down-sampled image includes: performing gradient operation on the down-sampled image in the horizontal direction and the vertical direction to obtain a horizontal gradient image and a vertical gradient image of the down-sampled image; respectively determining splicing boundaries of the horizontal gradient image and the vertical gradient image according to the positions of the suture lines in the horizontal gradient image and the vertical gradient image; in the horizontal gradient image, setting corresponding smooth coefficients for adjacent images at the splicing boundary respectively; the sum of the smoothing coefficients of adjacent images is equal to one; according to the smooth coefficients corresponding to the adjacent images, carrying out weighted summation on the adjacent images to obtain a processed horizontal gradient image; in the vertical gradient image, setting corresponding smooth coefficients for adjacent images at the splicing boundary respectively; the sum of the smoothing coefficients of adjacent images is equal to one; and according to the smoothing coefficients corresponding to the adjacent images, carrying out weighted summation on the adjacent images to obtain the processed vertical gradient image.
The adjacent images are sub-images adjacent to each other in the gradient image.
In the specific implementation, the terminal performs gradient operation on the down-sampled image in the horizontal direction and the vertical direction to obtain a horizontal gradient image and a vertical gradient image of the down-sampled image; wherein, the gradient operation formula is as follows:
Figure 32172DEST_PATH_IMAGE018
wherein the content of the first and second substances,
Figure 25536DEST_PATH_IMAGE020
in order to be a horizontal gradient image,
Figure 311024DEST_PATH_IMAGE022
in order to down-sample the image,
Figure 579194DEST_PATH_IMAGE024
a gradient operator character of the downsampled image in the horizontal direction;
Figure 633738DEST_PATH_IMAGE026
in the form of a vertical gradient image,
Figure 342543DEST_PATH_IMAGE028
the gradient operator in the vertical direction for the down-sampled image.
Then, the terminal respectively determines the position of a suture line in the horizontal gradient image and the vertical gradient image in a dynamic programming searching mode, so that the splicing boundary of the horizontal gradient image and the splicing boundary of the vertical gradient image can be respectively determined; then, the terminal can set smoothing coefficients for adjacent images at the splicing boundary in the horizontal gradient image, namely two adjacent sub-images in the horizontal gradient image respectively, so that the sum of the smoothing coefficients of the adjacent images is equal to one; and then, carrying out weighted summation on the adjacent images according to the corresponding smoothing coefficients of the adjacent images, thereby merging the adjacent images with the overlapping region, simultaneously enabling the gradient in the boundary neighborhood range to be as close to zero as possible, and finally obtaining the processed horizontal gradient image. Specifically, the smoothing coefficient of the left sub-image at the splicing boundary may be set to w, and the smoothing coefficient of the right sub-image at the splicing boundary is 1-w; then when weighted summing is performed on the neighboring images, the left sub-image a is denoted as a = a × w, w
Figure 115327DEST_PATH_IMAGE030
(ii) a The right sub-image B is denoted B = B (1-w).
Based on the same approach, a processed vertical gradient image can be obtained.
According to the technical scheme of the embodiment, a horizontal gradient image and a vertical gradient image of the down-sampled image are obtained by performing gradient operation on the down-sampled image in the horizontal direction and the vertical direction; respectively determining splicing boundaries of the horizontal gradient image and the vertical gradient image according to the positions of the suture lines in the horizontal gradient image and the vertical gradient image; in the horizontal gradient image, setting corresponding smooth coefficients for adjacent images at the splicing boundary respectively; the sum of the smoothing coefficients of adjacent images is equal to one; according to the smooth coefficients corresponding to the adjacent images, carrying out weighted summation on the adjacent images to obtain a processed horizontal gradient image; and obtaining a processed vertical gradient image based on the same method; thus, by determining the horizontal gradient image and the vertical gradient image of the down-sampled image, the edge information of the down-sampled image is extracted; and the splicing boundary of the horizontal gradient image and the vertical gradient image is subjected to smoothing treatment, so that the gradient in the boundary neighborhood range is as close to zero as possible, and meanwhile, adjacent images with overlapping regions at the splicing boundary can be merged, thereby reducing the abrupt gradient in the gradient image of the down-sampling image and improving the quality of the gradient image.
In one embodiment, the processed gradient image comprises a processed horizontal gradient image and a processed vertical gradient image; performing laplacian operation on the processed gradient image to obtain a laplacian gradient image of the downsampled image, including: respectively carrying out Laplace operation on the processed horizontal gradient image and the processed vertical gradient image to obtain a Laplace horizontal gradient image and a Laplace vertical gradient image; and superposing the Laplace horizontal gradient image and the Laplace vertical gradient image to obtain a Laplace gradient image of the downsampled image.
In specific implementation, the terminal respectively performs laplacian operation on the processed horizontal gradient image and the processed vertical gradient image to obtain a laplacian horizontal gradient image and a laplacian vertical gradient image; the calculation formula of the Laplace horizontal gradient image and the Laplace vertical gradient image is as follows:
Figure 452767DEST_PATH_IMAGE032
wherein the content of the first and second substances,
Figure 378129DEST_PATH_IMAGE034
in order to be a laplacian horizontal gradient image,
Figure 244454DEST_PATH_IMAGE035
is an operator of the gradient in the horizontal direction,
Figure 504534DEST_PATH_IMAGE037
the processed horizontal gradient image is obtained;
Figure 927556DEST_PATH_IMAGE041
in the case of a laplacian vertical gradient image,
Figure 425534DEST_PATH_IMAGE043
is an operator of the gradient in the vertical direction,
Figure 728339DEST_PATH_IMAGE045
is the processed vertical gradient image.
And then, the terminal superposes the Laplace horizontal gradient image and the Laplace vertical gradient image to obtain a Laplace gradient image of the down-sampled image. Specifically, the calculation formula of the laplacian gradient image is as follows:
Figure 210136DEST_PATH_IMAGE047
wherein Lap is a Laplace gradient image.
According to the technical scheme of the embodiment, the Laplace operation is respectively carried out on the processed horizontal gradient image and the processed vertical gradient image to obtain a Laplace horizontal gradient image and a Laplace vertical gradient image; superposing the Laplace horizontal gradient image and the Laplace vertical gradient image to obtain a Laplace gradient image of a down-sampled image; in this way, by performing laplacian operation on the horizontal gradient image after the smoothing processing and the vertical gradient image after the smoothing processing, the details of the downsampled image can be enhanced while the image noise is prevented from being increased by the laplacian operation, and the edge of the downsampled image can be found; so that the information contained in the downsampled image can be acquired.
In one embodiment, determining a poisson fusion result of a laplacian gradient image of a downsampled image to obtain an optimized downsampled image includes: acquiring a Fourier transform result of a Green function; determining the product of the Fourier transform result of the Laplace gradient image and the Fourier transform result of the Greens function; determining an inverse fourier transform result of the product; and obtaining the optimized down-sampled image according to the constant and the real part of the inverse Fourier transform result of the product.
In a specific implementation, in order to avoid iteratively solving a huge sparse linear matrix, the embodiment uses a green function to solve a poisson equation, the green function is a function for solving an inhomogeneous differential equation with an initial condition or a boundary condition, the function of the green function is a linear operator L, an output under the action of a point source δ is a green function G, and the output is expressed by a mathematical formula as follows:
Figure 374532DEST_PATH_IMAGE049
in this embodiment, the relationship between the green function and the laplacian image can be expressed as:
Figure 258175DEST_PATH_IMAGE051
wherein the content of the first and second substances,
Figure 731881DEST_PATH_IMAGE053
for the optimized down-sampled image, Lap is the laplacian image,
Figure 700974DEST_PATH_IMAGE055
green function, convolution operator.
And the relation between the green function and the laplacian can be expressed as:
Figure 400553DEST_PATH_IMAGE057
wherein the content of the first and second substances,
Figure 404281DEST_PATH_IMAGE059
in order to be the laplacian operator,
Figure 48889DEST_PATH_IMAGE061
is a diracThe function of the trigonometric function is,
Figure 505278DEST_PATH_IMAGE055
is a green function;
wherein the content of the first and second substances,
Figure 729586DEST_PATH_IMAGE063
Figure 72973DEST_PATH_IMAGE065
wherein S is a constant; n is dimension information, so in the present application, dimension information n =2 of an image; r is the Euler distance;
Figure 888483DEST_PATH_IMAGE067
refers to a gamma function (gamma function).
Therefore, the terminal can transform the result F (F) by the fourier transform of the laplacian
Figure 832168DEST_PATH_IMAGE069
) And the Fourier transform result F (of the Dirac trigonometric function)
Figure DEST_PATH_IMAGE071_16A
) Determining a Fourier transform result F (of the Green function)
Figure 673216DEST_PATH_IMAGE073
) Wherein, in the step (A),
Figure 385957DEST_PATH_IMAGE075
then, the terminal determines the product of the Fourier transform result of the Laplace gradient image and the Fourier transform result of the Green function; thereby determining an inverse fourier transform result of the product; obtaining an optimized down-sampled image according to the constant and the real part of the inverse Fourier transform result of the product; specifically, the optimized down-sampled image calculation formula is as follows:
Figure 106789DEST_PATH_IMAGE077
wherein R is the real part of the inverse Fourier transform,
Figure 537770DEST_PATH_IMAGE079
for the inverse fourier transform operation, f (lap) is the fourier transform result of the laplace gradient image, and c is a constant term; therefore, the convolution solving mode of the optimized down-sampled image is converted into the solving mode of Fourier forward transformation and inverse transformation.
According to the technical scheme of the embodiment, a Fourier transform result of a Green function is obtained; determining the product of the Fourier transform result of the Laplace gradient image and the Fourier transform result of the Greens function; determining an inverse fourier transform result of the product; obtaining an optimized down-sampled image according to the constant and the real part of the inverse Fourier transform result of the product; therefore, the optimized downsampled image can be obtained by solving the Poisson fusion result of the Laplace gradient image of the downsampled image through the Green function, the problems that a large coefficient linear matrix needs to be solved in an iterative mode and the boundary color constraint is required to be clear in the traditional Poisson fusion calculation mode are solved, and the continuity of the boundary color of the optimized downsampled image can be kept while the processing speed is improved.
In another embodiment, as shown in fig. 3, an image processing method is provided, which is described by taking the method as an example for being applied to a terminal, and includes the following steps:
step S302, a down-sampling image of the initial panoramic image is obtained; the initial panoramic image is an image obtained by splicing a plurality of sub-images.
Step S304, performing gradient operation on the down-sampled image in the horizontal direction and the vertical direction to obtain a horizontal gradient image and a vertical gradient image of the down-sampled image.
Step S306, determining the splicing boundary of the horizontal gradient image and the vertical gradient image according to the position of the suture line in the horizontal gradient image and the vertical gradient image.
Step S308, respectively setting corresponding smoothing coefficients for adjacent images at the splicing boundary in the horizontal gradient image; the sum of the smoothing coefficients of the neighboring images is equal to one.
And S310, carrying out weighted summation on the adjacent images according to the smoothing coefficients corresponding to the adjacent images to obtain a processed horizontal gradient image.
Step S312, in the vertical gradient image, setting corresponding smooth coefficients for the adjacent images at the splicing boundary respectively; the sum of the smoothing coefficients of the neighboring images is equal to one.
And step S314, carrying out weighted summation on the adjacent images according to the smoothing coefficients corresponding to the adjacent images to obtain a processed vertical gradient image.
And step S316, performing Laplace operation on the processed horizontal gradient image and the processed vertical gradient image respectively to obtain a Laplace horizontal gradient image and a Laplace vertical gradient image.
And step S318, overlapping the Laplace horizontal gradient image and the Laplace vertical gradient image to obtain a Laplace gradient image of the downsampled image.
Step S320, determining a poisson fusion result of the laplacian gradient image to obtain an optimized downsampled image.
Step S322, determining a first displacement image according to a difference between the downsampled image and the optimized downsampled image.
In step S324, the initial panoramic image is used as a guide map.
Step S326, performing edge preserving filtering processing on the up-sampled image of the first displacement image by using the guide map to obtain a second displacement image; the second shifted image has associated boundary information with the initial panoramic image; the resolution of the second displaced image is equal to the resolution of the initial panoramic image.
And step S328, overlapping the second displacement image and the initial panoramic image to obtain a target fusion image.
It should be noted that, for the specific limitations of the above steps, reference may be made to the specific limitations of an image processing method.
To facilitate understanding by those skilled in the art, FIG. 4 provides a flow chart of another image processing method; wherein a high resolution color map first needs to be acquired
Figure 103880DEST_PATH_IMAGE081
(equivalent to the "initial panoramic image" in the above-described embodiment); then, to
Figure 687440DEST_PATH_IMAGE082
Performing down-sampling processing to obtain down-sampled image of high-resolution color image
Figure 579172DEST_PATH_IMAGE084
(ii) a Then, down-sampled image is processed
Figure 497450DEST_PATH_IMAGE085
Performing gradient editing processing (corresponding to steps S304 to S320 in the above embodiment) to obtain a laplacian gradient image Lap; then, a fast poisson solution is performed on the laplacian gradient image (which is equivalent to "solving the poisson equation using the green function" in the above embodiment), so as to obtain a low-resolution fusion image
Figure 398410DEST_PATH_IMAGE087
(equivalent to the "down-sampled image after optimization" in the above-described embodiment); then, a down-sampled image is calculated
Figure 567966DEST_PATH_IMAGE088
Fusing images with low resolution
Figure 365021DEST_PATH_IMAGE089
Displacement of each pixel in between, to obtain a first displacement image
Figure 505015DEST_PATH_IMAGE091
And shifting the first shift image
Figure 209666DEST_PATH_IMAGE091
Performing guided filtering up-sampling to obtain second displacement image of high-resolution color ratio map
Figure 751506DEST_PATH_IMAGE093
(ii) a Specifically, the first shift image may be first shifted
Figure 1353DEST_PATH_IMAGE094
Direct upsampling to obtain an upsampled image of the first shifted image
Figure 628643DEST_PATH_IMAGE096
(ii) a Wherein the content of the first and second substances,
Figure 871406DEST_PATH_IMAGE096
resolution and high resolution color map
Figure 267752DEST_PATH_IMAGE097
The same; then, in
Figure 688500DEST_PATH_IMAGE097
For guiding the drawing, pair
Figure 803087DEST_PATH_IMAGE096
Performing edge-preserving filtering to obtain a second displacement image
Figure 52803DEST_PATH_IMAGE093
(ii) a Finally, the second shift image is processed
Figure 569235DEST_PATH_IMAGE093
And high resolution color map
Figure 410152DEST_PATH_IMAGE098
Superposing to obtain a high-resolution fusion image
Figure 762767DEST_PATH_IMAGE100
(equivalent to the "target fusion image" in the above-described embodiment).
Therefore, the improved fast Poisson solving method can quickly and effectively solve the series problems of brightness, color, overlapping dislocation and the like in the image splicing process, avoids the problems that a large coefficient linear matrix needs to be iteratively solved and boundary color constraint with clear requirements is required in the traditional Poisson equation solution, and enables the boundary color of a high-resolution fusion result to keep continuity while improving the processing speed; in addition, the boundary is optimized and the definition of high resolution is kept by guiding the filtering up-sampling processing, so that the fusion effect of the sub-images in the high-resolution panoramic image is further improved.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be rotated or alternated with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the present application further provides an image processing apparatus for implementing the image processing method. The implementation scheme for solving the problem provided by the apparatus is similar to the implementation scheme described in the above method, so the specific limitations in one or more embodiments of the image processing apparatus provided below may refer to the limitations of one of the image processing methods in the foregoing description, and details are not repeated here.
In one embodiment, as shown in fig. 5, there is provided an image processing apparatus including: an obtaining module 510, an optimizing module 520, a determining module 530, a filtering module 540, and a superimposing module 550, wherein:
an obtaining module 510, configured to obtain a downsampled image of the initial panoramic image; the initial panoramic image is an image obtained by splicing a plurality of sub-images.
An optimizing module 520, configured to perform optimization processing on the downsampled image to obtain an optimized downsampled image; and each optimized sub-image in the optimized downsampled image has the same image style, and the edge characteristics of each optimized sub-image are matched.
A determining module 530, configured to determine a first displacement image according to a difference between the downsampled image and the optimized downsampled image.
The filtering module 540 is configured to perform guided filtering processing on the upsampled image of the first displacement image to obtain a second displacement image; the second shifted image has associated boundary information with the initial panoramic image; the resolution of the second displaced image is equal to the resolution of the initial panoramic image.
And an overlaying module 550, configured to overlay the second displacement image and the initial panoramic image to obtain a target fusion image.
In one embodiment, the filtering module 540 is specifically configured to use the initial panoramic image as a guide map; and performing edge-preserving filtering processing on the up-sampling image of the first displacement image by adopting the guide image to obtain the second displacement image.
In one embodiment, the optimization module 520 is specifically configured to perform a gradient operation on the downsampled image, and generate a processed gradient image of the downsampled image according to a gradient operation result of the downsampled image; performing laplacian operation on the processed gradient image to obtain a laplacian gradient image of the downsampled image; and determining a Poisson fusion result of the Laplace gradient image to obtain the optimized down-sampling image.
In one embodiment, the optimization module 520 is specifically configured to perform gradient operation on the down-sampled image in a horizontal direction and a vertical direction to obtain a horizontal gradient image and a vertical gradient image of the down-sampled image; respectively determining the splicing boundaries of the horizontal gradient image and the vertical gradient image according to the positions of the suture lines in the horizontal gradient image and the vertical gradient image; in the horizontal gradient image, setting corresponding smoothing coefficients for adjacent images at the splicing boundary respectively; the sum of the smoothing coefficients of the adjacent images is equal to one; according to the smoothing coefficient corresponding to the adjacent image, carrying out weighted summation on the adjacent image to obtain a processed horizontal gradient image; setting corresponding smoothing coefficients for adjacent images at the splicing boundary in the vertical gradient image; the sum of the smoothing coefficients of the adjacent images is equal to one; and according to the smoothing coefficients corresponding to the adjacent images, carrying out weighted summation on the adjacent images to obtain a processed vertical gradient image.
In one embodiment, the optimization module 520 is specifically configured to perform laplacian operation on the processed horizontal gradient image and the processed vertical gradient image respectively to obtain a laplacian horizontal gradient image and a laplacian vertical gradient image; and superposing the Laplace horizontal gradient image and the Laplace vertical gradient image to obtain a Laplace gradient image of the down-sampling image.
In one embodiment, the optimization module 520 is specifically configured to obtain a fourier transform result of the green's function; determining a product of a Fourier transform result of the Laplace gradient image and a Fourier transform result of the Green function; determining an inverse fourier transform result of the product; and obtaining the optimized down-sampled image according to the constant and the real part of the inverse Fourier transform result of the product.
The respective modules in the image processing apparatus described above may be wholly or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 6. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing image processing data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an image processing method.
Those skilled in the art will appreciate that the architecture shown in fig. 6 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is further provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In an embodiment, a computer program product is provided, comprising a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
It should be noted that, the user information (including but not limited to user device information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), Magnetic Random Access Memory (MRAM), Ferroelectric Random Access Memory (FRAM), Phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (9)

1. An image processing method, characterized in that the method comprises:
acquiring a down-sampling image of the initial panoramic image; the initial panoramic image is an image obtained by splicing a plurality of sub-images;
optimizing the downsampling image to obtain an optimized downsampling image; the method specifically comprises the following steps: performing gradient operation on the down-sampling image to obtain a processed gradient image of the down-sampling image; performing laplacian operation on the processed gradient image to obtain a laplacian gradient image of the downsampled image; determining a Poisson fusion result of the Laplace gradient image to obtain the optimized downsampled image; each optimized sub-image in the optimized downsampled image has the same image style, and edge features among the optimized sub-images are matched;
determining a first displacement image according to the difference between the downsampled image and the optimized downsampled image;
conducting guide filtering processing on the up-sampling image of the first displacement image to obtain a second displacement image; the second shifted image has associated boundary information with the initial panoramic image; a resolution of the second shifted image is equal to a resolution of the initial panoramic image;
and overlapping the second displacement image and the initial panoramic image to obtain a target fusion image.
2. The method according to claim 1, wherein the performing a guided filtering process on the upsampled image of the first shifted image to obtain a second shifted image comprises:
taking the initial panoramic image as a guide map;
and performing edge-preserving filtering processing on the up-sampling image of the first displacement image by adopting the guide image to obtain the second displacement image.
3. The method of claim 1, wherein performing a gradient operation on the downsampled image to obtain a processed gradient image of the downsampled image comprises:
performing gradient operation on the down-sampled image in the horizontal direction to obtain a horizontal gradient image of the down-sampled image;
determining the splicing boundary of the horizontal gradient image according to the position of a suture line in the horizontal gradient image;
in the horizontal gradient image, setting corresponding smoothing coefficients for adjacent images at the splicing boundary respectively; the sum of the smoothing coefficients of the adjacent images is equal to one;
and according to the smoothing coefficients corresponding to the adjacent images, carrying out weighted summation on the adjacent images to obtain a processed horizontal gradient image.
4. The method of claim 1, wherein performing a gradient operation on the downsampled image to obtain a processed gradient image of the downsampled image further comprises:
performing gradient operation on the down-sampled image in the vertical direction to obtain a vertical gradient image of the down-sampled image;
determining the splicing boundary of the vertical gradient image according to the position of a suture line in the vertical gradient image;
in the vertical gradient image, corresponding smoothing coefficients are respectively set for adjacent images at the splicing boundary; the sum of the smoothing coefficients of the adjacent images is equal to one;
and according to the smoothing coefficients corresponding to the adjacent images, carrying out weighted summation on the adjacent images to obtain a processed vertical gradient image.
5. The method of claim 1, wherein the processed gradient image comprises a processed horizontal gradient image and a processed vertical gradient image; the performing laplacian operation on the processed gradient image to obtain a laplacian gradient image of the downsampled image includes:
respectively carrying out Laplace operation on the processed horizontal gradient image and the processed vertical gradient image to obtain a Laplace horizontal gradient image and a Laplace vertical gradient image;
and superposing the Laplace horizontal gradient image and the Laplace vertical gradient image to obtain a Laplace gradient image of the down-sampling image.
6. The method of claim 1, wherein the determining a Poisson fusion result of the Laplace gradient image of the downsampled image to obtain the optimized downsampled image comprises:
acquiring a Fourier transform result of a Green function;
determining a product of a Fourier transform result of the Laplace gradient image and a Fourier transform result of the Green function;
determining an inverse fourier transform result of the product;
and obtaining the optimized down-sampled image according to the constant and the real part of the inverse Fourier transform result of the product.
7. An image processing apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring a down-sampling image of the initial panoramic image; the initial panoramic image is an image obtained by splicing a plurality of sub-images;
the optimization module is used for optimizing the downsampling image to obtain an optimized downsampling image; the method specifically comprises the following steps: performing gradient operation on the down-sampling image to obtain a processed gradient image of the down-sampling image; performing laplacian operation on the processed gradient image to obtain a laplacian gradient image of the downsampled image; determining a Poisson fusion result of the Laplace gradient image to obtain the optimized downsampled image; each optimized sub-image in the optimized downsampled image has the same image style, and edge features among the optimized sub-images are matched;
a determining module, configured to determine a first displacement image according to a difference between the downsampled image and the optimized downsampled image;
the filtering module is used for conducting guided filtering processing on the up-sampling image of the first displacement image to obtain a second displacement image; the second shifted image has associated boundary information with the initial panoramic image; a resolution of the second shifted image is equal to a resolution of the initial panoramic image;
and the superposition module is used for superposing the second displacement image and the initial panoramic image to obtain a target fusion image.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 6.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
CN202111576663.4A 2021-12-22 2021-12-22 Image processing method, image processing device, computer equipment and storage medium Active CN113962867B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111576663.4A CN113962867B (en) 2021-12-22 2021-12-22 Image processing method, image processing device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111576663.4A CN113962867B (en) 2021-12-22 2021-12-22 Image processing method, image processing device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113962867A CN113962867A (en) 2022-01-21
CN113962867B true CN113962867B (en) 2022-03-15

Family

ID=79473514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111576663.4A Active CN113962867B (en) 2021-12-22 2021-12-22 Image processing method, image processing device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113962867B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101119442A (en) * 2007-08-10 2008-02-06 浙江大学 Video editing method based on panorama sketch split joint
CN101853524A (en) * 2010-05-13 2010-10-06 北京农业信息技术研究中心 Method for generating corn ear panoramic image by using image sequence
DE102015121084A1 (en) * 2015-05-25 2016-12-01 Htc Corporation Zoom control method for camera and electronic device with camera
CN107146201A (en) * 2017-05-08 2017-09-08 重庆邮电大学 A kind of image split-joint method based on improvement image co-registration
GB2548358A (en) * 2016-03-15 2017-09-20 Nokia Technologies Oy A method, an apparatus and a computer program product for coding a 360-degree panoramic images and video
CN109785233A (en) * 2018-12-25 2019-05-21 合肥埃科光电科技有限公司 A kind of image super-resolution rebuilding method
CN109934772A (en) * 2019-03-11 2019-06-25 深圳岚锋创视网络科技有限公司 A kind of image interfusion method, device and portable terminal
CN110782424A (en) * 2019-11-08 2020-02-11 重庆紫光华山智安科技有限公司 Image fusion method and device, electronic equipment and computer readable storage medium
CN111553841A (en) * 2020-04-21 2020-08-18 东南大学 Real-time video stitching algorithm based on optimal suture line updating
CN111652800A (en) * 2020-04-30 2020-09-11 清华大学深圳国际研究生院 Single image super-resolution method and computer readable storage medium
CN113313774A (en) * 2020-02-26 2021-08-27 华为技术有限公司 Image processing method, image processing device, electronic equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108230274A (en) * 2018-01-16 2018-06-29 华南理工大学 Multiresolution rapid denoising method and device under a kind of mixed noise model
CN109961497A (en) * 2019-03-22 2019-07-02 刘文龙 Real-time three-dimensional method for reconstructing based on unmanned plane image

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101119442A (en) * 2007-08-10 2008-02-06 浙江大学 Video editing method based on panorama sketch split joint
CN101853524A (en) * 2010-05-13 2010-10-06 北京农业信息技术研究中心 Method for generating corn ear panoramic image by using image sequence
DE102015121084A1 (en) * 2015-05-25 2016-12-01 Htc Corporation Zoom control method for camera and electronic device with camera
GB2548358A (en) * 2016-03-15 2017-09-20 Nokia Technologies Oy A method, an apparatus and a computer program product for coding a 360-degree panoramic images and video
CN107146201A (en) * 2017-05-08 2017-09-08 重庆邮电大学 A kind of image split-joint method based on improvement image co-registration
CN109785233A (en) * 2018-12-25 2019-05-21 合肥埃科光电科技有限公司 A kind of image super-resolution rebuilding method
CN109934772A (en) * 2019-03-11 2019-06-25 深圳岚锋创视网络科技有限公司 A kind of image interfusion method, device and portable terminal
CN110782424A (en) * 2019-11-08 2020-02-11 重庆紫光华山智安科技有限公司 Image fusion method and device, electronic equipment and computer readable storage medium
CN113313774A (en) * 2020-02-26 2021-08-27 华为技术有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN111553841A (en) * 2020-04-21 2020-08-18 东南大学 Real-time video stitching algorithm based on optimal suture line updating
CN111652800A (en) * 2020-04-30 2020-09-11 清华大学深圳国际研究生院 Single image super-resolution method and computer readable storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
图像融合技术研究与应用;郭雄飞;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20140815(第08期);I138-1447 *
基于GPU加速的实时4K全景视频拼接;卢嘉铭 等;《计算机科学》;20170831;第44卷(第8期);18-21,26 *

Also Published As

Publication number Publication date
CN113962867A (en) 2022-01-21

Similar Documents

Publication Publication Date Title
CN109978756B (en) Target detection method, system, device, storage medium and computer equipment
US20180300855A1 (en) Method and a system for image processing
CN102246204B (en) Devices and methods for processing images using scale space
CN108109109B (en) Super-resolution image reconstruction method, device, medium and computing equipment
KR102221225B1 (en) Method and Apparatus for Improving Image Quality
EP3929867A2 (en) Image fusion method and portable terminal
CN107657586A (en) A kind of single photo super-resolution Enhancement Method based on depth residual error network
CN111951167B (en) Super-resolution image reconstruction method, super-resolution image reconstruction device, computer equipment and storage medium
CN109448088B (en) Method and device for rendering three-dimensional graphic wire frame, computer equipment and storage medium
US20230252605A1 (en) Method and system for a high-frequency attention network for efficient single image super-resolution
CN113344793A (en) Image super-resolution reconstruction method, device, equipment and storage medium
Zhu et al. Low-light image enhancement network with decomposition and adaptive information fusion
CN113962867B (en) Image processing method, image processing device, computer equipment and storage medium
KR101341617B1 (en) Apparatus and method for super-resolution based on error model of single image
CN111681165A (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN115423697A (en) Image restoration method, terminal and computer storage medium
CN111524072B (en) Super-resolution reconstruction network training method and device, electronic equipment and storage medium
CN110322478B (en) Feature point observation window processing method, tracking method, device, equipment and medium
US20230010980A1 (en) Planning radiation therapy using a personalized hematologic risk score
CN114782300A (en) Product image reconstruction method, product reconstruction model training method and device
CN108364273B (en) Method for multi-focus image fusion in spatial domain
CN114549377B (en) Medical image fusion method
CN110895790A (en) Scene image super-resolution method based on posterior degradation information estimation
CN110751627B (en) Image processing method, device, computer equipment and storage medium
CN116580131B (en) Static image rendering method, device, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant