CN111062902B - Image deformation method, medium, device and apparatus - Google Patents

Image deformation method, medium, device and apparatus Download PDF

Info

Publication number
CN111062902B
CN111062902B CN201911194886.7A CN201911194886A CN111062902B CN 111062902 B CN111062902 B CN 111062902B CN 201911194886 A CN201911194886 A CN 201911194886A CN 111062902 B CN111062902 B CN 111062902B
Authority
CN
China
Prior art keywords
image
processed
texture
texture image
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911194886.7A
Other languages
Chinese (zh)
Other versions
CN111062902A (en
Inventor
朱钰恒
李文镇
林杰兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gaoding Xiamen Technology Co Ltd
Original Assignee
Gaoding Xiamen Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gaoding Xiamen Technology Co Ltd filed Critical Gaoding Xiamen Technology Co Ltd
Priority to CN201911194886.7A priority Critical patent/CN111062902B/en
Publication of CN111062902A publication Critical patent/CN111062902A/en
Application granted granted Critical
Publication of CN111062902B publication Critical patent/CN111062902B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses an image deformation method, medium, equipment and device, wherein the method comprises the following steps: acquiring an image to be processed and a texture image to be processed which have the same size; processing each pixel point in the texture image to be processed according to the image type of the texture image to be processed so as to obtain the corresponding change distance of each pixel point in the texture image to be processed; moving pixel points at the same position in the texture image to be processed according to the variation distance corresponding to each pixel point in the texture image to be processed so as to perform image deformation on the texture image to be processed; superposing the image to be processed after the image deformation and the texture image to be processed to output a fusion image; therefore, the to-be-processed image is deformed according to the texture of the to-be-processed texture image and then fused, so that the fusion effect is natural and agreeable.

Description

Image deformation method, medium, device and apparatus
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image deformation method, a computer-readable storage medium, a computer device, and an image deformation apparatus.
Background
In the related art, a situation that one image needs to be fused to another image often occurs, but due to the fact that textures of two images are different, if two completely different images are directly fused together, the fused image looks unnatural, and therefore the fusion effect is greatly reduced.
Disclosure of Invention
The present invention is directed to solving at least one of the technical problems in the art to some extent. Therefore, an object of the present invention is to provide an image transformation method, which transforms an image to be processed according to the texture of the texture image to be processed, and then performs fusion, so that the fusion effect is natural and agreeable.
A second object of the invention is to propose a computer-readable storage medium.
A third object of the invention is to propose a computer device.
A fourth object of the present invention is to provide an image warping device.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides an image deformation method, including: acquiring an image to be processed and a texture image to be processed which have the same size; processing each pixel point in the texture image to be processed according to the image type of the texture image to be processed so as to obtain the corresponding change distance of each pixel point in the texture image to be processed; moving pixel points at the same position in the texture image to be processed according to the variation distance corresponding to each pixel point in the texture image to be processed so as to perform image deformation on the image to be processed; and superposing the to-be-processed image after the image deformation and the to-be-processed texture image to output a fusion image.
According to the image deformation method provided by the embodiment of the invention, the image to be processed and the texture image to be processed with the same size are obtained; then, processing each pixel point in the texture image to be processed according to the image type of the texture image to be processed so as to obtain the corresponding change distance of each pixel point in the texture image to be processed; then, moving pixel points at the same position in the image to be processed according to the variation distance corresponding to each pixel point in the texture image to be processed so as to perform image deformation on the image to be processed; finally, overlapping the to-be-processed image after the image deformation and the to-be-processed texture image to output a fusion image; therefore, the image to be processed is processed, and the image to be processed is deformed according to the texture of the texture image to be processed, so that the fused image is natural and fit.
In addition, the image deformation method proposed according to the above-mentioned embodiment of the present invention may also have the following additional technical features:
optionally, when the texture image to be processed is a grayscale image, the variation distance corresponding to each pixel point in the texture image to be processed is calculated according to the following formula:
di=(Xi-128)*sqrt(0.5)*A
wherein, diRepresenting the variation distance, X, corresponding to the ith pixel point in the texture image to be processediAnd representing the current pixel value of the ith pixel point in the texture image to be processed, wherein A is a preset constant and is used for controlling the magnitude of the moving amplitude.
Optionally, when the texture image to be processed is an RGB image, the variation distance corresponding to any two channels of each pixel point in the RGB image is calculated.
Optionally, the variation distance corresponding to each pixel point in the texture image to be processed is calculated according to the following formula:
Figure BDA0002294446470000021
Figure BDA0002294446470000022
wherein the content of the first and second substances,
Figure BDA0002294446470000023
representing the variation distance corresponding to the R channel of the ith pixel point in the texture image to be processed,
Figure BDA0002294446470000024
representing the current pixel value of the R channel of the ith pixel point in the texture image to be processed,
Figure BDA0002294446470000025
representing the variation distance corresponding to the G channel of the ith pixel point in the texture image to be processed,
Figure BDA0002294446470000026
and representing the current pixel value of the G channel of the ith pixel point in the texture image to be processed, wherein A is a preset constant and is used for controlling the moving amplitude.
Optionally, before the to-be-processed image after the image deformation is superimposed on the to-be-processed texture image, a bilinear interpolation method is further adopted to process the pixel points in the to-be-processed image after the image deformation.
Optionally, when the to-be-processed image after the image deformation is superimposed on the to-be-processed texture image, a positive film bottom-superimposed fusion algorithm is further adopted to perform superimposition processing on the to-be-processed image after the image deformation and the to-be-processed texture image.
Optionally, when the to-be-processed image after the image deformation is superimposed with the to-be-processed texture image, a highlight fusion algorithm is further adopted to superimpose the to-be-processed image after the image deformation and the to-be-processed texture image.
To achieve the above object, a second embodiment of the present invention provides a computer-readable storage medium, on which an image warping program is stored, which when executed by a processor implements the image warping method as described above.
According to the computer-readable storage medium of an embodiment of the present invention, by storing the image morphing program, the image morphing program realizes the image morphing method as described above when being executed by the processor, thereby making the fused image natural and fit.
To achieve the above object, a third embodiment of the present invention provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the image warping method is implemented.
According to the computer device of the embodiment of the invention, the image deformation program is stored through the memory, so that the image deformation program realizes the image deformation method when being executed by the processor, and the fused image is natural and fit.
In order to achieve the above object, a fourth aspect of the present invention provides an image morphing device, including: the acquisition module is used for acquiring the images to be processed and the texture images to be processed with the same size; the distance calculation module is used for processing each pixel point in the texture image to be processed according to the image type of the texture image to be processed so as to obtain the corresponding change distance of each pixel point in the texture image to be processed; the deformation module is used for moving the pixel points at the same position in the texture image to be processed according to the change distance corresponding to each pixel point in the texture image to be processed so as to perform image deformation on the texture image to be processed; and the superposition module is used for superposing the image to be processed after the image deformation and the texture image to be processed so as to output a fusion image.
According to the image deformation device provided by the embodiment of the invention, the to-be-processed image and the to-be-processed texture image with the same size are obtained through the obtaining module, then each pixel point in the to-be-processed texture image is processed through the distance calculating module according to the image type of the to-be-processed texture image so as to obtain the variation distance corresponding to each pixel point in the to-be-processed texture image, then the pixel points at the same position in the to-be-processed image are moved through the deformation module according to the variation distance corresponding to each pixel point in the to-be-processed texture image so as to perform image deformation on the to-be-processed image, and the to-be-processed image after image deformation and the to-be-processed texture image are overlapped through the overlapping module so as to output a fused image; therefore, the image to be processed is processed, and the image to be processed is deformed according to the texture of the texture image to be processed, so that the fused image is natural and fit.
Drawings
FIG. 1 is a flowchart illustrating an image warping method according to an embodiment of the present invention;
FIG. 2 is an image to be processed according to one embodiment of the invention;
FIG. 3 is a texture image to be processed according to one embodiment of the present invention;
FIG. 4 is an effect diagram of fused images according to one embodiment of the invention;
FIG. 5 is a diagram of the effects of fusing images according to one embodiment of the invention;
FIG. 6 is an effect diagram of fusing images according to one embodiment of the invention;
FIG. 7 is an effect diagram of fusing images according to one embodiment of the invention;
FIG. 8 is an effect diagram of fusing images according to one embodiment of the invention;
FIG. 9 is an effect diagram of fusing images according to one embodiment of the invention;
FIG. 10 is a diagram of the effects of fusing images according to one embodiment of the present invention;
fig. 11 is a block diagram of an image warping device according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
In the related technology, after two images with different textures are fused, obvious unnatural feeling exists, so that the fusion effect is poor; according to the image deformation method provided by the embodiment of the invention, the image to be processed and the texture image to be processed with the same size are obtained; then, processing each pixel point in the texture image to be processed according to the image type of the texture image to be processed so as to obtain the corresponding change distance of each pixel point in the texture image to be processed; then moving the pixel points at the same position in the image to be processed according to the variation distance corresponding to each pixel point in the texture image to be processed so as to deform the image to be processed; finally, overlapping the image to be processed after the image deformation and the texture image to be processed to output a fusion image; so that the fused image is natural and fit.
In order to better understand the above technical solution, exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
In order to better understand the technical solution, the technical solution will be described in detail with reference to the drawings and the specific embodiments.
Fig. 1 is a schematic flowchart of an image warping method according to an embodiment of the present invention, and as shown in fig. 1, the image warping method according to an embodiment of the present invention includes the following steps:
step 101, obtaining an image to be processed and a texture image to be processed with the same size.
That is, the size of the acquired image to be processed and the size of the acquired texture image to be processed are the same.
As an embodiment, fig. 2 is a to-be-processed image, and fig. 3 is a to-be-processed texture image; and acquiring the to-be-processed image and the to-be-processed texture image input by a user, and if the sizes of the input to-be-processed image and the input to-be-processed texture image are not consistent, processing the sizes of the to-be-processed image and the to-be-processed texture image to ensure that the sizes of the processed to-be-processed image and the processed texture image are consistent.
And 102, processing each pixel point in the texture image to be processed according to the image type of the texture image to be processed so as to obtain the corresponding variation distance of each pixel point in the texture image to be processed.
That is to say, after acquiring the to-be-processed image and the to-be-processed texture image, firstly judging the image type of the to-be-processed texture image, and if the image type of the to-be-processed texture image is a gray-scale image, processing each pixel point in the to-be-processed texture image according to the corresponding processing mode of the gray-scale image so as to obtain the corresponding change distance of each pixel point in the to-be-processed texture image; and if the image type of the texture image to be processed is an RGB image, processing each pixel point in the texture image to be processed according to the corresponding processing mode of the RGB image so as to obtain the corresponding change distance of each pixel point in the texture image to be processed.
As an embodiment, when the texture image to be processed is a gray-scale image, the variation distance corresponding to each pixel point in the texture image to be processed is calculated according to the following formula:
di=(Xi-128)*sqrt(0.5)*A
wherein d isiRepresenting the variation distance, X, corresponding to the ith pixel point in the texture image to be processediRepresenting the current pixel value of the ith pixel point in the texture image to be processedAnd A is a preset constant and is used for controlling the movement amplitude.
It should be noted that the larger the value of a is, the larger the moving amplitude of the image to be processed is; preferably, the value of a in this embodiment may take 0.1.
As an embodiment, when the texture image to be processed is an RGB image, the variation distance corresponding to any two channels of each pixel point in the RGB image is calculated.
That is to say, if the texture image to be processed is an RGB image, the variation distance corresponding to each pixel point in any two channels in the RGB image needs to be calculated, which may be the variation distance corresponding to the pixel point of the R channel and the G channel, the R channel and the B channel, or the R channel and the G channel.
As an embodiment, taking an R channel and a G channel as an example, the variation distance corresponding to each pixel point in the texture image to be processed is calculated according to the following formula:
Figure BDA0002294446470000051
Figure BDA0002294446470000052
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002294446470000053
representing the variation distance corresponding to the R channel of the ith pixel point in the texture image to be processed,
Figure BDA0002294446470000054
the current pixel value of the R channel of the ith pixel point in the texture image to be processed is represented,
Figure BDA0002294446470000055
the change distance corresponding to the G channel of the ith pixel point in the texture image to be processed is represented,
Figure BDA0002294446470000056
representAnd A is a preset constant and is used for controlling the moving amplitude of the current pixel value of the G channel of the ith pixel point in the texture image to be processed.
It should be noted that the larger the value of a is, the larger the moving amplitude of the image to be processed is; preferably, the value of a in this embodiment may take 0.1; in addition, there is no requirement for precedence between any two channels.
And 103, moving the pixel points at the same position in the image to be processed according to the variation distance corresponding to each pixel point in the texture image to be processed so as to deform the image to be processed.
That is to say, when the image deformation is performed on the image to be processed, the movement distance of the ith pixel point in the image to be processed is determined by the deformation distance of the ith pixel point in the texture image to be processed, which is the same as the ith pixel point.
As an embodiment, if the image type of the texture image to be processed is a gray scale image, the variation distance d corresponding to the ith pixel point in the texture image to be processed is calculated according to the above formulaiThen, generating a corresponding moving coordinate according to the change distance to obtain a moving distance; for example, the generated movement coordinates may be (d)i,di)、(-di,di)、(di,-di)、(-di,-di) Four cases.
The moving direction corresponding to each moving coordinate is defined in advance based on the positive-negative relationship of the moving coordinate, for example, (| d)i|,|diI) is the second quadrant, moving to the upper left; (| d)i|,|diI) is the first quadrant, moving to the upper right; (| d)i|,-|diI) is a third quadrant, moving to the lower left; (| d)i|,-|di|) is the fourth quadrant, moving to the lower right.
As a specific embodiment, it is assumed that the calculated variation distance corresponding to three pixel points in the texture image to be processed is: d is a radical of1=80、d2=-70、d350; the generated movement coordinates are in the first case (d)i,di) For example, may correspond to (80,80), (-70 ), and (50,50), respectively; at this time, sum d in the image to be processed is set1The corresponding pixel point is taken as the origin and moves to the upper left according to the moving coordinates (80,80) by the moving distance
Figure BDA0002294446470000061
At this time, sum d in the image to be processed is set2The corresponding pixel point is moved to the lower right direction by the origin according to the moving coordinate (-70 ) and the moving distance is
Figure BDA0002294446470000062
At this time, sum d in the image to be processed is set3The corresponding pixel point as the origin moves to the upper left according to the moving coordinates (50,50) by the moving distance
Figure BDA0002294446470000063
As a specific embodiment, it is assumed that the calculated variation distance corresponding to three pixel points in the texture image to be processed is: d is a radical of1=80、d2=-70、d350; the generated mobile coordinates are in the second case (-d)i,di) For example, may correspond to (-80,80), (70, -70), and (-50,50), respectively; at this time, sum d in the image to be processed is set1The corresponding pixel point as the origin moves to the upper right direction according to the moving coordinate (-80,80) with the moving distance of
Figure BDA0002294446470000064
At this time, sum d in the image to be processed is set2The corresponding pixel point is taken as the origin point and moves to the lower left according to the moving coordinates (70-70) with the moving distance of
Figure BDA0002294446470000065
At this time, sum d in the image to be processed is set3The corresponding pixel point is taken as the origin and moves to the upper right according to the moving coordinate (-50,50), and the moving distance is
Figure BDA0002294446470000066
As a specific embodiment, it is assumed that the calculated variation distance corresponding to three pixel points in the texture image to be processed is: d1=80、d2=-70、d350; the generated movement coordinates are in a third case (d)i,-di) For example, may correspond to (80, -80), (-70, 70) and (50, -50), respectively; at this time, sum d in the image to be processed is set1The corresponding pixel point is taken as the origin point and moves to the lower left according to the moving coordinates (80-80) with the moving distance of
Figure BDA0002294446470000067
At this time, sum d in the image to be processed is set2The corresponding pixel point is taken as the origin point and moves to the lower left according to the moving coordinate (-70, 70), and the moving distance is
Figure BDA0002294446470000068
At this time, sum d in the image to be processed is set3The corresponding pixel point is taken as the origin point and moves to the upper right according to the moving coordinates (50-50) by the moving distance
Figure BDA0002294446470000069
As a specific embodiment, it is assumed that the calculated variation distance corresponding to three pixel points in the texture image to be processed is: d1=80、d2=-70、d350; the generated mobile coordinates are in a fourth case (-d)i,-di) For example, may correspond to (-80 ), (70, 70), and (-50 ), respectively; at this time, sum d in the image to be processed is set1The corresponding pixel point is taken as the origin point and moves to the lower right according to the moving coordinates (-80 ), and the moving distance is
Figure BDA00022944464700000610
At this time, sum d in the image to be processed is set2The corresponding pixel point is the origin and moves to the upper left according to the moving coordinates (70, 70) by the moving distance
Figure BDA00022944464700000611
At this time, sum d in the image to be processed is set3The corresponding pixel point is taken as the origin point and moves to the lower right according to the moving coordinates (-50 ), and the moving distance is
Figure BDA00022944464700000612
As an embodiment, if the image type of the texture image to be processed is an RGB image, the variation distance corresponding to any two channels of the ith pixel point in the texture image to be processed is calculated according to the above formula, for example, taking an R channel and a G channel as examples, the variation distance corresponding to the R channel of the ith pixel point is calculated
Figure BDA0002294446470000071
The variation distance corresponding to the ith pixel point in the G channel
Figure BDA0002294446470000072
Then, generating corresponding moving coordinates according to the change distance so as to deform the image to be processed; for example, the generated movement coordinates may be
Figure BDA0002294446470000073
Four cases.
It should be noted that the moving direction corresponding to each moving coordinate is defined in advance according to the positive-negative relationship of the moving coordinate, for example,
Figure BDA0002294446470000074
the second quadrant moves to the upper left;
Figure BDA0002294446470000075
moving to the right and upper direction for the first quadrant;
Figure BDA0002294446470000076
moving to the lower left for the third quadrant;
Figure BDA0002294446470000077
and the fourth quadrant, moving to the lower right.
The specific moving manner is the same as the grayscale diagram of the above embodiment, and is not described herein again.
And step 104, overlapping the to-be-processed image after the image deformation and the to-be-processed texture image to output a fused image.
It should be noted that, here, the to-be-processed image after the image deformation and the to-be-processed texture image may be directly superimposed, so as to output the fused image.
As an example, the generated movement coordinates are in a first case (d)i,di) For example, an effect graph showing the superposition of the deformed to-be-processed image and the to-be-processed texture image is shown in fig. 4; generated moving coordinates in the second case (-d)i,di) For example, an effect graph showing the superposition of the deformed to-be-processed image and the to-be-processed texture image is shown in fig. 5; the generated movement coordinates are represented by the third case (d)i,-di) For example, an effect graph showing the superposition of the deformed to-be-processed image and the to-be-processed texture image is shown in fig. 6; generated movement coordinates in a fourth case (-d)i,-di) For example, an effect graph showing the superposition of the deformed to-be-processed image and the to-be-processed texture image is shown in fig. 7.
In summary, according to the image deformation method of the embodiment of the present invention, the to-be-processed image and the to-be-processed texture image with the same size are obtained first; then, processing each pixel point in the texture image to be processed according to the image type of the texture image to be processed so as to obtain the corresponding change distance of each pixel point in the texture image to be processed; then, moving pixel points at the same position in the image to be processed according to the variation distance corresponding to each pixel point in the texture image to be processed so as to perform image deformation on the image to be processed; finally, overlapping the to-be-processed image after the image deformation and the to-be-processed texture image to output a fusion image; therefore, the image to be processed is processed, and the image to be processed is deformed according to the texture of the texture image to be processed, so that the fused image is natural and fit.
In addition, according to a specific embodiment of the present invention, before the to-be-processed image after the image deformation and the to-be-processed texture image are superimposed, a bilinear interpolation method is further adopted to process the pixel points in the to-be-processed image after the image deformation.
That is to say, after the to-be-processed image moves the pixel points at the same position in the to-be-processed image according to the variation distance corresponding to each pixel point in the to-be-processed texture image, the bilinear interpolation method is also adopted to process the pixel points in the to-be-processed image after the image deformation so as to perform scaling processing on the pixel points of the to-be-processed image after the image deformation, thereby avoiding the condition that the pixel points of the to-be-processed image after the image deformation are discontinuous and larger gaps are caused.
In addition, according to a specific embodiment of the present invention, when the to-be-processed image after the image deformation and the to-be-processed texture image are superimposed, a positive-film bottom-superimposed fusion algorithm is further adopted to perform superimposition processing on the to-be-processed image after the image deformation and the to-be-processed texture image; the effect profile after fusion is shown in fig. 8.
In addition, according to a specific embodiment of the present invention, when the to-be-processed image after the image deformation and the to-be-processed texture image are superimposed, a highlight fusion algorithm is further adopted to perform the superimposition processing on the to-be-processed image after the image deformation and the to-be-processed texture image; the effect map after fusion is shown in fig. 9.
In addition, according to an embodiment of the present invention, as shown in fig. 10, when a four-channel to-be-processed text image with clear boundaries is deformed and merged by using the above-mentioned deformation method, pixel values of a texture map to be processed are used to replace pixel values of the first three channels in the text image to be processed.
In addition, to implement the above-described embodiments, an embodiment of the present invention proposes a computer-readable storage medium having stored thereon an image morphing program that, when executed by a processor, implements the image morphing method as described above.
According to the computer-readable storage medium of an embodiment of the present invention, by storing the image morphing program, the image morphing program realizes the image morphing method as described above when being executed by the processor, thereby making the fused image natural and fit.
In addition, to implement the above embodiment, an embodiment of the present invention provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the image warping method as described above is implemented.
According to the computer device of the embodiment of the invention, the image deformation program is stored through the memory, so that the image deformation program realizes the image deformation method when being executed by the processor, thereby enabling the fused image to be natural and fit.
In addition, an embodiment of the present invention further provides an image morphing apparatus, as shown in fig. 11, including: an acquisition module 201, a distance calculation module 202, a deformation module 203, and a superposition module 204.
The acquiring module 201 is configured to acquire an image to be processed and a texture image to be processed, which have the same size;
the distance calculation module 202 is configured to process each pixel point in the texture image to be processed according to the image type of the texture image to be processed, so as to obtain a variation distance corresponding to each pixel point in the texture image to be processed;
the deformation module 203 is configured to move the pixel points at the same position in the to-be-processed image according to the variation distance corresponding to each pixel point in the to-be-processed texture image, so as to perform image deformation on the to-be-processed image;
the superposition module 204 is configured to superpose the to-be-processed image after the image deformation and the to-be-processed texture image, so as to output a fused image.
In some embodiments, when the texture image to be processed is a gray scale image, the variation distance corresponding to each pixel point in the texture image to be processed is calculated according to the following formula:
di=(Xi-128)*sqrt(0.5)*A
wherein, diRepresenting the variation distance, X, corresponding to the ith pixel point in the texture image to be processediRepresenting texture images to be processedAnd the current pixel value of the ith pixel point, A is a preset constant and is used for controlling the movement amplitude.
In some embodiments, when the texture image to be processed is an RGB image, the variation distance corresponding to any two channels of each pixel point in the RGB image is calculated.
Further, in some embodiments, the variation distance corresponding to each pixel point in the texture image to be processed is calculated according to the following formula:
Figure BDA0002294446470000091
Figure BDA0002294446470000092
wherein the content of the first and second substances,
Figure BDA0002294446470000093
representing the variation distance corresponding to the R channel of the ith pixel point in the texture image to be processed,
Figure BDA0002294446470000094
the current pixel value of the R channel of the ith pixel point in the texture image to be processed is represented,
Figure BDA0002294446470000095
representing the variation distance corresponding to the G channel of the ith pixel point in the texture image to be processed,
Figure BDA0002294446470000096
and the current pixel value of the G channel of the ith pixel point in the texture image to be processed is represented, and A is a preset constant and used for controlling the moving amplitude.
As an embodiment, before the to-be-processed image after the image deformation is superimposed with the to-be-processed texture image, a bilinear interpolation method is further adopted to process the pixel points in the to-be-processed image after the image deformation.
As an embodiment, when the to-be-processed image after the image deformation and the to-be-processed texture image are superimposed, a positive film bottom-overlapped fusion algorithm is further adopted to perform superimposition processing on the to-be-processed image after the image deformation and the to-be-processed texture image.
As an embodiment, when the to-be-processed image after the image deformation and the to-be-processed texture image are superimposed, a highlight fusion algorithm is further adopted to perform superimposition processing on the to-be-processed image after the image deformation and the to-be-processed texture image.
It should be noted that the description of the image deformation method in the foregoing embodiment is also applicable to the image deformation apparatus in this embodiment, and is not repeated herein.
In summary, according to the image deformation apparatus provided in the embodiment of the present invention, the to-be-processed image and the to-be-processed texture image with the same size are obtained by the obtaining module, then each pixel point in the to-be-processed texture image is processed by the distance calculating module according to the image type of the to-be-processed texture image, so as to obtain the variation distance corresponding to each pixel point in the to-be-processed texture image, then the pixel point at the same position in the to-be-processed image is moved by the deforming module according to the variation distance corresponding to each pixel point in the to-be-processed texture image, so as to perform image deformation on the to-be-processed image, and the to-be-processed image after image deformation and the to-be-processed texture image are overlapped by the overlapping module, so as to output the fused image; therefore, the image to be processed is processed, and the image to be processed is deformed according to the texture of the texture image to be processed, so that the fused image is natural and fit.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all changes and modifications that fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.
In the description of the present invention, it is to be understood that the terms "first", "second" and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through an intermediate. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
In the description of the specification, reference to the description of "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above should not be understood to necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (10)

1. An image warping method, characterized by comprising the steps of:
acquiring an image to be processed and a texture image to be processed which have the same size;
processing each pixel point in the texture image to be processed according to the image type of the texture image to be processed so as to obtain the corresponding variation distance of each pixel point in the texture image to be processed;
moving pixel points at the same position in the texture image to be processed according to the variation distance corresponding to each pixel point in the texture image to be processed so as to perform image deformation on the texture image to be processed;
and superposing the to-be-processed image after the image deformation and the to-be-processed texture image to output a fusion image.
2. The image deformation method according to claim 1, wherein when the texture image to be processed is a gray-scale image, the variation distance corresponding to each pixel point in the texture image to be processed is calculated according to the following formula:
di=(Xi-128)*sqrt(0.5)*A
wherein d isiRepresenting the variation distance, X, corresponding to the ith pixel point in the texture image to be processediAnd representing the current pixel value of the ith pixel point in the texture image to be processed, wherein A is a preset constant and is used for controlling the moving amplitude.
3. The image deformation method according to claim 1, wherein when the texture image to be processed is an RGB image, the variation distance corresponding to any two channels of each pixel point in the RGB image is calculated.
4. The image deformation method according to claim 3, wherein the variation distance corresponding to each pixel point in the texture image to be processed is calculated according to the following formula:
Figure FDA0002294446460000011
Figure FDA0002294446460000012
wherein the content of the first and second substances,
Figure FDA0002294446460000013
to representThe variation distance corresponding to the R channel of the ith pixel point in the texture image to be processed,
Figure FDA0002294446460000014
representing the current pixel value of the R channel of the ith pixel point in the texture image to be processed,
Figure FDA0002294446460000015
representing the variation distance corresponding to the G channel of the ith pixel point in the texture image to be processed,
Figure FDA0002294446460000016
and representing the current pixel value of the G channel of the ith pixel point in the texture image to be processed, wherein A is a preset constant and is used for controlling the magnitude of the movement amplitude.
5. The image deformation method according to any one of claims 1 to 4, wherein before the image to be processed after the image deformation is superimposed on the texture image to be processed, a bilinear interpolation method is further adopted to process pixel points in the image to be processed after the image deformation.
6. The image deformation method according to any one of claims 1 to 4, wherein when the image to be processed after the image deformation is overlapped with the texture image to be processed, a positive film-bottom-overlapped fusion algorithm is further adopted to perform an overlapping process on the image to be processed after the image deformation and the texture image to be processed.
7. The image deformation method according to any one of claims 1 to 4, wherein when the image to be processed after the image deformation is superimposed on the texture image to be processed, a highlight fusion algorithm is further used for superimposing the image to be processed after the image deformation and the texture image to be processed.
8. A computer-readable storage medium, having stored thereon an image morphing program which, when executed by a processor, implements the image morphing method according to any one of claims 1 to 7.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the computer program, implements the image warping method as claimed in any one of claims 1-7.
10. An image morphing apparatus, characterized by comprising:
the acquisition module is used for acquiring the image to be processed and the texture image to be processed which have the same size;
the distance calculation module is used for processing each pixel point in the texture image to be processed according to the image type of the texture image to be processed so as to obtain the corresponding change distance of each pixel point in the texture image to be processed;
the deformation module is used for moving the pixel points at the same position in the to-be-processed image according to the change distance corresponding to each pixel point in the to-be-processed texture image so as to perform image deformation on the to-be-processed image;
and the superposition module is used for superposing the image to be processed after the image deformation and the texture image to be processed so as to output a fusion image.
CN201911194886.7A 2019-11-28 2019-11-28 Image deformation method, medium, device and apparatus Active CN111062902B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911194886.7A CN111062902B (en) 2019-11-28 2019-11-28 Image deformation method, medium, device and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911194886.7A CN111062902B (en) 2019-11-28 2019-11-28 Image deformation method, medium, device and apparatus

Publications (2)

Publication Number Publication Date
CN111062902A CN111062902A (en) 2020-04-24
CN111062902B true CN111062902B (en) 2022-07-19

Family

ID=70299321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911194886.7A Active CN111062902B (en) 2019-11-28 2019-11-28 Image deformation method, medium, device and apparatus

Country Status (1)

Country Link
CN (1) CN111062902B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112633233B (en) * 2020-12-30 2021-06-29 广州粤建三和软件股份有限公司 Engineering sample anti-counterfeiting detection method and device, computer equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105741229A (en) * 2016-02-01 2016-07-06 成都通甲优博科技有限责任公司 Method for realizing quick fusion of face image
CN107133981A (en) * 2017-03-30 2017-09-05 腾讯科技(深圳)有限公司 Image processing method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106294681B (en) * 2016-08-05 2019-11-05 腾讯科技(深圳)有限公司 The methods, devices and systems of multiple-exposure

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105741229A (en) * 2016-02-01 2016-07-06 成都通甲优博科技有限责任公司 Method for realizing quick fusion of face image
CN107133981A (en) * 2017-03-30 2017-09-05 腾讯科技(深圳)有限公司 Image processing method and device

Also Published As

Publication number Publication date
CN111062902A (en) 2020-04-24

Similar Documents

Publication Publication Date Title
US10728513B2 (en) Image processing apparatus, image processing method, and storage medium
KR102095626B1 (en) Image processing method and apparatus
JP3935500B2 (en) Motion vector calculation method and camera shake correction device, imaging device, and moving image generation device using this method
CN106447602B (en) Image splicing method and device
US20120306874A1 (en) Method and system for single view image 3 d face synthesis
CN112669429A (en) Image distortion rendering method and device
EP2383701B1 (en) Image processing method and apparatus
CN106952247B (en) Double-camera terminal and image processing method and system thereof
WO2017050083A1 (en) Element identification method and device
JP5911292B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
CN111062902B (en) Image deformation method, medium, device and apparatus
KR20150046810A (en) Method and apparatus for image processing
JP2010205095A (en) Three-dimensional object recognition device, three-dimensional object recognition program, and computer readable recording medium having program recorded therein
CN107077719A (en) Perspective correction based on depth map in digital photograph
CN111932594B (en) Billion pixel video alignment method and device based on optical flow and medium
CN111311724B (en) Shadow adding method, medium, device and apparatus for 3D characters
JP2014082678A (en) Marker embedding device, marker detection device, marker embedding method, marker detection method, and program
EP3175613B1 (en) Projection apparatus, image processing apparatus, and image processing method
CN108876704A (en) The method, apparatus and computer storage medium of facial image deformation
CN113658280B (en) Data augmentation method, device, equipment and storage medium based on artificial intelligence
JP2010152529A (en) Vertex texture mapping device and program
CN111461960B (en) Multi-layer matrix transformation method and device
GB2594249A (en) Method for creating a virtual environment reconstruction of an actual location
Pan et al. Blind watermarking of NURBS curves and surfaces
KR101609786B1 (en) A Method Providing Comparing Image Of Human Face

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant