CN112634124A - Image deformation method, image deformation device, electronic apparatus, and storage medium - Google Patents

Image deformation method, image deformation device, electronic apparatus, and storage medium Download PDF

Info

Publication number
CN112634124A
CN112634124A CN202011455468.1A CN202011455468A CN112634124A CN 112634124 A CN112634124 A CN 112634124A CN 202011455468 A CN202011455468 A CN 202011455468A CN 112634124 A CN112634124 A CN 112634124A
Authority
CN
China
Prior art keywords
image
pixels
boundary
scaling
taking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011455468.1A
Other languages
Chinese (zh)
Other versions
CN112634124B (en
Inventor
权甲
赵健
陈海波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenlan Industrial Intelligent Innovation Research Institute Ningbo Co ltd
Original Assignee
Shenlan Industrial Intelligent Innovation Research Institute Ningbo Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenlan Industrial Intelligent Innovation Research Institute Ningbo Co ltd filed Critical Shenlan Industrial Intelligent Innovation Research Institute Ningbo Co ltd
Priority to CN202011455468.1A priority Critical patent/CN112634124B/en
Publication of CN112634124A publication Critical patent/CN112634124A/en
Application granted granted Critical
Publication of CN112634124B publication Critical patent/CN112634124B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/18Image warping, e.g. rearranging pixels individually

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Editing Of Facsimile Originals (AREA)

Abstract

The embodiment of the application relates to the technical field of image processing, and provides an image deformation method, an image deformation device, electronic equipment and a storage medium, wherein the image deformation method comprises the following steps: receiving a first input instruction of a user; responding to the first input instruction, and outputting a target boundary, wherein the target boundary is the boundary of the two-dimensional closed graph; determining an inscribed rectangle or a circumscribed rectangle of the target boundary; scaling the original image by taking the inscribed rectangle or the circumscribed rectangle as a boundary, and outputting a primary deformed image; and scaling the primary deformed image by taking the target boundary as a boundary, and outputting a target deformed image of the original image. The image deformation method can be used for target boundaries of various shapes, the whole image deformation method is executed by taking the target boundary directly reflecting the final result as the starting point, the ideal deformation effect can be ensured, the whole image deformation method is simple to operate, excessive manual operation of operators is not needed, and the method is convenient and rapid.

Description

Image deformation method, image deformation device, electronic apparatus, and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image deformation method, an image deformation apparatus, an electronic device, and a storage medium.
Background
Image warping is a common means of recreating original images. In the prior art, the original image is generally deformed as follows: and triggering a deformation mode through the target control, dragging a plurality of control points appointed on the image by using a mouse, and adaptively deforming the original image according to the positions of the control points.
This image deformation processing method has at least two problems: (1) the operation is complex, and the use threshold is high; (2) the user can only finish the image deformation by dragging the control points, the flexibility is low, the fine adjustment of the details needs to be carried out for multiple times, and the expected effect is difficult to achieve.
Disclosure of Invention
The application provides an image deformation method for realizing rapid deformation to a target shape.
The application provides an image deformation method, which comprises the following steps: receiving a first input instruction of a user; responding to the first input instruction, and outputting a target boundary, wherein the target boundary is the boundary of the two-dimensional closed graph; determining an inscribed rectangle or a circumscribed rectangle of the target boundary; zooming the original image by taking the inscribed rectangle or the circumscribed rectangle as a boundary, and outputting a primary deformed image; and scaling the primary deformed image by taking the target boundary as a boundary, and outputting a target deformed image of the original image.
According to the image deformation method provided by the application, the scaling the primary deformation image and outputting the target deformation image of the original image by taking the target boundary as a boundary comprises the following steps:
determining two first intersection points which are farthest away from each other in intersection points of a straight line where a row of pixels of the primary deformed image along a first direction is located and the target boundary;
zooming pixels of a corresponding row of the primary deformed image by taking the two first intersection points as boundaries, and outputting a first-direction deformed image;
determining two second intersection points which are farthest away from the target boundary and are located by a straight line where a row of pixels of the first direction deformation image along the second direction is located;
zooming pixels of a corresponding row of the first direction deformation image by taking the two second intersection points as boundaries, and outputting a target deformation image; wherein
The second direction is perpendicular to the first direction.
According to the image deformation method provided by the application, the scaling of the corresponding row of pixels of the primary deformation image by taking the two first intersection points as boundaries comprises the following steps: scaling the corresponding row of pixels of the primary deformed image in an equal proportion by taking the midpoint of the two first intersection points as a first boundary center;
the scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries comprises the following steps: scaling the corresponding row of pixels of the first direction deformation image in an equal proportion by taking the middle point of the two second intersection points as a second boundary center;
alternatively, the first and second electrodes may be,
the scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries comprises the following steps: determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the primary deformed image to the first boundary center by taking the midpoint of the two first intersection points as the first boundary center;
the scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries comprises the following steps: and determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the first direction deformation image to the second boundary center by taking the middle point of the two second intersection points as the second boundary center.
According to the image deformation method provided by the application, the scaling of the corresponding row of pixels of the primary deformation image by taking the two first intersection points as boundaries comprises the following steps: scaling the corresponding row of pixels of the primary deformed image in an equal proportion by taking the middle point of the row of pixels of the primary deformed image as the center of a first pixel;
the scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries comprises the following steps: scaling the corresponding row of pixels of the first direction deformation image in an equal proportion by taking the second pixel center of the row of pixels of the first direction deformation image as a center;
alternatively, the first and second electrodes may be,
the scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries comprises the following steps: determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the primary deformed image to the first pixel center by taking the middle point of the row of pixels of the primary deformed image as the first pixel center;
the scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries comprises the following steps: and determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the first direction deformation image to the center of the second pixel by taking the middle point of the row of pixels of the first direction deformation image as the center of the second pixel.
According to the image deformation method provided by the application, the determining of the inscribed rectangle or circumscribed rectangle of the target boundary comprises the following steps:
judging the concave-convex attribute of the target boundary;
determining that the target boundary is convex, outputting an inscribed rectangle of the target boundary, and displaying the inscribed rectangle on the target boundary;
or determining that the target boundary is concave, outputting a circumscribed rectangle of the target boundary, and displaying the circumscribed rectangle on the target boundary.
According to the image deformation method provided by the application, the judging the concave-convex attribute of the target boundary comprises the following steps:
scanning each row and each column of the target boundary with two mutually perpendicular straight lines;
determining the target boundary to be convex under the condition that the number of intersection points of the target boundary and each straight line is not more than 2;
or, when the number of intersections between the target boundary and at least one of the straight lines is greater than 2, determining that the target boundary is concave.
The present application also provides an image warping apparatus, including:
the receiving module is used for receiving a first input instruction of a user;
the first processing module is used for responding to the first input instruction and outputting a target boundary, wherein the target boundary is the boundary of the two-dimensional closed graph;
the second processing module is used for determining an inscribed rectangle or a circumscribed rectangle of the target boundary;
the third processing module is used for zooming the original image and outputting a primary deformed image by taking the inscribed rectangle or the circumscribed rectangle as a boundary;
and the fourth processing module is used for scaling the primary deformed image by taking the target boundary as a boundary and outputting a target deformed image of the original image.
According to the image warping apparatus provided by the present application, the fourth processing module includes:
the first determining module is used for determining two first intersection points which are farthest away from each other in intersection points of a straight line where a row of pixels of the primary deformed image along the first direction is located and the target boundary;
the first processing submodule is used for scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries and outputting a first-direction deformed image;
the second determining module is used for determining two second intersection points which are farthest away from the target boundary and are positioned on a straight line where a row of pixels of the first direction deformation image along the second direction is positioned;
the second processing submodule is used for scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries and outputting a target deformation image; wherein
The second direction is perpendicular to the first direction.
According to the image warping device provided by the application, the first processing submodule is further used for scaling the corresponding row of pixels of the primary warped image in an equal proportion by taking the midpoint of the two first intersection points as a first boundary center;
the second processing submodule is further configured to scale the corresponding row of pixels of the first-direction deformed image in an equal proportion by taking a midpoint of the two second intersection points as a second boundary center;
alternatively, the first and second electrodes may be,
the first processing submodule is further configured to determine, with a midpoint of the two first intersection points as a first boundary center, a scaling of the pixel based on a distance from each pixel in a corresponding row of pixels of the primary deformed image to the first boundary center;
the second processing submodule is further configured to determine, by using a midpoint of the two second intersection points as a second boundary center, a scaling of the pixel based on a distance from each pixel in a corresponding row of pixels of the first direction deformed image to the second boundary center.
According to the image warping device provided by the application, the first processing sub-module is further configured to scale the corresponding row of pixels of the primary warped image in an equal proportion by taking a midpoint of the row of pixels of the primary warped image as a first pixel center;
the second processing submodule is further used for scaling the corresponding row of pixels of the first direction deformation image by taking the second pixel center of the row of pixels of the first direction deformation image as the center;
alternatively, the first and second electrodes may be,
the first processing sub-module is further configured to determine, with a midpoint of a row of pixels of the primary distorted image as a first pixel center, a scaling of the pixels based on distances from each pixel in a corresponding row of pixels of the primary distorted image to the first pixel center;
the second processing sub-module is further configured to determine, with a midpoint of a row of pixels of the first direction deformation image as a second pixel center, a scaling of the pixels based on a distance from each pixel in a corresponding row of pixels of the first direction deformation image to the second pixel center.
According to the image warping device provided by the present application, the second processing module includes:
the first judgment module is used for judging the concave-convex attribute of the target boundary;
the first display module is used for determining that the target boundary is convex, outputting an inscribed rectangle of the target boundary and displaying the inscribed rectangle on the target boundary;
or the second display module is used for determining that the target boundary is concave, outputting a circumscribed rectangle of the target boundary, and displaying the circumscribed rectangle on the target boundary.
According to the image warping apparatus provided by the present application, the first determining module includes:
a scanning module for scanning each row and each column of the target boundary with two mutually perpendicular straight lines;
the first determining submodule is used for determining that the target boundary is convex under the condition that the number of intersection points of the target boundary and each straight line is not more than 2;
or, the second determining submodule is configured to determine that the target boundary is concave when the number of intersections between the target boundary and at least one of the straight lines is greater than 2.
The present application further provides an electronic device, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the image warping method as described in any one of the above when executing the computer program.
The present application also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the image warping method as described in any one of the above.
According to the image deformation method, the image deformation device, the electronic equipment and the storage medium, the original image is firstly zoomed to the inscribed rectangle or the circumscribed rectangle of the target boundary and then zoomed to the deformation method of the target boundary, so that the method can be used for the target boundaries of various shapes, the whole image deformation method is executed by taking the target boundary directly reflecting the final result as a starting point, the ideal deformation effect can be ensured, the whole image deformation method is simple to operate, excessive manual operation of operators is not needed, and convenience and rapidness are realized.
Drawings
In order to more clearly illustrate the technical solutions in the present application or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart of an image warping method provided herein;
FIG. 2 is a schematic flow chart diagram illustrating an embodiment of step 130 of the image warping method provided herein;
FIG. 3 is a schematic flow chart diagram illustrating an embodiment of step 131 in the image warping method provided herein;
FIG. 4 is a schematic diagram of step 131b in the image warping method provided in the present application;
FIG. 5 is a schematic diagram of step 131c in the image warping method provided in the present application;
FIG. 6 is a schematic flow chart diagram illustrating an embodiment of step 150 in an image warping method provided herein;
FIG. 7 is a schematic diagram of an original image to be deformed in the image deformation method provided by the present application;
FIG. 8 is a schematic diagram of a target boundary in the image morphing method provided in the present application;
FIG. 9 is a schematic diagram of an inscribed rectangle for determining a boundary of an object in the image deformation method provided by the present application;
FIG. 10 is a schematic diagram of a primary warped image in the image warping method provided in the present application;
fig. 11 is a schematic diagram of an image deformed in a first direction in the image deforming method provided in the present application;
FIG. 12 is a schematic diagram of a deformed image of an object in the image deformation method provided in the present application;
FIG. 13 is a schematic diagram of an object boundary obtained in the image deformation method provided by the present application;
FIG. 14 is a schematic diagram of an original image to be deformed in the image deformation method provided by the present application;
FIG. 15 is a schematic diagram of a deformed image of an object in the image deformation method provided by the present application;
FIG. 16 is a schematic structural diagram of an image warping device provided in the present application;
fig. 17 is a schematic structural diagram of a second processing module of the image warping apparatus provided in the present application;
fig. 18 is a schematic structural diagram of a first determining module of the image warping apparatus provided in the present application;
fig. 19 is a schematic structural diagram of a fourth processing module of the image warping apparatus provided in the present application;
fig. 20 is a schematic structural diagram of an electronic device provided in the present application.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The image morphing method of the present application is described below with reference to fig. 1 to 15.
The image warping method may be applied to a terminal, and may be specifically, but not limited to, executed by hardware or software in the terminal.
In some embodiments, the terminal includes, but is not limited to, a desktop computer, a notebook, or in some embodiments, the terminal includes a portable device such as a mobile phone or tablet computer having a touch-sensitive surface (e.g., a touch screen display and/or a touchpad).
In the following embodiments, a terminal including a display is described. However, it should be understood that the terminal may include one or more other physical user interface devices such as a touch panel, a physical keyboard, a mouse, and/or a joystick.
As shown in fig. 1, an image warping method provided in an embodiment of the present application includes: step 110-step 150.
Step 110, receiving a first input instruction of a user;
in this step, the first input instruction is used to output a target boundary.
Wherein, the first input instruction can be expressed in at least one of the following modes:
first, the first input command may be input by the user through an external input device (e.g., mouse, keyboard, electronic pen, button).
In this embodiment, the terminal is connected to an external input device such as a mouse or a keyboard, and receives a first input instruction from the user, which may be represented by receiving an input from the user clicking, pressing or dragging the mouse, or receiving an input from the user hitting the keyboard.
In a specific embodiment, the first input instruction may be represented by a user operation of clicking or dragging a target control by a mouse, and the target control is used for selecting a target two-dimensional closed graph.
In another specific embodiment, the first input instruction may include that a user selects a local or cloud target image by clicking or dragging a first target control through a mouse, the target image includes a target two-dimensional closed graph, and the first input instruction may further include that the user clicks a second target control through the mouse to identify a target boundary from the target image.
Secondly, in the case that the terminal is included to have a touch-sensitive surface, the first input instruction may be represented as a touch input, including but not limited to a click input, a slide input, a press input, and the like.
In this embodiment, receiving the first input instruction of the user may be represented by receiving a touch operation of the user in a display area of a display screen of the terminal.
In order to reduce the misoperation rate of the user, the action area of the first input instruction can be limited to a specific area, such as the upper middle area of the interface; or displaying the target control on the current interface, and touching the target control to realize the first input instruction.
Third, the first input instruction may be expressed as a voice input.
In this embodiment, the terminal may trigger the output of the target boundary when receiving a voice such as "circle with radius of 5cm is taken as the target boundary".
Of course, in other embodiments, the first input command may also be expressed in other forms, including but not limited to character input, and the like, which may be determined according to actual needs, and the embodiment of the present application does not limit this.
Step 120, responding to the first input instruction, outputting a target boundary, wherein the target boundary is the boundary of the two-dimensional closed graph;
in this step, after receiving the first input instruction, the terminal may output a target boundary in response to the first input instruction, and the target boundary may be displayed on the interface.
The target boundary is the boundary of the two-dimensional closed graph. The object boundary may be a polygon, circle, ellipse, or other irregular closed figure.
For example, the target boundary shown in fig. 4 is a circle, the target boundary shown in fig. 5 is an irregular closed figure, the target boundary shown in fig. 8 is a circle, and the target boundary shown in fig. 13 is a quadrangle.
Step 130, determining an inscribed rectangle or a circumscribed rectangle of the target boundary;
with the target boundary known, an inscribed rectangle or circumscribed rectangle of the target boundary may be obtained, preferably having a length and width parallel to the lateral and longitudinal directions of the interface, respectively.
In some embodiments, this step may result in an inscribed rectangle of the target boundary, which may be displayed on the image of the target boundary.
As shown in fig. 9, the target boundary is a circle, and an inscribed rectangle is displayed in a wire frame on the target boundary.
In other embodiments, this step may result in a circumscribed rectangle of the target boundary, and an inscribed rectangle may be displayed on the image of the target boundary.
Step 140, scaling the original image by taking an internal rectangle or an external rectangle as a boundary, and outputting a primary deformed image;
as shown in fig. 10, in the case where the inscribed rectangle is output in step 130, this step is to scale the original image by using the inscribed rectangle as a scaling limit, and output the deformed image once.
In actual implementation, the steps specifically include: scaling the original image along the length direction based on a ratio of the size of the inscribed rectangle in the length direction to the size of the original image in the length direction; the original image is scaled in the width direction based on the ratio of the size of the inscribed rectangle in the width direction to the size of the original image in the width direction.
In the case that the circumscribed rectangle is output in step 130, this step is used to scale the original image by using the circumscribed rectangle as a scaling limit, and output a primary deformed image.
In actual implementation, the steps specifically include: scaling the original image along the length direction based on the ratio of the size of the circumscribed rectangle in the length direction to the size of the original image in the length direction; the original image is scaled in the width direction based on a ratio of the size of the circumscribed rectangle in the width direction to the size of the original image in the width direction.
And 150, zooming the primary deformed image by taking the target boundary as a boundary, and outputting a target deformed image of the original image.
In this step, the scaling limit is the boundary of the target obtained in step 110, the scaled object is the primary deformed image obtained in step 140, and scaling is performed to output the target deformed image of the original image.
As shown in fig. 7, the original image is rectangular, and after scaling, the target deformed image shown in fig. 12 is obtained according to the circular target boundary.
According to the image deformation method of the embodiment of the application, the original image is firstly zoomed to the inscribed rectangle or the circumscribed rectangle of the target boundary and then zoomed to the deformation method of the target boundary, so that the method can be used for the target boundaries of various shapes, the whole image deformation method is executed by taking the target boundary directly reflecting the final result as a starting point, the ideal deformation effect can be ensured, the operation of the whole image deformation method is simple, excessive manual operation of operators is not needed, and the method is convenient and fast.
In some embodiments, as shown in FIG. 2, step 130, determining the inscribed rectangle or circumscribed rectangle of the target boundary, includes step 131 and step 132a, or step 131 and step 132 b.
Step 131, judging the concave-convex attribute of the target boundary;
the two-dimensional closed figure has both concave and convex types, for example, circular and triangular shapes are convex, and the irregular figure shown in fig. 5 is concave.
Step 132a, determining that the target boundary is convex, outputting an inscribed rectangle of the target boundary, and displaying the inscribed rectangle on the target boundary;
as shown in fig. 9, in the case where the target boundary is convex, an inscribed rectangle of the target boundary is output, and the inscribed rectangle is displayed on the target boundary.
And 132b, determining the target boundary to be concave, outputting a circumscribed rectangle of the target boundary, and displaying the circumscribed rectangle on the target boundary.
In the case where the target boundary is concave, an inscribed rectangle of the target boundary is output, and the inscribed rectangle is displayed on the target boundary.
Under the condition that the target boundary is a polygon, determining that the target boundary is a convex polygon, outputting an inscribed rectangle of the target boundary, and displaying the inscribed rectangle on the target boundary; and under the condition that the target boundary is a polygon, determining that the target boundary is a concave polygon, outputting a circumscribed rectangle of the target boundary, and displaying the circumscribed rectangle on the target boundary.
The length and width of the inscribed rectangle are parallel to the length and width of the interface, respectively, and the length and width of the circumscribed rectangle are parallel to the length and width of the interface, respectively.
Therefore, when zooming is carried out, each row of pixel points of the original image can be ensured to have two definite boundary points.
In actual implementation, as shown in fig. 3, the step 131 of determining the concave-convex attribute of the target boundary includes: step 131a and step 131b, or step 131a and step 131 c.
Step 131a, scanning each line and each column of the target boundary by using two mutually perpendicular straight lines;
131b, determining the target boundary to be convex under the condition that the number of the intersection points of the target boundary and each straight line is not more than 2;
as shown in fig. 4, the line-by-line scanning is performed with straight lines parallel to the lateral direction of the interface, and the intersection point of any straight line with the boundary of the object is at most 2; scanning line by using straight lines parallel to the longitudinal direction of the interface, wherein the intersection point of any straight line and the target boundary is at most 2; the target boundary is convex.
And 131c, determining that the target boundary is concave when the number of the intersection points of the target boundary and the at least one straight line is more than 2.
As shown in fig. 5, by scanning line by line with a straight line parallel to the lateral direction of the interface, there are more than 2 intersections of the straight line with the boundary of the object as shown in the drawing; the target boundary is concave.
In some embodiments, step 150, scaling the deformed image once with the boundary of the object as a boundary, and outputting the deformed image of the object of the original image, includes: step 151-step 154.
Step 151, determining two first intersection points which are farthest away from intersection points of a straight line where a row of pixels of the primary deformed image along the first direction is located and the target boundary;
step 152, scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries, and outputting a first-direction deformed image;
step 153, determining two second intersection points which are farthest away from the target boundary and are located by a straight line where a row of pixels of the first-direction deformation image along the second direction is located;
step 154, scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries, and outputting a target deformation image; wherein the second direction is perpendicular to the first direction.
The first direction and the second direction are respectively parallel to the length and the width of the inscribed rectangle or the circumscribed rectangle.
The following describes in detail the image deformation method according to the embodiment of the present application, with reference to fig. 7 to 12, taking the first direction as a vertical direction and the second direction as a horizontal direction as an example.
As shown in fig. 7, is an original image to be deformed.
As shown in fig. 8, in response to a first input instruction from the user, a target boundary image is output, the target boundary image including a target boundary, the target boundary being a circle.
As shown in fig. 9, the inscribed rectangle of the object boundary is determined and displayed at the corresponding position of the object boundary image.
As shown in fig. 10, the original image is scaled with the inscribed rectangle as a boundary, and a deformed image is output once.
As shown in fig. 11, two first intersection points which are farthest from each other among the intersection points of the straight line where any column of pixels of the primary deformed image is located in the longitudinal direction and the target boundary are determined; and scaling the corresponding column of pixels of the primary deformed image by taking the two first intersection points as boundaries, and executing the operation on each column of pixels to output the first-direction deformed image. In other words, the image scaling algorithm is used to scale each column in the longitudinal direction of the primary deformed image to the boundary of the target boundary.
As shown in fig. 12, two second intersection points, which are farthest from the target boundary, of the straight line where any row of pixels of the first direction deformation image is located in the transverse direction are determined; and scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries, and outputting the target deformation image. In other words, the image scaling algorithm is used to scale each line of the first direction warped image to the boundary of the target boundary.
Of course, the order of the row and column operations may be reversed.
It should be noted that, for the target boundary with concave property, there may be a plurality of intersections between the straight line where a row of pixels of the primary deformed image or the first-direction deformed image along the transverse direction or the longitudinal direction is located and the target boundary, and in this case, two intersections that are farthest apart need to be determined.
The scaling modes of the above steps 151 to 154 are at least four as follows:
firstly, the middle point of two intersection points of two target boundaries and a straight line is taken as a first boundary center, and scaling is carried out in an equal manner.
Step 152, scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries, includes: and scaling the pixels of the corresponding row of the primary deformation image by using the midpoint of the two first intersection points as the center of the first boundary.
In other words, each pixel or group of pixels is scaled in a direction away from or towards the centre of the first boundary on the line on which the row of pixels lies, according to the same scaling.
For example, if a row of pixels of a primary deformed image includes 100 pixels and there are 200 pixels between two first intersections, each pixel is copied once in a direction away from the center of the first boundary with the center of the first boundary being the center (the center point is not moved).
Step 154, scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries, including: scaling the corresponding row of pixels of the first-direction deformation image in an equal proportion by taking the midpoint of the two second intersection points as a second boundary center;
in other words, each pixel or group of pixels is scaled in a direction away from or towards the centre of the second boundary on the line on which the row of pixels lies, at the same scale.
For example, a row of pixels of the primary deformed image includes 100 pixels, and 50 pixels are located between two first intersection points, and then the average value of every two pixels is filled toward the direction close to the center of the first boundary with the center of the first boundary being the center (the center point is not moved).
And secondly, the middle point of two intersection points of two target boundaries and a straight line is taken as the center of the first boundary, and scaling is not carried out in equal proportion.
Step 152, scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries, includes: and determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the primary deformed image to the first boundary center by taking the midpoint of the two first intersection points as the first boundary center.
In other words, each pixel or group of pixels determines the scaling according to the distance between itself and the center of the first boundary, the distance and the scaling are positively or negatively correlated, and the scaling is performed on the straight line where the row of pixels is located in the direction away from or close to the center of the first boundary.
Step 154, scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries, including: and determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the first direction deformation image to the second boundary center by taking the middle point of the two second intersection points as the second boundary center.
In other words, each pixel or each group of pixels determines the scaling according to the distance between the pixel or each group of pixels and the center of the second boundary, the distance is in positive correlation or negative correlation with the scaling, and the scaling is carried out on the straight line where the row of pixels is located in the direction away from or close to the center of the second boundary.
And thirdly, scaling in equal proportion by taking the middle point of a row of pixels of the image to be scaled as the center of the first boundary.
Step 152, scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries, includes: and scaling the corresponding row of pixels of the primary deformed image in equal proportion by taking the middle point of the row of pixels of the primary deformed image as the center of the first pixel.
In other words, each pixel or group of pixels is scaled in a direction away from or towards the centre of the first boundary on the line on which the row of pixels lies, according to the same scaling.
For example, if a row of pixels of a primary deformed image includes 100 pixels and there are 200 pixels between two first intersections, each pixel is copied once in a direction away from the center of the first pixel with the center of the first pixel as the center (center point is not moved).
Step 154, scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries, including: and scaling the corresponding row of pixels of the first direction deformation image by taking the second pixel center of the row of pixels of the first direction deformation image as a center.
In other words, each pixel or group of pixels is scaled in a direction away from or towards the centre of the second pixel on the line on which the row of pixels lies, according to the same scaling.
For example, a row of pixels of the primary deformed image includes 100 pixels, and 50 pixels are located between two first intersection points, and then the average value of every two pixels is filled toward the center of the second pixel by taking the center of the second pixel as the center (the center point is not moved).
And fourthly, taking the middle point of a row of pixels of the image to be zoomed as the center of the first boundary, and zooming in unequal proportion.
Step 152, scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries, includes: and determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the primary deformed image to the first pixel center by taking the middle point of the row of pixels of the primary deformed image as the first pixel center.
In other words, each pixel or each group of pixels determines the scaling according to the distance between the pixel or each group of pixels and the center of the first pixel, the distance is in positive correlation or negative correlation with the scaling, and the scaling is carried out on the straight line where the row of pixels is located in the direction away from or close to the center of the first pixel.
Step 154, scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries, including: and determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the first direction deformation image to the center of the second pixel by taking the middle point of the row of pixels of the first direction deformation image as the center of the second pixel.
In other words, each pixel or each group of pixels determines the scaling according to the distance between the pixel or each group of pixels and the center of the second pixel, the distance is in positive correlation or negative correlation with the scaling, and the scaling is carried out on the straight line where the row of pixels is located in the direction away from or close to the center of the second pixel.
The four zooming modes can realize different zooming effects, and a user can set the zooming modes in advance according to needs.
Fig. 13 shows that the first input instruction includes an operation of selecting a local or cloud target image, and an operation of clicking a second target control by a mouse by a user to identify a target boundary from the target image. The object boundaries in fig. 13 are rectangular boundaries in the photograph.
Fig. 14 is an original image to be deformed.
Fig. 15 is a target deformed image obtained by deforming an original image and filling the deformed image into the rectangular boundary in fig. 13.
The following describes an image warping apparatus provided in the present application, and the image warping apparatus described below and the image warping method described above may be referred to in correspondence with each other.
As shown in fig. 16, an image warping apparatus according to an embodiment of the present application includes: a receiving module 1610, a first processing module 1620, a second processing module 1630, a third processing module 1640, and a fourth processing module 1650.
A receiving module 1610, configured to receive a first input instruction of a user;
a first processing module 1620, configured to output a target boundary in response to the first input instruction, where the target boundary is a boundary of a two-dimensional closed graph;
a second processing module 1630, configured to determine an inscribed rectangle or a circumscribed rectangle of the target boundary;
the third processing module 1640 is configured to scale the original image and output a primary deformed image by taking the inscribed rectangle or the circumscribed rectangle as a boundary;
the fourth processing module 1650 is configured to scale the primary deformed image with the target boundary as a boundary, and output a target deformed image of the original image.
According to the image deformation device of the embodiment of the application, the original image is firstly zoomed to the inscribed rectangle or the circumscribed rectangle of the target boundary and then zoomed to the deformation method of the target boundary, so that the method can be used for the target boundaries of various shapes, the whole image deformation method is executed by taking the target boundary directly reflecting the final result as the starting point, the ideal deformation effect can be ensured, the operation of the whole image deformation method is simple, excessive manual operation of operators is not needed, and the method is convenient and fast.
In some embodiments, as shown in fig. 17, the second processing module 1630 may include: a first judging module 1631 and a first display module 1632, or a first judging module 1631 and a second display module 1633.
A first judging module 1631, configured to judge a concave-convex attribute of the target boundary;
a first display module 1632, configured to determine that the target boundary is convex, output an inscribed rectangle of the target boundary, and display the inscribed rectangle on the target boundary;
the second display module 1633 is configured to determine that the target boundary is concave, output a circumscribed rectangle of the target boundary, and display the circumscribed rectangle on the target boundary.
In some embodiments, as shown in fig. 18, the first determining module 1631 may include: a scan module 1631a and a first determination sub-module 1631b, or a scan module 1631a and a second determination sub-module 1631 c.
A scanning module 1631a, configured to scan each row and each column of the target boundary with two mutually perpendicular straight lines;
the first determining submodule 1631b is configured to determine that the target boundary is convex when the number of intersections between the target boundary and each straight line is not greater than 2;
the second determining submodule 1631c is configured to determine that the target boundary is concave if the number of intersections of the target boundary and the at least one straight line is greater than 2.
In some embodiments, as shown in fig. 19, the fourth processing module 1650 may comprise: a first determination module 1651, a first processing sub-module 1652, a second determination module 1653, and a second processing sub-module 1654.
A first determining module 1651, configured to determine two first intersection points which are farthest from each other among intersection points of a straight line where a line of pixels of the primary deformed image along the first direction is located and the boundary of the object;
a first processing sub-module 1652, configured to scale the corresponding row of pixels of the primary warped image by taking two first intersection points as boundaries, and output a first-direction warped image;
a second determining module 1653, configured to determine two second intersection points, which are farthest from the boundary of the object, of straight lines where a row of pixels of the first-direction deformed image along the second direction is located;
a second processing sub-module 1654, configured to scale the corresponding row of pixels of the first-direction deformed image with two second intersection points as boundaries, and output a target deformed image; wherein
The second direction is perpendicular to the first direction.
In some embodiments, the first processing sub-module 1652 may be further configured to scale the corresponding row of pixels of the primary warped image by taking a midpoint of the two first intersections as a first boundary center;
the second processing sub-module 1654 may be further configured to scale the corresponding row of pixels of the first direction warped image equally with a midpoint of the two second intersections as a second boundary center.
In some embodiments, the first processing sub-module 1652 may be further configured to determine a scaling of the pixels based on distances from the first boundary center to each of the pixels in the corresponding row of pixels of the primary transformed image, with a midpoint of the two first intersections as the first boundary center;
the second processing sub-module 1654 may be further configured to determine a scaling of the pixels based on distances from each pixel in the corresponding row of pixels of the first direction warped image to the second boundary center with a midpoint of the two second intersections as the second boundary center.
In some embodiments, the first processing sub-module 1652 may be further configured to scale the corresponding row of pixels of the primary warped image by taking a midpoint of the row of pixels of the primary warped image as a first pixel center;
the second processing sub-module 1654 may also be configured to scale the corresponding line of pixels of the first-direction warped image equally, centered at a second pixel center of the line of pixels of the first-direction warped image.
In some embodiments, the first processing sub-module 1652 may be further configured to determine a scaling of the pixels based on distances from respective pixels in a corresponding line of pixels of the primary warped image to a first pixel center, with a midpoint of the line of pixels of the primary warped image as the first pixel center;
the second processing sub-module 1654 may be further configured to determine a scaling of the pixels based on distances from respective pixels in a corresponding line of pixels of the first-direction warped image to a center of the second pixel, with a midpoint of the line of pixels of the first-direction warped image as the center of the second pixel.
The image warping device provided in the embodiment of the present application is used for executing the image warping method, and the specific implementation manner thereof is consistent with the method implementation manner, and the same beneficial effects can be achieved, and details are not repeated here.
Fig. 20 illustrates a physical structure diagram of an electronic device, and as shown in fig. 20, the electronic device may include: a processor (processor)2010, a communication Interface (Communications Interface)2020, a memory (memory)2030 and a communication bus 2040, wherein the processor 2010, the communication Interface 2020 and the memory 2030 communicate with each other via the communication bus 2040. Processor 2010 may invoke logic instructions in memory 2030 to perform a method of image warping, the method comprising: receiving a first input instruction of a user; responding to a first input instruction, and outputting a target boundary, wherein the target boundary is the boundary of the two-dimensional closed graph; determining an inscribed rectangle or a circumscribed rectangle of the target boundary; scaling the original image by taking the inscribed rectangle or the circumscribed rectangle as a boundary, and outputting a primary deformed image; and zooming the primary deformed image by taking the target boundary as a boundary, and outputting a target deformed image of the original image.
Furthermore, the logic instructions in the memory 2030 may be implemented in software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The processor 2010 in the electronic device provided in the embodiment of the present application may call a logic instruction in the memory 2030 to implement the image deformation method, and a specific implementation manner of the method is consistent with that of the method and may achieve the same beneficial effects, which is not described herein again.
On the other hand, the present application further provides a computer program product, which is described below, and the computer program product described below and the image morphing method described above may be referred to correspondingly.
The computer program product comprises a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform the image warping method provided by the above methods, the method comprising: receiving a first input instruction of a user; responding to a first input instruction, and outputting a target boundary, wherein the target boundary is the boundary of the two-dimensional closed graph; determining an inscribed rectangle or a circumscribed rectangle of the target boundary; scaling the original image by taking the inscribed rectangle or the circumscribed rectangle as a boundary, and outputting a primary deformed image; and zooming the primary deformed image by taking the target boundary as a boundary, and outputting a target deformed image of the original image.
When the computer program product provided in the embodiment of the present application is executed, the image deformation method is implemented, and the specific implementation manner is consistent with the method implementation manner, and the same beneficial effects can be achieved, which is not described herein again.
In yet another aspect, the present application further provides a non-transitory computer-readable storage medium, which is described below, and the non-transitory computer-readable storage medium described below and the image morphing method described above are referred to in correspondence with each other.
The present application also provides a non-transitory computer readable storage medium having stored thereon a computer program that, when executed by a processor, is implemented to perform the image warping method provided above, the method comprising: receiving a first input instruction of a user; responding to a first input instruction, and outputting a target boundary, wherein the target boundary is the boundary of the two-dimensional closed graph; determining an inscribed rectangle or a circumscribed rectangle of the target boundary; scaling the original image by taking the inscribed rectangle or the circumscribed rectangle as a boundary, and outputting a primary deformed image; and zooming the primary deformed image by taking the target boundary as a boundary, and outputting a target deformed image of the original image.
When the computer program stored on the non-transitory computer readable storage medium provided in the embodiment of the present application is executed, the image deformation method is implemented, and the specific implementation manner is consistent with the method implementation manner and can achieve the same beneficial effects, which is not described herein again.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (14)

1. An image warping method, comprising:
receiving a first input instruction of a user;
responding to the first input instruction, and outputting a target boundary, wherein the target boundary is the boundary of the two-dimensional closed graph;
determining an inscribed rectangle or a circumscribed rectangle of the target boundary;
zooming the original image by taking the inscribed rectangle or the circumscribed rectangle as a boundary, and outputting a primary deformed image;
and scaling the primary deformed image by taking the target boundary as a boundary, and outputting a target deformed image of the original image.
2. The method according to claim 1, wherein the scaling the primary deformed image and outputting a target deformed image of the original image by using the target boundary as a boundary comprises:
determining two first intersection points which are farthest away from each other in intersection points of a straight line where a row of pixels of the primary deformed image along a first direction is located and the target boundary;
zooming pixels of a corresponding row of the primary deformed image by taking the two first intersection points as boundaries, and outputting a first-direction deformed image;
determining two second intersection points which are farthest away from the target boundary and are located by a straight line where a row of pixels of the first direction deformation image along the second direction is located;
zooming pixels of a corresponding row of the first direction deformation image by taking the two second intersection points as boundaries, and outputting a target deformation image; wherein
The second direction is perpendicular to the first direction.
3. The image morphing method according to claim 2,
the scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries comprises the following steps: scaling the corresponding row of pixels of the primary deformed image in an equal proportion by taking the midpoint of the two first intersection points as a first boundary center;
the scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries comprises the following steps: scaling the corresponding row of pixels of the first direction deformation image in an equal proportion by taking the middle point of the two second intersection points as a second boundary center;
alternatively, the first and second electrodes may be,
the scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries comprises the following steps: determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the primary deformed image to the first boundary center by taking the midpoint of the two first intersection points as the first boundary center;
the scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries comprises the following steps: and determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the first direction deformation image to the second boundary center by taking the middle point of the two second intersection points as the second boundary center.
4. The image morphing method according to claim 2,
the scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries comprises the following steps: scaling the corresponding row of pixels of the primary deformed image in an equal proportion by taking the middle point of the row of pixels of the primary deformed image as the center of a first pixel;
the scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries comprises the following steps: scaling the corresponding row of pixels of the first direction deformation image in an equal proportion by taking the second pixel center of the row of pixels of the first direction deformation image as a center;
alternatively, the first and second electrodes may be,
the scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries comprises the following steps: determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the primary deformed image to the first pixel center by taking the middle point of the row of pixels of the primary deformed image as the first pixel center;
the scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries comprises the following steps: and determining the scaling of the pixels based on the distance from each pixel in the corresponding row of pixels of the first direction deformation image to the center of the second pixel by taking the middle point of the row of pixels of the first direction deformation image as the center of the second pixel.
5. The image warping method according to any one of claims 1-4, wherein said determining an inscribed rectangle or circumscribed rectangle of the boundary of the object comprises:
judging the concave-convex attribute of the target boundary;
determining that the target boundary is convex, outputting an inscribed rectangle of the target boundary, and displaying the inscribed rectangle on the target boundary;
or determining that the target boundary is concave, outputting a circumscribed rectangle of the target boundary, and displaying the circumscribed rectangle on the target boundary.
6. The image deformation method according to claim 5, wherein the determining the concave-convex property of the target boundary comprises:
scanning each row and each column of the target boundary with two mutually perpendicular straight lines;
determining the target boundary to be convex under the condition that the number of intersection points of the target boundary and each straight line is not more than 2;
or, when the number of intersections between the target boundary and at least one of the straight lines is greater than 2, determining that the target boundary is concave.
7. An image morphing apparatus, characterized by comprising:
the receiving module is used for receiving a first input instruction of a user;
the first processing module is used for responding to the first input instruction and outputting a target boundary, wherein the target boundary is the boundary of the two-dimensional closed graph;
the second processing module is used for determining an inscribed rectangle or a circumscribed rectangle of the target boundary;
the third processing module is used for zooming the original image and outputting a primary deformed image by taking the inscribed rectangle or the circumscribed rectangle as a boundary;
and the fourth processing module is used for scaling the primary deformed image by taking the target boundary as a boundary and outputting a target deformed image of the original image.
8. The image morphing apparatus of claim 7, wherein the fourth processing module comprises:
the first determining module is used for determining two first intersection points which are farthest away from each other in intersection points of a straight line where a row of pixels of the primary deformed image along the first direction is located and the target boundary;
the first processing submodule is used for scaling the corresponding row of pixels of the primary deformed image by taking the two first intersection points as boundaries and outputting a first-direction deformed image;
the second determining module is used for determining two second intersection points which are farthest away from the target boundary and are positioned on a straight line where a row of pixels of the first direction deformation image along the second direction is positioned;
the second processing submodule is used for scaling the corresponding row of pixels of the first direction deformation image by taking the two second intersection points as boundaries and outputting a target deformation image; wherein
The second direction is perpendicular to the first direction.
9. Image morphing apparatus according to claim 8,
the first processing submodule is further used for scaling the corresponding row of pixels of the primary deformed image in an equal proportion by taking the midpoint of the two first intersection points as a first boundary center;
the second processing submodule is further configured to scale the corresponding row of pixels of the first-direction deformed image in an equal proportion by taking a midpoint of the two second intersection points as a second boundary center;
alternatively, the first and second electrodes may be,
the first processing submodule is further configured to determine, with a midpoint of the two first intersection points as a first boundary center, a scaling of the pixel based on a distance from each pixel in a corresponding row of pixels of the primary deformed image to the first boundary center;
the second processing submodule is further configured to determine, by using a midpoint of the two second intersection points as a second boundary center, a scaling of the pixel based on a distance from each pixel in a corresponding row of pixels of the first direction deformed image to the second boundary center.
10. Image morphing apparatus according to claim 8,
the first processing submodule is further used for scaling the corresponding row of pixels of the primary deformed image in an equal proportion by taking the middle point of the row of pixels of the primary deformed image as the center of the first pixel;
the second processing submodule is further used for scaling the corresponding row of pixels of the first direction deformation image by taking the second pixel center of the row of pixels of the first direction deformation image as the center;
alternatively, the first and second electrodes may be,
the first processing sub-module is further configured to determine, with a midpoint of a row of pixels of the primary distorted image as a first pixel center, a scaling of the pixels based on distances from each pixel in a corresponding row of pixels of the primary distorted image to the first pixel center;
the second processing sub-module is further configured to determine, with a midpoint of a row of pixels of the first direction deformation image as a second pixel center, a scaling of the pixels based on a distance from each pixel in a corresponding row of pixels of the first direction deformation image to the second pixel center.
11. Image morphing device according to any one of claims 7 to 10, characterized in that the second processing module comprises:
the first judgment module is used for judging the concave-convex attribute of the target boundary;
the first display module is used for determining that the target boundary is convex, outputting an inscribed rectangle of the target boundary and displaying the inscribed rectangle on the target boundary;
or the second display module is used for determining that the target boundary is concave, outputting a circumscribed rectangle of the target boundary, and displaying the circumscribed rectangle on the target boundary.
12. The image morphing apparatus of claim 11, wherein the first determining means comprises:
a scanning module for scanning each row and each column of the target boundary with two mutually perpendicular straight lines;
the first determining submodule is used for determining that the target boundary is convex under the condition that the number of intersection points of the target boundary and each straight line is not more than 2;
or, the second determining submodule is configured to determine that the target boundary is concave when the number of intersections between the target boundary and at least one of the straight lines is greater than 2.
13. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the image warping method according to any one of claims 1-6 are implemented when the program is executed by the processor.
14. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the image warping method according to any one of claims 1 to 6.
CN202011455468.1A 2020-12-10 2020-12-10 Image warping method, image warping device, electronic apparatus, and storage medium Active CN112634124B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011455468.1A CN112634124B (en) 2020-12-10 2020-12-10 Image warping method, image warping device, electronic apparatus, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011455468.1A CN112634124B (en) 2020-12-10 2020-12-10 Image warping method, image warping device, electronic apparatus, and storage medium

Publications (2)

Publication Number Publication Date
CN112634124A true CN112634124A (en) 2021-04-09
CN112634124B CN112634124B (en) 2024-04-12

Family

ID=75309926

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011455468.1A Active CN112634124B (en) 2020-12-10 2020-12-10 Image warping method, image warping device, electronic apparatus, and storage medium

Country Status (1)

Country Link
CN (1) CN112634124B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004184738A (en) * 2002-12-04 2004-07-02 Sony Corp Image scaling device, image scaling method, program and recording medium
CN101490708A (en) * 2006-09-13 2009-07-22 索尼株式会社 Image processing device, image processing method, and program
CN106408508A (en) * 2015-07-30 2017-02-15 腾讯科技(深圳)有限公司 Image deformation processing method and apparatus
CN110443751A (en) * 2019-07-10 2019-11-12 广东智媒云图科技股份有限公司 Image distortion method, device, equipment and storage medium based on painting line
CN110599564A (en) * 2019-09-19 2019-12-20 浙江大搜车软件技术有限公司 Image display method and device, computer equipment and storage medium
CN110852932A (en) * 2018-08-21 2020-02-28 北京市商汤科技开发有限公司 Image processing method and apparatus, image device, and storage medium
CN111028686A (en) * 2019-12-13 2020-04-17 维沃移动通信有限公司 Image processing method, image processing apparatus, electronic device, and medium
CN111078090A (en) * 2019-11-29 2020-04-28 上海联影医疗科技有限公司 Display method, device, equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105205826B (en) * 2015-10-15 2017-10-27 电子科技大学 A kind of SAR image azimuth of target method of estimation screened based on direction straight line
CN108305218B (en) * 2017-12-29 2022-09-06 浙江水科文化集团有限公司 Panoramic image processing method, terminal and computer readable storage medium
CN109741394B (en) * 2018-12-10 2021-02-26 北京拓尔思信息技术股份有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN111935397B (en) * 2020-07-07 2022-04-22 北京迈格威科技有限公司 Image processing method and device, electronic equipment and computer readable medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004184738A (en) * 2002-12-04 2004-07-02 Sony Corp Image scaling device, image scaling method, program and recording medium
CN101490708A (en) * 2006-09-13 2009-07-22 索尼株式会社 Image processing device, image processing method, and program
CN106408508A (en) * 2015-07-30 2017-02-15 腾讯科技(深圳)有限公司 Image deformation processing method and apparatus
CN110852932A (en) * 2018-08-21 2020-02-28 北京市商汤科技开发有限公司 Image processing method and apparatus, image device, and storage medium
CN110443751A (en) * 2019-07-10 2019-11-12 广东智媒云图科技股份有限公司 Image distortion method, device, equipment and storage medium based on painting line
CN110599564A (en) * 2019-09-19 2019-12-20 浙江大搜车软件技术有限公司 Image display method and device, computer equipment and storage medium
CN111078090A (en) * 2019-11-29 2020-04-28 上海联影医疗科技有限公司 Display method, device, equipment and storage medium
CN111028686A (en) * 2019-12-13 2020-04-17 维沃移动通信有限公司 Image processing method, image processing apparatus, electronic device, and medium

Also Published As

Publication number Publication date
CN112634124B (en) 2024-04-12

Similar Documents

Publication Publication Date Title
US10430075B2 (en) Image processing for introducing blurring effects to an image
CN112967381B (en) Three-dimensional reconstruction method, apparatus and medium
CN112991178B (en) Image splicing method, device, equipment and medium
CN112288665A (en) Image fusion method and device, storage medium and electronic equipment
US9679353B2 (en) Plan display device that displays enlarged/reduced image of original image with indication and plan display program for displaying same
US10831338B2 (en) Hiding regions of a shared document displayed on a screen
CN113703630A (en) Interface display method and device
CN112634124A (en) Image deformation method, image deformation device, electronic apparatus, and storage medium
CN109766530B (en) Method and device for generating chart frame, storage medium and electronic equipment
CN114138141A (en) Display method and device and electronic equipment
EP4325344A1 (en) Multi-terminal collaborative display update method and apparatus
CN111124246B (en) Interface interaction method, equipment and storage medium
US10878641B1 (en) Editing bezier patch by selecting multiple anchor points
CN112882636B (en) Picture processing method and device
CN114792283A (en) Image processing method, device and equipment and computer readable storage medium
CN108288298B (en) Method and device for drawing function image, computer equipment and storage medium
CN104317522A (en) Terminal equipment, display device, and document icon showing method and system
CN114820908B (en) Virtual image generation method and device, electronic equipment and storage medium
CN111767700B (en) Graph adjustment method and device
CN111708472A (en) Gesture implementation method and device on intelligent device, electronic device and storage medium
CN114237535A (en) Split screen display implementation method and device and computing equipment
CN114895832A (en) Object adjustment method and device, electronic equipment and computer readable medium
CN114418855A (en) Picture processing method and device, electronic equipment and storage medium
CN113012264A (en) Picture processing method, device and equipment
CN116027962A (en) Virtual key setting method, device, equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant