WO2020007241A1 - 图像处理方法和装置、电子设备以及计算机可读存储介质 - Google Patents
图像处理方法和装置、电子设备以及计算机可读存储介质 Download PDFInfo
- Publication number
- WO2020007241A1 WO2020007241A1 PCT/CN2019/093551 CN2019093551W WO2020007241A1 WO 2020007241 A1 WO2020007241 A1 WO 2020007241A1 CN 2019093551 W CN2019093551 W CN 2019093551W WO 2020007241 A1 WO2020007241 A1 WO 2020007241A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- deformation
- grid
- preset
- image
- target object
- Prior art date
Links
- 238000003860 storage Methods 0.000 title claims abstract description 29
- 238000003672 processing method Methods 0.000 title claims abstract description 26
- 230000000694 effects Effects 0.000 claims abstract description 55
- 238000000034 method Methods 0.000 claims abstract description 53
- 238000012545 processing Methods 0.000 claims abstract description 45
- 238000004422 calculation algorithm Methods 0.000 claims description 70
- 239000011159 matrix material Substances 0.000 claims description 48
- 238000004590 computer program Methods 0.000 claims description 18
- 230000008569 process Effects 0.000 claims description 16
- 210000004709 eyebrow Anatomy 0.000 claims description 11
- 210000001508 eye Anatomy 0.000 claims description 10
- 238000010586 diagram Methods 0.000 description 22
- 238000004891 communication Methods 0.000 description 11
- 230000001815 facial effect Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 230000008859 change Effects 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 238000003708 edge detection Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 241001465754 Metazoa Species 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 210000005069 ears Anatomy 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 210000001331 nose Anatomy 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 208000014651 Meester-Loeys syndrome Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000000214 mouth Anatomy 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
- G06T17/205—Re-meshing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2004—Aligning objects, relative positioning of parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
Definitions
- the present disclosure relates to the field of image processing, and in particular, to an image processing method and apparatus, an electronic device, and a computer-readable storage medium.
- the present disclosure proposes a technical solution for image processing.
- an image processing method including:
- Determining a deformation parameter based on a preset deformation effect where the deformation parameter is used to determine a position deviation of each pixel point of the target object based on the preset deformation effect;
- Deformation processing is performed on the target object in the first image based on the deformation parameters to obtain a second image.
- the determining that the first region matches the target object in the first image includes:
- a first mesh corresponding to the target object is formed, and the first mesh matches a first region.
- the deformation parameter is a deformed pixel matrix, and each parameter in the deformed pixel matrix is used to determine a position deviation of a corresponding pixel point of a target object based on the preset deformation effect.
- the determining that the first region matches the target object in the first image includes:
- the first region is determined based on a relative position between the feature points.
- the determining a deformation parameter based on a preset deformation effect includes:
- the deformation parameter is determined based on the preset deformation template by using a preset algorithm.
- the determining the deformation parameter based on the preset deformation effect further includes:
- the deformation parameter is determined using a preset algorithm and a second position of a pixel in the second grid.
- the determining whether a preset deformation template exists includes:
- the determining the deformation parameter based on the preset deformation template includes:
- each first pixel point in the third grid is deforming based on a preset algorithm and the four control points of each sub-mesh Previous third position
- the deformation parameter is determined based on a third position of each first pixel point before the deformation and a corresponding fourth position of the first pixel in the third grid.
- the determining the deformation parameter by using a preset algorithm and a second position of a second pixel point in the second grid includes:
- the deformation parameter is determined based on a first position of each second pixel point before the deformation and a corresponding second position in the second grid.
- the expression of the preset algorithm is:
- d_i (a_i, b_i), which is determined by the coordinates of the control points of each sub-mesh in the deformed grid, i is a positive integer less than 5; u and v are the values of each pixel in the deformed grid when they are not deformed.
- the abscissa and ordinate of the position, p (u, v) is the position coordinate of each pixel point in the deformed grid; the deformed grid is the second grid or the third grid.
- the acquiring a second grid for performing image transformation includes:
- the method further includes:
- the performing a deformation process on the target object in the first image based on the deformation parameter to obtain a second image includes:
- a linear interpolation algorithm is used to adjust the position of the corresponding pixel point on the target object based on the position deviation of each pixel point in the deformation parameter.
- the target object includes a face
- the determining the first region based on a relative position between the feature points includes:
- the height of the first region is determined based on the distance between the center of the eyebrow and the tip of the nose.
- the forming a first grid corresponding to the target object includes:
- a first grid is determined that matches the position and size of the target object in the first image, and the first grid is equally divided into a plurality of sub-grids.
- an image processing apparatus including:
- a first determining module configured to determine a first region that matches a target object in a first image
- a second determining module configured to determine a deformation parameter based on a preset deformation effect, where the deformation parameter is used to determine a position deviation of each pixel point of the target object based on the preset deformation effect;
- a deformation module configured to perform a deformation process on a target object in the first image based on the deformation parameter to obtain a second image.
- the first determining module is further configured to determine the first region by forming a first grid corresponding to the target object, and the first grid matches the first region.
- the deformation parameter is a deformed pixel matrix, and each parameter in the deformed pixel matrix is used to determine a position deviation of a corresponding pixel point of a target object based on the preset deformation effect.
- the first determining module is further configured to determine a position of a feature point of a target object in the first image, and determine the first region based on a relative position between the feature points.
- the second determination module is further configured to determine whether a preset deformation template exists, and in the case where the preset deformation template exists, use a preset algorithm to determine the deformation parameter based on the preset deformation template. .
- the second determining module is further configured to obtain a second mesh for image deformation in a case where the preset deformation template does not exist, and use a preset algorithm and the second mesh.
- the second position of the pixels in the grid determines the deformation parameter, wherein the second grid is a deformed grid.
- the second determining module is further configured to determine that the preset deformation template exists when a first instruction for performing a deformation operation based on a preset deformation image is received.
- the second determination module is further configured to determine a third mesh that matches a deformation region of the preset deformation template
- each first pixel point in the third grid is deforming based on a preset algorithm and the four control points of each sub-mesh Previous third position
- the deformation parameter is determined based on a third position of each first pixel point before the deformation and a corresponding fourth position in the third grid.
- the second determining module is further configured to use four vertices of each sub-mesh in the second grid as four control points;
- the deformation parameter is determined based on a first position of each second pixel point before the deformation and a corresponding second position in the second grid.
- the expression of the preset algorithm is:
- d_i (a_i, b_i), which is determined by the coordinates of the control points of each sub-mesh in the deformed grid, i is a positive integer less than 5; u and v are the values of each pixel in the deformed grid when they are not deformed.
- the abscissa and ordinate of the position, p (u, v) is the position coordinate of each pixel point in the deformed grid; the deformed grid is the second grid or the third grid.
- the second determination module is further configured to receive a deformation operation on the first region, and obtain the second mesh based on the deformation operation; or obtain a stored second mesh.
- the second determining module is further configured to adjust the dimension of the deformed pixel matrix based on the number of pixels in the length direction and the number of pixels in the width direction of the first region, so that the deformed pixel matrix The pixels in the one-to-one correspondence with the pixels in the first region;
- the deformation module is further configured to obtain the second image by using the deformed pixel matrix after adjusting the dimensions.
- the deformation module is further configured to use a linear interpolation algorithm to adjust a position of a corresponding pixel point on the target object based on a position deviation of each pixel point in the deformation parameter.
- the first determining module is further configured to determine the center of the bridge of the nose as the center of the first region when the target object includes a face;
- the height of the first region is determined based on the distance between the center of the eyebrow and the tip of the nose.
- the first determining module is further configured to determine a first grid that matches the position and size of the target object in the first image, and the first grid is equally divided into a plurality of sub-grids. .
- an electronic device including:
- Memory for storing processor-executable instructions
- the processor is configured to execute the image processing method according to any one of the first aspects.
- a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, implement the image according to any one of the first aspects. Approach.
- a computer program comprising computer-readable code, and when the computer-readable code is run in an electronic device, a processor in the electronic device executes a program for Implementing the method of any one of the first aspects.
- the image can be directly deformed according to the required deformation effect, instead of simply performing the deformation operation by dragging, adding pictures, and labels.
- it is more convenient and flexible to perform image deformation, and the applicability is better.
- FIG. 1 shows a flowchart of an image processing method according to an embodiment of the present disclosure
- FIG. 2 illustrates a flowchart of determining a first region when a target object is a face in an image processing method according to an embodiment of the present disclosure
- step S200 of the image processing method according to an embodiment of the present disclosure
- FIG. 4 shows a flowchart of determining the deformation parameters based on the preset deformation template in an image processing method according to an embodiment of the present disclosure
- step S204 in the image processing method according to an embodiment of the present disclosure
- FIG. 6 illustrates a schematic diagram of determining a first region according to an image processing method according to an embodiment of the present disclosure
- FIG. 7 illustrates a structural diagram of performing a deformation process based on a deformation parameter according to an embodiment of the present disclosure
- FIG. 8 illustrates a schematic diagram of a second image obtained according to an embodiment of the present disclosure
- FIG. 9 illustrates a block diagram of an image processing apparatus according to an embodiment of the present disclosure.
- FIG. 10 illustrates a block diagram of an electronic device according to an embodiment of the present disclosure
- FIG. 11 illustrates a block diagram of another electronic device according to an embodiment of the present disclosure.
- exemplary means “serving as an example, embodiment, or illustration.” Any embodiment described herein as “exemplary” is not necessarily to be construed as superior or better than other embodiments.
- FIG. 1 shows a flowchart of an image processing method according to an embodiment of the present disclosure.
- the image processing method according to the embodiment of the present disclosure can be applied to an electronic device having an image processing function, such as a mobile phone, a camera, a video camera, a PAD, and a computer device.
- an electronic device having an image processing function such as a mobile phone, a camera, a video camera, a PAD, and a computer device.
- Such electronic devices are not limited in this disclosure.
- the image processing method in the embodiment of the present disclosure may include:
- S100 Determine a first region that matches a target object in the first image
- the image processing method provided by the embodiment of the present disclosure may perform a deformation process on an image to obtain an image having a corresponding deformation effect.
- the first image may be an image captured by an electronic device, an image received through a communication connection with another electronic device, or an image stored in the electronic device, which is not limited in the embodiment of the present disclosure.
- the target object is a part of the first image that needs to be deformed, and may be, for example, an object such as a face, an animal, a plant, or a scene, or an image area object selected by the user, or the entire image.
- a target object in the first image that needs to be deformed may be determined.
- the method for determining the target object may include: determining the target object according to the current imaging mode of the electronic device; or receiving selection information input by the user and determining the target object based on the selection information; or determining the target object based on the feature information in the first image target.
- the target object may also be determined in other ways.
- a target object that needs to be deformed may be determined according to a current imaging mode.
- the shooting modes in the implementation of the present disclosure may include a portrait mode and a scene mode, or may include other modes.
- a portrait mode a face in the image may be determined as a target object.
- the current imaging mode of the electronic device can be a scene mode
- the scene in the image can be used as a target object. That is, in different camera modes, the corresponding target object can be determined to perform the deformation process of the corresponding target object without user operation, and it is simple and convenient.
- the foregoing is merely an example for describing an embodiment in which a target object can be determined according to an imaging mode, which is not limited in the embodiments of the present disclosure.
- the target object may also be determined according to the selection information input by the user.
- a selection operation on the first image by the user may be received, that is, the user may perform a selection operation on the first image through a touch operation or a frame selection operation, and the image selected by the selection operation Information is selection information.
- the user can also select the category of the target object to be determined from the option list of the target object.
- the option list can include options such as faces, animals, plants, scenes, etc., and then can receive the selection information about the above options to determine all The target object in the selected image.
- the target object in the first image can be obtained, and the first region corresponding to the target object can be determined based on the target object.
- the size of the first region in the embodiment of the present disclosure matches the size of the target object, and the first The position of an area also matches the position of the target object, and the deformation of the target object can be achieved by performing a deformation process on each pixel point in the first area.
- determining the first region that matches the target object in the first image may include forming a first grid corresponding to the target object, and the first grid matches the first region. That is, in the embodiment of the present disclosure, the first grid corresponding to the determined target object can be used, and the position and size of the first grid match the target object, that is, the position where the first grid is located, that is, Is the first area, or the first area is represented in the form of a first grid.
- the first grid may include multiple sub-grids, and each sub-grid may correspond to at least one pixel.
- by forming the first grid it is possible to conveniently analyze the position, gray scale, and other features of each pixel in the grid.
- the first grid is taken as an example for description.
- S200 Determine a deformation parameter based on a preset deformation effect, where the deformation parameter is used to determine a position deviation of each pixel of the target object based on the preset deformation effect;
- a corresponding deformation parameter may be determined according to a preset deformation effect.
- the deformation parameters can be determined by the deformation effect corresponding to the existing deformation image.
- the first region (the first grid) can also be directly operated to obtain the deformation effect of the first region, so as to determine the deformation parameters.
- a deformation operation can be performed on each pixel in the first region through a preset algorithm.
- the deformation parameter may be used to determine a position deviation of each pixel point of the target object based on the preset deformation effect, that is, the deformation parameter may include or be able to determine a deformed position of each pixel point of the target object.
- the deformation parameter in the embodiment of the present disclosure may be a deformed pixel matrix, and each parameter in the deformed pixel matrix is used to determine a position deviation of a corresponding pixel point of a target object based on the preset deformation effect.
- the dimension (size) of the deformed pixel matrix can correspond to the first grid, that is, the parameter values in the deformed pixel matrix correspond to the pixels in the first grid one by one.
- the above parameter values can be used to determine the position offset of the corresponding pixel points. the amount.
- the parameters of the deformed pixel matrix can be used to determine the position change amount of each pixel during the deformation process, that is, the above-mentioned position deviation.
- the step of determining the deformation parameter based on the preset deformation effect in the embodiment of the present disclosure includes using a preset algorithm to determine the deformation parameter based on the preset deformation effect.
- the preset algorithm may include a Bezier surface algorithm.
- the deformation parameter or the deformed pixel matrix may include a position offset of each pixel in the preset deformation effect.
- the position offset includes not only a position offset value, but also an offset direction, so that it can be easily determined.
- the deformed pixel matrix may include the original position coordinates of the pixel points at the current coordinates in the preset deformation effect between undeformed, so that the position deviation may be determined based on a deviation between the original position coordinates and the current coordinates.
- S300 Perform deformation processing on the target object in the first image based on the deformation parameters to obtain a second image.
- the deformation parameters of the deformed pixel matrix can be used to determine the positional deviation of the corresponding pixel point when the deformation process is performed. Therefore, the deformed position of each pixel point of the target object can be determined based on the positional deviation and executed.
- the warping operation obtains a second image.
- pixels can be evaluated by a bilinear difference algorithm, so that the deformed image can be made smoother.
- the deformation processing of the first image to the second image can be completed, and since the embodiment of the present disclosure can deform the first image according to the deformation parameters corresponding to the preset deformation effect, it is more flexible and convenient.
- the deformation effect is applicable to all kinds of images at the same time, and the user experience is better.
- a first region matching the target object may be determined according to the target object.
- determining that the first region matches the target object in the first image may include:
- the first region is determined based on a relative position between the feature points.
- FIG. 6 illustrates a schematic diagram of determining a first region according to an image processing method according to an embodiment of the present disclosure.
- feature points of the target object can be identified, and the first grid is determined based on the positions of the feature points.
- the edge feature points of the target object can be obtained, for example, the edge feature points of the target feature are extracted by an image edge detection algorithm, and the first mesh is determined by connecting the edge feature points.
- the image edge detection algorithm may include: Canny operator edge detection algorithm or Sobel operator edge detection. In other embodiments, edge feature points may also be determined by other edge detection methods.
- the outline of the target object may be determined based on the edge feature points, and the first region may correspond to the outline or a square grid generated based on the outline. That is, the outer border of the first region can be determined by the upper, lower, left, and right edges of the outline, that is, the height of the first region can be determined according to the uppermost feature points and the lowermost edge feature points. , Determine the width of the first region according to the distance between the left-most edge feature points and the right-most edge feature points, and determine the first area corresponding to the target image.
- the first region may also be determined according to feature points of each part in the target object.
- the first region may be based on the characteristic parts of the nose, eyes, ears, eyebrows, and the like.
- the feature points determine a first region.
- determining the first region based on a relative position between the feature points may include:
- S101 Determine the center of the bridge of the nose as the center of the first region
- S102 Determine the width of the first region based on the distance between the outer edges of the two eyes;
- S103 Determine the height of the first region based on the distance between the center of the eyebrow and the tip of the nose.
- the positions of the characteristic parts of the face such as nose, eyes, ears, eyebrows, and mouth can be determined according to the manner of facial recognition, and then the first region.
- the center position of the bridge of the nose can be determined as the center of the first region, that is, the first region implemented in the present disclosure is symmetrically arranged with the center position of the bridge of the nose.
- the width of the first region may be determined according to a first distance between the outer edge of the left eye and the outer edge of the right eye, where the first distance may be directly set to the width of the first region, or may be determined according to the first prediction.
- the width of the first region is determined by a correspondence relationship, for example, a preset multiple of the first distance can be set in the art, and the first preset correspondence relationship can be set according to different requirements, which is not limited in the embodiments of the present disclosure.
- the height of the first region can be determined according to the second distance between the center of the eyebrow and the tip of the nose.
- the center of the eyebrow refers to the center position of the two eyebrows.
- the first and second distances can be calculated by facial recognition.
- a person skilled in the art may select an appropriate algorithm to obtain the first distance and the second distance, for example, a PCA algorithm to identify features.
- there may be a second preset correspondence between the height of the first grid and the second distance for example, the height of the first region may be twice the height of the second distance. This field can be set according to different needs in the field.
- There are two preset correspondence relationships which are not limited in the embodiment of the present disclosure.
- the height and width of the first region can be determined according to the identified feature points, thereby determining the size of the first region, and at the same time, the first region corresponds to the position of the target object.
- the first region may also be divided into multiple sub-grids to form the first grid.
- the multiple sub-grids may be the same.
- the first region may be divided according to a preset dimension grid. The first dimension may be determined based on the size of the first grid, or may be determined according to pre-configured dimension information.
- the first grid is divided into sub-grids of a preset dimension according to the size of the first grid, and the size of the sub-grid is smaller than the size threshold, that is, in the embodiment of the present disclosure, The first grid is divided into sub-grids that are smaller than the size threshold.
- the first grid may also be divided according to the pre-configured dimensional information.
- the size threshold and the pre-configured dimensional information may be preset, and a person skilled in the art may set it according to different requirements.
- a deformed pixel matrix (deformation parameter) for performing the deformation operation may also be determined according to a preset deformation effect.
- the deformation parameters may be determined according to an image template with a preset deformation effect, or the deformation parameters may be determined according to a deformation operation on the first region.
- FIG. 3 illustrates a flowchart of step S200 of the image processing method according to an embodiment of the present disclosure.
- Step S200 may include:
- S204 Determine the deformation parameter by using a preset algorithm and a second position of a second pixel point in the second grid.
- the preset deformation template exists. That is, the user can select an image of a desired deformation effect as an image template of a preset deformation effect, that is, the foregoing preset deformation template.
- deformation parameters can be determined according to the deformation effect corresponding to the preset deformation template. If there is no preset deformation template, a second mesh for image deformation is obtained, and based on the first mesh The difference between the grid and the second grid determines the deformation parameters.
- the second mesh may be obtained based on a user's direct deformation operation on the first mesh, or may be a mesh obtained by other methods, such as obtaining a stored second mesh. The two methods are described in detail below.
- FIG. 4 shows a flowchart of determining the deformation parameters based on the preset deformation template in an image processing method according to an embodiment of the present disclosure, where step S202 may include:
- S2022 Use the four vertices of each sub-mesh in the third grid as four control points, and determine each first pixel point in the third grid based on a preset algorithm and the four control points of each sub-mesh. Third position before deformation;
- S2023 Determine the deformation parameter based on a third position of each first pixel before deformation and a corresponding fourth position in the third grid.
- the preset deformation template may correspond to a deformation area where the deformation operation is performed, and the deformation area is equivalent to a region where the target object in the first image is located.
- a third mesh matching the deformation region of the preset deformation template may be obtained, and the third mesh may be a third mesh after the deformation region of the preset deformation template is deformed, and the third mesh The grid matches the size and position of the deformed area.
- the third position of each first pixel point in the third grid when the deformation operation is not performed may be determined according to the four control points of each sub-grid in the third grid.
- the third position when each first pixel is not deformed may be obtained according to a preset algorithm.
- the preset algorithm may be a Bezier surface algorithm, which may be expressed as:
- d_i (a_i, b_i), which is determined by the coordinates of the control points of each sub-grid in the third grid
- i is a positive integer less than 5
- u and v are the first pixel points in the third grid.
- the abscissa and ordinate of the third position before deformation, p (u, v) is the coordinate value of the fourth position after the deformation of each first pixel point in the third grid.
- a rectangular coordinate system can be established, so that the position of the third grid in the rectangular coordinate system can be determined, and the The position of each pixel point in the three grids in the rectangular coordinate system, and the position of the first pixel point before the deformation, that is, the third position can be obtained in the foregoing manner.
- b00, b01, b10, and b11 are the coordinate values of the four control points located at the upper left, lower left, upper right, and lower right of the sub-grid of the third grid, respectively.
- the third position of each pixel in the third grid before performing the deformation can be obtained, so as to determine the deformation parameter according to the difference between the third position and the fourth position.
- the deformation parameters may include an initial position of each pixel in the deformation area of the preset deformation template. Based on the difference between the current position of the pixel and the initial position, the correspondence between each pixel can be determined. Position deviation due to this deformation effect.
- the parameter value in the deformation parameters may be a position deviation corresponding to the deformation operation performed for each pixel point.
- the position deviations in the embodiments of the present disclosure are all vector values, which include deviations in length and direction.
- the dimensions of the deformation parameters can be adjusted according to the size of the first grid, such as the dimensions of the deformed pixel matrix, so that the pixels of the deformed pixel matrix correspond to the pixels in the first grid one-to-one.
- the deformation parameters obtained through step S2023 in the embodiment of the present disclosure may be the deformation parameters corresponding to the deformation regions in the preset deformation template (the first pixel matrix is used to distinguish them below), in order to achieve the same deformation as the preset deformation template.
- a deformation operation of the first image may be performed according to the first pixel matrix.
- the dimension of the first deformation speed limit matrix needs to be adjusted according to the size of the first grid.
- the number of pixels in the length direction and the width direction in the first grid can be determined, and the dimension of the first deformation template corresponding to the deformation area in the preset deformation template is adjusted based on the number.
- the uniform deformation can be performed in the Sampling pixels in the deformed area, where the number of samples is the same as the number of pixels corresponding to the first grid, so that the first pixel matrix of the adjusted dimension is determined based on the position deviation of the sampled pixels, and the pixels of the first pixel matrix and The pixels in the first grid correspond one-to-one, and the first pixel matrix after adjusting the dimensions is the deformation parameter in the embodiment of the present disclosure.
- the above embodiment is to determine the deformation parameters by using a preset deformation template.
- the following describes in detail the determination of the deformation parameters by using the obtained second mesh.
- the user may directly perform dragging or other deformation operations on the first area, that is, the shape of the first area may be changed, thereby determining the deformation parameter.
- This process can include:
- a deformation parameter is determined based on the position deviation.
- the user may perform a touch operation on the first region, that is, the shape of the first region may be changed, and the difference between the position of each pixel point after the shape change and the position before the shape change may be performed. Value, determine the position deviation of each pixel based on the touch operation, and use this to establish the deformation parameters.
- the deformation parameter may also be determined according to a directly obtained second mesh, where the second mesh is a deformed mesh.
- FIG. 5 shows a flowchart of step S204 in the image processing method according to an embodiment of the present disclosure.
- the step S204 may include:
- S2042 Determine a first position of each second pixel point in the second grid before deformation based on the preset algorithm
- S2043 Determine the deformation parameter based on a first position of each second pixel point before the deformation and a corresponding second position in the second grid.
- the four vertices of each sub-mesh in the second grid are used as the four control points, so that it can be determined that each second pixel point in the second grid is not deformed.
- the first position when each second pixel is not deformed can be obtained according to a preset algorithm.
- the preset algorithm can be a Bezier surface algorithm, which can be expressed as:
- d_i (a_i, b_i), which is determined by the coordinates of the control points of each sub-grid in the second grid, i is a positive integer less than 5; u and v are the respective second pixel points in the second grid.
- the abscissa and ordinate of the first position before deformation, p (u, v) is the coordinate value of the deformed second position corresponding to each second pixel point in the second grid.
- the correspondence between the first position (u, v) of each second pixel before deformation and the second position P (u, v) after deformation can be established, which is known on the second grid
- the four control points of each sub-grid in the second grid and the second position of the corresponding second pixel point are known.
- a rectangular coordinate system can be established, so that the second grid can be determined.
- the position in the rectangular coordinate system and the position of each pixel in the second grid in the rectangular coordinate system can further obtain the position of the second pixel before the deformation, that is, the first position in the above manner.
- b00, b01, b10, and b11 are the coordinate values of the four control points located at the upper left, lower left, upper right, and lower right of the sub-grid of the second grid, respectively.
- the deformation algorithm based on the sub-mesh can change the shape of the grid by pulling the vertices of the grid, and then reflect the deformation of the grid to the image to achieve free deformation of the image.
- existing control point deformation algorithms such as TPS, MLS, and MRLS, etc.
- the larger the number of control points the higher the algorithm complexity, and the worse the real-time effect of image deformation.
- Such algorithms are applied to the grid Deformation of the image in the structure also has discontinuous edges.
- the number of control points of the function applied by the Bezier surface algorithm used in the embodiment of the present disclosure is small, and a good mesh deformation effect can be achieved.
- the optimization of the Bezier surface algorithm can reduce complexity and improve real-time performance.
- the efficiency of the calculation algorithm is verified by the processor of the electronic device according to the embodiment of the present disclosure, and the face deformation can be obtained on a 720p image.
- the forward calculation of the Bezier surface algorithm takes about 40-50ms in total.
- the algorithm takes about 8ms for face deformation on the same image.
- the deformed pixel matrix can also be directly obtained, which further improves the speed of deformation.
- the first position of each pixel in the second grid before performing the deformation can be obtained, so as to determine the deformation parameter according to the difference between the first position and the second position.
- the deformation parameters may include an initial position of each second pixel in the second grid of the preset deformation template, based on a difference between the current position of the second pixel and the initial position, that is, The position deviation of each pixel point corresponding to the deformation effect can be determined.
- the parameter value in the deformation parameters may be a position deviation corresponding to the deformation operation performed for each pixel point.
- the position deviations in the embodiments of the present disclosure are all vector values, which include deviations in length and direction.
- the second grid in the embodiment of the present disclosure may also correspond to the positional deviation generated by each pixel point when the deformation is performed.
- the difference between the position of each pixel point after the shape change and the position before the shape change may be performed.
- Value determine the position deviation of each pixel based on the touch operation, and use this to establish the deformation parameters.
- the second grid may be a grid that matches the size of the first grid, or a grid that does not match. When the size of the second grid does not match the size of the first grid, the second grid may be changed. The size of the mesh, and retain the deformation effect of the second mesh. For example, a grid matching the size of the first grid may be obtained based on the second grid in a uniform sampling manner. In other embodiments, the first grid and the second grid can also be matched in other ways, which is not limited in the embodiments of the present disclosure.
- the dimensions of the deformation parameters may be adjusted according to the size of the first grid, so that the pixels of the deformation parameters correspond to the pixels in the first grid one to one. That is, the deformation parameters obtained through step S2023 in the embodiment of the present disclosure may be the deformation parameters corresponding to the deformation regions in the preset deformation template (the first pixel matrix is used to distinguish them below), in order to achieve the same deformation as the preset deformation template. As a result, a deformation operation of the first image may be performed according to the first pixel matrix.
- the dimension of the first deformation speed limit matrix needs to be adjusted according to the size of the first grid.
- performing the deformation processing on the target object in the first image based on the deformation parameter, and obtaining the second image may include:
- a linear interpolation algorithm is used to adjust the position of the corresponding pixel on the target object based on the position deviation of each pixel in the deformation parameters; the gray value of the pixel after the adjusted position is determined as the pixel value when the position is not adjusted grayscale value.
- the pixel value at the new position is replaced with the pixel value of the pixel point at the original position. That is, when performing pixel position processing and adjusting a pixel from an original position to a new position, the pixel value of the pixel at the original position can be retained at the new position, thereby achieving the deformation process of the target object.
- the pixel value may be expressed as a gray value, but it is not a specific limitation of the present disclosure.
- the linear difference algorithm as in the above embodiment may be a bilinear difference algorithm, and the pixel value of each pixel point can be quickly obtained through this algorithm.
- FIG. 7 shows a schematic structural diagram of performing a deformation process based on a deformation parameter according to an embodiment of the present disclosure
- FIG. 8 shows a schematic diagram of a second image obtained according to an embodiment of the present disclosure, wherein the embodiment of the present disclosure can conveniently and quickly perform image deformation deal with.
- the embodiments of the present disclosure can directly deform the image according to the required deformation effect, instead of simply performing the deformation operation by dragging, adding pictures, and labels.
- the deformation of the image controlled by the control points in the mesh deformation algorithm optimized by the Bezier surface algorithm can greatly reduce the complexity of the image deformation algorithm controlled by the control points.
- mesh deformation can also be used to reduce the complexity of image deformation processing.
- the embodiments of the present disclosure can use the combination of grids and facial key points to complete rich facial deformation effects. It can have good deformation results for facial images with different corners and different sizes, and it is stable when processing facial deformations in videos. Performance.
- the present disclosure also provides an image processing apparatus, an electronic device, a computer-readable storage medium, and a program.
- the foregoing can be used to implement any one of the image processing methods provided by the present disclosure.
- the corresponding technical solutions and descriptions, and the corresponding records in the method section, are provided. ,No longer.
- An embodiment of the present disclosure also provides a computer-readable storage medium having computer program instructions stored thereon, and the computer program instructions implement the above method when executed by a processor.
- the computer-readable storage medium may be a non-volatile computer-readable storage medium or a volatile computer-readable storage medium.
- An embodiment of the present disclosure further provides an electronic device including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured as the above method.
- the electronic device may be provided as a terminal, a server, or other forms of devices.
- FIG. 9 illustrates a block diagram of an image processing apparatus according to an embodiment of the present disclosure, where the apparatus may include:
- a first determining module 100 configured to determine a first region that matches a target object in a first image
- the second determining module 200 is configured to determine a deformation parameter based on a preset deformation effect, where the deformation parameter is used to determine a position deviation of each pixel point of the target object based on the preset deformation effect;
- the deformation module 300 is configured to perform a deformation process on the target object in the first image based on the deformation parameter to obtain a second image.
- the first determination module is further configured to determine the first region by forming a first grid corresponding to the target object, and the first grid matches the first region.
- the deformation parameter is a deformed pixel matrix, and each parameter in the deformed pixel matrix is used to determine a position deviation of a corresponding pixel point of a target object based on the preset deformation effect.
- the first determination module is further configured to determine a position of a feature point of a target object in a first image, and determine the first region based on a relative position between each of the feature points.
- the second determination module is further configured to determine whether a preset deformation template exists, and in a case where a preset deformation template exists, use a preset algorithm to determine the deformation based on the preset deformation template. parameter.
- the second determining module is further configured to obtain a second mesh for image deformation in a case where the preset deformation template does not exist, and utilize a preset algorithm and the second The second position of the pixels in the grid determines the deformation parameter, wherein the second grid is a deformed grid.
- the second determination module is further configured to determine that the preset deformation template exists when a first instruction for performing a deformation operation based on a preset deformation image is received.
- the second determination module is further configured to determine a third mesh that matches a deformation region of the preset deformation template
- each first pixel point in the third grid is deforming based on a preset algorithm and the four control points of each sub-mesh Previous third position
- the deformation parameter is determined based on a third position of each first pixel point before the deformation and a corresponding fourth position in the third grid.
- the second determination module is further configured to use four vertices of each sub-mesh in the second mesh as four control points;
- the deformation parameter is determined based on a first position of each second pixel point before the deformation and a corresponding second position in the second grid.
- the expression of the preset algorithm is:
- d_i (a_i, b_i), which is determined by the coordinates of the control points of each sub-mesh in the deformed grid, i is a positive integer less than 5; u and v are the values of each pixel in the deformed grid when they are not deformed.
- the abscissa and ordinate of the position, p (u, v) is the position coordinate of each pixel point in the deformed grid; the deformed grid is the second grid or the third grid.
- the second determination module is further configured to receive a deformation operation on the first region, and obtain the second mesh based on the deformation operation; or obtain a stored second mesh.
- the second determination module is further configured to adjust the dimension of the deformed pixel matrix based on the number of pixels in the length direction and the number of pixels in the width direction of the first region, so that the deformed pixels The pixels in the matrix correspond to the pixels in the first region on a one-to-one basis;
- the deformation module is further configured to obtain the second image by using the deformed pixel matrix after adjusting the dimensions.
- the deformation module is further configured to use a linear interpolation algorithm to adjust the position of the corresponding pixel point on the target object based on the position deviation of each pixel point in the deformation parameter.
- the first determining module is further configured to determine the center of the bridge of the nose as the center of the first region when the target object includes a face;
- the height of the first region is determined based on the distance between the center of the eyebrow and the tip of the nose.
- the first determination module is further configured to determine a first grid that matches the position and size of the target object in the first image, and the first grid is equally divided into a plurality of subnets. grid.
- the embodiments of the present disclosure can directly deform the image according to the required deformation effect, instead of simply performing the deformation operation by dragging, adding pictures, and labels.
- the deformation of the image controlled by the control points in the mesh deformation algorithm optimized by the Bezier surface algorithm can greatly reduce the complexity of the image deformation algorithm controlled by the control points.
- mesh deformation can also be used to reduce the complexity of image deformation processing.
- the embodiments of the present disclosure can use the combination of grids and facial key points to complete rich facial deformation effects. It can have good deformation results for facial images with different corners and different sizes, and it is stable when processing facial deformations in videos. Performance.
- FIG. 10 illustrates a block diagram of an electronic device according to an embodiment of the present disclosure.
- the electronic device may be provided as a terminal, a server, or other forms of devices.
- the electronic device may include an image processing apparatus 800.
- the device 800 may be a terminal such as a mobile phone, a computer, a digital broadcasting terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
- the device 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input / output (I / O) interface 812, a sensor component 814, And communication component 816.
- the processing component 802 generally controls the overall operations of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
- the processing component 802 may include one or more processors 820 to execute instructions to complete all or part of the steps of the method described above.
- the processing component 802 may include one or more modules to facilitate the interaction between the processing component 802 and other components.
- the processing component 802 may include a multimedia module to facilitate the interaction between the multimedia component 808 and the processing component 802.
- the memory 804 is configured to store various types of data to support operation at the device 800. Examples of such data include instructions for any application or method for operating on the device 800, contact data, phone book data, messages, pictures, videos, and the like.
- the memory 804 may be implemented by any type of volatile or non-volatile storage devices or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), Programming read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk.
- SRAM static random access memory
- EEPROM electrically erasable programmable read-only memory
- EPROM Programming read-only memory
- PROM programmable read-only memory
- ROM read-only memory
- magnetic memory flash memory
- flash memory magnetic disk or optical disk.
- the power component 806 provides power to various components of the device 800.
- the power component 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
- the multimedia component 808 includes a screen that provides an output interface between the device 800 and a user.
- the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user.
- the touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure related to the touch or slide operation.
- the multimedia component 808 includes a front camera and / or a rear camera. When the device 800 is in an operation mode, such as a shooting mode or a video mode, the front camera and / or the rear camera can receive external multimedia data. Each front camera and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
- the audio component 810 is configured to output and / or input audio signals.
- the audio component 810 includes a microphone (MIC) that is configured to receive an external audio signal when the device 800 is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode.
- the received audio signal may be further stored in the memory 804 or transmitted via the communication component 816.
- the audio component 810 further includes a speaker for outputting audio signals.
- the I / O interface 812 provides an interface between the processing component 802 and a peripheral interface module.
- the peripheral interface module may be a keyboard, a click wheel, a button, or the like. These buttons can include, but are not limited to: a home button, a volume button, a start button, and a lock button.
- the sensor component 814 includes one or more sensors for providing status assessment of various aspects of the device 800.
- the sensor component 814 can detect the on / off state of the device 800 and the relative positioning of the components, such as the display and keypad of the device 800.
- the sensor component 814 can also detect the change of the position of the device 800 or a component of the device 800 , The presence or absence of the user's contact with the device 800, the orientation or acceleration / deceleration of the device 800, and the temperature change of the device 800.
- the sensor component 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
- the sensor component 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
- the sensor component 814 may further include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
- the communication component 816 is configured to facilitate wired or wireless communication between the device 800 and other devices.
- the device 800 can access a wireless network based on a communication standard, such as WiFi, 2G, or 3G, or a combination thereof.
- the communication component 816 receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel.
- the communication component 816 further includes a near field communication (NFC) module to facilitate short-range communication.
- the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
- RFID radio frequency identification
- IrDA infrared data association
- UWB ultra wideband
- Bluetooth Bluetooth
- the device 800 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A gate array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation is used to perform the above method.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGA field programmable A gate array
- controller microcontroller, microprocessor, or other electronic component implementation is used to perform the above method.
- a non-volatile computer-readable storage medium is also provided, and computer program instructions are stored thereon, and the computer program instructions implement the image processing method according to the foregoing embodiment when executed by a processor.
- the memory 804 includes computer program instructions, and the computer program instructions may be executed by the processor 820 of the device 800 to complete the foregoing method.
- Fig. 11 is a block diagram illustrating another electronic device according to an embodiment of the present disclosure.
- the electronic device 1900 may be provided as a server. 11, the electronic device 1900 includes a processing component 1922, which further includes one or more processors, and a memory resource represented by a memory 1932, for storing instructions executable by the processing component 1922, such as an application program.
- the application program stored in the memory 1932 may include one or more modules each corresponding to a set of instructions.
- the processing component 1922 is configured to execute instructions to perform the method described above.
- the electronic device 1900 may further include a power supply component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input / output (I / O) interface 1958 .
- the electronic device 1900 can operate based on an operating system stored in the memory 1932, such as Windows ServerTM, Mac OSXTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
- a non-volatile computer-readable storage medium such as a memory 1932 including computer program instructions, and the computer program instructions may be executed by the processing component 1922 of the electronic device 1900 to complete the above method.
- the present disclosure may be a system, method, and / or computer program product.
- the computer program product may include a computer-readable storage medium having computer-readable program instructions for causing a processor to implement various aspects of the present disclosure.
- the computer-readable storage medium may be a tangible device that can hold and store instructions used by the instruction execution device.
- the computer-readable storage medium may be, for example, but not limited to, an electric storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
- Non-exhaustive list of computer-readable storage media include: portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM) Or flash memory), static random access memory (SRAM), portable compact disc read only memory (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanical encoding device, such as a printer with instructions stored thereon A protruding structure in the hole card or groove, and any suitable combination of the above.
- RAM random access memory
- ROM read-only memory
- EPROM erasable programmable read-only memory
- flash memory flash memory
- SRAM static random access memory
- CD-ROM compact disc read only memory
- DVD digital versatile disc
- memory stick floppy disk
- mechanical encoding device such as a printer with instructions stored thereon A protruding structure in the hole card or groove, and any suitable combination of the above.
- Computer-readable storage media used herein are not to be interpreted as transient signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (for example, light pulses through fiber optic cables), or via electrical wires Electrical signal transmitted.
- the computer-readable program instructions described herein can be downloaded from a computer-readable storage medium to various computing / processing devices or downloaded to an external computer or external storage device via a network, such as the Internet, a local area network, a wide area network, and / or a wireless network.
- the network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and / or edge servers.
- the network adapter card or network interface in each computing / processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing / processing device .
- Computer program instructions for performing the operations of the present disclosure may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or in one or more programming languages.
- the programming languages include object-oriented programming languages—such as Smalltalk, C ++, and the like—and conventional procedural programming languages—such as "C” or similar programming languages.
- Computer-readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer, partly on a remote computer, or entirely on a remote computer or server carried out.
- the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (such as through the Internet using an Internet service provider) connection).
- electronic circuits such as programmable logic circuits, field-programmable gate arrays (FPGAs), or programmable logic arrays (PLAs) are personalized by using state information of computer-readable program instructions.
- the electronic circuits may Computer-readable program instructions are executed to implement various aspects of the present disclosure.
- These computer-readable program instructions can be provided to a processor of a general-purpose computer, special-purpose computer, or other programmable data processing device, thereby producing a machine such that, when executed by a processor of a computer or other programmable data processing device , Means for implementing the functions / actions specified in one or more blocks in the flowcharts and / or block diagrams.
- These computer-readable program instructions may also be stored in a computer-readable storage medium, and these instructions cause a computer, a programmable data processing apparatus, and / or other devices to work in a specific manner.
- a computer-readable medium storing instructions includes: An article of manufacture that includes instructions to implement various aspects of the functions / acts specified in one or more blocks in the flowcharts and / or block diagrams.
- Computer-readable program instructions can also be loaded onto a computer, other programmable data processing device, or other device, so that a series of operating steps can be performed on the computer, other programmable data processing device, or other device to produce a computer-implemented process , So that the instructions executed on the computer, other programmable data processing apparatus, or other equipment can implement the functions / actions specified in one or more blocks in the flowchart and / or block diagram.
- each block in the flowchart or block diagram may represent a module, a program segment, or a part of an instruction that contains one or more components for implementing a specified logical function.
- Executable instructions may also occur in a different order than those marked in the drawings. For example, two consecutive blocks may actually be executed substantially in parallel, and they may sometimes be executed in the reverse order, depending on the functions involved.
- each block in the block diagrams and / or flowcharts, and combinations of blocks in the block diagrams and / or flowcharts can be implemented in a dedicated hardware-based system that performs the specified function or action. , Or it can be implemented with a combination of dedicated hardware and computer instructions.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Health & Medical Sciences (AREA)
- Geometry (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Architecture (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Testing Of Coins (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Editing Of Facsimile Originals (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (31)
- 一种图像处理方法,其特征在于,包括:确定与第一图像中的目标对象相匹配的第一区域;基于预设的变形效果确定变形参数,所述变形参数用于确定目标对象的各像素点基于所述预设的变形效果产生的位置偏差;基于所述变形参数对所述第一图像中的目标对象进行变形处理,获得第二图像。
- 根据权利要求1所述的方法,其特征在于,所述确定与第一图像中的目标对象相匹配的第一区域包括:形成与所述目标对象相对应的第一网格,所述第一网格与第一区域匹配。
- 根据权利要求1或2所述的方法,其特征在于,所述变形参数为变形像素矩阵,该变形像素矩阵中的每个参数用于确定目标对象的对应像素点基于所述预设的变形效果产生的位置偏差。
- 根据权利要求1-3中任意一项所述的方法,其特征在于,所述确定与第一图像中的目标对象匹配的第一区域包括:确定第一图像内的目标对象的特征点的位置;基于各所述特征点之间的相对位置确定所述第一区域。
- 根据权利要求1-4中任意一项所述的方法,其特征在于,所述基于预设的变形效果确定变形参数包括:确定是否存在预设变形模板;在存在预设变形模板的情况下,利用预设算法,基于所述预设变形模板确定所述变形参数。
- 根据权利要求5所述的方法,其特征在于,所述基于预设的变形效果确定变形参数还包括:在不存在所述预设变形模板的情况下,获取用于图像变形的第二网格,该第二网格为变形网格;利用预设算法以及所述第二网格中的像素点的第二位置确定所述变形参数。
- 根据权利要求5或6所述的方法,其特征在于,所述确定是否存在预设变形模板包括:在接收到基于预设变形图像执行变形操作的第一指令时,确定为存在所述预设变形模板。
- 根据权利要求5-7中任意一项所述的方法,其特征在于,所述基于所述预设变形模板确定所述变形参数包括:确定与所述预设变形模板的变形区域匹配的第三网格;将所述第三网格中每个子网格的四个顶点作为四个控制点,并基于预设算法以及各子网格的四个控制点确定第三网格中各第一像素点在变形之前的第三位置;基于各第一像素点在变形之前的第三位置以及其在第三网格中对应的第四位置确定所述变形参数。
- 根据权利要求6所述的方法,其特征在于,所述利用预设算法以及所述第二网格中的第二像素点的第二位置确定所述变形参数包括:将所述第二网格中的每个子网格的四个顶点作为四个控制点;基于所述预设算法确定第二网格中各第二像素点在变形之前的第一位置;基于各第二像素点在变形之前的第一位置和在第二网格中对应的第二位置,确定所述变形参数。
- 根据权利要求6所述的方法,其特征在于,所述获取用于执行图像变形的第二网格包括:接收关于第一区域的变形操作,基于该变形操作获取所述第二网格;或者获取存储的第二网格。
- 根据权利要求3所述的方法,其特征在于,所述方法还包括:基于所述第一区域在长度方向上的像素点数量和宽度方向上的像素点数量调节所述变形像素矩阵的维度,使得变形像素矩阵中的像素点与第一区域中的像素点一一对应,并利用调节维度后的变形像素矩阵获得第二图像。
- 根据权利要求1-11中任意一项所述的方法,其特征在于,所述基于所述变形参数对所述第一图像中的目标对象进行变形处理,获得第二图像包括:利用线性插值算法,基于所述变形参数中各像素点的位置偏差,调节目标对象上对应像素点的位置。
- 根据权利要求4所述的方法,其特征在于,所述目标对象包括面部,并且所述基于各所述特征点之间的相对位置确定所述第一区域包括:将鼻梁中心确定为第一区域的中心;基于双眼的外侧边缘之间的距离确定所述第一区域的宽度;基于眉心与鼻尖之间的距离确定所述第一区域的高度。
- 根据权利要求2所述的方法,其特征在于,所述形成与所述目标对象相对应的第一网格包括:确定与所述第一图像中的目标对象的位置和尺寸相匹配的第一网格,该第一网格被均等分成多个子网格。
- 一种图像处理装置,其特征在于,包括:第一确定模块,其配置为确定与第一图像中的目标对象相匹配的第一区域;第二确定模块,其配置为基于预设的变形效果确定变形参数,所述变形参数用于确定目标对象的各像素点基于所述预设的变形效果产生的位置偏差;变形模块,其配置为基于所述变形参数对所述第一图像中的目标对象进行变形处理,获得第二图像。
- 根据权利要求15所述的装置,其特征在于,所述第一确定模块进一步配置为通过形成与所述目标对象相对应的第一网格确定所述第一区域,所述第一网格与第一区域匹配。
- 根据权利要求15或16所述的装置,其特征在于,所述变形参数为变形像素矩阵,该变形像素矩阵中的每个参数用于确定目标对象的对应像素点基于所述预设的变形效果产生的位置偏差。
- 根据权利要求15-17中任意一项所述的装置,其特征在于,所述第一确定模块进一步配置为确定第一图像内的目标对象的特征点的位置,并基于各所述特征点之间的相对位置确定所述第一区域。
- 根据权利要求15-18中任意一项所述的装置,其特征在于,所述第二确定模块进一步配置为确定是否存在预设变形模板,在存在预设变形模板的情况下,利用预设算法,基于所述预设变形模板确定所述变形参数。
- 根据权利要求19所述的装置,其特征在于,所述第二确定模块进一步配置为在不存在所述预设变形模板的情况下,获取用于图像变形的第二网格,并利用预设算法以及所述第二网格中的像素点的第二位置确定所述变形参数,其中所述第二网格为变形网格。
- 根据权利要求19或20所述的装置,其特征在于,所述第二确定模块进一步配置为在接收到基于预设变形图像执行变形操作的第一指令时,确定为存在所述预设变形模板。
- 根据权利要求19-21中任意一项所述的装置,其特征在于,所述第二确定模块进一步配置为确定与所述预设变形模板的变形区域匹配的第三网格,将所述第三网格中每个子网格的四个顶点作为四个控制点,并基于预设算法以及各子网格的四个控制点确定第三网格中各第一像素点在变形之前的第三位置;并基于各第一像素点在变形之前的第三位置以及其在第三网格中对应的第四位置确定所述变形参数。
- 根据权利要求20所述的装置,其特征在于,所述第二确定模块进一步配置为将所述第二网格中的每个子网格的四个顶点作为四个控制点;基于所述预设算法确定第二网格中各第二像素点在变形之前的第一位置;基于各第二像素点在变形之前的第一位置和在第二网格中对应的第二位置,确定所述变形参数。
- 根据权利要求20所述的装置,其特征在于,所述第二确定模块进一步配置为接收关于第一区域的变形操作,基于该变形操作获取所述第二网格;或者获取存储的第二网格。
- 根据权利要求17所述的装置,其特征在于,所述第二确定模块还配置为基于所述第一区域在 长度方向上的像素点数量和宽度方向上的像素点数量调节所述变形像素矩阵的维度,使得变形像素矩阵中的像素点与第一区域中的像素点一一对应;所述变形模块还配置为利用调节维度后的变形像素矩阵获得第二图像。
- 根据权利要求15-25中任意一项所述的装置,其特征在于,所述变形模块还配置为利用线性插值算法,基于所述变形参数中各像素点的位置偏差,调节目标对象上对应像素点的位置。
- 根据权利要求18所述的装置,其特征在于,所述第一确定模块还配置为在所述目标对象包括面部时,将鼻梁中心确定为第一区域的中心;基于双眼的外侧边缘之间的距离确定所述第一区域的宽度;基于眉心与鼻尖之间的距离确定所述第一区域的高度。
- 根据权利要求16所述的装置,其特征在于,所述第一确定模块还配置为确定与所述第一图像中的目标对象的位置和尺寸相匹配的第一网格,该第一网格被均等分成多个子网格。
- 一种电子设备,其特征在于,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为:执行权利要求1至14中任意一项所述的图像处理方法。
- 一种计算机可读存储介质,其上存储有计算机程序指令,其特征在于,所述计算机程序指令被处理器执行时实现权利要求1至14中任意一项所述的图像处理方法。
- 一种计算机程序,其特征在于,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器执行用于实现权利要求1-14中任意一项所述的方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG11202010410WA SG11202010410WA (en) | 2018-07-04 | 2019-06-28 | Image processing method and apparatus, electronic device, and computer-readable storage medium |
KR1020207030576A KR102442485B1 (ko) | 2018-07-04 | 2019-06-28 | 이미지 처리 방법 및 장치, 전자 기기 및 컴퓨터 판독 가능한 저장 매체 |
JP2020558500A JP7038853B2 (ja) | 2018-07-04 | 2019-06-28 | 画像処理方法及び装置、電子機器並びにコンピュータ可読記憶媒体 |
US17/073,778 US11481975B2 (en) | 2018-07-04 | 2020-10-19 | Image processing method and apparatus, electronic device, and computer-readable storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810724309.3 | 2018-07-04 | ||
CN201810724309.3A CN109087238B (zh) | 2018-07-04 | 2018-07-04 | 图像处理方法和装置、电子设备以及计算机可读存储介质 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/073,778 Continuation US11481975B2 (en) | 2018-07-04 | 2020-10-19 | Image processing method and apparatus, electronic device, and computer-readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020007241A1 true WO2020007241A1 (zh) | 2020-01-09 |
Family
ID=64837358
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/093551 WO2020007241A1 (zh) | 2018-07-04 | 2019-06-28 | 图像处理方法和装置、电子设备以及计算机可读存储介质 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11481975B2 (zh) |
JP (1) | JP7038853B2 (zh) |
KR (1) | KR102442485B1 (zh) |
CN (1) | CN109087238B (zh) |
SG (1) | SG11202010410WA (zh) |
WO (1) | WO2020007241A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111768393A (zh) * | 2020-07-01 | 2020-10-13 | 上海商汤智能科技有限公司 | 图像处理方法及装置、电子设备和存储介质 |
CN111901499A (zh) * | 2020-07-17 | 2020-11-06 | 青岛聚好联科技有限公司 | 一种计算视频图像中像素实际距离的方法及设备 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109087238B (zh) * | 2018-07-04 | 2021-04-23 | 北京市商汤科技开发有限公司 | 图像处理方法和装置、电子设备以及计算机可读存储介质 |
CN109087239B (zh) * | 2018-07-25 | 2023-03-21 | 腾讯科技(深圳)有限公司 | 一种人脸图像处理方法、装置及存储介质 |
CN109934766B (zh) | 2019-03-06 | 2021-11-30 | 北京市商汤科技开发有限公司 | 一种图像处理方法及装置 |
CN110443745B (zh) * | 2019-07-03 | 2024-03-19 | 平安科技(深圳)有限公司 | 图像生成方法、装置、计算机设备及存储介质 |
CN110856014B (zh) * | 2019-11-05 | 2023-03-07 | 北京奇艺世纪科技有限公司 | 动态图像生成方法、装置、电子设备及存储介质 |
CN112767235A (zh) * | 2019-11-06 | 2021-05-07 | 腾讯科技(深圳)有限公司 | 图像处理方法、装置、计算机可读存储介质和计算机设备 |
CN110944230B (zh) * | 2019-11-21 | 2021-09-10 | 北京达佳互联信息技术有限公司 | 视频特效的添加方法、装置、电子设备及存储介质 |
CN112258386A (zh) * | 2020-10-26 | 2021-01-22 | 中国科学院微电子研究所 | 图像变形加速处理方法及装置、电子设备和可读存储介质 |
CN112767288B (zh) * | 2021-03-19 | 2023-05-12 | 北京市商汤科技开发有限公司 | 图像处理方法及装置、电子设备和存储介质 |
CN114549597A (zh) * | 2021-12-24 | 2022-05-27 | 北京旷视科技有限公司 | 图像中目标对象的匹配方法、设备、存储介质及程序产品 |
CN114995738B (zh) * | 2022-05-31 | 2023-06-16 | 重庆长安汽车股份有限公司 | 一种变换方法、装置、电子设备、存储介质及程序产品 |
CN116341587B (zh) * | 2023-05-25 | 2023-09-26 | 北京紫光青藤微系统有限公司 | 用于条码识别的方法及装置、条码采集设备 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103605975A (zh) * | 2013-11-28 | 2014-02-26 | 小米科技有限责任公司 | 一种图像处理的方法、装置及终端设备 |
CN104063842A (zh) * | 2014-05-30 | 2014-09-24 | 小米科技有限责任公司 | 处理图像的方法、装置及终端 |
CN106548117A (zh) * | 2015-09-23 | 2017-03-29 | 腾讯科技(深圳)有限公司 | 一种人脸图像处理方法和装置 |
US20180032797A1 (en) * | 2016-07-29 | 2018-02-01 | Samsung Electronics Co., Ltd. | Apparatus and method for processing a beauty effect |
CN109087238A (zh) * | 2018-07-04 | 2018-12-25 | 北京市商汤科技开发有限公司 | 图像处理方法和装置、电子设备以及计算机可读存储介质 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004234333A (ja) * | 2003-01-30 | 2004-08-19 | Nippon Hoso Kyokai <Nhk> | 画像変形情報生成方法及びその装置及びそのプログラム |
US9336314B2 (en) * | 2010-12-29 | 2016-05-10 | Microsoft Technology Licensing, Llc | Dynamic facet ordering for faceted search |
CN104063890A (zh) * | 2013-03-22 | 2014-09-24 | 中国移动通信集团福建有限公司 | 一种人脸卡通动漫形象化方法及系统 |
CN103824253B (zh) * | 2014-02-19 | 2017-01-18 | 中山大学 | 一种基于图像局部精确变形的人物五官变形方法 |
JP6143199B2 (ja) * | 2015-03-18 | 2017-06-07 | カシオ計算機株式会社 | 画像補正装置、画像補正方法、及びプログラム |
US9646195B1 (en) * | 2015-11-11 | 2017-05-09 | Adobe Systems Incorporated | Facial feature liquifying using face mesh |
-
2018
- 2018-07-04 CN CN201810724309.3A patent/CN109087238B/zh active Active
-
2019
- 2019-06-28 JP JP2020558500A patent/JP7038853B2/ja active Active
- 2019-06-28 KR KR1020207030576A patent/KR102442485B1/ko active IP Right Grant
- 2019-06-28 WO PCT/CN2019/093551 patent/WO2020007241A1/zh active Application Filing
- 2019-06-28 SG SG11202010410WA patent/SG11202010410WA/en unknown
-
2020
- 2020-10-19 US US17/073,778 patent/US11481975B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103605975A (zh) * | 2013-11-28 | 2014-02-26 | 小米科技有限责任公司 | 一种图像处理的方法、装置及终端设备 |
CN104063842A (zh) * | 2014-05-30 | 2014-09-24 | 小米科技有限责任公司 | 处理图像的方法、装置及终端 |
CN106548117A (zh) * | 2015-09-23 | 2017-03-29 | 腾讯科技(深圳)有限公司 | 一种人脸图像处理方法和装置 |
US20180032797A1 (en) * | 2016-07-29 | 2018-02-01 | Samsung Electronics Co., Ltd. | Apparatus and method for processing a beauty effect |
CN109087238A (zh) * | 2018-07-04 | 2018-12-25 | 北京市商汤科技开发有限公司 | 图像处理方法和装置、电子设备以及计算机可读存储介质 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111768393A (zh) * | 2020-07-01 | 2020-10-13 | 上海商汤智能科技有限公司 | 图像处理方法及装置、电子设备和存储介质 |
CN111901499A (zh) * | 2020-07-17 | 2020-11-06 | 青岛聚好联科技有限公司 | 一种计算视频图像中像素实际距离的方法及设备 |
CN111901499B (zh) * | 2020-07-17 | 2022-04-01 | 青岛聚好联科技有限公司 | 一种计算视频图像中像素实际距离的方法及设备 |
Also Published As
Publication number | Publication date |
---|---|
CN109087238A (zh) | 2018-12-25 |
SG11202010410WA (en) | 2020-11-27 |
JP7038853B2 (ja) | 2022-03-18 |
CN109087238B (zh) | 2021-04-23 |
JP2021518956A (ja) | 2021-08-05 |
US20210035362A1 (en) | 2021-02-04 |
KR20200135492A (ko) | 2020-12-02 |
US11481975B2 (en) | 2022-10-25 |
KR102442485B1 (ko) | 2022-09-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020007241A1 (zh) | 图像处理方法和装置、电子设备以及计算机可读存储介质 | |
WO2020224457A1 (zh) | 图像处理方法及装置、电子设备和存储介质 | |
WO2020135529A1 (zh) | 位姿估计方法及装置、电子设备和存储介质 | |
JP6134446B2 (ja) | 画像分割方法、画像分割装置、画像分割デバイス、プログラム及び記録媒体 | |
US9959484B2 (en) | Method and apparatus for generating image filter | |
WO2020155711A1 (zh) | 图像生成方法及装置、电子设备和存储介质 | |
WO2016011747A1 (zh) | 肤色调整方法和装置 | |
CN110321048B (zh) | 三维全景场景信息处理、交互方法及装置 | |
WO2017031901A1 (zh) | 人脸识别方法、装置及终端 | |
US20210256672A1 (en) | Method, electronic device and storage medium for processing image | |
CN109325908B (zh) | 图像处理方法及装置、电子设备和存储介质 | |
CN107341777B (zh) | 图片处理方法及装置 | |
CN110889382A (zh) | 虚拟形象渲染方法及装置、电子设备和存储介质 | |
CN109840939B (zh) | 三维重建方法、装置、电子设备及存储介质 | |
TWI718631B (zh) | 人臉圖像的處理方法及裝置、電子設備和儲存介質 | |
CN106648063B (zh) | 手势识别方法及装置 | |
US11310443B2 (en) | Video processing method, apparatus and storage medium | |
CN114025105B (zh) | 视频处理方法、装置、电子设备、存储介质 | |
CN110782532B (zh) | 图像生成方法、生成装置、电子设备及存储介质 | |
EP3905660A1 (en) | Method and device for shooting image, and storage medium | |
WO2022193466A1 (zh) | 图像处理方法及装置、电子设备和存储介质 | |
WO2022134390A1 (zh) | 标注方法及装置、电子设备和存储介质 | |
WO2022193507A1 (zh) | 图像处理方法及装置、设备、存储介质、程序和程序产品 | |
WO2015196715A1 (zh) | 图像重定位方法、装置及终端 | |
US9665925B2 (en) | Method and terminal device for retargeting images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19830653 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020558500 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20207030576 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19830653 Country of ref document: EP Kind code of ref document: A1 |