WO2020156058A1 - Procédé et dispositif de réalisation de déformation d'image basée sur glissement sur un terminal - Google Patents

Procédé et dispositif de réalisation de déformation d'image basée sur glissement sur un terminal Download PDF

Info

Publication number
WO2020156058A1
WO2020156058A1 PCT/CN2020/070738 CN2020070738W WO2020156058A1 WO 2020156058 A1 WO2020156058 A1 WO 2020156058A1 CN 2020070738 W CN2020070738 W CN 2020070738W WO 2020156058 A1 WO2020156058 A1 WO 2020156058A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
drag
image
deformed
terminal
Prior art date
Application number
PCT/CN2020/070738
Other languages
English (en)
Chinese (zh)
Inventor
倪光耀
杨辉
Original Assignee
北京字节跳动网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字节跳动网络技术有限公司 filed Critical 北京字节跳动网络技术有限公司
Publication of WO2020156058A1 publication Critical patent/WO2020156058A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Definitions

  • the present disclosure relates to the field of image processing technology, and in particular to a method, device, electronic device, and computer-readable storage medium for implementing terminal-based image drag and drop deformation.
  • image deformation is a common method in image processing, which refers to changing one image into another according to certain rules or methods. It can be used for entertainment, face deformation, etc. For example, for a given face image, it can produce complex expression changes such as happiness, anger, sadness, and happiness.
  • an embodiment of the present disclosure provides a terminal-based image drag transformation implementation method, which includes: determining a drag point in an image to be transformed displayed on the terminal screen; The image to be deformed is dragged and deformed; the deformed image is displayed on the terminal screen.
  • determining the drag point in the image to be deformed displayed on the terminal screen includes:
  • the selected to-be-selected drag point is used as the drag point.
  • the dragging and deforming the image to be deformed according to the trigger operation of the dragging point includes:
  • the receiving a trigger operation on the drag point includes:
  • a second trigger operation on the drag point is received, and the end position of the drag point is determined according to the second trigger operation on the drag point.
  • the method further includes:
  • the drag point is in the first form after the first trigger operation, and the first form is displayed on the terminal screen; and/or,
  • the drag point is in the second form after the second triggering operation, and the second form is displayed on the terminal screen.
  • the determining the position of the drag point according to the trigger operation includes:
  • the position of the drag point mapped to the image to be deformed is determined.
  • the determining the position of the drag point mapped to the image to be deformed according to the position mapping relationship between the source key point on the template image and the target key point on the image to be deformed includes:
  • the location of the drag point mapped to the image to be deformed is determined according to the transformation matrix and the translation vector.
  • the determining the transformation matrix and translation vector of the drag point according to the position of the drag point on the template image, the position of the source key point, and the position of the target key point includes:
  • the transformation matrix and translation vector of the drag point are determined according to the weight, the position of the source key point, and the position of the target key point.
  • the dragging and deforming the image to be deformed according to the starting point position and the ending point position includes:
  • the determining the target position of the grid point according to the original position of the grid point, the starting point position and the ending point position includes:
  • the target position is determined according to the transformation matrix and the translation vector.
  • the determining the transformation matrix and translation vector of the grid point according to the original position of the grid point, the starting point position and the ending point position includes:
  • the transformation matrix and translation vector of the grid point are determined according to the weight, the start position and the end position.
  • the determining the transformation matrix and translation vector of the grid point according to the weight, the starting point position and the ending point position includes:
  • the transformation matrix and translation vector of the grid point are determined according to the optimized weight, the starting point position and the ending point position.
  • the image to be deformed is a video face image.
  • embodiments of the present disclosure provide a terminal-based image drag-and-drop implementation device, including: a drag-point determination module for determining a drag-point in an image to be deformed displayed on the terminal screen; a drag-and-deform module , Used to drag and deform the image according to the trigger operation of the drag point; an image display module, used to display the deformed image on the terminal screen.
  • the drag point determination module includes:
  • a drag point display unit configured to display at least one candidate drag point contained in the image on the terminal screen
  • the drag point selection unit is configured to use the selected drag point to be selected as the drag point according to the selection operation generated on the terminal screen.
  • the drag deformation module includes:
  • a receiving unit configured to receive a trigger operation on the drag point
  • a drag point determining unit configured to determine a position of the drag point according to the trigger operation, the position including a starting point position and an ending point position;
  • the drag deformation unit is configured to drag and deform the image to be deformed according to the start point position and the end point position.
  • the receiving unit is specifically configured to: receive a first trigger operation on the drag point, determine the start position of the drag point according to the first trigger operation on the drag point;
  • the second trigger operation of the drag point determines the end position of the drag point according to the second trigger operation of the drag point.
  • the image display module is specifically configured to: the drag point is in the first form after the first triggering operation, and display the first form on the terminal screen; and/or, the The drag point is in the second form after the second trigger operation, and the second form is displayed on the terminal screen.
  • the drag point determination unit includes:
  • a drag point determination subunit configured to determine a template image corresponding to the image according to the first trigger operation of the drag point, and determine the drag point on the template image
  • the drag point position determination subunit is used to determine the position where the drag point is mapped to the image to be deformed according to the position mapping relationship between the source key point on the template image and the target key point on the image to be deformed .
  • the drag point position determination subunit is specifically configured to determine the drag point according to the position of the drag point on the template image, the position of the source key point, and the position of the target key point.
  • the transformation matrix and translation vector of the drag point; and the location of the drag point mapped to the image to be deformed is determined according to the transformation matrix and translation vector.
  • the drag point position determining subunit is specifically configured to determine the drag point and the source key point according to the position of the drag point on the template image and the position of the source key point
  • the weight of the drag point; the transformation matrix and translation vector of the drag point are determined according to the weight, the position of the source key point, and the position of the target key point.
  • the drag deformation unit includes:
  • a grid processing subunit for performing grid processing on the image to be deformed to obtain at least one grid point
  • the target position determining subunit is configured to determine the target position of the grid point according to the original position of the grid point, the start position and the end position;
  • the drag deformation sub-unit is used to drag and deform the image to be deformed according to the target position, and display the deformed image.
  • target position determining subunit is specifically configured to: determine the transformation matrix and translation vector of the grid point according to the original position of the grid point, the start position and the end position; The matrix and the translation vector determine the target position.
  • target position determining subunit is specifically configured to: determine the weight of the grid point and the starting point of the drag point according to the original position of the grid point and the starting point position; according to the weight, The start position and the end position determine the transformation matrix and translation vector of the grid point.
  • target position determining subunit is specifically configured to: perform optimization processing on the weight; and determine the transformation matrix and translation vector of the grid point according to the optimized weight, the starting point position, and the ending point position.
  • the image to be deformed is a video face image.
  • embodiments of the present disclosure provide an electronic device, including: at least one processor; and,
  • the device can execute any of the terminal-based image drag-and-drop implementation methods described in the foregoing first aspect.
  • embodiments of the present disclosure provide a non-transitory computer-readable storage medium, characterized in that the non-transitory computer-readable storage medium stores computer instructions, and the computer instructions are used to cause a computer to execute the aforementioned first aspect Any of the terminal-based methods for implementing image drag-and-drop deformation.
  • the present disclosure discloses a method, a device, an electronic device and a computer-readable storage medium for implementing image drag and drop deformation based on a terminal.
  • the method for implementing terminal-based image drag deformation includes: determining a drag point in the image to be deformed displayed on the terminal screen; dragging and deforming the image according to the trigger operation of the drag point; displaying the deformed image on the terminal on the screen.
  • a drag point is determined in the image to be deformed displayed on the terminal screen, the image is dragged and deformed according to the trigger operation of the drag point, and the deformed image is displayed on the terminal screen.
  • the image distortion cannot be freely controlled by the user, the distortion effect that the user wants cannot be achieved, and the controllability is poor.
  • FIG. 1a is a flowchart of a method for implementing a terminal-based image drag deformation provided by an embodiment of the disclosure
  • FIG. 1b is a flowchart of a method for implementing a terminal-based image dragging deformation according to another embodiment of the present disclosure
  • FIG. 1c is a schematic diagram of the implementation of a drag point to be selected in the method for implementing a terminal-based image drag deformation provided by an embodiment of the disclosure
  • FIG. 1d is a flowchart of a method for implementing a terminal-based image drag and deformation according to another embodiment of the present disclosure
  • FIG. 1e is a flowchart of a method for implementing a terminal-based image dragging deformation provided by another embodiment of the present disclosure
  • FIG. 2a is a schematic structural diagram of a terminal-based image drag-and-drop implementation device provided by an embodiment of the disclosure
  • Figure 2b is a schematic structural diagram of a terminal-based image drag-and-drop implementation device provided by an embodiment of the disclosure
  • 2c is a schematic structural diagram of a terminal-based image drag-and-drop implementation device provided by another embodiment of the present disclosure
  • Fig. 3 is a schematic structural diagram of an electronic device provided according to an embodiment of the present disclosure.
  • Figure 1a is a flowchart of a method for implementing terminal-based image drag deformation provided by an embodiment of the present disclosure.
  • the method for implementing terminal-based image drag deformation provided in this embodiment can be implemented by a terminal-based image drag deformation implementation device Execution, the terminal-based image drag-and-drop implementation device can be implemented as software, or as a combination of software and hardware, the terminal-based image drag-and-drop implementation device can be integrated in the terminal-based image drag-and-drop implementation system In a certain device, such as a terminal-based image drag-and-drop implementation server or terminal-based image drag-and-drop implementation in a terminal device. As shown in Figure 1a, the method includes the following steps:
  • Step S1 determining the drag point in the image to be deformed displayed on the terminal screen
  • the terminal may be a mobile terminal (for example, a smart phone, an iphone, a tablet computer) or a fixed terminal (for example, a desktop computer)
  • a mobile terminal for example, a smart phone, an iphone, a tablet computer
  • a fixed terminal for example, a desktop computer
  • the image to be deformed may be a video face image.
  • a face image can be collected as the image to be deformed through the camera function of the terminal device.
  • Step S2 Determine the drag point in the image to be deformed displayed on the terminal screen
  • the trigger operation includes, but is not limited to, a single-click operation, a double-click operation, and a drag operation.
  • Step S3 Display the deformed image on the terminal screen.
  • a drag point is determined in the image to be deformed displayed on the terminal screen, the image is dragged and deformed according to the trigger operation of the drag point, and the deformed image is displayed on the terminal screen, which solves the prior art
  • the image distortion cannot be freely controlled by the user, cannot achieve the distortion effect that the user wants, and the technical problem of poor controllability.
  • step S1 includes:
  • Step S11 Display at least one drag point to be selected included in the image to be deformed on the terminal screen
  • the white dots are candidate drag points included in the image to be deformed.
  • Step S12 According to the selection operation generated on the terminal screen, the selected to-be-selected drag point is used as the drag point.
  • step S2 includes:
  • Step S21 Receive a trigger operation on the drag point
  • Step S22 Determine the position of the drag point according to the trigger operation, where the position includes a starting point and an ending point;
  • Step S23 drag and deform the image to be deformed according to the start position and the end position.
  • the drag point corresponding to the pixel point of the image to be deformed is moved from the starting point position to the ending point position, so as to realize the image deformation.
  • step S21 includes:
  • Step S211 Receive a first trigger operation on the drag point, and determine the starting point of the drag point according to the first trigger operation on the drag point;
  • Step S212 Receive a second trigger operation on the drag point, and determine the end position of the drag point according to the second trigger operation on the drag point.
  • the first trigger operation precedes the second trigger operation in time sequence.
  • step S22 includes:
  • Step S221 Determine a template image corresponding to the image to be deformed according to the trigger operation, and determine the drag point on the template image;
  • the drag point includes a drag start point and a drag end point.
  • this step includes: obtaining a template image corresponding to the drag start point according to the first trigger operation, and according to the second trigger operation Get the template image corresponding to the drag end point.
  • Step S222 According to the position mapping relationship between the source key point on the template image and the target key point on the image to be deformed, determine the position where the drag point is mapped to the image to be deformed.
  • the template image is an image preset to be associated with the image to be deformed, such as a frontal image of a human face. Any pixel on the template image can be selected as the drag point.
  • the source key points are pixels on the template image, and specifically may be all pixels on the template image, or a selected preset number of pixels.
  • the target key point is a key point on the image to be deformed, and specifically may be all pixels on the template image, or a selected preset number of pixels.
  • the number of the source key points and the target key points are the same, and they are pixels with the same characteristics contained in both images.
  • the key point can be customized by the user or obtained through machine algorithm learning.
  • the number of specific pixels can be customized, for example, 106. When the pixels are the selected preset number of pixels, the calculation amount can be effectively reduced.
  • step S222 includes:
  • the location of the drag point mapped to the image to be deformed is determined according to the transformation matrix and the translation vector.
  • the step of determining the transformation matrix and translation vector of the drag point according to the position of the drag point, the position of the source key point, and the position of the target key point includes:
  • the transformation matrix and translation vector of the drag point are determined according to the weight, the position of the source key point, and the position of the target key point.
  • the weight is calculated; where
  • the following formula can be used to optimize the weight, and the optimized weight is used to locate the position of the drag point: Among them, ⁇ is the optimized weight coefficient, and ⁇ is the optimized weight offset.
  • step S23 includes:
  • Step S231 Perform gridding processing on the image to be deformed to obtain at least one grid point
  • Step S232 Determine the target position of the grid point according to the original position of the grid point, the start position and the end position;
  • Step S233 Drag and deform the image to be deformed according to the target position, and display the deformed image.
  • step S232 includes:
  • the target position is determined according to the transformation matrix and the translation vector.
  • the step of determining the transformation matrix and translation vector of the grid point according to the original position of the grid point, the start position and the end position includes:
  • the transformation matrix and translation vector of the grid point are determined according to the weight, the start position and the end position.
  • the step of determining the transformation matrix and translation vector of the grid point according to the weight, the starting point position and the ending point position includes:
  • the transformation matrix and translation vector of the grid point are determined according to the optimized weight, the starting point position and the ending point position.
  • the weight is calculated; where
  • the weight is optimized, where ⁇ is the optimized weight coefficient, and ⁇ is the optimized weight offset.
  • the method further includes:
  • the drag point is in the first form after the first trigger operation, and the first form is displayed on the terminal screen; and/or,
  • the drag point is in the second form after the second triggering operation, and the second form is displayed on the terminal screen.
  • the first form and the second form are different states, and can be represented by different colors or different shapes.
  • Figure 2a is a schematic structural diagram of a terminal-based image drag-and-drop implementation device provided by an embodiment of the disclosure.
  • the terminal-based image drag-and-drop implementation device can be implemented as software or as a combination of software and hardware.
  • the image drag deformation realization device may be integrated in a device in a terminal-based image drag deformation realization system, such as a terminal-based image drag deformation realization server or a terminal-based image drag deformation realization terminal device.
  • the device includes: a drag point determination module 21, a drag deformation module 22, and an image display module 23. among them,
  • the drag point determination module 21 is used to determine a drag point in the image to be deformed displayed on the terminal screen; the drag deformation module 22 is used to drag and deform the image according to the trigger operation of the drag point; image display The module 23 is used to display the deformed image on the terminal screen.
  • the terminal may be a mobile terminal (for example, a smart phone, an iphone, a tablet computer) or a fixed terminal (for example, a desktop computer)
  • a mobile terminal for example, a smart phone, an iphone, a tablet computer
  • a fixed terminal for example, a desktop computer
  • the image to be deformed may be a video face image.
  • a face image can be collected as the image to be deformed through the camera function of the terminal device.
  • the trigger operation includes, but is not limited to, a single-click operation, a double-click operation, and a drag operation.
  • the drag point determination module 21 includes: a drag point display unit 211 and a drag point selection unit 212; wherein,
  • the drag point display unit 211 is configured to display at least one drag point to be selected included in the image on the terminal screen;
  • the drag point selection unit 212 is configured to use the selected drag point to be selected as the drag point according to the selection operation generated on the terminal screen.
  • the drag deformation module 22 includes: a receiving unit 221, a drag point determination unit 222, and a drag deformation unit 223; wherein,
  • the receiving unit 221 is configured to receive a trigger operation on the drag point
  • the drag point determination unit 222 is configured to determine the position of the drag point according to the trigger operation, where the position includes a starting point and an ending point;
  • the drag deformation unit 223 is configured to drag and deform the image to be deformed according to the start position and the end position.
  • the receiving unit 221 is specifically configured to: receive a first trigger operation on the drag point, and determine the starting point of the drag point according to the first trigger operation of the drag point Position; receiving a second trigger operation on the drag point, and determine the end position of the drag point according to the second trigger operation on the drag point.
  • the image display module 23 is specifically configured to: the drag point is in the first form after the first trigger operation, and display the first form on the terminal screen And/or, the drag point is in the second form after the second triggering operation, and the second form is displayed on the terminal screen.
  • the drag point determination unit 222 includes: a drag point determination subunit and a drag point position determination subunit; wherein,
  • the drag point determination subunit is configured to determine a template image corresponding to the image according to the first trigger operation of the drag point, and determine the drag point on the template image;
  • the drag point position determination subunit is used to determine the position of the drag point mapped to the image to be deformed according to the position mapping relationship between the source key point on the template image and the target key point on the image to be deformed.
  • the drag point position determination subunit is specifically configured to determine the drag point according to the position of the drag point on the template image, the position of the source key point, and the position of the target key point.
  • the transformation matrix and translation vector of the drag point; and the location of the drag point mapped to the image to be deformed is determined according to the transformation matrix and translation vector.
  • the drag point position determining subunit is specifically configured to determine the drag point and the source key point according to the position of the drag point on the template image and the position of the source key point
  • the weight of the drag point; the transformation matrix and translation vector of the drag point are determined according to the weight, the position of the source key point, and the position of the target key point.
  • the drag deformation unit 223 includes: a grid processing subunit, a target position determination subunit, and a drag deformation subunit; wherein,
  • the grid processing subunit is used to perform grid processing on the image to be deformed to obtain at least one grid point;
  • the target position determining subunit is used to determine the target position of the grid point according to the original position of the grid point, the start position and the end position;
  • the drag deformation sub-unit is used to drag and deform the image to be deformed according to the target position, and display the deformed image.
  • target position determining subunit is specifically configured to: determine the transformation matrix and translation vector of the grid point according to the original position of the grid point, the start position and the end position; The matrix and the translation vector determine the target position.
  • target position determining subunit is specifically configured to: determine the weight of the grid point and the starting point of the drag point according to the original position of the grid point and the starting point position; according to the weight, The start position and the end position determine the transformation matrix and translation vector of the grid point.
  • target position determining subunit is specifically configured to: perform optimization processing on the weight; and determine the transformation matrix and translation vector of the grid point according to the optimized weight, the starting point position, and the ending point position.
  • FIG. 3 shows a schematic structural diagram of an electronic device suitable for implementing embodiments of the present disclosure.
  • the electronic devices in the embodiments of the present disclosure may include, but are not limited to, mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablets), PMPs (portable multimedia players), vehicle-mounted terminals (such as Mobile terminals such as car navigation terminals) and fixed terminals such as digital TVs, desktop computers, etc.
  • the electronic device shown in FIG. 3 is only an example, and should not bring any limitation to the function and scope of use of the embodiments of the present disclosure.
  • the electronic device may include a processing device (such as a central processing unit, a graphics processor, etc.) 301, which may be loaded into a random access memory according to a program stored in a read-only memory (ROM) 302 or from a storage device 308 (RAM)
  • the program in 303 executes various appropriate actions and processing.
  • various programs and data required for the operation of the electronic device are also stored.
  • the processing device 301, the ROM 302, and the RAM 303 are connected to each other through a bus 304.
  • An input/output (I/O) interface 305 is also connected to the bus 304.
  • the following devices can be connected to the I/O interface 305: including input devices 306 such as touch screens, touch pads, keyboards, mice, image sensors, microphones, accelerometers, gyroscopes, etc.; including, for example, liquid crystal displays (LCD), speakers, An output device 307 such as a vibrator; a storage device 308 such as a magnetic tape and a hard disk; and a communication device 309.
  • the communication device 309 may allow the electronic device to perform wireless or wired communication with other devices to exchange data.
  • FIG. 3 shows an electronic device with various devices, it should be understood that it is not required to implement or have all the devices shown. It may alternatively be implemented or provided with more or fewer devices.
  • the process described above with reference to the flowchart can be implemented as a computer software program.
  • the embodiments of the present disclosure include a computer program product, which includes a computer program carried on a computer-readable medium, and the computer program contains program code for executing the method shown in the flowchart.
  • the computer program may be downloaded and installed from the network through the communication device 309, or installed from the storage device 308, or installed from the ROM 302.
  • the processing device 301 the above-mentioned functions defined in the method of the embodiment of the present disclosure are executed.
  • the aforementioned computer-readable medium in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the two.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination of the above. More specific examples of computer-readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in a baseband or as a part of a carrier wave, and a computer-readable program code is carried therein. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than the computer-readable storage medium.
  • the computer-readable signal medium may send, propagate, or transmit the program for use by or in combination with the instruction execution system, apparatus, or device .
  • the program code contained on the computer-readable medium can be transmitted by any suitable medium, including but not limited to: wire, optical cable, RF (Radio Frequency), etc., or any suitable combination of the above.
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or it may exist alone without being assembled into the electronic device.
  • the above-mentioned computer-readable medium carries one or more programs.
  • the electronic device When the above-mentioned one or more programs are executed by the electronic device, the electronic device is caused to determine the position of the drag point in the image to be deformed, and the drag point includes Drag start point and drag end point; drag and deform the image to be deformed according to the position of the drag start point and the position of the drag end point.
  • the computer program code used to perform the operations of the present disclosure may be written in one or more programming languages or a combination thereof.
  • the above-mentioned programming languages include object-oriented programming languages—such as Java, Smalltalk, C++, and also conventional Procedural programming language-such as "C" language or similar programming language.
  • the program code can be executed entirely on the user's computer, partly on the user's computer, executed as an independent software package, partly on the user's computer and partly executed on a remote computer, or entirely executed on the remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (for example, using an Internet service provider to pass Internet connection).
  • LAN local area network
  • WAN wide area network
  • each block in the flowchart or block diagram can represent a module, program segment, or part of code, and the module, program segment, or part of code contains one or more for realizing the specified logic function Executable instructions.
  • the functions marked in the block may also occur in a different order from the order marked in the drawings. For example, two blocks shown in succession can actually be executed substantially in parallel, or they can sometimes be executed in the reverse order, depending on the functions involved.
  • each block in the block diagram and/or flowchart, and the combination of the blocks in the block diagram and/or flowchart can be implemented by a dedicated hardware-based system that performs the specified functions or operations Or it can be realized by a combination of dedicated hardware and computer instructions.
  • the units involved in the embodiments described in the present disclosure can be implemented in software or hardware. Among them, the name of the unit does not constitute a limitation on the unit itself under certain circumstances.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un procédé et un dispositif de réalisation d'une déformation d'image basée sur glissement sur un terminal, un appareil électronique et un support de stockage lisible par ordinateur. Le procédé de réalisation d'une déformation d'image basée sur glissement sur un terminal comprend les étapes consistant à : déterminer un point de glissement sur une image à déformer affichée sur un écran de terminal (S1) ; effectuer un glissement pour déformer l'image selon une opération de déclenchement par rapport au point de glissement ; et afficher l'image déformée sur l'écran de terminal (S3). Le procédé résout le problème technique de l'état de la technique dans lequel une déformation d'image est très peu maniable, et ainsi un utilisateur est incapable de manipuler librement une image pour obtenir l'effet de déformation souhaité.
PCT/CN2020/070738 2019-01-31 2020-01-07 Procédé et dispositif de réalisation de déformation d'image basée sur glissement sur un terminal WO2020156058A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910100528.9A CN110069191B (zh) 2019-01-31 2019-01-31 基于终端的图像拖拽变形实现方法和装置
CN201910100528.9 2019-01-31

Publications (1)

Publication Number Publication Date
WO2020156058A1 true WO2020156058A1 (fr) 2020-08-06

Family

ID=67366116

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/070738 WO2020156058A1 (fr) 2019-01-31 2020-01-07 Procédé et dispositif de réalisation de déformation d'image basée sur glissement sur un terminal

Country Status (2)

Country Link
CN (1) CN110069191B (fr)
WO (1) WO2020156058A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110069191B (zh) * 2019-01-31 2021-03-30 北京字节跳动网络技术有限公司 基于终端的图像拖拽变形实现方法和装置
CN110837332A (zh) * 2019-11-13 2020-02-25 北京字节跳动网络技术有限公司 面部图像变形方法、装置、电子设备和计算机可读介质
CN111199512B (zh) * 2019-12-24 2023-08-15 远光软件股份有限公司 Svg矢量图形的调整方法、装置、存储介质及终端
CN113986105B (zh) * 2020-07-27 2024-05-31 北京达佳互联信息技术有限公司 人脸图像的变形方法、装置、电子设备及存储介质
CN112258653A (zh) * 2020-10-28 2021-01-22 北京字跳网络技术有限公司 弹性对象的渲染方法、装置、设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080192049A1 (en) * 2007-02-13 2008-08-14 Thomas Schiwietz Image deformation using physical models
CN105184735A (zh) * 2014-06-19 2015-12-23 腾讯科技(深圳)有限公司 一种人像变形方法及装置
CN109003224A (zh) * 2018-07-27 2018-12-14 北京微播视界科技有限公司 基于人脸的形变图像生成方法和装置
CN110069191A (zh) * 2019-01-31 2019-07-30 北京字节跳动网络技术有限公司 基于终端的图像拖拽变形实现方法和装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100787977B1 (ko) * 2006-03-30 2007-12-24 삼성전자주식회사 이동 단말기에서 사용자 데이터 크기 조절 장치 및 방법
CN104077798B (zh) * 2014-07-01 2017-05-03 中国科学技术大学 一种可形变物体的高真实感动画合成方法
JP2016207197A (ja) * 2015-04-15 2016-12-08 株式会社ウェブサービス・ディベロップメント 情報処理装置、情報処理方法及び情報処理プログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080192049A1 (en) * 2007-02-13 2008-08-14 Thomas Schiwietz Image deformation using physical models
CN105184735A (zh) * 2014-06-19 2015-12-23 腾讯科技(深圳)有限公司 一种人像变形方法及装置
CN109003224A (zh) * 2018-07-27 2018-12-14 北京微播视界科技有限公司 基于人脸的形变图像生成方法和装置
CN110069191A (zh) * 2019-01-31 2019-07-30 北京字节跳动网络技术有限公司 基于终端的图像拖拽变形实现方法和装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DU XIAORONG , PING SHUWEN ,ZHANG YONG : "Graphics and Image Local Deformation Based on Moving Least Squares Method", JOURNAL OF SYSTEM SIMULATION, vol. 27, no. 4, 30 April 2015 (2015-04-30), pages 816 - 823, XP009522483, ISSN: 1004-731X *

Also Published As

Publication number Publication date
CN110069191B (zh) 2021-03-30
CN110069191A (zh) 2019-07-30

Similar Documents

Publication Publication Date Title
WO2020156058A1 (fr) Procédé et dispositif de réalisation de déformation d'image basée sur glissement sur un terminal
WO2020186935A1 (fr) Procédé et dispositif d'affichage d'objet virtuel, appareil électronique, et support de stockage lisible par ordinateur
WO2021139408A1 (fr) Procédé et appareil pour afficher un effet spécial, et support d'enregistrement et dispositif électronique
WO2021104365A1 (fr) Procédé de partage d'objets et dispositif électronique
TWI522894B (zh) 用於電子元件中的方法、電腦程式產品以及非暫時性電腦可讀記錄媒體
WO2021179882A1 (fr) Procédé et appareil de dessin d'image, support lisible et dispositif électronique
US20160004425A1 (en) Method of displaying graphic user interface and electronic device implementing same
WO2020220809A1 (fr) Procédé et dispositif de reconnaissance d'action pour objet cible, et appareil électronique
WO2021238679A1 (fr) Procédé et appareil de commande d'affichage d'interface d'appel vidéo, support de stockage et dispositif
WO2020200263A1 (fr) Procédé et dispositif de traitement d'image dans un flux d'informations, et support de stockage lisible par ordinateur
WO2022022689A1 (fr) Procédé et appareil d'interaction et dispositif électronique
US12019669B2 (en) Method, apparatus, device, readable storage medium and product for media content processing
WO2022242379A1 (fr) Procédé et dispositif de rendu à base de trait, support de stockage et terminal
WO2020125405A1 (fr) Procédé de commande pour appareil terminal et appareil terminal
CN108804067A (zh) 信息显示方法、设备和计算机可读介质
WO2021244651A1 (fr) Procédé et dispositif d'affichage d'informations, et terminal et support de stockage
WO2021244650A1 (fr) Procédé et dispositif de commande, terminal et support d'enregistrement
JP7007168B2 (ja) プログラム、情報処理方法、及び情報処理装置
WO2024051639A1 (fr) Procédé, appareil et dispositif de traitement d'image, support de stockage et produit
CN110069195B (zh) 图像拖拽变形方法和装置
WO2020207083A1 (fr) Procédé et appareil de partage d'informations, et dispositif électronique et support de stockage lisible par ordinateur
WO2023169287A1 (fr) Procédé et appareil de génération d'effet spécial de maquillage de beauté, dispositif, support d'enregistrement et produit de programme
CN110378282A (zh) 图像处理方法及装置
CN110070479B (zh) 图像变形拖拽点定位方法和装置
WO2020133386A1 (fr) Procédé de sélection partielle de notes, appareil, terminal électronique et support de stockage lisible

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20747651

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 01.12.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 20747651

Country of ref document: EP

Kind code of ref document: A1