WO2018176958A1 - Procédé et système de cartographie adaptative en fonction du mouvement de points clés dans une image - Google Patents

Procédé et système de cartographie adaptative en fonction du mouvement de points clés dans une image Download PDF

Info

Publication number
WO2018176958A1
WO2018176958A1 PCT/CN2017/120135 CN2017120135W WO2018176958A1 WO 2018176958 A1 WO2018176958 A1 WO 2018176958A1 CN 2017120135 W CN2017120135 W CN 2017120135W WO 2018176958 A1 WO2018176958 A1 WO 2018176958A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinates
point
key
image
texture
Prior art date
Application number
PCT/CN2017/120135
Other languages
English (en)
Chinese (zh)
Inventor
李亮
陈少杰
张文明
Original Assignee
武汉斗鱼网络科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 武汉斗鱼网络科技有限公司 filed Critical 武汉斗鱼网络科技有限公司
Publication of WO2018176958A1 publication Critical patent/WO2018176958A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/20Linear translation of whole images or parts thereof, e.g. panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/60Rotation of whole images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Definitions

  • the present invention relates to the field of image processing and, more particularly, to an adaptive mapping method and system that moves with key points in an image.
  • mapping on video images is becoming more and more common.
  • texture is the effect of adding dynamic texture to specific parts of the image.
  • a human face can be referred to as an eye. , nose or mouth
  • the existing mapping method only places the texture at a specific position in the picture.
  • the specific position changes, although the position of the texture changes, the display scale changes without considering the specific position.
  • the shape of the texture will be distorted to a certain extent, which will greatly reduce the display effect of the texture.
  • the present invention provides a method and system for adaptive mapping of key points moving in an image that overcomes the above problems or at least partially solves the above problems.
  • a method of adaptive mapping that moves with key points in an image comprising:
  • an adaptive mapping system that moves with key points in an image, including:
  • the nearby coordinate generating device obtains a certain range of coordinates near the position after the key point is moved according to the coordinate of the key point movement and the scaling factor of the texture;
  • the texture sampling coordinate obtaining means obtains the texture sampling coordinates according to the coordinates of a certain range, the coordinates of the key points, and the scaling factor of the texture in the vicinity of the position after the key point is moved, and displays the texture on the texture sampling coordinates.
  • an adaptive mapping apparatus comprising:
  • At least one processor at least one memory, and a bus;
  • the processor and the memory complete communication with each other through the bus;
  • the memory stores program instructions executable by the processor, the processor invoking the program instructions to perform the following methods:
  • a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions when the program instructions are When the computer executes, it can perform the following methods:
  • a non-transitory computer readable storage medium storing computer instructions that cause the computer to perform the following method:
  • the present application proposes an adaptive mapping method and system for moving key points in an image, by locating the position of a key point on a key object in the image, and rotating any point on the image according to the position change after the key point is moved.
  • Matrix transformation obtain the coordinates of the rotation matrix corresponding to the point, ensure that the points on the image move synchronously with the position of the key point, and then move the key point according to the coordinate of the key point and the scaling factor of the texture.
  • a certain range of coordinates near the position, and finally the coordinate of the key point and the scaling factor of the texture are obtained to obtain the texture sampling coordinates of the display texture.
  • the present invention enables the texture to be adaptively moved in the case where the key point on the image is moved, and The user's experience is improved without changing the original display ratio.
  • FIG. 1 is a flow chart of an adaptive mapping method for moving key points in an image according to an embodiment of the present invention
  • step S1 is a flowchart of step S1 in the embodiment of the present invention.
  • step S2 in the embodiment of the present invention.
  • FIG. 4 is a block diagram of an adaptive mapping system that moves with key points in an image according to an embodiment of the present invention
  • FIG. 5 is a structural block diagram of a conversion module according to an embodiment of the present invention.
  • FIG. 6 is a functional block diagram of an adaptive mapping device in an embodiment of the present invention.
  • the present invention provides a mapping method of adaptive rotation that can be rotated with the human face.
  • FIG. 1 is a flowchart of an adaptive mapping method for face rotation according to an embodiment of the present invention. As shown in FIG. 1, the method includes:
  • the present application proposes an adaptive mapping method and system for moving key points in an image, by locating the position of a key point on a key object in the image, and rotating any point on the image according to the position change after the key point is moved.
  • Matrix transformation get the converted coordinates corresponding to the point, and then according to the coordinate of the key point and the scaling factor of the texture, filter the coordinates of a certain range around the position after the key point is moved, and finally combine the coordinates of the key point and the texture of the map.
  • the scaling factor obtains the texture sampling coordinates of the display texture.
  • the present invention enables the texture to be adaptively moved in the case of moving key points on the image without changing the original display ratio, thereby improving the user experience.
  • FIG. 2 is a flowchart of step S1 in the embodiment of the present invention, including:
  • the embodiment of the present invention refers to the coordinates of the image as the world coordinate system, and the coordinate system in which the movable key object in the image is located is referred to as the object coordinate system.
  • the texture is located in the coordinate system of the image, the texture is mapped. It always follows the motion of the object, which is equivalent to the coordinate system of the object. Therefore, in order to ensure the rotation of the texture to adapt to the transformation between the two coordinate systems, the aspect ratio of the image, the distance moved by the key point, and the rotation angle of the key object are required. , Perform a rotation matrix transformation on any point on the image.
  • step S1.3 further includes:
  • the rotation center point is a key point, so-called The key point, to put it simply, is that the texture needs to be aligned at all times, for example, for mapping on the human eye, the key point can be the center point of the region of the human eye (assuming this is the coordinates of the pupil).
  • the rotation center point in the embodiment refers to a key point after the face is moved in the image, and also in the above example, the coordinates of the pupil after the face is moved.
  • the present invention is not limited to the mapping of the human eye, and the present invention is equally applicable to the mapping of other parts of the face, such as the nose, mouth or eyebrows, and will not be described herein.
  • the rotation angle of the face can be calculated from the position of the key points on the human face. For example, the angle between the bridge of the nose and the X-axis in the horizontal direction is calculated, and the calculation formula is not given in detail here. Therefore, in order to ensure that the rotated coordinates coincide with the rotation angle of the face and the shape does not change, the embodiment rotates the coordinates after the first translation according to the rotation angle of the face and the coordinates of the rotation center point. obtain.
  • step S1.3 further includes:
  • the coordinates (x, y) of the point are converted into homogeneous coordinates (x, y, 1), and the homogeneous coordinates are scaled based on the aspect ratio R of the image. Get the scaled coordinates (R*x, y, 1).
  • the homogeneous coordinate is to represent a vector that is originally n-dimensional with an n+1-dimensional vector.
  • the homogeneous coordinates of the two-dimensional point (x, y) are expressed as (hx, hy, h).
  • the homogeneous representation of a vector is not unique, and the different values of the h of the homogeneous coordinates represent the same point, such as homogeneous coordinates (8, 4, 2), (4, 2 , 1) represents two-dimensional points (4, 2).
  • the scaled coordinates are represented by matrix M 1 as:
  • the coordinates of the original rotation center point are the origin coordinates (0, 0), and the coordinates after the translation are assumed to be (D x , D y ), and the sx units are translated in the x-axis direction.
  • the first coordinates of the shifted matrix M 2 is expressed as:
  • the rotated coordinates are represented by a matrix M 3 as:
  • FIG. 3 is a schematic flowchart of step S2 in the embodiment of the present invention. As shown in FIG. 3, step S2 includes:
  • the coordinates after the transformation of the rotation matrix are defined as (dst x , dst y ), and the coordinates of the key points are (center x , center y ), when the coordinates after the rotation
  • the following conditions are satisfied: (center x -resize x *0.5) ⁇ dst x ⁇ (center x +resize x *0.5) and (center y -resize y *0.5) ⁇ dst y ⁇ (center y +resize y *0.5)
  • the coordinate converted by the rotation matrix is used as a coordinate of a certain range near the position after the key point is moved.
  • l represents the length of the longitudinal length of the key object, such as the length of the face
  • w represents the width of the texture
  • h represents the height of the texture
  • r represents the display coefficient
  • W represents the width of the image
  • H represents the height of the image.
  • the step of obtaining texture sampling coordinates according to the coordinate of a certain range, the coordinates of the key points, and the scaling factor of the map located near the position after the movement of the key point further includes:
  • the key point coordinates are (center x , center y )
  • the scaling factor of the texture on the x-axis is resize x
  • the texture is scaled on the y-axis.
  • the coefficient resize y and the texture sampling coordinates are (coord x , coord y ), then:
  • Coord x (near x -center x +resize x *0.5)/resize x ;
  • Coord y (near y -center y +resize y *0.5)/resize y .
  • FIG. 4 is a block diagram of an adaptive mapping system that moves with key points in an image according to an embodiment of the present invention, including:
  • the nearby coordinate generating device obtains a certain range of coordinates near the position after the key point is moved according to the coordinate of the key point movement and the scaling factor of the texture;
  • the texture sampling coordinate obtaining means obtains the texture sampling coordinates according to the coordinates of a certain range, the coordinates of the key points, and the scaling factor of the texture in the vicinity of the position after the key point is moved, and displays the texture on the texture sampling coordinates.
  • a rotation matrix transformation is performed on any point on the image to obtain a rotation matrix conversion corresponding to the point.
  • the coordinates after the image ensure that the points on the image move synchronously with the position of the key point, and then according to the coordinates of the key point and the scaling factor of the texture, the coordinates of a certain range near the position after the key point is moved are filtered, and finally combined.
  • the coordinate of the key point and the scaling factor of the texture obtain the texture sampling coordinates of the display texture.
  • the present invention enables the texture to be adaptively moved when the key points on the image are moved, and the original display ratio is not changed, thereby improving the user. Experience.
  • the conversion coordinate device comprises:
  • a positioning module for acquiring an aspect ratio of an image and locating a position of a key point on a key object in the image
  • the conversion module is configured to perform rotation matrix conversion on any point on the image according to the aspect ratio of the image, the distance moved by the key point, and the rotation angle of the key object, to obtain the converted coordinates corresponding to any point.
  • the coordinates of the image are called the world coordinate system, and the coordinate system of the movable key object in the image is called the object coordinate system.
  • the texture is located in the coordinate system of the image, the texture always follows the object. Movement, which is equivalent to the coordinate system of the object, so in order to ensure the rotation of the texture to adapt to the conversion between the two coordinate systems, the image should be based on the aspect ratio of the image, the distance moved by the key point, and the rotation angle of the key object. Rotate the matrix transformation at any point.
  • FIG. 5 is a structural block diagram of a conversion module in an embodiment of the present invention, including:
  • the first scaling unit is configured to scale the coordinates of the point according to the aspect ratio of the image to any point on the image to obtain the scaled coordinates.
  • the first translation unit is configured to translate the coordinates of the rotation center point from the origin coordinates to the coordinates of the key point, and translate the scaled coordinates according to the moving distance of the key points to obtain the coordinates after the first translation.
  • the rotation center point is a key point, so-called The key point, to put it simply, is that the texture needs to be aligned at all times, for example, for mapping on the human eye, the key point can be the center point of the region of the human eye (assuming this is the coordinates of the pupil).
  • the rotation center point in the embodiment refers to a key point after the face is moved in the image, and also in the above example, the coordinates of the pupil after the face is moved.
  • the present invention is not limited to the mapping of the human eye, and the present invention is equally applicable to the mapping of other parts of the face, such as the nose, mouth or eyebrows, and will not be described herein.
  • the rotation unit is configured to rotate the coordinates after the first translation according to the rotation angle of the key object and the coordinates of the rotation center point, and obtain the rotated coordinates.
  • the rotation angle of the face can be calculated from the position of the key points on the human face. For example, the angle between the bridge of the nose and the X-axis in the horizontal direction is calculated, and the calculation formula is not given in detail here. Therefore, in order to ensure that the rotated coordinates coincide with the rotation angle of the face and the shape does not change, the embodiment rotates the coordinates after the first translation according to the rotation angle of the face and the coordinates of the rotation center point. obtain.
  • a second translation unit configured to translate coordinates of the rotation center point from coordinates of the key point back to the origin coordinates, and translate the scaled coordinates according to the moving distance of the rotation center point to obtain coordinates after the second translation;
  • a second scaling unit configured to scale the coordinates after the second translation back to the original scale to obtain the converted coordinates.
  • the nearby coordinate generation device comprises:
  • a scaling factor acquisition module for obtaining a scaling factor of the texture on the x-axis based on the longitudinal length of the key object, the width of the texture, and the width of the image, and obtaining the texture on the y-axis based on the longitudinal length of the key object, the height of the texture, and the height of the image.
  • the attachment coordinate acquisition module is configured to define (dstx, dsty) the coordinate after the rotation for any one of the rotated coordinates, and the coordinates of the key point are (centerx, centery), and the coordinates after the rotation satisfy the following conditions: Centerx-resizex*0.5) ⁇ dstx ⁇ (centerx+resizex*0.5) and (centery–resizey*0.5) ⁇ dsty ⁇ (centery+resizey*0.5), the rotated coordinates are taken as the position after the key point is moved A certain range of coordinates nearby.
  • the texture sampling coordinate obtaining device is further configured to:
  • Coord x (near x -center x +resize x *0.5)/resize x ;
  • Coord y (near y -center y +resize y *0.5)/resize y .
  • the present invention discloses an adaptive mapping device, as shown in FIG. 6, comprising:
  • the processor 601 and the memory 602 complete communication with each other through the bus 603; the memory 602 stores program instructions executable by the processor 601, and the processor 602 invokes the program instructions to execute the above
  • the methods provided by the method embodiments include, for example:
  • the present invention discloses a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions when the program instructions are When the computer is executed, the computer can perform the methods provided by the foregoing method embodiments, including, for example:
  • the present invention discloses a non-transitory computer readable storage medium storing computer instructions that cause the computer to perform the various method embodiments described above
  • the methods provided include, for example:
  • the foregoing program may be stored in a computer readable storage medium, and the program is executed when executed.
  • the foregoing steps include the steps of the foregoing method embodiments; and the foregoing storage medium includes: a medium that can store program codes, such as a ROM, a RAM, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

La présente invention concerne un procédé et un système de cartographie adaptative en fonction du mouvement de points clés dans une image. Le procédé comprend : positionnement des emplacements de points clés sur un objet clé dans l'image, et réalisation d'une transformation de matrice de rotation sur un point quelconque sur l'image en fonction du changement d'emplacement du point clé déplacé afin d'obtenir les coordonnées du point correspondant après la réalisation de la transformation de matrice de rotation ; obtention des coordonnées dans une certaine plage au voisinage de l'emplacement du point clé déplacé en fonction des coordonnées du point clé déplacé et du coefficient d'échelle d'une carte ; et obtention de coordonnées d'échantillonnage de carte en fonction des coordonnées dans la certaine plage au voisinage de l'emplacement du point clé déplacé, des coordonnées du point clé et du coefficient d'échelle de la carte, puis affichage de la carte sur les coordonnées d'échantillonnage de carte. La présente invention permet d'effectuer une rotation adaptative de la carte dans le cas où un visage humain effectue une rotation, et une échelle d'affichage d'origine ne sera pas modifiée et l'expérience d'utilisateur peut être améliorée.
PCT/CN2017/120135 2017-03-28 2017-12-29 Procédé et système de cartographie adaptative en fonction du mouvement de points clés dans une image WO2018176958A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710193044.4A CN107122774B (zh) 2017-03-28 2017-03-28 一种随图像中关键点移动的自适应贴图方法和系统
CN201710193044.4 2017-03-28

Publications (1)

Publication Number Publication Date
WO2018176958A1 true WO2018176958A1 (fr) 2018-10-04

Family

ID=59717365

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/120135 WO2018176958A1 (fr) 2017-03-28 2017-12-29 Procédé et système de cartographie adaptative en fonction du mouvement de points clés dans une image

Country Status (2)

Country Link
CN (1) CN107122774B (fr)
WO (1) WO2018176958A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109614059A (zh) * 2018-12-19 2019-04-12 森大(深圳)技术有限公司 Gerber文件处理方法、装置及计算机可读存储介质
CN111126344A (zh) * 2019-12-31 2020-05-08 杭州趣维科技有限公司 一种生成人脸额头关键点的方法与系统
CN111652918A (zh) * 2020-06-04 2020-09-11 深圳地平线机器人科技有限公司 确定3d脸部模型的方法、装置、介质以及电子设备
CN113487717A (zh) * 2021-07-13 2021-10-08 网易(杭州)网络有限公司 图片处理方法及装置、计算机可读存储介质、电子设备
CN114119684A (zh) * 2020-08-31 2022-03-01 中国移动通信集团浙江有限公司 基于四面体结构的标记点配准方法
CN114429666A (zh) * 2022-04-06 2022-05-03 深圳市大头兄弟科技有限公司 视频的人脸替换方法、装置、设备及存储介质

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107122774B (zh) * 2017-03-28 2019-12-03 武汉斗鱼网络科技有限公司 一种随图像中关键点移动的自适应贴图方法和系统
CN109509140A (zh) * 2017-09-15 2019-03-22 阿里巴巴集团控股有限公司 显示方法及装置
CN109587390B (zh) * 2017-09-29 2021-05-07 腾讯科技(深圳)有限公司 画面渲染方法、画面渲染装置及存储介质
CN107679497B (zh) * 2017-10-11 2023-06-27 山东新睿信息科技有限公司 视频面部贴图特效处理方法及生成系统
CN107808372B (zh) * 2017-11-02 2022-01-28 北京奇虎科技有限公司 图像穿越处理方法、装置、计算设备及计算机存储介质
CN108205822B (zh) * 2017-12-13 2020-09-08 中兴通讯股份有限公司 贴图方法及装置
CN108305212A (zh) * 2018-01-09 2018-07-20 武汉斗鱼网络科技有限公司 图像混合方法、存储介质、电子设备及系统
CN110020577B (zh) * 2018-01-10 2021-03-16 武汉斗鱼网络科技有限公司 人脸关键点扩展计算方法、存储介质、电子设备及系统
CN108921000B (zh) * 2018-04-16 2024-02-06 深圳市深网视界科技有限公司 头部角度标注、预测模型训练、预测方法、设备和介质
CN108846878A (zh) * 2018-06-07 2018-11-20 奇酷互联网络科技(深圳)有限公司 人脸贴图生成方法、装置、可读存储介质及移动终端
CN110378991A (zh) * 2018-11-07 2019-10-25 深圳格调网络运营有限公司 虚拟环境下模型与贴图的匹配方法
CN111383170B (zh) * 2018-12-28 2023-08-15 广州市百果园网络科技有限公司 图片关键点的调整方法、装置及终端
CN109819316B (zh) * 2018-12-28 2021-06-01 北京字节跳动网络技术有限公司 处理视频中人脸贴纸的方法、装置、存储介质及电子设备
CN112489157B (zh) * 2020-12-18 2024-03-19 广州视源电子科技股份有限公司 相框绘制方法、设备及存储介质
CN113687823B (zh) * 2021-07-30 2023-08-01 稿定(厦门)科技有限公司 基于html的四边形区块非线性变换方法及其系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130083976A1 (en) * 2011-10-03 2013-04-04 Qualcomm Incorporated Image-based head position tracking method and system
CN103090875A (zh) * 2012-11-26 2013-05-08 华南理工大学 一种基于双摄像头的实时实景匹配车载导航方法及装置
CN104881526A (zh) * 2015-05-13 2015-09-02 深圳彼爱其视觉科技有限公司 一种基于3d的物品穿戴方法及眼镜试戴方法
CN104881114A (zh) * 2015-05-13 2015-09-02 深圳彼爱其视觉科技有限公司 一种基于3d眼镜试戴的角度转动实时匹配方法
CN104899917A (zh) * 2015-05-13 2015-09-09 深圳彼爱其视觉科技有限公司 一种基于3d的物品虚拟穿戴的图片保存和分享方法
CN104898832A (zh) * 2015-05-13 2015-09-09 深圳彼爱其视觉科技有限公司 一种基于智能终端的3d实时眼镜试戴方法
CN107122774A (zh) * 2017-03-28 2017-09-01 武汉斗鱼网络科技有限公司 一种随图像中关键点移动的自适应贴图方法和系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101231658A (zh) * 2008-02-02 2008-07-30 谢亦玲 饰品计算机模拟配戴装置
CN105719351B (zh) * 2014-12-04 2018-09-28 高德软件有限公司 一种显示电子地图的方法和装置
CN105069745A (zh) * 2015-08-14 2015-11-18 济南中景电子科技有限公司 基于普通图像传感器及增强现实技术的带表情变脸系统及方法
CN105354876B (zh) * 2015-10-20 2018-10-09 何家颖 一种基于移动终端的实时立体试衣方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130083976A1 (en) * 2011-10-03 2013-04-04 Qualcomm Incorporated Image-based head position tracking method and system
CN103090875A (zh) * 2012-11-26 2013-05-08 华南理工大学 一种基于双摄像头的实时实景匹配车载导航方法及装置
CN104881526A (zh) * 2015-05-13 2015-09-02 深圳彼爱其视觉科技有限公司 一种基于3d的物品穿戴方法及眼镜试戴方法
CN104881114A (zh) * 2015-05-13 2015-09-02 深圳彼爱其视觉科技有限公司 一种基于3d眼镜试戴的角度转动实时匹配方法
CN104899917A (zh) * 2015-05-13 2015-09-09 深圳彼爱其视觉科技有限公司 一种基于3d的物品虚拟穿戴的图片保存和分享方法
CN104898832A (zh) * 2015-05-13 2015-09-09 深圳彼爱其视觉科技有限公司 一种基于智能终端的3d实时眼镜试戴方法
CN107122774A (zh) * 2017-03-28 2017-09-01 武汉斗鱼网络科技有限公司 一种随图像中关键点移动的自适应贴图方法和系统

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109614059A (zh) * 2018-12-19 2019-04-12 森大(深圳)技术有限公司 Gerber文件处理方法、装置及计算机可读存储介质
CN111126344A (zh) * 2019-12-31 2020-05-08 杭州趣维科技有限公司 一种生成人脸额头关键点的方法与系统
CN111126344B (zh) * 2019-12-31 2023-08-01 杭州趣维科技有限公司 一种生成人脸额头关键点的方法与系统
CN111652918A (zh) * 2020-06-04 2020-09-11 深圳地平线机器人科技有限公司 确定3d脸部模型的方法、装置、介质以及电子设备
CN111652918B (zh) * 2020-06-04 2023-08-18 深圳地平线机器人科技有限公司 确定3d脸部模型的方法、装置、介质以及电子设备
CN114119684A (zh) * 2020-08-31 2022-03-01 中国移动通信集团浙江有限公司 基于四面体结构的标记点配准方法
CN114119684B (zh) * 2020-08-31 2023-02-28 中国移动通信集团浙江有限公司 基于四面体结构的标记点配准方法
CN113487717A (zh) * 2021-07-13 2021-10-08 网易(杭州)网络有限公司 图片处理方法及装置、计算机可读存储介质、电子设备
CN113487717B (zh) * 2021-07-13 2024-02-23 网易(杭州)网络有限公司 图片处理方法及装置、计算机可读存储介质、电子设备
CN114429666A (zh) * 2022-04-06 2022-05-03 深圳市大头兄弟科技有限公司 视频的人脸替换方法、装置、设备及存储介质
CN114429666B (zh) * 2022-04-06 2022-07-01 深圳市大头兄弟科技有限公司 视频的人脸替换方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN107122774B (zh) 2019-12-03
CN107122774A (zh) 2017-09-01

Similar Documents

Publication Publication Date Title
WO2018176958A1 (fr) Procédé et système de cartographie adaptative en fonction du mouvement de points clés dans une image
US11645801B2 (en) Method for synthesizing figure of virtual object, electronic device, and storage medium
WO2022002032A1 (fr) Formation de modèle entraîné par image et génération d'image
CN107491174B (zh) 用于远程协助的方法、装置、系统及电子设备
US20180204052A1 (en) A method and apparatus for human face image processing
CN110517214B (zh) 用于生成图像的方法和装置
CN108769517A (zh) 一种基于增强现实进行远程辅助的方法与设备
WO2023019974A1 (fr) Procédé et appareil de correction destinés à une image de document, dispositif électronique et support de stockage
US20220358675A1 (en) Method for training model, method for processing video, device and storage medium
CN109754464B (zh) 用于生成信息的方法和装置
CN115147265B (zh) 虚拟形象生成方法、装置、电子设备和存储介质
CN115668300A (zh) 利用纹理解析的对象重建
WO2018113339A1 (fr) Procédé et dispositif de construction d'image de projection
JP2021523455A (ja) 顔に対する射影歪み補正
WO2022016996A1 (fr) Dispositif et procédé de traitement d'image, appareil électronique et support de stockage lisible par ordinateur
US9196096B2 (en) 3D-consistent 2D manipulation of images
US11170550B2 (en) Facial animation retargeting using an anatomical local model
CN112528707A (zh) 图像处理方法、装置、设备及存储介质
US10768692B2 (en) Systems and methods for capturing and rendering hand skeletons over documents images for telepresence
US20240046554A1 (en) Presenting virtual representation of real space using spatial transformation
CN112734628A (zh) 一种经三维转换后的跟踪点的投影位置计算方法及系统
CN108171802B (zh) 一种云端与终端结合实现的全景增强现实实现方法
US11663752B1 (en) Augmented reality processing device and method
CN112465692A (zh) 图像处理方法、装置、设备及存储介质
EP4350605A1 (fr) Procédé et appareil de traitement d'image, dispositif et support

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17902963

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17902963

Country of ref document: EP

Kind code of ref document: A1