CN111147842B - Wearable object-based matching degree determination method, device and equipment - Google Patents

Wearable object-based matching degree determination method, device and equipment Download PDF

Info

Publication number
CN111147842B
CN111147842B CN201811309225.XA CN201811309225A CN111147842B CN 111147842 B CN111147842 B CN 111147842B CN 201811309225 A CN201811309225 A CN 201811309225A CN 111147842 B CN111147842 B CN 111147842B
Authority
CN
China
Prior art keywords
limb
preset
determining
point
dimensional model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811309225.XA
Other languages
Chinese (zh)
Other versions
CN111147842A (en
Inventor
付佳
彭碧
李诗卉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201811309225.XA priority Critical patent/CN111147842B/en
Publication of CN111147842A publication Critical patent/CN111147842A/en
Application granted granted Critical
Publication of CN111147842B publication Critical patent/CN111147842B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers

Landscapes

  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a method, a device and equipment for determining matching degree based on a wearable object, wherein the method comprises the following steps: determining a first three-dimensional model of a first limb of the user; acquiring a second three-dimensional model of the wearable object corresponding to the first limb, wherein the first limb is used for wearing the wearable object; and determining the matching degree of the first limb and the wearable object according to the first three-dimensional model and the second three-dimensional model. The accuracy of the wearable object purchased through the internet platform is improved.

Description

Wearable object-based matching degree determination method, device and equipment
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a method, a device and equipment for determining matching degree based on a wearable object.
Background
Currently, with the continuous development of internet technology, more and more users are buying wearable objects through internet platforms, for example, the wearable objects may include shoes, hats, gloves, clothing, etc.
In practical applications, since wearable objects often have different sizes, different users need to purchase wearable objects of different sizes in order for the purchased wearable objects to fit themselves. In the prior art, when a merchant sells a wearable object in an internet platform, the size of the wearable object is generally marked, and accordingly, a user can select the size of the wearable object according to the size marked in the internet platform and the purchase experience of the individual. However, the actual sizes of wearable objects having the same labeling size often vary, such that the size of the wearable object purchased by the user through the internet platform may not fit on its own, thereby resulting in poor accuracy of the wearable object purchased through the internet platform.
Disclosure of Invention
The embodiment of the invention provides a method, a device and equipment for determining the matching degree based on a wearable object, which improve the accuracy of the wearable object purchased through an Internet platform.
In a first aspect, an embodiment of the present invention provides a method for determining a matching degree based on a wearable object, including:
determining a first three-dimensional model of a first limb of the user;
acquiring a second three-dimensional model of the wearable object corresponding to the first limb, wherein the first limb is used for wearing the wearable object;
and determining the matching degree of the first limb and the wearable object according to the first three-dimensional model and the second three-dimensional model.
In one possible embodiment, the determining the first three-dimensional model of the first limb of the user includes:
acquiring limb information of the first limb, wherein the limb information comprises a plurality of limb images of the first limb, and shooting distance and shooting angle of each limb image;
and determining a first three-dimensional model of the first limb according to the limb information and the standard three-dimensional model corresponding to the first limb.
In another possible implementation manner, the determining the first three-dimensional model of the first limb according to the limb information and the standard three-dimensional model corresponding to the first limb includes:
According to the limb information, determining characteristic parameters of each preset limb point in the first limb, wherein the characteristic parameters of one preset limb point comprise three-dimensional coordinates of the preset limb point and normal vectors of a limb surface where the preset limb point is located, and the first limb comprises a plurality of preset limb points;
and modifying model parameters of the standard three-dimensional model according to characteristic parameters of each preset limb point in the first limb to obtain the first three-dimensional model.
In another possible embodiment, for any first preset limb point of the plurality of preset limb points, determining, according to the limb information, a characteristic parameter of the first preset limb point includes:
acquiring a plurality of first limb images corresponding to the first preset limb points and shooting distances and shooting angles of each first limb image from the limb information;
determining three-dimensional coordinates of the first preset limb points according to the plurality of first limb images and the shooting distance and shooting angle of each first limb image;
and determining a normal vector of a limb surface where the first preset limb point is located according to the three-dimensional coordinates of the first preset limb point, the plurality of first limb images and the shooting distance and the shooting angle of each first limb image.
In another possible implementation manner, the determining the three-dimensional coordinates of the first preset limb point according to the plurality of first limb images and the shooting distance and shooting angle of each first limb image includes:
determining parallax images of the first preset limb points according to the plurality of first limb images;
and determining the three-dimensional coordinates of the first preset limb point according to the parallax image and the shooting distance and the shooting angle of each first limb image.
In another possible implementation manner, the determining the normal vector of the limb surface where the first preset limb point is located according to the three-dimensional coordinates of the first preset limb point, the plurality of first limb images, and the shooting distance and shooting angle of each first limb image includes:
determining voxels corresponding to the first preset limb points according to the plurality of first limb images, wherein the voxels are three-dimensional volume elements corresponding to the first preset limb points;
determining three-dimensional coordinates of each vertex of the voxel according to the shooting distance and the shooting angle of each first limb image;
determining an isosurface corresponding to the first preset limb point in the voxel according to the three-dimensional coordinates of each vertex of the voxel and the three-dimensional coordinates of the first preset limb point;
Obtaining a normal vector of the isosurface;
and determining the normal vector of the limb surface where the first preset limb point is located as the normal vector of the isosurface.
In another possible implementation manner, the obtaining the normal vector of the isosurface includes:
determining the coordinates of each vertex of the isosurface according to the vertex coordinates of the edge where each vertex of the isosurface is located, wherein the edge is the edge of the voxel;
obtaining a unit normal vector of each vertex of the isosurface;
and determining the normal vector of the isosurface according to the unit normal vector of each vertex of the isosurface.
In another possible embodiment, determining the degree of matching of the first limb and the wearable object from the first three-dimensional model and the second three-dimensional model includes:
determining at least one first preset matching area in the first three-dimensional model;
determining at least one second preset matching area in the second three-dimensional model;
and matching each first preset matching area with a corresponding second preset matching area to obtain the matching degree of each first preset matching area and the corresponding second preset matching area.
In a second aspect, an embodiment of the present invention provides a wearable object-based matching degree determining apparatus, including a first determining module, an acquiring module, and a second determining module, where,
the first determining module is used for determining a first three-dimensional model of a first limb of a user;
the acquisition module is used for acquiring a second three-dimensional model of the wearable object corresponding to the first limb, and the first limb is used for wearing the wearable object;
the second determining module is used for determining the matching degree of the first limb and the wearable object according to the first three-dimensional model and the second three-dimensional model.
In another possible implementation manner, the first determining module is specifically configured to:
acquiring limb information of the first limb, wherein the limb information comprises a plurality of limb images of the first limb, and shooting distance and shooting angle of each limb image;
and determining a first three-dimensional model of the first limb according to the limb information and the standard three-dimensional model corresponding to the first limb.
In another possible implementation manner, the first determining module is specifically configured to:
according to the limb information, determining characteristic parameters of each preset limb point in the first limb, wherein the characteristic parameters of one preset limb point comprise three-dimensional coordinates of the preset limb point and normal vectors of a limb surface where the preset limb point is located, and the first limb comprises a plurality of preset limb points;
And modifying model parameters of the standard three-dimensional model according to characteristic parameters of each preset limb point in the first limb to obtain the first three-dimensional model.
In another possible implementation manner, for any first preset limb point of the plurality of preset limb points, the first determining module is specifically configured to:
acquiring a plurality of first limb images corresponding to the first preset limb points and shooting distances and shooting angles of each first limb image from the limb information;
determining three-dimensional coordinates of the first preset limb points according to the plurality of first limb images and the shooting distance and shooting angle of each first limb image;
and determining a normal vector of a limb surface where the first preset limb point is located according to the three-dimensional coordinates of the first preset limb point, the plurality of first limb images and the shooting distance and the shooting angle of each first limb image.
In another possible implementation manner, the first determining module is specifically configured to:
determining parallax images of the first preset limb points according to the plurality of first limb images;
and determining the three-dimensional coordinates of the first preset limb point according to the parallax image and the shooting distance and the shooting angle of each first limb image.
In another possible implementation manner, the first determining module is specifically configured to:
determining voxels corresponding to the first preset limb points according to the plurality of first limb images, wherein the voxels are three-dimensional volume elements corresponding to the first preset limb points;
determining three-dimensional coordinates of each vertex of the voxel according to the shooting distance and the shooting angle of each first limb image;
determining an isosurface corresponding to the first preset limb point in the voxel according to the three-dimensional coordinates of each vertex of the voxel and the three-dimensional coordinates of the first preset limb point;
obtaining a normal vector of the isosurface;
and determining the normal vector of the limb surface where the first preset limb point is located as the normal vector of the isosurface.
In another possible implementation manner, the first determining module is specifically configured to:
determining the coordinates of each vertex of the isosurface according to the vertex coordinates of the edge where each vertex of the isosurface is located, wherein the edge is the edge of the voxel;
obtaining a unit normal vector of each vertex of the isosurface;
and determining the normal vector of the isosurface according to the unit normal vector of each vertex of the isosurface.
In another possible implementation manner, the second determining module is specifically configured to:
determining at least one first preset matching area in the first three-dimensional model;
determining at least one second preset matching area in the second three-dimensional model;
and matching each first preset matching area with a corresponding second preset matching area to obtain the matching degree of each first preset matching area and the corresponding second preset matching area.
In a third aspect, an embodiment of the present invention provides a wearable object-based matching degree determining apparatus, including: a processor coupled to the memory;
the memory is used for storing a computer program;
the processor is configured to execute the computer program stored in the memory, so that the terminal device executes the wearable object-based matching degree determining method according to any one of the above first aspects.
In a fourth aspect, an embodiment of the present invention provides a readable storage medium, including a program or instructions, which when executed on a computer, performs a method for determining matching degree based on a wearable object according to any one of the first aspect.
According to the method, the device and the equipment for determining the matching degree based on the wearable object, when a user purchases the wearable object of the first limb on the Internet platform, the terminal equipment (or the server) can acquire the first three-dimensional model of the first limb and the second three-dimensional model of the wearable object, and the matching degree of the first limb and the wearable object is determined according to the first three-dimensional model and the second three-dimensional model. Because the size of the first three-dimensional model is the same as the size of the first limb, and the size of the second three-dimensional model is the same as the size of the wearable object, the matching degree of the first three-dimensional model and the second three-dimensional model can truly reflect the matching degree of the size of the first limb and the wearable object, so that a user can accurately determine whether the size of the wearable object is suitable for the user according to the matching degree, and the accuracy of the wearable object purchased by the user through the Internet platform is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it will be obvious that the drawings in the following description are some embodiments of the present invention, and that other drawings can be obtained according to these drawings without inventive effort to a person skilled in the art.
Fig. 1 is a schematic diagram of a wearable object-based matching degree determining method according to an embodiment of the present invention;
fig. 2 is a flow chart of a method for determining matching degree based on a wearable object according to an embodiment of the present invention;
FIG. 3 is a flowchart of a method for determining a first three-dimensional model according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a matching degree determination interface according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a wearable object-based matching degree determining device according to an embodiment of the present invention;
fig. 6 is a schematic hardware structure diagram of a wearable object-based matching degree determining device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 is a schematic diagram of a wearable object-based matching degree determining method according to an embodiment of the present invention. Referring to fig. 1, when a user purchases a wearable object through an internet platform, the user may first scan a limb through a terminal device to obtain limb information of the limb. The terminal device may determine a limb model from the limb information and match the limb model with a model of the wearable object to be purchased to determine whether the size of the wearable object to be purchased is suitable for the user. Or, the terminal device may also send the limb information to the server, and the server determines a limb model according to the limb information, and matches the limb model with the model of the wearable object to be purchased to determine whether the size of the wearable object to be purchased is suitable for the user.
In the application, when the user purchases the wearable object on the internet platform, the terminal equipment (or the server) can determine the limb model of the user according to the limb information of the user, and determine the matching degree of the limb and the wearable object according to the limb model and the model of the wearable object, wherein the matching degree reflects the size similarity degree of the limb of the user and the wearable object, so that the user can accurately determine whether the size of the wearable object is suitable for the user according to the matching degree, and the accuracy of the wearable object purchased by the user through the internet platform is improved.
The technical scheme shown in the application is described in detail through specific embodiments. It should be noted that the following embodiments may be combined with each other, and for the same or similar matters, the description will not be repeated in different embodiments.
Fig. 2 is a flow chart of a method for determining matching degree based on a wearable object according to an embodiment of the present invention. Referring to fig. 2, the method may include:
s201, determining a first three-dimensional model of a first limb of a user.
Optionally, the execution body of the embodiment of the present invention may be a terminal device or a server, and may also be a wearable object recommendation device provided in the terminal device or the server. The wearable object recommendation device may be implemented by software, or may be implemented by a combination of software and hardware.
Optionally, the terminal device may be a mobile phone, a computer, or other devices.
Alternatively, the server may be a server corresponding to an internet platform selling the wearable object.
Alternatively, the first limb may be any body part of the user.
For example, the first limb may be a user's hand, foot, head, or the like.
Optionally, the contour of the first three-dimensional model is the same as the contour of the first limb, and the size of the first three-dimensional model is the same as the size of the first limb.
S202, acquiring a second three-dimensional model of the wearable object corresponding to the first limb.
Alternatively, a second three-dimensional model of the wearable object may be generated before the vendor sells the wearable object, or after the manufacturer produces the wearable object.
Alternatively, if the wearable object has a fixed morphology, the wearable object may be scanned to determine a second three-dimensional model of the wearable object.
For example, a wearable object having a fixed morphology may include shoes, hats, and the like.
Alternatively, since the fabrication material of the wearable object has a certain thickness, after the second three-dimensional model is generated by scanning the wearable object, the size of the second three-dimensional model may be corrected according to the thickness of the fabrication material of the wearable object.
Optionally, if the wearable object does not have a fixed shape, the wearable object may be worn on a limb model that matches the size of the wearable object, and the limb model may be scanned to obtain a second three-dimensional model corresponding to the wearable object.
For example, a wearable object that does not have an unfixed form may include a sweater, pants, or the like.
S203, determining the matching degree of the first limb and the wearable object according to the first three-dimensional model and the second three-dimensional model.
Alternatively, the degree of matching of the first limb and the wearable object may be determined by the following possible implementations: at least one first preset matching area is determined in the first three-dimensional model, at least one second preset matching area is determined in the second three-dimensional model, and each first preset matching area is matched with the corresponding second preset matching area, so that the matching degree of each first preset matching area and the corresponding second preset matching area is obtained.
Optionally, a portion with a high requirement for comfort of the wearable object may be determined in the first limb, and an area corresponding to the portion in the first three-dimensional model is determined as a first preset matching area.
For example, assuming the first limb is a foot and the toe and heel in the foot are in high demand for comfort in the shoe, the corresponding areas of the toe and heel in the first three-dimensional model of the foot may be determined as the first preset matching area.
Optionally, the second preset matching area and the first preset matching area have a preset corresponding relationship.
For example, assuming the first preset matching region is the toe and heel in the first three-dimensional model, the second preset matching region is the toe and heel in the second three-dimensional model.
For example, assuming that the first limb is a foot and the wearable object is a shoe, the degree of matching between the toe in the first three-dimensional model and the toe in the second three-dimensional model, and the degree of matching between the heel in the first three-dimensional model and the heel in the second three-dimensional model may be obtained.
According to the wearable object-based matching degree determining method provided by the embodiment of the invention, when a user purchases a wearable object of a first limb on an Internet platform, a terminal device (or a server) can acquire a first three-dimensional model of the first limb and a second three-dimensional model of the wearable object, and the matching degree of the first limb and the wearable object is determined according to the first three-dimensional model and the second three-dimensional model. Because the size of the first three-dimensional model is the same as the size of the first limb, and the size of the second three-dimensional model is the same as the size of the wearable object, the matching degree of the first three-dimensional model and the second three-dimensional model can truly reflect the matching degree of the size of the first limb and the wearable object, so that a user can accurately determine whether the size of the wearable object is suitable for the user according to the matching degree, and the accuracy of the wearable object purchased by the user through the Internet platform is improved.
On the basis of any of the above embodiments, optionally, the first three-dimensional model of the first limb may be determined by the following possible implementation, specifically, please refer to the embodiment shown in fig. 3.
Fig. 3 is a flowchart of a method for determining a first three-dimensional model according to an embodiment of the present invention. Referring to fig. 3, the method may include:
s301, acquiring limb information of a first limb.
The limb information comprises a plurality of limb images of the first limb and shooting distance and shooting angle of each limb image.
Optionally, the user may perform multi-angle shooting on the first limb through an image capturing device in the terminal device, so as to obtain multiple limb images of the first limb.
Optionally, a gyroscope is further arranged in the terminal device, and when the terminal device collects the limb images through the camera device, shooting distances and shooting angles corresponding to each limb image can be obtained through the gyroscope.
Alternatively, a plurality of limb images and the shooting distance and shooting angle of each limb image can be acquired by the following possible implementation manners: the terminal equipment comprises a camera device and a gyroscope, the camera device is started in the terminal equipment at the initial time, a shooting frame is displayed in the terminal equipment, a user can adjust the distance and the angle between the camera device and the first limb so that the first limb is positioned in the shooting frame, at the moment, the distance (shooting distance) between the camera device and the first limb is a preset distance, and the angle (shooting angle) between the camera device and the first limb is a preset angle. In the process that the user moves the terminal equipment to enable the camera to shoot the first limb at multiple angles, the shooting distance and the shooting angle of each limb image are determined according to the change of the gyroscope on the X, Y, Z axis and the preset distance and the preset angle.
It should be noted that the limb information may also include other information, which is not particularly limited in the embodiment of the present invention.
S302, determining characteristic parameters of each preset limb point in the first limb according to limb information.
The characteristic parameters of one preset limb point comprise three-dimensional coordinates of the preset limb point and normal vectors of the limb surface where the preset limb point is located, and the first limb comprises a plurality of preset limb points.
Optionally, the preset limb point is a limb point with obvious characteristics in the first limb or a limb point with high requirements on comfort of the wearable object.
For example, assuming the first limb is a foot, the first preset limb points may include a toe midpoint, a heel midpoint, etc.
Of course, in the actual application process, the preset limb point in the first limb may be set according to actual needs, which is not particularly limited in the embodiment of the present invention.
It should be noted that, the first limb includes a plurality of preset limb points, the process of acquiring the characteristic parameter of each preset limb point is the same, and the following description will be given by taking the process of acquiring the characteristic parameter of the first limb point in the first limb as an example:
alternatively, the three-dimensional coordinates of the first preset limb point may be obtained by the following possible implementation manners: and acquiring a plurality of first limb images corresponding to the first preset limb points and shooting distances and shooting angles of each first limb image in the limb information, and determining three-dimensional coordinates of the first preset limb points according to the plurality of first limb images and the shooting distances and shooting angles of each first limb image. Alternatively, the parallax image of the first preset limb point may be determined according to the plurality of first limb images, and the three-dimensional coordinates of the first preset limb point may be determined according to the parallax image and the photographing distance and photographing angle of each of the first limb images.
Optionally, the first limb image includes an image of a first preset limb point.
Optionally, image recognition may be performed on each image included in the limb information, and if a first preset limb point is obtained by recognition in one limb image, the limb image is determined to be the first limb image.
Optionally, the shooting angles of the first limb points in each first limb image are different.
It should be noted that, according to the technology in the prior art, the parallax image of the first preset limb point may be determined according to a plurality of first limb images, which is not described in detail in the embodiment of the present invention.
Alternatively, the three-dimensional coordinates of the first preset limb point may be coordinates of the first preset limb point with respect to the preset origin.
For example, the preset origin may be a point where the imaging device is located when the terminal device initially shoots, or the preset origin may be a point preset in the first limb.
Of course, in the actual application process, the preset origin may be set according to actual needs, which is not specifically limited in the embodiment of the present invention.
Alternatively, the normal vector of the limb surface where the first preset limb point is located may be obtained through the following possible implementation manner: according to a plurality of first limb images, determining voxels corresponding to first preset limb points, determining three-dimensional coordinates of each vertex of the voxels according to the shooting distance and the shooting angle of each first limb image, determining an equivalent surface corresponding to the first preset limb points in the voxels according to the three-dimensional coordinates of each vertex of the voxels and the three-dimensional coordinates of the first preset limb points, obtaining normal vectors of the equivalent surface, and determining that the normal vectors of the limb surface where the first preset limb points are located are normal vectors of the equivalent surface.
It should be noted that, the voxel is a three-dimensional volume element corresponding to the first preset limb point.
For example, voxels may be cuboid elements, cube elements, or the like.
It should be noted that the iso-surface is used to indicate a part of the surface in the first limb. The isosurface corresponding to the first preset limb point may be the isosurface where the first preset limb point is located.
Alternatively, the state of each vertex in the voxel may be determined according to the three-dimensional coordinates of each vertex of the voxel and the three-dimensional coordinates of the first preset limb point, where the state of each vertex is 1 or 0.
Alternatively, when the state of the vertex is 1, it is indicated that the vertex is located outside the iso-surface, and when the state of the vertex is 0, it is indicated that the vertex is located inside the iso-surface. Alternatively, when the state of the vertex is 1, it is indicated that the vertex is located inside the iso-surface, and when the state of the vertex is 0, it is indicated that the vertex is located outside the iso-surface.
Alternatively, the state of each vertex of a voxel may be determined by the following program code:
Figure BDA0001854544210000111
where s [ jj ] is one element of the coordinates of the jj-th vertex, for example, s [ jj ] is the coordinate on the X-axis, or the coordinate on the Y-axis, or the coordinate on the Z-axis in the jj-th vertex. just value is a preset coordinate on the X-axis, or a coordinate on the Y-axis, or a coordinate on the Z-axis. currentIndex is the state of the vertex.
Optionally, after obtaining the state of each vertex, if the states of two vertices of one edge in the voxel are different, it is indicated that the iso-surface has an intersection with the edge, and the iso-surface can be determined and obtained according to the intersection of the iso-surface and the multiple edges.
Alternatively, the normal vector of the iso-surface may be determined by the following possible implementation: the coordinates of each vertex of the isosurface can be determined according to the vertex coordinates of the edge where each vertex of the isosurface is located, the edge is the edge of the voxel, the unit normal vector of each vertex of the isosurface is obtained, and the normal vector of the isosurface is determined according to the unit normal vector of each vertex of the isosurface.
Alternatively, one vertex coordinate of the iso-surface may be determined by the following formula one:
p=m1+ (isoCurrentValue-S1) (M2-M1) | (S1-S2) formula one;
wherein P is the coordinate of a vertex of the isosurface, M1 and M2 are the coordinates of two endpoints fixed on the edge of the voxel where the isosurface is located, S1 and S2 are scalar data of the two endpoints, and isoCurrentValue is the isosurface value.
Alternatively, the normal vector of the iso-surface may be determined from the unit normal vector of each vertex of the iso-surface by the following formula two:
X=x1+ (isoCurrentValue-S1) (X2-X1) | (V2-V1) formula two;
wherein X is the normal vector of the isosurface, X1 and X2 are the unit normal vectors of at least two vertexes in the isosurface, S1 and S2 are scalar data of at least two vertexes, and isoCurrentValue is the isosurface value.
S303, modifying model parameters of the standard three-dimensional model according to characteristic parameters of each preset limb point in the first limb to obtain the first three-dimensional model.
Optionally, the model parameters of the standard three-dimensional model include a plurality of feature parameters, and the corresponding feature parameters in the model parameters can be modified into feature parameters of preset limb points, so as to obtain the first three-dimensional model.
In the embodiment shown in fig. 3, a user only needs to scan (shoot at multiple angles) a first limb to enable the terminal device to acquire the obtained limb information of the first limb, further, the characteristic parameters of each preset limb point in the first limb are determined according to the limb information of the first limb, the corresponding characteristic parameters in the preset standard three-dimensional model are modified into the characteristic parameters of each preset limb point to obtain a first three-dimensional model of the first limb, the operation of the user is simple, and the efficiency of generating the first three-dimensional model is high.
The technical solution shown in the foregoing method embodiment will be described in detail below by way of a specific example with reference to fig. 4.
Fig. 4 is a schematic diagram of a matching degree determination interface according to an embodiment of the present invention. Referring to fig. 4, an interface 401-403 is included.
Referring to the interface 401, in the process of purchasing shoes through the internet platform, the user displays the shoes selected by the user in the interface 401, and the interface 401 further includes a try-on icon, so that the user can click the "try-on" icon to perform virtual try-on the shoes. After the user clicks the "try-on" icon, the terminal device displays the interface 402.
Referring to the interface 402, the terminal device invokes the camera device and performs image acquisition through the camera device, where the interface 402 includes a right foot image display area a, and prompts the user to adjust the distance and angle between the camera device and the right foot, so that the right foot image is just located in the right foot display area a. After the user adjusts the distance and angle between the camera device and the right foot, the user can click on the shooting icon to shoot the right foot at multiple angles. After the user clicks the "shoot" icon, the right foot display area a is canceled from being displayed. After the user shoots the right foot at multiple angles, the terminal device may generate a three-dimensional model of the right foot according to the method shown in the embodiments of fig. 2-3, and display the three-dimensional model of the right foot, specifically, please refer to the interface 403.
Referring to interface 403, the terminal device displays the generated three-dimensional model of the right foot, and the terminal device may also obtain and display the three-dimensional model of the shoe. The terminal equipment is matched with the three-dimensional model of the right foot and the three-dimensional model of the shoe to generate a try-on result. Assuming that the length of the shoe matches the length of the right foot, but the toe is wider than the heel, the result of the try-on may be as shown in interface 403.
In the embodiment shown in fig. 4, the foot model is the same as the real size of the foot of the user, and the size of the shoe model is the same as the real size of the shoe, so that the matching degree of the foot model and the shoe model can truly reflect the matching degree of the foot of the user and the size of the shoe, so that the user can accurately determine whether the size of the shoe to be purchased fits the foot according to the matching degree, the user can purchase the shoe with proper size through the internet platform, and the user operation is simple and convenient, and the user experience is improved.
Fig. 5 is a schematic structural diagram of a wearable object-based matching degree determining device according to an embodiment of the present invention. Referring to fig. 5, the wearable object-based matching degree determining apparatus 10 may include a first determining module 11, an acquiring module 12, and a second determining module 13, wherein,
The first determining module 11 is configured to determine a first three-dimensional model of a first limb of a user;
the acquiring module 12 is configured to acquire a second three-dimensional model of a wearable object corresponding to the first limb, where the first limb is used to wear the wearable object;
the second determining module 13 is configured to determine a matching degree of the first limb and the wearable object according to the first three-dimensional model and the second three-dimensional model.
The wearable object-based matching degree determining device provided by the embodiment of the invention can execute the technical scheme shown in the embodiment of the method, and the implementation principle and the beneficial effects are similar, and are not repeated here.
In one possible embodiment, the first determining module 11 is specifically configured to:
acquiring limb information of the first limb, wherein the limb information comprises a plurality of limb images of the first limb, and shooting distance and shooting angle of each limb image;
and determining a first three-dimensional model of the first limb according to the limb information and the standard three-dimensional model corresponding to the first limb.
In another possible embodiment, the first determining module 11 is specifically configured to:
According to the limb information, determining characteristic parameters of each preset limb point in the first limb, wherein the characteristic parameters of one preset limb point comprise three-dimensional coordinates of the preset limb point and normal vectors of a limb surface where the preset limb point is located, and the first limb comprises a plurality of preset limb points;
and modifying model parameters of the standard three-dimensional model according to characteristic parameters of each preset limb point in the first limb to obtain the first three-dimensional model.
In another possible implementation manner, for any first preset limb point of the plurality of preset limb points, the first determining module is specifically configured to:
acquiring a plurality of first limb images corresponding to the first preset limb points and shooting distances and shooting angles of each first limb image from the limb information;
determining three-dimensional coordinates of the first preset limb points according to the plurality of first limb images and the shooting distance and shooting angle of each first limb image;
and determining a normal vector of a limb surface where the first preset limb point is located according to the three-dimensional coordinates of the first preset limb point, the plurality of first limb images and the shooting distance and the shooting angle of each first limb image.
In another possible embodiment, the first determining module 11 is specifically configured to:
determining parallax images of the first preset limb points according to the plurality of first limb images;
and determining the three-dimensional coordinates of the first preset limb point according to the parallax image and the shooting distance and the shooting angle of each first limb image.
In another possible embodiment, the first determining module 11 is specifically configured to:
determining voxels corresponding to the first preset limb points according to the plurality of first limb images, wherein the voxels are three-dimensional volume elements corresponding to the first preset limb points;
determining three-dimensional coordinates of each vertex of the voxel according to the shooting distance and the shooting angle of each first limb image;
determining an isosurface corresponding to the first preset limb point in the voxel according to the three-dimensional coordinates of each vertex of the voxel and the three-dimensional coordinates of the first preset limb point;
obtaining a normal vector of the isosurface;
and determining the normal vector of the limb surface where the first preset limb point is located as the normal vector of the isosurface.
In another possible embodiment, the first determining module 11 is specifically configured to:
Determining the coordinates of each vertex of the isosurface according to the vertex coordinates of the edge where each vertex of the isosurface is located, wherein the edge is the edge of the voxel;
obtaining a unit normal vector of each vertex of the isosurface;
and determining the normal vector of the isosurface according to the unit normal vector of each vertex of the isosurface.
In another possible embodiment, the second determining module 13 is specifically configured to:
determining at least one first preset matching area in the first three-dimensional model;
determining at least one second preset matching area in the second three-dimensional model;
and matching each first preset matching area with a corresponding second preset matching area to obtain the matching degree of each first preset matching area and the corresponding second preset matching area.
Fig. 6 is a schematic hardware structure diagram of a wearable object-based matching degree determining device according to an embodiment of the present invention. As shown in fig. 6, the wearable object-based matching degree determination device 20 includes: at least one processor 21 and a memory 22. Optionally, the wearable object based matching degree determination device further comprises a communication component 23. The processor 21, the memory 22, and the communication unit 23 are connected via a bus 24.
In a specific implementation, at least one processor 21 executes computer-executable instructions stored in the memory 22, so that the at least one processor 21 performs the method as shown in the above method embodiments.
The communication means 23 may interact with the device.
The specific implementation process of the processor 21 can be referred to the above method embodiment, and its implementation principle and technical effects are similar, and this embodiment will not be described herein again.
In the embodiment shown in fig. 6, it should be understood that the processor may be a central processing unit (english: central Processing Unit, abbreviated as CPU), or may be other general purpose processors, digital signal processors (english: digital Signal Processor, abbreviated as DSP), application specific integrated circuits (english: application Specific Integrated Circuit, abbreviated as ASIC), or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution.
The memory may comprise high speed RAM memory or may further comprise non-volatile storage NVM, such as at least one disk memory.
The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (Peripheral Component, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, the buses in the drawings of the present application are not limited to only one bus or one type of bus.
The present application also provides a computer-readable storage medium having stored therein computer-executable instructions which, when executed by a processor, implement the method shown in the method embodiments described above.
The computer readable storage medium described above may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk, or optical disk. A readable storage medium can be any available medium that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. In the alternative, the readable storage medium may be integral to the processor. The processor and the readable storage medium may reside in an application specific integrated circuit (Application Specific Integrated Circuits, ASIC for short). The processor and the readable storage medium may reside as discrete components in a device.
The division of the units is merely a logic function division, and there may be another division manner when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the embodiments of the present invention, and are not limited thereto; although embodiments of the present invention have been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions.

Claims (16)

1. A wearable object-based matching degree determination method, comprising:
determining a first three-dimensional model of a first limb of the user;
acquiring a second three-dimensional model of the wearable object corresponding to the first limb, wherein the first limb is used for wearing the wearable object;
determining at least one first preset matching area in the first three-dimensional model, wherein the comfort level requirement of the part of the first preset matching area corresponding to the first limb on the wearable object is greater than that of the parts of the other areas corresponding to the first limb;
determining at least one second preset matching area in the second three-dimensional model, wherein the second preset matching area and the first preset matching area have a preset corresponding relation;
And matching each first preset matching area with a corresponding second preset matching area to obtain the matching degree of each first preset matching area with the corresponding second preset matching area so as to determine the matching degree of the first limb and the wearable object.
2. The method of claim 1, wherein the determining the first three-dimensional model of the first limb of the user comprises:
acquiring limb information of the first limb, wherein the limb information comprises a plurality of limb images of the first limb, and shooting distance and shooting angle of each limb image;
and determining a first three-dimensional model of the first limb according to the limb information and the standard three-dimensional model corresponding to the first limb, wherein the first limb is a foot, and the first preset matching area corresponds to a position in the first limb and comprises a toe and a heel in the foot.
3. The method of claim 2, wherein the determining the first three-dimensional model of the first limb from the limb information and the standard three-dimensional model corresponding to the first limb comprises:
according to the limb information, determining characteristic parameters of each preset limb point in the first limb, wherein the characteristic parameters of one preset limb point comprise three-dimensional coordinates of the preset limb point and normal vectors of a limb surface where the preset limb point is located, and the first limb comprises a plurality of preset limb points;
And modifying model parameters of the standard three-dimensional model according to characteristic parameters of each preset limb point in the first limb to obtain the first three-dimensional model.
4. A method according to claim 3, wherein for any first preset limb point of the plurality of preset limb points, determining a characteristic parameter of the first preset limb point from the limb information comprises:
acquiring a plurality of first limb images corresponding to the first preset limb points and shooting distances and shooting angles of each first limb image from the limb information;
determining three-dimensional coordinates of the first preset limb points according to the plurality of first limb images and the shooting distance and shooting angle of each first limb image;
and determining a normal vector of a limb surface where the first preset limb point is located according to the three-dimensional coordinates of the first preset limb point, the plurality of first limb images and the shooting distance and the shooting angle of each first limb image.
5. The method of claim 4, wherein determining the three-dimensional coordinates of the first preset limb point according to the plurality of first limb images and the photographing distance and photographing angle of each first limb image comprises:
Determining parallax images of the first preset limb points according to the plurality of first limb images;
and determining the three-dimensional coordinates of the first preset limb point according to the parallax image and the shooting distance and the shooting angle of each first limb image.
6. The method according to claim 4, wherein determining the normal vector of the limb surface where the first preset limb point is located according to the three-dimensional coordinates of the first preset limb point, the plurality of first limb images, and the photographing distance and photographing angle of each first limb image comprises:
determining voxels corresponding to the first preset limb points according to the plurality of first limb images, wherein the voxels are three-dimensional volume elements corresponding to the first preset limb points;
determining three-dimensional coordinates of each vertex of the voxel according to the shooting distance and the shooting angle of each first limb image;
determining an isosurface corresponding to the first preset limb point in the voxel according to the three-dimensional coordinates of each vertex of the voxel and the three-dimensional coordinates of the first preset limb point;
obtaining a normal vector of the isosurface;
and determining the normal vector of the limb surface where the first preset limb point is located as the normal vector of the isosurface.
7. The method of claim 6, wherein the obtaining the normal vector of the iso-surface comprises:
determining the coordinates of each vertex of the isosurface according to the vertex coordinates of the edge where each vertex of the isosurface is located, wherein the edge is the edge of the voxel;
obtaining a unit normal vector of each vertex of the isosurface;
and determining the normal vector of the isosurface according to the unit normal vector of each vertex of the isosurface.
8. A wearable object-based matching degree determining device is characterized by comprising a first determining module, an acquiring module and a second determining module, wherein,
the first determining module is used for determining a first three-dimensional model of a first limb of a user;
the acquisition module is used for acquiring a second three-dimensional model of the wearable object corresponding to the first limb, and the first limb is used for wearing the wearable object;
the second determining module is configured to determine, based on the first determining module,
determining at least one first preset matching area in the first three-dimensional model, wherein the comfort level requirement of the part of the first preset matching area corresponding to the first limb on the wearable object is greater than that of the parts of the other areas corresponding to the first limb;
Determining at least one second preset matching area in the second three-dimensional model, wherein the second preset matching area and the first preset matching area have a preset corresponding relation;
and matching each first preset matching area with a corresponding second preset matching area to obtain the matching degree of each first preset matching area with the corresponding second preset matching area so as to determine the matching degree of the first limb and the wearable object.
9. The apparatus of claim 8, wherein the first determining module is specifically configured to:
acquiring limb information of the first limb, wherein the limb information comprises a plurality of limb images of the first limb, and shooting distance and shooting angle of each limb image;
and determining a first three-dimensional model of the first limb according to the limb information and the standard three-dimensional model corresponding to the first limb, wherein the first limb is a foot, and the first preset matching area corresponds to a position in the first limb and comprises a toe and a heel in the foot.
10. The apparatus of claim 9, wherein the first determining module is specifically configured to:
according to the limb information, determining characteristic parameters of each preset limb point in the first limb, wherein the characteristic parameters of one preset limb point comprise three-dimensional coordinates of the preset limb point and normal vectors of a limb surface where the preset limb point is located, and the first limb comprises a plurality of preset limb points;
And modifying model parameters of the standard three-dimensional model according to characteristic parameters of each preset limb point in the first limb to obtain the first three-dimensional model.
11. The apparatus of claim 10, wherein for any first preset limb point of the plurality of preset limb points, the first determining module is specifically configured to:
acquiring a plurality of first limb images corresponding to the first preset limb points and shooting distances and shooting angles of each first limb image from the limb information;
determining three-dimensional coordinates of the first preset limb points according to the plurality of first limb images and the shooting distance and shooting angle of each first limb image;
and determining a normal vector of a limb surface where the first preset limb point is located according to the three-dimensional coordinates of the first preset limb point, the plurality of first limb images and the shooting distance and the shooting angle of each first limb image.
12. The apparatus of claim 11, wherein the first determining module is specifically configured to:
determining parallax images of the first preset limb points according to the plurality of first limb images;
and determining the three-dimensional coordinates of the first preset limb point according to the parallax image and the shooting distance and the shooting angle of each first limb image.
13. The apparatus of claim 11, wherein the first determining module is specifically configured to:
determining voxels corresponding to the first preset limb points according to the plurality of first limb images, wherein the voxels are three-dimensional volume elements corresponding to the first preset limb points;
determining three-dimensional coordinates of each vertex of the voxel according to the shooting distance and the shooting angle of each first limb image;
determining an isosurface corresponding to the first preset limb point in the voxel according to the three-dimensional coordinates of each vertex of the voxel and the three-dimensional coordinates of the first preset limb point;
obtaining a normal vector of the isosurface;
and determining the normal vector of the limb surface where the first preset limb point is located as the normal vector of the isosurface.
14. The apparatus of claim 13, wherein the first determining module is specifically configured to:
determining the coordinates of each vertex of the isosurface according to the vertex coordinates of the edge where each vertex of the isosurface is located, wherein the edge is the edge of the voxel;
obtaining a unit normal vector of each vertex of the isosurface;
and determining the normal vector of the isosurface according to the unit normal vector of each vertex of the isosurface.
15. A wearable object-based matching degree determination device, comprising: a processor coupled to the memory;
the memory is used for storing a computer program;
the processor is configured to execute the computer program stored in the memory, so that the terminal device executes the wearable object-based matching degree determination method according to any one of claims 1 to 7.
16. A readable storage medium comprising a program or instructions which, when run on a computer, performs the wearable object-based matching degree determination method as claimed in any one of the preceding claims 1-7.
CN201811309225.XA 2018-11-05 2018-11-05 Wearable object-based matching degree determination method, device and equipment Active CN111147842B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811309225.XA CN111147842B (en) 2018-11-05 2018-11-05 Wearable object-based matching degree determination method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811309225.XA CN111147842B (en) 2018-11-05 2018-11-05 Wearable object-based matching degree determination method, device and equipment

Publications (2)

Publication Number Publication Date
CN111147842A CN111147842A (en) 2020-05-12
CN111147842B true CN111147842B (en) 2023-05-02

Family

ID=70515757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811309225.XA Active CN111147842B (en) 2018-11-05 2018-11-05 Wearable object-based matching degree determination method, device and equipment

Country Status (1)

Country Link
CN (1) CN111147842B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101271574A (en) * 2008-03-20 2008-09-24 华南师范大学 Three-dimensional visualization method and device
CN102956004A (en) * 2011-08-25 2013-03-06 鸿富锦精密工业(深圳)有限公司 Virtual fitting system and method
CN104091269A (en) * 2014-06-30 2014-10-08 京东方科技集团股份有限公司 Virtual fitting method and virtual fitting system
CN104637084A (en) * 2015-01-29 2015-05-20 吴宇晖 Method for building garment virtual three-dimensional model and virtual garment trying-on system
CN104637083A (en) * 2015-01-29 2015-05-20 吴宇晖 Virtual fitting system
CN108010134A (en) * 2017-11-29 2018-05-08 湘潭大学 A kind of real-time three-dimensional virtual fit method based on mobile terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110298897A1 (en) * 2010-06-08 2011-12-08 Iva Sareen System and method for 3d virtual try-on of apparel on an avatar

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101271574A (en) * 2008-03-20 2008-09-24 华南师范大学 Three-dimensional visualization method and device
CN102956004A (en) * 2011-08-25 2013-03-06 鸿富锦精密工业(深圳)有限公司 Virtual fitting system and method
CN104091269A (en) * 2014-06-30 2014-10-08 京东方科技集团股份有限公司 Virtual fitting method and virtual fitting system
CN104637084A (en) * 2015-01-29 2015-05-20 吴宇晖 Method for building garment virtual three-dimensional model and virtual garment trying-on system
CN104637083A (en) * 2015-01-29 2015-05-20 吴宇晖 Virtual fitting system
CN108010134A (en) * 2017-11-29 2018-05-08 湘潭大学 A kind of real-time three-dimensional virtual fit method based on mobile terminal

Also Published As

Publication number Publication date
CN111147842A (en) 2020-05-12

Similar Documents

Publication Publication Date Title
US11055890B2 (en) Electronic device for generating avatar and method thereof
EP3951721A1 (en) Method and apparatus for determining occluded area of virtual object, and terminal device
Delaunoy et al. Photometric bundle adjustment for dense multi-view 3d modeling
CN107564080B (en) Face image replacement system
US20190228556A1 (en) Estimating accurate face shape and texture from an image
US20170186219A1 (en) Method for 360-degree panoramic display, display module and mobile terminal
CN111163303B (en) Image display method, device, terminal and storage medium
WO2018210308A1 (en) Blurring method and apparatus for image, storage medium, and electronic device
US11069115B2 (en) Method of controlling display of avatar and electronic device therefor
CN110706283B (en) Calibration method and device for sight tracking, mobile terminal and storage medium
CN110850961A (en) Calibration method of head-mounted display equipment and head-mounted display equipment
CN110825079A (en) Map construction method and device
US20220277512A1 (en) Generation apparatus, generation method, system, and storage medium
WO2020100111A1 (en) Methods and systems for evaluating the size of a garment
CN110688002A (en) Virtual content adjusting method and device, terminal equipment and storage medium
CN111147842B (en) Wearable object-based matching degree determination method, device and equipment
CN110533775B (en) Glasses matching method and device based on 3D face and terminal
KR20220158866A (en) Colored three-dimensional digital model generation
CN112017276A (en) Three-dimensional model construction method and device and electronic equipment
CN108764135B (en) Image generation method and device and electronic equipment
CN111710044A (en) Image processing method, apparatus and computer-readable storage medium
CN115937299A (en) Method for placing virtual object in video and related equipment
CN109300191A (en) AR model treatment method, apparatus, electronic equipment and readable storage medium storing program for executing
TWI637353B (en) Measurement device and measurement method
CN111368675A (en) Method, device and equipment for processing gesture depth information and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant