CN106920146B - Three-dimensional fitting method based on somatosensory characteristic parameter extraction - Google Patents

Three-dimensional fitting method based on somatosensory characteristic parameter extraction Download PDF

Info

Publication number
CN106920146B
CN106920146B CN201710090213.1A CN201710090213A CN106920146B CN 106920146 B CN106920146 B CN 106920146B CN 201710090213 A CN201710090213 A CN 201710090213A CN 106920146 B CN106920146 B CN 106920146B
Authority
CN
China
Prior art keywords
fitting
model
parameter
clothing
clothes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710090213.1A
Other languages
Chinese (zh)
Other versions
CN106920146A (en
Inventor
郑紫微
赵婷
骆绪龙
郭建广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo University
Original Assignee
Ningbo University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo University filed Critical Ningbo University
Priority to CN201710090213.1A priority Critical patent/CN106920146B/en
Publication of CN106920146A publication Critical patent/CN106920146A/en
Application granted granted Critical
Publication of CN106920146B publication Critical patent/CN106920146B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0022Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device
    • G06K17/0029Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device the arrangement being specially adapted for wireless interrogation of grouped or bundled articles tagged with wireless record carriers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a three-dimensional fitting method based on somatosensory characteristic parameter extraction, which is used for a three-dimensional fitting system formed by a mobile Kinect camera, an RFID scanner, at least one piece of clothes with an RFID label, a clothes model parameter database, a human body model parameter database, a central processing unit, a touch display screen and a fitting background generator. After a clothes fitting model is constructed and an original dynamic model of a human body is generated, a fitting background is generated; generating a human body actual dynamic model matched with the real facial image of the fitting person by using the human body model parameter database, and obtaining a virtual clothes fitting model by using the clothes model parameter database; the clothing model parameter database adaptively sleeves the virtual clothing fitting model on the corresponding actual dynamic model of the human body; the touch display screen sets the maximum deformation area of the virtual clothes fitting model during displaying and the deformation rate of the virtual clothes fitting model during deformation so that a fitter can obtain the corresponding satisfied clothes.

Description

Three-dimensional fitting method based on somatosensory characteristic parameter extraction
Technical Field
The invention relates to the field of three-dimensional fitting, in particular to a three-dimensional fitting method based on somatosensory characteristic parameter extraction.
Background
When a traditional clothing physical store purchases clothing, people try on to know whether the clothing to be purchased conforms to the body type of the people and whether the wearing effect is satisfactory. However, when a holiday of great importance is encountered, especially in a clothing store in a shopping mall, a large number of consumers may enter the clothing store to try on the clothing to purchase the clothing satisfying the wearing effect of the consumers.
However, there are also significant disadvantages when a consumer enters a clothing store to purchase clothing: because the number of consumers is huge, the number of clothes displayed in the clothes entity store is limited, once a plurality of consumers try on the same clothes or the same clothes at the same time, the consumers need to queue one clothes for trying on, a great deal of time of the consumers is wasted, the consumers can leave the clothes entity store because of unwilling to queue, the clothes purchasing efficiency between the clothes entity store and the consumers is reduced, and the reason that the online clothes purchasing is favored by the consumers is also provided.
Disclosure of Invention
The invention aims to solve the technical problem of providing a three-dimensional fitting method based on somatosensory characteristic parameter extraction, which can select satisfactory wearing effect without requiring a consumer to try on clothes in a clothing brick-and-mortar store in the prior art.
The technical scheme adopted by the invention for solving the technical problems is as follows: a three-dimensional fitting method based on somatosensory characteristic parameter extraction is used for a three-dimensional fitting system formed by a mobile Kinect camera, an RFID scanner, at least one piece of clothes with an RFID label, a clothes model parameter database, a human body model parameter database, a central processing unit, a touch display screen and a fitting background generator, and is characterized by comprising the following steps 1-8:
step 1, scanning RFID labels on various clothes through an RFID scanner to obtain clothes parameters corresponding to the clothes, storing the obtained clothes parameters into a clothes model parameter database, and then generating an original clothes model corresponding to the obtained clothes parameters through the clothes model parameter database; the clothing parameters at least comprise clothing texture, clothing neckline size, clothing width, clothing length and clothing sleeve length, and the clothing texture represents the density of clothing cloth; the process of generating the original clothing model by the clothing model parameter database comprises the following steps 1-1 to 1-3:
step 1-1, setting two original mass points on an original garment model to be generated as i and j respectively, and obtaining the distance l between the two original mass points i and jij
Figure GDA0002589048240000021
The coordinates of the original particle i are (x)i,yi,zi) The coordinate of the original particle j is (x)j,yj,zj);
Step 1-2, according to the clothing texture corresponding to clothing and the compensation index constant corresponding to the clothing texture, presetting the space displacement index of each particle coordinate when the original clothing model is modeled by a clothing model parameter database; the spatial displacement index is labeled Δ, where:
spatial displacement index of each particle coordinate on original clothing model
Figure GDA0002589048240000022
Lambda is a compensation index constant corresponding to the texture of the garment;
step 1-3, obtaining new coordinates of the original clothing model after the two corresponding mass points are subjected to space displacement change according to the space displacement index of each mass point coordinate when the original clothing model is modeled, so as to obtain the original clothing model of the obtained clothing parameters; wherein:
the particles of the original particle i and the original particle j after being subjected to space displacement change are correspondingly marked as i 'and j'; the coordinates of the particle i' are (x)i',yi',zi') The coordinates of particle j' are (x)j',yj',zj') (ii) a The distance between particles i 'and j' is li'j(ii) a Wherein:
Figure GDA0002589048240000023
xi'=xi+△,yj'=yj+△;zj'=zj+. delta; delta is the corresponding spatial displacement index in the step 1-2;
step 2, pre-storing a human body initial model in the three-dimensional fitting system, and generating a human body original dynamic model aiming at different somatosensory characteristic parameters by the three-dimensional fitting system according to the somatosensory characteristic parameters of the externally input human body; the somatosensory characteristic parameters at least comprise height parameters, shoulder width parameters, chest parameters, waist parameters, hip parameters, arm length parameters, leg thickness parameters, foot length parameters, foot width parameters and neck thickness parameters; the generation process of the human body original dynamic model at least comprises the following steps 2-1 to 2-5:
step 2-1, extracting original parameter data of each length in the prestored somatosensory characteristic parameters of the human body initial model by the three-dimensional fitting system; wherein, the human body initial model mark C stored in the three-dimensional fitting system in advance0The length original parameter data at least comprises a height parameter, a shoulder width parameter, a chest circumference parameter, a waist circumference parameter, a hip circumference parameter, an arm length parameter, a leg thickness parameter, a foot length parameter, a foot width parameter and a neck thickness parameter;
step 2-2, initializing each length original parameter data in the somatosensory characteristic parameters of the human body initial model to obtain a length original parameter data value after corresponding initialization in the human body initial model; wherein the length original parameter data set initialized in the human body initial model is S0The parameters include a height parameter, a shoulder width parameter, a chest circumference parameter, a waist circumference parameter, a hip circumference parameter, an arm length parameter, a leg thickness parameter, a foot length parameter, a foot width parameter and a neck thickness parameter; n is 1,2, …, 11; setting the initialized length original parameter data value in the human body initial model as Lxn,xnRepresenting the initialized nth length original parameter in the human body initial model;
step 2-3, extracting length data aiming at the human body according to the received body sensing characteristic parameters of the human body input from the outside, and screening to obtain length parameter data corresponding to the step 2-1; wherein the length parameter data obtained by the set screening is marked as L'yn,ynIs represented by the formulanCorresponding human body length parameter data names;
step 2-4, according to the length original parameter data value L in step 2-2xnAnd an externally input corresponding length parameter data value L'ynObtaining matching correction error parameter values respectively corresponding to the length parameter data when constructing the original dynamic model of the human body; wherein, the parameter value of the matching correction error corresponding to the nth length parameter data in the original dynamic model of the human body is marked as omegan
Figure GDA0002589048240000031
Step 2-5, correcting the error parameter value omega according to the obtained matchingnAnd the length original parameter data value L initialized in the human body initial modelxnGenerating a human body original dynamic model aiming at different somatosensory characteristic parameters; wherein, each length parameter mark in the constructed human body original dynamic model is Lyn
Lyn=Lxnn;n=1,2,…,11;
Step 3, the fitting background generator generates fitting original background databases aiming at different fitting environments in advance according to the command of the central processing unit; after a fitting person carries a garment to be fitted into a three-dimensional fitting area, scanning an RFID label on the fitting garment by an RFID scanner to obtain various garment parameters corresponding to the fitting garment, and sending the garment parameters corresponding to the fitting garment to a central processing unit by the RFID scanner; the method comprises the steps that a mobile Kinect camera collects actual somatosensory characteristic parameters of a fitting person and a real face image of the fitting person in a mode of moving around the fitting person for at least one circle, and sends the obtained actual somatosensory characteristic parameters of the fitting person and the real face image to a human body model parameter database;
step 4, generating a human body actual dynamic model which is matched with the fitting person and has the fitting person real facial image according to the stored fitting person actual body feeling characteristic parameters and the real facial image by the human body model parameter database;
the clothing model parameter database calls a corresponding original clothing model according to clothing parameters corresponding to the clothing to be tried on;
the clothes model parameter database makes micro-adjustment on the original clothes model according to the fitting command processed by the central processing unit to obtain a virtual clothes fitting model;
wherein, the process of the clothing model parameter database for fine adjustment of the clothing fitting model to obtain the virtual clothing fitting model comprises the following steps 4-1 to 4-5:
step 4-1, the CPU sets the primitivePreset deformation control parameter P of clothing modelclothesBy central processing according to the body feeling characteristic parameter P of the fitting personuserSetting the maximum deformation index S of the original clothing model in the horizontal directionmaxAnd minimum deformation index Smin(ii) a Wherein, the preset deformation control parameters of the original garment model correspond to the size of the collar of the garment, the width of the garment, the length of the garment and the length of the sleeves of the garment in the step 1 one by one; presetting a deformation control parameter PclothesCharacteristic parameter P of body feeling of fitting personuserOne-to-one correspondence is realized;
step 4-2, the central processing unit controls a parameter P according to the preset deformation of the original clothing modelclothesAnd somatosensory characteristic parameter P of fitting personuserMaximum deformation index S of original clothing model in horizontal directionmaxAnd minimum deformation index SminCalculating to obtain the self-adaptive fitting deformation ratio R of the clothing parameters corresponding to the original clothing model during fitting, and sending the obtained self-adaptive fitting deformation ratio R of the clothing parameters to the clothing model parameter database by the central processing unit for processing:
when P is presentuser≥PclothesWhen the original clothes model is used for fitting, the self-adaptive fitting deformation proportion R of the clothes parameters is adjusted to
Figure GDA0002589048240000041
When P is presentuser<PclothesWhen the original clothes model is used for fitting, the self-adaptive fitting deformation proportion R of the clothes parameters is adjusted to
Figure GDA0002589048240000042
4-3, the garment model parameter database adjusts each original garment parameter corresponding to the original garment model in the same proportion according to the adaptive fitting deformation proportion R of the garment parameters so as to obtain a micro-adjustment garment fitting model which accords with the comfort degree of a human body;
step 4-4, presetting a wind power index in a clothing model parameter database, and obtaining a virtual fitting space displacement index of each mass point on the micro-adjustment clothing fitting model according to the clothing texture parameters corresponding to the micro-adjustment clothing fitting model and the self-adaptive fitting deformation ratio R of the clothing parameters; the virtual fitting displacement space displacement index of any particle x on the micro-adjustment garment fitting model is recorded as ζ (x):
Figure GDA0002589048240000043
fwx=kw·(vw-vx);
c is a micro-adjustment clothes fitting model, rho (x) is the density of any particle x on the micro-adjustment clothes fitting model C, rho (x) represents the corresponding clothes texture parameter, SCTo fine-tune the overall surface area of the garment fitting model C, g is the acceleration of gravity at the geographical location of the fitter, fwxIn order to finely adjust the wind power to which the mass point x on the clothes fitting model C is subjected, theta is an included angle between the wind power direction and the gravity acceleration direction; k is a radical ofwFor presetting a constant value of a wind power index, vwIs the wind speed, vxThe speed of a mass point x on the micro-adjustment garment fitting model C is obtained, psi represents the bonding index of the degree of adhesion of the garment cloth corresponding to the micro-adjustment garment fitting model C, alpha is the deviation angle of the central axis of the original position of the micro-adjustment garment fitting model C deviated during fitting, and R is the self-adaptive fitting deformation proportion of the obtained garment parameters;
step 4-5, the garment model parameter database judges according to the obtained virtual fitting space displacement index of each mass point on the micro-adjustment garment fitting model, a preset virtual fitting space displacement threshold constant and a preset human body-garment spacing value constant so as to generate a virtual garment fitting model during virtual fitting; wherein the preset virtual fitting space displacement threshold constant is marked as zeta0The preset constant value of the human body-clothing distance value is marked as D0The virtual garment fitting model is marked as C';
when ζ is0≤ζ(x)≤D0Then, moving the particle x on the micro-adjustment clothes fitting model C to the position x' to obtain a virtual clothes fitting model during virtual fitting; when ζ (x)<ζ0Or ζ (x)>D0Then, the particle x on the micro-adjustment clothes fitting model C is not moved so as to continue to use the original micro-adjustment clothes fitting model as the virtual clothes fitting model during virtual fitting; wherein:
x 'is the actual position of the particle x on the fine adjustment garment fitting model C during virtual fitting, and x' ═ ζ2(x) gR; g is the gravity acceleration of the geographical position of the fitting person, and R is the adaptive fitting deformation proportion of the clothing parameters; the virtual clothes fitting model C' is formed by moving a particle x on the micro-adjustment clothes fitting model C;
step 5, the clothes model parameter database in the three-dimensional fitting system adaptively sleeves the virtual clothes fitting model on the corresponding human body actual dynamic model according to the human body actual dynamic model and the virtual clothes fitting model; the process of adaptively sleeving the virtual garment fitting model to the corresponding actual dynamic model of the human body by the garment model parameter database comprises the following steps 5-1 to 5-5:
step 5-1, the clothes model parameter database divides the virtual clothes fitting model into pieces to obtain M independent fabric subareas; wherein, the r-th cloth partition mark is Cr;r≤M;
Step 5-2, the actual dynamic model of the human body is sliced by the clothing model parameter database to obtain M independent human body model partitions which correspond to the cloth partitions one by one so as to form M pairs of cloth-human body model partitions; wherein the r-th personal body model partition mark is Br(ii) a Manikin partition BrAnd cloth subarea CrOne-to-one correspondence is realized; r is less than or equal to M;
step 5-3, in the M pairs of cloth-human body model subareas, sequentially calculating the vertical distance between the cloth subareas and the corresponding human body model subareas by the clothing model parameter database; wherein, the r-th cloth subarea CrAnd the r personal body model partition BrPerpendicular distance between them is denoted as Hr,r≤M;
And 5-4, acquiring the optimal paste between the cloth subareas and the corresponding human body model subareas by the clothing model parameter database according to the vertical distance between each cloth subarea and the corresponding human body model subareaClosing the distance, and taking the obtained optimal fit distance as a gap value reserved when the virtual clothes fitting model is sleeved on the actual dynamic model of the human body; wherein, the optimal joint distance between the cloth subarea and the corresponding human body model subarea is marked as hopt
Figure GDA0002589048240000051
r≤M;
HrRepresenting the vertical distance, D, between the r-th cloth section and the r-th personal body model section0Is a preset constant value of human body-clothing distance value, zeta0The fitting space displacement is a preset virtual fitting space displacement threshold constant, and R is the self-adaptive fitting deformation proportion of the clothing parameters;
step 5-5, the clothing model parameter database sleeves the virtual clothing fitting model on the corresponding human body actual dynamic model according to the clearance value constant reserved by the virtual clothing fitting model obtained in the step 5-4 and the human body actual dynamic model;
step 6, setting a maximum deformation area deformed when the virtual clothes fitting model is displayed and a deformation rate of the virtual clothes fitting model when the virtual clothes fitting model is deformed according to clothes parameters corresponding to the virtual clothes fitting model by a touch display screen of the three-dimensional fitting system, and displaying the virtual clothes fitting model in the limited maximum deformation area by the display screen; the acquisition process of the maximum deformation area comprises the following steps of 6-1 to 6-3:
6-1, constructing a new three-dimensional stereo coordinate system by a touch display screen of the stereo fitting system according to shoulder width parameters and waist circumference parameters in the garment parameters corresponding to the virtual garment fitting model;
step 6-2, the touch display screen of the three-dimensional fitting system obtains the coordinates (x) of the left point A of the shoulder width according to the shoulder width parameter and the waist circumference parameter in the clothing parameters and the constructed three-dimensional coordinate systemShoulder-L,yShoulder-L,zShoulder-L) Shoulder width right point A' coordinate (x)Shoulder-R,yShoulder-R,zShoulder-R) B coordinate of waist left point (x)Waist-L,yWaist-L,zWaist-L) And the waist right point B' coordinate (x)Waist-R,yWaist-R,zWaist-R);
6-3, calculating to obtain a limited maximum deformation area by a touch display screen of the three-dimensional fitting system according to the obtained shoulder width left point coordinate, shoulder width right point coordinate, waist circumference left point coordinate and waist circumference right point coordinate; wherein the limited maximum deformation area is marked as Regionmax
Figure GDA0002589048240000061
The area of any point (x, y, z) on the virtual clothes fitting model satisfies the requirement
Figure GDA0002589048240000062
Step 7, the fitting person inputs the fitting background image name of the fitting to be worn through the central processing unit, and the fitting background generator generates a corresponding fitting background image; the touch display screen performs depth preprocessing on the generated fitting background image to display the fitting background image selected by the fitting person in high definition; the process of performing depth preprocessing on the generated fitting background image by the touch display screen comprises the following steps:
expanding a plurality of pixels outwards along the horizontal direction at the edge of the fitting background image so as to enable the depth values on the surface of the fitting background image to be consistent, and further avoiding causing geometric distortion of the fitting background image; wherein, the expansion model of the edge of the fitting background image expanding outwards along the horizontal direction is as follows:
Figure GDA0002589048240000071
d (x, y) represents the gray level of a pixel point at the coordinate (x, y) on the fitting background image before outward expansion at the edge, d' (x, y) represents the gray level of the pixel point at the coordinate (x, y) after outward expansion at the edge, and dir (x, y) represents the direction of outward expansion of the coordinate (x, y) on the fitting background image; 1 represents expansion to the right in the horizontal direction, -1 represents expansion to the left in the horizontal direction, and 0 represents no expansion in the horizontal direction; k represents the pixel number value of outward expansion of coordinates (x, y) on the fitting background image;
Figure GDA0002589048240000072
th is a preset threshold constant for preprocessing a fitting background image selected by a fitting person on the touch display screen;
and 8, after the fitting person is satisfied with the fitting situation of the virtual garment fitting model and the actual dynamic model of the human body under the selected fitting background image, inputting a fitting satisfaction instruction to the central processing unit by the fitting person, and calling the garment parameter corresponding to the current fitting satisfaction instruction from the garment model parameter database by the central processing unit and feeding back the garment parameter to the fitting person so that the fitting person can obtain the corresponding satisfied garment according to the fed garment parameter.
Further, after the step 8, the method further comprises the following steps: step 9, after the fitting person inputs a fitting satisfaction instruction to a central processing unit of the three-dimensional fitting system, a touch display screen of the three-dimensional fitting system displays a prompt frame for inquiring whether the fitting person obtains a fitting effect model and actual dynamic models of each human body corresponding to the fitting person, and after the fitting person selects the instruction for obtaining the fitting effect model, the touch display screen displays a prompt for paying a payment item to the fitting person; step 10, pre-storing an encryption image in a bank system by a three-dimensional fitting system to extract a secret key, continuously acquiring face images of the fitting persons with preset number of the fitting persons as original images to be encrypted for the payment operation by the fitting persons through a touch display screen after inputting bank card numbers and corresponding payment passwords, taking the face images of the fitting persons with unique biological characteristics as preliminary security encryption measures, carrying out mixed encryption based on image encryption on the received payment information comprising the bank card numbers and the payment passwords to obtain an encryption image, and safely providing the payment information embedded in the encryption image to the corresponding bank system by the touch display screen for decryption; and 11, extracting a secret key by the bank system according to an embedded secret image prestored in the three-dimensional fitting system, extracting payment information in the face image of the embedded fitting person, and when the bank system judges that the extracted payment information is consistent with the payment information prestored in the bank system by the fitting person, extracting the secret key by the bank system according to the prestored embedded secret image to finish deduction of a bank account of the fitting person and transfer of the bank system to the three-dimensional fitting system account.
Compared with the prior art, the invention has the advantages that:
firstly, in the three-dimensional fitting method, when an original garment model is constructed by the three-dimensional fitting system, the texture of garments adopted by different garments is considered in the construction of the original garment model, and the coordinates of mass points on the constructed original garment model in three spatial directions are considered, so that the three-dimensional fitting effect requirement is fully met, the condition that only the fitting effect of the garments on a two-dimensional plane is considered in the traditional virtual fitting scheme is avoided, the real fitting effect of a fitting person is better met, and the three-dimensional fitting effect observation of a garment buyer on the virtual garment is met;
the Kinect camera can display a more real fitting effect for the fitting person by acquiring the real face image of the fitting person and adding the real face image of the fitting person into the subsequent human body actual dynamic model;
secondly, in the process of obtaining the virtual garment fitting model, the central processing unit is used for setting the maximum deformation index and the minimum deformation index of the original garment model in the horizontal direction, so that the normal display of the original garment model in the horizontal direction of the touch display screen can be met, the phenomenon of deformation and distortion of the original garment model during deformation can be avoided, and a fitter can observe the original garment model without distortion;
thirdly, introducing a virtual fitting space displacement index, wherein the virtual fitting space displacement index integrates wind force influence factors borne by the micro-adjustment garment fitting model, the bonding condition of the garment cloth and gravity factors borne by each mass point on the garment into a calculation process, so that the virtual fitting requirement of the virtual garment is met, and the real fitting environment effect of a fitting person is better met;
setting a deformation rate of the virtual garment fitting model when the virtual garment fitting model deforms, and adapting to the display speed of the whole virtual garment fitting model on the touch display screen so as not to cause the display of the virtual garment fitting model on the touch display screen to generate lag;
finally, after the fitting background image is generated, the touch display screen conducts deep preprocessing on the fitting background image to display the fitting background image selected by the fitting person in a high-definition mode, and the edge of the fitting background image is expanded by a plurality of pixels along the horizontal direction, so that the depth values on the surface of the fitting background image are consistent, geometric distortion of the fitting background image is avoided, and a more vivid fitting effect of the fitting person under the condition of the fitting background image is achieved.
Drawings
Fig. 1 is a schematic diagram of a frame structure of a three-dimensional fitting system according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a three-dimensional fitting method based on somatosensory characteristic parameter extraction in the embodiment of the invention.
Detailed Description
The invention is described in further detail below with reference to the accompanying examples.
As shown in fig. 2, the three-dimensional fitting method based on somatosensory characteristic parameter extraction in this embodiment is used for a three-dimensional fitting system formed by a mobile Kinect camera, an RFID scanner, at least one piece of clothing with an RFID tag, a clothing model parameter database, a human body model parameter database, a central processing unit, a touch display screen and a fitting background generator, where the Kinect camera belongs to the prior art and is not described herein in much detail, and the three-dimensional fitting system is shown in fig. 1; the three-dimensional fitting method comprises the following steps 1 to 8:
step 1, scanning RFID labels on various clothes through an RFID scanner to obtain clothes parameters corresponding to the clothes, storing the obtained clothes parameters into a clothes model parameter database, and then generating an original clothes model corresponding to the obtained clothes parameters through the clothes model parameter database; the clothing parameters at least comprise clothing texture, clothing neckline size, clothing width, clothing length and clothing sleeve length, and the clothing texture represents the density of clothing cloth; garment fabrics of different garment textures have different densities; for example, the texture of both cotton cloth garments and polyester cloth garments is different, and the garment texture of each garment cloth is also known in advance; in the step 1, the clothes textures adopted by different clothes are fully considered, so that the effect condition of the real try-on clothes is better met; in the step 1, the process of generating the original clothing model by the clothing model parameter database comprises the following steps 1-1 to 1-3:
step 1-1, setting two original mass points on an original garment model to be generated as i and j respectively, and obtaining the distance l between the two original mass points i and jij
Figure GDA0002589048240000091
The coordinates of the original particle i are (x)i,yi,zi) The coordinate of the original particle j is (x)j,yj,zj) (ii) a The coordinate of each original particle in three spatial directions is considered, so that the effect requirement of three-dimensional fitting is fully met, and the condition that only the fitting effect of the clothes on a two-dimensional plane is considered in the traditional virtual fitting scheme is avoided, so that the original clothes model formed in the invention better accords with the real clothes effect, and the three-dimensional observation of the clothes buyer on the virtual clothes is further met;
step 1-2, according to the clothing texture corresponding to clothing and the compensation index constant corresponding to the clothing texture, presetting the space displacement index of each particle coordinate when the original clothing model is modeled by a clothing model parameter database; the spatial displacement index is labeled Δ, where:
spatial displacement index of each particle coordinate on original clothing model
Figure GDA0002589048240000092
Lambda is a compensation index constant corresponding to the texture of the garment; in the actual finished garment, the garment is subjected to the whole weight of the garmentAll the mass points on the garment can generate certain deviation; aiming at an original garment model to be established, the real situation of each particle displacement on the real garment is reflected more realistically by introducing a spatial displacement index on a particle coordinate, and further the real fidelity of the subsequently formed original garment model is enhanced;
step 1-3, obtaining new coordinates of the original clothing model after the two corresponding mass points are subjected to space displacement change according to the space displacement index of each mass point coordinate when the original clothing model is modeled, so as to obtain the original clothing model of the obtained clothing parameters; wherein: the particles of the original particle i and the original particle j after being subjected to space displacement change are correspondingly marked as i 'and j'; the coordinates of the particle i' are (x)i',yi',zi') The coordinates of particle j' are (x)j',yj',zj') (ii) a The distance between particles i 'and j' is li'j(ii) a Wherein:
Figure GDA0002589048240000093
xi'=xi+△,yj'=yj+△;zj'=zj+. delta; delta is the corresponding spatial displacement index in the step 1-2;
aiming at an original garment model to be established, the real situation of displacement of each mass point on the real garment is reflected more realistically by introducing the spatial displacement index on each coordinate of the mass point and increasing the introduced spatial displacement index on the basis of the coordinate of each original mass point, so that the real fidelity situation of the original garment model formed at the position is further enhanced;
specifically, in the three-dimensional fitting system of the embodiment, a touch display screen of the three-dimensional fitting system is provided with a zoom control button for controlling the original garment model and the virtual garment fitting model to be enlarged or reduced, so that a fitter can conveniently adjust and observe the original garment model and the virtual garment fitting model; a touch display screen of the three-dimensional fitting system is provided with a virtual visual field distance control button so as to adapt to fitting persons to observe the clothes model from different distances; the virtual visual field distance control button is used for increasing the distance between the virtual clothes fitting model and the position of a fitting person in the touch display screen, so that the fitting person can conveniently observe the whole virtual fitting effect of the fitting person at the distance required to be adjusted, and the fitting person can observe the wearing effect of the fitting person from the angles of other passers.
Step 2, pre-storing a human body initial model in the three-dimensional fitting system, and generating a human body original dynamic model aiming at different somatosensory characteristic parameters by the three-dimensional fitting system according to the somatosensory characteristic parameters of the externally input human body; since different people have different characteristic parameters, such as different gender consumers, male consumers or female consumers, their characteristic parameters are different; the somatosensory characteristic parameters at least comprise height parameters, shoulder width parameters, chest parameters, waist parameters, hip parameters, arm length parameters, leg thickness parameters, foot length parameters, foot width parameters and neck thickness parameters; the generation process of the human body original dynamic model at least comprises the following steps 2-1 to 2-5:
step 2-1, extracting original parameter data of each length in the prestored somatosensory characteristic parameters of the human body initial model by the three-dimensional fitting system; wherein, the human body initial model mark C stored in the three-dimensional fitting system in advance0The length original parameter data at least comprises a height parameter, a shoulder width parameter, a chest circumference parameter, a waist circumference parameter, a hip circumference parameter, an arm length parameter, a leg thickness parameter, a foot length parameter, a foot width parameter and a neck thickness parameter;
step 2-2, initializing each length original parameter data in the somatosensory characteristic parameters of the human body initial model to obtain a length original parameter data value after corresponding initialization in the human body initial model; wherein the length original parameter data set initialized in the human body initial model is S0The parameters include a height parameter, a shoulder width parameter, a chest circumference parameter, a waist circumference parameter, a hip circumference parameter, an arm length parameter, a leg thickness parameter, a foot length parameter, a foot width parameter and a neck thickness parameter; setting the initialized length original parameter data value in the human body initial model as Lxn,xnRepresenting the nth length original parameter in the human body initial model; n is 1,2, …, 11; for example, the length raw parameter data set initialized in the human initial model is S0In the figure, the 2 nd length original parameter is the shoulder width parameter, i.e. the 2 nd length original parameter in the human body initial model is marked as x2(ii) a Correspondingly, the initialized 2 nd length original parameter data value in the human body initial model is marked as Lx2(ii) a Length original parameter xnAnd the initialized length original parameter data value LxnAre in one-to-one correspondence;
step 2-3, extracting length data aiming at the human body according to the received body sensing characteristic parameters of the human body input from the outside, and screening to obtain length parameter data corresponding to the step 2-1; wherein the length parameter data obtained by the set screening is marked as L'yn,ynIs represented by the formulanCorresponding human body length parameter data names; for example, in step 2-2, x2Denotes the 2 nd length original parameter, correspondingly, y here2It corresponds to the shoulder width parameter, L'y2The shoulder width length parameter data obtained after screening; x is the number of4Denotes the 4 th original parameter, correspondingly, y here4Is then corresponding to waist circumference parameter, L'y2The waist length parameter data obtained after screening is obtained;
step 2-4, according to the length original parameter data value L in step 2-2xnAnd an externally input corresponding length parameter data value L'ynObtaining matching correction error parameter values respectively corresponding to the length parameter data when constructing the original dynamic model of the human body; wherein, the parameter value of the matching correction error corresponding to the nth length parameter data in the original dynamic model of the human body is marked as omegan
Figure GDA0002589048240000111
When a human body original dynamic model is constructed, aiming at the length original parameter data value, screening to obtain each length parameter data L'ynCalculating the sum and product of the accumulated sum and product, and calculating the initialized length original parameter data value LxnCalculating the sum and product of the accumulated sum and product to obtain corresponding error parameter value omeganTo compensate errors generated when calculating each length data, thereby obtaining more accurate length data to generate a subsequent human body original dynamic model;
step 2-5, correcting the error parameter value omega according to the obtained matchingnAnd the length original parameter data value L initialized in the human body initial modelxnGenerating a human body original dynamic model aiming at different somatosensory characteristic parameters; wherein, each length parameter mark in the constructed human body original dynamic model is Lyn
Lyn=Lxnn;n=1,2,…,11;
The human original dynamic model C constructed here1Each length parameter value L inynAt original length, original parameter data value LxnOn the basis, a matching correction error parameter value omega for compensating length data errors is addednTherefore, the generated human body original dynamic model has higher fidelity, and the fidelity of each model in the three-dimensional fitting process is ensured;
step 3, the fitting background generator generates fitting original background databases aiming at different fitting environments in advance according to the command of the central processing unit; after a fitting person carries a garment to be fitted into a three-dimensional fitting area, scanning an RFID label on the fitting garment by an RFID scanner to obtain various garment parameters corresponding to the fitting garment, and sending the garment parameters corresponding to the fitting garment to a central processing unit by the RFID scanner; the method comprises the steps that a mobile Kinect camera collects actual somatosensory characteristic parameters of a fitting person and a real face image of the fitting person in a mode of moving around the fitting person for at least one circle, and sends the obtained actual somatosensory characteristic parameters of the fitting person and the real face image to a human body model parameter database; the real face image of the fitting person, namely the real face image of the consumer, is acquired through the Kinect camera, and the real face image of the fitting person is added into a subsequent human body actual dynamic model, so that a more real fitting effect can be displayed for the fitting person; compared with the current fitting scheme adopting the model head portrait, the method better meets the real fitting purpose of a fitting person; as an improvement measure, in step 3, the touch display screen of the three-dimensional fitting system may further have a garment type and a garment name selected by the fitting person, so that after the fitting person selects a garment to be fitted through the touch display screen, the touch display screen calls a corresponding garment model in the garment model parameter database to display the garment model on the touch display screen.
Step 4, generating a human body actual dynamic model which is matched with the fitting person and has the fitting person real facial image according to the stored fitting person actual body feeling characteristic parameters and the real facial image by the human body model parameter database;
the clothing model parameter database calls a corresponding original clothing model according to clothing parameters corresponding to the clothing to be tried on;
the clothes model parameter database makes micro-adjustment on the original clothes model according to the fitting command processed by the central processing unit to obtain a virtual clothes fitting model;
wherein, the process of the clothing model parameter database for fine adjustment of the clothing fitting model to obtain the virtual clothing fitting model comprises the following steps 4-1 to 4-5:
step 4-1, the central processing unit sets a preset deformation control parameter P of the original clothing modelclothesThe central processing unit is used for controlling the fitting person to perform fitting according to the somatosensory characteristic parameter P of the fitting personuserSetting the maximum deformation index S of the original clothing model in the horizontal directionmaxAnd minimum deformation index Smin(ii) a Wherein, the preset deformation control parameters of the original garment model correspond to the size of the collar of the garment, the width of the garment, the length of the garment and the length of the sleeves of the garment in the step 1 one by one; presetting a deformation control parameter PclothesCharacteristic parameter P of body feeling of fitting personuserOne-to-one correspondence is realized; the maximum deformation index of the original clothing model in the horizontal direction is set as the minimum deformation index through the central processing unit, so that the normal display of the original clothing model in the horizontal direction of the display screen can be met, and the deformation distortion of the original clothing model in the deformation process can be avoidedA phenomenon;
step 4-2, the central processing unit controls a parameter P according to the preset deformation of the original clothing modelclothesAnd somatosensory characteristic parameter P of fitting personuserMaximum deformation index S of original clothing model in horizontal directionmaxAnd minimum deformation index SminCalculating to obtain the self-adaptive fitting deformation ratio R of the clothing parameters corresponding to the original clothing model during fitting, and sending the obtained self-adaptive fitting deformation ratio R of the clothing parameters to the clothing model parameter database by the central processing unit for processing:
when P is presentuser≥PclothesWhen the fitting parameters of the original clothes model are not less than the preset deformation control parameters, the self-adaptive fitting deformation ratio R of the clothes parameters is adjusted to be the same as the fitting parameters of the original clothes model
Figure GDA0002589048240000121
When P is presentuser<PclothesWhen the fitting characteristic parameter value of the original clothes model is smaller than the preset deformation control parameter value, the self-adaptive fitting deformation proportion R of the clothes parameter of the original clothes model during fitting is adjusted to be
Figure GDA0002589048240000122
The first introduction of the self-adaptive fitting deformation ratio R of the clothing parameters thoroughly solves the problem that the three-dimensional fitting system automatically adjusts according to the real somatosensory characteristic parameters of a fitting person to control the deformation ratio of each parameter in the original clothing model, so that the clothing parameters synchronously deform in the same proportion, and the generated clothing model cannot generate shape distortion in any direction;
4-3, the garment model parameter database adjusts each original garment parameter corresponding to the original garment model in the same proportion according to the adaptive fitting deformation proportion R of the garment parameters so as to obtain a micro-adjustment garment fitting model which accords with the comfort degree of a human body; the self-adaptive fitting deformation ratio R not only meets the synchronous same-ratio deformation in all directions on the original garment model, ensures the vividness of the garment model, but also can ensure that the obtained micro-adjustment garment fitting model more meets the requirement of human comfort;
step 4-4, presetting a wind power index in a clothing model parameter database, and obtaining a virtual fitting space displacement index of each mass point on the micro-adjustment clothing fitting model according to the clothing texture parameters corresponding to the micro-adjustment clothing fitting model and the self-adaptive fitting deformation ratio R of the clothing parameters; the virtual fitting space displacement index of any particle x on the micro-adjustment garment fitting model is marked as zeta (x),
Figure GDA0002589048240000131
fwx=kw·(vw-vx);
c is a micro-adjustment clothes fitting model, rho (x) is the density of any particle x on the micro-adjustment clothes fitting model C, rho (x) represents the corresponding clothes texture parameter, SCTo fine-tune the overall surface area of the garment fitting model C, g is the acceleration of gravity at the geographical location of the fitter, fwxIn order to finely adjust the wind power to which the mass point x on the clothes fitting model C is subjected, theta is an included angle between the wind power direction and the gravity acceleration direction; k is a radical ofwFor presetting a constant value of a wind power index, vwIs the wind speed, vxThe speed of a mass point x on the micro-adjustment garment fitting model C is obtained, psi represents the bonding index of the degree of adhesion of the garment cloth corresponding to the micro-adjustment garment fitting model C, alpha is the deviation angle of the central axis of the original position of the micro-adjustment garment fitting model C deviated during fitting, and R is the self-adaptive fitting deformation proportion of the obtained garment parameters; because a virtual scene more conforming to fitting and dressing environments needs to be truly reproduced in the fitting process by using the three-dimensional fitting system, the virtual fitting space displacement index zeta (x) is introduced, and the virtual fitting space displacement index integrates wind force influence factors borne by a micro-adjustment garment fitting model, the bonding condition of garment cloth and gravity factors borne by mass points on the garment into the calculation process, so that the virtual fitting of the virtual garment is more conforming to the real fitting environment effect of a fitting person;
step 4-5, the garment model parameter database judges according to the obtained virtual fitting space displacement index of each mass point on the micro-adjustment garment fitting model, a preset virtual fitting space displacement threshold constant and a preset human body-garment spacing value constant so as to generate a virtual garment fitting model during virtual fitting; wherein the preset virtual fitting space displacement threshold constant is marked as zeta0The preset constant value of the human body-clothing distance value is marked as D0The virtual garment fitting model is marked as C';
when ζ is0≤ζ(x)≤D0Then, moving the particle x on the micro-adjustment clothes fitting model C to the position x' to obtain a virtual clothes fitting model during virtual fitting; when ζ (x)<ζ0Or ζ (x)>D0Then, the particle x on the micro-adjustment clothes fitting model C is not moved so as to continue to use the original micro-adjustment clothes fitting model as the virtual clothes fitting model during virtual fitting; wherein x 'is the actual position of the particle x on the fine adjustment garment fitting model C during virtual fitting, and x' ═ ζ2(x) gR; g is the gravity acceleration of the geographical position of the fitting person, and R is the adaptive fitting deformation proportion of the clothing parameters; the virtual clothes fitting model C' is formed by moving a particle x on the micro-adjustment clothes fitting model C;
step 5, the clothes model parameter database in the three-dimensional fitting system adaptively sleeves the virtual clothes fitting model on the corresponding human body actual dynamic model according to the human body actual dynamic model and the virtual clothes fitting model; the process of adaptively sleeving the virtual garment fitting model to the corresponding actual dynamic model of the human body by the garment model parameter database comprises the following steps 5-1 to 5-5:
step 5-1, the clothes model parameter database divides the virtual clothes fitting model into pieces to obtain M independent fabric subareas; wherein, the r-th cloth partition mark is Cr(ii) a r is less than or equal to M; m represents the total number of the partitioned cloth partitions of the virtual clothes fitting model;
step 5-2, the actual dynamic model of the human body is sliced by the clothing model parameter database to obtain M independent and one-to-one corresponding to the cloth subareasPartitioning the mannequin to form M pairs of cloth-mannequin partitions; the r-th individual body model partition is marked as Br(ii) a Manikin partition BrAnd cloth subarea CrOne-to-one correspondence is realized; r is less than or equal to M; for example, fabric section C of the virtual garment fitting model1Partition B with human body model1Correspondingly, the cloth subarea C of the virtual clothes fitting model2Partition B with human body model2Corresponding; compared with the current virtual fitting scheme, the cloth partition and human body model partition setting scheme is beneficial to matching each cloth partition on the virtual fitting model to the actual dynamic model of the human body;
step 5-3, in the M pairs of cloth-human body model subareas, sequentially calculating the vertical distance between the cloth subareas and the corresponding human body model subareas by the clothing model parameter database; wherein, the r-th cloth subarea CrAnd the r personal body model partition BrPerpendicular distance between them is denoted as HrR is less than or equal to M; the vertical distance is the distance between the cloth partition section and the corresponding human body model partition section;
step 5-4, the clothing model parameter database obtains the optimal fitting distance between the cloth subareas and the corresponding human body model subareas according to the vertical distance between each cloth subarea and the corresponding human body model subarea, and the obtained optimal fitting distance is used as a gap value reserved when the virtual clothing fitting model is sleeved on the human body actual dynamic model; wherein, the optimal joint distance between the cloth subarea and the corresponding human body model subarea is marked as hopt
Figure GDA0002589048240000141
r≤M;
HrDenotes the r-th cloth partition CrAnd the r personal body model partition BrPerpendicular distance between, D0Is a preset constant value of human body-clothing distance value, zeta0The fitting space displacement is a preset virtual fitting space displacement threshold constant, and R is the self-adaptive fitting deformation proportion of the clothing parameters; in the virtual fitting process, the optimal fitting distance can be obtained to meet the requirement of matching between the actual dynamic model of the human body and the virtual fitting model of the clothesThe optimal distance is provided during the timing so as to meet the requirement of comfort when wearing;
step 5-5, the clothing model parameter database sleeves the virtual clothing fitting model on the corresponding human body actual dynamic model according to the clearance value constant reserved by the virtual clothing fitting model obtained in the step 5-4 and the human body actual dynamic model;
step 6, setting a maximum deformation area deformed when the virtual clothes fitting model is displayed and a deformation rate of the virtual clothes fitting model when the virtual clothes fitting model is deformed according to clothes parameters corresponding to the virtual clothes fitting model by a touch display screen of the three-dimensional fitting system, and displaying the virtual clothes fitting model in the limited maximum deformation area by the display screen; the deformation rate of the virtual garment fitting model during deformation is set, so that the display speed of the whole virtual garment fitting model on the touch display screen can be adapted, and the display of the virtual garment fitting model cannot be delayed; wherein, the process of acquiring the maximum deformation area comprises the following steps 6-1 to 6-3:
6-1, constructing a new three-dimensional stereo coordinate system by a touch display screen of the stereo fitting system according to shoulder width parameters and waist circumference parameters in the garment parameters corresponding to the virtual garment fitting model;
step 6-2, the touch display screen of the three-dimensional fitting system obtains the coordinates (x) of the left point A of the shoulder width according to the shoulder width parameter and the waist circumference parameter in the clothing parameters and the constructed three-dimensional coordinate systemShoulder-L,yShoulder-L,zShoulder-L) Shoulder width right point A' coordinate (x)Shoulder-R,yShoulder-R,zShoulder-R) B coordinate of waist left point (x)Waist-L,yWaist-L,zWaist-L) And the waist right point B' coordinate (x)Waist-R,yWaist-R,zWaist-R) (ii) a The contour of the virtual clothes fitting model matched with the body type of the fitting person can be preliminarily drawn through the acquired shoulder width coordinate and waist circumference coordinate;
and 6-3, the touch display screen of the three-dimensional fitting system obtains the shoulder width left point coordinate, the shoulder width right point coordinate, the waist circumference left point coordinate andcalculating the coordinate of the right point of the waist to obtain a limited maximum deformation area; the calculation of the limited maximum deformation area is beneficial to the three-dimensional fitting system to control the display boundary of the virtual clothes fitting model on the touch display screen so as to utilize the interface of the touch display screen to the maximum extent and display the virtual clothes fitting model on the interface of the touch display screen; wherein the limited maximum deformation area is marked as Regionmax
Figure GDA0002589048240000151
The area of any point (x, y, z) on the virtual clothes fitting model satisfies
Figure GDA0002589048240000152
Because any point (x, y, z) on the virtual clothes fitting model is arranged in the limited maximum deformation region, the mass points on the virtual clothes fitting model can be ensured to be constructed in the limited deformation region;
step 7, the fitting person inputs the fitting background image name to be worn by the fitting garment through the central processing unit, the fitting background generator generates the corresponding fitting background image, and the fitting background image generation can help the fitting person to know the effect of fitting the garment under the fitting background in the virtual fitting environment; the touch display screen performs depth preprocessing on the generated fitting background image to display the fitting background image selected by the fitting person in high definition; the process of performing depth preprocessing on the generated fitting background image by the touch display screen comprises the following steps:
expanding a plurality of pixels outwards along the horizontal direction at the edge of the fitting background image so as to enable the depth values on the surface of the fitting background image to be consistent, thereby avoiding causing geometric distortion of the fitting background image and reflecting more vivid fitting effect of a fitting person under the condition of the fitting background image; wherein, the expansion model of the edge of the fitting background image expanding outwards along the horizontal direction is as follows:
Figure GDA0002589048240000161
d (x, y) represents the gray level of a pixel point at the coordinate (x, y) on the fitting background image before outward expansion at the edge, d' (x, y) represents the gray level of the pixel point at the coordinate (x, y) after outward expansion at the edge, and dir (x, y) represents the direction of outward expansion of the coordinate (x, y) on the fitting background image; 1 represents expansion to the right in the horizontal direction, -1 represents expansion to the left in the horizontal direction, and 0 represents no expansion in the horizontal direction; k represents the pixel number value of outward expansion of coordinates (x, y) on the fitting background image;
Figure GDA0002589048240000162
th is a preset threshold constant for preprocessing the fitting background image selected by the fitting person on the touch display screen.
And 8, after the fitting person is satisfied with the fitting situation of the virtual garment fitting model and the actual dynamic model of the human body under the selected fitting background image, inputting a fitting satisfaction instruction to the central processing unit by the fitting person, and calling the garment parameter corresponding to the current fitting satisfaction instruction from the garment model parameter database by the central processing unit and feeding back the garment parameter to the fitting person so that the fitting person can obtain the corresponding satisfied garment according to the fed garment parameter.
In order to meet the requirement of fitting accessories of a fitting person, the three-dimensional fitting system in the embodiment further comprises a wearing accessory generator, and the step 7 further comprises the steps that the fitting person inputs the wearing accessories needing matching through a central processing unit, so that the wearing accessories needed by fitting are generated by a wearing accessory generator, and the generated wearing accessories are matched with the actual dynamic model and/or the virtual clothes fitting model of the human body according to the requirement of the fitting person; the fitting person selects a fitting hairstyle suitable for the personal preference on the touch display screen of the three-dimensional fitting system.
In order to further meet the actual requirements of the fitting persons, the method further comprises the step 8, after the step, of establishing a satisfaction degree data list of each garment by a central processing unit of the three-dimensional fitting system, and recording the number of satisfaction instructions fed back by the fitting persons corresponding to each garment so as to form a garment satisfaction degree trend table for reference of a garment seller; the central processing unit commands the touch display screen to record the condition of the human body actual dynamic model corresponding to the fitting person during movement so as to form virtual fitting process video data for the fitting person to watch; step 9, after the fitting person inputs a fitting satisfaction instruction to a central processing unit of the three-dimensional fitting system, a touch display screen of the three-dimensional fitting system displays a prompt frame for inquiring whether the fitting person obtains a fitting effect model and actual dynamic models of each human body corresponding to the fitting person, and after the fitting person selects the instruction for obtaining the fitting effect model, the touch display screen displays a prompt for paying a payment item to the fitting person; step 10, pre-storing an encryption image in a bank system by a three-dimensional fitting system to extract a secret key, continuously acquiring face images of the fitting persons with preset number of the fitting persons as original images to be encrypted for the payment operation by the fitting persons through a touch display screen after inputting bank card numbers and corresponding payment passwords, taking the face images of the fitting persons with unique biological characteristics as preliminary security encryption measures, carrying out mixed encryption based on image encryption on the received payment information comprising the bank card numbers and the payment passwords to obtain an encryption image, and safely providing the payment information embedded in the encryption image to the corresponding bank system by the touch display screen for decryption; and 11, extracting a secret key by the bank system according to an embedded secret image prestored in the three-dimensional fitting system, extracting payment information in the face image of the embedded fitting person, and when the bank system judges that the extracted payment information is consistent with the payment information prestored in the bank system by the fitting person, extracting the secret key by the bank system according to the prestored embedded secret image to finish deduction of a bank account of the fitting person and transfer of the bank system to the three-dimensional fitting system account. Of course, the touch display screen of the three-dimensional fitting system can also be provided with a hair dressing button which can dress the hair color and the hair style of the actual dynamic model of the human body.
In addition, after the step 11, the three-dimensional fitting system further includes that after confirming that the payment of the fitting person is received, the 3D printer prints the actual dynamic model of the human body selected by the fitting person according to the command of the central processing unit so as to print out the 3D model when the fitting person wears the corresponding fitting clothes, and the three-dimensional fitting system provides the actual somatosensory characteristic parameters of the fitting person for the fitting person, so that the fitting person can output the actual somatosensory characteristic parameters to the three-dimensional fitting system when fitting again. Furthermore, the fitting person makes activities according to the actual dynamic model of the human body displayed by the touch display screen, and after the touch display screen identifies the activities made by the fitting person, the central processing unit makes the same activities in the database of the life-style body model parameters corresponding to the actual dynamic model of the human body of the fitting person, so as to realize the synchronous interaction between the fitting person and the actual dynamic model of the human body in the touch display screen; the touch display screen is provided with a speed adjusting control for the person to be fitted to adjust the activity speed of the actual dynamic model of the human body and a model rotating angle adjusting control for the person to be fitted to control the actual dynamic model of the human body.

Claims (3)

1. A three-dimensional fitting method based on somatosensory characteristic parameter extraction is used for a three-dimensional fitting system formed by a mobile Kinect camera, an RFID scanner, at least one piece of clothes with an RFID label, a clothes model parameter database, a human body model parameter database, a central processing unit, a touch display screen and a fitting background generator, and is characterized by comprising the following steps 1-8:
step 1, scanning RFID labels on various clothes through an RFID scanner to obtain clothes parameters corresponding to the clothes, storing the obtained clothes parameters into a clothes model parameter database, and then generating an original clothes model corresponding to the obtained clothes parameters through the clothes model parameter database; the clothing parameters at least comprise clothing texture, clothing neckline size, clothing width, clothing length and clothing sleeve length, and the clothing texture represents the density of clothing cloth; the process of generating the original clothing model by the clothing model parameter database comprises the following steps 1-1 to 1-3:
step 1-1, setting two original mass points on an original garment model to be generated as i and j respectively, and obtaining the distance l between the two original mass points i and jij
Figure FDA0001228706300000011
The coordinates of the original particle i are (x)i,yi,zi) The coordinate of the original particle j is (x)j,yj,zj);
Step 1-2, according to the clothing texture corresponding to clothing and the compensation index constant corresponding to the clothing texture, presetting the space displacement index of each particle coordinate when the original clothing model is modeled by a clothing model parameter database; the spatial displacement index is labeled Δ, where:
spatial displacement index of each particle coordinate on original clothing model
Figure FDA0001228706300000012
Lambda is a compensation index constant corresponding to the texture of the garment;
step 1-3, obtaining new coordinates of the original clothing model after the two corresponding mass points are subjected to space displacement change according to the space displacement index of each mass point coordinate when the original clothing model is modeled, so as to obtain the original clothing model of the obtained clothing parameters; wherein:
the particles of the original particle i and the original particle j after being subjected to space displacement change are correspondingly marked as i 'and j'; the coordinates of the particle i' are (x)i',yi',zi') The coordinates of particle j' are (x)j',yj',zj') (ii) a Distance between particles i ' and j ' is l 'ij(ii) a Wherein:
Figure FDA0001228706300000013
xi'=xi+△,yj'=yj+△;zj'=zj+. delta; delta is the corresponding spatial displacement index in the step 1-2;
step 2, pre-storing a human body initial model in the three-dimensional fitting system, and generating a human body original dynamic model aiming at different somatosensory characteristic parameters by the three-dimensional fitting system according to the somatosensory characteristic parameters of the externally input human body; the somatosensory characteristic parameters at least comprise height parameters, shoulder width parameters, chest parameters, waist parameters, hip parameters, arm length parameters, leg thickness parameters, foot length parameters, foot width parameters and neck thickness parameters; the generation process of the human body original dynamic model at least comprises the following steps 2-1 to 2-5:
step 2-1, extracting original parameter data of each length in the prestored somatosensory characteristic parameters of the human body initial model by the three-dimensional fitting system; wherein, the human body initial model mark C stored in the three-dimensional fitting system in advance0The length original parameter data at least comprises a height parameter, a shoulder width parameter, a chest circumference parameter, a waist circumference parameter, a hip circumference parameter, an arm length parameter, a leg thickness parameter, a foot length parameter, a foot width parameter and a neck thickness parameter;
step 2-2, initializing each length original parameter data in the somatosensory characteristic parameters of the human body initial model to obtain a length original parameter data value after corresponding initialization in the human body initial model; wherein the length original parameter data set initialized in the human body initial model is S0The parameters include a height parameter, a shoulder width parameter, a chest circumference parameter, a waist circumference parameter, a hip circumference parameter, an arm length parameter, a leg thickness parameter, a foot length parameter, a foot width parameter and a neck thickness parameter; n is 1,2, …, 11; setting the initialized length original parameter data value in the human body initial model as Lxn,xnRepresenting the initialized nth length original parameter in the human body initial model;
step 2-3, extracting length data aiming at the human body according to the received body sensing characteristic parameters of the human body input from the outside, and screening to obtain length parameter data corresponding to the step 2-1; wherein the length parameter data obtained by the set screening is marked as L'yn,ynIs represented by the formulanCorresponding human body length parameter data names;
step 2-4, according to the length original parameter data value L in step 2-2xnAnd an externally input corresponding length parameter data value L'ynObtaining the matching correction respectively corresponding to each length parameter data when constructing the original dynamic model of the human bodyA positive error parameter value; wherein, the parameter value of the matching correction error corresponding to the nth length parameter data in the original dynamic model of the human body is marked as omegan
Figure FDA0001228706300000021
Step 2-5, correcting the error parameter value omega according to the obtained matchingnAnd the length original parameter data value L initialized in the human body initial modelxnGenerating a human body original dynamic model aiming at different somatosensory characteristic parameters; wherein, each length parameter mark in the constructed human body original dynamic model is Lyn
Lyn=Lxnn;n=1,2,…,11;
Step 3, the fitting background generator generates fitting original background databases aiming at different fitting environments in advance according to the command of the central processing unit; after a fitting person carries a garment to be fitted into a three-dimensional fitting area, scanning an RFID label on the fitting garment by an RFID scanner to obtain various garment parameters corresponding to the fitting garment, and sending the garment parameters corresponding to the fitting garment to a central processing unit by the RFID scanner; the method comprises the steps that a mobile Kinect camera collects actual somatosensory characteristic parameters of a fitting person and a real face image of the fitting person in a mode of moving around the fitting person for at least one circle, and sends the obtained actual somatosensory characteristic parameters of the fitting person and the real face image to a human body model parameter database;
step 4, generating a human body actual dynamic model which is matched with the fitting person and has the fitting person real facial image according to the stored fitting person actual body feeling characteristic parameters and the real facial image by the human body model parameter database;
the clothing model parameter database calls a corresponding original clothing model according to clothing parameters corresponding to the clothing to be tried on;
the clothes model parameter database makes micro-adjustment on the original clothes model according to the fitting command processed by the central processing unit to obtain a virtual clothes fitting model;
wherein, the process of the clothing model parameter database for fine adjustment of the clothing fitting model to obtain the virtual clothing fitting model comprises the following steps 4-1 to 4-5:
step 4-1, the central processing unit sets a preset deformation control parameter P of the original clothing modelclothesBy central processing according to the body feeling characteristic parameter P of the fitting personuserSetting the maximum deformation index S of the original clothing model in the horizontal directionmaxAnd minimum deformation index Smin(ii) a Wherein, the preset deformation control parameters of the original garment model correspond to the size of the collar of the garment, the width of the garment, the length of the garment and the length of the sleeves of the garment in the step 1 one by one; presetting a deformation control parameter PclothesCharacteristic parameter P of body feeling of fitting personuserOne-to-one correspondence is realized;
step 4-2, the central processing unit controls a parameter P according to the preset deformation of the original clothing modelclothesAnd somatosensory characteristic parameter P of fitting personuserMaximum deformation index S of original clothing model in horizontal directionmaxAnd minimum deformation index SminCalculating to obtain the self-adaptive fitting deformation ratio R of the clothing parameters corresponding to the original clothing model during fitting, and sending the obtained self-adaptive fitting deformation ratio R of the clothing parameters to the clothing model parameter database by the central processing unit for processing:
when P is presentuser≥PclothesWhen the original clothes model is used for fitting, the self-adaptive fitting deformation proportion R of the clothes parameters is adjusted to
Figure FDA0001228706300000031
When P is presentuser<PclothesWhen the original clothes model is used for fitting, the self-adaptive fitting deformation proportion R of the clothes parameters is adjusted to
Figure FDA0001228706300000032
4-3, the garment model parameter database adjusts each original garment parameter corresponding to the original garment model in the same proportion according to the adaptive fitting deformation proportion R of the garment parameters so as to obtain a micro-adjustment garment fitting model which accords with the comfort degree of a human body;
step 4-4, presetting a wind power index in a clothing model parameter database, and obtaining a virtual fitting space displacement index of each mass point on the micro-adjustment clothing fitting model according to the clothing texture parameters corresponding to the micro-adjustment clothing fitting model and the self-adaptive fitting deformation ratio R of the clothing parameters; the virtual fitting displacement space displacement index of any particle x on the micro-adjustment garment fitting model is recorded as ζ (x):
Figure FDA0001228706300000041
c is a micro-adjustment clothes fitting model, rho (x) is the density of any particle x on the micro-adjustment clothes fitting model C, rho (x) represents the corresponding clothes texture parameter, SCTo fine-tune the overall surface area of the garment fitting model C, g is the acceleration of gravity at the geographical location of the fitter, fwxIn order to finely adjust the wind power to which the mass point x on the clothes fitting model C is subjected, theta is an included angle between the wind power direction and the gravity acceleration direction; k is a radical ofwFor presetting a constant value of a wind power index, vwIs the wind speed, vxThe speed of a mass point x on the micro-adjustment garment fitting model C is obtained, psi represents the bonding index of the degree of adhesion of the garment cloth corresponding to the micro-adjustment garment fitting model C, alpha is the deviation angle of the central axis of the original position of the micro-adjustment garment fitting model C deviated during fitting, and R is the self-adaptive fitting deformation proportion of the obtained garment parameters;
step 4-5, the garment model parameter database judges according to the obtained virtual fitting space displacement index of each mass point on the micro-adjustment garment fitting model, a preset virtual fitting space displacement threshold constant and a preset human body-garment spacing value constant so as to generate a virtual garment fitting model during virtual fitting; wherein the preset virtual fitting space displacement threshold constant is marked as zeta0The preset constant value of the human body-clothing distance value is marked as D0The virtual garment fitting model is marked as C';
when ζ is0≤ζ(x)≤D0Then, moving the particle x on the micro-adjustment clothes fitting model C to the position x' to obtain a virtual clothes fitting model during virtual fitting; when ζ (x)<ζ0Or ζ (x)>D0Then, the particle x on the micro-adjustment clothes fitting model C is not moved so as to continue to use the original micro-adjustment clothes fitting model as the virtual clothes fitting model during virtual fitting; wherein:
x 'is the actual position of the particle x on the fine adjustment garment fitting model C during virtual fitting, and x' ═ ζ2(x) gR; g is the gravity acceleration of the geographical position of the fitting person, and R is the adaptive fitting deformation proportion of the clothing parameters; the virtual clothes fitting model C' is formed by moving a particle x on the micro-adjustment clothes fitting model C;
step 5, the clothes model parameter database in the three-dimensional fitting system adaptively sleeves the virtual clothes fitting model on the corresponding human body actual dynamic model according to the human body actual dynamic model and the virtual clothes fitting model; the process of adaptively sleeving the virtual garment fitting model to the corresponding actual dynamic model of the human body by the garment model parameter database comprises the following steps 5-1 to 5-5:
step 5-1, the clothes model parameter database divides the virtual clothes fitting model into pieces to obtain M independent fabric subareas; wherein, the r-th cloth partition mark is Cr;r≤M;
Step 5-2, the actual dynamic model of the human body is sliced by the clothing model parameter database to obtain M independent human body model partitions which correspond to the cloth partitions one by one so as to form M pairs of cloth-human body model partitions; wherein the r-th personal body model partition mark is Br(ii) a Manikin partition BrAnd cloth subarea CrOne-to-one correspondence is realized; r is less than or equal to M;
step 5-3, in the M pairs of cloth-human body model subareas, sequentially calculating the vertical distance between the cloth subareas and the corresponding human body model subareas by the clothing model parameter database; wherein, the r-th cloth subarea CrAnd the r personal body model partition BrPerpendicular distance between them is denoted as Hr,r≤M;
Step 5-4, the clothing model parameter database obtains the optimal fitting distance between the cloth subareas and the corresponding human body model subareas according to the vertical distance between each cloth subarea and the corresponding human body model subarea, and the obtained optimal fitting distance is used as a gap value reserved when the virtual clothing fitting model is sleeved on the human body actual dynamic model; wherein, the optimal joint distance between the cloth subarea and the corresponding human body model subarea is marked as hopt
Figure FDA0001228706300000051
HrRepresenting the vertical distance, D, between the r-th cloth section and the r-th personal body model section0Is a preset constant value of human body-clothing distance value, zeta0The fitting space displacement is a preset virtual fitting space displacement threshold constant, and R is the self-adaptive fitting deformation proportion of the clothing parameters;
step 5-5, the clothing model parameter database sleeves the virtual clothing fitting model on the corresponding human body actual dynamic model according to the clearance value constant reserved by the virtual clothing fitting model obtained in the step 5-4 and the human body actual dynamic model;
step 6, setting a maximum deformation area deformed when the virtual clothes fitting model is displayed and a deformation rate of the virtual clothes fitting model when the virtual clothes fitting model is deformed according to clothes parameters corresponding to the virtual clothes fitting model by a touch display screen of the three-dimensional fitting system, and displaying the virtual clothes fitting model in the limited maximum deformation area by the display screen; the acquisition process of the maximum deformation area comprises the following steps of 6-1 to 6-3:
6-1, constructing a new three-dimensional stereo coordinate system by a touch display screen of the stereo fitting system according to shoulder width parameters and waist circumference parameters in the garment parameters corresponding to the virtual garment fitting model;
step 6-2, the touch display screen of the three-dimensional fitting system obtains the shoulder width parameter and the waist circumference parameter in the clothing parameters and the constructed three-dimensional coordinate systemWide left point A coordinate (x)Shoulder-L,yShoulder-L,zShoulder-L) Shoulder width right point A' coordinate (x)Shoulder-R,yShoulder-R,zShoulder-R) B coordinate of waist left point (x)Waist-L,yWaist-L,zWaist-L) And the waist right point B' coordinate (x)Waist-R,yWaist-R,zWaist-R);
6-3, calculating to obtain a limited maximum deformation area by a touch display screen of the three-dimensional fitting system according to the obtained shoulder width left point coordinate, shoulder width right point coordinate, waist circumference left point coordinate and waist circumference right point coordinate; wherein the limited maximum deformation area is marked as Regionmax
Figure FDA0001228706300000061
The area of any point (x, y, z) on the virtual clothes fitting model satisfies the requirement
Figure FDA0001228706300000062
Step 7, the fitting person inputs the fitting background image name of the fitting to be worn through the central processing unit, and the fitting background generator generates a corresponding fitting background image; the touch display screen performs depth preprocessing on the generated fitting background image to display the fitting background image selected by the fitting person in high definition; the process of performing depth preprocessing on the generated fitting background image by the touch display screen comprises the following steps:
expanding a plurality of pixels outwards along the horizontal direction at the edge of the fitting background image so as to enable the depth values on the surface of the fitting background image to be consistent, and further avoiding causing geometric distortion of the fitting background image; wherein, the expansion model of the edge of the fitting background image expanding outwards along the horizontal direction is as follows:
Figure FDA0001228706300000063
d (x, y) represents the gray level of a pixel point at the coordinate (x, y) on the fitting background image before outward expansion at the edge, d' (x, y) represents the gray level of the pixel point at the coordinate (x, y) after outward expansion at the edge, and dir (x, y) represents the direction of outward expansion of the coordinate (x, y) on the fitting background image; 1 represents expansion to the right in the horizontal direction, -1 represents expansion to the left in the horizontal direction, and 0 represents no expansion in the horizontal direction; k represents the pixel number value of outward expansion of coordinates (x, y) on the fitting background image;
Figure FDA0001228706300000064
th is a preset threshold constant for preprocessing a fitting background image selected by a fitting person on the touch display screen;
and 8, after the fitting person is satisfied with the fitting situation of the virtual garment fitting model and the actual dynamic model of the human body under the selected fitting background image, inputting a fitting satisfaction instruction to the central processing unit by the fitting person, and calling the garment parameter corresponding to the current fitting satisfaction instruction from the garment model parameter database by the central processing unit and feeding back the garment parameter to the fitting person so that the fitting person can obtain the corresponding satisfied garment according to the fed garment parameter.
2. The stereoscopic fitting method based on somatosensory feature parameter extraction according to claim 1, further comprising, after step 8:
step 9, after the fitting person inputs a fitting satisfaction instruction to a central processing unit of the three-dimensional fitting system, a touch display screen of the three-dimensional fitting system displays a prompt frame for inquiring whether the fitting person obtains a fitting effect model and actual dynamic models of each human body corresponding to the fitting person, and after the fitting person selects the instruction for obtaining the fitting effect model, the touch display screen displays a prompt for paying a payment item to the fitting person;
step 10, pre-storing an encryption image in a bank system by a three-dimensional fitting system to extract a secret key, continuously acquiring face images of the fitting persons with preset number of the fitting persons as original images to be encrypted for the payment operation by the fitting persons through a touch display screen after inputting bank card numbers and corresponding payment passwords, taking the face images of the fitting persons with unique biological characteristics as preliminary security encryption measures, carrying out mixed encryption based on image encryption on the received payment information comprising the bank card numbers and the payment passwords to obtain an encryption image, and safely providing the payment information embedded in the encryption image to the corresponding bank system by the touch display screen for decryption;
and 11, extracting a secret key by the bank system according to an embedded secret image prestored in the three-dimensional fitting system, extracting payment information in the face image of the embedded fitting person, and when the bank system judges that the extracted payment information is consistent with the payment information prestored in the bank system by the fitting person, extracting the secret key by the bank system according to the prestored embedded secret image to finish deduction of a bank account of the fitting person and transfer of the bank system to the three-dimensional fitting system account.
3. The stereoscopic fitting method based on somatosensory feature parameter extraction according to claim 2, further comprising, after step 11: after the three-dimensional fitting system confirms that the payment of the fitting person is received, the 3D printer prints the human body actual dynamic model selected by the fitting person according to the command of the central processing unit so as to print the 3D model when the fitting person wears the corresponding fitting clothes, and the three-dimensional fitting system provides the actual body feeling characteristic parameters of the fitting person for the fitting person, so that the fitting person can conveniently output the actual body feeling characteristic parameters to the three-dimensional fitting system when fitting the fitting person again.
CN201710090213.1A 2017-02-20 2017-02-20 Three-dimensional fitting method based on somatosensory characteristic parameter extraction Active CN106920146B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710090213.1A CN106920146B (en) 2017-02-20 2017-02-20 Three-dimensional fitting method based on somatosensory characteristic parameter extraction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710090213.1A CN106920146B (en) 2017-02-20 2017-02-20 Three-dimensional fitting method based on somatosensory characteristic parameter extraction

Publications (2)

Publication Number Publication Date
CN106920146A CN106920146A (en) 2017-07-04
CN106920146B true CN106920146B (en) 2020-12-11

Family

ID=59453895

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710090213.1A Active CN106920146B (en) 2017-02-20 2017-02-20 Three-dimensional fitting method based on somatosensory characteristic parameter extraction

Country Status (1)

Country Link
CN (1) CN106920146B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107392957B (en) * 2017-07-19 2021-06-08 杭州中赛实业有限公司 Children dress fitting method based on somatosensory technology and children dress thereof
CN107705279B (en) * 2017-09-22 2021-07-23 北京奇虎科技有限公司 Image data real-time processing method and device for realizing double exposure and computing equipment
CN108389213B (en) * 2018-01-24 2021-06-11 上海工程技术大学 Adhesion circumference measuring method based on tangent plane point cloud
CN108830783B (en) 2018-05-31 2021-07-02 北京市商汤科技开发有限公司 Image processing method and device and computer storage medium
CN109064260B (en) * 2018-07-11 2022-03-01 北京知足科技有限公司 Shoe type data acquisition method and device
CN111353844A (en) * 2018-12-24 2020-06-30 上海时元互联网科技有限公司 Comfort evaluation method and system for virtual fitting, storage medium and terminal
CN109934613A (en) * 2019-01-16 2019-06-25 中德(珠海)人工智能研究院有限公司 A kind of virtual costume system for trying
CN110675214A (en) * 2019-08-27 2020-01-10 杭州海飘科技有限公司 Virtual fitting somatosensory simulation method and system
CN114556402A (en) 2019-09-03 2022-05-27 程立苇 Data processing method and device, computer equipment and computer readable storage medium
CN112488779A (en) * 2019-09-12 2021-03-12 爱唯秀股份有限公司 Three-dimensional fitting method
CN114347065B (en) * 2022-02-09 2024-05-03 望墨科技(武汉)有限公司 Continuous fitting method of fitting robot
CN116503569B (en) * 2023-06-29 2023-09-22 深圳市镭神智能系统有限公司 Virtual fitting method and system, computer readable storage medium and electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104156746A (en) * 2014-08-13 2014-11-19 西华大学 Monopoly experience store management system based on RFID garment labels, and monopoly experience store management method based on RFID garment labels
US9261526B2 (en) * 2010-08-26 2016-02-16 Blast Motion Inc. Fitting system for sporting equipment
CN106407605A (en) * 2016-11-01 2017-02-15 南京大学 Particle computer dynamic simulation method for 3D garment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101582143A (en) * 2008-05-16 2009-11-18 杨政宪 Terminal try-on simulation system and method for generating try-on image
CN106339929A (en) * 2016-08-31 2017-01-18 潘剑锋 3D fitting system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9261526B2 (en) * 2010-08-26 2016-02-16 Blast Motion Inc. Fitting system for sporting equipment
CN104156746A (en) * 2014-08-13 2014-11-19 西华大学 Monopoly experience store management system based on RFID garment labels, and monopoly experience store management method based on RFID garment labels
CN106407605A (en) * 2016-11-01 2017-02-15 南京大学 Particle computer dynamic simulation method for 3D garment

Also Published As

Publication number Publication date
CN106920146A (en) 2017-07-04

Similar Documents

Publication Publication Date Title
CN106920146B (en) Three-dimensional fitting method based on somatosensory characteristic parameter extraction
US10777021B2 (en) Virtual representation creation of user for fit and style of apparel and accessories
CN106910115B (en) Virtual fitting method based on intelligent terminal
KR101671649B1 (en) Method and System for 3D manipulated image combined physical data and clothing data
CN104637084B (en) A kind of method and virtual fitting system for establishing garment virtual threedimensional model
EP3370208A2 (en) Virtual reality-based apparatus and method to generate a three dimensional (3d) human face model using image and depth data
CN104123753B (en) Three-dimensional virtual fitting method based on garment pictures
JP6874772B2 (en) Image generator, image generator, and program
US8976230B1 (en) User interface and methods to adapt images for approximating torso dimensions to simulate the appearance of various states of dress
US10311508B2 (en) Garment modeling simulation system and process
CN109035413B (en) Virtual fitting method and system for image deformation
US20020024517A1 (en) Apparatus and method for three-dimensional image production and presenting real objects in virtual three-dimensional space
EP4089615A1 (en) Method and apparatus for generating an artificial picture
CN101493930B (en) Loading exchanging method and transmission exchanging method
CN109801380A (en) A kind of method, apparatus of virtual fitting, storage medium and computer equipment
JP2004506276A (en) Three-dimensional face modeling system and modeling method
CN106097442A (en) A kind of intelligent simulation dressing system and application process thereof
CN106897916B (en) Personalized clothing remote customization method based on mobile terminal
CN106846124B (en) Intelligent processing method of personalized clothes based on industrial 4.0
KR102506352B1 (en) Digital twin avatar provision system based on 3D anthropometric data for e-commerce
KR101767144B1 (en) Apparatus, method and computer program for generating 3-dimensional model of clothes
JP6537419B2 (en) Template selection system, template selection method, template selection program and recording medium storing the program
Yamada et al. Image-based virtual fitting system with garment image reshaping
CN112802031A (en) Real-time virtual hair trial method based on three-dimensional human head tracking
JP2001249957A (en) Automatic preparation method for paper pattern for clothes and automatic preparation system therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant