CN109816492B - Method, terminal and medium for realizing virtual fitting room - Google Patents

Method, terminal and medium for realizing virtual fitting room Download PDF

Info

Publication number
CN109816492B
CN109816492B CN201910084555.1A CN201910084555A CN109816492B CN 109816492 B CN109816492 B CN 109816492B CN 201910084555 A CN201910084555 A CN 201910084555A CN 109816492 B CN109816492 B CN 109816492B
Authority
CN
China
Prior art keywords
user
dimensional model
model
commodity
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910084555.1A
Other languages
Chinese (zh)
Other versions
CN109816492A (en
Inventor
许广明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201910084555.1A priority Critical patent/CN109816492B/en
Publication of CN109816492A publication Critical patent/CN109816492A/en
Application granted granted Critical
Publication of CN109816492B publication Critical patent/CN109816492B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention provides a method, a terminal and a medium for realizing a virtual fitting room, wherein the method is used for collecting a three-dimensional model of a user and storing the three-dimensional model in a user database; acquiring a three-dimensional model of a commodity and storing the three-dimensional model in a commodity database; when a selection instruction of a user is received, acquiring a commodity number in the selection instruction, and reading a commodity three-dimensional model corresponding to the commodity number from the commodity database; reading a user three-dimensional model of the user in a user database; comparing the commodity three-dimensional model with the user three-dimensional model, and generating an imaging model according to the comparison result; and displaying the imaging model. The method can superpose the selected commodity three-dimensional model on the user three-dimensional model, so that the user can visually see the commodity wearing state, the interest of the user in shopping the online shop is improved, the frequency of unsuitable goods return in online shopping is reduced, and the trouble of frequent trying-on of the user in shopping by the traditional method is solved.

Description

Method, terminal and medium for realizing virtual fitting room
Technical Field
The invention belongs to the technical field of internet, and particularly relates to a method, a terminal and a medium for realizing a virtual fitting room.
Background
In the conventional process of purchasing clothes, the customer needs to take off the clothes on the body to try on new clothes better, and the process takes a lot of time for the customer. Due to time and clothing resource limitations, sometimes a garment suitable for the user cannot be bought for several hours. The process of fitting the customer also takes up a lot of time and clothes resources in the clothing store, which causes the clothing store to lose much profit. Particularly, in the gold holidays, when a large number of users exist, clothing stores are often crowded, and the working efficiency is low. Meanwhile, due to the limitation of spatial positions, a clothing store cannot provide various clothes for a user to refer to, and the development of the clothing store is limited.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a method, a terminal and a medium for realizing a virtual fitting room, which are used for solving the defects of low purchasing efficiency and troublesome operation caused by trying on new clothes by a user in the traditional clothes purchasing process.
In a first aspect, a method for implementing a virtual fitting room includes the following steps:
collecting a user three-dimensional model and storing the user three-dimensional model in a user database;
acquiring a three-dimensional model of a commodity and storing the three-dimensional model in a commodity database;
when a selection instruction of a user is received, acquiring a commodity number in the selection instruction, and reading a commodity three-dimensional model corresponding to the commodity number from the commodity database;
reading a user three-dimensional model of the user in a user database;
comparing the commodity three-dimensional model with the user three-dimensional model, and generating an imaging model according to the comparison result;
and displaying the imaging model.
Preferably, the acquiring the three-dimensional model of the user specifically includes:
collecting user model parameters input by a user;
generating the user three-dimensional model according to the user model parameters and a preset user model template;
the user model parameters include one or more of the following parameters: chest circumference, shoulder width, collar circumference, sleeve length, waist circumference, pant length, body length, head circumference, foot length, foot width, hip circumference, and skin color.
Preferably, the three-dimensional model of the commodity comprises a three-dimensional model of clothes, a three-dimensional model of shoes and a three-dimensional model of hats;
the acquiring of the three-dimensional model of the commodity specifically comprises the following steps:
collecting commodity model parameters input by a merchant;
generating the commodity three-dimensional model according to the commodity model parameters and a preset commodity model template;
the commodity model parameters comprise one or more of the following parameters: shoulder width, chest circumference, clothing length, waist circumference, hip circumference, sleeve length, front crotch, back rail, thigh circumference, arm circumference, cuff, sleeve circumference, sling length, color, material, number, head circumference, foot length, and foot width.
Preferably, when the three-dimensional model of the commodity is a three-dimensional model of a garment, comparing the three-dimensional model of the garment with the three-dimensional model of the user, and generating an imaging model according to the comparison result specifically includes:
respectively comparing the clothes three-dimensional models with different code numbers with the user three-dimensional model to obtain comparison results;
and acquiring the code number of the clothes three-dimensional model with the minimum difference of the comparison results, and generating the imaging model according to the clothes three-dimensional model with the code number and the user three-dimensional model.
Preferably, the generating the imaging model according to the three-dimensional clothes model and the three-dimensional user model of the code number specifically includes:
sequentially calculating the size difference value of the same position in the clothes three-dimensional model and the user three-dimensional model, wherein the calculation method of the size difference value is as follows:
size difference = commodity model parameter value-user model parameter value;
adjusting the corresponding three-dimensional model of the clothes according to the size difference,
superposing the adjusted clothes three-dimensional model to the user three-dimensional model to generate the imaging model;
the adjusting the corresponding three-dimensional model of the garment according to the size difference specifically includes:
if the size difference value is a negative number and the material of the clothes three-dimensional model is elastic, uniformly extending the size of the position in the clothes three-dimensional model to be equal to the corresponding user model parameter value;
and if the size difference is a positive number, adjusting the wrinkle degree of the three-dimensional model of the clothes according to the size difference and the position.
Preferably, if the size difference is a positive number, the adjusting the wrinkle degree of the three-dimensional model of the garment according to the size difference and the position specifically comprises:
if the size difference value of the shoulder width is positive, the shoulder width value in the three-dimensional model of the clothes is adjusted to the shoulder width value of the three-dimensional model of the user, and the length value of the middle sleeve of the three-dimensional model of the clothes is adjusted according to the cuff, the sleeve circumference and the size difference value;
and if the size difference of the garment length is positive and the size difference of the waistline is 0, increasing the wrinkle degree of the waistline in the three-dimensional model of the garment to enable the garment length to be equal to the corresponding body length.
Preferably, the acquiring the three-dimensional model of the user specifically includes:
receiving images, videos or data input by a user, and establishing the three-dimensional model of the user according to the images, videos or data.
Preferably, before the acquiring the three-dimensional model of the user and storing the three-dimensional model in the user database, the method further comprises:
receiving registration information input by a user to complete registration;
the registration information includes a username and a password.
In a second aspect, a terminal comprises a processor, an input device, an output device, and a memory, which are connected to each other, wherein the memory is used for storing a computer program, which includes program instructions, and the processor is configured to call the program instructions to execute the method of the first aspect.
In a third aspect, a computer-readable storage medium stores a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method of the first aspect.
According to the technical scheme, the method, the terminal and the medium for realizing the virtual fitting room provided by the invention are used for acquiring the three-dimensional model of the user corresponding to the stature of the user and the three-dimensional model of the commodity corresponding to the shape of the commodity. The method has the advantages that the user can select the commodity three-dimensional model of the commodity to be purchased when trying on, so that the method can overlay the selected commodity three-dimensional model on the user three-dimensional model, the user can visually see the commodity wearing manner, the interest of the user in shopping online is improved, the frequency of online shopping which is not suitable for goods return is reduced, and meanwhile, the trouble that the user frequently tries on the commodity in shopping in the traditional method is solved.
Drawings
In order to more clearly illustrate the detailed description of the invention or the technical solutions in the prior art, the drawings used in the detailed description or the prior art description will be briefly described below. Throughout the drawings, like elements or portions are generally identified by like reference numerals. In the drawings, elements or portions are not necessarily drawn to scale.
Fig. 1 is a flowchart of a method for implementing a virtual fitting room according to an embodiment.
Fig. 2 is a flowchart of a method for generating an imaging model according to the second embodiment.
Fig. 3 is a flowchart of a method for generating an imaging model according to a contrast result according to the second embodiment.
Fig. 4 is a flowchart of a method for adjusting a three-dimensional model of a garment according to a size difference according to the second embodiment.
Fig. 5 is a flowchart of a method for adjusting the wrinkle degree of the three-dimensional model of the garment according to the second embodiment when the size difference is a positive number.
Fig. 6 is a block diagram of a terminal according to a third embodiment.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings. The following examples are only for illustrating the technical solutions of the present invention more clearly, and therefore are only examples, and the protection scope of the present invention is not limited thereby. It is to be noted that, unless otherwise specified, technical or scientific terms used herein shall have the ordinary meaning as understood by those skilled in the art to which the invention pertains.
The first embodiment is as follows:
a method for implementing a virtual fitting room, see fig. 1, includes the following steps:
s1: collecting a user three-dimensional model and storing the user three-dimensional model in a user database;
specifically, the acquiring the three-dimensional model of the user specifically includes:
receiving images, videos or data input by a user, and establishing the user three-dimensional model according to the images, videos or data.
Therefore, the user can provide images or videos containing the user shape, and the method performs image processing and video processing on the images or videos to obtain the user shape data. The user can also directly input the shape data. The method establishes a three-dimensional user model according to the body shape data of the user. Only one user three-dimensional model is acquired by one login account, and the user can modify the established user three-dimensional model as the shape of the user also changes along with the time. And inputting new images, videos or data again to establish a new three-dimensional model of the user.
S2: acquiring a three-dimensional model of a commodity and storing the three-dimensional model in a commodity database;
specifically, the three-dimensional model of the commodity comprises three-dimensional models of commodities such as clothes, trousers, shoes, hats and the like. Similarly, the three-dimensional model of the commodity can also be established according to the images, videos or data by receiving the images, videos or data input by the merchant. A commodity corresponds to a three-dimensional model of the commodity.
S3: when a selection instruction of a user is received, acquiring a commodity number in the selection instruction, and reading a commodity three-dimensional model corresponding to the commodity number from the commodity database;
specifically, when the user wants to try on which product, the user inputs the product number corresponding to the product. The selection instruction can be sent by clicking a certain commodity on a touch screen of the terminal by a user, or can be obtained by inputting a commodity number by the user. Each commodity three-dimensional model corresponds to a commodity number, and the corresponding commodity three-dimensional model can be taken out through the commodity number.
S4: reading a user three-dimensional model of the user in a user database;
specifically, when the user starts fitting, the user three-dimensional model of the user and the commodity three-dimensional model selected by the user need to be read.
S5: comparing the commodity three-dimensional model with the user three-dimensional model, and generating according to the comparison result;
s6: and displaying the imaging model.
Specifically, the method can know whether the commodity is matched with the shape of the user by comparing the commodity three-dimensional model with the user three-dimensional model. The imaging model shows the effect of the commodity worn on the user in a simulation mode.
The method comprises the steps of collecting a user three-dimensional model corresponding to the stature of a user and a commodity three-dimensional model corresponding to the shape of a commodity. The method has the advantages that the user can select the commodity three-dimensional model of the commodity to be purchased when trying on, so that the method can overlay the selected commodity three-dimensional model on the user three-dimensional model, the user can visually see the commodity wearing manner, the interest of the user in shopping online is improved, the frequency of online shopping which is not suitable for goods return is reduced, and meanwhile, the trouble that the user frequently tries on the commodity in shopping in the traditional method is solved.
Preferably, before the acquiring the three-dimensional model of the user and storing the three-dimensional model in the user database, the method further comprises:
receiving registration information input by a user and finishing registration;
the registration information includes a username and password.
Specifically, the method provides a user registration function, and a user logs in through a user name and a password to enter a fitting function.
The second embodiment:
in the second embodiment, on the basis of the first embodiment, the following contents are added:
the acquiring the three-dimensional model of the user specifically comprises the following steps:
collecting user model parameters input by a user;
generating the user three-dimensional model according to the user model parameters and a preset user model template;
the user model parameters include one or more of the following parameters: chest circumference, shoulder width, collar circumference, sleeve length, waist circumference, pant length, body length, head circumference, foot length, foot width, hip circumference, and skin color.
Specifically, the model parameters of the user, i.e., the shape data of the user, may be obtained by analyzing the image or video, or may be directly input. The body shape data of the user is measured by a traditional method. For example: the chest circumference refers to the circumference of the outer ring of the chest of a human body, the length can be expressed by inches or centimeters, and the chest circumference is divided into a chest circumference and a lower chest circumference. Shoulder width is the lateral horizontal linear distance between the most outwardly protruding parts of the upper arm at the deltoid region. In the traditional method, the collar of the shirt is obtained by winding the neck for one circle, placing the front part below the laryngeal prominence and placing half inch. The sleeve length refers to the distance from the shoulder seam to the cuff. Waist circumference is the horizontal waist circumference through the umbilical point and is a comprehensive index reflecting the total fat amount and fat distribution. The length of the pants is measured from the waist to the position of the ankle or more.
Preferably, the three-dimensional model of the commodity comprises a three-dimensional model of clothes, a three-dimensional model of shoes and a three-dimensional model of hats;
the acquiring of the three-dimensional model of the commodity specifically comprises:
collecting commodity model parameters input by a merchant;
generating the three-dimensional commodity model according to the commodity model parameters and a preset commodity model template;
the commodity model parameters comprise one or more of the following parameters: shoulder width, chest circumference, clothing length, waist circumference, hip circumference, sleeve length, front crotch, back rail, thigh circumference, arm circumference, cuff, sleeve circumference, sling length, color, material, number, head circumference, foot length, and foot width.
Specifically, the merchant may input different commodity model parameters according to different commodities. For example: if the garment is a garment, the shoulder width, the chest circumference, the garment length, the waist circumference, the sleeve length, the arm circumference, the cuff, the sleeve circumference, the sling length, the color, the material, the code number and the like can be input. If the trousers are used, the hip circumference, the front crotch, the back crotch, the thigh circumference, the color, the material and the code number can be input. If the cap is used, the head circumference, color and material can be input. If the shoes are used, the color, material, number, length and width of the feet can be input.
Besides, this embodiment gives a generation method of the following imaging model.
Referring to fig. 2, when the three-dimensional model of the commodity is a three-dimensional model of a garment, comparing the three-dimensional model of the garment with the three-dimensional model of the user, and generating an imaging model according to the comparison result specifically includes:
s11: respectively comparing the clothes three-dimensional models with different code numbers with the user three-dimensional model to obtain comparison results;
s12: and acquiring the code number of the clothes three-dimensional model with the minimum difference of the comparison results, and generating the imaging model according to the clothes three-dimensional model with the code number and the user three-dimensional model.
Specifically, even for the same item, the three-dimensional models of the clothes corresponding to different code numbers are different. The method comprises the steps of comparing a three-dimensional model of a user with three-dimensional models of clothes with different code numbers of the same type of clothes to obtain the three-dimensional model of the clothes with the minimum difference of comparison results, wherein the code number corresponding to the three-dimensional model of the clothes is the code number which is most suitable for the user. The code numbers include XS, S, M, L, XL, XXL, XXXL, etc. The method can obtain the code number which is most suitable for the user of the clothes with the style, and generate the imaging model according to the three-dimensional model of the clothes corresponding to the code number, so that the user can directly obtain the code number which is most suitable for the style and the fitting effect of the clothes corresponding to the code number.
Referring to fig. 3, the generating the imaging model according to the three-dimensional clothes model and the three-dimensional user model of the code number specifically includes:
s21: sequentially calculating the size difference value of the same position in the three-dimensional model of the clothes and the three-dimensional model of the user, wherein the calculation method of the size difference value is as follows:
size difference = commodity model parameter value-user model parameter value;
specifically, the commodity model parameter value is a value corresponding to a commodity model parameter in the commodity three-dimensional model, and the user model parameter value is a value corresponding to a user model parameter in the user three-dimensional model. When the method is used for comparison, the size comparison of the same positions of two three-dimensional models is carried out, for example: and comparing the size difference of the chest circumference in the three-dimensional model of the user and the chest circumference in the three-dimensional model of the clothes, and comparing the size difference of the shoulder width in the three-dimensional model of the user and the shoulder width in the three-dimensional model of the clothes.
S22: adjusting the corresponding clothes three-dimensional model according to the size difference;
s23: superposing the adjusted clothes three-dimensional model to the user three-dimensional model to generate the imaging model;
specifically, the different sizes of the comparison results may cause different fitting effects for the user. According to the method, the corresponding clothes three-dimensional model is adjusted according to the size difference, and the adjusted clothes three-dimensional model is directly superposed on the user three-dimensional model to obtain the imaging model.
Referring to fig. 4, the adjusting the corresponding three-dimensional clothes model according to the size difference specifically includes:
s31: if the size difference value is a negative number and the material of the clothes three-dimensional model is elastic, uniformly extending the size of the position in the clothes three-dimensional model to be equal to the corresponding user model parameter value;
specifically, if the size difference is negative, which means that the clothes is too small, then if the clothes is made of elastic material, the user will stretch the clothes during fitting, and at this time, the size of the position in the three-dimensional model of the clothes is extended uniformly to be equal to the corresponding user model parameter value, that is, the imaging effect is that the clothes is stretched uniformly to the effect of conforming to the shape of the user.
S32: and if the size difference is a positive number, adjusting the wrinkle degree of the three-dimensional model of the clothes according to the size difference and the position.
In particular, if the size difference is positive, indicating that the clothing is too large, if the clothing is too large, the user may be wrinkled or otherwise obstructed from other locations, such as: if the waist circumference is appropriate but the garment length above the waist circumference is too long, the user will have a crease above the waist circumference when wearing the garment. If the whole clothes is too long, the user can block the buttocks, the thighs and the like when wearing the clothes. And adjusting the wrinkle degree of the three-dimensional model of the clothes according to the size or the position of the size difference. The degree of pleating may include one pleat, two pleats, or multiple pleats, with more pleats indicating a longer garment.
Referring to fig. 5, if the size difference is a positive number, the adjusting the wrinkle degree of the three-dimensional model of the garment according to the size difference and the position specifically includes:
s41: if the size difference of the shoulder widths is positive, the shoulder width value in the three-dimensional clothes model is adjusted to the shoulder width value of the three-dimensional user model, and the sleeve length value in the three-dimensional clothes model is adjusted according to the cuff, the sleeve circumference and the size difference;
specifically, if the shoulder width of the garment is too large, the user wears the garment four times, the shoulder position of the upper part of the garment can be sagged down to form part of the sleeve, the shoulder width value in the three-dimensional model of the garment is adjusted to be the shoulder width value of the three-dimensional model of the user, the sagged part of the upper shoulder width of the garment is adjusted to be the position of the sleeve, namely the sleeve can be lengthened, and the length of the sleeve is determined according to the sagged part, the sleeve opening and the sleeve circumference. Namely, the sleeve length value in the three-dimensional clothes model is equal to the original sleeve length value plus the length laid down, if the size of a position in the adjusted three-dimensional clothes model is smaller than that of the corresponding position in the three-dimensional user model, folds are formed above the position, and the fold degree of the position in the three-dimensional clothes model is adjusted.
S42: and if the size difference of the garment length is positive and the size difference of the waistline is 0, increasing the wrinkle degree of the waistline in the three-dimensional model of the garment to enable the garment length to be equal to the corresponding body length.
If the size difference of the waistline is 0, the waistline is proper, if the length of the clothes above the waistline is too long, the degree of waistline folds in the three-dimensional model of the clothes is increased, and the overall length of the clothes is equal to the corresponding body length in the three-dimensional model of the user.
The method can adjust the imaging model according to the size and the material of the clothes, so that the fitting effect is closer to the actual wearing effect, and the use is convenient.
For a brief description, the method provided by the embodiment of the present invention may refer to the corresponding content in the foregoing method embodiment.
Example three:
a terminal, see fig. 6, comprising a processor 801, an input device 802, an output device 803 and a memory 804, the processor 801, the input device 802, the output device 803 and the memory 804 being interconnected, wherein the memory 804 is adapted to store a computer program comprising program instructions, the processor 801 is configured to invoke the program instructions to perform the above-mentioned method.
In particular implementations, the terminals described in embodiments of the invention include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touchpad).
In the discussion that follows, a terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that, in the embodiment of the present invention, the Processor 801 may be a Central Processing Unit (CPU), and the Processor may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field-Programmable Gate arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 802 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, and the like, and the output device 803 may include a display (LCD, and the like), a speaker, and the like.
The memory 804 may include both read-only memory and random access memory, and provides instructions and data to the processor 801. A portion of the memory 804 may also include non-volatile random access memory. For example, the memory 804 may also store device type information.
For a brief description, the embodiment of the present invention may refer to the corresponding content in the foregoing method embodiments.
Example four:
a computer-readable storage medium, in which a computer program is stored, the computer program comprising program instructions which, when executed by a processor, cause the processor to carry out the above-mentioned method.
The computer readable storage medium may be an internal storage unit of the terminal according to any of the foregoing embodiments, for example, a hard disk or a memory of the terminal. The computer readable storage medium may also be an external storage device of the terminal, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the terminal. The computer-readable storage medium is used for storing the computer program and other programs and data required by the terminal. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
For a brief description, reference may be made to the corresponding contents in the foregoing method embodiments for providing a medium according to an embodiment of the present invention.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention, and they should be construed as being included in the following claims and description.

Claims (7)

1. A method for realizing a virtual fitting room is characterized by comprising the following steps:
collecting a user three-dimensional model and storing the user three-dimensional model in a user database;
acquiring a three-dimensional model of a commodity and storing the three-dimensional model in a commodity database;
when a selection instruction of a user is received, acquiring a commodity number in the selection instruction, and reading a commodity three-dimensional model corresponding to the commodity number from the commodity database;
reading a user three-dimensional model of the user in a user database;
comparing the commodity three-dimensional model with the user three-dimensional model, and generating an imaging model according to a comparison result;
displaying the imaging model;
the three-dimensional model of the commodity comprises a three-dimensional model of clothing;
when the three-dimensional model of the commodity is a three-dimensional model of clothes, comparing the three-dimensional model of the clothes with the three-dimensional model of the user, and generating an imaging model according to a comparison result specifically comprises:
respectively comparing the clothes three-dimensional models with different code numbers with the user three-dimensional model to obtain comparison results;
acquiring the code number of the clothes three-dimensional model with the minimum difference of the comparison results, and generating the imaging model according to the clothes three-dimensional model with the code number and the user three-dimensional model;
the generating of the imaging model according to the three-dimensional clothes model and the three-dimensional user model of the code number specifically comprises the following steps:
sequentially calculating the size difference value of the same position in the clothes three-dimensional model and the user three-dimensional model, wherein the calculation method of the size difference value is as follows:
the size difference = commodity model parameter value-user model parameter value, the commodity model parameter value is a value corresponding to a commodity model parameter in the commodity three-dimensional model, the user model parameter value is a value corresponding to a user model parameter in the user three-dimensional model, and the user model parameter includes one or more of the following parameters: chest circumference, shoulder width, collar circumference, sleeve length, waist circumference, trouser length, body length, head circumference, foot length, foot width, hip circumference and skin color, wherein the parameters of the commodity model comprise one or more of the following parameters: shoulder width, chest circumference, clothes length, waist circumference, hip circumference, sleeve length, front crotch, back rail, thigh circumference, arm circumference, cuff, sleeve circumference, sling length, color, material, code number, head circumference, foot length and foot width, and when comparing, carrying out size comparison at the same position of the two three-dimensional models;
adjusting the corresponding three-dimensional model of the clothes according to the size difference,
superposing the adjusted three-dimensional model of the clothes to the three-dimensional model of the user to generate the imaging model;
the adjusting of the corresponding three-dimensional clothes model according to the size difference specifically includes:
if the size difference is negative and the clothes three-dimensional model is elastic, uniformly extending the size of the position in the clothes three-dimensional model to be equal to the corresponding user model parameter value;
if the size difference is a positive number, adjusting the wrinkle degree of the three-dimensional model of the clothes according to the size difference and the position;
if the size difference is a positive number, adjusting the wrinkle degree of the three-dimensional clothes model according to the size difference and the position specifically comprises:
if the size difference of the shoulder widths is positive, the shoulder width value in the three-dimensional clothes model is adjusted to the shoulder width value of the three-dimensional user model, and the sleeve length value in the three-dimensional clothes model is adjusted according to the cuff, the sleeve circumference and the size difference;
and if the size difference of the garment length is positive and the size difference of the waist is 0, increasing the wrinkle degree of the waist in the three-dimensional model of the garment to enable the garment length to be equal to the corresponding body length.
2. The method for implementing a virtual fitting room according to claim 1,
the acquiring of the three-dimensional model of the user specifically comprises:
collecting user model parameters input by a user;
generating the user three-dimensional model according to the user model parameters and a preset user model template;
the user model parameters include one or more of the following parameters: chest circumference, shoulder width, collar circumference, sleeve length, waist circumference, pant length, body length, head circumference, foot length, foot width, hip circumference, and skin color.
3. The method for implementing a virtual fitting room according to claim 1,
the commodity three-dimensional model comprises a shoe three-dimensional model and a hat three-dimensional model;
the acquiring of the three-dimensional model of the commodity specifically comprises:
collecting commodity model parameters input by a merchant;
generating the commodity three-dimensional model according to the commodity model parameters and a preset commodity model template;
the commodity model parameters comprise one or more of the following parameters: shoulder width, chest circumference, clothing length, waist circumference, hip circumference, sleeve length, front crotch, back rail, thigh circumference, arm circumference, cuff, sleeve circumference, sling length, color, material, number, head circumference, foot length, and foot width.
4. The method for implementing a virtual fitting room according to claim 1,
the acquiring of the three-dimensional model of the user specifically comprises:
receiving images, videos or data input by a user, and establishing the user three-dimensional model according to the images, videos or data.
5. The method for implementing a virtual fitting room according to claim 1,
before the acquiring the three-dimensional model of the user and storing the three-dimensional model in the user database, the method further comprises the following steps:
receiving registration information input by a user and finishing registration;
the registration information includes a username and password.
6. A terminal, comprising a processor, an input device, an output device, and a memory, the processor, the input device, the output device, and the memory being interconnected, wherein the memory is configured to store a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1-5.
7. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to carry out the method according to any one of claims 1-5.
CN201910084555.1A 2019-01-29 2019-01-29 Method, terminal and medium for realizing virtual fitting room Active CN109816492B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910084555.1A CN109816492B (en) 2019-01-29 2019-01-29 Method, terminal and medium for realizing virtual fitting room

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910084555.1A CN109816492B (en) 2019-01-29 2019-01-29 Method, terminal and medium for realizing virtual fitting room

Publications (2)

Publication Number Publication Date
CN109816492A CN109816492A (en) 2019-05-28
CN109816492B true CN109816492B (en) 2023-01-03

Family

ID=66605660

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910084555.1A Active CN109816492B (en) 2019-01-29 2019-01-29 Method, terminal and medium for realizing virtual fitting room

Country Status (1)

Country Link
CN (1) CN109816492B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112685649A (en) * 2021-01-25 2021-04-20 深圳创维-Rgb电子有限公司 Clothing recommendation method and device, storage medium and terminal equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108510594A (en) * 2018-02-27 2018-09-07 吉林省行氏动漫科技有限公司 Virtual fit method, device and terminal device
GB2561275A (en) * 2017-04-07 2018-10-10 Farfetch Uk Ltd Tracking user interaction in a retail environment
CN108648053A (en) * 2018-05-10 2018-10-12 南京衣谷互联网科技有限公司 A kind of imaging method for virtual fitting
CN109003168A (en) * 2018-08-16 2018-12-14 深圳Tcl数字技术有限公司 Virtual fit method, smart television and computer readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180032818A1 (en) * 2016-07-27 2018-02-01 International Business Machines Corporation Providing a personalized fitting room experience

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2561275A (en) * 2017-04-07 2018-10-10 Farfetch Uk Ltd Tracking user interaction in a retail environment
CN108510594A (en) * 2018-02-27 2018-09-07 吉林省行氏动漫科技有限公司 Virtual fit method, device and terminal device
CN108648053A (en) * 2018-05-10 2018-10-12 南京衣谷互联网科技有限公司 A kind of imaging method for virtual fitting
CN109003168A (en) * 2018-08-16 2018-12-14 深圳Tcl数字技术有限公司 Virtual fit method, smart television and computer readable storage medium

Also Published As

Publication number Publication date
CN109816492A (en) 2019-05-28

Similar Documents

Publication Publication Date Title
US11662829B2 (en) Modification of three-dimensional garments using gestures
US11273378B2 (en) Generating and utilizing digital avatar data for online marketplaces
CN108886586B (en) Method for synthesizing image and electronic device thereof
US20220258049A1 (en) System and method for real-time calibration of virtual apparel using stateful neural network inferences and interactive body measurements
US9990663B2 (en) Measuring shirt
US8525828B1 (en) Visualization of fit, flow, and texture of clothing items by online consumers
US20110022965A1 (en) Personalized shopping avatar
TW201737856A (en) Smart garment
JP6535023B2 (en) Method, apparatus and system for simulating an article
WO2020203656A1 (en) Information processing device, information processing method, and program
WO2022262508A1 (en) Augmented reality-based intelligent trying on method and system, terminal, and medium
Wang et al. Prediction of garment fit level in 3D virtual environment based on artificial neural networks
KR20230143588A (en) Software application for providing virtual wearing status of 3D avatar image
Hu et al. A generic method of wearable items virtual try-on
CN109816492B (en) Method, terminal and medium for realizing virtual fitting room
Wolff et al. Designing personalized garments with body movement
US20140136560A1 (en) System and method for selecting the recommended size of an article of clothing
JP2008003850A (en) Fit feeling judgment support system
Shams et al. Towards 3D virtual dressing room based user-friendly metaverse strategy
CN111666963B (en) Method, device and equipment for identifying clothes styles
WO2022081745A1 (en) Real-time rendering of 3d wearable articles on human bodies for camera-supported computing devices
KR20210130420A (en) System for smart three dimensional garment fitting and the method for providing garment fitting service using there of
KR101001538B1 (en) Method for buying shopping goods using avatar and system thereof
CN113011936A (en) Object purchasing processing method and device, electronic equipment and readable storage medium
WO2024033943A1 (en) Method and system for displaying three-dimensional virtual apparel on three-dimensional avatar for real-time fitting

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant