CN111179417B - Virtual try-on and try-on system and electronic equipment - Google Patents
Virtual try-on and try-on system and electronic equipment Download PDFInfo
- Publication number
- CN111179417B CN111179417B CN201911409269.4A CN201911409269A CN111179417B CN 111179417 B CN111179417 B CN 111179417B CN 201911409269 A CN201911409269 A CN 201911409269A CN 111179417 B CN111179417 B CN 111179417B
- Authority
- CN
- China
- Prior art keywords
- try
- virtual
- module
- human body
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012360 testing method Methods 0.000 claims description 12
- 238000000034 method Methods 0.000 claims description 11
- 210000002683 foot Anatomy 0.000 claims description 9
- 238000005516 engineering process Methods 0.000 claims description 7
- 238000012216 screening Methods 0.000 claims description 7
- 230000001815 facial effect Effects 0.000 claims description 6
- 210000002414 leg Anatomy 0.000 claims description 6
- 210000003423 ankle Anatomy 0.000 claims description 3
- 238000005452 bending Methods 0.000 claims description 3
- 230000009191 jumping Effects 0.000 claims description 3
- 230000001932 seasonal effect Effects 0.000 claims description 3
- 210000000689 upper leg Anatomy 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 9
- 230000000007 visual effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000037396 body weight Effects 0.000 description 2
- 244000309466 calf Species 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention relates to a virtual try-on and try-on system, which comprises a virtual try-on and try-on module, a virtual scene demonstration module and a collocation module; the matching module is used for matching the to-be-tested object with the human body three-dimensional model; the virtual try-on module is used for displaying the matching condition of the human body three-dimensional model and the object to be tested; the virtual scene demonstration module is used for providing a plurality of real virtual scenes and is used for showing the relationship between the virtual human body model after try-on/try-on and the selected environment; the virtual try-on module, the collocation module and the virtual scene demonstration module are communicated through uniform data. The invention also relates to electronic equipment comprising the virtual try-on and try-on system. The invention can improve the accuracy of the clothes selecting size; the time can be saved by providing the try-on demonstration scene, and the economic burden of the user can be reduced.
Description
Technical Field
The present invention relates to the field of virtual try-on technologies, and in particular, to a virtual try-on and try-on system and an electronic device.
Background
With the development of the e-commerce industry, online shopping has become a main way for most Chinese people to purchase clothes nowadays, and is deeply popular with the masses by virtue of the advantages of convenience and rapidness, but the online shopping is a virtual shopping way, cannot be tried on the body of the user, cannot sense whether the clothes are proper in size and match with the skin colors of the user, so that some customers often take a lot of time to choose, even wear the clothes inappropriately to be lost after buying, and economic burden and resource waste are caused to the customers.
At present, the existing virtual try-on clothes technology only provides clothes selection and recommendation decisions for customers, cannot provide real environments for the customers, cannot truly simulate whether the clothes are in harmony with the real scene which the customers want to attend, and cannot provide visual real visual impact and feeling of being in the scene for the customers.
Disclosure of Invention
In order to solve the technical problems, the invention provides a virtual try-on and try-on system, which is characterized in that: the virtual scene demonstration module comprises a virtual try-on module, a virtual scene demonstration module and a collocation module;
the matching module is used for matching the to-be-tested object with the human body three-dimensional model;
the virtual try-on module is used for displaying the matching condition of the human body three-dimensional model and the object to be tested;
the virtual scene demonstration module is used for providing a plurality of real virtual scenes and is used for showing the relationship between the virtual human body model after try-on/try-on and the selected environment;
the virtual try-on module, the collocation module and the virtual scene demonstration module are communicated through uniform data.
Further, the method comprises the steps of,
the human body information acquisition module is used for acquiring physical parameters of a user;
the virtual try-on module comprises a three-dimensional human body model unit and a characteristic matching unit;
the three-dimensional human body model unit is used for constructing the three-dimensional human body model according to the body parameters;
the characteristic matching unit is used for screening out the to-be-tested objects with the parameters corresponding to the body parameters within a first preset deviation range.
Further, the method comprises the steps of,
the physical parameter includes at least one of: an overall parameter set, a head parameter set, an upper body parameter set, a lower body parameter set, and a foot parameter set;
the global parameter set includes height and/or weight;
the head parameter set comprises a head circumference and/or a neck circumference;
the upper body parameter set at least comprises one of the neck circumference, chest circumference, waist circumference, arm length, back length and shoulder width;
the lower body parameter set comprises at least one of the waistline, the length of the inner leg, the circumference of the thigh and the circumference of the calf;
the set of foot parameters includes foot length and/or ankle circumference.
Further, the method comprises the steps of,
the way in which the three-dimensional mannequin unit builds the three-dimensional mannequin of the human body includes,
a first mode: according to the body parameters preset by the user, searching the nearest model from a database to be used as the human body three-dimensional model, and replacing the facial features of the model with the facial features of the user;
the second mode is as follows: carrying out integral scanning on a human body and fine scanning on the body parameters through a 3D scanning technology to construct a three-dimensional model of the human body;
the three-dimensional human body model unit establishes the human body three-dimensional model by using the first mode;
in the first mode, the second mode is used when the model within a second predetermined deviation range is not found.
Further, the method comprises the steps of,
the feature matching unit is further used for selecting matching modes, including matching modes of a loose mode, a moderate mode and/or a close-fitting mode;
the first predetermined deviation ranges of the loose mode, the medium mode and/or the tight mode are different.
Further, the method comprises the steps of,
the feature matching unit compares the deviation value S of the to-be-tested object and the three-dimensional human model with the first preset deviation range when the screening is carried out;
the deviation value s= [ (a) 1 -b 1 ) 2 +(a 2 -b 2 ) 2 +…+(a n -b n ) 2 ]N, wherein n represents the number of items of the physical parameter; a, a 1 ,a 2 ,...,a n Representing the physical parameter; b 1 ,b 2 ,…,b n Representing the corresponding parameter.
Further, the method comprises the steps of,
the real virtual scene includes a season portion, a real field portion, and/or a gesture portion;
the seasonal portion includes spring, summer, autumn and/or winter;
the real site portion includes a campus, a forest, a mall, and/or a wedding site;
the gestures include bending, bowing, stretching, jumping and/or normal walking.
Further, the method comprises the steps of,
the test object comprises a purchased test object and an unaddressed test object;
the collocation module comprises an existing database storing the purchased test objects and an online database storing the non-purchased test objects.
Further, the method comprises the steps of,
the analysis and prediction module is used for transferring the related non-purchased objects to the existing database according to the objects to be detected selected by the user and/or the selected real virtual scene.
The invention also provides electronic equipment, which comprises the virtual try-on and try-on system.
The invention has the beneficial effects that: the accuracy of the selected clothes size can be greatly improved; the fitting effect of the clothes in various real scenes can be simulated, and real visual experience is provided for users; the clothes suitable for matching can be found by the user under the condition of utilizing the existing clothes resources, a new pattern is not required to be purchased, and economic burden is reduced.
Drawings
FIG. 1 is a schematic diagram of a virtual try-on and try-on system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a virtual fitting system according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a virtual try-on module according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a collocation module according to an embodiment of the present invention;
fig. 5 is a third schematic diagram of a virtual try-on and try-on system according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, interfaces, techniques, etc., in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
Fig. 1 is a schematic diagram of a virtual try-on and try-on system according to an embodiment of the present invention, including a virtual try-on module, a virtual scene demonstration module, and a collocation module;
the matching module is used for matching the to-be-tested object with the human body three-dimensional model of the user;
the virtual try-on module is used for displaying the matching condition of the human body three-dimensional model and the object to be tested;
the virtual scene demonstration module is used for providing a plurality of real virtual scenes and is used for showing the relationship between the virtual human body model after try-on/try-on and the selected environment;
the virtual try-on module, the collocation module and the virtual scene demonstration module are communicated through uniform data.
The beneficial effects of this embodiment are: the accuracy of the selected clothes size can be greatly improved; the fitting effect of the clothes in various real scenes can be simulated, and real visual experience is provided for users; the clothes suitable for matching can be found by the user under the condition of utilizing the existing clothes resources, a new pattern is not required to be purchased, and economic burden is reduced.
Fig. 2 is a schematic diagram of a virtual fitting and wearing system according to an embodiment of the present invention, and further includes a human body information acquisition module, configured to acquire a physical parameter of a user.
Fig. 3 is a schematic diagram of a virtual try-on module according to an embodiment of the present invention based on the above embodiment. The virtual try-on module comprises a three-dimensional human body model unit and a characteristic matching unit; the three-dimensional human body model unit is used for constructing the three-dimensional human body model according to the body parameters; and the characteristic matching unit is used for screening the to-be-tested objects with the corresponding parameters in a first preset deviation range according to the body parameters.
Fig. 4 is a schematic diagram of a collocation module according to the present invention based on the above embodiment, wherein the collocation module includes an existing database storing the purchased test objects and an online database storing the non-purchased test objects. The user can directly call the model of the purchased clothes to carry out simulation collocation with the target clothes; meanwhile, when the matched clothes of the user also need to be purchased, the online clothes selection database can be called. The user can directly call the model of the purchased clothes to carry out simulation matching with the target clothes, so that the clothes selected by the user can be reasonably matched with the existing resources, and money is saved for the user; meanwhile, when the matched clothes of the user also need to be purchased, an online clothes selection database can be called, and convenient service is provided for the user.
Fig. 5 is a schematic illustration of a virtual fitting system according to an embodiment of the present invention based on the above embodiment. The above embodiment further provides an analysis and prediction module, configured to transfer the related non-purchased object to be measured into the existing database according to the object to be measured selected by the user and/or the selected real virtual scene.
In the above embodiment, the physical parameter includes at least one of: an overall parameter set, a head parameter set, an upper body parameter set, a lower body parameter set, and a foot parameter set; the overall parameter set includes height and/or weight; the head parameter set includes a head circumference and/or a neck circumference; the upper body parameter set at least comprises one of neck circumference, chest circumference, waistline, arm length, back length and shoulder width; the lower body parameter set at least comprises one of the waistline, the inner leg length, the thigh circumference and the calf circumference; the foot parameter set includes foot length and/or ankle circumference.
In the above embodiment, the manner in which the three-dimensional mannequin unit establishes the three-dimensional mannequin includes:
a first mode: according to the body parameters preset by the user, searching the nearest model from a database to be used as the human body three-dimensional model, and replacing the facial features of the model with the facial features of the user;
or a second mode: carrying out integral scanning on a human body and fine scanning on the body parameters through a 3D scanning technology to construct a three-dimensional model of the human body;
the three-dimensional model of the human body can be constructed by using a 3D modeling technology, and the 3D scanning technology can use systems such as Cyberware, loughborough, hamamatsu, image Twin and the like.
The three-dimensional human body model unit preferentially uses the first mode to build a three-dimensional human body model; the second approach is used when no model within the second predetermined deviation range is found.
The formula d= [ (c) may also be used in determining whether the model is within the second deviation range 1 -b 1 ) 2 +(c 2 -b 2 ) 2 +…+(c n -b n ) 2 ]N, wherein D represents a deviation value; n represents the number of items of the physical parameter; a, a 1 ,a 2 ,...,a n Representing a physical parameter (e.g. user chest circumference a 1 Waistline a 2 Length a in leg 3 Height a 4 And body weight a 5 );c 1 ,c 2 ,…,c n Representing model parameters corresponding to the body parameters (e.g. model chest circumference c 1 Waistline c 2 Length c in leg 3 Height c 4 And body weight c 5 ). For example, in model screening, five parameters including chest circumference, waist circumference, leg length, height and weight are selected and substituted into the above formulaAnd D is less than or equal to 9cm, and the model can be used for virtual fitting.
In the above embodiment, the feature matching unit is further configured to select a matching mode including a loose mode, a moderate mode, and a close-fitting mode; the first predetermined deviation ranges for the loose mode, the medium mode and/or the tight mode are different, and the specific ranges may be set manually.
In the above embodiment, the feature matching unit compares the deviation value S of the object to be tested and the three-dimensional human model with the first predetermined deviation range when screening;
deviation value s= [ (a) 1 -b 1 ) 2 +(a 2 -b 2 ) 2 +…+(a n -b n ) 2 ]N, wherein n represents the number of items of the physical parameter; a, a 1 ,a 2 ,...,a n Representing a physical parameter; b 1 ,b 2 ,…,b n Representing the corresponding parameters. For example, when the jacket is screened, five groups of parameters including chest circumference, height, weight, back length and shoulder width are selected, and are substituted into the formula to solve the deviation value S, so that S is less than or equal to 5cm, and the jacket can be confirmed to be in accordance with a close-fitting mode; s is more than 5cm and less than or equal to 10cm, and the jacket can be confirmed to be in a moderate mode; s is more than 10cm and less than or equal to 15cm, and the coat can be confirmed to be in a loose mode.
In the above embodiments, the real virtual scene includes a season part, a real field part, and/or a gesture part; the seasonal portion includes spring, summer, autumn and/or winter; the real site portion includes a campus, forest, mall, and/or wedding site; gestures include bending, stretching, jumping and/or walking normally.
The embodiment of the invention further provides electronic equipment based on the embodiment, which comprises the virtual fitting and fitting system in the embodiment.
In the above embodiment, when the user uses the electronic device, the electronic device first obtains the physical parameters of the user, where the physical parameters may be manually input by the user, pre-stored data in the system, and/or data immediately collected by the human body information collecting module, and then modeled by the three-dimensional human body model unit. When the user selects the clothing, confirming a matching mode (loose mode, moderate mode and/or close mode) and a first preset deviation range; after the user selects the to-be-tested object, the system selects the to-be-tested object with the size in a second preset deviation range according to the parameter of the to-be-tested object, matches the to-be-tested object with the three-dimensional model of the human body, and screens the to-be-tested object again for matching after the user is dissatisfied with the try-on and try-wear effects. And placing the three-dimensional human body model matched with the object to be tested in various virtual reality environments for the user to judge, and if the user is not satisfied with the object to be tested, re-selecting the object to be tested, and transferring the related object to be tested in the online database into the existing database for the user to select according to the historical shopping record and preference of the user. And finally amplifying and displaying the determined object to be tested, carefully observing by a user, and determining whether to purchase the object to be tested finally.
The reader will appreciate that in the description of this specification, a description of terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.
Claims (9)
1. A virtual try-on and try-on system, characterized in that: the virtual scene demonstration module comprises a virtual try-on module, a virtual scene demonstration module and a collocation module;
the matching module is used for matching the to-be-tested object with the human body three-dimensional model of the user;
the virtual try-on module is used for displaying the matching condition of the human body three-dimensional model and the object to be tested;
the virtual scene demonstration module is used for providing a plurality of real virtual scenes and showing the relationship between the human body three-dimensional model after try-on/try-on and the selected environment;
the virtual try-on module, the collocation module and the virtual scene demonstration module are communicated by data;
the system also comprises a human body information acquisition module, wherein the human body information acquisition module is used for acquiring physical parameters of a user;
the virtual fitting module comprises a three-dimensional human body model unit, wherein the three-dimensional human body model unit is used for constructing the three-dimensional human body model according to the body parameters;
the method for establishing the three-dimensional model of the human body by the three-dimensional model unit comprises the following steps:
a first mode: according to the body parameters preset by the user, searching the nearest model from a database to be used as the human body three-dimensional model, and replacing the facial features of the model with the facial features of the user;
or a second mode: carrying out integral scanning on a human body and fine scanning on the body parameters through a 3D scanning technology to construct a three-dimensional model of the human body;
the three-dimensional human body model unit establishes the human body three-dimensional model by using the first mode;
in the first mode, when the model within a second predetermined deviation range is not found, using the second mode;
wherein, for each model, if the deviation value D corresponding to the model is smaller than or equal to the second predetermined deviation range, the model is within the second predetermined deviation range, the deviation value d= [ (c) 1 -a 1 ) 2 +(c 2 -a 2 ) 2 +…+(c n -a n ) 2 ]N, wherein n represents the number of items of the physical parameter; a1, a2, & an represents a physical parameter; c1, c2, …, cn represent model parameters corresponding to the body parameters.
2. The virtual try-on and try-on system of claim 1, wherein: the virtual try-on module further comprises a feature matching unit;
the characteristic matching unit is used for screening out the to-be-tested objects with the parameters corresponding to the body parameters within a first preset deviation range.
3. The virtual try-on and try-on system of claim 1, wherein:
the physical parameter includes at least one of: an overall parameter set, a head parameter set, an upper body parameter set, a lower body parameter set, and a foot parameter set;
the global parameter set includes height and/or weight;
the head parameter set comprises a head circumference and/or a neck circumference;
the upper body parameter set at least comprises one of neck circumference, chest circumference, waistline, arm length, back length and shoulder width;
the lower body parameter set at least comprises one of waistline, inner leg length, thigh circumference and shank circumference;
the set of foot parameters includes foot length and/or ankle circumference.
4. A virtual try-on and try-on system according to claim 2, wherein:
the feature matching unit is further used for selecting matching modes, including matching modes of a loose mode, a moderate mode and/or a close-fitting mode;
the first predetermined deviation ranges of the loose mode, the medium mode and/or the tight mode are different.
5. A virtual try-on and try-on system according to claim 2, wherein:
the feature matching unit compares the deviation value S of the to-be-tested object and the three-dimensional human model with the first preset deviation range when the screening is carried out;
the deviation value s= [ (a) 1 -b 1 ) 2 +(a 2 -b 2 ) 2 +…+(a n -b n ) 2 ]N, wherein n represents the number of items of the physical parameter; a, a 1 ,a 2 ,...,a n Representing the physical parameter; b 1 ,b 2 ,…,b n Representing the corresponding parameter.
6. The virtual try-on and try-on system of claim 1, wherein:
the real virtual scene includes a season portion, a real field portion, and/or a gesture portion;
the seasonal portion includes spring, summer, autumn and/or winter;
the real site portion includes a campus, a forest, a mall, and/or a wedding site;
the gestures include bending, bowing, stretching, jumping and/or normal walking.
7. The virtual try-on and try-on system of claim 1, wherein:
the test object comprises a purchased test object and an unaddressed test object;
the collocation module comprises an existing database storing the purchased test objects and an online database storing the non-purchased test objects.
8. The virtual fit and try-on system of claim 7, wherein:
the analysis and prediction module is used for transferring the related non-purchased objects to the existing database according to the objects to be tested selected by the user and/or the selected real virtual scene.
9. An electronic device, characterized in that: comprising a virtual try-on and try-on system according to any of the preceding claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911409269.4A CN111179417B (en) | 2019-12-31 | 2019-12-31 | Virtual try-on and try-on system and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911409269.4A CN111179417B (en) | 2019-12-31 | 2019-12-31 | Virtual try-on and try-on system and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111179417A CN111179417A (en) | 2020-05-19 |
CN111179417B true CN111179417B (en) | 2023-11-24 |
Family
ID=70652371
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911409269.4A Active CN111179417B (en) | 2019-12-31 | 2019-12-31 | Virtual try-on and try-on system and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111179417B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112734515A (en) * | 2020-12-31 | 2021-04-30 | 江苏迅高智能科技有限公司 | Integrated intelligent assistant for overseas life and shopping |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104091269A (en) * | 2014-06-30 | 2014-10-08 | 京东方科技集团股份有限公司 | Virtual fitting method and virtual fitting system |
CN104318446A (en) * | 2014-10-17 | 2015-01-28 | 上海和鹰机电科技股份有限公司 | Virtual fitting method and system |
CN106504060A (en) * | 2016-10-22 | 2017-03-15 | 肇庆市联高电子商务有限公司 | Human body system for trying based on ecommerce |
CN106846113A (en) * | 2017-01-24 | 2017-06-13 | 上海衣佳网络科技有限公司 | A kind of virtual costume customization and system for trying |
CN107240007A (en) * | 2017-07-21 | 2017-10-10 | 陕西科技大学 | A kind of AR three-dimensional virtual fitting systems combined with 3D manikins |
CN109978643A (en) * | 2017-12-28 | 2019-07-05 | 南京工程学院 | A kind of method of virtually trying |
JP2019112732A (en) * | 2017-12-21 | 2019-07-11 | アビームコンサルティング株式会社 | Digital fitting support device, digital apparel support device and digital apparel support method |
KR20190108976A (en) * | 2018-03-16 | 2019-09-25 | (주)어반유니온 | Method of providing shopping information using 3D virtual fitting, and shopping system using the same |
CN110381270A (en) * | 2019-07-04 | 2019-10-25 | 浙江敦奴联合实业股份有限公司 | A kind of ready-made clothes custom-built system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9189886B2 (en) * | 2008-08-15 | 2015-11-17 | Brown University | Method and apparatus for estimating body shape |
CN102682211A (en) * | 2012-05-09 | 2012-09-19 | 晨星软件研发(深圳)有限公司 | Three-dimensional fitting method and device |
-
2019
- 2019-12-31 CN CN201911409269.4A patent/CN111179417B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104091269A (en) * | 2014-06-30 | 2014-10-08 | 京东方科技集团股份有限公司 | Virtual fitting method and virtual fitting system |
CN104318446A (en) * | 2014-10-17 | 2015-01-28 | 上海和鹰机电科技股份有限公司 | Virtual fitting method and system |
CN106504060A (en) * | 2016-10-22 | 2017-03-15 | 肇庆市联高电子商务有限公司 | Human body system for trying based on ecommerce |
CN106846113A (en) * | 2017-01-24 | 2017-06-13 | 上海衣佳网络科技有限公司 | A kind of virtual costume customization and system for trying |
CN107240007A (en) * | 2017-07-21 | 2017-10-10 | 陕西科技大学 | A kind of AR three-dimensional virtual fitting systems combined with 3D manikins |
JP2019112732A (en) * | 2017-12-21 | 2019-07-11 | アビームコンサルティング株式会社 | Digital fitting support device, digital apparel support device and digital apparel support method |
CN109978643A (en) * | 2017-12-28 | 2019-07-05 | 南京工程学院 | A kind of method of virtually trying |
KR20190108976A (en) * | 2018-03-16 | 2019-09-25 | (주)어반유니온 | Method of providing shopping information using 3D virtual fitting, and shopping system using the same |
CN110381270A (en) * | 2019-07-04 | 2019-10-25 | 浙江敦奴联合实业股份有限公司 | A kind of ready-made clothes custom-built system |
Non-Patent Citations (3)
Title |
---|
E时尚虚拟网上试衣间的实现;蒋家金;钱锋;;杭州电子科技大学学报(04);全文 * |
oholow."虚拟化在线试衣系统".《道客巴巴》.2014,第1-54页. * |
基于Kinect的虚拟试衣系统设计与实现;刘杰等;《信息技术》(第7期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN111179417A (en) | 2020-05-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6546309B1 (en) | Virtual fitting room | |
US7242999B2 (en) | Method and apparatus for identifying virtual body profiles | |
US20040078285A1 (en) | Production of made to order clothing | |
ES2272346T3 (en) | SYSTEM AND METHOD TO VISUALIZE PERSONAL ASPECT. | |
CN108734557A (en) | Methods, devices and systems for generating dress ornament recommendation information | |
WO2020203656A1 (en) | Information processing device, information processing method, and program | |
CN105989617A (en) | Virtual try-on apparatus and virtual try-on method | |
Tsoli et al. | Model-based anthropometry: Predicting measurements from 3D human scans in multiple poses | |
JP6242768B2 (en) | Virtual try-on device, virtual try-on method, and program | |
CN105527946A (en) | Rapid garment system and method based on industrial Internet | |
JP6320237B2 (en) | Virtual try-on device, virtual try-on method, and program | |
JP6338966B2 (en) | Virtual try-on device, virtual try-on system, virtual try-on method, and program | |
CN105069837B (en) | A kind of clothes trying analogy method and device | |
JP2008504593A (en) | Method for acquiring and managing human morphological data over a computer network and device for performing said method | |
CN111311373B (en) | Personalized clothing customizing method and device based on consumer social network | |
CN104598012B (en) | A kind of interactive advertising equipment and its method of work | |
Vitali et al. | Acquisition of customer’s tailor measurements for 3D clothing design using virtual reality devices | |
Brubacher et al. | Evaluation of the accuracy and practicability of predicting compression garment pressure using virtual fit technology | |
CN109063755A (en) | Clothes recognition methods and device | |
JP2018106736A (en) | Virtual try-on apparatus, virtual try-on method and program | |
CN111179417B (en) | Virtual try-on and try-on system and electronic equipment | |
CN107851113A (en) | Be configured as based on derived from performance sensor unit user perform attribute and realize the framework of automatic classification and/or search to media data, apparatus and method | |
WO2019033963A1 (en) | Transaction method and system based on physical store | |
Sayem | Objective analysis of the drape behaviour of virtual shirt, part 1: avatar morphing and virtual stitching | |
CN115908701A (en) | Virtual fitting method and system based on style3d |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |