CN112488779A - Three-dimensional fitting method - Google Patents
Three-dimensional fitting method Download PDFInfo
- Publication number
- CN112488779A CN112488779A CN201910866160.7A CN201910866160A CN112488779A CN 112488779 A CN112488779 A CN 112488779A CN 201910866160 A CN201910866160 A CN 201910866160A CN 112488779 A CN112488779 A CN 112488779A
- Authority
- CN
- China
- Prior art keywords
- dimensional
- garment
- graph
- profile
- mannequin
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 230000009286 beneficial effect Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 22
- 239000007787 solid Substances 0.000 description 9
- 238000005034 decoration Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000037237 body shape Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 1
- 239000012466 permeate Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
Abstract
The invention relates to a three-dimensional fitting method, which comprises the following steps: A.3D scans a human body model to form a first three-dimensional figure graph; B. wearing a garment on the mannequin to form a garment mannequin; C.3D scans the dress manikin, so as to form a human shape and a dress map; D. removing the parts of the first three-dimensional figure included in the figure and garment profile to form a first three-dimensional garment profile; E.3D scans the actual figure of a user to form a second three-dimensional figure profile; F. stretching or cutting the first three-dimensional figure to form a second three-dimensional figure, wherein the stretching or cutting proportion is defined as a 3D deformation; G. adjust this first three-dimensional clothes drawing according to this 3D warp, in order to form a second three-dimensional clothes drawing, the beneficial effect of the invention is that can show the actual situation of this user's own dress with virtual mode.
Description
Technical Field
The invention relates to a three-dimensional fitting method, in particular to a three-dimensional fitting method applied to 3D deformation.
Background
With the popularization of electronic commerce, online shopping is favored by people due to the characteristics of convenience, time saving, money saving and the like. Network shopping has become the life habit of consumers and widely permeates the daily life of people.
However, for purchasing clothes on the internet, consumers may have trouble of trying on clothes and not knowing how to put on clothes. Although some apparel websites provide the functionality of a "virtual fitting room". However, these functions are based on the simple two-dimensional mapping of the user's head portrait and the web site's clothes to display fitting effect, but cannot display the real situation that the clothes are worn on the user's own, so that the user finds that the clothes are not suitable for wearing after trying on the clothes purchased through the web at home, resulting in a very high rate of return goods.
Therefore, it is worth those skilled in the art to think how to make the effect presented by the virtual fitting approach the actual situation that the user wears the clothes.
Disclosure of Invention
The invention aims to provide a three-dimensional fitting method which shows the actual situation of wearing clothes of a user in a virtual mode.
The three-dimensional fitting method comprises the following steps:
A.3D scans a human body model to form a first three-dimensional figure graph;
B. wearing a garment on the mannequin to form a garment mannequin;
C.3D scans the dress manikin, so as to form a human shape and a dress map;
D. removing the parts of the first three-dimensional figure included in the figure and garment profile to form a first three-dimensional garment profile;
E.3D scans the actual figure of a user to form a second three-dimensional figure profile;
F. stretching or cutting the first three-dimensional figure to form a second three-dimensional figure, wherein the stretching or cutting proportion is defined as a 3D deformation; and
G. adjust this first three-dimensional clothing drawing according to this 3D warp to form a second three-dimensional clothing drawing cut with a cut.
The three-dimensional fitting method further comprises a step H, wherein the step H comprises the following steps: the second three-dimensional garment picture is combined with the second three-dimensional human figure picture to form a user human figure and garment picture.
In the three-dimensional fitting method, in step a, the mannequin further includes a plurality of sensing marks, and the sensing marks are evenly distributed on the surface of the mannequin.
In the three-dimensional fitting method, in step a, the human body model is scanned by using a 3D scanner.
In the three-dimensional fitting method, in step C, the 3D scanner is used to scan the garment mannequin.
In the three-dimensional fitting method, in step E, the 3D scanner is used to scan the actual body shape of the user.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below. It is to be noted that the components in the attached drawings are merely schematic and are not shown in actual scale.
Drawings
Fig. 1 shows a three-dimensional fitting method according to the present embodiment.
FIG. 2A is a schematic diagram of a 3D scanner scanning a phantom.
Fig. 2B is a schematic view of a first three-dimensional figure profile 51.
Fig. 3 is a schematic view of the garment mannequin 6.
Fig. 4A is a schematic diagram of the 3D scanner 4 scanning the clothing mannequin 6.
Fig. 4B is a schematic diagram of first figure and garment profile 61.
Fig. 5 is a schematic view of a first three-dimensional garment view profile 61A.
Fig. 6A is a schematic diagram of the 3D scanner 4 scanning the user 8.
Fig. 6B is a schematic view of a first and a second three-dimensional figure section 81.
Fig. 7 is a schematic view showing a first chevron diagram 51 stretched into a second chevron diagram 81.
Fig. 8A is a schematic diagram illustrating adjustment of the first three-dimensional garment profile 61A.
Fig. 8B is a schematic view illustrating a second three-dimensional garment view profile 61B.
Fig. 9 is a schematic diagram of a user figure and clothing drawing 91.
Detailed Description
Referring to fig. 1, fig. 1 shows a three-dimensional fitting method according to the present embodiment, which includes the following steps:
first, referring to step S1, fig. 2A and fig. 2B (fig. 2A is a schematic diagram illustrating a human body model scanned by a 3D scanner, and fig. 2B is a schematic diagram illustrating a first human figure graph cut at fig. 51), a human body model 5 is scanned by a 3D scanner 4 to form a first human figure graph cut at fig. 51. Wherein the manikin 5 further comprises a plurality of sensing marks 52, and the sensing marks 52 are evenly distributed on the surface of the manikin 5. In this way, the 3D scanner 4 is facilitated to scan, and the first stereograph 51 is also closer to a real scale.
Then, referring to step S2 and fig. 3 (fig. 3 is a schematic view of the garment mannequin 6), a garment 7 is worn on the mannequin 5 to form a garment mannequin 6. In which apparel 7 is the garment to be tried on.
Next, referring to step S3 and fig. 4A and 4B (fig. 4A shows a schematic diagram of the 3D scanner 4 scanning the clothing body model 6, and fig. 4B shows a schematic diagram of the first human figure and the clothing figure 61), the 3D scanner 4 is used to scan the clothing body model 6 to form a human figure and a clothing figure 61.
Thereafter, referring to step S4 and fig. 5 (fig. 5 is a schematic view of a first three-dimensional garment pattern, which is illustrated as profile 61A), the portion of first three-dimensional human figure profile contained in profile 51 is removed to form a first three-dimensional garment pattern, which is illustrated as profile 61A. Thus, first three-dimensional garment view profile 61A corresponds to the three-dimensional view profile of garment 7.
Then, referring to step S5 and fig. 6A and 6B (fig. 6A shows a schematic diagram of the 3D scanner 4 scanning the user 8, and fig. 6B shows a schematic diagram of the first and second solid human figure graph section 81), the 3D scanner 4 is used to scan the actual body shape of the user 8, so as to form a second solid human figure graph section 81. Specifically, the second three-dimensional figure 81 is the actual figure of the user 8, and the volume of the second three-dimensional figure 81 is larger than that of the first three-dimensional figure 51 (this is also taken as an example in the present embodiment).
Thereafter, referring to step S6 and fig. 7 (fig. 7 shows a schematic diagram of a first three-dimensional figure graph, where 51 is stretched to a second three-dimensional figure graph, 81), the first three-dimensional figure graph, where 51 is stretched or cut to the second three-dimensional figure graph, 81, and the stretching or cutting scale is defined as a 3D deformation. In detail, when the volume of the first solid figure profile 51 is smaller than that of the second solid figure profile 81, it is necessary to perform the stretching process on the first solid figure profile 51. On the contrary, when the volume of the first three-dimensional figure profile 51 is larger than that of the second three-dimensional figure profile 81, it is necessary to cut down the first three-dimensional figure profile 51 to be similar to the second three-dimensional figure profile 81, and this 3D modification is obtained.
Thereafter, referring to step S7, fig. 8A and 8B (fig. 8A is a schematic diagram illustrating an adjustment of the first stereoscopic decoration diagram before 61A, and fig. 8B is a schematic diagram illustrating a second stereoscopic decoration diagram before 61B), the first stereoscopic decoration diagram is adjusted according to the 3D modification, like 61A, to form a second stereoscopic decoration diagram before 61B. In the present embodiment, since the volume of solid chevron profile 51 is smaller than that of second solid chevron profile 81, this 3D deformation will cause first solid suit profile 61A to be stretched to form second solid suit profile 61B.
Then, referring to step S8 and fig. 9 (fig. 9 is a schematic diagram of the user figure and clothing image file 91), the second three-dimensional clothing image file 61B is combined with the second three-dimensional figure image file 81 to form a user figure and clothing image file 91. Specifically, the user figure and clothing figure 91 is a perspective view that simulates the user trying on clothing 7. Therefore, compared with the traditional two-dimensional map splicing method for displaying the fitting effect, the three-dimensional fitting method of the embodiment can show the actual situation of the user wearing the clothes, so that the user cannot generate too large fall with the virtual fitting effect after trying on the clothes purchased through the network at home, and the ratio of subsequent goods return is reduced.
In summary, the three-dimensional fitting method of the present embodiment can show the actual wearing situation of the user in a virtual manner.
The above-described embodiments are merely exemplary for convenience of description, and various modifications may be made by those skilled in the art without departing from the scope of the invention as claimed in the claims.
Claims (6)
1. A three-dimensional fitting method is characterized by comprising the following steps:
A.3D scans a human body model to form a first three-dimensional figure graph;
B. wearing a garment on the mannequin to form a garment mannequin;
C.3D scans the dress manikin, so as to form a human shape and a dress map;
D. removing part of the first three-dimensional human figure graph contained in the human figure and clothing graph to form a first three-dimensional clothing graph;
E.3D scans the actual figure of a user to form a second three-dimensional figure profile;
F. stretching or cutting the first three-dimensional figure to form a second three-dimensional figure, wherein the stretching or cutting proportion is defined as a 3D deformation; and
G. adjust this first three-dimensional clothing drawing according to this 3D warp to form a second three-dimensional clothing drawing cut with a cut.
2. A three-dimensional fitting method according to claim 1, further comprising a step H of: the second three-dimensional garment graph is combined with the second three-dimensional human figure graph to form a user human figure and a garment graph.
3. A three-dimensional fitting method according to claim 1, wherein in step a, the mannequin further comprises a plurality of sensing marks, and the sensing marks are evenly distributed on the surface of the mannequin.
4. A method of fitting a garment according to claim 1, wherein in step a, the phantom is scanned using a 3D scanner.
5. A three-dimensional fitting method according to claim 3, wherein in step C, the 3D scanner is used to scan the garment mannequin.
6. A three-dimensional fitting method according to claim 3, wherein in step E, the 3D scanner is used to scan the actual figure of the user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910866160.7A CN112488779A (en) | 2019-09-12 | 2019-09-12 | Three-dimensional fitting method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910866160.7A CN112488779A (en) | 2019-09-12 | 2019-09-12 | Three-dimensional fitting method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112488779A true CN112488779A (en) | 2021-03-12 |
Family
ID=74919934
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910866160.7A Pending CN112488779A (en) | 2019-09-12 | 2019-09-12 | Three-dimensional fitting method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112488779A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103810607A (en) * | 2014-03-03 | 2014-05-21 | 郑超 | Virtual fitting method |
CN106251200A (en) * | 2016-07-27 | 2016-12-21 | 华北电力大学 | The virtual fit method of Case-based Reasoning |
CN106920146A (en) * | 2017-02-20 | 2017-07-04 | 宁波大学 | Three-dimensional fitting method based on body-sensing characteristic parameter extraction |
CN106960463A (en) * | 2017-03-13 | 2017-07-18 | 东华大学 | Towards the quick fitting method of three-dimensional virtual garment of real scan human body |
CN107025688A (en) * | 2017-03-13 | 2017-08-08 | 东华大学 | A kind of 3-D scanning clothes reconstruct and method for reusing |
-
2019
- 2019-09-12 CN CN201910866160.7A patent/CN112488779A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103810607A (en) * | 2014-03-03 | 2014-05-21 | 郑超 | Virtual fitting method |
CN106251200A (en) * | 2016-07-27 | 2016-12-21 | 华北电力大学 | The virtual fit method of Case-based Reasoning |
CN106920146A (en) * | 2017-02-20 | 2017-07-04 | 宁波大学 | Three-dimensional fitting method based on body-sensing characteristic parameter extraction |
CN106960463A (en) * | 2017-03-13 | 2017-07-18 | 东华大学 | Towards the quick fitting method of three-dimensional virtual garment of real scan human body |
CN107025688A (en) * | 2017-03-13 | 2017-08-08 | 东华大学 | A kind of 3-D scanning clothes reconstruct and method for reusing |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8976230B1 (en) | User interface and methods to adapt images for approximating torso dimensions to simulate the appearance of various states of dress | |
CN105527946A (en) | Rapid garment system and method based on industrial Internet | |
CN104021589A (en) | Three-dimensional fitting simulating method | |
EP0933728A2 (en) | A method and system for generating a stereoscopic image of a garment | |
CN109003168A (en) | Virtual fit method, smart television and computer readable storage medium | |
US20090144173A1 (en) | Method for converting 2d image into pseudo 3d image and user-adapted total coordination method in use artificial intelligence, and service business method thereof | |
US20130170715A1 (en) | Garment modeling simulation system and process | |
CN102156808A (en) | System and method for improving try-on effect of reality real-time virtual ornament | |
CN106659259A (en) | Method for virtually selecting clothing | |
EP2677497A2 (en) | Method and system of spacial visualisation of objects and a platform control system included in the system, in particular for a virtual fitting room | |
JP6373026B2 (en) | Image processing apparatus, image processing system, image processing method, and program | |
US20130173226A1 (en) | Garment modeling simulation system and process | |
CN106097067A (en) | A kind of sweater based on the Internet customizes design system and method | |
RU2358628C2 (en) | Designing method of clothing based on non-contact anthropometry | |
CN105653742A (en) | Clothes model building method in three-dimension simulation fitting system | |
CN106897916B (en) | Personalized clothing remote customization method based on mobile terminal | |
CN109472851A (en) | The constructive method of 3D human visual's image and use | |
CN110189413A (en) | A kind of method and system generating clothes distorted pattern | |
CN204990403U (en) | 3D clothing system of trying on on | |
CN112488779A (en) | Three-dimensional fitting method | |
CN105844513A (en) | Cloud service based clothes try-on method and apparatus | |
JP7374740B2 (en) | Fitting support device, fitting support method, fitting support program | |
WO2024072542A1 (en) | Parametric modelling and grading | |
KR20220000123A (en) | Method for 2d virtual fitting based key-point | |
CN106327290A (en) | Virtual fitting system based on augmented reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |