CN114186299B - Method for generating and rendering three-dimensional clothing seam effect - Google Patents
Method for generating and rendering three-dimensional clothing seam effect Download PDFInfo
- Publication number
- CN114186299B CN114186299B CN202111499059.6A CN202111499059A CN114186299B CN 114186299 B CN114186299 B CN 114186299B CN 202111499059 A CN202111499059 A CN 202111499059A CN 114186299 B CN114186299 B CN 114186299B
- Authority
- CN
- China
- Prior art keywords
- map
- rendering
- sample
- effect
- generating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000000694 effects Effects 0.000 title claims abstract description 42
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000009877 rendering Methods 0.000 title claims abstract description 24
- 238000009958 sewing Methods 0.000 claims abstract description 11
- 238000009792 diffusion process Methods 0.000 claims abstract description 5
- 238000013461 design Methods 0.000 claims description 10
- 238000010586 diagram Methods 0.000 claims description 10
- 239000013598 vector Substances 0.000 claims description 6
- 241001270131 Agaricus moelleri Species 0.000 claims description 3
- 239000000203 mixture Substances 0.000 claims description 3
- 230000002194 synthesizing effect Effects 0.000 abstract description 2
- 230000037303 wrinkles Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000002354 daily effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2111/00—Details relating to CAD techniques
- G06F2111/16—Customisation or personalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30124—Fabrics; Textile; Paper
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A10/00—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE at coastal zones; at river basins
- Y02A10/40—Controlling or monitoring, e.g. of flood or hurricane; Forecasting, e.g. risk assessment or mapping
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Geometry (AREA)
- Computational Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Computer Hardware Design (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Treatment Of Fiber Materials (AREA)
Abstract
The invention discloses a method for generating and rendering a three-dimensional clothing sewing effect, which comprises the steps of importing or drawing 2D clothing cut pieces to generate a 3D clothing model; constructing a 2D closed contour map of the cut piece; generating a normal map according to the boundary relation of the 2D closed contour map; synthesizing a seam normal map according to the seam relation among the cut pieces; and rendering the utilization normal map in real time to finally obtain the rendering effect of the seam. The invention simulates based on the normal map and the sewing information of the cut pieces, thereby realizing the effect of light and dark folds at the sewing position of the three-dimensional clothing; the edge information of the sample wafer is complemented by a filter closed loop method; and filling the shape of the edge map by using a water-diffusion filling method to obtain a filled normal map, so as to ensure that the finally rendered picture is completely and continuously fitted.
Description
Technical Field
The invention belongs to the field of three-dimensional clothing design, and particularly relates to a method for generating and rendering a three-dimensional clothing seam effect.
Background
Garment design CAD has entered the research field since the 70 th century, and the technology has been widely used in garment production enterprises of various sizes. The 3D design of the garment can enable a designer to acquire the effect obtained immediately after the design is more visual, with the development of software and hardware of a computer, the design effect is reflected more realistically aiming at real-time simulation of the garment, even real-time try-on and money changing can be carried out, the design requirement is met to the greatest extent, and the time is shortened.
In order to better present the design effect, the presentation of details becomes particularly important, and the stitching effect plays an important role in the final presented effect. The 3D garment model is formed by connecting a series of triangular meshes, and the real world garment generates small wrinkles due to stitching, and the small wrinkles are difficult to represent through the triangular mesh surfaces, so that the small wrinkles need to be represented through an image.
In the field of real-time rendering, normal line mapping is a technology capable of well representing object surface details, concave-convex lines of clothing cloth can be represented by using normal line mapping, and a sewing fold effect can be generated by overlapping folds at a sewing position for a real packaging seam.
Disclosure of Invention
The invention aims to: the invention provides a method for generating and rendering a three-dimensional clothing seam effect, which realizes that a real seam fold effect is obtained on a three-dimensional clothing model.
The technical scheme is as follows: a method for generating and rendering a three-dimensional clothing seam effect comprises the following steps:
s1, importing or drawing a two-dimensional clothing design sample in real time;
s2, generating three-dimensional clothing sample piece model information and a two-dimensional clothing closed contour map through scanning pixel points of the sample piece;
s3, judging whether the contour feature points of the checked graph are complete and continuous, if so, calculating a contour map according to the closed contour map of the S2;
s4, according to the pixel points in the closed area of the outline points of the clothes, obtaining the pixel values of the corresponding pixel points of the positions to be sewn; stitching the sample, and recording texture coordinate information of the sample at the stitching position;
s5, superposing fold information of the sewed part on the normal line graph of S3 according to texture coordinate information of the sample piece of the sewed part; fusing the normal map with the concave-convex texture of the original sample wafer;
s6, complementing edge information of the sample wafer by a filter closed loop method; firstly, dividing a target sample; extracting an edge map of a target sample wafer; extracting the outer contour of the edge map by using a method of collecting boundary tracking; taking the outer contour as a filling boundary, and filling the shape of the edge map by using a water-diffusion filling method to obtain a filled normal map;
and S7, rendering the edge-filled sample in real time to obtain the final three-dimensional model surface effect and the seam effect.
Specifically, the closed contour map in S2 is represented by a gray scale map, and pixels outside and inside the contour line are represented by different gray scales.
Specifically, in S3, a normal map is generated from the gray map, and assuming that (i, j) is coordinates on the image, vectors are generated in S and T directions, i.e., horizontal and vertical directions, where H represents a pixel value:
S(i,j)=<1,0,H(i+1,j)-H(i-1,j)>
T(i,j)=<0,1,H(i,j+1)-H(i,j-1)>
then the normal at (i, j) is:
N(i,j)=S(i,j)X T(i,j)/|S(i,j)X T(i,j)|。
specifically, the folding effect is superimposed in S5, and the folding effect is replaced by the normal map generated in S3 according to the texture coordinate information of the stitching position.
Specifically, S6 is a mixture of two normals, and since the normals actually store vectors instead of color values, a simple color mixing method cannot be adopted, and a partial derivative mixing method is adopted: the normal line diagram of the original clothing piece and the new normal line which is superimposed with the wrinkling effect are reserved.
The beneficial effects are that: compared with the prior art, the invention has the advantages that:
1. simulating based on the normal line map and sewing information of the cut pieces, so as to realize the effect of bright and dark folds at the sewing position of the three-dimensional garment;
2. the edge information of the sample wafer is complemented by a filter closed loop method; and filling the shape of the edge map by using a water-diffusion filling method to obtain a filled normal map, so as to ensure that the finally rendered picture is completely and continuously fitted.
Drawings
FIG. 1 is a flow chart of a method for creating a printing effect on a three-dimensional garment model according to an embodiment of the present invention;
FIG. 2 is a profile generated by S2 shown in FIG. 1;
FIG. 3 is a normal map generated by S3 shown in FIG. 1;
fig. 4 is a normal line diagram after the fold is superimposed in S5 shown in fig. 1;
FIG. 5 is a schematic diagram of a real-time rendering result of the seamless part generated in S7 shown in FIG. 1;
fig. 6 is a schematic diagram of a real-time rendering result of the seam generated in S7 shown in fig. 1.
Detailed Description
The technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
A method for generating and rendering a three-dimensional clothing seam effect comprises the following steps:
s1, importing or drawing a two-dimensional clothing design sample in real time;
s2, generating three-dimensional clothing sample piece model information and a two-dimensional clothing closed contour map through scanning pixel points of the sample piece; the closed contour map is represented by a gray scale map, and pixels outside and inside the contour line are represented by different gray scales;
s3, judging whether profile feature points of the checked graph are complete and continuous, if the feature points are complete and continuous, generating a normal graph according to a gray level graph according to a closed profile graph calculation algorithm graph of S2, and generating vectors in S and T directions, namely a horizontal direction and a vertical direction, assuming that (i, j) is coordinates on the graph, wherein H represents pixel values:
S(i,j)=<1,0,H(i+1,j)-H(i-1,j)>
T(i,j)=<0,1,H(i,j+1)-H(i,j-1)>
then the normal at (i, j) is:
N(i,j)=S(i,j)X T(i,j)/|S(i,j)X T(i,j)|;
s4, according to the pixel points in the closed area of the outline points of the clothes, obtaining the pixel values of the corresponding pixel points of the positions to be sewn; stitching the sample, and recording texture coordinate information of the sample at the stitching position;
s5, superposing fold information of the sewed part on the normal line graph of S3 according to texture coordinate information of the sample piece of the sewed part; fusing the normal map with the concave-convex texture of the original sample wafer;
s6, complementing edge information of the sample wafer by a filter closed loop method; firstly, dividing a target sample; extracting an edge map of a target sample wafer; extracting the outer contour of the edge map by using a method of collecting boundary tracking; taking the outer contour as a filling boundary, and filling the shape of the edge map by using a water-diffusion filling method to obtain a filled normal map;
and S7, rendering the edge-filled sample in real time to obtain the final three-dimensional model surface effect and the seam effect.
The above-mentioned overlapping of the fold effect in S5 is a normal map generated in S3 by replacing the fold effect with the texture coordinate information of the seam.
The above S6 is a mixture of two normals, and since the normals actually store vectors instead of color values, a simple color mixing method cannot be adopted, and a partial guide mixing method is adopted: the normal line diagram of the original clothing piece and the new normal line which is superimposed with the wrinkling effect are reserved.
Example 1
Taking sewing of a T-shirt as an example, the following description will be given:
as shown in fig. 1, the user imports a two-dimensional dailies designed by a designer;
step S2, generating three-dimensional garment piece model data, and displaying the three-dimensional garment piece model data in a three-dimensional software window; the three-dimensional panel data contains the necessary vertex normal texture coordinates and the like for displaying the data while correlating with other internal data. And then generating a 2D contour map according to texture coordinate system information at the contour of the 2D garment piece, wherein different gray values are displayed inside and outside the contour line, and the result is shown in figure 2.
Step S3 is performed, and S3 calculates the normal line corresponding to each pixel point according to the adjacent gray scale coordinates by using the gray scale coordinates based on the direction of the texture S, T according to the result of step S2, and the stored result is shown in fig. 3.
S4, using a stitching tool, selecting two dimensions for stitching, and generating stitching marking data which are used for overlapping the stitching folding effect.
S5 is performed, and based on the result of S4, the previously prepared fold map is replaced along the slit according to the texture coordinates, and the result is shown in fig. 4.
And S6, if the rendering of the garment piece has normal lines, fusing the normal lines generated in the S4, retaining the details of the two normal lines, and if the rendering has no normal line, using the normal lines generated in the S5.
And S7, using the final normal obtained in the S6, and entering a real-time rendering stage. Fig. 5 is a seamless effect diagram, and fig. 6 is a seamless effect diagram.
The method comprises the steps of importing or drawing 2D clothing cut pieces to generate a 3D clothing model; constructing a 2D closed contour map of the cut piece; generating a normal map according to the boundary relation of the 2D closed contour map; synthesizing a seam normal map according to the seam relation among the cut pieces; and rendering the utilization normal map in real time to finally obtain the rendering effect of the seam. The invention simulates based on the normal map and the sewing information of the cut pieces, and realizes the effect of light and dark folds at the sewing position of the three-dimensional clothing.
The preferred embodiments of the invention disclosed above are intended only to assist in the explanation of the invention. The preferred embodiments are not exhaustive or to limit the invention to the precise form disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best understand and utilize the invention.
Claims (4)
1. The method for generating and rendering the three-dimensional clothing seam effect is characterized by comprising the following steps of:
s1, importing or drawing a two-dimensional clothing design sample in real time;
s2, generating three-dimensional clothing sample piece model information and a two-dimensional clothing closed contour map through scanning pixel points of the sample piece;
s3, judging whether the contour feature points of the checked graph are complete and continuous, if so, calculating a contour map according to the closed contour map of the S2;
s4, according to the pixel points in the closed area of the outline points of the clothes, obtaining the pixel values of the corresponding pixel points of the positions to be sewn; stitching the sample, and recording texture coordinate information of the sample at the stitching position;
s5, superposing fold information of the sewed part on the normal line graph of S3 according to texture coordinate information of the sample piece of the sewed part; fusing the normal map with the concave-convex texture of the original sample wafer; the S5 is a mixture of two normals, and since the normals actually store vectors instead of color values, a simple color mixing method cannot be adopted, and a partial guide mixing method is adopted: the normal line diagram of the original clothing piece and the new normal line superimposed with the wrinkling effect are reserved;
s6, complementing edge information of the sample wafer by a filter closed loop method; firstly, dividing a target sample; extracting an edge map of a target sample wafer; extracting the outer contour of the edge map by using a method of collecting boundary tracking; taking the outer contour as a filling boundary, and filling the shape of the edge map by using a water-diffusion filling method to obtain a filled normal map;
and S7, rendering the edge-filled sample in real time to obtain the final three-dimensional model surface effect and the seam effect.
2. The method for generating and rendering the three-dimensional clothing seam effect according to claim 1, wherein the method comprises the following steps of: the closed contour diagram in S2 is represented by a gray scale diagram, and pixels outside and inside the contour line are represented by different gray scales.
3. The method for generating and rendering the three-dimensional clothing seam effect according to claim 1, wherein the method comprises the following steps of: in S3, a normal map is generated from the gray map, and a vector is generated in S and T directions, i.e., a horizontal direction and a vertical direction, assuming that (i, j) is coordinates on the image, where H represents a pixel value:
S(i,j) = <1,0,H(i+1,j)-H(i-1,j)>
T(i,j) = <0,1,H(i,j+1)-H(i,j-1)>
then the normal at (i, j) is:
N(i,j) = S(i,j) X T(i,j)/|S(i,j) X T(i,j)|。
4. the method for generating and rendering the three-dimensional clothing seam effect according to claim 1, wherein the method comprises the following steps of: and S5, overlapping the fold effect, namely replacing the normal map generated in S3 with the fold effect according to texture coordinate information of the sewing position.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111499059.6A CN114186299B (en) | 2021-12-09 | 2021-12-09 | Method for generating and rendering three-dimensional clothing seam effect |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111499059.6A CN114186299B (en) | 2021-12-09 | 2021-12-09 | Method for generating and rendering three-dimensional clothing seam effect |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114186299A CN114186299A (en) | 2022-03-15 |
CN114186299B true CN114186299B (en) | 2023-12-15 |
Family
ID=80542935
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111499059.6A Active CN114186299B (en) | 2021-12-09 | 2021-12-09 | Method for generating and rendering three-dimensional clothing seam effect |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114186299B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107481095A (en) * | 2017-07-26 | 2017-12-15 | 深圳市盛世华服信息有限公司 | A kind of virtual Design of Popular Dress Ornaments method for customizing of 3D and custom-built system |
CN109598780A (en) * | 2018-08-30 | 2019-04-09 | 广州多维魔镜高新科技有限公司 | A kind of clothes 3D modeling method |
CN111028361A (en) * | 2019-11-18 | 2020-04-17 | 杭州群核信息技术有限公司 | Three-dimensional model and material merging method, device, terminal, storage medium and rendering method |
CN111191659A (en) * | 2019-12-26 | 2020-05-22 | 西安工程大学 | Multi-shape clothes hanger identification method for garment production system |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080246765A1 (en) * | 2005-05-06 | 2008-10-09 | Desmond Grenfell | Method and apparatus for constraint-based texture generation |
US20110298897A1 (en) * | 2010-06-08 | 2011-12-08 | Iva Sareen | System and method for 3d virtual try-on of apparel on an avatar |
EP2721209A4 (en) * | 2011-06-15 | 2014-06-25 | Tietex Int Ltd | Stitch bonded creped fabric construction |
EP3335197A1 (en) * | 2015-08-14 | 2018-06-20 | Metail Limited | Method and system for generating an image file of a 3d garment model on a 3d body model |
-
2021
- 2021-12-09 CN CN202111499059.6A patent/CN114186299B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107481095A (en) * | 2017-07-26 | 2017-12-15 | 深圳市盛世华服信息有限公司 | A kind of virtual Design of Popular Dress Ornaments method for customizing of 3D and custom-built system |
CN109598780A (en) * | 2018-08-30 | 2019-04-09 | 广州多维魔镜高新科技有限公司 | A kind of clothes 3D modeling method |
CN111028361A (en) * | 2019-11-18 | 2020-04-17 | 杭州群核信息技术有限公司 | Three-dimensional model and material merging method, device, terminal, storage medium and rendering method |
CN111191659A (en) * | 2019-12-26 | 2020-05-22 | 西安工程大学 | Multi-shape clothes hanger identification method for garment production system |
Non-Patent Citations (1)
Title |
---|
基于CLO 3D虚拟汉服设计与实现;张沙沙等;服装设计师;第120-125页 * |
Also Published As
Publication number | Publication date |
---|---|
CN114186299A (en) | 2022-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112509151B (en) | Method for generating sense of reality of virtual object in teaching scene | |
US11961200B2 (en) | Method and computer program product for producing 3 dimensional model data of a garment | |
US10628666B2 (en) | Cloud server body scan data system | |
EP2686834B1 (en) | Improved virtual try on simulation service | |
Robson et al. | Context-aware garment modeling from sketches | |
WO2018150220A1 (en) | System and method for three-dimensional garment mesh deformation and layering for garment fit visualization | |
US20160155262A1 (en) | Method of constructing 3d clothing model based on a single image | |
US10410400B1 (en) | Digital image editing for images of folded objects | |
US20120229463A1 (en) | 3d image visual effect processing method | |
CN104517313B (en) | The method of ambient light masking based on screen space | |
Li et al. | Outdoor augmented reality tracking using 3D city models and game engine | |
CN102693065A (en) | Method for processing visual effect of stereo image | |
CN114186299B (en) | Method for generating and rendering three-dimensional clothing seam effect | |
Schwandt et al. | Glossy reflections for mixed reality environments on mobile devices | |
Fondevilla et al. | Fashion transfer: Dressing 3d characters from stylized fashion sketches | |
Cheng et al. | A 3D virtual show room for online apparel retail shop | |
Dobbyn et al. | Clothing the Masses: Real-Time Clothed Crowds With Variation. | |
Gois et al. | Interactive shading of 2.5 D models. | |
JP2739447B2 (en) | 3D image generator capable of expressing wrinkles | |
Zhang et al. | Modeling garment seam from a single image | |
TW201019265A (en) | Auxiliary design system and method for drawing and real-time displaying 3D objects | |
CN116958332B (en) | Method and system for mapping 3D model in real time of paper drawing based on image recognition | |
US20230306699A1 (en) | Virtual garment wrapping for draping simulation | |
Ju | Garment mesh reconstruction and animation | |
Lazunin et al. | Interactive visualization of multi-layered clothing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |