GB2341529A - Three-dimensional embroidery design simulator - Google Patents

Three-dimensional embroidery design simulator Download PDF

Info

Publication number
GB2341529A
GB2341529A GB9819258A GB9819258A GB2341529A GB 2341529 A GB2341529 A GB 2341529A GB 9819258 A GB9819258 A GB 9819258A GB 9819258 A GB9819258 A GB 9819258A GB 2341529 A GB2341529 A GB 2341529A
Authority
GB
United Kingdom
Prior art keywords
dimensional
design
line
points
dimension
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9819258A
Other versions
GB2341529B (en
GB9819258D0 (en
Inventor
Andrew Bennett Kaymer
Martin Bysh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Emnet Embroidery Networks Ltd
Original Assignee
Emnet Embroidery Networks Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Emnet Embroidery Networks Ltd filed Critical Emnet Embroidery Networks Ltd
Priority to GB9819258A priority Critical patent/GB2341529B/en
Publication of GB9819258D0 publication Critical patent/GB9819258D0/en
Publication of GB2341529A publication Critical patent/GB2341529A/en
Application granted granted Critical
Publication of GB2341529B publication Critical patent/GB2341529B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Abstract

A method of converting an embroidery design from a stitch data format to a format suitable for manipulation, translation and lighting within a three-dimensional mathematical universe and subsequent perspective projection onto a two-dimensional medium, the method comprising the steps of: (1) analysing the design and converting it to a plurality of points; (2) generating a curve along a third dimension between every two consecutive points by splitting the line defined by two points into smaller segments and adding a third dimension component; (3) generating line normals for each segment of each curve, along a third dimension; (4) manipulating the now three-dimensional data points using geometry to form a three-dimensional image of the design; and (5) displaying the image in perspective on a two-dimensional medium.

Description

1 2341529 Three-Dimensional Embroidery Design Simulator The present
invention relates to a three-dimensional embroidery design simulator. Furthermore, it relates to a method of simulating an embroidery pattern on a computer 5 screen, to illustrate its appearance when it is stitched out into a fabric.
The current standard method of displaying an embroidery pattern on a computer screen involves displaying stitch data within the computer memory, either as two-dimensional vector information, or as absolute coordinates in a two dimensional Cartesian space. As a result, the embroidery pattern can only be displayed in two-dimensional form. This does not give a true representation or indication of how the embroidery pattern will actually look when stitched into a fabric.
In accordance with the present invention, an embroidery pattern is displayed on a computer screen in perspective view, by converting the embroidery design data from twodimensional vector information to three-dimensional line segments suitable for lighting and transformation within a three-dimensional mathematical universe.
Accordingly, the present invention provides a method of converting an embroidery design from a stitch data format to a format suitable for manipulation, translation, and lighting within a three-dimensional mathematical universe and subsequent perspective projection onto a two-dimensional medium, the method comprising the steps of:
(1) analysing the design and converting it to a plurality of points; (2) generating a curve along the third dimension between every two consecutive points by splitting the line defined by two points into smaller line segments; (3) generating line normals for each segment of each curve, along the third dimension; (4) manipulating the data points using three-dimensional geometry to form a three-dimensional image of the design; and (5) displaying the image in perspective on a two-dimensional medium.
2 In a preferred embodiment, the analysis and conversion of the design involves the steps of- (1) establishing the position ofjump stitches in the design; and (2) linking all consecutive collinear jumps to form a single stitch vector.
Advantageously, the analysis and conversion of the design involves the steps of:
(1) establishing the position of embedded jumps between two stitches in the design; and (2) integrating the embedded jumps into the line of stitches.
Preferably, the positions of the embedded jumps are established by taking into account defining features of embedded jumps within a design.
The curve generation step may involve separating each stitch vector into a plurality of line segments, and mapping said line segments along a curve in a third dimension. In this case, the line normals, may be generated so to be perpendicular to a rectangular plane, the rectangular plane being defined by the line segment and a second line parallel, and of equal length, and having the same third dimension components.
Alternatively, the curve generation step may involve separating each stitch vector into a plurality of polygonal line-shaped segments, and mapping said polygonal segments along a curve in a third dimension. In this case, the line normals may be generated so to be perpendicular to a rectangular plane, the rectangular plane being defined by the polygonal segment and a second polygon parallel, and of the same third dimension components, to the polygonal segment.
Preferably, the two-dimensional medium is the visual display unit of a computer.
The present invention also provides a three-dimensional design simulator comprising means for inputting and storing an embroidery design in stitch data format on a computer, means 3 for manipulating said stitch data to translate it into three dimensions, and means for displaying the three-dimensional format on a two- dimensional medium.
The invention further provides a three-dimensional design simulator for converting an embroidery design from a stitch data format to a format suitable for manipulation, translation, and lighting within a three-dimensional mathematical universe and subsequent perspective projection onto a two- dimensional medium, the simulator comprising:
means for analysing the design and converting it to a plurality of points; (2) means for generating a curve along a third dimension between every two consecutive points, by splitting the line defined by two points into smaller segments; (3) means for generating fine normals for each segment of each curve along the third dimension; (4) means for manipulating the data points using a three-dimensional geometry to form a three-dimensional image of the design; and (5) means for displaying the image in perspective on a two-dimensional medium.
A three-dimensional design simulator constructed in accordance with the present invention will now be described in detail, by way of example, with reference to the accompanying drawings, in which:
Figure 1 is a side view of a curve section of an embroidery design generated by the simulator; Figure 2 illustrates the method by which line normals are generated by the simulator; Figure 3 shows an embroidery pattern simulated using standard methods of design simulation; and Figure 4 shows an embroidery pattern simulated in three-dimensions using the simulator.
4 The three-dimensional design simulator is designed to manipulate an embroidery pattern stored in a computer memory as dimensional vector information.
The method of simulation is carried out in three stages. 5 The first stage involves the pre-processing ofjump stitch data and the conversion of data into absolute co-ordinates. The original design, in two-dimensional vector format, is analysed so as to find jump stitches and embedded jumps. Embedded jumps are found by taking into account various defining features. Embedded jumps perform a function of lengthening a normal stitch by causing the needle to move, without inserting a stitch, after one normal stitch and before the next. They are necessary when a user wishes to exceed the maximum length of a stitch determined by the embroidery data. Embedded jumps, for example tend to appear between normal stitches, rather than between colour changes in the design. There also tends to be only one or two stitches in between groups of collinear jumps. Furthermore, the stitch following the stitch after an embedded jump usually reverses the flow of the stitches. These embedded jump stitches are indications of an extended normal stitch, rather than a jump, and thus are integrated into the stitch.
All jump stitches, having a vector control value "jump" are marked "inVisible", such that 20 they are not displayed as part of the design when simulated in three-dimensional form. Consecutive collinear jump stitches are linked to form a single vector relative to the last needle insertion instruction before the string ofjumps. The colour information of blocks of consecutive stitches, is also stored for use by a rendering engine. Each stitch making up 1 single vector is then converted from twodimensional vector data to absolute co- ordinates in a two-dimensional co-ordinate space, with the first stitch representing the start of the single vector, becoming the new origin.
The second stage involves the generation of curves, and is understood more clearly with reference to Figure 1. Each line segment/stitch, comprised of two consecutive points of the new absolute stitch coordinate data, is converted into a curve 1 which, when processed by a three-dimensional rendering engine, would reflect light according to the angle and colour of each section of the curve. One curve is generated per line segment stitch in the design. Each line segment comprising of two consecutive absolute points. The curve 1 comprises five straight line segments 2. The segments 2 map to the curve along a third dimension. Any number of line segments can be used to produce the curve, even one segment. However, the quality of the image will decrease with decreasing numbers of segments, while the memory consumption and time taken for generation will increase with increasing numbers of segments. The data for the segments, and indeed for the whole design is stored as an interrupted series of points of the form x, y, z, rix, ny, nz a segment being formed by any two consecutive points. The co-ordinates, rix, ny, nz, refer to the normal, created using the method as described in stage three below. The curve which the segments map to can be actual (as described above) or implied. In the latter case, the normals are generated, or pre-calculated, and are allocated for each two-dimensional segment as if they were mapped to the curve along the third dimension. When lit and displayed from the front, this would look identical to the actual curve mapping. If the design, however, is then rotated around the horizontal or vertical axes (not possible in a two-dimensional universe), the design would appear flat.
The third stage involves the generation of the line normals, and is more clearly understood with reference to Figure 2. A normal 3) is generated for each line segment 2 in the design.
The direction of the normal is used in the calculations for the angle of incidence of the light in the three-dimensional mathematical universe. Each normal 3 is generated so to be perpendicular to a rectangular plane 4 created by the respective segment 2 and a line 5 paraflel to, and of equal length to, and having equal third dimension components as that line segment (the line 5 forming the opposite end of the rectangle).
Following the three preparatory stages described above, the stitch data has been transformed into a series of line end co-ordinates and line normals. The resultant data can then be transformed by, for example, morphing, rotating or moving the data in three dimensions. The data can then be fit, before finally being rendered using a traditional threedimensional rendering engine, such as for example, OpenGL. The data relating to the line 6 normals is utilised by light sourcing mathematics of the rendering engine. A point in the three-dimensional universe (world space) is designated as a source of light. This point is given light components, namely ambient reflectance, diffusion, specular colour and intensity values, together with a direction. Colour (extracted at the initial stage) and pre- defined ambient reflectance, diffusion. specular colour and intensity values are also given to each line segment. The colour of the line, after light processing, is calculated by a combination of the light components and the line components as scaled by the angle of incidence. The angle of incidence of each line segment is the angle between the normal of the fine segment and the vector defined by the light point and position of the line segment in the world space. The ambient component of a line is combined with the ambient reflectance component of each incoming light source. The diffusion component of the line is combined with the incoming light diffusion component. The specular component of the he is combined with the specular component of the incoming light. The resultant colour produced by the process is displayed as the colour of the endpoint of the line. The processed colour of the previous line is used as the line start point colour. After perspective projection, the colours are interpolated, pixel by pixel, across the line using Gouraud shading or Phong shading.
The quality of the rendered image may be further enhanced by, for example, adding a texture map, consisting of a bitmap of material or thread, to each line at the rendering time. Also, shadows may be created during rendering and, more light sources may be added.
Furthermore, a more detailed curve map, for example, a curved bump map, may be applied to the stitch vectors during production of the curve, which would produce a texture output when rendered, thus improving the quality of the displayed design. Then, curve generation may consist of thin polygonal segments rather than line segments, or alternatively may consist of a plurality of curves. The method of curve generation depends entirely on the graphics application program interface and its method of generating curves.
When displayed on a two-dimensional medium, such as a computer screen, the resultant image provides a more realistic view of what the embroidery pattern will look like when 7 stitched into fabric, with each stitch having been curved according to the present method. This can be seen clearly with reference to Figure 4, which illustrates a design simulated in three-dimensions, compared with Fi.,,njre 3, illustrating the display of the same design, generated by standard methods, wherein the actual stitches are not manipulated to show 5 their curved and lit structures.
8

Claims (13)

1. A method of converting an embroidery design from a stitch data format to a format suitable for mantipulation, translation, and lighting within a three-dimensional mathematical universe and subsequent perspective projection onto a two-dimensional medium, the method comprising the steps of..
analysing the design and converting it to a plurality of points; (2) generating a curve along the third dimension between every two consecutive points by splitting the fine defined by two points into smaller line segments and adding a third dimension component.
(3) generating line normals for each segment of each curve, along a third dimension; (4) manipulating the now three-dimensional data points using geometry to form a three-dimensional image of the design; and (5) displaying the image in perspective on a two-dimensional medium.
2. A method according to claim 1, wherein the analysis and conversion of the design involves the steps of:
(1) establishing the position of collinear jump stitches in the design; and (2) linking all consecutive collinearjump stitches to form a single stitch vector.
(3) converting the single stitch vector to a line segment made up of absolute points in a two-dimensional cartesian space.
3. A method according to claim 1 or claim 2, wherein the analysis and conversion of the design involves the steps of:
(1) establishing the position of embedded jumps within a line of stitches in the design, and (2) integrating the embedded jumps into the line of stitches.
4. A method according to claim 3, wherein the positions of the embedded jumps are established by taking into account definina features of embedded jumps within a design.
=y 9
5. A method according to any of claims I to 4, wherein the design is converted into a plurality of points in a two-dimensional Cartesian space.
6. A method according to any one of claims I to 5, wherein the curve generation step involves separating each line segment into a plurality of line segments, and mapping said line segments along a curve in a third dimension.
7. A method according to claim 6, wherein the line normals are generated so to be perpendicular to a rectangular plane, the rectangular plane being defined by the line segment and a second line parallel, and of equal length to, and with the same third dimension components as the line segment.
8. A method according to any one of claims I to 5, wherein the curve generation step involves separating each line segment into a plurality of line-shaped polygonal segments, and mapping said polygonal segments along a curve in a third dimension.
9. A method according to claim 8, wherein the line normals are generated so to be perpendicular to a rectangular plane, the rectangular plane being defined by the polygonal segment and a second polygon parallel, and of equal dimensions, and the same third dimension components as to the polygonal segment.
10. A method according to any one of the preceding claims, wherein the twodimensional medium is the visual display unit of a computer.
11. A method according to any of claims I to 9, wherein the twodimensional medium is a printer.
12. A three-dimensional design simulator comprising means for inputting and storing an embroidery design in stitch data format on a computer, means for manipulating said stitch data to translate it into three dimensions, and means for displaying the threedimensional format on a two-dimensional medium.
13. A three-dimensional design simulator for converting an embroidery design from a stitch data format to a format suitable for manipulation, translation, and lighting within a three-dimensional mathematical universe and subsequent perspective projection onto a two-dimensional medium, the simulator comprising:
(1) means for analysing the design and converting it to a plurality of points; (2) means for generating a curve along a third dimension between every two consecutive points, by splitting the line defined by two points into smaller segments. 1 (3) means for generating line normals for each segment of each curve along a third dimension; (4) means for manipulating the data points using a three-dimensional geometry to form a three-dimensional image of the design., and (5) means for displaying the image in perspective on a two-dimensional medium.
14 A three-dimensional design simulator according to claim 13, comprising means for converting the embroidery design into a plurality of points In a two-dimensional Cartesian space.
GB9819258A 1998-09-03 1998-09-03 Three-Dimensional embroidery design simulator Expired - Fee Related GB2341529B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB9819258A GB2341529B (en) 1998-09-03 1998-09-03 Three-Dimensional embroidery design simulator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9819258A GB2341529B (en) 1998-09-03 1998-09-03 Three-Dimensional embroidery design simulator

Publications (3)

Publication Number Publication Date
GB9819258D0 GB9819258D0 (en) 1998-10-28
GB2341529A true GB2341529A (en) 2000-03-15
GB2341529B GB2341529B (en) 2002-08-07

Family

ID=10838313

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9819258A Expired - Fee Related GB2341529B (en) 1998-09-03 1998-09-03 Three-Dimensional embroidery design simulator

Country Status (1)

Country Link
GB (1) GB2341529B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1500735A1 (en) * 2002-04-11 2005-01-26 Shima Seiki Manufacturing Limited Embroidery simulation method and apparatus, and program and recording medium
US7212880B2 (en) * 2005-07-12 2007-05-01 Brother Kogyo Kabushiki Kaisha Embroidery data processing device and computer program product
US7457683B2 (en) * 2006-02-08 2008-11-25 Bailie Brian D Adjustable embroidery design system and method
EP2131298A1 (en) * 2007-03-27 2009-12-09 Shima Seiki Manufacturing., Ltd. Simulation device and simulation method of knit product

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110097609B (en) * 2019-04-04 2022-11-29 浙江凌迪数字科技有限公司 Sample domain-based refined embroidery texture migration method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0218922A1 (en) * 1985-09-16 1987-04-22 Marcella M. Katz Flexible non distortable handcraft sheet material and method of applying printed designs thereto

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0218922A1 (en) * 1985-09-16 1987-04-22 Marcella M. Katz Flexible non distortable handcraft sheet material and method of applying printed designs thereto

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1500735A1 (en) * 2002-04-11 2005-01-26 Shima Seiki Manufacturing Limited Embroidery simulation method and apparatus, and program and recording medium
EP1500735A4 (en) * 2002-04-11 2008-09-24 Shima Seiki Mfg Embroidery simulation method and apparatus, and program and recording medium
CN1656273B (en) * 2002-04-11 2011-01-05 株式会社岛精机制作所 Embroidery simulation method and apparatus
US7212880B2 (en) * 2005-07-12 2007-05-01 Brother Kogyo Kabushiki Kaisha Embroidery data processing device and computer program product
US7457683B2 (en) * 2006-02-08 2008-11-25 Bailie Brian D Adjustable embroidery design system and method
EP2131298A1 (en) * 2007-03-27 2009-12-09 Shima Seiki Manufacturing., Ltd. Simulation device and simulation method of knit product
EP2131298A4 (en) * 2007-03-27 2010-08-11 Shima Seiki Mfg Simulation device and simulation method of knit product

Also Published As

Publication number Publication date
GB2341529B (en) 2002-08-07
GB9819258D0 (en) 1998-10-28

Similar Documents

Publication Publication Date Title
Haeberli et al. Texture mapping as a fundamental drawing primitive
EP0870284B1 (en) Interactive image editing
US7245305B2 (en) Shading of images using texture
US6664962B1 (en) Shadow mapping in a low cost graphics system
US7170527B2 (en) Interactive horizon mapping
US6922193B2 (en) Method for efficiently calculating texture coordinate gradient vectors
US20070139408A1 (en) Reflective image objects
WO1998038591A2 (en) Method for rendering shadows on a graphical display
JP3626144B2 (en) Method and program for generating 2D image of cartoon expression from 3D object data
US5793372A (en) Methods and apparatus for rapidly rendering photo-realistic surfaces on 3-dimensional wire frames automatically using user defined points
US6741248B2 (en) Rendering geometric features of scenes and models by individual polygons
JP2008282171A (en) Graphics processor, and method for rendering processing
GB2341529A (en) Three-dimensional embroidery design simulator
JP2003115055A (en) Image generator
KR100603134B1 (en) Method and apparatus for 3 dimension rendering processing using the monochromatic lighting
Buchholz et al. Realtime non-photorealistic rendering of 3D city models
KR20030083962A (en) Method For Applying Shading Effect To 3D Rendering Images And Information Storage Medium Storing A Program Implementing The Same
Öhrn Different mapping techniques for realistic surfaces
JPH07152925A (en) Image generating device
JP3132220B2 (en) 3D model shape creation method
US20030117410A1 (en) Method and apparatus for providing refractive transparency in selected areas of video displays
JP3453410B2 (en) Image processing apparatus and method
Strothotte et al. Pixel-oriented rendering of line drawings
Burris TRANSFORMATIONS IN COMPUTER GRAPHICS
JP4589517B2 (en) Information storage medium and image generation apparatus

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20090820 AND 20090826

PCNP Patent ceased through non-payment of renewal fee

Effective date: 20150903