US20020183886A1 - Automated embroidery stitching - Google Patents

Automated embroidery stitching Download PDF

Info

Publication number
US20020183886A1
US20020183886A1 US10/038,046 US3804602A US2002183886A1 US 20020183886 A1 US20020183886 A1 US 20020183886A1 US 3804602 A US3804602 A US 3804602A US 2002183886 A1 US2002183886 A1 US 2002183886A1
Authority
US
United States
Prior art keywords
regions
region
location
processor
representation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/038,046
Inventor
Kuan-Lan Wang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Softfoundry Inc
Original Assignee
Softfoundry Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Softfoundry Inc filed Critical Softfoundry Inc
Priority to US10/038,046 priority Critical patent/US20020183886A1/en
Publication of US20020183886A1 publication Critical patent/US20020183886A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/04Sewing machines having electronic memory or microprocessor control unit characterised by memory aspects
    • D05B19/08Arrangements for inputting stitch or pattern data to memory ; Editing stitch or pattern data
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/12Sewing machines having electronic memory or microprocessor control unit characterised by control of operation of machine

Definitions

  • An embroidery machine is a device which can accept as input a series of movements and other actions (e.g., X/Y move, color change, thread trim, etc.) which correspond to a stitch sequence and sew color stitches onto fabric in an orderly fashion.
  • Edge detecting algorithms are sometimes used to automate the process of edge identification from a computer image file.
  • the edges detected by these approaches are usually broken and therefore do not form a closed region ready for stitch generation.
  • These broken edges conventionally need to be corrected manually before individual embroidery stitches can be computed from these regions' borders and be placed within these regions following the user's instruction.
  • One approach to generate these embroidery stitch sequences is to divide the image into many smaller but fairly uniform colored regions if such regions can be found, and then according to the individual geometry of these regions, to pick different style of machine thread sequence to lay threads such that these regions are covered.
  • the threads generated in each region are in a very orderly way and almost parallel to neighboring threads.
  • One problem is that the stitches generated in this way are too orderly and lack the fine appearance from most of the images.
  • the stitches resulting from such hand-placement and hand/sequencing usually have no fixed order, appear to be very random, and frequently cross one another, to the detriment of embroidery machine efficiency and in direct contrast to the types of stitches currently capable of being generated automatically or with machine assistance.
  • bitmap is an N by M points of color pixel data, either residing in a computer's volatile memory (e.g., random access memory, or RAM) or nonvolatile memory (e.g., hard disk). Bitmaps on one extreme could be highly organized with a single color repeated in every pixel. Bitmaps on the other extreme could be completely random in pixel color, shade, and distribution. Thus, one can say that images as bitmaps exist in nearly infinite possibilities.
  • the existing efforts to automate or partially automate stitch generation achieve only a limited degree of success, and for only a very limited range of image complexity/randomness.
  • Improved methods and systems are needed for automatic and/or machine-assisted embroidery stitch generation from an image.
  • such improved methods and systems should reduce needed time and human labor, especially skilled labor, during the stitch-sequence generation phase.
  • the stitch sequences generated from such improved methods and systems should allow for efficient stitching by embroidery machines.
  • such improved methods and systems should be operable for a vastly greater range of image types.
  • An automated image processing method and system for generating a representation of an input image is for reducing time and labor for constructing, e.g., computer embroidery stitches from an input image.
  • a method for embroidering includes receiving a digitized representation of an image, determining a plurality of regions of the image in response to the digitized representation, and determining a geometric index for each region in the plurality of regions of the image. The method also includes determining a region type associated with each regions from the plurality of regions in response to the geometric index for each region, the region types including a first region type and a second region type, embroidering representations of regions from the plurality of regions associated with the first region type; and thereafter embroidering representations of regions from the plurality of regions associated with the second region type.
  • a method for embroidering includes receiving a digitized representation of an image, determining grain structures for a plurality of locations in the digitized representation, and the plurality of locations including a first location and a second location.
  • the method also includes embroidering a representation of the first location using cross stitch patterns, when a grain structure for the first location indicates a bi-directional grain structure, and embroidering a representation of the first location using uni-directional stitch patterns, when the grain structure for the first location indicates a uni-directional grain structure.
  • an article of manufacture having embroidery sewn thereon uses one of the methods described above.
  • an embroidery system includes a processor, and a processor readable memory.
  • the processor readable memory includes code that directs the processor to retrieve a digitized representation of an image, and code that directs the processor to determine a plurality of regions of the image in response to the digitized representation of the image.
  • the memory also includes code that directs the processor to determine a geometric index for each region in the plurality of regions, code that directs the processor to determine a region type associated with each region from the plurality of regions in response to the geometric index for each region, the region types including a first region type and a second region type, code that directs the processor to direct embroidering representations of regions from the plurality of regions associated with the first region type, and code that directs the processor to direct embroidering representations of regions from the plurality of regions associated with the second region type after the representations of regions from the plurality of regions associated with the first region type.
  • the computer program product also includes code configured to direct the processor to direct embroidering of a representation of the first location using cross stitch patterns, when a grain structure for the first location indicates a bi-directional grain structure, and code configured to direct the processor to direct embroidering of a representation of the first location using uni-directional stitch patterns, when the grain structure for the first location indicates a uni-directional grain structure.
  • the codes reside on a tangible media.
  • FIGS. 1A and 1B illustrate example input images.
  • FIG. 2 is a flowchart of a method according to a specific embodiment of the present invention.
  • FIGS. 3 A- 3 C illustrate samples of different types of regions in coexistence.
  • FIGS. 4 A- 4 C illustrate samples of different types of machined stitches.
  • FIG. 5 illustrates an example of subdivided type 2 regions.
  • a specific embodiment of the present invention is directed to a method for automatic computer embroidery stitch generation from an image, for reducing the time and color required to digitize an image. For best results, input images should be clear and large with fine resolution for a meaningful representation with embroidery stitches.
  • FIGS. 1A and 1B illustrate example input images.
  • FIG. 1A there is shown a cartoon-like image with qualitatively different features labeled 10 , 20 , 30 , and 40 which, as described below, variously correspond to Type 1, Type 2, and Type 3 image regions.
  • FIG. 1B there is shown a photograph-like image that also includes cartoon-type elements (bold text).
  • FIG. 1B shows a scene with people in a boat, and textured water splashing all about. Any other photograph that shows detailed variety in color, texture, pattern, orientation, etc, such as of a cat, would also have made a fine example picture.
  • FIG. 2 is a flowchart of a method according to a specific embodiment of the present invention.
  • FIG. 2 includes steps 50 , 60 , 70 , 80 , 90 , 100 , 110 , 120 , 130 , 140 , 150 , 160 , and 170 .
  • FIGS. 3 A- 3 C illustrate samples of different types of regions in coexistence.
  • FIG. 3A includes a Type 3 region 200 , a Type 2 region 180 , and a Type 1 region 210 within a Type 1 region 190 .
  • FIG. 3B includes a Type 3 region 230 over and intercepting a Type 2 region 220 and over and intercepting a Type 1 region 240 .
  • FIG. 3B also includes a Type 2 region ( 220 ) intercepting a Type 1 region ( 240 ).
  • FIG. 3C includes a region 250 that is then uncoupled into a Type 1 region 270 and a Type 2 region 280 .
  • FIGS. 4 A- 4 C illustrate samples of different types of machined stitches.
  • FIG. 4A illustrates running stitches, which are suitable for Type 3 regions.
  • FIG. 4B illustrates zigzag stitches, which are suitable for Type 2 regions.
  • FIG. 4C illustrates various filled stitches, which are suitable for Type 1 regions.
  • FIG. 5 illustrates an example of Type 2 regions which have been further subdivided into smaller regions.
  • an image can be as simple as a cartoon drawing (FIG. 1A) or as complicated as mixture of artwork and photos (FIG. 1B).
  • the complexity/randomness, or “nature”, or “composition,” of an input image or regions within the input image is automatically determined. This is shown in Step 50 of FIG. 2 for the specific embodiment.
  • a large class of images result from photographing 3-dimensional real objects such as people, animals, plants, and scenery. These images are composed of mixtures of light signals from large numbers of randomly oriented surfaces which emit, reflect, absorb, and transparently or translucently transmit colors. In such images, the elements seem random but actually all have something in common, e.g. a same lighting source, a same nature of material in certain area(s), etc. The theory of deterministic chaos can be adopted.
  • the specific embodiment take as its starting point, an image to be analyzed for finding the inherent properties of different regions, in order to apply pertinent ways to generate computer stitches.
  • an image can be divide into chosen small regular regions for determining its properties, according to, e.g., known image processing techniques. If there is enough similarity in neighboring regions, they can be grouped into larger regions for further processing. If certain region exhibits regularity, technique revealed later in this paper can be used to extract simple images out.
  • the K entropy is the most important measure of chaotic system, it measures the average rate of loss of information, if K is 0, then the system is fully deterministic; if K is infinity, then the system is completely random. Otherwise, the system is deterministic chaos.
  • K Lim ⁇ ⁇ 0 ⁇ ⁇ Lim t 2 ⁇ ⁇ ⁇ 1 t 1 - t 2 ⁇ I ⁇ ⁇ [ t 1 , t 2 ]
  • I ⁇ ⁇ [ t 1 , t 2 ] - ⁇ ( i ) ⁇ P ⁇ ( i 1 , i 2 , ... ⁇ ⁇ i n ) ⁇ ln ⁇ ⁇ P ⁇ ( i 1 , i 2 , ... ⁇ ⁇ i n )
  • P(i 1 , i 2 , . . . i n ) is when after the state space was reconstructed into n dimensional space with unit length of ⁇ , and when time between t 1 to t 2 was divided into n intervals, the joint probability of its state falling into these grids.
  • the data can be viewed as the information from a dynamic system in different stages. Following Reference [3]. Approximate entropy is an efficient way to compute for measurement entropy.
  • N is the total sampling point count and L is the sampling interval count.
  • Approximate entropy ApEn(M,r,N) can be defined as
  • the above expression is used to evaluate the local nature of images ranging from the simple cartoon artwork to complicated photos. For example, values of the expression may be compared with predetermined threshold value(s), and values to one side of a threshold would indicate one nature and values to the other side would indicate another nature.
  • Step 60 the cartoon-like or simple artwork type of image is partitioned into regions of fairly uniform color, and a border is determined for each region.
  • Step 70 a metric (or multiple metrics, in other embodiments) is computed for each region that is indicative of its geometric quality. From this metric, the each region can be classified according to a geometry classification system. In the specific embodiment, the metric measures the “thinness” of a region's shape.
  • a geometry index I can be formally defined as, e.g., P B 2 P A
  • the macroscopic approach of the specific embodiment takes each region's geometry index I as its starting point. From these indices I, the regions are classified into some number of geometry types, e.g., by comparison with threshold values, along with other considerations.
  • the threshold may be predetermined by testing the specific system in question with a variety of input and finding the values that give good performance.
  • Regions labeled 10 are thin curve-like region with very high index I value
  • Regions labeled 20 are thicker in local width compared with region type 3, therefore moderately high index I value; and Regions labeled 30 (type 1) are relatively large area and with similar dimension when measured on two perpendicular axes, the value of index I is small; and
  • a region labeled 40 (type 4) has pixels on the exterior edge of the image and upon further analysis may be recognized as background region.
  • FIG. 1A is an example of just one simple image for embodying the present invention. It will be readily apparent to one of ordinary skill in the art that a vast number of images are suitable for use in conjunction with the present invention. More specifically as non-exhaustively illustrated in FIG. 3, the first 3 types of regions can be nested or intersected or separated with each other in many ways and in lots of images; any combination of two types of regions mentioned previously are also considered suitable for the present invention. For ease of future of operation, some region can be split into 2 or more different Type 2 or Type 1 regions, judging by the geometry difference in different portions of this region. If all region's colors add up to exceed allowable colors from the designated embroidery machines, similar colored regions can be forced to have a same color, to thereby reduce the overall number of colors used.
  • Type 3 of area thin curve-like
  • Type 1 or Type 2 of areas Type 3 area is always embroidered last and overlaid on previously embroidered Type 1 or Type 2 areas.
  • Step 80 of the specific embodiment of present invention all Type 3 regions are extracted first and are fitted with spline curves. In later steps of stitch generation, these curves will be organized and embroidered last such that they can be overlaid on previously embroidered Type 1 and/or Type 2 areas.
  • a representation of actual embroidery stitch for Type 3 areas is called running stitches is shown as 290 in FIG. 4A.
  • the pixels belonging to extracted Type 3 area, i.e., the void can be changed to the color number of adjacent region's pixels, be ready for next round of operation.
  • Type 1 and/or Type 2 regions are allowed within Type 1 and/or Type 2 regions.
  • the techniques to be cited later in this paper can be used to obtain detailed image grain structures and orientations shading distribution directly from the original input image, which has not been quantized into regions each of uniform color. This image nature information is then saved and later is used for stitch generation within these areas. This is a powerful feature which greatly adds to the quality of the output embroidery for many types of input images.
  • Step 90 once all the Type 3 regions were identified and extracted, the next step in the specific embodiment is to operate on Type 2. Many pairs of spline curves are derived and fitted across all local width (FIG. 5), each Type 2 region is divided into many 4 sided sub-regions 320 . Once these 4 sided sub-regions are adjusted along the border to reasonable geometry, zigzag or satin stitches (FIG. 4B) 300 are generated for each sub-region, and hence for the whole Type 2 region, according to known methods in the art.
  • the region is in between two or more regions of the same color.
  • this region can be set as an option to be extracted out and later be overlaid as stitches on top of stitches previously embroidered from other regions.
  • the pixels which used to be in this Type 2 region i.e., the void
  • the pixels which used to be in this Type 2 region can be changed to the color of adjacent regions.
  • Type 2 region is in between the border of 2 or more Type 1 regions of same color, if this Type 2 region is extracted out for generating overlaid stitches, the present invention takes these bordering Type 1 regions and group them into one Type 1 region.
  • Step 110 and 120 embroidery stitches within each of the Type 1 regions (FIG. 4C) 310 are computed.
  • Type 1 region may have one or more Type 2 regions embedded within.
  • Step 130 the specific embodiment provides a means to find the image information composition and be used as a reference for generating embroidery stitches directory from the border of this region.
  • the specific embodiment of this invention can find its application in automatic stitch generation from a scanned sample of actual embroidery work.
  • the local image grain directions described later, in conjunctions with the finding of regions and borders can be adopted.
  • Type 4 background region 40
  • it is usually optionally omitted and need not be of concerned. If there is a need to also generate stitches for any or all of this type of region, the method used is identical to those described previously for Type 1 or Type 2 regions.
  • bitmaps can be found as deterministic chaos, in this case, both the pixels distribution and edge extraction are very fragmentary and it is very difficult to use it to generate embroidery stitches. Even if some of the images can be approximated by identifying regions for stitch generation by the method stated previously, the resulting stitches could appear too orderly and plain.
  • each pixel is about 0.25 mm, comparable to embroidery thread width, between 4 and 20 pixels could be linked to a line segment, which is comparable to the machine embroidery stitch length, commonly between 1 mm to 5 mm.
  • the automatic steps needed are to find the distribution, stitch length and sequence for all different colored stitch segments. This is a difficult problem and has likely not been previously unexplored.
  • Step 140 the image grain structure is determined.
  • k 1 ,k 2 is the location number of local pixels; in continuous form, using u, ⁇ instead of n 1 , n 2 , define a power H(u, ⁇ ) as:
  • T(u, ⁇ ) values are now computed for each predetermined location point within this image.
  • [0089] means, for each r value, one can find H value around the circular path of fixed r.
  • [0092] means, for each ⁇ value, one can find H value along the r direction.
  • N is changing rapidly from point to point in a small region of image, it indicates that there is a pattern in this region and one can repeatedly use the same technique of DFT over a refined grid to find detail of this pattern.
  • Case a local pixels are linked parallel to the local grain direction found previously and it also means that parallel stitches are created for Uni.-directional grain and cross stitches are generated for bi-directional grain.
  • Case b local pixels are linked parallel to a fixed given direction. The result could either be one direction stitches or bi-directional stitches.
  • Each line segment can be made short (using fewer pixel) or long (using more pixels) according to the local color intensity of these pixels.
  • shorter stitches have more dense stitching needle points can contribute to darker shade and, longer stitches have fewer needle points can contribute to lighter shade.
  • a large number of individual same colored short line segments in the same area can be connected together by tie-in pairs of most closely situated end points from each line segment 150. Many of the as-tied end points can be merged into one point if the distance between them is too small.
  • This line linking process is continued until the current end point can not find a nearest end point with acceptable distance.
  • This process can temporally be hold and the already connected line segments can form a group 160 . The process is continued until all the line segments are used in forming many groups.
  • the embroidery field can sometimes still be considered as an art and without any fixed and unique way for stitch preparation.
  • the system and method described herein can process a large class of input images. For certain images, a better final result can be achieved after making slight modification to either the stitch sequence generated by the specific embodiment or to the image itself (for reprocessing by the present invention).
  • the present invention relates to methods and systems for deriving information from images and generating command sequences from such information for creating stylized renditions of those images.
  • a particular application is to machine embroidery wherein the present invention generates embroidery stitch sequences and optionally operates an embroidery machine using the generated stitch sequences as an input.
  • switch sequences could also correspond to a generic line-art representation of images, which have aesthetic appeal even when printed on other media or when displayed on a computer display.
  • the present invention will find its implementation in a software program on a computer-readable storage medium within a computer system, wherein the computer system may be coupled to send its output to an embroidery machine and which computer system may form a part of an automated embroidery system. Such an embodiment is therefore to be considered within the scope of the present invention. Other embodiments are described in the attached appendicies.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Textile Engineering (AREA)
  • Sewing Machines And Sewing (AREA)

Abstract

A method for embroidering includes receiving a digitized representation of an image, determining grain structures for a plurality of locations in the digitized representation, the plurality of locations including a first location and a second location, embroidering a representation of the first location using cross stitch patterns, when a grain structure for the first location indicates a bi-directional grain structure, and embroidering a representation of the first location using uni-directional stitch patterns, when the grain structure for the first location indicates a uni-directional grain structure.

Description

    CROSS-REFERENCES TO RELATED APPLICATIONS
  • This application claims priority to U.S. Ser. No. 60/081,333 filed Apr. 10, 1998, incorporated by reference herein, for all purposes.[0001]
  • BACKGROUND OF THE INVENTION
  • With the advent of computer digitizers and embroidery machines, users of embroidery machines and embroidery software have become more conscious of the need to reduce the time, labor, cost and user experience and expertise needed to convert images into embroidery stitch sequences. An embroidery machine is a device which can accept as input a series of movements and other actions (e.g., X/Y move, color change, thread trim, etc.) which correspond to a stitch sequence and sew color stitches onto fabric in an orderly fashion. [0002]
  • Before the use of computer software for assistance with generating stitch sequences for input to embroidery machines, a user needed to simulate the process used by hand embroiderers and sewing machine operators. In particular, the user needed visually to select various spots for receiving distinctive color and then to approximate those desired colored spots laying down series of stitch segments and sequencing the series of stitches. This process is extremely time-consuming and requires great experience. [0003]
  • Up to several hundred thousands of stitches or more are required to represent an image. Because of the need to use many different colored thread to cover many different areas many times, and the need to improve the embroidery machine's productivity, there is an additional requirement to organize the overall stitch sequence in an efficient way so as to minimize machine-stop events such as colored thread change and thread trim. Typically, several days of labor are required to describe manually a single, ordinary image as an adequate embroidery stitch sequence. [0004]
  • Since the days of purely-manual stitch sequence construction, some workers have used computer digitization software to perform a tedious step-by-step task of inputting information from an image into a computer. These techniques involve the use of digitizer tablets or computer monitor screens for detailed step-by-step digitizing tracing or selecting of distinctive borders from different areas of an image to identify the borders. Conventionally, a user must select these border edges in sequence to inform a computer that certain border curves can form a region. This process is still time consuming and inconvenient. Once regions are identified, simple computer software can be used to aid in generating certain types of stitch sequences for filling in the regions in a rudimentary manner. [0005]
  • Edge detecting algorithms are sometimes used to automate the process of edge identification from a computer image file. However, the edges detected by these approaches are usually broken and therefore do not form a closed region ready for stitch generation. These broken edges conventionally need to be corrected manually before individual embroidery stitches can be computed from these regions' borders and be placed within these regions following the user's instruction. [0006]
  • One approach to generate these embroidery stitch sequences is to divide the image into many smaller but fairly uniform colored regions if such regions can be found, and then according to the individual geometry of these regions, to pick different style of machine thread sequence to lay threads such that these regions are covered. The threads generated in each region are in a very orderly way and almost parallel to neighboring threads. There are several problems associated with this approach. One problem is that the stitches generated in this way are too orderly and lack the fine appearance from most of the images. Secondly, very few images available today can be divided into regions of similar color in an obvious way, and thirdly, even for a simple images with clear borders of uniform colored regions, it may have hundreds of smaller regions that need to be inputted into computer for stitch generation, which is very labor-intensive and time-consuming both in digitizing along the regions' borders and in selecting each of these small region's borders to form the region. [0007]
  • Many images (e.g., photographs) are represented by bitmaps of a vast amount of different colored pixels distributed in a very mixed and complex manner. There is no art today to automatically convert an arbitrary image bitmap into a series of embroidery stitches in an efficient manner. The only existing approach for generating embroidery stitches from this type of image is to hand-lay individual stitch threads, one by one, locally in reference to the local details of image. Since this approach needs significant experience and vast amounts of labor for each task, it is not generally practical in the machine embroidery industry. The stitches resulting from such hand-placement and hand/sequencing usually have no fixed order, appear to be very random, and frequently cross one another, to the detriment of embroidery machine efficiency and in direct contrast to the types of stitches currently capable of being generated automatically or with machine assistance. [0008]
  • Images exist today in the form of photos, paintings, artworks, prints, fabrics, or even computer files. The methods to convert such images to bitmap data are well-known arts. A bitmap is an N by M points of color pixel data, either residing in a computer's volatile memory (e.g., random access memory, or RAM) or nonvolatile memory (e.g., hard disk). Bitmaps on one extreme could be highly organized with a single color repeated in every pixel. Bitmaps on the other extreme could be completely random in pixel color, shade, and distribution. Thus, one can say that images as bitmaps exist in nearly infinite possibilities. The existing efforts to automate or partially automate stitch generation achieve only a limited degree of success, and for only a very limited range of image complexity/randomness. [0009]
  • Improved methods and systems are needed for automatic and/or machine-assisted embroidery stitch generation from an image. In particular, such improved methods and systems should reduce needed time and human labor, especially skilled labor, during the stitch-sequence generation phase. Furthermore, the stitch sequences generated from such improved methods and systems should allow for efficient stitching by embroidery machines. In addition, such improved methods and systems should be operable for a vastly greater range of image types. [0010]
  • SUMMARY OF THE INVENTION
  • An automated image processing method and system for generating a representation of an input image. One application is for reducing time and labor for constructing, e.g., computer embroidery stitches from an input image. [0011]
  • According to another embodiment, a method for embroidering includes receiving a digitized representation of an image, determining a plurality of regions of the image in response to the digitized representation, and determining a geometric index for each region in the plurality of regions of the image. The method also includes determining a region type associated with each regions from the plurality of regions in response to the geometric index for each region, the region types including a first region type and a second region type, embroidering representations of regions from the plurality of regions associated with the first region type; and thereafter embroidering representations of regions from the plurality of regions associated with the second region type. [0012]
  • According to an embodiment of the present embodiment, a method for embroidering includes receiving a digitized representation of an image, determining grain structures for a plurality of locations in the digitized representation, and the plurality of locations including a first location and a second location. The method also includes embroidering a representation of the first location using cross stitch patterns, when a grain structure for the first location indicates a bi-directional grain structure, and embroidering a representation of the first location using uni-directional stitch patterns, when the grain structure for the first location indicates a uni-directional grain structure. [0013]
  • According to another embodiment, an article of manufacture having embroidery sewn thereon uses one of the methods described above. [0014]
  • According to another embodiment, an embroidery system includes a processor, and a processor readable memory. The processor readable memory includes code that directs the processor to retrieve a digitized representation of an image, and code that directs the processor to determine a plurality of regions of the image in response to the digitized representation of the image. The memory also includes code that directs the processor to determine a geometric index for each region in the plurality of regions, code that directs the processor to determine a region type associated with each region from the plurality of regions in response to the geometric index for each region, the region types including a first region type and a second region type, code that directs the processor to direct embroidering representations of regions from the plurality of regions associated with the first region type, and code that directs the processor to direct embroidering representations of regions from the plurality of regions associated with the second region type after the representations of regions from the plurality of regions associated with the first region type. [0015]
  • According to another embodiment, a computer program product for a computer system including a processor comprises code configured to direct the processor to receive a digitized representation of an image, and code configured to direct the processor to determine grain structures for a plurality of locations in the digitized representation, the plurality of locations including a first location and a second location. The computer program product also includes code configured to direct the processor to direct embroidering of a representation of the first location using cross stitch patterns, when a grain structure for the first location indicates a bi-directional grain structure, and code configured to direct the processor to direct embroidering of a representation of the first location using uni-directional stitch patterns, when the grain structure for the first location indicates a uni-directional grain structure. The codes reside on a tangible media.[0016]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In order to more fully understand the present invention, reference is made to the accompanying drawings. [0017]
  • FIGS. 1A and 1B illustrate example input images. [0018]
  • FIG. 2 is a flowchart of a method according to a specific embodiment of the present invention. [0019]
  • FIGS. [0020] 3A-3C illustrate samples of different types of regions in coexistence.
  • FIGS. [0021] 4A-4C illustrate samples of different types of machined stitches.
  • FIG. 5 illustrates an example of subdivided [0022] type 2 regions.
  • DESCRIPTION OF THE SPECIFIC EMBODIMENTS
  • A specific embodiment of the present invention is directed to a method for automatic computer embroidery stitch generation from an image, for reducing the time and color required to digitize an image. For best results, input images should be clear and large with fine resolution for a meaningful representation with embroidery stitches. [0023]
  • FIGS. 1A and 1B illustrate example input images. In FIG. 1A, there is shown a cartoon-like image with qualitatively different features labeled [0024] 10, 20, 30, and 40 which, as described below, variously correspond to Type 1, Type 2, and Type 3 image regions. In FIG. 1B, there is shown a photograph-like image that also includes cartoon-type elements (bold text). FIG. 1B shows a scene with people in a boat, and textured water splashing all about. Any other photograph that shows detailed variety in color, texture, pattern, orientation, etc, such as of a cat, would also have made a fine example picture.
  • FIG. 2 is a flowchart of a method according to a specific embodiment of the present invention. FIG. 2 includes [0025] steps 50, 60, 70, 80, 90, 100, 110, 120, 130, 140, 150, 160, and 170.
  • FIGS. [0026] 3A-3C illustrate samples of different types of regions in coexistence. FIG. 3A includes a Type 3 region 200, a Type 2 region 180, and a Type 1 region 210 within a Type 1 region 190. FIG. 3B includes a Type 3 region 230 over and intercepting a Type 2 region 220 and over and intercepting a Type 1 region 240. FIG. 3B also includes a Type 2 region (220) intercepting a Type 1 region (240). FIG. 3C includes a region 250 that is then uncoupled into a Type 1 region 270 and a Type 2 region 280.
  • FIGS. [0027] 4A-4C illustrate samples of different types of machined stitches. FIG. 4A illustrates running stitches, which are suitable for Type 3 regions. FIG. 4B illustrates zigzag stitches, which are suitable for Type 2 regions. FIG. 4C illustrates various filled stitches, which are suitable for Type 1 regions.
  • FIG. 5 illustrates an example of [0028] Type 2 regions which have been further subdivided into smaller regions.
  • The Nature of an Image or Region
  • In general, an image can be as simple as a cartoon drawing (FIG. 1A) or as complicated as mixture of artwork and photos (FIG. 1B). According to the present invention, the complexity/randomness, or “nature”, or “composition,” of an input image or regions within the input image is automatically determined. This is shown in [0029] Step 50 of FIG. 2 for the specific embodiment.
  • According to the present invention, separate methods appropriate to each level of complexity/randomness are employed. In the specific embodiment of FIG. 2, two levels of complexity/randomness are handled by two branches of method steps, namely Steps [0030] 60-130 and Steps 140-170.
  • For determining the nature of an input image (e.g., Step [0031] 50), measurement entropy which is used in the study of deterministic chaos can be adopted. See Reference [2].
  • A large class of images result from photographing 3-dimensional real objects such as people, animals, plants, and scenery. These images are composed of mixtures of light signals from large numbers of randomly oriented surfaces which emit, reflect, absorb, and transparently or translucently transmit colors. In such images, the elements seem random but actually all have something in common, e.g. a same lighting source, a same nature of material in certain area(s), etc. The theory of deterministic chaos can be adopted. [0032]
  • The specific embodiment take as its starting point, an image to be analyzed for finding the inherent properties of different regions, in order to apply pertinent ways to generate computer stitches. In the specific embodiment, an image can be divide into chosen small regular regions for determining its properties, according to, e.g., known image processing techniques. If there is enough similarity in neighboring regions, they can be grouped into larger regions for further processing. If certain region exhibits regularity, technique revealed later in this paper can be used to extract simple images out. [0033]
  • Consider each smaller region. According to definition of measurement entropy for nonlinear dynamic system, the K entropy is the most important measure of chaotic system, it measures the average rate of loss of information, if K is 0, then the system is fully deterministic; if K is infinity, then the system is completely random. Otherwise, the system is deterministic chaos. [0034]
  • According to definition: [0035] K = Lim ɛ 0 Lim t 2 1 t 1 - t 2 I ɛ [ t 1 , t 2 ] where I ɛ [ t 1 , t 2 ] = - ( i ) P ( i 1 , i 2 , i n ) ln P ( i 1 , i 2 , i n )
    Figure US20020183886A1-20021205-M00001
  • and P(i[0036] 1, i2, . . . in) is when after the state space was reconstructed into n dimensional space with unit length of ε, and when time between t1 to t2 was divided into n intervals, the joint probability of its state falling into these grids.
  • For image processing, the data can be viewed as the information from a dynamic system in different stages. Following Reference [3]. Approximate entropy is an efficient way to compute for measurement entropy. [0037]
  • Typically, the data making up a time series {x(t[0038] 1)}, i=1, . . . , N, are of lower dimension than the actual dynamics.
  • After the state space reconstruction using the delay vector of embedding dimension M, the series of M dimension vectors are: [0039]
  • X(1), X(2), . . . X(N−(M−1)L)
  • where [0040]
  • X(i)=x(i),x(i+L), . . . x(i+(M−1)L); i=1,2, . . . ,N−(M−1)L
  • define as delay vector. [0041]
  • N is the total sampling point count and L is the sampling interval count. [0042]
  • Define the distance between vector X(i) and X(j) as: [0043]
  • d[X(i),X(j)]=max(|x(i+(k−1)L)−x(j+(k−1)L|); k=1,2. . . M
  • Define: [0044] C i M ( r ) = 1 N - M + 1 × ( the total numbers count of J s for varying J , under the condition of d [ X ( i ) , X ( j ) r ) and : Φ M ( r ) = 1 N - M + 1 i = 1 N - M + 1 ln C i M ( r )
    Figure US20020183886A1-20021205-M00002
  • Approximate entropy ApEn(M,r,N) can be defined as [0045]
  • ApEn(M,r,N)=ΦM(r)−ΦM+1(r)
  • Values of M, r, N and L can be chosen to adjust the sensitivity for detecting different image natures, where M=2, r=0 to 15, N=500 to 1000, L=1, can yield satisfactory results for common image types. In other embodiments of the invention, other ways to approximate entropy may be used. [0046]
  • The above expression is used to evaluate the local nature of images ranging from the simple cartoon artwork to complicated photos. For example, values of the expression may be compared with predetermined threshold value(s), and values to one side of a threshold would indicate one nature and values to the other side would indicate another nature. Once the nature of image is identified for many sub-areas of the image ([0047] Step 50 of FIG. 2), methods from the specific embodiment of the present invention, e.g., macroscopic stitch generation or microscopic stitch generation, can be applied to designated area of image for automatic stitch generation.
  • Macroscopic Stitch Generation [0048]
  • In [0049] Step 60, the cartoon-like or simple artwork type of image is partitioned into regions of fairly uniform color, and a border is determined for each region.
  • For cartoon-like or simple artwork type of image, well-known techniques such as the quantification of each image pixel based on RGB or UVW space is used to assign a quantified number to each pixel. Examples of such well-known image processing technique is discussed in Reference [1]. The result of such calculation assigns a quantified number to each pixel. Scanned samples of already embroidered work can be processed by this same method. [0050]
  • For every pixel in the image, compare the quantified number with that of all adjacent pixels. If the difference of this number is small enough, based, e.g., on comparison with a predetermined threshold, one can say that their color are similar and can be merged into a same region. This process is repeated until all the pixels are used and all the regions are formed. It is relatively easy to find all the pixels on the border of each region and therefore each region has pixels on the border and pixels not on border. [0051]
  • In [0052] Step 70, a metric (or multiple metrics, in other embodiments) is computed for each region that is indicative of its geometric quality. From this metric, the each region can be classified according to a geometry classification system. In the specific embodiment, the metric measures the “thinness” of a region's shape.
  • One can perform the pixel count P[0053] B for border pixels and the pixel count PA interior area pixel including border pixel for each region. A geometry index I can be formally defined as, e.g., P B 2 P A
    Figure US20020183886A1-20021205-M00003
  • This is a preferred formulation of the geometry index I. The higher the index, the more “thin” is the region, and the lower the index, the more “thick,” or “full,” is the region. Other metrics for geometry indices may be used to determine “thinness.”[0054]
  • For example, any expression that tends to get larger (or smaller) as P[0055] B exceeds PA would have some utility. For example: P B P A
    Figure US20020183886A1-20021205-M00004
  • The macroscopic approach of the specific embodiment takes each region's geometry index I as its starting point. From these indices I, the regions are classified into some number of geometry types, e.g., by comparison with threshold values, along with other considerations. The threshold may be predetermined by testing the specific system in question with a variety of input and finding the values that give good performance. [0056]
  • With illustrative reference to FIG. 1A: [0057]
  • 1. Regions labeled [0058] 10 (type 3) are thin curve-like region with very high index I value;
  • 2. Regions labeled [0059] 20 (type 2) are thicker in local width compared with region type 3, therefore moderately high index I value; and Regions labeled 30 (type 1) are relatively large area and with similar dimension when measured on two perpendicular axes, the value of index I is small; and
  • 3. A region labeled [0060] 40 (type 4) has pixels on the exterior edge of the image and upon further analysis may be recognized as background region.
  • FIG. 1A is an example of just one simple image for embodying the present invention. It will be readily apparent to one of ordinary skill in the art that a vast number of images are suitable for use in conjunction with the present invention. More specifically as non-exhaustively illustrated in FIG. 3, the first 3 types of regions can be nested or intersected or separated with each other in many ways and in lots of images; any combination of two types of regions mentioned previously are also considered suitable for the present invention. For ease of future of operation, some region can be split into 2 or more [0061] different Type 2 or Type 1 regions, judging by the geometry difference in different portions of this region. If all region's colors add up to exceed allowable colors from the designated embroidery machines, similar colored regions can be forced to have a same color, to thereby reduce the overall number of colors used.
  • In the specific embodiment, if local coexistence of [0062] Type 3 of area (thin curve-like) with Type 1 or Type 2 of areas, Type 3 area is always embroidered last and overlaid on previously embroidered Type 1 or Type 2 areas.
  • In [0063] Step 80 of the specific embodiment of present invention, all Type 3 regions are extracted first and are fitted with spline curves. In later steps of stitch generation, these curves will be organized and embroidered last such that they can be overlaid on previously embroidered Type 1 and/or Type 2 areas. A representation of actual embroidery stitch for Type 3 areas is called running stitches is shown as 290 in FIG. 4A. The pixels belonging to extracted Type 3 area, i.e., the void can be changed to the color number of adjacent region's pixels, be ready for next round of operation.
  • In the specific embodiment, slight color variations are allowed within [0064] Type 1 and/or Type 2 regions. The techniques to be cited later in this paper can be used to obtain detailed image grain structures and orientations shading distribution directly from the original input image, which has not been quantized into regions each of uniform color. This image nature information is then saved and later is used for stitch generation within these areas. This is a powerful feature which greatly adds to the quality of the output embroidery for many types of input images.
  • In [0065] Step 90, once all the Type 3 regions were identified and extracted, the next step in the specific embodiment is to operate on Type 2. Many pairs of spline curves are derived and fitted across all local width (FIG. 5), each Type 2 region is divided into many 4 sided sub-regions 320. Once these 4 sided sub-regions are adjusted along the border to reasonable geometry, zigzag or satin stitches (FIG. 4B) 300 are generated for each sub-region, and hence for the whole Type 2 region, according to known methods in the art.
  • In the specific embodiment of this invention, the following criteria can be used to evaluate a [0066] Type 2 region:
  • 1. The area of this region is relatively small. [0067]
  • 2. The geometry index I is relatively large. [0068]
  • 3. The region is completely fit into other region. [0069]
  • 4. The region is in between two or more regions of the same color. [0070]
  • If the subjected [0071] Type 2 region satisfies most of these criteria, this region can be set as an option to be extracted out and later be overlaid as stitches on top of stitches previously embroidered from other regions. In that case, the pixels which used to be in this Type 2 region (i.e., the void) can be changed to the color of adjacent regions. Thus, there is a step of determining whether the Type 2 region should be overlaid on top of other stitches.
  • In the case of a [0072] Type 2 region is in between the border of 2 or more Type 1 regions of same color, if this Type 2 region is extracted out for generating overlaid stitches, the present invention takes these bordering Type 1 regions and group them into one Type 1 region.
  • In the [0073] next Steps 110 and 120 embroidery stitches within each of the Type 1 regions (FIG. 4C) 310 are computed. With or without the use of previously computed image composition (“image nature”) information, Type 1 region may have one or more Type 2 regions embedded within.
  • In [0074] Step 130, the specific embodiment provides a means to find the image information composition and be used as a reference for generating embroidery stitches directory from the border of this region.
  • The specific embodiment of this invention can find its application in automatic stitch generation from a scanned sample of actual embroidery work. The local image grain directions described later, in conjunctions with the finding of regions and borders can be adopted. [0075]
  • As to the region of Type 4 (background region) [0076] 40, it is usually optionally omitted and need not be of concerned. If there is a need to also generate stitches for any or all of this type of region, the method used is identical to those described previously for Type 1 or Type 2 regions.
  • Microscopic Stitch Generation [0077]
  • As stated previously, a large fraction of the images in the form of computer bitmaps are taken from photos of complicated 3-D objects. Using the method described previously, bitmaps can be found as deterministic chaos, in this case, both the pixels distribution and edge extraction are very fragmentary and it is very difficult to use it to generate embroidery stitches. Even if some of the images can be approximated by identifying regions for stitch generation by the method stated previously, the resulting stitches could appear too orderly and plain. [0078]
  • The specific embodiment of present invention use a method to link local adjacent pixels for forming the embroidery stitches directly, without the need to find borders of region containing these local pixels. Using a high resolution computer monitor as an example, each pixel is about 0.25 mm, comparable to embroidery thread width, between 4 and 20 pixels could be linked to a line segment, which is comparable to the machine embroidery stitch length, commonly between 1 mm to 5 mm. Thus the automatic steps needed are to find the distribution, stitch length and sequence for all different colored stitch segments. This is a difficult problem and has likely not been previously unexplored. [0079]
  • Image Grain Structure Determination [0080]
  • In [0081] Step 140, the image grain structure is determined.
  • Applying the well-known technique of two-dimensional DFT, Reference [1], A 2-dimensional discrete Fourier transform (DFT) of a sampled 2-D signal h (k[0082] 1,k2) is given by H ( n 1 , n 2 ) = k 1 = 0 N - 1 k 2 = 0 N - 1 h ( k 1 , k 2 ) - j2 / N * ( n 1 k 1 + n 2 k 2 ) = DFT { h ( k 1 , k 2 ) where n 1 = 0 , 1 , 2 , , N - 1 and n 2 = 0 , 1 , 2 , , N - 1
    Figure US20020183886A1-20021205-M00005
  • In discretized form, k[0083] 1,k2 is the location number of local pixels; in continuous form, using u, υ instead of n1, n2, define a power H(u, υ) as:
  • T(u, υ)=log 10 (1+|H(u, υ)|)
  • for slowing down the decreasing of spectra in increasing frequencies. [0084]
  • Repeatedly apply DFT to predetermined locations of images over its adjacent region, T(u, υ) values are now computed for each predetermined location point within this image. [0085]
  • Transform T(u, υ) into the polar coordinates of (r, φ) [0086]
  • Define [0087] T ( r ) = i = 1 n T ( r , φ i )
    Figure US20020183886A1-20021205-M00006
  • where [0088]
  • φi+1−φi =π/n and r=r 1 ,r 2 . . . r n
  • means, for each r value, one can find H value around the circular path of fixed r. [0089]
  • Similarly, define [0090] T ( φ ) = i = 1 n T ( r i , φ )
    Figure US20020183886A1-20021205-M00007
  • where [0091]
  • r l+1 −r i =w/2n and φ=φ 1, φ2. . . , φn
  • means, for each φ value, one can find H value along the r direction. [0092]
  • One can now investigate the curve of T(φ) VS. φ and find number of wave peak, denote it as N. Following result can be recorded: [0093]
  • a. if N is equal to 1 or 2, then it is clear that there is Uni. Or bi-directional grain structure around this point. In this case, the grain directions can be easily determined; [0094]
  • b. if T(r) is constant, then it indicates that there is no directional grain existing at this point; [0095]
  • c. if T(r) fluctuates, it indicates that images near this point are very fragmentary and spotty; and [0096]
  • d. finally, if N is changing rapidly from point to point in a small region of image, it indicates that there is a pattern in this region and one can repeatedly use the same technique of DFT over a refined grid to find detail of this pattern. [0097]
  • The specific embodiment of present invention utilizes previous findings as in a, b, c and d, for reference to derive the pixel linking and line forming process in the following manners. [0098]
  • Case a: local pixels are linked parallel to the local grain direction found previously and it also means that parallel stitches are created for Uni.-directional grain and cross stitches are generated for bi-directional grain. [0099]
  • Case b: local pixels are linked parallel to a fixed given direction. The result could either be one direction stitches or bi-directional stitches. [0100]
  • Case c: local least square method are adopted, for pixels distributed in local (x, y) random stitches could be found. [0101]
  • Case d: pixels will be linked to closely follow the details of patterns. [0102]
  • Each line segment can be made short (using fewer pixel) or long (using more pixels) according to the local color intensity of these pixels. Thus, shorter stitches have more dense stitching needle points can contribute to darker shade and, longer stitches have fewer needle points can contribute to lighter shade. [0103]
  • In the specific embodiment of this invention, a large number of individual same colored short line segments in the same area can be connected together by tie-in pairs of most closely situated end points from each [0104] line segment 150. Many of the as-tied end points can be merged into one point if the distance between them is too small. This line linking process is continued until the current end point can not find a nearest end point with acceptable distance. This process can temporally be hold and the already connected line segments can form a group 160. The process is continued until all the line segments are used in forming many groups.
  • One note worth taken is that the starting and end point of any group of lines can be switched or made to coincide as one point (in this case, one extra underlay line of stitch is generated under this group of lines.) This property of changeable starting and ending point in a group of lines is very important as it provides more flexible ways to connect group of lines. Interactive test could be used to find the right sequence of connecting all various group of lines from different color, such that the occurrence of long exposed connecting line between groups are minimized [0105] 170.
  • The embroidery field can sometimes still be considered as an art and without any fixed and unique way for stitch preparation. The system and method described herein can process a large class of input images. For certain images, a better final result can be achieved after making slight modification to either the stitch sequence generated by the specific embodiment or to the image itself (for reprocessing by the present invention). [0106]
  • The invention has now been explained with reference to specific embodiments. The present invention relates to methods and systems for deriving information from images and generating command sequences from such information for creating stylized renditions of those images. A particular application is to machine embroidery wherein the present invention generates embroidery stitch sequences and optionally operates an embroidery machine using the generated stitch sequences as an input. [0107]
  • Other embodiments will be apparent to those of ordinary skill in the art in view of the foregoing description. For example, the “stitch sequences” could also correspond to a generic line-art representation of images, which have aesthetic appeal even when printed on other media or when displayed on a computer display. [0108]
  • It is envisioned herein that the present invention will find its implementation in a software program on a computer-readable storage medium within a computer system, wherein the computer system may be coupled to send its output to an embroidery machine and which computer system may form a part of an automated embroidery system. Such an embodiment is therefore to be considered within the scope of the present invention. Other embodiments are described in the attached appendicies. [0109]
  • The scope of the invention is therefore indicated by the appended claims rather than the foregoing description and all changes that come within the meaning and range of equivalents thereof are intended to be considered as being embraced within their scope. It will be appreciated by those skilled in this particular art that the present invention can be embodied in other specific forms without departing from the spirit or essential characteristics thereof. [0110]
    Figure US20020183886A1-20021205-P00001
    Figure US20020183886A1-20021205-P00002
    Figure US20020183886A1-20021205-P00003
    Figure US20020183886A1-20021205-P00004
    Figure US20020183886A1-20021205-P00005
    Figure US20020183886A1-20021205-P00006
    Figure US20020183886A1-20021205-P00007
    Figure US20020183886A1-20021205-P00008
    Figure US20020183886A1-20021205-P00009
    Figure US20020183886A1-20021205-P00010
    Figure US20020183886A1-20021205-P00011
    Figure US20020183886A1-20021205-P00012
    Figure US20020183886A1-20021205-P00013
    Figure US20020183886A1-20021205-P00014
    Figure US20020183886A1-20021205-P00015
    Figure US20020183886A1-20021205-P00016
    Figure US20020183886A1-20021205-P00017
    Figure US20020183886A1-20021205-P00018
    Figure US20020183886A1-20021205-P00019
    Figure US20020183886A1-20021205-P00020
    Figure US20020183886A1-20021205-P00021
    Figure US20020183886A1-20021205-P00022
    Figure US20020183886A1-20021205-P00023
    Figure US20020183886A1-20021205-P00024
    Figure US20020183886A1-20021205-P00025
    Figure US20020183886A1-20021205-P00026
    Figure US20020183886A1-20021205-P00027
    Figure US20020183886A1-20021205-P00028
    Figure US20020183886A1-20021205-P00029
    Figure US20020183886A1-20021205-P00030
    Figure US20020183886A1-20021205-P00031
    Figure US20020183886A1-20021205-P00032
    Figure US20020183886A1-20021205-P00033
    Figure US20020183886A1-20021205-P00034
    Figure US20020183886A1-20021205-P00035
    Figure US20020183886A1-20021205-P00036
    Figure US20020183886A1-20021205-P00037
    Figure US20020183886A1-20021205-P00038
    Figure US20020183886A1-20021205-P00039
    Figure US20020183886A1-20021205-P00040
    Figure US20020183886A1-20021205-P00041
    Figure US20020183886A1-20021205-P00042
    Figure US20020183886A1-20021205-P00043
    Figure US20020183886A1-20021205-P00044
    Figure US20020183886A1-20021205-P00045
    Figure US20020183886A1-20021205-P00046
    Figure US20020183886A1-20021205-P00047
    Figure US20020183886A1-20021205-P00048
    Figure US20020183886A1-20021205-P00049
    Figure US20020183886A1-20021205-P00050
    Figure US20020183886A1-20021205-P00051
    Figure US20020183886A1-20021205-P00052
    Figure US20020183886A1-20021205-P00053
    Figure US20020183886A1-20021205-P00054
    Figure US20020183886A1-20021205-P00055
    Figure US20020183886A1-20021205-P00056
    Figure US20020183886A1-20021205-P00057
    Figure US20020183886A1-20021205-P00058
    Figure US20020183886A1-20021205-P00059
    Figure US20020183886A1-20021205-P00060
    Figure US20020183886A1-20021205-P00061
    Figure US20020183886A1-20021205-P00062
    Figure US20020183886A1-20021205-P00063
    Figure US20020183886A1-20021205-P00064
    Figure US20020183886A1-20021205-P00065
    Figure US20020183886A1-20021205-P00066
    Figure US20020183886A1-20021205-P00067
    Figure US20020183886A1-20021205-P00068
    Figure US20020183886A1-20021205-P00069
    Figure US20020183886A1-20021205-P00070
    Figure US20020183886A1-20021205-P00071
    Figure US20020183886A1-20021205-P00072
    Figure US20020183886A1-20021205-P00073
    Figure US20020183886A1-20021205-P00074
    Figure US20020183886A1-20021205-P00075
    Figure US20020183886A1-20021205-P00076
    Figure US20020183886A1-20021205-P00077
    Figure US20020183886A1-20021205-P00078
    Figure US20020183886A1-20021205-P00079
    Figure US20020183886A1-20021205-P00080
    Figure US20020183886A1-20021205-P00081
    Figure US20020183886A1-20021205-P00082
    Figure US20020183886A1-20021205-P00083
    Figure US20020183886A1-20021205-P00084
    Figure US20020183886A1-20021205-P00085
    Figure US20020183886A1-20021205-P00086
    Figure US20020183886A1-20021205-P00087
    Figure US20020183886A1-20021205-P00088
    Figure US20020183886A1-20021205-P00089
    Figure US20020183886A1-20021205-P00090
    Figure US20020183886A1-20021205-P00091
    Figure US20020183886A1-20021205-P00092
    Figure US20020183886A1-20021205-P00093
    Figure US20020183886A1-20021205-P00094
    Figure US20020183886A1-20021205-P00095
    Figure US20020183886A1-20021205-P00096
    Figure US20020183886A1-20021205-P00097
    Figure US20020183886A1-20021205-P00098
    Figure US20020183886A1-20021205-P00099
    Figure US20020183886A1-20021205-P00100
    Figure US20020183886A1-20021205-P00101
    Figure US20020183886A1-20021205-P00102
    Figure US20020183886A1-20021205-P00103
    Figure US20020183886A1-20021205-P00104
    Figure US20020183886A1-20021205-P00105
    Figure US20020183886A1-20021205-P00106
    Figure US20020183886A1-20021205-P00107
    Figure US20020183886A1-20021205-P00108
    Figure US20020183886A1-20021205-P00109
    Figure US20020183886A1-20021205-P00110
    Figure US20020183886A1-20021205-P00111
    Figure US20020183886A1-20021205-P00112
    Figure US20020183886A1-20021205-P00113
    Figure US20020183886A1-20021205-P00114
    Figure US20020183886A1-20021205-P00115
    Figure US20020183886A1-20021205-P00116
    Figure US20020183886A1-20021205-P00117
    Figure US20020183886A1-20021205-P00118
    Figure US20020183886A1-20021205-P00119
    Figure US20020183886A1-20021205-P00120
    Figure US20020183886A1-20021205-P00121
    Figure US20020183886A1-20021205-P00122
    Figure US20020183886A1-20021205-P00123
    Figure US20020183886A1-20021205-P00124
    Figure US20020183886A1-20021205-P00125
    Figure US20020183886A1-20021205-P00126
    Figure US20020183886A1-20021205-P00127
    Figure US20020183886A1-20021205-P00128
    Figure US20020183886A1-20021205-P00129
    Figure US20020183886A1-20021205-P00130
    Figure US20020183886A1-20021205-P00131
    Figure US20020183886A1-20021205-P00132
    Figure US20020183886A1-20021205-P00133
    Figure US20020183886A1-20021205-P00134
    Figure US20020183886A1-20021205-P00135
    Figure US20020183886A1-20021205-P00136
    Figure US20020183886A1-20021205-P00137
    Figure US20020183886A1-20021205-P00138
    Figure US20020183886A1-20021205-P00139
    Figure US20020183886A1-20021205-P00140
    Figure US20020183886A1-20021205-P00141
    Figure US20020183886A1-20021205-P00142
    Figure US20020183886A1-20021205-P00143
    Figure US20020183886A1-20021205-P00144
    Figure US20020183886A1-20021205-P00145
    Figure US20020183886A1-20021205-P00146
    Figure US20020183886A1-20021205-P00147
    Figure US20020183886A1-20021205-P00148
    Figure US20020183886A1-20021205-P00149
    Figure US20020183886A1-20021205-P00150
    Figure US20020183886A1-20021205-P00151
    Figure US20020183886A1-20021205-P00152
    Figure US20020183886A1-20021205-P00153
    Figure US20020183886A1-20021205-P00154
    Figure US20020183886A1-20021205-P00155
    Figure US20020183886A1-20021205-P00156
    Figure US20020183886A1-20021205-P00157
    Figure US20020183886A1-20021205-P00158
    Figure US20020183886A1-20021205-P00159
    Figure US20020183886A1-20021205-P00160
    Figure US20020183886A1-20021205-P00161
    Figure US20020183886A1-20021205-P00162
    Figure US20020183886A1-20021205-P00163
    Figure US20020183886A1-20021205-P00164
    Figure US20020183886A1-20021205-P00165
    Figure US20020183886A1-20021205-P00166
    Figure US20020183886A1-20021205-P00167
    Figure US20020183886A1-20021205-P00168
    Figure US20020183886A1-20021205-P00169
    Figure US20020183886A1-20021205-P00170

Claims (26)

What is claimed is:
1. A method for embroidering comprises:
receiving a digitized representation of an image;
determining grain structures for a plurality of locations in the digitized representation, the plurality of locations including a first location and a second location;
embroidering a representation of the first location using cross stitch patterns, when a grain structure for the first location indicates a bi-directional grain structure; and
embroidering a representation of the first location using unidirectional stitch patterns, when the grain structure for the first location indicates a unidirectional grain structure.
2. The method of claim 1 further comprising:
embroidering a representation of the first location using pre-determined fill patterns, when the grain structure for the first location indicates a reduced directional grain structure.
3. The method of claim 1 wherein embroidering the representation of the first location using uni-directional stitch patterns, comprises:
determining a stitch length for the representation of the first location, in response to a color intensity at the first location; and
embroidering the representation of the first location using stitches having the stitch length.
4. The method of claim 1 further comprising:
embroidering a representation of the second location using unidirectional stitch patterns, a direction for the unidirectional stitch patterns associated with the first location different from a direction for the unidirectional stitch patterns associated with the second location.
5. The method of claim 1 wherein the image is a photograph.
6. The method of claim 1 wherein determining grain structures for a plurality of locations in the digitized representation comprises using a discrete Fourier transform.
7. A method for embroidering comprises:
receiving a digitized representation of an image;
determining a plurality of regions of the image in response to the digitized representation,
determining a geometric index for each region in the plurality of regions of the image;
determining a region type associated with each regions from the plurality of regions in response to the geometric index for each region, the region types including a first region type and a second region type;
embroidering representations of regions from the plurality of regions associated with the first region type; and thereafter
embroidering representations of regions from the plurality of regions associated with the second region type.
8. The method of claim 7 wherein determining region types associated with regions from the plurality of regions includes determining a region associated with a first region type;
the method further comprising:
segmenting the region associated with the first region type into a first region associated with the first region type and a second region associated with the second region type.
9. The method of claim 7 wherein
determining the geometric index comprises determining a relative thickness indicator for each region in the plurality of regions of the image; and
determining the region type associated with each regions from the plurality of regions is in response to the relative thicknesses indicator of each regions in the plurality of regions.
10. The method of claim 7 wherein the representations of regions from the plurality of regions associated with the second region type are overlaid over representations of regions from the plurality of regions associated with the first region type.
11. The method of claim 7 wherein embroidering representations of regions from the plurality of regions associated with the first region type comprises:
determining a grain structure for a location in a region associated with the first region type;
embroidering a representation of the location of the region using uni-directional stitch patterns, when the grain structure for the location indicates a uni-directional grain structure.
12. The method of claim 11 wherein embroidering the representation of the location using unidirectional stitch patterns, comprises:
determining a stitch length for the representation of the first location, in response to a color intensity at the first location; and
embroidering the representation of the first location using stitches having the stitch length.
13. A computer program product for a computer system including a processor comprises:
code configured to direct the processor to receive a digitized representation of an image;
code configured to direct the processor to determine grain structures for a plurality of locations in the digitized representation, the plurality of locations including a first location and a second location;
code configured to direct the processor to direct embroidering of a representation of the first location using cross stitch patterns, when a grain structure for the first location indicates a bi-directional grain structure; and
code configured to direct the processor to direct embroidering of a representation of the first location using unidirectional stitch patterns, when the grain structure for the first location indicates a unidirectional grain structure;
wherein the codes reside on a tangible media.
14. The computer program product of claim 13 further comprising:
code configured to direct the processor to direct embroidering of a representation of the first location using predetermined fill patterns, when the grain structure for the first location indicates a reduced directional grain structure.
15. The computer program product of claim 13 wherein the code configured to direct the processor to direct embroidering of the representation of the first location using uni-directional stitch patterns, comprises:
code configured to direct the processor to determine a stitch length for the representation of the first location, in response to a color intensity at the first location; and
code configured to direct the processor to direct embroidering of the representation of the first location using stitches having the stitch length.
16. The computer program product of claim 13 further comprising:
code configured to direct the processor to direct embroidering of a representation of the second location using uni-directional stitch patterns, a direction for the uni-directional stitch patterns associated with the first location different from a direction for the uni-directional stitch patterns associated with the second location.
17. The computer program product of claim 13 wherein the image is a photograph.
18. The computer program product of claim 13 wherein code configured to direct the processor to determine grain structures for a plurality of locations in the digitized representation comprises code configured to direct the processor to execute a discrete Fourier transform.
19. An embroidery system including:
a processor; and
a processor readable memory comprising:
code that directs the processor to retrieve a digitized representation of an image;
code that directs the processor to determine a plurality of regions of the image in response to the digitized representation of the image;
code that directs the processor to determine a geometric index for each region in the plurality of regions;
code that directs the processor to determine a region type associated with each region from the plurality of regions in response to the geometric index for each region, the region types including a first region type and a second region type;
code that directs the processor to direct embroidering representations of regions from the plurality of regions associated with the first region type; and
code that directs the processor to direct embroidering representations of regions from the plurality of regions associated with the second region type after the representations of regions from the plurality of regions associated with the first region type.
20. The embroidery system of claim 19 wherein the code that directs the processor to determine region types associated with regions from the plurality of regions includes code that directs the processor to determine a region associated with a first region type;
the processor readable memory further comprising:
code that directs the processor to segment the region associated with the first region type into a first region associated with the first region type and a second region associated with the second region type.
21. The embroidery system of claim 19 wherein
the code that directs the processor to determine a geometric index for each region in the plurality of regions of the image comprises code that directs the processor to determine a relative thickness index for each region in the plurality of regions; and
the code that directs the processor to determine a region type associated with each region from the plurality of regions is in response to the relative thickness index of each regions from the plurality of regions.
22. The embroidery system of claim 19 wherein the representations of regions from the plurality of regions associated with the second region type are overlaid over representations of regions from the plurality of regions associated with the first region type.
23. The embroidery system of claim 19 wherein the code that directs the processor to direct embroidering of representations of regions from the plurality of regions associated with the first region type comprises:
code that directs the processor to determine a grain structure for a location in a region associated with the first region type; and
code that directs the processor to direct embroidering of a representation of the location of the region using uni-directional stitch patterns, when the grain structure for the location indicates a uni-directional grain structure.
24. The embroidery system of claim 23 wherein the code that directs the processor to direct embroidering of the representation of the location using uni-directional stitch patterns, comprises:
code that directs the processor to determine a stitch length for the representation of the first location, in response to a color intensity at the first location; and
code that directs the processor to direct embroidering of the representation of the first location using stitches having the stitch length.
25. A article of manufacture having embroidery embroidered thereon using the method described in claim 1.
26. A article of manufacture having embroidery embroidered thereon using the method described in claim 7.
US10/038,046 1998-04-10 2002-01-02 Automated embroidery stitching Abandoned US20020183886A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/038,046 US20020183886A1 (en) 1998-04-10 2002-01-02 Automated embroidery stitching

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US8133398P 1998-04-10 1998-04-10
US09/286,109 US6370442B1 (en) 1998-04-10 1999-04-02 Automated embroidery stitching
US10/038,046 US20020183886A1 (en) 1998-04-10 2002-01-02 Automated embroidery stitching

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/286,109 Continuation US6370442B1 (en) 1998-04-10 1999-04-02 Automated embroidery stitching

Publications (1)

Publication Number Publication Date
US20020183886A1 true US20020183886A1 (en) 2002-12-05

Family

ID=22163514

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/286,109 Expired - Fee Related US6370442B1 (en) 1998-04-10 1999-04-02 Automated embroidery stitching
US10/038,046 Abandoned US20020183886A1 (en) 1998-04-10 2002-01-02 Automated embroidery stitching

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/286,109 Expired - Fee Related US6370442B1 (en) 1998-04-10 1999-04-02 Automated embroidery stitching

Country Status (4)

Country Link
US (2) US6370442B1 (en)
EP (1) EP1102881A4 (en)
AU (1) AU3551699A (en)
WO (1) WO1999053128A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6836695B1 (en) * 1998-08-17 2004-12-28 Soft Sight Inc. Automatically generating embroidery designs from a scanned image
US6983081B2 (en) * 2002-08-23 2006-01-03 Ulead Systems, Inc. Method for integration of source object into base image
JP2007181607A (en) * 2006-01-10 2007-07-19 Juki Corp Sewing machine
US7920939B2 (en) * 2006-09-30 2011-04-05 Vistaprint Technologies Limited Method and system for creating and manipulating embroidery designs over a wide area network
US20080243298A1 (en) * 2007-03-28 2008-10-02 Hurd Deborah J Method and system for creating printable images of embroidered designs
US9702071B2 (en) * 2008-10-23 2017-07-11 Zazzle Inc. Embroidery system and method
US11157977B1 (en) 2007-10-26 2021-10-26 Zazzle Inc. Sales system using apparel modeling system and method
JP4798239B2 (en) * 2009-03-13 2011-10-19 ブラザー工業株式会社 Embroidery data creation device, embroidery data creation program, and computer-readable medium storing embroidery data creation program
US8955447B1 (en) * 2011-03-30 2015-02-17 Linda Susan Miksch Method for joining fabric
JP2013146366A (en) * 2012-01-19 2013-08-01 Brother Ind Ltd Embroidery data generating device and embroidery data generating program
JP2015093127A (en) * 2013-11-13 2015-05-18 ブラザー工業株式会社 Sewing machine

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3259089A (en) * 1962-09-13 1966-07-05 John T Rockholt Tufting machine
US4483265A (en) 1982-09-27 1984-11-20 Weidmann Kathy A Cross-stitch design process
JP2523346B2 (en) 1988-02-26 1996-08-07 蛇の目ミシン工業株式会社 Automatic device for creating embroidery data for computer embroidery machines
DE69030792T2 (en) * 1989-07-27 1997-09-11 Lsi Logic Corp Method and device for interaction emulation between an application-specific integrated circuit (ASIC) during development and a target system
JPH07136361A (en) * 1993-11-18 1995-05-30 Brother Ind Ltd Embroidery data generating device
JPH0844848A (en) * 1994-07-28 1996-02-16 Brother Ind Ltd Image processor and embroidery data preparing device
JPH09170158A (en) * 1995-12-20 1997-06-30 Brother Ind Ltd Embroidery data processor
US6004018A (en) * 1996-03-05 1999-12-21 Janome Sewing Machine Device for producing embroidery data on the basis of image data
JPH10179964A (en) * 1996-12-27 1998-07-07 Brother Ind Ltd Method and apparatus for processing embroidery data
JPH1176658A (en) * 1997-09-05 1999-03-23 Brother Ind Ltd Embroidery data processor, its sewing machine and recording medium

Also Published As

Publication number Publication date
EP1102881A1 (en) 2001-05-30
US6370442B1 (en) 2002-04-09
WO1999053128A1 (en) 1999-10-21
EP1102881A4 (en) 2004-11-10
WO1999053128A9 (en) 2000-05-25
AU3551699A (en) 1999-11-01

Similar Documents

Publication Publication Date Title
US6629015B2 (en) Embroidery data generating apparatus
US6370442B1 (en) Automated embroidery stitching
DE69427004T2 (en) Sign display for the selection of graphic objects
US5255352A (en) Mapping of two-dimensional surface detail on three-dimensional surfaces
US8660683B2 (en) Printer driver systems and methods for automatic generation of embroidery designs
US20020007228A1 (en) Automatically generating embroidery designs from a scanned image
DE112014004190T5 (en) Posture estimation and robots
DE112007002904T5 (en) Apparatus and method for generating photorealistic image thumbnails
JP3185900B2 (en) Image editing apparatus and method for image processing system
US6356648B1 (en) Embroidery data processor
US5563795A (en) Embroidery stitch data producing apparatus and method
US20090169113A1 (en) Automatic and Semi-Automatic Detection of Planar Shapes from 2D Images
EP2269163A2 (en) System and method for illumination invariant image segmentation
US20060282777A1 (en) Batch processing of images
US11769195B2 (en) Systems and methods for visualizing wall coverings in an image of a scene
US6502006B1 (en) Method and system for computer aided embroidery
JP2009266839A (en) Semiconductor inspection apparatus and semiconductor inspection method
EP2077191A1 (en) Grain pattern for grain pattern printing, its grain pattern creating method and program, housing material product on which grain pattern is printed, automobile interior component, home electric appliance, and information device
US20110187721A1 (en) Line drawing processing apparatus, storage medium storing a computer-readable program, and line drawing processing method
JP2002024283A (en) Method and device for displaying connection relation
JP3483912B2 (en) Color discriminating apparatus and color discriminating method
DE60305027T2 (en) Method for providing a vector image with hidden lines hidden
EP0689168B1 (en) Image processing method and apparatus
US6535212B1 (en) Method of constructing three-dimensional image such as three-dimensional image obtained when internal parts are observed through a hole
US11770496B2 (en) Systems and methods for visualizing surface coverings in an image of a scene

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE