CN114657712B - Pattern pattern optimization method based on edge recognition - Google Patents

Pattern pattern optimization method based on edge recognition Download PDF

Info

Publication number
CN114657712B
CN114657712B CN202210237014.XA CN202210237014A CN114657712B CN 114657712 B CN114657712 B CN 114657712B CN 202210237014 A CN202210237014 A CN 202210237014A CN 114657712 B CN114657712 B CN 114657712B
Authority
CN
China
Prior art keywords
pattern
product
frame
parameters
sewing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210237014.XA
Other languages
Chinese (zh)
Other versions
CN114657712A (en
Inventor
崔林涛
潘建国
徐仙国
王桔芬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jack Technology Co Ltd
Original Assignee
Jack Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jack Technology Co Ltd filed Critical Jack Technology Co Ltd
Priority to CN202210237014.XA priority Critical patent/CN114657712B/en
Publication of CN114657712A publication Critical patent/CN114657712A/en
Application granted granted Critical
Publication of CN114657712B publication Critical patent/CN114657712B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Textile Engineering (AREA)
  • Sewing Machines And Sewing (AREA)

Abstract

The invention discloses a pattern optimization method based on edge recognition. In order to solve the problems that the open loop control or the semi-closed loop control in the prior art is not suitable for large-batch pattern sewing and the deviation of the patterns of products is easy to exist; the invention comprises the following steps: s1: acquiring product sewing parameters in batches, and respectively shooting and acquiring image data streams after pattern sewing after the product sewing is finished; s2: by constructing an identification frame for the whole product and the pattern, calculating the relation between the proportion and the position, comparing the relation with parameters in a database, and judging whether adjustment parameters need to be generated or not; s3: matrixing a recognition frame of the pattern, acquiring edge areas of the pattern, pixelating the edge areas, and comparing the edge areas with a set resolution and pixel interval threshold value to judge whether adjustment parameters need to be generated; s4: parameters of the corresponding sewing equipment are adjusted. And the two-stage judgment and feedback are correspondingly adjusted, so that the consistency of the product and the template is ensured in the mass production process.

Description

Pattern pattern optimization method based on edge recognition
Technical Field
The invention relates to the field of parameter optimization of sewing machines, in particular to a pattern optimization method based on edge recognition.
Background
In the conventional automatic sewing machine, a predetermined stitch pattern is generally formed by forming a stitch while moving a cloth in a X, Y direction based on sewing pattern data. Open loop control is often used to drive the respective pulse motors or semi-closed loop control.
However, the sewing of an automatic sewing machine using an open loop control or a semi-closed loop control is liable to cause inconsistency of sewing patterns in the mass production process. Even because of hardware reasons such as equipment wear, cause stitch position skew, the pattern position of same batch of products is not unified, has the deviation.
For example, an "automatic sewing machine" disclosed in chinese patent literature, which has a needle with a bulletin number CN 101100789B; a positioning unit having an operation amount detecting unit for detecting an operation amount of the pulse motor for moving the cloth holding member to an arbitrary position; a needle position detection unit that detects a needle position in the up-down direction; a pattern storage unit for storing sewing pattern data; an operation control unit that outputs command pulses to the pulse motor one by one in a single pulse while maintaining a predetermined deviation between the command pulses and an actual operation; and a required time storage unit which stores a required time for outputting the instruction pulse, and the motion control unit starts outputting the instruction pulse with reference to the required time at a timing when the instruction pulse output is completed when the needle reaches the object to be sewn on the needle board.
The scheme is not suitable for pattern sewing in a large batch, and pattern deviation is easy to exist among products in the same batch.
Disclosure of Invention
The invention mainly solves the problems that the prior art open-loop control or semi-closed-loop control is not suitable for large-batch pattern sewing and product pattern deviation is easy to exist; a pattern optimization method based on edge recognition is provided.
The technical problems of the invention are mainly solved by the following technical proposal:
the pattern optimization method based on edge recognition is characterized by comprising the following steps of:
s1: acquiring product sewing parameters in batches, and respectively shooting and acquiring image data streams after pattern sewing after the product sewing is finished;
s2: constructing a recognition frame for the whole product and the pattern patterns, and matching and recognizing the pattern patterns and the positions of the pattern patterns in the product; comparing the parameters with parameters in a database, and judging whether adjustment parameters need to be generated or not;
s3: matrixing a recognition frame of the pattern, acquiring edge areas of the pattern, pixelating the edge areas, and comparing the edge areas with a set resolution and pixel interval threshold value to judge whether adjustment parameters need to be generated;
s4: and (3) acquiring the adjustment parameters in the step S2 and the step S3, and adjusting the parameters of the corresponding sewing equipment.
Judging whether the sewn product has deviation from the template or not through two-stage judgment of pattern position judgment and edge stitch position judgment, and feeding back to perform corresponding adjustment, so that the consistency of the product and the template is ensured to be maintained in the mass production process. Only the corresponding parameters of the sewing equipment are regulated, and personalized regulation is carried out.
Preferably, the step S2 specifically includes:
s201: respectively constructing a product identification frame and a pattern identification frame according to the points of the product and the extreme edge of the pattern;
s202: matching in a database according to the pattern shape in the pattern recognition frame to obtain product sewing information corresponding to the pattern;
s203: obtaining the proportion and position data of the actual pattern and the product through the proportion relation and the position relation of the pattern recognition frame and the product recognition frame;
s204: comparing the ratio and position data of the actual pattern and the product with corresponding product sewing information in a database to obtain a ratio error and a position error; judging whether the proportional error and the position error are in the corresponding error domain, if so, entering the next step, otherwise, generating the adjustment parameters.
The product sewing information comprises the relative positions of the pattern patterns and the products and the proportion of the pattern patterns to the products; and (3) calculating the relation between the pattern and the product through the relation of the identification frame, and judging whether parameter adjustment is needed through the error between the pattern and the template. The adjusted parameters are sewing parameters which influence the positions and the proportions of the pattern.
Preferably, the points of the two lateral edges of the product are used as the standard to form a vertical frame of the product; forming a transverse frame of the product by taking points at the edges of the two vertical sides of the product as standards; the closed area formed by the vertical product frame and the horizontal product frame is a product identification frame;
forming a pattern vertical frame by taking points at the edges of the two lateral sides of the pattern as standards; forming a pattern transverse frame by taking points at the extreme edges of the two vertical sides of the pattern as standards; the enclosed area formed by the vertical pattern frame and the horizontal pattern frame is a pattern recognition frame.
And the product identification frame and the pattern identification frame are constructed, the relation between the pattern and the product is calculated through the relation of the identification frames, and the calculation is simpler and more convenient.
Preferably, the step S3 specifically includes the following steps:
s301: uniformly dividing the pattern recognition frame into a plurality of matrix areas;
s302: traversing the matrix area, and respectively judging whether the edges of the pattern patterns exist in the matrix area; if yes, entering the next step; otherwise, eliminating the corresponding matrix area;
s303: respectively pixelating the matrix area to obtain the minimum resolution when a single pixel is only a pattern or a background;
s304: calculating the distance between the tip pixels of the edge of the pattern;
s305: comparing the resolution and the pixel spacing of the tip with preset thresholds respectively, and entering the next step if the resolution and the pixel spacing of the tip are both within the threshold range; otherwise, generating corresponding adjustment parameters.
The adjustment parameters generated in the step correspond to the sewing precision of the stitch, so that the stitch is denser, and burrs are reduced.
Preferably, if the area ratio of the pattern area to the matrix area is 95% or less or 5% or less, the matrix area is removed.
And matrix areas with small influence on the stitch edges are eliminated, so that the influence is reduced, the calculation is more accurate, and the calculation efficiency is improved.
Preferably, the determination of the minimum resolution is:
a. continuously reducing the size of the single pixel in a rated proportion;
b1. filling all pixels into the pattern when the area ratio of the pattern in each pixel is more than 75%;
b2. filling all pixels into the background when the area ratio of the pattern in each pixel is less than 30%;
b3. and (c) returning to the step (a) to continuously reduce the pixel size when the area ratio of the pattern in the single pixel is more than or equal to 30% and less than or equal to 75%.
The individual pixel size is continually reduced until a minimum resolution is obtained where the individual pixel is merely a design pattern or background.
Preferably, if the generated adjustment parameters include locked controller parameters, the corresponding parameters are differenced, whether the difference exceeds a modification threshold value of the corresponding controller parameters is judged, and if yes, an alarm is provided for the corresponding user; if not, the locked controller parameters are maintained. The parameters are prevented from being tampered randomly, and the data security is guaranteed.
The beneficial effects of the invention are as follows:
1. judging whether the sewn product has deviation from the template or not through two-stage judgment of pattern position judgment and edge stitch position judgment, and feeding back to perform corresponding adjustment, so that the consistency of the product and the template is ensured to be maintained in the mass production process.
2. And the sewing parameters are batched, the feedback data is adjusted in a personalized way, and only the corresponding sewing equipment parameters are adjusted, so that the local optimization is realized, and the efficiency is improved.
3. And the product identification frame and the pattern identification frame are constructed, the relation between the pattern and the product is calculated through the relation of the identification frames, and the calculation is simpler and more convenient.
Drawings
FIG. 1 is a flow chart of a pattern optimization method based on edge recognition.
Detailed Description
The technical scheme of the invention is further specifically described below through examples and with reference to the accompanying drawings.
Examples:
the pattern optimization method based on edge recognition in this embodiment, as shown in fig. 1, includes the following steps:
s1: and acquiring product sewing parameters in batches, and respectively shooting and acquiring image data streams after pattern sewing after the product sewing is finished.
And the operator inputs the sewing parameters of the products in batches, and carries out controller configuration operation, locking operation of the controller parameters and/or unlocking operation of the controller parameters according to the user permission.
And calling a controller parameter configuration interface according to the controller parameter configuration instruction, and executing configuration operation of the controller parameters through the controller parameter configuration interface.
And calling a configuration interface of the locking identification bit according to a locking instruction of the locking identification bit associated with each controller parameter, and triggering the locking of the locking identification bit associated with each controller parameter through the configuration interface of the locking identification bit so as to execute the locking operation of the controller parameter.
And according to the unlocking instruction of the user, the unlocking operation of the locked controller parameter is executed by modifying the locking identification bit of the locked controller parameter.
Shooting products produced by each sewing device through a camera to obtain corresponding product image data streams.
S2: constructing a recognition frame for the whole product and the pattern patterns, and matching and recognizing the pattern patterns and the positions of the pattern patterns in the product; and comparing the parameters with parameters in a database, and judging whether the adjustment parameters need to be generated.
S201: and respectively constructing a product identification frame and a pattern identification frame according to the points of the product and the extreme edge of the pattern.
Forming a vertical frame of the product by taking points at the edges of the two lateral sides of the product as standards; forming a transverse frame of the product by taking points at the edges of the two vertical sides of the product as standards; the closed area formed by the vertical product frame and the horizontal product frame is a product identification frame.
Forming a pattern vertical frame by taking points at the edges of the two lateral sides of the pattern as standards; forming a pattern transverse frame by taking points at the extreme edges of the two vertical sides of the pattern as standards; the enclosed area formed by the vertical pattern frame and the horizontal pattern frame is a pattern recognition frame.
And the product identification frame and the pattern identification frame are constructed, the relation between the pattern and the product is calculated through the relation of the identification frames, and the calculation is simpler and more convenient.
S202: and matching in a database according to the pattern shape in the pattern recognition frame to obtain the product sewing information corresponding to the pattern.
The product sewing information comprises the relative positions of the pattern patterns and the products and the proportion of the pattern patterns and the products.
S203: and obtaining the proportion and position data of the actual pattern and the product through the proportion relation and the position relation of the pattern recognition frame and the product recognition frame.
In this embodiment, a coordinate system is established with the lower left corner of the product identification frame as the origin. The ratio of the pattern to the product can be obtained by the length and the width of the product identification frame and the pattern identification frame.
The position relation between the pattern and the product can be obtained through the coordinates of the center point of the pattern recognition frame and the center point of the product recognition frame.
S204: and comparing the ratio and position data of the actual pattern and the product with corresponding product sewing information in a database to obtain a ratio error and a position error.
Judging whether the proportional error and the position error are in the corresponding error domain, if so, entering the next step, otherwise, generating the adjustment parameters. The adjusted parameters are sewing parameters which influence the positions and the proportions of the pattern.
And (3) calculating the relation between the pattern and the product through the relation of the identification frame, and judging whether parameter adjustment is needed through the error between the pattern and the template.
S3: and matrixing the identification frame of the pattern, acquiring edge regions of the pattern, pixelating the edge regions, and comparing the pixelated edge regions with a set resolution and pixel interval threshold value to judge whether adjustment parameters are required to be generated.
S301: the pattern recognition frame is uniformly divided into a plurality of matrix areas.
S302: traversing the matrix area, and respectively judging whether the edges of the pattern patterns exist in the matrix area; if yes, entering the next step; otherwise, eliminating the corresponding matrix area.
And if the area ratio of the pattern to the matrix area is more than or equal to 95% or less than or equal to 5%, eliminating the matrix area.
And matrix areas with small influence on the stitch edges are eliminated, so that the influence is reduced, the calculation is more accurate, and the calculation efficiency is improved.
S303: the matrix areas are pixelated separately, resulting in a minimum resolution when the individual pixels are only in the pattern or background.
The determination process of the minimum resolution is:
a. the size of individual pixels is continually reduced at nominal scale. In this embodiment, the least bisection method is used.
b1. When the area ratio of the pattern in a single pixel is more than 75%, the pixels are all filled with the pattern.
b2. When the area ratio of the pattern in a single pixel is less than 30%, the pixel is fully filled as a background.
b3. And (c) returning to the step (a) to continuously reduce the pixel size when the area ratio of the pattern in the single pixel is more than or equal to 30% and less than or equal to 75%.
The individual pixel size is continually reduced until a minimum resolution is obtained where the individual pixel is merely a design pattern or background.
S304: the spacing between pattern edge tip pixels is calculated. And identifying the edge of the pattern, and acquiring the stitch tip of the edge to obtain a tip pixel. The spacing between adjacent tip pixels is calculated.
S305: comparing the resolution and the pixel spacing of the tip with preset thresholds respectively, and entering the next step if the resolution and the pixel spacing of the tip are both within the threshold range; otherwise, generating corresponding adjustment parameters.
The adjustment parameters generated in the step correspond to the sewing precision of the stitch, so that the stitch is denser, and burrs are reduced.
S4: and (3) acquiring the adjustment parameters in the step S2 and the step S3, and adjusting the parameters of the corresponding sewing equipment.
If the generated adjustment parameters comprise locked controller parameters, making a difference between the corresponding parameters, judging whether the difference exceeds a modification threshold value of the corresponding controller parameters, and if so, providing an alarm for a corresponding user; if not, the locked controller parameters are maintained. The parameters are prevented from being tampered randomly, and the data security is guaranteed.
According to the scheme of the embodiment, whether the sewn product has deviation from the template or not is judged through two-stage judgment of pattern position judgment and edge stitch position judgment, and corresponding adjustment is fed back, so that the consistency of the product and the template is ensured in a mass production process. Only the corresponding parameters of the sewing equipment are regulated, and personalized regulation is carried out.
It should be understood that the examples are only for illustrating the present invention and are not intended to limit the scope of the present invention. Further, it is understood that various changes and modifications may be made by those skilled in the art after reading the teachings of the present invention, and such equivalents are intended to fall within the scope of the claims appended hereto.

Claims (5)

1. The pattern optimization method based on edge recognition is characterized by comprising the following steps of:
s1: acquiring product sewing parameters in batches, and respectively shooting and acquiring image data streams after pattern sewing after the product sewing is finished;
s2: constructing a recognition frame for the whole product and the pattern patterns, and matching and recognizing the pattern patterns and the positions of the pattern patterns in the product; comparing the parameters with parameters in a database, and judging whether adjustment parameters need to be generated or not;
s3: matrixing a recognition frame of the pattern, acquiring edge areas of the pattern, pixelating the edge areas, and comparing the edge areas with a set resolution and pixel interval threshold value to judge whether adjustment parameters need to be generated;
s4: acquiring the adjustment parameters in the step S2 and the step S3, and adjusting the parameters of corresponding sewing equipment;
the step S2 specifically includes:
s201: respectively constructing a product identification frame and a pattern identification frame according to the points of the product and the extreme edge of the pattern;
s202: matching in a database according to the pattern shape in the pattern recognition frame to obtain product sewing information corresponding to the pattern;
s203: obtaining the proportion and position data of the actual pattern and the product through the proportion relation and the position relation of the pattern recognition frame and the product recognition frame;
s204: comparing the ratio and position data of the actual pattern and the product with corresponding product sewing information in a database to obtain a ratio error and a position error; judging whether the proportional error and the position error are in the corresponding error domain, if so, entering the next step, otherwise, generating an adjustment parameter;
forming a vertical frame of the product by taking points at the edges of the two lateral sides of the product as standards; forming a transverse frame of the product by taking points at the edges of the two vertical sides of the product as standards; the closed area formed by the vertical product frame and the horizontal product frame is a product identification frame;
forming a pattern vertical frame by taking points at the edges of the two lateral sides of the pattern as standards; forming a pattern transverse frame by taking points at the extreme edges of the two vertical sides of the pattern as standards; the enclosed area formed by the vertical pattern frame and the horizontal pattern frame is a pattern recognition frame.
2. The pattern optimization method based on edge recognition according to claim 1, wherein the step S3 specifically comprises the following steps:
s301: uniformly dividing the pattern recognition frame into a plurality of matrix areas;
s302: traversing the matrix area, and respectively judging whether the edges of the pattern patterns exist in the matrix area; if yes, entering the next step; otherwise, eliminating the corresponding matrix area;
s303: respectively pixelating the matrix area to obtain the minimum resolution when a single pixel is only a pattern or a background;
s304: calculating the distance between the tip pixels of the edge of the pattern;
s305: comparing the resolution and the pixel spacing of the tip with preset thresholds respectively, and entering the next step if the resolution and the pixel spacing of the tip are both within the threshold range; otherwise, generating corresponding adjustment parameters.
3. The pattern optimization method based on edge recognition according to claim 2, wherein the matrix area is rejected if the area ratio of the pattern to the matrix area is 95% or more or 5% or less.
4. The pattern optimization method based on edge recognition according to claim 2, wherein the minimum resolution determination process is:
a. continuously reducing the size of the single pixel in a rated proportion;
b1. filling all pixels into the pattern when the area ratio of the pattern in each pixel is more than 75%;
b2. filling all pixels into the background when the area ratio of the pattern in each pixel is less than 30%;
b3. and (c) returning to the step (a) to continuously reduce the pixel size when the area ratio of the pattern in the single pixel is more than or equal to 30% and less than or equal to 75%.
5. The pattern optimization method based on edge recognition according to claim 1, wherein if the generated adjustment parameters include locked controller parameters, the corresponding parameters are differenced, whether the difference exceeds a modification threshold value of the corresponding controller parameters is judged, and if yes, an alarm is provided for the corresponding user; if not, the locked controller parameters are maintained.
CN202210237014.XA 2022-03-11 2022-03-11 Pattern pattern optimization method based on edge recognition Active CN114657712B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210237014.XA CN114657712B (en) 2022-03-11 2022-03-11 Pattern pattern optimization method based on edge recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210237014.XA CN114657712B (en) 2022-03-11 2022-03-11 Pattern pattern optimization method based on edge recognition

Publications (2)

Publication Number Publication Date
CN114657712A CN114657712A (en) 2022-06-24
CN114657712B true CN114657712B (en) 2023-08-04

Family

ID=82028882

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210237014.XA Active CN114657712B (en) 2022-03-11 2022-03-11 Pattern pattern optimization method based on edge recognition

Country Status (1)

Country Link
CN (1) CN114657712B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104711785A (en) * 2015-04-01 2015-06-17 华中科技大学 Computerized pattern forming machine based on visual control
CN105447847A (en) * 2014-09-24 2016-03-30 Juki株式会社 Form detection means and sewing machine
CN206783926U (en) * 2017-03-20 2017-12-22 启翔股份有限公司 The vision alignment device of sewing machine
TWI646233B (en) * 2017-08-02 2019-01-01 伸興工業股份有限公司 Appliqué method based on image recognition
CN109355812A (en) * 2018-04-09 2019-02-19 深圳市诺德机器人有限公司 A kind of vision positioning automatic sewing system and method for sewing
CN110136155A (en) * 2019-05-19 2019-08-16 绵阳逢研科技有限公司 A kind of pattern edge track intelligent extract method and its application
CN111705434A (en) * 2020-06-19 2020-09-25 珠海运控瑞奇数控科技有限公司 Sewing method for intelligently and adaptively adjusting sewing patterns
CN113123022A (en) * 2021-04-16 2021-07-16 上海威士机械有限公司 Cuff sewing device and method based on visual detection processing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447847A (en) * 2014-09-24 2016-03-30 Juki株式会社 Form detection means and sewing machine
CN104711785A (en) * 2015-04-01 2015-06-17 华中科技大学 Computerized pattern forming machine based on visual control
CN206783926U (en) * 2017-03-20 2017-12-22 启翔股份有限公司 The vision alignment device of sewing machine
TWI646233B (en) * 2017-08-02 2019-01-01 伸興工業股份有限公司 Appliqué method based on image recognition
CN109355812A (en) * 2018-04-09 2019-02-19 深圳市诺德机器人有限公司 A kind of vision positioning automatic sewing system and method for sewing
CN110136155A (en) * 2019-05-19 2019-08-16 绵阳逢研科技有限公司 A kind of pattern edge track intelligent extract method and its application
CN111705434A (en) * 2020-06-19 2020-09-25 珠海运控瑞奇数控科技有限公司 Sewing method for intelligently and adaptively adjusting sewing patterns
CN113123022A (en) * 2021-04-16 2021-07-16 上海威士机械有限公司 Cuff sewing device and method based on visual detection processing

Also Published As

Publication number Publication date
CN114657712A (en) 2022-06-24

Similar Documents

Publication Publication Date Title
US4204193A (en) Adaptive alignment for pattern recognition system
US4309950A (en) Embroidery machine
US8180476B2 (en) Machining state checking method and machining state checking apparatus
US8700200B2 (en) Sewing machine and non-transitory computer-readable medium storing sewing machine control program
US8335583B2 (en) Embroidery data generating device and computer-readable medium storing embroidery data generating program
US9249533B2 (en) Sewing machine
CN114657712B (en) Pattern pattern optimization method based on edge recognition
US4821662A (en) Method of embroidery and stitch processor therefor
US8655474B2 (en) Embroidery data generating apparatus, embroidery data generating method, and non-transitory computer-readable medium storing embroidery data generating program
US5563795A (en) Embroidery stitch data producing apparatus and method
CN107239390A (en) The method and apparatus for proofreading application interface information
CN112513359A (en) Method for adjusting the position of a seam profile relative to the structure of a material to be sewn
CN1036606C (en) Embroidery data producting apparatus
US8126584B2 (en) Embroidery data creation apparatus and storage medium storing embroidery data creation program
CN110273229B (en) Stitch inspection device
US9389605B2 (en) Method of generating a numerical control program, apparatus for the same, and program for causing a computer to execute the method
US9008818B2 (en) Embroidery data generating device and non-transitory computer-readable medium
US6170413B1 (en) Correction apparatus for sewing data and correction method
US6247420B1 (en) Method of recognizing embroidery outline and conversion to a different data format
CN111476763B (en) Device and method for correcting visual position
US7983783B2 (en) Embroidery data creation apparatus and embroidery data creation program
US20180340280A1 (en) Non-transitory computer-readable medium and sewing data generation device
EP4283031A1 (en) Production method and production system for correction data for inverse plating
US5029540A (en) Automatic embroidering machine
US6253695B1 (en) Method of changing the density of an embroidery stitch group

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant