CN114657712A - Pattern and pattern optimization method based on edge recognition - Google Patents

Pattern and pattern optimization method based on edge recognition Download PDF

Info

Publication number
CN114657712A
CN114657712A CN202210237014.XA CN202210237014A CN114657712A CN 114657712 A CN114657712 A CN 114657712A CN 202210237014 A CN202210237014 A CN 202210237014A CN 114657712 A CN114657712 A CN 114657712A
Authority
CN
China
Prior art keywords
pattern
product
frame
parameters
sewing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210237014.XA
Other languages
Chinese (zh)
Other versions
CN114657712B (en
Inventor
崔林涛
潘建国
徐仙国
王桔芬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jack Technology Co Ltd
Original Assignee
Jack Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jack Technology Co Ltd filed Critical Jack Technology Co Ltd
Priority to CN202210237014.XA priority Critical patent/CN114657712B/en
Publication of CN114657712A publication Critical patent/CN114657712A/en
Application granted granted Critical
Publication of CN114657712B publication Critical patent/CN114657712B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Textile Engineering (AREA)
  • Sewing Machines And Sewing (AREA)

Abstract

The invention discloses a pattern and pattern optimization method based on edge recognition. The method aims to solve the problems that the open-loop control or semi-closed-loop control in the prior art is not suitable for large-batch pattern sewing and is easy to have product pattern deviation; the invention comprises the following steps: s1: obtaining product sewing parameters in batches, and respectively shooting and obtaining image data streams after pattern sewing after the product sewing is finished; s2: establishing an identification frame for the whole product and the pattern, calculating the relation between the proportion and the position, comparing the relation with parameters in a database, and judging whether adjustment parameters need to be generated or not; s3: performing matrixing on the identification frame of the pattern, acquiring an edge area of the pattern, performing pixelation on the edge area, comparing the pixelation with a set resolution and a pixel spacing threshold, and judging whether an adjustment parameter needs to be generated; s4: adjusting parameters of the corresponding sewing equipment. And two-stage judgment, feedback and corresponding adjustment are carried out, and the consistency of products and templates is guaranteed to be kept in the large-batch production process.

Description

Pattern and pattern optimization method based on edge recognition
Technical Field
The invention relates to the field of parameter optimization of sewing machines, in particular to a pattern and pattern optimization method based on edge recognition.
Background
The current automatic sewing machine generally forms a predetermined stitch pattern by moving a cloth in an X, Y direction based on sewing pattern data and forming stitches. Open-loop control is mostly used to drive each pulse motor or semi-closed-loop control.
However, the sewing of the automatic sewing machine using the open-loop control or the semi-closed-loop control is liable to cause inconsistency of sewing patterns in the process of mass production. Even due to hardware reasons such as abrasion of equipment and the like, pin positions are deviated, and the pattern positions of the same batch of products are not uniform and have deviation.
For example, an "automatic sewing machine" disclosed in chinese patent literature, which has publication No. CN101100789B, has a sewing needle; a positioning unit having an operation amount detection unit for detecting the operation amount of a pulse motor for moving the cloth holding member to an arbitrary position; a needle position detection unit which detects a needle position in a vertical direction; a pattern storage unit for storing sewing pattern data; an operation control means for outputting command pulses to the pulse motor one by one in a single pulse while maintaining a predetermined deviation between the command pulses and an actual operation; and a required time storage unit for storing required time for outputting the command pulse, and the action control unit refers to the required time and starts to output the command pulse at the timing when the command pulse is completely output when the sewing needle reaches the sewed object on the needle plate.
The scheme is not suitable for large-batch pattern sewing, and pattern deviation exists among products in the same batch easily.
Disclosure of Invention
The invention mainly solves the problems that the open-loop control or semi-closed-loop control in the prior art is not suitable for large-batch pattern sewing and is easy to have product pattern deviation; provided is a pattern optimization method based on edge recognition.
The technical problem of the invention is mainly solved by the following technical scheme:
a pattern optimization method based on edge recognition is characterized by comprising the following steps:
s1: obtaining product sewing parameters in batches, and respectively shooting and obtaining image data streams after pattern sewing after the product sewing is finished;
s2: establishing an identification frame for the whole product and the pattern, and matching and identifying the pattern and the position of the pattern in the product; comparing the parameters with the parameters in the database, and judging whether the adjustment parameters need to be generated;
s3: matrixing the identification frame of the pattern, acquiring the edge area of the pattern, pixelating the edge area, comparing the pixelation frame with a set resolution and a pixel interval threshold value, and judging whether an adjustment parameter needs to be generated;
s4: acquiring the adjustment parameters in the step S2 and the step S3, and adjusting the parameters of the corresponding sewing equipment.
Whether a sewn product has deviation with a template or not is judged through two-stage judgment of pattern position judgment and edge stitch position judgment, corresponding adjustment is carried out through feedback, and consistency of the product and the template is guaranteed to be kept in a large-batch production process. Only the corresponding parameters of the sewing equipment are adjusted, and the individualized adjustment is realized.
Preferably, the step S2 specifically includes:
s201: respectively constructing a product identification frame and a pattern identification frame according to the most marginal points of the product and the pattern;
s202: matching in a database according to the shape of the pattern in the pattern recognition frame to obtain product sewing information corresponding to the pattern;
s203: obtaining the ratio and position data of the actual pattern and the product through the ratio and position relationship between the pattern recognition frame and the product recognition frame;
s204: comparing the ratio and position data of the actual pattern and the product with corresponding product sewing information in a database to obtain a ratio error and a position error; and judging whether the proportional error and the position error are in the corresponding error domains, if so, entering the next step, and otherwise, generating an adjusting parameter.
The product sewing information comprises the relative position of the pattern and the product and the proportion of the pattern and the product; and calculating the relation between the pattern and the product through the relation of the identification frame, and judging whether the parameters need to be adjusted through the error between the pattern and the template. The adjusted parameters are sewing parameters influencing the position and the proportion of the patterns.
Preferably, the vertical frame of the product is formed by taking the most marginal points on the two transverse sides of the product as the standard; forming a transverse frame of the product by taking the most marginal points on the two vertical sides of the product as standards; a closed area formed by the vertical product frame and the transverse product frame is a product identification frame;
forming a vertical pattern frame by taking the most marginal points on the two transverse sides of the pattern as a standard; forming a pattern transverse frame by taking the most marginal points on the two vertical sides of the pattern as a standard; and a closed area formed by the vertical pattern frame and the transverse pattern frame is a pattern recognition frame.
And a product identification frame and a pattern identification frame are constructed, and the relationship between the pattern and the product is calculated through the relationship between the identification frames, so that the calculation is simpler and more convenient.
Preferably, the step S3 specifically includes the following steps:
s301: uniformly dividing the pattern recognition frame into a plurality of matrix areas;
s302: traversing the matrix area, and respectively judging whether the edges of the pattern exist in the matrix area; if yes, entering the next step; otherwise, rejecting the corresponding matrix area;
s303: respectively pixelating the matrix areas to obtain the minimum resolution when a single pixel is only a pattern or a background;
s304: calculating the distance between the pixels at the edge tips of the pattern;
s305: comparing the resolution and the pitch of the sharp-end pixels with preset threshold values respectively, and entering the next step if the resolution and the pitch of the sharp-end pixels are within the range of the threshold values; otherwise, generating corresponding adjusting parameters.
The adjustment parameters generated in the step correspond to the sewing precision of the stitches, so that the stitches are denser, and burrs are reduced.
Preferably, the matrix region is excluded when the ratio of the area of the pattern to the area of the matrix region is 95% or more or 5% or less.
Matrix areas with small influences on the stitch edges are removed, influences are reduced, calculation is more accurate, and calculation efficiency is improved.
Preferably, the minimum resolution is determined by:
a. the size of a single pixel is continuously reduced by a rated proportion;
b1. when the area ratio of the pattern in a single pixel is more than 75%, filling the pixel into the pattern;
b2. when the area ratio of the pattern in a single pixel is less than 30%, filling the pixel as a background;
b3. and returning to the step a to continue reducing the pixel size when the area ratio of the pattern in the single pixel is more than or equal to 30% and less than or equal to 75%.
The individual pixel size is reduced until a minimum resolution is obtained where the individual pixel is the only motif pattern or background.
Preferably, if the generated adjustment parameters comprise locked controller parameters, the corresponding parameters are differenced, whether the difference exceeds a modification threshold value of the corresponding controller parameters is judged, and if yes, an alarm is provided for a corresponding user; if not, the locked controller parameters are maintained. Parameters are guaranteed not to be tampered randomly, and data safety is guaranteed.
The invention has the beneficial effects that:
1. and judging whether the sewn product has deviation with the template or not through two-stage judgment of pattern position judgment and edge stitch position judgment, feeding back and performing corresponding adjustment, and ensuring that the consistency of the product and the template is kept in the large-batch production process.
2. Sewing parameter batch data, feedback data individualized adjustment, only adjusting corresponding sewing equipment parameters, local optimization and efficiency improvement.
3. And a product identification frame and a pattern identification frame are constructed, and the relationship between the pattern and the product is calculated through the relationship between the identification frames, so that the calculation is simpler and more convenient.
Drawings
Fig. 1 is a flowchart of a pattern optimization method based on edge recognition according to the present invention.
Detailed Description
The technical scheme of the invention is further specifically described by the following embodiments and the accompanying drawings.
The embodiment is as follows:
the pattern optimization method based on edge recognition in the embodiment, as shown in fig. 1, includes the following steps:
s1: and acquiring product sewing parameters in batches, and respectively shooting and acquiring image data streams after pattern sewing after the product sewing is finished.
The operator inputs the sewing parameters of the product in batches, and performs controller configuration operation, controller parameter locking operation and/or controller parameter unlocking operation according to the user authority.
And calling a controller parameter configuration interface according to the controller parameter configuration instruction, and executing configuration operation of the controller parameters through the controller parameter configuration interface.
And calling a configuration interface of the locking identification bit according to the locking instruction of the locking identification bit associated with each controller parameter, and triggering the locking of the locking identification bit associated with each controller parameter through the configuration interface of the locking identification bit so as to execute the locking operation of the controller parameter.
And according to the user unlocking instruction, the unlocking operation of the controller parameter is executed by modifying the locking identification bit of the locked controller parameter.
And shooting the products produced by each sewing device through a camera to obtain corresponding product image data streams.
S2: establishing an identification frame for the whole product and the pattern, and matching and identifying the pattern and the position of the pattern in the product; and comparing the parameters with the parameters in the database to judge whether the adjustment parameters need to be generated.
S201: and respectively constructing a product identification frame and a pattern identification frame according to the most marginal points of the product and the pattern.
Forming a vertical frame of the product by taking the most marginal points on the two transverse sides of the product as standards; forming a transverse frame of the product by taking the most marginal points on the two vertical sides of the product as standards; the closed area formed by the vertical product frame and the transverse product frame is a product identification frame.
Forming a vertical pattern frame by taking the most marginal points on the two transverse sides of the pattern as a standard; forming a pattern transverse frame by taking the most marginal points on the two vertical sides of the pattern as a standard; the closed area formed by the vertical pattern frame and the horizontal pattern frame is a pattern recognition frame.
And a product identification frame and a pattern identification frame are constructed, and the relationship between the pattern and the product is calculated through the relationship between the identification frames, so that the calculation is simpler and more convenient.
S202: and matching in a database according to the shape of the pattern in the pattern recognition frame to obtain the product sewing information corresponding to the pattern.
The product sewing information comprises the relative position of the pattern and the product and the proportion of the pattern and the product.
S203: and obtaining the ratio and position data of the actual pattern and the product through the ratio relationship and the position relationship between the pattern recognition frame and the product recognition frame.
In this embodiment, a coordinate system is established with the lower left corner of the product identification frame as the origin. The ratio of the pattern to the product can be obtained through the lengths and the widths of the product identification frame and the pattern identification frame.
The position relation between the pattern and the product can be obtained through the center point of the pattern recognition frame and the center point coordinate of the product recognition frame.
S204: and comparing the ratio and position data of the actual pattern and the product with corresponding product sewing information in the database to obtain a ratio error and a position error.
And judging whether the proportional error and the position error are in the corresponding error domains, if so, entering the next step, and otherwise, generating an adjusting parameter. The adjusted parameters are sewing parameters influencing the position and the proportion of the patterns.
And calculating the relation between the pattern and the product through the relation of the identification frame, and judging whether the parameters need to be adjusted through the error between the pattern and the template.
S3: and performing matrixing on the identification frame of the pattern, acquiring an edge area of the pattern, performing pixelization on the edge area, comparing the pixelization with a set resolution and a pixel pitch threshold, and judging whether an adjustment parameter needs to be generated.
S301: and uniformly dividing the pattern recognition frame into a plurality of matrix areas.
S302: traversing the matrix area, and respectively judging whether the edges of the pattern exist in the matrix area; if yes, entering the next step; otherwise, the corresponding matrix area is removed.
If the ratio of the area of the pattern to the area of the matrix region is 95% or more or 5% or less, the matrix region is excluded.
Matrix areas with small influences on the stitch edges are removed, influences are reduced, calculation is more accurate, and calculation efficiency is improved.
S303: the matrix areas are respectively pixilated, and the minimum resolution is obtained when a single pixel is only a pattern or a background.
The minimum resolution is determined by the following process:
a. the size of the individual pixels is continuously reduced at a nominal scale. In the present embodiment, a minimum dichotomy is employed.
b1. When the area ratio of the pattern in a single pixel is more than 75%, the pixel is entirely filled with the pattern.
b2. When the area ratio of the pattern in a single pixel is less than 30%, the pixel is entirely filled with the background.
b3. And returning to the step a to continue reducing the pixel size when the area ratio of the pattern in the single pixel is more than or equal to 30% and less than or equal to 75%.
The individual pixel size is reduced until a minimum resolution is obtained where the individual pixel is the only motif pattern or background.
S304: and calculating the distance between the pixels at the edge tip of the pattern. And identifying the edge of the pattern, and acquiring the stitch tip of the edge to obtain a tip pixel. The spacing between adjacent tip pixels is calculated.
S305: comparing the resolution and the pitch of the sharp-end pixels with preset threshold values respectively, and entering the next step if the resolution and the pitch of the sharp-end pixels are within the range of the threshold values; otherwise, generating corresponding adjusting parameters.
The adjustment parameters generated in the step correspond to the sewing precision of the stitches, so that the stitches are denser, and burrs are reduced.
S4: acquiring the adjustment parameters in the step S2 and the step S3, and adjusting the parameters of the corresponding sewing equipment.
If the generated adjusting parameters comprise locked controller parameters, the corresponding parameters are differenced, whether the difference value exceeds the modification threshold value of the corresponding controller parameters is judged, and if yes, an alarm is provided for a corresponding user; if not, the locked controller parameters are maintained. And parameters are not randomly tampered, and data security is ensured.
According to the scheme of the embodiment, whether the sewn product has deviation with the template is judged through two-stage judgment of pattern position judgment and edge stitch position judgment, corresponding adjustment is carried out through feedback, and the consistency of the product and the template is guaranteed to be kept in a large-batch production process. Only the corresponding parameters of the sewing equipment are adjusted, and the individualized adjustment is realized.
It should be understood that the examples are for illustrative purposes only and are not intended to limit the scope of the present invention. Further, it should be understood that various changes or modifications of the present invention may be made by those skilled in the art after reading the teaching of the present invention, and such equivalents may fall within the scope of the present invention as defined in the appended claims.

Claims (7)

1. A pattern optimization method based on edge recognition is characterized by comprising the following steps:
s1: obtaining product sewing parameters in batches, and respectively shooting and obtaining image data streams after pattern sewing after the product sewing is finished;
s2: establishing an identification frame for the whole product and the pattern, and matching and identifying the pattern and the position of the pattern in the product; comparing the parameters with the parameters in the database, and judging whether the adjustment parameters need to be generated;
s3: matrixing the identification frame of the pattern, acquiring the edge area of the pattern, pixelating the edge area, comparing the pixelation frame with a set resolution and a pixel interval threshold value, and judging whether an adjustment parameter needs to be generated;
s4: acquiring the adjustment parameters in the step S2 and the step S3, and adjusting the parameters of the corresponding sewing equipment.
2. The method for optimizing pattern based on edge recognition as claimed in claim 1, wherein the step S2 specifically comprises:
s201: respectively constructing a product identification frame and a pattern identification frame according to the most marginal points of the product and the pattern;
s202: matching in a database according to the shape of the pattern in the pattern recognition frame to obtain product sewing information corresponding to the pattern;
s203: obtaining the proportion and position data of the actual pattern and the product through the proportion relation and position relation of the pattern recognition frame and the product recognition frame;
s204: comparing the ratio and position data of the actual pattern and the product with corresponding product sewing information in a database to obtain a ratio error and a position error; and judging whether the proportional error and the position error are in the corresponding error domains, if so, entering the next step, and otherwise, generating an adjusting parameter.
3. The pattern and pattern optimization method based on edge recognition is characterized in that the vertical frame of the product is formed by taking the most marginal points on the two transverse sides of the product as the standard; forming a transverse frame of the product by taking the most marginal points on the two vertical sides of the product as standards; a closed area formed by the vertical product frame and the transverse product frame is a product identification frame;
forming a vertical pattern frame by taking the most marginal points on the two transverse sides of the pattern as a standard; forming a pattern transverse frame by taking the most marginal points on the two vertical sides of the pattern as a standard; the closed area formed by the vertical pattern frame and the horizontal pattern frame is a pattern recognition frame.
4. The method for optimizing pattern based on edge recognition as claimed in claim 1, 2 or 3, wherein the step S3 specifically comprises the following steps:
s301: uniformly dividing the pattern recognition frame into a plurality of matrix areas;
s302: traversing the matrix area, and respectively judging whether the edges of the pattern exist in the matrix area; if yes, entering the next step; otherwise, rejecting the corresponding matrix area;
s303: respectively pixelating the matrix areas to obtain the minimum resolution when a single pixel is only a pattern or a background;
s304: calculating the distance between the pixels at the edge tips of the pattern;
s305: comparing the resolution and the pitch of the sharp-end pixels with preset threshold values respectively, and entering the next step if the resolution and the pitch of the sharp-end pixels are within the range of the threshold values; otherwise, generating corresponding adjusting parameters.
5. A method for pattern optimization based on edge recognition as claimed in claim 4, wherein matrix regions are excluded if the ratio of the area of the pattern to the area of the matrix regions is 95% or more or 5% or less.
6. The method for optimizing pattern based on edge recognition according to claim 4, wherein the minimum resolution is determined by:
a. the size of a single pixel is continuously reduced by a rated proportion;
b1. when the area ratio of the pattern in a single pixel is more than 75%, filling the pixel into the pattern;
b2. when the area ratio of the pattern in a single pixel is less than 30%, filling the pixel as a background;
b3. and returning to the step a to continue reducing the pixel size when the area ratio of the pattern in the single pixel is more than or equal to 30% and less than or equal to 75%.
7. The pattern optimization method based on edge recognition as claimed in claim 1, wherein if the generated adjustment parameters include locked controller parameters, the corresponding parameters are differentiated, whether the difference exceeds a modification threshold of the corresponding controller parameters is judged, and if yes, an alarm is provided for a corresponding user; if not, the locked controller parameters are maintained.
CN202210237014.XA 2022-03-11 2022-03-11 Pattern pattern optimization method based on edge recognition Active CN114657712B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210237014.XA CN114657712B (en) 2022-03-11 2022-03-11 Pattern pattern optimization method based on edge recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210237014.XA CN114657712B (en) 2022-03-11 2022-03-11 Pattern pattern optimization method based on edge recognition

Publications (2)

Publication Number Publication Date
CN114657712A true CN114657712A (en) 2022-06-24
CN114657712B CN114657712B (en) 2023-08-04

Family

ID=82028882

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210237014.XA Active CN114657712B (en) 2022-03-11 2022-03-11 Pattern pattern optimization method based on edge recognition

Country Status (1)

Country Link
CN (1) CN114657712B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104711785A (en) * 2015-04-01 2015-06-17 华中科技大学 Computerized pattern forming machine based on visual control
CN105447847A (en) * 2014-09-24 2016-03-30 Juki株式会社 Form detection means and sewing machine
CN206783926U (en) * 2017-03-20 2017-12-22 启翔股份有限公司 The vision alignment device of sewing machine
TWI646233B (en) * 2017-08-02 2019-01-01 伸興工業股份有限公司 Appliqué method based on image recognition
CN109355812A (en) * 2018-04-09 2019-02-19 深圳市诺德机器人有限公司 A kind of vision positioning automatic sewing system and method for sewing
CN110136155A (en) * 2019-05-19 2019-08-16 绵阳逢研科技有限公司 A kind of pattern edge track intelligent extract method and its application
CN111705434A (en) * 2020-06-19 2020-09-25 珠海运控瑞奇数控科技有限公司 Sewing method for intelligently and adaptively adjusting sewing patterns
CN113123022A (en) * 2021-04-16 2021-07-16 上海威士机械有限公司 Cuff sewing device and method based on visual detection processing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447847A (en) * 2014-09-24 2016-03-30 Juki株式会社 Form detection means and sewing machine
CN104711785A (en) * 2015-04-01 2015-06-17 华中科技大学 Computerized pattern forming machine based on visual control
CN206783926U (en) * 2017-03-20 2017-12-22 启翔股份有限公司 The vision alignment device of sewing machine
TWI646233B (en) * 2017-08-02 2019-01-01 伸興工業股份有限公司 Appliqué method based on image recognition
CN109355812A (en) * 2018-04-09 2019-02-19 深圳市诺德机器人有限公司 A kind of vision positioning automatic sewing system and method for sewing
CN110136155A (en) * 2019-05-19 2019-08-16 绵阳逢研科技有限公司 A kind of pattern edge track intelligent extract method and its application
CN111705434A (en) * 2020-06-19 2020-09-25 珠海运控瑞奇数控科技有限公司 Sewing method for intelligently and adaptively adjusting sewing patterns
CN113123022A (en) * 2021-04-16 2021-07-16 上海威士机械有限公司 Cuff sewing device and method based on visual detection processing

Also Published As

Publication number Publication date
CN114657712B (en) 2023-08-04

Similar Documents

Publication Publication Date Title
EP3263756B1 (en) Sewing quality control in sewing machine
US4204193A (en) Adaptive alignment for pattern recognition system
US9249533B2 (en) Sewing machine
US8335583B2 (en) Embroidery data generating device and computer-readable medium storing embroidery data generating program
US6256551B1 (en) Embroidery data production upon partitioning a large-size embroidery pattern into several regions
US5563795A (en) Embroidery stitch data producing apparatus and method
US8655474B2 (en) Embroidery data generating apparatus, embroidery data generating method, and non-transitory computer-readable medium storing embroidery data generating program
CN114657712B (en) Pattern pattern optimization method based on edge recognition
CN1036606C (en) Embroidery data producting apparatus
CN112513359A (en) Method for adjusting the position of a seam profile relative to the structure of a material to be sewn
JP2007175087A (en) Embroidery data preparation device and embroidery data preparation program
CN110273229B (en) Stitch inspection device
US8126584B2 (en) Embroidery data creation apparatus and storage medium storing embroidery data creation program
CN106894170A (en) Sewing machine, pin number management system and pin number management method
EP1308548B1 (en) Sewing and embroidering machine
US6247420B1 (en) Method of recognizing embroidery outline and conversion to a different data format
US6170413B1 (en) Correction apparatus for sewing data and correction method
CN111476763B (en) Device and method for correcting visual position
US7983783B2 (en) Embroidery data creation apparatus and embroidery data creation program
CN117779334B (en) Needle selection flower type transformation control system for computerized flat knitting machine
CN116149271B (en) Intelligent quality inspection and control method and system for different types of clothes production line production process
EP4283031A1 (en) Production method and production system for correction data for inverse plating
US6253695B1 (en) Method of changing the density of an embroidery stitch group
CN115578544A (en) Three-dimensional model generation method, system and device under limited condition and storage medium
CN115063355A (en) Cloth strip lattice identification method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant