CN116882271A - Embroidery design method based on deep neural network - Google Patents

Embroidery design method based on deep neural network Download PDF

Info

Publication number
CN116882271A
CN116882271A CN202310776229.3A CN202310776229A CN116882271A CN 116882271 A CN116882271 A CN 116882271A CN 202310776229 A CN202310776229 A CN 202310776229A CN 116882271 A CN116882271 A CN 116882271A
Authority
CN
China
Prior art keywords
embroidery
cloth
product
cutting
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310776229.3A
Other languages
Chinese (zh)
Inventor
汪杭军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Zhichuang Information Technology Co ltd
Original Assignee
Suzhou Zhichuang Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Zhichuang Information Technology Co ltd filed Critical Suzhou Zhichuang Information Technology Co ltd
Priority to CN202310776229.3A priority Critical patent/CN116882271A/en
Publication of CN116882271A publication Critical patent/CN116882271A/en
Pending legal-status Critical Current

Links

Classifications

    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05CEMBROIDERING; TUFTING
    • D05C17/00Embroidered or tufted products; Base fabrics specially adapted for embroidered work; Inserts for producing surface irregularities in embroidered products
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/55Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5838Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/12Geometric CAD characterised by design entry means specially adapted for CAD, e.g. graphical user interfaces [GUI] specially adapted for CAD
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/27Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention discloses an embroidery design method based on a deep neural network, which comprises the following steps: s1, cloth cutting: cutting the needed cloth accurately, and cutting the cloth into the needed embroidery size by cutting equipment to obtain embroidery base cloth; s2, fixing cloth: placing the cut embroidery base fabric on a sewing frame, tensioning and fixing the embroidery base fabric; s3, pattern determination: determining the pattern to be embroidered of the embroidery base fabric, performing priming operation, and feeding the embroidery base fabric into an embroidery machine after priming; s4, forming embroidery patterns; s5, equipment control: generating driving data according to the embroidery pattern to be formed based on the deep neural network, and controlling an embroidery machine to carry out embroidery work through the driving data to generate an embroidery product; s6, editing and storing the embroidery product. According to the invention, the cloth can be automatically cut through the set parameters and can be automatically conveyed to an embroidery working area after cutting, so that the follow-up use is convenient, the manual participation is reduced, and the labor force is reduced.

Description

Embroidery design method based on deep neural network
Technical Field
The invention belongs to the technical field of embroidery design, and particularly relates to an embroidery design method based on a deep neural network.
Background
Hand embroidery, also known as "needle embroidery", is commonly known as "embroidery". The embroidery needle is used for guiding color threads (silk, velvet and threads), and the embroidery needle is used for decorating and transporting the fabric (silk and cloth) according to the designed pattern, so that the embroidery stitch is used for forming the pattern or the characters, and the method is one of the excellent national traditional processes in China. The embroidery obtained by the traditional manual embroidery method has poor stereoscopic impression, and causes platy hardening of fabric when embroidery patterns are densely sewn, in addition, the traditional manual embroidery machine is simpler, has low efficiency and high cost, and along with the development of the technology in the field, the high-intelligent embroidery equipment is developed, and the defects of low efficiency, high cost and the like of the traditional manual embroidery are changed.
The nylon embroidery positioning digital printing process based on the nylon embroidery disclosed in the authority publication No. CN114934397A is realized by replacing polyester fibers with nylon fibers, threading embroidery with common white nylon threads, and does not need to distinguish embroidery pattern colors of designers, and the color influence of the water-soluble process on the embroidery threads is not needed to be considered when water-soluble paper of the embroidery is washed off. However, the existing embroidery design method does not solve the problems that: when the embroidery work is carried out, the bottom cutting error is easy to occur because the base cloth needs to be manually cut, the base cloth needs to be manually moved from the cutting area to the embroidery area, the labor force is increased, meanwhile, the embroidery path is generated according to the patterns needing to be embroidered, the difficulty in embroidery pattern generation is increased, in addition, the embroidery product is inconvenient to edit and store, the editing improvement is carried out on the basis of the existing embroidery product, and the generated embroidery product is directly utilized next time, so that the embroidery design method based on the deep neural network is provided.
Disclosure of Invention
The invention aims to provide an embroidery design method based on a deep neural network, which aims to solve the problems in the background technology.
In order to achieve the above purpose, the present invention provides the following technical solutions: an embroidery design method based on a deep neural network comprises the following steps:
s1, cloth cutting: cutting the needed cloth accurately, and cutting the cloth into the needed embroidery size by cutting equipment to obtain embroidery base cloth;
s2, fixing cloth: placing the cut embroidery base fabric on a sewing frame, tensioning and fixing the embroidery base fabric;
s3, pattern determination: determining the pattern to be embroidered of the embroidery base fabric, performing priming operation, and feeding the embroidery base fabric into an embroidery machine after priming;
s4, embroidery pattern forming: forming the pattern to be embroidered on the embroidery base cloth in a computer;
s5, equipment control: generating driving data according to the embroidery pattern to be formed based on the deep neural network, and controlling an embroidery machine to carry out embroidery work through the driving data to generate an embroidery product;
s6, editing and storing an embroidery product: the embroidery product after embroidery is shot to obtain a product picture, the picture is preprocessed and vectorized, the research is carried out by a computer recognition technology and a digital color feature labeling mode to form an embroidery vector gallery and a design element electronic document, and samples of the original embroidery product are arranged and stored.
Preferably, the specific steps of cutting the cloth in the step S1 are as follows:
s101, inspecting the cloth, after the cloth is inspected to be qualified, placing the cloth on a spreading machine, and setting spreading parameters of the spreading machine;
s102, spreading cloth on a cutting workbench through a cloth spreading machine, setting cutting parameters of a cutting machine, and cutting the cloth on the cutting workbench;
s103, transmitting the cut cloth on the cutting workbench to an embroidery working area for subsequent embroidery work.
Preferably, the spreading parameters of the spreading machine comprise tension force and spreading displacement of each time, and the cutting parameters of the cutting machine comprise cutting speed and cutting position.
Preferably, the embroidery machine includes a driving unit for performing an embroidery operation on the embroidery base fabric, and the driving unit is controlled by the generated driving data and controlling the driving of the X-axis motor and the Y-axis motor of the embroidery machine, the driving data controlling the driving unit using data composed of 16 bits, and the X/Y-axis motor being controlled to be less than 0.01mm unit.
Preferably, the driving data includes design basic information including a design name, a needle count, and a design size recorded from an address 00000000h, the address 000000A0h is blank, bitmap data of an actual picture of the design is recorded from the address 00000100h to 00000ffeh, and actual design data of the embroidery design is recorded from the address 00001000 h.
Preferably, the deep neural network in S5 is a multi-dimensional neural network having a deep neural network DNN sequence comprising an inner DNN and an outer DNN, each DNN comprising a layer sequence, and corresponding layers of different DNNs having the same parameters, each DNN being configured to process the input data sequentially by the layer sequence along a first dimension of data propagation, the DNNs in the DNN sequence being arranged along a second dimension of data propagation starting from the inner DNN up to the outer DNN, the DNNs in the DNN sequence being connected such that an output of at least one layer of DNNs is combined with an input of at least one layer of subsequent DNNs in the DNN sequence.
Preferably, the embroidering operation in S5 further includes: and (3) performing edge-pointing on the embroidered embroidery base cloth, wherein the edge-pointing thread is one half to three times of the embroidery thread.
Preferably, the preprocessing in S6 includes the following steps:
s601, opening a product picture of an embroidery product by using drawing software, and adjusting the contrast of the product picture to enable the background of the product picture to become dark mirror surface;
s602, selecting a filter library option in filter options of drawing software, selecting texture options in the filter library, and adjusting the square size of the jigsaw and the highlighting parameters to obtain a product picture of the preprocessed embroidery product.
Preferably, the specific step of performing vectorization processing in S6 is as follows: and opening the preprocessed product picture of the embroidery product by using drawing software, selecting a high-quality image option in profile tracing options in the drawing software, and adjusting detail, smoothness and corner smoothness parameters of the high-quality image option to obtain the vectorized product picture of the embroidery product.
Preferably, the specific steps after the vectorization processing in S6 include:
the method comprises the following specific steps of: selecting CMYK mode of drawing software, collecting standard color of embroidery product, and marking color feature with CMYK number, classifying color according to size of color region to obtain color scheme of embroidery product;
decomposing the vectorized embroidery product, establishing an embroidery product artistic feature database, extracting data in the embroidery product artistic feature database, and combining or redesigning to obtain a new embroidery product digital picture.
Compared with the prior art, the invention has the beneficial effects that:
(1) According to the invention, the cloth can be automatically cut through the set parameters and can be automatically conveyed to an embroidery work area after cutting, so that the follow-up use is convenient, the manual participation is reduced, and the labor force is reduced;
(2) The invention is based on the deep neural network, can generate driving data according to the embroidery pattern to be formed, and generates an embroidery path through the driving data, thereby controlling the embroidery machine to carry out embroidery work and ensuring the accurate completion of the embroidery work;
(3) According to the invention, the embroidery product after embroidery is shot to obtain the product picture, the picture is preprocessed and vectorized, the embroidery vector gallery and the design element electronic document are formed through research by a computer recognition technology and a digital color feature labeling mode, and the sample of the original embroidery product is arranged and stored, so that editing and improvement can be performed on the basis of the existing embroidery product, and the generated embroidery product is directly utilized next time, thereby facilitating research and analysis of the embroidery product and increasing production efficiency.
Drawings
FIG. 1 is a flow chart of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, the present invention provides a technical solution: an embroidery design method based on a deep neural network comprises the following steps:
s1, cloth cutting: cutting the needed cloth accurately, and cutting the cloth into the needed embroidery size by cutting equipment to obtain embroidery base cloth;
s2, fixing cloth: placing the cut embroidery base fabric on a sewing frame, tensioning and fixing the embroidery base fabric;
s3, pattern determination: determining the pattern to be embroidered of the embroidery base fabric, performing priming operation, and feeding the embroidery base fabric into an embroidery machine after priming;
s4, embroidery pattern forming: forming the pattern to be embroidered on the embroidery base cloth in a computer;
s5, equipment control: generating driving data according to the embroidery pattern to be formed based on the deep neural network, and controlling an embroidery machine to carry out embroidery work through the driving data to generate an embroidery product;
s6, editing and storing an embroidery product: the embroidery product after embroidery is shot to obtain a product picture, the picture is preprocessed and vectorized, the research is carried out by a computer recognition technology and a digital color feature labeling mode to form an embroidery vector gallery and a design element electronic document, and samples of the original embroidery product are arranged and stored.
In this embodiment, preferably, the specific steps of cutting the fabric in S1 are:
s101, inspecting the cloth, after the cloth is inspected to be qualified, placing the cloth on a spreading machine, and setting spreading parameters of the spreading machine;
s102, spreading cloth on a cutting workbench through a cloth spreading machine, setting cutting parameters of a cutting machine, and cutting the cloth on the cutting workbench;
s103, transmitting the cut cloth on the cutting workbench to an embroidery working area for subsequent embroidery work.
In this embodiment, preferably, the spreading parameters of the spreading machine include a tension force and each spreading displacement, and the cutting parameters of the cutting machine include a cutting speed and a cutting position.
In this embodiment, preferably, the embroidery machine includes a driving unit for performing an embroidery operation on the embroidery base fabric, and the driving unit is controlled by the generated driving data and controlling the driving of the X-axis motor and the Y-axis motor of the embroidery machine, the driving data controlling the driving unit using data composed of 16 bits, and the X/Y-axis motor being controlled to be less than 0.01mm unit.
In this embodiment, it is preferable that the driving data includes design basic information including a design name, a needle count, and a design size recorded from an address 00000000h, the address 000000A0h is blank, bitmap data of an actual picture of the design recorded from the address 00000100h to 00000ffeh, and actual design data of the embroidery design recorded from the address 00001000 h.
In this embodiment, preferably, the deep neural network in S5 is a multidimensional neural network having a deep neural network DNN sequence comprising an internal DNN and an external DNN, each DNN comprising a layer sequence, and corresponding layers of different DNNs having the same parameters, each DNN being configured to process the input data sequentially by the layer sequence along a first dimension of data propagation, the DNNs in the DNN sequence being arranged along a second dimension of data propagation starting from the internal DNN up to the external DNN, the DNNs in the DNN sequence being connected such that an output of at least one layer of DNNs is combined with an input of at least one layer of a subsequent DNN in the DNN sequence.
In this embodiment, preferably, the embroidering in S5 further includes: and (3) performing edge-pointing on the embroidered embroidery base cloth, wherein the edge-pointing thread is one half to three times of the embroidery thread.
In this embodiment, preferably, the preprocessing in S6 includes the following steps:
s601, opening a product picture of an embroidery product by using drawing software, and adjusting the contrast of the product picture to enable the background of the product picture to become dark mirror surface;
s602, selecting a filter library option in filter options of drawing software, selecting texture options in the filter library, and adjusting the square size of the jigsaw and the highlighting parameters to obtain a product picture of the preprocessed embroidery product.
In this embodiment, preferably, the specific step of performing the vectorization processing in S6 is: and opening the preprocessed product picture of the embroidery product by using drawing software, selecting a high-quality image option in profile tracing options in the drawing software, and adjusting detail, smoothness and corner smoothness parameters of the high-quality image option to obtain the vectorized product picture of the embroidery product.
In this embodiment, preferably, the specific steps after the vectorization processing in S6 include:
the method comprises the following specific steps of: selecting CMYK mode of drawing software, collecting standard color of embroidery product, and marking color feature with CMYK number, classifying color according to size of color region to obtain color scheme of embroidery product;
decomposing the vectorized embroidery product, establishing an embroidery product artistic feature database, extracting data in the embroidery product artistic feature database, and combining or redesigning to obtain a new embroidery product digital picture.
The principle and the advantages of the invention are that:
according to the invention, the cloth can be automatically cut through the set parameters and can be automatically conveyed to an embroidery work area after cutting, so that the follow-up use is convenient, the manual participation is reduced, and the labor force is reduced;
the invention is based on the deep neural network, can generate driving data according to the embroidery pattern to be formed, and generates an embroidery path through the driving data, thereby controlling the embroidery machine to carry out embroidery work and ensuring the accurate completion of the embroidery work;
according to the invention, the embroidery product after embroidery is shot to obtain the product picture, the picture is preprocessed and vectorized, the embroidery vector gallery and the design element electronic document are formed through research by a computer recognition technology and a digital color feature labeling mode, and the sample of the original embroidery product is arranged and stored, so that editing and improvement can be performed on the basis of the existing embroidery product, and the generated embroidery product is directly utilized next time, thereby facilitating research and analysis of the embroidery product and increasing production efficiency.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (10)

1. An embroidery design method based on a deep neural network is characterized in that: the method comprises the following steps:
s1, cloth cutting: cutting the needed cloth accurately, and cutting the cloth into the needed embroidery size by cutting equipment to obtain embroidery base cloth;
s2, fixing cloth: placing the cut embroidery base fabric on a sewing frame, tensioning and fixing the embroidery base fabric;
s3, pattern determination: determining the pattern to be embroidered of the embroidery base fabric, performing priming operation, and feeding the embroidery base fabric into an embroidery machine after priming;
s4, embroidery pattern forming: forming the pattern to be embroidered on the embroidery base cloth in a computer;
s5, equipment control: generating driving data according to the embroidery pattern to be formed based on the deep neural network, and controlling an embroidery machine to carry out embroidery work through the driving data to generate an embroidery product;
s6, editing and storing an embroidery product: the embroidery product after embroidery is shot to obtain a product picture, the picture is preprocessed and vectorized, the research is carried out by a computer recognition technology and a digital color feature labeling mode to form an embroidery vector gallery and a design element electronic document, and samples of the original embroidery product are arranged and stored.
2. The embroidery design method based on the deep neural network according to claim 1, wherein: the specific steps of cloth cutting in the step S1 are as follows:
s101, inspecting the cloth, after the cloth is inspected to be qualified, placing the cloth on a spreading machine, and setting spreading parameters of the spreading machine;
s102, spreading cloth on a cutting workbench through a cloth spreading machine, setting cutting parameters of a cutting machine, and cutting the cloth on the cutting workbench;
s103, transmitting the cut cloth on the cutting workbench to an embroidery working area for subsequent embroidery work.
3. The embroidery design method based on the deep neural network according to claim 2, wherein: the cloth spreading parameters of the cloth spreading machine comprise tension force and cloth spreading displacement each time, and the cutting parameters of the cutting machine comprise cutting speed and cutting position.
4. The embroidery design method based on the deep neural network according to claim 1, wherein: the embroidery machine comprises a driving unit for performing embroidery operation on the embroidery base fabric, and an X-axis motor and a Y-axis motor of the embroidery machine are controlled to be driven by the generated driving data, wherein the driving data is formed by 16 bits, and the driving unit is controlled by the driving data, so that the X/Y-axis motor is controlled to be below 0.01mm unit.
5. The embroidery design method based on the deep neural network according to claim 4, wherein: the driving data is recorded with design basic information including design name, needle number and design size from address 00000000h, address 000000A0h is blank, bitmap data of actual picture of design is recorded from address 00000100h to 00000ffeh, and actual design data of embroidery design is recorded from address 00001000 h.
6. The embroidery design method based on the deep neural network according to claim 1, wherein: the deep neural network in S5 is a multi-dimensional neural network having a deep neural network DNN sequence comprising an internal DNN and an external DNN, each DNN comprising a layer sequence, and corresponding layers of different DNNs having the same parameters, each DNN being configured to process the input data sequentially by the layer sequence along a first dimension of data propagation, the DNNs in the DNN sequence being arranged along a second dimension of data propagation starting from the internal DNN up to the external DNN, the DNNs in the DNN sequence being connected such that an output of at least one layer of DNNs is combined with an input of at least one layer of a subsequent DNN in the DNN sequence.
7. The embroidery design method based on the deep neural network according to claim 1, wherein: the step S5 of performing embroidery work further comprises: and (3) performing edge-pointing on the embroidered embroidery base cloth, wherein the edge-pointing thread is one half to three times of the embroidery thread.
8. The embroidery design method based on the deep neural network according to claim 1, wherein: the pretreatment in the step S6 comprises the following steps:
s601, opening a product picture of an embroidery product by using drawing software, and adjusting the contrast of the product picture to enable the background of the product picture to become dark mirror surface;
s602, selecting a filter library option in filter options of drawing software, selecting texture options in the filter library, and adjusting the square size of the jigsaw and the highlighting parameters to obtain a product picture of the preprocessed embroidery product.
9. The embroidery design method based on the deep neural network according to claim 1, wherein: the specific step of vectorizing in the S6 is as follows: and opening the preprocessed product picture of the embroidery product by using drawing software, selecting a high-quality image option in profile tracing options in the drawing software, and adjusting detail, smoothness and corner smoothness parameters of the high-quality image option to obtain the vectorized product picture of the embroidery product.
10. The embroidery design method based on the deep neural network according to claim 1, wherein: the specific steps after the vectorization processing in the step S6 include:
the method comprises the following specific steps of: selecting CMYK mode of drawing software, collecting standard color of embroidery product, and marking color feature with CMYK number, classifying color according to size of color region to obtain color scheme of embroidery product;
decomposing the vectorized embroidery product, establishing an embroidery product artistic feature database, extracting data in the embroidery product artistic feature database, and combining or redesigning to obtain a new embroidery product digital picture.
CN202310776229.3A 2023-06-28 2023-06-28 Embroidery design method based on deep neural network Pending CN116882271A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310776229.3A CN116882271A (en) 2023-06-28 2023-06-28 Embroidery design method based on deep neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310776229.3A CN116882271A (en) 2023-06-28 2023-06-28 Embroidery design method based on deep neural network

Publications (1)

Publication Number Publication Date
CN116882271A true CN116882271A (en) 2023-10-13

Family

ID=88263525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310776229.3A Pending CN116882271A (en) 2023-06-28 2023-06-28 Embroidery design method based on deep neural network

Country Status (1)

Country Link
CN (1) CN116882271A (en)

Similar Documents

Publication Publication Date Title
EP3653391A1 (en) Precise digital printing method for fabric by utilizing digital printing machine and fabric
US11060220B2 (en) Sewing data for embroidery designs systems and methods
CN107012601B (en) High-resolution machine embroidery method
US5701830A (en) Embroidery data processing apparatus
US5270939A (en) Method for modifying embroidery design programs
CN104695139A (en) Industrial sewing machine system and cut part sewing processing method by same
JPH09170158A (en) Embroidery data processor
CN110714284B (en) Intelligent positioning embroidery method and intelligent positioning embroidery control device
CN110940670A (en) Flexible printing label printing head draft detection system based on machine vision and implementation method thereof
EP0859300B1 (en) Fashion design and production supporting method and system therefor
CN116882271A (en) Embroidery design method based on deep neural network
DE19506154A1 (en) Image data prodn. using image reader
US20100224112A1 (en) Automatic sizing of embroidery
CN1587482A (en) Computer machine embroidery technologic pocess
CN107488943A (en) A kind of dynamic of spininess quilting apparatus matches somebody with somebody needle method
CN111242966A (en) Image boundary correction method, device, electronic equipment and storage medium
CN101307530B (en) Embroidery machine needle rod sewing sequence designation device and its method
US5515289A (en) Stitch data producing system and method for determining a stitching method
CN1480579A (en) Computerized anglicanum
Wu Utilization of MultiLevel Drawing Technology for Computer-Based Embroidery Employing Computerized Digital Technology
Shariq et al. Image Processing Based Pattern Recognition and Computerized Embroidery Machine
CN112215793B (en) Method for drawing hand-embroidered carpet pattern detection texture and applying hand-embroidered carpet pattern detection texture to carpet weaving
TW202407628A (en) Fabric printing system and steps thereof including an editing device, a style pattern database, an inkjet printing device and a cutting device to enable fast and accurate fabric printing
CN116623375A (en) Unmanned operation method and system for carrying out traditional manual embroidery on cut pieces
CN112581419A (en) Method for forming embroidery by embroidery color analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination