CN108288272A - Yarn recognition methods and device - Google Patents
Yarn recognition methods and device Download PDFInfo
- Publication number
- CN108288272A CN108288272A CN201810127700.5A CN201810127700A CN108288272A CN 108288272 A CN108288272 A CN 108288272A CN 201810127700 A CN201810127700 A CN 201810127700A CN 108288272 A CN108288272 A CN 108288272A
- Authority
- CN
- China
- Prior art keywords
- yarn
- attribute
- image
- described image
- measured
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- D—TEXTILES; PAPER
- D03—WEAVING
- D03J—AUXILIARY WEAVING APPARATUS; WEAVERS' TOOLS; SHUTTLES
- D03J1/00—Auxiliary apparatus combined with or associated with looms
- D03J1/14—Apparatus for threading warp stop-motion droppers, healds, or reeds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30124—Fabrics; Textile; Paper
Abstract
The present invention provides a kind of yarn recognition methods and devices, are related to textile industry field, and the method is applied to vision-based detection module, the method includes:Obtain the picture frame of the yarn to be measured for the acquisition of image capture device camera;The characteristics of image of the yarn to be measured is extracted in described image frame;The image attributes of the yarn to be measured is determined according to described image feature;If described image attribute meets any preset attribute condition, default number of yarns corresponding with the preset attribute condition is determined as the number of yarns recognized.A kind of yarn recognition methods provided by the invention and device, rapid image acquisition, and the method detected by stereoscopic vision are carried out to yarn to be measured using image capture device, accurately judge the number of yarns of yarn to be measured, be conducive to rapidly and efficiently it is automatic entry, improve working efficiency.
Description
Technical field
The present invention relates to textile industry fields, more particularly, to a kind of yarn recognition methods and device.
Background technology
In textile industry, drawing-in (trying to make a match) is a particularly important link.Traditional Handicraft Textile industry is every
Needle may be required for human hand is dynamic to go drawing-in, efficiency very low per line.With the development of science and technology, Modern Textile Industry utilizes machine
Instead of artificial drawing-in, working efficiency can be greatly improved.Existing Machine automated drawing-in relies primarily on some sensors (as pressed
Force snesor) be detected, due to pressure sensor there are short life, of high cost, high failure rate and it is easy to aging the shortcomings of, wearing
Through in the process it is possible that the case where two or threads are sent into drawing-in link as an one thread, so as to cause wearing
Through mistake.Since automation weaving is produced in batches on assembly line, debugs if necessary to manual intervention, work can be reduced
Make efficiency.Meanwhile in this repetitive operation, people is easy to will appear fatigue state, if manual intervention is not in time, can lead
It causes a collection of yarn that cannot use, causes the waste of material.
Invention content
In view of this, the purpose of the present invention is to provide a kind of yarn recognition methods and devices, to alleviate the prior art
In machine drawing-in error rate it is high, debugging takes time and effort, the technical problems such as cause working efficiency to reduce.
In a first aspect, an embodiment of the present invention provides a kind of yarn recognition methods, it is applied to vision-based detection module, including:
Obtain the picture frame of the yarn to be measured for image capture device acquisition;
The characteristics of image of the yarn to be measured is extracted in described image frame;
The image attributes of the yarn to be measured is determined according to described image feature;
It, will default yarn corresponding with the preset attribute condition if described image attribute meets any preset attribute condition
Quantity is determined as the number of yarns recognized.
With reference to first aspect, an embodiment of the present invention provides the first possible embodiments of first aspect, wherein institute
Stating image attributes includes:The attributes such as region area, area circumference, linear character, hole Euler's numbers and row scanning pulse number.
With reference to first aspect, an embodiment of the present invention provides second of possible embodiments of first aspect, wherein institute
Stating preset attribute condition includes:Property parameters range is adopted in the image capture device for obtaining the image for acquiring yarn to be measured
Before the picture frame of collection, including:
Obtain the sample image sequence of one or more yarn of described image collecting device acquisition;
Morphological scale-space is carried out to the multiframe sample image of the sample image sequence, obtains meeting feature extraction condition
Multiframe sample image frame;
Multiple property parameters of image attributes are determined according to the characteristics of image extracted in every frame sample image frame;
It is corresponding with itself more by what is extracted from multiframe sample image frame for each attribute in described image attribute
A property parameters carry out value according to default value rule, obtain the attribute thresholds of itself;
According to the preset attribute threshold value for determining one or more yarn according to multiple attribute thresholds of value rule value;
Property parameters range is determined according to preset attribute threshold value and predetermined deviation range.
With reference to first aspect, an embodiment of the present invention provides the third possible embodiments of first aspect, wherein institute
The method of stating further includes:
Judge whether described image attribute meets any preset attribute condition;
It is described to judge whether described image attribute meets any preset attribute condition, including:
Judge whether any attribute in described image attribute is respectively positioned within the scope of the corresponding property parameters of single yarn;
If any attribute in described image attribute is respectively positioned within the scope of the corresponding property parameters of single yarn, determines and know
The number of yarns being clipped to is single;
If any attribute in described image attribute is not located within the scope of the corresponding property parameters of single yarn, judge
Whether any attribute in described image attribute is located at the corresponding property parameters range of threads;
If any attribute in described image attribute is respectively positioned on the corresponding property parameters range of threads, identification is determined
The number of yarns arrived is more.
With reference to first aspect, an embodiment of the present invention provides the 4th kind of possible embodiments of first aspect, wherein sentences
Whether any attribute in disconnected described image attribute is respectively positioned within the scope of the corresponding property parameters of single yarn, including:
It, will be corresponding in the property parameters range of the property parameters of each attribute and single yarn according to precedence information
Property parameters range be compared.
With reference to first aspect, an embodiment of the present invention provides the 5th kind of possible embodiments of first aspect, wherein institute
Stating image capture device includes:First camera and second camera, the optical axis of the optical axis of the first camera and the second camera
Between be in predetermined acute angle or right angle, it is described obtain for acquire yarn to be measured image image capture device acquisition image
Frame, including:
Obtain the second picture frame of the first picture frame and second camera acquisition of the first camera acquisition.
With reference to first aspect, an embodiment of the present invention provides the 6th kind of possible embodiments of first aspect, wherein
The characteristics of image of the yarn to be measured is extracted in described image frame, including:
Described image frame is pre-processed, image segmentation handles and Morphological scale-space;
The characteristics of image of the yarn to be measured is extracted in the picture frame after Morphological scale-space.
Second aspect, the embodiment of the present invention also provide a kind of yarn identification device, including:
Acquisition module, the picture frame of the yarn to be measured for obtaining image capture device acquisition;
Extraction module, the characteristics of image for extracting the yarn to be measured in described image frame;
First determining module, the image attributes for determining the yarn to be measured according to described image feature;
Second determining module will be with the preset attribute if meeting any preset attribute condition for described image attribute
The corresponding default number of yarns of condition is determined as the number of yarns recognized.
The third aspect, the embodiment of the present invention also provide a kind of electronic equipment, including memory, processor, the memory
In be stored with the computer program that can be run on the processor, said program code makes the processor execute described first
Method described in aspect.
Fourth aspect, the embodiment of the present invention also provide a kind of meter for the non-volatile program code that can perform with processor
Calculation machine readable medium, said program code make the processor execute the method described in the first aspect.
The embodiment of the present invention brings following advantageous effect:A kind of yarn recognition methods provided by the invention and device, profit
Rapid image acquisition, and the method detected by stereoscopic vision are carried out to yarn to be measured with image capture device, accurately judged
The number of yarns for going out yarn to be measured, be conducive to rapidly and efficiently it is automatic entry, improve working efficiency.
Other features and advantages of the present invention will illustrate in the following description, also, partly become from specification
It obtains it is clear that understand through the implementation of the invention.The purpose of the present invention and other advantages are in specification, claims
And specifically noted structure is realized and is obtained in attached drawing.
To enable the above objects, features and advantages of the present invention to be clearer and more comprehensible, preferred embodiment cited below particularly, and coordinate
Appended attached drawing, is described in detail below.
Description of the drawings
It, below will be to specific in order to illustrate more clearly of the specific embodiment of the invention or technical solution in the prior art
Embodiment or attached drawing needed to be used in the description of the prior art are briefly described, it should be apparent that, in being described below
Attached drawing is some embodiments of the present invention, for those of ordinary skill in the art, before not making the creative labor
It puts, other drawings may also be obtained based on these drawings.
Fig. 1 is the flow chart of yarn recognition methods provided in an embodiment of the present invention;
Fig. 2 is the flow chart for the yarn recognition methods that another embodiment of the present invention provides;
Fig. 3 is the flow chart for the yarn recognition methods that another embodiment of the present invention provides;
Fig. 4 is the structural schematic diagram for the yarn identification device that another embodiment of the present invention provides.
Icon:
11- acquisition modules;12- extraction modules;The first determining modules of 13-;The second determining modules of 14-.
Specific implementation mode
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with attached drawing to the present invention
Technical solution be clearly and completely described, it is clear that described embodiments are some of the embodiments of the present invention, rather than
Whole embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art are not making creative work premise
Lower obtained every other embodiment, shall fall within the protection scope of the present invention.
Textile industry is always important one of the pillar industry of national national economy for a long time, but since world financial is endangered
The influence of machine, textile industry itself be also faced with crisis, needs constantly to promote Development of Textile Industry with advanced science and technology.Mesh
Before, textile industry the degree of automation is relatively low, during machine work, may also need to human assistance intervention, expend a large amount of people
Power, working efficiency are relatively low.Technology based on computer vision is gradually being applied in textile industry, current main vision-based detection
Task be fabric defects detect automatically, the package of yarn and removal Yarn filoplume etc..And there is also many on the textile machine
Eye-observation link causes the reduction of industrial efficiency and a large amount of artificial input.
In traditional Handicraft Textile industry, it may be required for human hand is dynamic to go drawing-in, efficiency very low per line per needle.With section
The development of skill, profit is installed machines to replace manual labor drawing-in in Modern Textile Industry, in order to avoid interting mistake, it is necessary to there is a set of real-time prison
Reponse system is surveyed to help machine to correct mistake, or has the function that manual intervention after discovery false alarm.
The existing automation weaving loom in current China relies primarily on some traditional sensors and is detected, in automatic drawing-in
In the process, be generally adopted by pressure-type sensor, although pressure sensor precision can meet production requirement, short life and
It is of high cost, simultaneously because mechanical sensor there are mechanical breakdown and it is easy to aging the problems such as, and can not quickly be entried, be had
There is certain limitation.Often occur detection failure during use, needs artificial treatment, and easily examined after being used for a long time
Indeterminacy is true, it is possible that two or threads are sent into as an one thread feelings of drawing-in link during drawing-in
Condition.It since automation weaving is produced in batches on assembly line, debugs if necessary to manual intervention, work effect can be reduced
Rate.Meanwhile in this repetitive operation, people is easy to will appear fatigue state, if manual intervention is not in time, can lead to one
Criticizing yarn cannot use, and cause the waste of material.
Currently, machine drawing-in error rate in the prior art is high, debugging takes time and effort, and working efficiency is caused to reduce,
Based on this, a kind of yarn recognition methods provided by the invention and device carry out yarn to be measured using image capture device quick
Image Acquisition, and the method detected by stereoscopic vision, accurately judge the number of yarns of yarn to be measured, are conducive to quick height
The automatic of effect is entried, and then improves working efficiency.
For ease of understanding the present embodiment, first to a kind of yarn recognition methods disclosed in the embodiment of the present invention into
Row is discussed in detail.
As shown in Figure 1, in one embodiment of the invention, providing a kind of yarn knowledge applied to vision-based detection module
Other method, the method are used to that single or threads criterion, the side to be generated in advance for certain types of yarn
Method includes following steps.
S101 obtains the sample image sequence of one or more yarn of image capture device acquisition.
Specifically, described image collecting device includes but not limited to:Camera or automatic focusing camera etc. can use single phase
Machine is acquired, and can also use fixed-focus camera acquisition etc. simultaneously using two performance parameters.Due to for different yarn classes
The collected yarn image attributes of type is different, it is therefore desirable to need online generation criterion, criterion life according to production
It is determined by man-machine interactively at process.For each new type yarn, man-machine interactively is needed to acquire 10-20 width images, by right
The feature of image sequence extracts, and is counted to image attributes, finally determines criterion.If desired list is generated in advance
The criterion of root need to only use single yarn as collecting sample;If desired more (two, three or other roots are generated in advance
Number) yarn criterion, then need to use threads as collecting sample.
In all examples being illustrated and described herein, any occurrence should be construed as merely illustrative, without
It is as limitation, therefore, other examples of exemplary embodiment can have different values.
S102 carries out Morphological scale-space to the multiframe sample image of the sample image sequence, obtains meeting feature extraction
The multiframe sample image frame of condition.
S103 determines multiple property parameters of image attributes according to the characteristics of image extracted in every frame sample image frame.
Specifically, image characteristics extraction includes:Image preprocessing, image yarn Target Segmentation, to object region into
Row post-processing, carries out attribute calculating to target image later.Described image attribute includes:Yarn target area area, region week
Long, linear character and hole Euler's numbers, and pass through the attributive character such as the signal number of transitions of grid scanning acquisition.Except this it
Outside, the moment characteristics and colour of the moment characteristics of sample yarn, the moment characteristics of whole figure gray value, target area gray value can also be calculated
The attributive character such as the RGB component statistics of yarn.
Include mainly to image preprocessing:The side of the brightness adjustment and a variety of removal noises of image gray processing and image
Method etc..
Target Segmentation part includes:The binaryzation of image is completed using Threshold Segmentation Algorithm.
Post-processing to object region includes mainly:After Morphological scale-space method removal image burr and segmentation
The morphology sides such as medium filtering, mean filter, gaussian filtering may be used for rough edge in the zonule noise of generation
Method carries out smooth treatment.
Calculate the property parameters of image attributes:Pass through the face of the direct statistical sample yarn in target area to binary image
Product, perimeter and hole Euler's numbers;Target figure is obtained using image outline extraction, linear character fitting or using Hough transformation method
The straight linear feature of picture;The gray feature change information for carrying out target area image is scanned using grid line, detailed process is:
Bianry image is progressively scanned, it is found that then counter adds 1 for a 0-1 transition, finally calculates entire image according to the value of counter
Umber of pulse.
For every frame image, the processes such as above-mentioned pretreatment, image segmentation processing and Morphological scale-space are carried out, this is calculated
The corresponding property parameters of each attribute of frame image.
S104 is right with itself by being extracted from multiframe sample image frame for each attribute in described image attribute
The multiple property parameters answered carry out value according to default value rule, obtain the attribute thresholds of itself.
Specifically, for any attribute in image attributes, obtain corresponding to the more of the attribute from multiframe sample image frame
A property parameters, and multiple property parameters are obtained into the attribute thresholds of the attribute according to default value rule value.For example, right
In this attribute of region area, take the minimum value of multiple areas as threshold value;For this attribute of row signal number of transitions, change is taken
The minimum number of number is changed as threshold value etc..
S105, according to the preset attribute threshold for determining one or more yarn according to multiple attribute thresholds of value rule value
Value.
Specifically, using the corresponding multiple attribute thresholds of each attribute as the default of sample yarn (single or threads)
Attribute thresholds.
S106 determines property parameters range according to preset attribute threshold value and predetermined deviation range.
Specifically, for each attribute, the attribute of the attribute is determined according to the attribute thresholds of the attribute and default deviation value
Parameter area.Using the property parameters range of multiple attributes as the property parameters range of sample yarn (single or threads)
(i.e. criterion).
According to real work needs, the criterion of single yarn can only be generated in advance;It can also be generated in advance single
The criterion of yarn and double joint yarn;Single, double joint and the criterion of Duo Gen, etc. can also be generated in advance.
As shown in Fig. 2, in another embodiment of the present invention, providing a kind of yarn applied to vision-based detection module
Recognition methods, the method are used for according to the criterion for being generated in advance single or more, and the automatic radical for detecting yarn is described
Method includes following steps.
S201 obtains the picture frame of the yarn to be measured of described image collecting device acquisition.
In practical applications, range information can be lost in acquiring image process due to camera, if only at a visual angle
Under yarn is judged, information content is sufficiently complete, for there is the case where threads coincidence, is obtained under the same visual angle
Image and the image presentation features of single yarn distinguish very little, can not carry out accurate judgement.Therefore it needs to carry out from different perspectives
The detection of yarn image uses the camera of two performance parameters:First camera and second camera, the optical axis of the first camera
It is in predetermined acute angle or right angle between the optical axis of the second camera, angle is 30 ° -90 °, and two cameras must synchronous acquisition.
The specific installation site of camera can be adjusted, according to actual needs reasonable arrangement space as far as possible, optimize whole mechanical layout.
In addition, the shooting background scheme of camera can be black, multicolour background board can be increased, automatically adjust bias light
The equipment of degree can also be applied, and the identification of different colours yarn is adapted to this, enhance the scope of application of identifying system.
The vision-based detection module can be the processing units such as computer, pass through wired mode or wireless mode and two phases
Machine connects.Since there are two cameras for setting, and two camera synchronous acquisition images, the vision-based detection module can obtain institute simultaneously
State the second picture frame of the first picture frame and second camera acquisition of first camera acquisition.
S202 extracts the characteristics of image of the yarn to be measured in described image frame.
Specifically, to described image frame (including:First picture frame and the second picture frame) it is pre-processed, at image segmentation
Reason and Morphological scale-space, extract the characteristics of image of the yarn to be measured, figure in the picture frame after Morphological scale-space later
As characteristic extraction procedure is identical as the image characteristics extraction process being generated in advance in single or threads criterion, here not
It is described in detail again.
S203 determines the image attributes of the yarn to be measured according to described image feature.
Specifically, yarn to be measured obtains after shooting, pretreatment, Morphological scale-space and data calculate these steps
The property parameters of each attribute, judge whether described image attribute meets any preset attribute condition (i.e. later in image attributes
Single attribute conditions or more attribute conditions)
S204 will be corresponding pre- with the preset attribute condition if described image attribute meets any preset attribute condition
If number of yarns is determined as the number of yarns recognized.
If specifically, single and more criterion is generated in advance, then first preferential in decision process
Judge it is whether single, then judge whether more.As shown in figure 3, its specific judgment method includes the following steps.
S301, judges whether any attribute in described image attribute is respectively positioned on the corresponding property parameters range of single yarn
It is interior.
S302, if any attribute in described image attribute is respectively positioned within the scope of the corresponding property parameters of single yarn,
Determine that the number of yarns recognized is single.
Specifically, according to precedence information, by the category of the property parameters of each attribute attribute corresponding with single yarn
Property parameter area compare, if each attribute is respectively positioned in property parameters range corresponding with itself, judge the number of threads recognized
Amount is single.
For the different attribute in image attributes, due to different radicals yarn image for different attribute susceptibility not
One, the method compared using priority ranking can improve accuracy.Therefore, therefore the parameter with typical change feature
Be placed on higher priority, such as the area of yarn target area and row scanning pulse number, and other area circumference, linear character and
These three feature precedences of hole Euler's numbers are relatively low, can be used for auxiliary judgment, but do not influence the judging result of high priority.If
The judging result of high priority feature mutually conflicts with the Individual features judging result of low priority, is subject to high priority feature.
Due to the use of the identical camera synchronous acquisition of two performance parameters, by the image for the acquisition image that two cameras obtain
Attribute needs while property parameters range corresponding with single yarn is compared.Also, only when the acquisition to two cameras
The judgement result of image is identical, and (image attributes for the acquisition image that i.e. each camera obtains is respectively positioned on the property parameters of single yarn
In range) when, it just can determine that the number of yarns recognized is single.
S303, if any attribute in described image attribute is not located at the corresponding property parameters range of single yarn
It is interior, judge whether any attribute in described image attribute is located at the corresponding property parameters range of threads.
Specifically, in comparison procedure, the image attributes of the acquisition image obtained by two cameras need simultaneously with more
The corresponding property parameters range of yarn is compared.
S304, if any attribute in described image attribute is respectively positioned on the corresponding property parameters range of threads, really
Surely the number of yarns recognized is more.
Specifically, also according to precedence information, by the property parameters of each attribute attribute corresponding with threads
Property parameters range compare, if each attribute is respectively positioned in property parameters range corresponding with itself, determine the yarn recognized
Line number amount is more.
In practical applications, the criterion of different radicals is generated in advance, can not only make the specific of number of yarns to be measured
Decision process improves precision, and can meet different actual demands.
The determination of yarn radical criterion determines the accuracy of follow-up drawing-in work, the attribute information of different yarns
Difference is very big.Therefore, it when replacing the yarn of new type, needs to reset entire work system, most important one
To be reset to the property parameters range of new samples yarn, need method using man-machine interactively formula to criterion again into
Row training, the setting of new property parameters range is carried out using the new feature data that new samples yarn obtains.
A kind of yarn recognition methods provided in an embodiment of the present invention carries out yarn to be measured using image capture device quick
Image Acquisition, and the method detected by stereoscopic vision, accurately judge the number of yarns of yarn to be measured, are conducive to quick height
The automatic of effect is entried, and working efficiency is improved.
Compared with the automatic drawing-in in traditional textile machine, replace personal monitoring's working condition not only can be with using machine vision
Working efficiency is improved, operating accuracy can be also improved, saves human resources.Meanwhile it is also one that mechanization production, which replaces hand labour,
The important measurement standard of a national science and technology development level, therefore, computer vision are used in automated machine production, are to comply with
Development in science and technology trend, machine vision development at present is very fast, just as the eyes of people, if wanting that machine is allowed instead of people to have come
At various complex works, the recognition reaction of machine vision is particularly important.
External advanced textile machinery in the monitoring of yarn radical using voltage sensitive sensor, and the master of voltage sensitive sensor
It is varistor to want component, and basic function is exactly the pressure that surface is subject to be converted into electric signal, but this conversion process is
It is nonlinear, and varistor itself have oxidation, by external environmental interference the shortcomings of.Therefore voltage sensitive sensor is being used
When, for the different environments of plant, debugging efforts are complicated, and maintenance work is frequent, can largely effect on production efficiency and life
Produce precision.And the environmental suitability of NI Vision Builder for Automated Inspection is stronger, system itself lasts a long time, and does not have to frequently replace electronics member device
Part, power consumption when work are very low.Therefore, no matter from work complexity or it is resource-effective on compare, using machine vision
System ratio is all more excellent using voltage sensitive sensor, the development of generation technique when being more suitable for, in technological improvement and the side of fusion in later stage
Face, with the superiority that analog circuit is incomparable.
Illustrate the detailed process of the drawing-in process of weaving loom provided in an embodiment of the present invention by way of example below:
Under normal circumstances, needle-threading machine is in threading, it is desirable that the pinprick of a needle can only pass through an one thread, therefore, crochet hook
In the case that the yarn ticked must be single, it just can guarantee that needle-threading machine works normally.
The specific basic step of drawing-in process is as follows:
1. an existing yarn is tightened on fixing bracket, the neat yarn burr of hairbrush brush and texture are utilized;
2. the yarn laterally arranged is sent into needle-threading machine;
3. needle-threading machine triggers a wherein one thread using mechanical crochet hook;
4. trigger this one thread post-tensioning carries out radical judgement to designated position, showing for threads is moved to prevent
As;
5. what if vision-based detection module judged crochet hook pull-up is single yarn, feeding structure of threading a needle carries out drawing-in operation.
If only generate the criterion of single yarn in advance, judge pull-up when vision-based detection module is threads
When, fault alarm is sent out, prompts working condition improper to worker.
During the drawing-in of weaving loom, a kind of yarn recognition methods provided in an embodiment of the present invention, for detecting in real time
Yarn conditions of the spinning machine after dividing sinker detect the radical of yarn to be measured, and then replace manual detection work automatically, save people
Power, and then improve working efficiency.
The technique effect and preceding method embodiment phase of the device that the embodiment of the present invention is provided, realization principle and generation
Together, to briefly describe, device embodiment part does not refer to place, can refer to corresponding contents in preceding method embodiment.Such as Fig. 4 institutes
Show, in another embodiment of the present invention, additionally provides a kind of yarn identification device, described device includes:Acquisition module 11,
Extraction module 12, the first determining module 13 and the second determining module 14.
Acquisition module 11, the picture frame for the yarn to be measured that acquisition image capture device acquires;
Extraction module 12, the characteristics of image for extracting the yarn to be measured in described image frame;
First determining module 13, the image attributes for determining the yarn to be measured according to described image feature;
Second determining module 14 will be with the default category if meeting any preset attribute condition for described image attribute
The property corresponding default number of yarns of condition is determined as the number of yarns recognized.
In another embodiment of the present invention, a kind of electronic equipment, including memory, processor are additionally provided, it is described
The computer program that can be run on the processor is stored in memory, said program code makes the processor execute institute
State yarn recognition methods.
In another embodiment of the present invention, a kind of non-volatile program generation that can perform with processor is additionally provided
The computer-readable medium of code, said program code make the processor execute the yarn recognition methods.
Unless specifically stated otherwise, the opposite step of the component and step that otherwise illustrate in these embodiments, digital table
It is not limit the scope of the invention up to formula and numerical value.
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description
It with the specific work process of device, can refer to corresponding processes in the foregoing method embodiment, details are not described herein.
Flow chart and block diagram in attached drawing show the system, method and computer journey of multiple embodiments according to the present invention
The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation
A part for a part for one module, section or code of table, the module, section or code includes one or more uses
The executable instruction of the logic function as defined in realization.It should also be noted that in some implementations as replacements, being marked in box
The function of note can also occur in a different order than that indicated in the drawings.For example, two continuous boxes can essentially base
Originally it is performed in parallel, they can also be executed in the opposite order sometimes, this is depended on the functions involved.It is also noted that
It is the combination of each box in block diagram and or flow chart and the box in block diagram and or flow chart, can uses and execute rule
The dedicated hardware based system of fixed function or action is realized, or can use the group of specialized hardware and computer instruction
It closes to realize.
The yarn recognition methods and device computer program product that the embodiment of the present invention is provided, including store program generation
The computer readable storage medium of code, the instruction that said program code includes can be used for executing described in previous methods embodiment
Method, specific implementation can be found in embodiment of the method, and details are not described herein.
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description
It with the specific work process of device, can refer to corresponding processes in the foregoing method embodiment, details are not described herein.
It, can be with if the function is realized in the form of SFU software functional unit and when sold or used as an independent product
It is stored in a computer read/write memory medium.Based on this understanding, technical scheme of the present invention is substantially in other words
The part of the part that contributes to existing technology or the technical solution can be expressed in the form of software products, the meter
Calculation machine software product is stored in a storage medium, including some instructions are used so that a computer equipment (can be
People's computer, server or network equipment etc.) it performs all or part of the steps of the method described in the various embodiments of the present invention.
And storage medium above-mentioned includes:USB flash disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), arbitrary access are deposited
The various media that can store program code such as reservoir (RAM, Random Access Memory), magnetic disc or CD.
In the description of the present invention, it should be noted that term "center", "upper", "lower", "left", "right", "vertical",
The orientation or positional relationship of the instructions such as "horizontal", "inner", "outside" be based on the orientation or positional relationship shown in the drawings, merely to
Convenient for the description present invention and simplify description, do not indicate or imply the indicated device or element must have a particular orientation,
With specific azimuth configuration and operation, therefore it is not considered as limiting the invention.In addition, term " first ", " second ",
" third " is used for description purposes only, and is not understood to indicate or imply relative importance.
Finally it should be noted that:Embodiment described above, only specific implementation mode of the invention, to illustrate the present invention
Technical solution, rather than its limitations, scope of protection of the present invention is not limited thereto, although with reference to the foregoing embodiments to this hair
It is bright to be described in detail, it will be understood by those of ordinary skill in the art that:Any one skilled in the art
In the technical scope disclosed by the present invention, it can still modify to the technical solution recorded in previous embodiment or can be light
It is readily conceivable that variation or equivalent replacement of some of the technical features;And these modifications, variation or replacement, do not make
The essence of corresponding technical solution is detached from the spirit and scope of technical solution of the embodiment of the present invention, should all cover the protection in the present invention
Within the scope of.Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (10)
1. a kind of yarn recognition methods, which is characterized in that it is applied to vision-based detection module, including:
Obtain the picture frame of the yarn to be measured of image capture device acquisition;
The characteristics of image of the yarn to be measured is extracted in described image frame;
The image attributes of the yarn to be measured is determined according to described image feature;
It, will default number of yarns corresponding with the preset attribute condition if described image attribute meets any preset attribute condition
It is determined as the number of yarns recognized.
2. yarn recognition methods according to claim 1, which is characterized in that described image attribute includes:Region area, area
The attributes such as domain perimeter, linear character, hole Euler's numbers and row scanning pulse number.
3. yarn recognition methods according to claim 2, which is characterized in that the preset attribute condition includes:Attribute is joined
Number range, before obtaining the picture frame of image capture device acquisition of the image for acquiring yarn to be measured, including:
Obtain the sample image sequence of one or more yarn of described image collecting device acquisition;
Morphological scale-space is carried out to the multiframe sample image of the sample image sequence, obtains the multiframe for meeting feature extraction condition
Sample image frame;
Multiple property parameters of image attributes are determined according to the characteristics of image extracted in every frame sample image frame;
For each attribute in described image attribute, the multiple categories corresponding with itself that will be extracted from multiframe sample image frame
Property parameter according to default value rule carry out value, obtain the attribute thresholds of itself;
According to the preset attribute threshold value for determining one or more yarn according to multiple attribute thresholds of value rule value;
Property parameters range is determined according to preset attribute threshold value and predetermined deviation range.
4. yarn recognition methods according to claim 3, which is characterized in that the method further includes:
Judge whether described image attribute meets any preset attribute condition;
It is described to judge whether described image attribute meets any preset attribute condition, including:
Judge whether any attribute in described image attribute is respectively positioned within the scope of the corresponding property parameters of single yarn;
If any attribute in described image attribute is respectively positioned within the scope of the corresponding property parameters of single yarn, determination recognizes
Number of yarns be it is single;
If any attribute in described image attribute is not located within the scope of the corresponding property parameters of single yarn, described in judgement
Whether any attribute in image attributes is located at the corresponding property parameters range of threads;
If any attribute in described image attribute is respectively positioned on the corresponding property parameters range of threads, what determination recognized
Number of yarns is more.
5. yarn recognition methods according to claim 4, which is characterized in that judge any attribute in described image attribute
Whether it is respectively positioned within the scope of the corresponding property parameters of single yarn, including:
According to precedence information, by the property parameters of each attribute and category corresponding in the property parameters range of single yarn
Property parameter area is compared.
6. yarn recognition methods according to claim 5, which is characterized in that described image collecting device includes:It synchronizes and adopts
The first camera and second camera of collection, between the optical axis of the first camera and the optical axis of the second camera in predetermined acute angle or
Right angle, the picture frame of the image capture device acquisition for obtaining the image for acquiring yarn to be measured, including:
Obtain the second picture frame of the first picture frame and second camera acquisition of the first camera acquisition.
7. yarn recognition methods according to claim 6, which is characterized in that extract the yarn to be measured in described image frame
The characteristics of image of line, including:
Described image frame is pre-processed, image segmentation handles and Morphological scale-space;
The characteristics of image of the yarn to be measured is extracted in the picture frame after Morphological scale-space.
8. a kind of warp yarn identification device, which is characterized in that including:
Acquisition module, the picture frame of the yarn to be measured for obtaining image capture device acquisition;
Extraction module, the characteristics of image for extracting the yarn to be measured in described image frame;
First determining module, the image attributes for determining the yarn to be measured according to described image feature;
Second determining module will be with the preset attribute condition if meeting any preset attribute condition for described image attribute
Corresponding default number of yarns is determined as the number of yarns recognized.
9. a kind of electronic equipment, including memory, processor, be stored in the memory to run on the processor
Computer program, which is characterized in that it is any described that said program code makes the processor execute the claim 1 to 7
Method.
10. a kind of computer-readable medium for the non-volatile program code that can perform with processor, which is characterized in that described
Program code makes the processor execute any method of the claim 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810127700.5A CN108288272A (en) | 2018-02-08 | 2018-02-08 | Yarn recognition methods and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810127700.5A CN108288272A (en) | 2018-02-08 | 2018-02-08 | Yarn recognition methods and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108288272A true CN108288272A (en) | 2018-07-17 |
Family
ID=62832813
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810127700.5A Pending CN108288272A (en) | 2018-02-08 | 2018-02-08 | Yarn recognition methods and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108288272A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110210427A (en) * | 2019-06-06 | 2019-09-06 | 中国民航科学技术研究院 | A kind of shelter bridge working condition detection system and method based on image processing techniques |
WO2020048248A1 (en) * | 2018-09-05 | 2020-03-12 | 深圳灵图慧视科技有限公司 | Textile defect detection method and apparatus, and computer device and computer-readable medium |
CN110992358A (en) * | 2019-12-18 | 2020-04-10 | 北京机科国创轻量化科学研究院有限公司 | Method and device for positioning yarn rods of yarn cage, storage medium and processor |
WO2020113773A1 (en) * | 2018-12-04 | 2020-06-11 | 深圳码隆科技有限公司 | Image recognition technology-based spinning machine fault monitoring system and method |
CN111461142A (en) * | 2020-03-31 | 2020-07-28 | 广东溢达纺织有限公司 | Fabric simulation method and system based on yarn and storage medium |
CN112308028A (en) * | 2020-11-25 | 2021-02-02 | 四川省农业科学院蚕业研究所 | Intelligent counting method for silkworm larvae and application thereof |
CN112330595A (en) * | 2020-10-13 | 2021-02-05 | 浙江华睿科技有限公司 | Tripwire detection method and device, electronic equipment and storage medium |
US11625822B2 (en) * | 2018-08-07 | 2023-04-11 | Ashok OSWAL | System and method for determining quality attributes of raw material of textile |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2832908A1 (en) * | 2013-07-30 | 2015-02-04 | Stäubli Sargans AG | Monitoring device for a weaving loom, weaving loom and method of monitoring |
CN105718989A (en) * | 2014-11-30 | 2016-06-29 | 中国科学院沈阳自动化研究所 | Bar counting method based on machine vision |
CN106596568A (en) * | 2016-12-13 | 2017-04-26 | 青岛大学 | Real-time non-contact yarn breakage detection method based on line laser |
-
2018
- 2018-02-08 CN CN201810127700.5A patent/CN108288272A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2832908A1 (en) * | 2013-07-30 | 2015-02-04 | Stäubli Sargans AG | Monitoring device for a weaving loom, weaving loom and method of monitoring |
CN105718989A (en) * | 2014-11-30 | 2016-06-29 | 中国科学院沈阳自动化研究所 | Bar counting method based on machine vision |
CN106596568A (en) * | 2016-12-13 | 2017-04-26 | 青岛大学 | Real-time non-contact yarn breakage detection method based on line laser |
Non-Patent Citations (3)
Title |
---|
李景华: "《运筹学:理论、模型与Excel求解》", 30 September 2012, 上海财经大学出版社 * |
许家尧,孙苑: "开放场景人脸识别系统及其应用", 《指挥信息系统与技术》 * |
赵文耘 等: "《软件工程:方法与实践》", 31 December 2014 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11625822B2 (en) * | 2018-08-07 | 2023-04-11 | Ashok OSWAL | System and method for determining quality attributes of raw material of textile |
WO2020048248A1 (en) * | 2018-09-05 | 2020-03-12 | 深圳灵图慧视科技有限公司 | Textile defect detection method and apparatus, and computer device and computer-readable medium |
WO2020113773A1 (en) * | 2018-12-04 | 2020-06-11 | 深圳码隆科技有限公司 | Image recognition technology-based spinning machine fault monitoring system and method |
CN110210427A (en) * | 2019-06-06 | 2019-09-06 | 中国民航科学技术研究院 | A kind of shelter bridge working condition detection system and method based on image processing techniques |
CN110992358A (en) * | 2019-12-18 | 2020-04-10 | 北京机科国创轻量化科学研究院有限公司 | Method and device for positioning yarn rods of yarn cage, storage medium and processor |
CN110992358B (en) * | 2019-12-18 | 2023-10-20 | 北京机科国创轻量化科学研究院有限公司 | Method and device for positioning yarn rod of yarn cage, storage medium and processor |
CN111461142A (en) * | 2020-03-31 | 2020-07-28 | 广东溢达纺织有限公司 | Fabric simulation method and system based on yarn and storage medium |
CN112330595A (en) * | 2020-10-13 | 2021-02-05 | 浙江华睿科技有限公司 | Tripwire detection method and device, electronic equipment and storage medium |
CN112330595B (en) * | 2020-10-13 | 2024-04-02 | 浙江华睿科技股份有限公司 | Method and device for detecting stumbling wire, electronic equipment and storage medium |
CN112308028A (en) * | 2020-11-25 | 2021-02-02 | 四川省农业科学院蚕业研究所 | Intelligent counting method for silkworm larvae and application thereof |
CN112308028B (en) * | 2020-11-25 | 2023-07-14 | 四川省农业科学院蚕业研究所 | Intelligent silkworm larva counting method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108288272A (en) | Yarn recognition methods and device | |
CN106226325B (en) | A kind of seat surface defect detecting system and its method based on machine vision | |
CN107483014B (en) | A kind of photovoltaic panel failure automatic detection method | |
CN112837302B (en) | Method and device for monitoring state of die, industrial personal computer, storage medium and system | |
CN106529559A (en) | Pointer-type circular multi-dashboard real-time reading identification method | |
CN108982514A (en) | A kind of bionical vision detection system of casting surface defect | |
CN104949990A (en) | Online detecting method suitable for defects of woven textiles | |
CN205538710U (en) | Inductance quality automatic check out system based on machine vision | |
CN110096980A (en) | Character machining identifying system | |
US5834639A (en) | Method and apparatus for determining causes of faults in yarns, rovings and slivers | |
CN115201211A (en) | Quality control method and system for intelligent visual textile product | |
CN105839355A (en) | Washing machine and method, and device for recognizing colors of clothes in washing machine | |
CN115144399B (en) | Assembly quality detection method and device based on machine vision | |
CN108315852B (en) | Spinning machine threading method and device | |
CN115690693A (en) | Intelligent monitoring system and monitoring method for construction hanging basket | |
CN113936001B (en) | Textile surface flaw detection method based on image processing technology | |
CN105619741B (en) | A kind of mould intelligent detecting method based on Tegra K1 | |
CN110458809A (en) | A kind of yarn evenness detection method based on sub-pixel edge detection | |
CN109881356B (en) | Hosiery machine knitting needle online detection device and method based on SVM image classification | |
CN108133479B (en) | Automatic spinning machine drawing-in monitoring method and device | |
CN114782426A (en) | Knitted fabric broken yarn defect detection method based on artificial intelligence system | |
CN101598675B (en) | Method for identifying mutual information of cotton foreign fiber | |
Su et al. | Rebar automatically counting on the product line | |
CN109211937B (en) | Detection system and detection method for bending defect of elastic braid of underwear | |
Chong et al. | Fabric Defect Detection Method Based on Projection Location and Superpixel Segmentation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180717 |
|
RJ01 | Rejection of invention patent application after publication |