CN109300117A - Nerve network system, electronic equipment and machine readable media - Google Patents

Nerve network system, electronic equipment and machine readable media Download PDF

Info

Publication number
CN109300117A
CN109300117A CN201811036995.1A CN201811036995A CN109300117A CN 109300117 A CN109300117 A CN 109300117A CN 201811036995 A CN201811036995 A CN 201811036995A CN 109300117 A CN109300117 A CN 109300117A
Authority
CN
China
Prior art keywords
network
fabric
fault
candidate region
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811036995.1A
Other languages
Chinese (zh)
Inventor
金玲玲
饶东升
何文玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Lingtu Huishi Technology Co Ltd
Original Assignee
Shenzhen Lingtu Huishi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Lingtu Huishi Technology Co Ltd filed Critical Shenzhen Lingtu Huishi Technology Co Ltd
Priority to CN201811036995.1A priority Critical patent/CN109300117A/en
Publication of CN109300117A publication Critical patent/CN109300117A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)

Abstract

This application discloses the nerve network systems, electronic equipment and machine readable media for fabric defects detection, the nerve network system includes that candidate region generates network, the object of fabric surface for being included to image to be detected carries out identification positioning and image segmentation, to generate the candidate region for including the object;Fabric attributes feature obtains network, for obtaining the fabric attributes feature for the object for belonging to fabric attributes that the candidate region is included;Attribute/fault discriminant classification network, for according to the candidate region and the fabric attributes feature, detecting in the candidate region with the presence or absence of fault effective coverage.The nerve network system, electronic equipment and machine readable media can improve the accuracy of fabric defects detection.

Description

Nerve network system, electronic equipment and machine readable media
Technical field
This application involves Fabric Detection technical fields, in particular to are used for the nerve network system and electricity of fabric defects detection Sub- equipment.
Background technique
On the production line of the fabrics such as woven fabric, looped fabric, non-woven cloth, need whether to detect produced fabric There are faults, for example, whether having spot, broken hole, fluffing etc. on fabric.
Current detection method mainly is found to knit before perching equipment by testing staff station in such a way that naked eyes detect Object fault is simultaneously marked fault or records.In the case where the yield of fabric is very big, being detected by testing staff will be very Take manpower, moreover, testing staff is easy fatigue after a period of time that works, to there is a possibility that erroneous detection occurs.Therefore, The overall fault detection efficiency detected by testing staff is not high and accuracy in detection is not sufficiently stable.
In the related technology, perching is carried out using computer mainly to realize by machine vision and fault classification method, That is, cloth image to be detected is obtained by photography technology, by presetting multiple classifications, using detection model according to this The feature of image determines the probability for each classification that the image is belonging respectively in multiple classification, and the classification of maximum probability is true It is set to classification described in the image, to obtain the classification of fault described in the cloth image.But it is tested currently with computer The method of cloth does not all take in the attributive character of fabric itself (such as stamp, texture etc.), often by the category of fabric Property feature erroneous detection be fault, cause detection accuracy rate it is very low.
Summary of the invention
In view of problem above, the embodiment of the present invention provides a kind of nerve network system and electricity for fabric defects detection Sub- equipment can solve the technical issues of above-mentioned background technology part is mentioned.
The nerve network system for fabric defects detection of embodiment according to the invention, comprising: candidate region generates Network, the object of the fabric surface for being included to image to be detected carry out identification positioning and image segmentation, have generated and include The candidate region of the object;Fabric attributes feature obtains network, belongs to fabric for obtain that the candidate region included The fabric attributes feature of the object of attribute;Attribute/fault discriminant classification network, for according to the candidate region and the fabric Attributive character detects in the candidate region with the presence or absence of fault effective coverage.
In another embodiment based on above system of the present invention, the system also includes convolution feature extraction network, For extracting the characteristic pattern of the candidate region;Target classification Recurrent networks, for extracting the provincial characteristics of the characteristic pattern, with Determine the fault effective coverage with the presence or absence of fault.
In another embodiment based on above system of the present invention, the target classification Recurrent networks are also used to: being carried out Fault classification differentiates and fault bounding box returns amendment.
In another embodiment based on above system of the present invention, the convolution feature extraction network include ZFNet, AlexNet or GoogLeNet.
In another embodiment based on above system of the present invention, the candidate region is generated network and is suggested using region Network RPN.
In another embodiment based on above system of the present invention, the attribute/fault discriminant classification network include according to The ROI region pond layer of secondary connection, several hidden layers and softmax layers.
In another embodiment based on above system of the present invention, it includes timing that the fabric attributes feature, which obtains network, Nerve network, or, timing-residual error neural network;Wherein, the timing-residual error neural network is in timing nerve Network each basic unit addition residual error network constitute neural network, the residual error network by the basic unit for the moment In the output weighted superposition at quarter to the output at the basic unit current time.
In another embodiment based on above system of the present invention, the timing neural network model includes circulation mind Through network model, long memory models or gating cycle model of element in short-term;The timing-residual error neural network model includes following Ring-residual error neural network model, long short-term memory-residual error neural network model or gating cycle unit-residual error neural network mould Type.
The electronic equipment of embodiment according to the invention, including processor and memory, being stored thereon with can be described The executable program run on processor, wherein the processor realizes the function of aforementioned system when executing the executable program Energy.
The machine readable media of embodiment according to the invention, is stored thereon with executable program, wherein described executable Program makes the function of machine realization aforementioned system upon being performed.
It can be seen from the above that the nerve network system and electronic equipment of the embodiment of the present invention utilize candidate regions Domain generates the candidate region that network generates fabric surface object, obtains network acquisition candidate region using fabric attributes feature and is wrapped What is contained belongs to the fabric attributes feature of the object of fabric attributes, is carried out using attribute/fault discriminant classification network to candidate region Discriminant classification detects the fault effective coverage for wherein belonging to fault, excludes the fabric attributes inactive area for belonging to fabric attributes, So as to improve the accuracy of fabric defects detection.
Detailed description of the invention
Fig. 1 is the structural representation of the nerve network system for fabric defects detection of one embodiment according to the invention Figure;
Fig. 2 is the knot of a neural network basic unit of the RNN-ResNet model of one embodiment according to the invention Structure schematic diagram;
Fig. 3 is a neural network basic unit of the LSTM-ResNet model of one embodiment according to the invention Structural schematic diagram;
Fig. 4 is attribute/fault discriminant classification network structural schematic diagram of one embodiment according to the invention;
Fig. 5 is that the structure of the nerve network system for fabric defects detection of another embodiment according to the invention is shown It is intended to;
Fig. 6 is the structural schematic diagram of the target classification Recurrent networks of one embodiment according to the invention;
Fig. 7 is the flow chart of the method for the nerve network system training of one embodiment according to the invention;
Fig. 8 is the schematic diagram of the electronic equipment of one embodiment according to the invention.
Specific embodiment
Theme described herein is discussed referring now to example embodiment.It should be understood that discussing these embodiments only It is in order to enable those skilled in the art can better understand that being not to claim to realize theme described herein Protection scope, applicability or the exemplary limitation illustrated in book.It can be in the protection scope for not departing from present disclosure In the case of, the function and arrangement of the element discussed are changed.Each example can according to need, omit, substitute or Add various processes or component.In addition, feature described in relatively some examples can also be combined in other examples.
Simultaneously, it should be appreciated that for ease of description, the size of various pieces shown in attached drawing is not according to reality Proportionate relationship draw.
It should also be noted that similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi It is defined in a attached drawing, does not then need to further discuss it in subsequent attached drawing.
It should also be understood that " A is connect with B " can refer to that A is directly connected to B or A and B is by one in example embodiment Other a or multiple units/components are indirectly connected with, and are not construed as limiting in example embodiment of the present invention to this.
The embodiment of the present disclosure can be applied to computer system/server, can be with numerous other general or specialized calculating System environments or configuration operate together.Suitable for be used together with computer system/server well-known computing system, ring The example of border and/or configuration includes but is not limited to: personal computer system, server computer system, thin client, it is hand-held or Laptop devices, microprocessor-based system, set-top box, programmable consumer electronics, NetPC Network PC, minicomputer System, large computer system and distributed cloud computing technology environment including above-mentioned any system, etc..
Computer system/server can be in computer system executable instruction (such as journey executed by computer system Sequence module) general context under describe.In general, program module may include routine, program, target program, component, logic, number According to structure etc., they execute specific task or realize specific abstract data type.Computer system/server can be with Implement in distributed cloud computing environment, in distributed cloud computing environment, task is long-range by what is be linked through a communication network Manage what equipment executed.In distributed cloud computing environment, it includes the Local or Remote meter for storing equipment that program module, which can be located at, It calculates in system storage medium.
Fig. 1 shows the structure of the nerve network system for fabric defects detection of one embodiment according to the invention Schematic diagram.As shown in Figure 1, the system includes that candidate region generates network 11, fabric attributes feature obtains network 12 and attribute/defect Point discriminant classification network 13, wherein the output end that candidate region generates network 11 obtains network 12 with fabric attributes feature respectively It is connected with attribute/fault discriminant classification network 13 input terminal, fabric attributes feature obtains the output end and attribute/defect of network 12 Another input terminal connection of point discriminant classification network 13.
In one or more alternative embodiments, candidate region generates network 11 for knitting to what image to be detected was included The object on object surface carries out identification positioning, to generate the candidate region for including the object.
Optionally, image to be detected can be acquired by CCD industrial camera shoots fabric surface acquisition to be detected, wherein on Stating CCD (Charge Coupled Device, photosensitive coupling component) is in digital camera for recording the semiconductor of light variation Component.
Optionally, region suggestion network (RPN:Region Proposal Network) can be used in the generation of candidate region Or using identification positioning and image segmentation algorithm, RPN, identification positioning and image segmentation algorithm are the prior art, are saved herein Slightly descriptions thereof.
It should be understood that the object that candidate region is included may include the fault and/or fabric attributes of fabric surface, knit Object attribute is such as, but not limited to the attribute that fabrics such as stamp, texture, jacquard weave, pattern itself have;Fault includes for example but not office It is limited to spot, yarn defect, float, printing and dyeing fault, side defect, fold, skew of weft, broken hole, hooks silk, sanding unevenness, blur, fluffing, wiping Wound, roll line, stop Mark.Wherein, spot includes greasy dirt, rust spot, color dot, spot, mildew, auxiliary agent spot;Yarn defect include dead cotton, slubbing, Flyings, thick young yarn, soiled yarn, the dry unevenness of item;Float include broken yarn, knot, dropped stitch, rotten needle, try to stop people from fighting each other it is elastic, disconnected try to stop people from fighting each other, spacing not Surely, cloth cover rise snake, filling show up, color fibre, needle path, mistake yarn, yarn trace, try to stop people from fighting each other it is show-through, try to stop people from fighting each other and show up;Printing and dyeing fault include bite, Stamp displacement, stamp staining, stamp cross that bottom, stamp is bad, dyeing flower, two tone colour, loses colour, difference;When defect includes pin hole, crosspointer Hole, crimping, rotten side, narrow envelope, wealthy envelope;Fold includes intermediate catcher mark, cloth cover corrugation, folding line;Skew of weft includes the classes such as twill, arch Not.
Fabric attributes feature obtains network 12 for obtaining the object for belonging to fabric attributes that the candidate region is included Fabric attributes feature.Its input is the candidate region that candidate region generates that network 11 generates, and output is fabric attributes feature.
Optionally, fabric attributes feature, which obtains network 12, can be used timing neural network, or, timing-residual error nerve Network;Wherein, the timing-residual error neural network is each basic unit addition residual error network in timing neural network The neural network of composition, the residual error network is by the output weighted superposition of the basic unit last moment to the basic unit In the output at current time.
Wherein, timing neural network includes Recognition with Recurrent Neural Network (RNN:Recurrent Neural Network), length Short-term memory network (LSTM:Long Short-Term Memory) or gating cycle unit networks (GRU:Gated Recurrent Unit).Correspondingly, the timing-residual error neural network includes circulation-residual error neural network (RNN- ResNet:Recurrent Neural Network-Residual Network), long short-term memory-residual error neural network (LSTM-ResNet) or gating cycle unit-residual error neural network (GRU-ResNet).Timing-residual error neural network can be with Solve the problems, such as that gradient disperse (diffusion of gradients) occurs in timing neural network, below with RNN-ResNet For be illustrated.
Fig. 2 is the structural representation of the neural network basic unit of RNN-ResNet one embodiment provided in this embodiment Figure, the neural network basic unit calculation formula after addition are as follows:
st=f (Uxt+Wst-1)+α·st-1
ot=SOFTMAX (Vst)
Wherein, xtFor external world's input of t moment, stIt is exported for the RNN-ResNet neural network unit memory of t moment, U, V, W is network parameter, and f can be the functions such as tanh, otFor the output of t moment, α is residual error coefficient.
It is understood that residual error coefficient α is added in RNN basic unit, so that the memory of RNN basic unit Export stItem increases α st-1, it will be in the output weighted superposition of RNN last moment to current output.When α is 0, as Common RNN basic unit, the f (Ux when α is 1, in RNN basic unitt+Wst-1) it is equivalent to study st-st-1, that is, introduce residual Poor mechanism is the compromise proposal of two kinds of situations as 0 < α < 1.
The present embodiment is because of if using common RNN model, when the model number of plies is more using RNN-ResNet model When, due to increasing with the number of plies, when calculating derivative using back-propagation method, the gradient of backpropagation is (from output Layer arrives the initial several layers of of network) range value can sharp reduce, as a result cause whole loss function relative to initial several The derivative of the weight of layer is very small, in this way, initially several layers of weight variations is very slow when using gradient descent method, So that they can not effectively be learnt from training sample, to the phenomenon that gradient disperse occur.And use RNN- ResNet connection is added in ResNet model in RNN, and wherein ResNet can arrive the output weighted superposition of RNN last moment In current output, so that deeper neural network is easy to trained.
Similarly, Fig. 3 is the structure of the neural network basic unit of LSTM-ResNet one embodiment provided in this embodiment Schematic diagram, as shown in figure 3, its substantive process for adding ResNet is added for LSTM basic unit, so that base This unit is in output stIn increase α st-1, the output of LSTM unit last moment is weighted to the output at current time On.The principle of GRU-ResNet model is same as described above, omits descriptions thereof herein.
Attribute/fault discriminant classification network 13 is used to detect institute according to the candidate region and the fabric attributes feature It states in candidate region with the presence or absence of fault effective coverage.
The input of attribute/fault discriminant classification network 13 is the candidate region and fabric that candidate region generates that network 11 generates Attributive character obtains the fabric attributes feature that network 12 obtains, and exports as fault effective coverage.Attribute/fault discriminant classification network 13 carry out two discriminant classification of attribute/fault to candidate region according to fabric attributes feature, then exclude to be identified as fabric category Property invalid candidate region, retain and be identified as effective candidate region of fault.Attribute/fault discriminant classification network 13 can wrap Include sequentially connected ROI region pond layer, several hidden layers and softmax layers.
Fig. 4 is attribute/fault discriminant classification network one embodiment structural schematic diagram.As shown in figure 4, attribute/fault Discriminant classification network 13 may include sequentially connected ROI region pond layer, three full articulamentum fc1, fc2, fc3 (hidden Layer) and one softmax layers, when it is implemented, attribute/fault discriminant classification network 13 network parameter can be used such as 1 institute of table Show.
Table 1, the present embodiment attribute/fault discriminant classification network parameter
From the above, it can be seen that the nerve network system of the embodiment of the present invention generates network life using candidate region At the candidate region of fabric surface object, belong to fabric using what fabric attributes feature acquisition network acquisition candidate region was included The fabric attributes feature of the object of attribute carries out discriminant classification, detection to candidate region using attribute/fault discriminant classification network Wherein belong to the fault effective coverage of fault out, excludes the fabric attributes inactive area for belonging to fabric attributes, knitted so as to improve The accuracy of object defect detection.
Fig. 5 shows the knot of the nerve network system for fabric defects detection of another embodiment according to the invention Structure schematic diagram.As shown in figure 5, the system include candidate region generate network 11, fabric attributes feature obtain network 12, attribute/ Fault discriminant classification network 13, convolution feature extraction network 14 and target classification Recurrent networks 15, wherein candidate region generates net The output end of network 11 obtains network 12, attribute/fault discriminant classification network 13 and convolution feature with fabric attributes feature respectively and mentions The input terminal of network 14 is taken to connect, fabric attributes feature obtains the output end and attribute/fault discriminant classification network 13 of network 12 Another input terminal connection, two input terminals of target classification Recurrent networks 15 respectively with attribute/fault discriminant classification network 13 It is connected with the output end of convolution feature extraction network 14.
In one or more alternative embodiments, candidate region generates network 11 for knitting to what image to be detected was included The object on object surface carries out identification positioning, to generate the candidate region for including the object.Fabric attributes feature obtains network 12 For obtaining the fabric attributes feature for the object for belonging to fabric attributes that the candidate region is included.Attribute/fault classification is sentenced Other network 13 is used to be detected in the candidate region according to the candidate region and the fabric attributes feature with the presence or absence of fault Effective coverage.
Convolution feature extraction network 14 is used to extract the characteristic pattern that candidate region generates the candidate region that network generates, optional , convolution feature extraction network 14 can use ZFNet, AlexNet or GoogLeNet.Convolution feature extraction network 14 extracts Characteristic pattern is the prior art, omits descriptions thereof herein.
Target classification Recurrent networks 15 are according to attribute/fault discriminant classification network output fault effective coverage, from convolution Provincial characteristics is extracted in the characteristic pattern that feature extraction network extracts, so that it is determined that whether there is fault in fault effective coverage.
Optionally, target classification Recurrent networks 15 are also used to carry out the differentiation of fault classification and the recurrence of fault bounding box is repaired Just, thus determining that there are can be differentiated to fault classification in the case where fault and export fault classification information.
Fig. 6 shows the structural schematic diagram of one embodiment target classification Recurrent networks.When it is implemented, target classification returns Return the network parameter of network 15 can be using as shown in table 2.
Table 2, the present embodiment target classification Recurrent networks parameter
Fig. 7 shows the flow chart of the method for nerve network system training of one embodiment according to the invention. Method 100 shown in Fig. 7 corresponds to the training stage, obtains the nerve net for fabric defects detection using training data training Network system.Method 100 shown in Fig. 7 can by computer or other suitably there is the electronic equipment of computing capability to realize.
As shown in fig. 7, receiving the image of multiple original shootings in box 102.Wherein, the image of multiple original shooting Normal picture including multiple no faults and it is multiple have the problem of fault image, the multiple normal picture includes having continuity Multiple images and do not have successional multiple images.At least one fabric category is spliced to form with successional multiple images Property circulation, such as institutional framework circulation, stamp circulation.
In box 104, image labeling (Image Annotation) processing is executed to the image of multiple original shooting, with Obtain first sample image set SP1.Wherein, each of first sample image set SP1 sample image is to multiple original One of image of the image of shooting executes what image labeling was handled.Image labeling processing is known technology, herein Omit descriptions thereof.Each image can contain the markup information of one or more attributes, such as the mark about attribute is believed It ceases, about the markup information of fault.
In box 106, gray processing processing is executed to first sample image set SP1, it will be in first sample image set SP1 Each sample image is converted to gray level image.
In box 108, some or all sample images are chosen from the first sample image set SP1 that gray processing is handled and are made For drawing of seeds picture.
In box 110, one or many angularly rotations, mirror image are executed to each drawing of seeds picture and/or other are suitable Operation, with from obtaining one or more images derived from each drawing of seeds picture.Wherein, the first sample of gray processing processing Sample image in image set SP1 and the second sample graph image set is together to form from the image obtained derived from each drawing of seeds picture SP2。
By the operation of box 108 and 110, the quantity of training sample can be increased (for example, can be by 2500 sample graphs As obtaining the sample image more than 50000 or even 100000 after treatment), and with the increase of training samples, finally The neural network model that training obtains has higher accuracy in detection.
Box 106-110 constitutes the image preprocessing process (Image Preprocessing) of method 100.
In box 112, the property parameters of each gray level image in the second sample graph image set SP2 are obtained, wherein the attribute Parameter includes but is not limited to the length of image, width etc..
In box 114, from each rule chosen in the second sample graph image set SP2 in its property parameters the first rule set of satisfaction Multiple images then, as training fabric attributes feature to obtain the third sample graph image set SP3 of network.Wherein, this first Rule set is used to define the condition that the sample image of training fabric attributes feature acquisition network needs to meet that is suitable for.For example, the One rule set defines length limitation, the width limitation that the sample image for obtaining network suitable for fabric attributes feature needs to meet Deng.Wherein, third sample graph image set SP3 includes having successional multiple images.
Under normal conditions, the part of the surface attribute of fabric has the rule of loop cycle, by with successional multiple Image trains fabric attributes feature to obtain network, the memory function of network can be obtained using fabric attributes feature, to having The fabric attributes of regularity carry out detection training.
In box 116, each rule that its property parameters meets Second Rule concentration are chosen from the second sample graph image set SP2 Multiple images then, as training the 4th sample graph image set SP4 of attribute/fault discriminant classification network.Wherein, this second Rule set is used to define the condition that the sample image of training attribute/fault discriminant classification network needs to meet that is suitable for.For example, the Two rule sets, which define, is suitable for length limitation, width that the sample image of training attribute/fault discriminant classification network needs to meet Limitation etc..4th sample graph image set SP4 includes multiple normal pictures and multiple problem images.
In box 218, its property parameters is chosen from the second sample graph image set SP2 and meets each rule in third rule set Multiple images then, as the 5th sample graph image set for training convolutional feature extraction network and target classification Recurrent networks SP5.Wherein, which is used to define the sample suitable for training convolutional feature extraction network and target classification Recurrent networks This image needs the condition met.For example, the definition of third rule set is suitable for training convolutional feature extraction network and target classification Length limitation, width limitation that the sample image of Recurrent networks needs to meet etc..5th sample graph image set SP5 includes multiple a variety of The problem of fault is classified image.
In box 220, use third sample graph image set SP3 as training data, training obtains the acquisition of fabric attributes feature Network.
In box 222, use the 4th sample graph image set SP4 as training data, training obtains attribute/fault discriminant classification Network.
In box 224, use the 5th sample graph image set SP5 as training data, training obtains convolution feature extraction network With target classification Recurrent networks.
Fig. 8 shows the schematic diagram of the electronic equipment of one embodiment according to the invention.As shown in figure 8, electronic equipment 200 may include processor 202 and memory 204, wherein be stored on memory 204 can run on processor 202 can Execute program, wherein any of the above-described embodiment of the realization present invention is used for fabric when processor 202 executes the executable program The function of defect detection nerve network system.
In one aspect, processor 202 is arranged in processing component 201, and electronic equipment 200 can also include with next Or multiple components: power supply module 203, multimedia component 205, audio component 207, input/output (I/O) interface 209, sensor Component 211 and communication component 213.
Wherein, power supply module 203 provides power supply for the various assemblies of electronic equipment 200.Power supply module 203 may include electricity Management system, one or more power supplys.
Multimedia component 205 includes the display screen of one output interface of offer between electronic equipment 200 and user.? In some embodiments, display screen may include liquid crystal display (LCD) and touch panel (TP).If display screen includes touch surface Plate, display screen may be implemented as touch screen, to receive input signal from the user.Touch panel includes one or more touchings Sensor is touched to sense the gesture on touch, slide, and touch panel.The touch sensor can not only sense touch or cunning The boundary of movement, but also detect duration and pressure associated with the touch or slide operation.
Audio component 207 is configured as output and/or input audio signal.For example, audio component 207 includes a Mike Wind (MIC), microphone are configured as receiving external audio signal.In some embodiments, audio component 207 further includes one and raises Sound device is used for output audio signal.
I/O interface 209 provides interface between processor 202 and peripheral interface module.Above-mentioned peripheral interface module can be with It is click wheel, button etc..These buttons may include, but are not limited to: volume button, start button and locking press button.
Sensor module 211 includes one or more sensors.Sensor module 211 may include proximity sensor, quilt It is configured to detect the presence of nearby objects without any physical contact.In some embodiments, the sensor module 211 can also be including camera etc..
Communication component 213 is configured to facilitate the communication of wired or wireless way between electronic equipment 200 and other equipment.
Electronic equipment 200 can access the wireless network based on communication standard, such as WiFi, 2G or 3G or their combination. It in one embodiment, may include SIM card slot in the electronic equipment 200, which is used to be inserted into SIM card, so that Electronic equipment 200 can log in GPRS network, be communicated by internet with server foundation.
The embodiment of the present invention also provides a kind of machine readable media, is stored thereon with executable program, wherein it is described can Execute program make upon being performed machine realize any of the above-described embodiment of the present invention for fabric defects detection neural network The function of system.
It will be understood by those skilled in the art that complete hardware embodiment can be used in the embodiment of the present invention, complete software is implemented The form of example or embodiment combining software and hardware aspects.Moreover, the embodiment of the present invention can be used it is one or more wherein It include computer-usable storage medium (including but not limited to magnetic disk storage, CD-ROM, the light of computer usable program code Learn memory etc.) on the form of computer program product implemented.
The specific embodiment illustrated above in conjunction with attached drawing describes exemplary embodiment, it is not intended that may be implemented Or fall into all embodiments of the protection scope of claims." exemplary " meaning of the term used in entire this specification Taste " be used as example, example or illustration ", be not meant to than other embodiments " preferably " or " there is advantage ".For offer pair The purpose of the understanding of described technology, specific embodiment include detail.However, it is possible in these no details In the case of implement these technologies.In some instances, public in order to avoid the concept to described embodiment causes indigestion The construction and device known is shown in block diagram form.
The foregoing description of present disclosure is provided so that any those of ordinary skill in this field can be realized or make Use present disclosure.To those skilled in the art, the various modifications carried out to present disclosure are apparent , also, can also answer generic principles defined herein in the case where not departing from the protection scope of present disclosure For other modifications.Therefore, present disclosure is not limited to examples described herein and design, but disclosed herein with meeting Principle and novel features widest scope it is consistent.

Claims (10)

1. being used for the nerve network system of fabric defects detection, comprising:
Candidate region generates network, and the object of the fabric surface for being included to image to be detected carries out identification positioning, with life At the candidate region comprising the object;
Fabric attributes feature obtains network, for obtaining the fabric for the object for belonging to fabric attributes that the candidate region is included Attributive character;
Attribute/fault discriminant classification network, for detecting the candidate according to the candidate region and the fabric attributes feature It whether there is fault effective coverage in region.
2. system according to claim 1, wherein the system also includes:
Convolution feature extraction network, for extracting the characteristic pattern of the candidate region;
Target classification Recurrent networks, for extracting the provincial characteristics of the characteristic pattern, with the determination fault effective coverage whether There are faults.
3. system according to claim 2, wherein the target classification Recurrent networks are also used to: carrying out fault classification and sentence Not and fault bounding box returns amendment.
4. system according to claim 2, wherein
The convolution feature extraction network includes ZFNet, AlexNet or GoogLeNet.
5. system according to claim 1-4, wherein
The candidate region generates network and suggests network RPN using region.
6. system according to claim 1-4, wherein
The attribute/fault discriminant classification network includes sequentially connected ROI region pond layer, several hidden layers and softmax layers.
7. system according to claim 1-4, wherein
It includes timing neural network that the fabric attributes feature, which obtains network, or, timing-residual error neural network;Wherein, institute Stating timing-residual error neural network is the nerve net in each basic unit addition residual error network composition of timing neural network Network, the residual error network are defeated by the output weighted superposition of the basic unit last moment to the basic unit current time On out.
8. system according to claim 7, wherein
The timing neural network model includes Recognition with Recurrent Neural Network model, long memory models or gating cycle unit mould in short-term Type;The timing-residual error neural network model includes circulation-residual error neural network model, long short-term memory-residual error nerve net Network model or gating cycle unit-residual error neural network model.
9. electronic equipment, comprising:
Processor, and
Memory is stored thereon with the executable program that can be run on the processor, wherein described in the processor executes The function of the system as described in claim any one of 1-8 is realized when executable program.
10. machine readable media is stored thereon with executable program, wherein the executable program makes machine upon being performed Device realizes the function of the system as described in claim any one of 1-8.
CN201811036995.1A 2018-09-05 2018-09-05 Nerve network system, electronic equipment and machine readable media Pending CN109300117A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811036995.1A CN109300117A (en) 2018-09-05 2018-09-05 Nerve network system, electronic equipment and machine readable media

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811036995.1A CN109300117A (en) 2018-09-05 2018-09-05 Nerve network system, electronic equipment and machine readable media

Publications (1)

Publication Number Publication Date
CN109300117A true CN109300117A (en) 2019-02-01

Family

ID=65166293

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811036995.1A Pending CN109300117A (en) 2018-09-05 2018-09-05 Nerve network system, electronic equipment and machine readable media

Country Status (1)

Country Link
CN (1) CN109300117A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020048248A1 (en) * 2018-09-05 2020-03-12 深圳灵图慧视科技有限公司 Textile defect detection method and apparatus, and computer device and computer-readable medium
CN112215791A (en) * 2019-07-12 2021-01-12 宝洁公司 System and method for providing textile information and visualizing the same
CN112465810A (en) * 2020-12-15 2021-03-09 华南农业大学 Method for detecting and classifying defects of textiles

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106996935A (en) * 2017-02-27 2017-08-01 华中科技大学 A kind of multi-level fuzzy judgment Fabric Defects Inspection detection method and system
CN107123114A (en) * 2017-04-21 2017-09-01 佛山市南海区广工大数控装备协同创新研究院 A kind of cloth defect inspection method and device based on machine learning
CN107870172A (en) * 2017-07-06 2018-04-03 黎明职业大学 A kind of Fabric Defects Inspection detection method based on image procossing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106996935A (en) * 2017-02-27 2017-08-01 华中科技大学 A kind of multi-level fuzzy judgment Fabric Defects Inspection detection method and system
CN107123114A (en) * 2017-04-21 2017-09-01 佛山市南海区广工大数控装备协同创新研究院 A kind of cloth defect inspection method and device based on machine learning
CN107870172A (en) * 2017-07-06 2018-04-03 黎明职业大学 A kind of Fabric Defects Inspection detection method based on image procossing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘祥惠: "基于深度学习的织物图像疵点区域定位算法研究", 《中国优秀硕士学位论文全文数据库工程科技Ⅰ辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020048248A1 (en) * 2018-09-05 2020-03-12 深圳灵图慧视科技有限公司 Textile defect detection method and apparatus, and computer device and computer-readable medium
CN112215791A (en) * 2019-07-12 2021-01-12 宝洁公司 System and method for providing textile information and visualizing the same
WO2021008464A1 (en) * 2019-07-12 2021-01-21 The Procter & Gamble Company System and method for providing textile information and visualizing same
JP2022536179A (en) * 2019-07-12 2022-08-12 ザ プロクター アンド ギャンブル カンパニー Systems and methods for providing and visualizing textile information
JP7451571B2 (en) 2019-07-12 2024-03-18 ザ プロクター アンド ギャンブル カンパニー System and method for providing textile information and visualizing it
CN112465810A (en) * 2020-12-15 2021-03-09 华南农业大学 Method for detecting and classifying defects of textiles

Similar Documents

Publication Publication Date Title
CN111028204B (en) Cloth defect detection method based on multi-mode fusion deep learning
CN105518709B (en) The method, system and computer program product of face for identification
CN109300117A (en) Nerve network system, electronic equipment and machine readable media
CN109187579A (en) Fabric defect detection method and device, computer equipment and computer-readable medium
Zhao et al. Combing rgb and depth map features for human activity recognition
CN109670452A (en) Method for detecting human face, device, electronic equipment and Face datection model
Kampouris et al. Fine-grained material classification using micro-geometry and reflectance
CN109359539A (en) Attention appraisal procedure, device, terminal device and computer readable storage medium
CN109598234A (en) Critical point detection method and apparatus
Qu et al. Defect detection on the fabric with complex texture via dual-scale over-complete dictionary
CN104200478B (en) Low-resolution touch screen image defect detection method based on sparse representation
CN107463965A (en) Fabric attribute picture collection and recognition methods and identifying system based on deep learning
CN109376631A (en) A kind of winding detection method and device neural network based
CN110008816A (en) A kind of method that real-time detection baby kicks quilt son
CN109035248A (en) Defect detection method, apparatus, terminal device, server and storage medium
CN109410192A (en) A kind of the fabric defect detection method and its device of multi-texturing level based adjustment
CN104461801B (en) A kind of method of testing and system of touch-screen susceptibility
CN111445426B (en) Target clothing image processing method based on generation of countermeasure network model
CN104850457B (en) The rapid loading display method and system of large nuber of images in a kind of associated diagram
CN108875331A (en) Face unlocking method, device and system and storage medium
CN109670517A (en) Object detection method, device, electronic equipment and target detection model
CN110245714A (en) Image-recognizing method, device and electronic equipment
CN109977832A (en) A kind of image processing method, device and storage medium
CN109325940A (en) Textile detecting method and device, computer equipment and computer-readable medium
CN109215022A (en) Cloth inspection method, device, terminal device, server, storage medium and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190201

RJ01 Rejection of invention patent application after publication