CN116596802A - Binding yarn extraction method, device, equipment and medium under complex background - Google Patents

Binding yarn extraction method, device, equipment and medium under complex background Download PDF

Info

Publication number
CN116596802A
CN116596802A CN202310677488.0A CN202310677488A CN116596802A CN 116596802 A CN116596802 A CN 116596802A CN 202310677488 A CN202310677488 A CN 202310677488A CN 116596802 A CN116596802 A CN 116596802A
Authority
CN
China
Prior art keywords
image
edge
algorithm
yarn
smooth filtering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310677488.0A
Other languages
Chinese (zh)
Other versions
CN116596802B (en
Inventor
刘琴杨
周腾
谈昆伦
刘时海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou Hongfa Zongheng Advanced Material Technology Co Ltd
Original Assignee
Changzhou Hongfa Zongheng Advanced Material Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Hongfa Zongheng Advanced Material Technology Co Ltd filed Critical Changzhou Hongfa Zongheng Advanced Material Technology Co Ltd
Priority to CN202310677488.0A priority Critical patent/CN116596802B/en
Publication of CN116596802A publication Critical patent/CN116596802A/en
Application granted granted Critical
Publication of CN116596802B publication Critical patent/CN116596802B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

The application relates to the technical field of composite material production, in particular to a binding yarn extraction method, device, equipment and medium under a complex background, comprising the following steps: photographing the cloth cover, and collecting an initial image; carrying out smooth filtering on the initial image through an anisotropic smooth filtering algorithm, and repeating the smooth filtering process on the filtered image until the image is iterated for a plurality of times to obtain a denoising image; extracting the profile in the vertical direction in the denoising image by an edge extraction algorithm to obtain binding yarn yarns; judging whether the binding yarn has broken yarn and doubling yarn. In the application, the image is subjected to smooth filtering through an anisotropic smooth filtering algorithm and is iterated for a plurality of times to obtain the denoising image, so that the interference caused by various noises in the complex background of the image can be avoided when the binding yarns are directly extracted. The anisotropic smooth filtering algorithm is needed to carry out smooth filtering, so that the edge can be well reserved, and the subsequent extraction of the binding yarns is ensured.

Description

Binding yarn extraction method, device, equipment and medium under complex background
Technical Field
The application relates to the technical field of composite material production, in particular to a binding yarn extraction method, device, equipment and medium under a complex background.
Background
The glass fiber is prepared from glass balls or waste glass serving as raw materials through high-temperature melting, wiredrawing, winding, weaving and other processes, is a very good metal substitute material, has wide application prospect in the fields of construction, ships, chemical pipelines, automobiles, aviation, wind power generation and the like, and has huge global market space, and the application field of the glass fiber is still continuously expanded.
In the production process of the glass fiber cloth cover, the broken yarn of the binding yarns woven on the glass fiber cloth cover can occur sometimes, so that the production quality of the cloth cover is seriously affected, and in the prior art, the broken yarn of the binding yarns is detected through a laser sensing device. That is, after the binding yarn is broken, the broken binding yarn can be blown up by wind beside the broken binding yarn, so that laser is blocked, and the detection effect is achieved. However, if the binding yarn is not broken but is doubled, that is, two yarns run into one hole, the binding yarn cannot float at the moment, and the laser sensing device cannot sense the binding yarn, so that the detection of the binding yarn by the laser sensing device has defects.
Besides laser induction, the corresponding binding yarns can be extracted by a visual method, and whether yarn breakage or yarn doubling occurs is judged by the actual number of the binding yarns. As shown in fig. 1, the image background of the glass fiber cloth surface is often complex and has a lot of noise, thereby making the extraction of the binder yarns difficult.
The information disclosed in this background section is only for enhancement of understanding of the general background of the application and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
Disclosure of Invention
The application provides a binding yarn extraction method, device, equipment and medium under a complex background, thereby effectively solving the problems in the background technology.
In order to achieve the above purpose, the technical scheme adopted by the application is as follows: the binding yarn extraction method under the complex background comprises the following steps:
photographing the cloth cover, and collecting an initial image;
carrying out smooth filtering on the initial image through an anisotropic smooth filtering algorithm, and repeating the smooth filtering process on the filtered image until the image is iterated for a plurality of times to obtain a denoising image;
extracting the profile in the vertical direction in the denoising image by an edge extraction algorithm to obtain binding yarn yarns;
judging whether the binding yarn has broken yarn and doubling yarn.
Further, the anisotropic smoothing filter algorithm includes:
calculating the edge degree of each pixel point in the image in the direction of the four adjacent domains;
calculating the smooth coefficient of each pixel point in the direction of the four adjacent domains according to the edge degree, wherein the smooth coefficient close to the edge is smaller than the smooth coefficient far away from the edge;
and smoothing and filtering the image according to the smoothing coefficient.
Further, the calculating the edge degree of each pixel point in the image in the direction of the four adjacent domains includes:
wherein ,Ix,y In the case of an image which is a picture, and />The edge degrees in the four directions in the image are respectively.
Further, the calculating the smoothing coefficient of each pixel point in the direction of the four adjacent domains according to the edge degree includes:
where k is a coefficient, and the larger the value, the smoother the value.
Further, the smoothing filtering the image according to the smoothing coefficient includes:
wherein ,It For the image of the t iteration, lambda is a coefficient, and the larger the value is, the smoother the value is.
Further, the pass-through edge extraction algorithm is a Canny edge extraction algorithm, and comprises the following steps:
calculating the amplitude and direction of the gradient in the denoising image;
non-maximum value suppression, traversing pixel points in an image, and removing all non-edge points;
an edge is determined and final edge information is determined using a dual threshold algorithm.
The application also comprises a binding yarn extracting device under the complex background, which comprises the following steps:
the acquisition module is used for photographing the cloth cover and acquiring an initial image;
the filtering module is used for carrying out smooth filtering on the initial image through an anisotropic smooth filtering algorithm, and repeating the smooth filtering process on the filtered image until the image is iterated for a plurality of times to obtain a denoising image;
the extraction module is used for extracting the profile in the vertical direction in the denoising image through an edge extraction algorithm to obtain binding yarn yarns;
and the judging module is used for judging whether the binding yarns are broken or doubled.
The application also includes a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method as described above when executing the computer program.
The application also includes a storage medium having stored thereon a computer program which, when executed by a processor, implements a method as described above.
The beneficial effects of the application are as follows: the application carries out smooth filtering on the image through the anisotropic smooth filtering algorithm and iterates for a plurality of times to obtain the denoising image, thereby avoiding the interference caused by various noises under the complex background in the image when directly extracting the binding yarns. The general smoothing method can enable the edge to be smoothed, in a complex background, if the smoothing is too large, the subsequent extraction of the binding yarns is influenced, so that the edge needs to be reserved when the noise is removed, and the smoothing coefficients in all directions in the image cannot be the same, so that the edge needs to be reserved well based on the characteristics, the smoothing filtering is carried out through an anisotropic smoothing filtering algorithm, and the subsequent extraction of the binding yarns is ensured.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings may be obtained according to the drawings without inventive effort to those skilled in the art.
FIG. 1 is a cloth cover original image collected in the background art;
FIG. 2 is a flow chart of the method of the present application;
FIG. 3 is a smoothed noise-reduced image of FIG. 1;
FIG. 4 is a schematic view of the structure of the device of the present application;
fig. 5 is a schematic structural diagram of a computer device.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments.
As shown in fig. 2: the binding yarn extraction method under the complex background comprises the following steps:
photographing the cloth cover, and collecting an initial image;
carrying out smooth filtering on the initial image through an anisotropic smooth filtering algorithm, and repeating the smooth filtering process on the filtered image until the image is iterated for a plurality of times to obtain a denoising image;
extracting the profile in the vertical direction in the denoising image by an edge extraction algorithm to obtain binding yarn yarns;
judging whether the binding yarn has broken yarn and doubling yarn.
The image is subjected to smooth filtering through an anisotropic smooth filtering algorithm and is iterated for a plurality of times to obtain a denoising image, so that interference caused by various noises in a complex background of the image can be avoided when the binding yarns are directly extracted. The general smoothing method can enable the edge to be smoothed, in a complex background, if the smoothing is too large, the subsequent extraction of the binding yarns is influenced, so that the edge needs to be reserved when the noise is removed, and the smoothing coefficients in all directions in the image cannot be the same, so that the edge needs to be reserved well based on the characteristics, the smoothing filtering is carried out through an anisotropic smoothing filtering algorithm, and the subsequent extraction of the binding yarns is ensured.
In popular terms, the image can be regarded as a thermal field, each pixel point is regarded as a heat flow, the flow of the heat flow depends on the relation between the current pixel point and surrounding pixel points, the flow of the heat flow can be blocked by edges, namely if a pixel beside the pixel is an edge pixel point, the flow smoothness coefficient of the pixel is smaller, namely the heat flow cannot be diffused or weakened to the pixels in the field; if it is not an edge pixel, the diffusion coefficient changes toward the flow direction, and the flow-through place becomes smooth, so that the noise area is smoothed while the edge is maintained.
In this embodiment, the anisotropic smoothing filter algorithm includes:
calculating the edge degree of each pixel point in the image in the direction of the four adjacent domains;
calculating the smooth coefficient of each pixel point in the direction of the four adjacent domains according to the edge degree, wherein the smooth coefficient close to the edge is smaller than the smooth coefficient far away from the edge;
and smoothing and filtering the image according to the smoothing coefficient.
Calculating the edge degree of each pixel point in the image in the direction of the four adjacent domains, wherein the edge degree comprises the following steps:
wherein ,Ix,y In the case of an image which is a picture, and />The edge degrees in four directions in the image are respectively. />The gradient operator is used for determining the edge information of four directions in the image, namely the edge degree of the four directions in the image, and the image is a discrete function, and the differentiation is needed to be changed into the difference when the gradient is calculated, so that when the edge information of the periphery of one pixel point is determined, the edge information existing in the image can be calculated only by making the difference between the gray values of the periphery pixel point and the current pixel point.
Calculating the smoothing coefficient of each pixel point in the direction of the four adjacent domains according to the edge degree comprises the following steps:
where k is a coefficient, the larger the value, the smoother the value, because the image is approximately seen as a thermal flow field, k represents the thermal flow coefficient in the thermal flow field, the larger the value represents the easier the heat is to conduct, i.e., the smoother the less likely the edges are to remain, and as an embodiment, k may be 0.8.
Smoothing the image according to the smoothing coefficients comprises:
wherein ,It For the image of the t iteration, lambda is a coefficient, and the larger the value is, the smoother the value is.
As an embodiment, the image in fig. 1 is smoothed, and after 8 iterations, the denoised image is shown in fig. 3, which well removes noise and retains edges, where the binding yarns of the white vertical lines can be clearly and obviously extracted.
In this embodiment, the edge extraction algorithm is a Canny edge extraction algorithm, which includes the following steps:
calculating the amplitude and direction of the gradient in the denoising image:
the direction of the gradient is perpendicular to the direction of the edge, which is always perpendicular to the edge, and usually has values of 8 different directions such as horizontal (left, right), vertical (up, down), diagonal (up right, up left, down right), etc.
Thus, when calculating the gradient, we will get both the magnitude and the angle of the gradient (representing the direction of the gradient).
Non-maximum suppression, traversing pixel points in an image, and removing all non-edge points:
after the magnitude and direction of the gradient are obtained, pixel points in the image are traversed, and all non-edge points are removed. In specific implementation, the pixel points are traversed one by one, whether the current pixel point is the maximum value with the same gradient direction in surrounding pixel points is judged, and whether the point is restrained or not is determined according to a judging result. As is apparent from the above description, this step is a process of edge refinement. For each pixel point:
if the point is a local maximum in the positive/negative gradient direction, the point is preserved;
if not, the point is suppressed (zeroed).
Determining edges, and determining final edge information by using a double-threshold algorithm:
after the above steps are completed, the strong edges in the image are already in the currently acquired edge image. However, some virtual edges may also be within the edge image. These false edges may be generated by the real image or by noise. In the latter case, it must be rejected.
Setting two thresholds, one of which is a high threshold maxV al The other is a low threshold value minV al . And judging the attribute of the edge according to the relation between the gradient value (gradient amplitude, the same applies hereinafter) of the current edge pixel and the two thresholds. The method comprises the following specific steps:
(1) If the gradient value of the current edge pixel is greater than or equal to maxV al The current edge pixel is marked as a strong edge.
(2) If the gradient value of the current edge pixel is between maxV al With minV al In between, the current edge pixel is marked as a virtual edge (which needs to be preserved)
(3) If the gradient value of the current edge pixel is less than or equal to minV al The current edge pixel is suppressed.
In the above process we get the virtual edges, which need to be further processed. It is generally determined which case the virtual edge is to be by judging whether the virtual edge is connected with the strong edge. Typically, if a virtual edge:
connected to a strong edge, the edge is treated as an edge.
And if the edge is not connected with the strong edge, the edge is a weak edge, and the edge is suppressed.
As shown in fig. 4, the present embodiment further includes a binder yarn extracting device in a complex background, and the method includes:
the acquisition module is used for photographing the cloth cover and acquiring an initial image;
the filtering module is used for carrying out smooth filtering on the initial image through an anisotropic smooth filtering algorithm, and repeating the smooth filtering process on the filtered image until the image is iterated for a plurality of times to obtain a denoising image;
the extraction module is used for extracting the profile in the vertical direction in the denoising image through an edge extraction algorithm to obtain binding yarn yarns;
the judging module is used for judging whether the binding yarns are broken or not and are doubled.
Please refer to fig. 5, which illustrates a schematic structure of a computer device according to an embodiment of the present application. The computer device 400 provided in the embodiment of the present application includes: a processor 410 and a memory 420, the memory 420 storing a computer program executable by the processor 410, which when executed by the processor 410 performs the method as described above.
The embodiment of the present application also provides a storage medium 430, on which storage medium 430 a computer program is stored which, when executed by the processor 410, performs a method as above.
The storage medium 430 may be implemented by any type or combination of volatile or nonvolatile Memory devices, such as a static random access Memory (Static Random Access Memory, SRAM), an electrically erasable Programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), an erasable Programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), a Programmable Read-Only Memory (PROM), a Read-Only Memory (ROM), a magnetic Memory, a flash Memory, a magnetic disk, or an optical disk.
In the description of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. The meaning of "a plurality of" is two or more, unless specifically defined otherwise.
In the present application, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art according to the specific circumstances.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms are not necessarily for the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like. While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.

Claims (9)

1. The binding yarn extraction method under the complex background is characterized by comprising the following steps:
photographing the cloth cover, and collecting an initial image;
carrying out smooth filtering on the initial image through an anisotropic smooth filtering algorithm, and repeating the smooth filtering process on the filtered image until the image is iterated for a plurality of times to obtain a denoising image;
extracting the profile in the vertical direction in the denoising image by an edge extraction algorithm to obtain binding yarn yarns;
judging whether the binding yarn has broken yarn and doubling yarn.
2. The method of claim 1, wherein the anisotropic smoothing filter algorithm comprises:
calculating the edge degree of each pixel point in the image in the direction of the four adjacent domains;
calculating the smooth coefficient of each pixel point in the direction of the four adjacent domains according to the edge degree, wherein the smooth coefficient close to the edge is smaller than the smooth coefficient far away from the edge;
and smoothing and filtering the image according to the smoothing coefficient.
3. The method for extracting binder yarns in a complex background according to claim 2, wherein the calculating the edge degree of each pixel point in the image in the direction of the four adjacent domains includes:
wherein ,Ix,y In the case of an image which is a picture, and />The edge degrees in the four directions in the image are respectively.
4. The method according to claim 3, wherein calculating the smoothing coefficient of each pixel in the direction of the four adjacent domains based on the edge degree comprises:
where k is a coefficient, and the larger the value, the smoother the value.
5. The method of claim 4, wherein smoothing the image according to the smoothing coefficients comprises:
wherein ,It For the image of the t iteration, lambda is a coefficient, and the larger the value is, the smoother the value is.
6. The method according to claim 1, wherein the pass-through edge extraction algorithm is a Canny edge extraction algorithm, comprising the steps of:
calculating the amplitude and direction of the gradient in the denoising image;
non-maximum value suppression, traversing pixel points in an image, and removing all non-edge points;
an edge is determined and final edge information is determined using a dual threshold algorithm.
7. A binder yarn extraction device in a complex context, characterized in that it uses a method according to any one of claims 1 to 6, comprising:
the acquisition module is used for photographing the cloth cover and acquiring an initial image;
the filtering module is used for carrying out smooth filtering on the initial image through an anisotropic smooth filtering algorithm, and repeating the smooth filtering process on the filtered image until the image is iterated for a plurality of times to obtain a denoising image;
the extraction module is used for extracting the profile in the vertical direction in the denoising image through an edge extraction algorithm to obtain binding yarn yarns;
and the judging module is used for judging whether the binding yarns are broken or doubled.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1-6 when the computer program is executed.
9. A storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any of claims 1-6.
CN202310677488.0A 2023-06-08 2023-06-08 Binding yarn extraction method, device, equipment and medium under complex background Active CN116596802B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310677488.0A CN116596802B (en) 2023-06-08 2023-06-08 Binding yarn extraction method, device, equipment and medium under complex background

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310677488.0A CN116596802B (en) 2023-06-08 2023-06-08 Binding yarn extraction method, device, equipment and medium under complex background

Publications (2)

Publication Number Publication Date
CN116596802A true CN116596802A (en) 2023-08-15
CN116596802B CN116596802B (en) 2023-10-24

Family

ID=87600799

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310677488.0A Active CN116596802B (en) 2023-06-08 2023-06-08 Binding yarn extraction method, device, equipment and medium under complex background

Country Status (1)

Country Link
CN (1) CN116596802B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117670843A (en) * 2023-12-07 2024-03-08 常州市宏发纵横新材料科技股份有限公司 Method, device, equipment and storage medium for detecting broken yarn of color yarn

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1512758A (en) * 2002-12-27 2004-07-14 ������������ʽ���� Reducing image noise
CN103116874A (en) * 2013-02-01 2013-05-22 华中科技大学 Image decomposition method and system for maximum extraction of useful information of noise
CN104376564A (en) * 2014-11-24 2015-02-25 西安工程大学 Method for extracting rough image edge based on anisotropism Gaussian directional derivative filter
CN104766278A (en) * 2015-03-19 2015-07-08 天津大学 Anisotropism filtering method based on self-adaptive averaging factor
CN106911904A (en) * 2015-12-17 2017-06-30 通用电气公司 Image processing method, image processing system and imaging system
CN107248148A (en) * 2017-06-14 2017-10-13 上海晔芯电子科技有限公司 Image denoising method and system
CN113837204A (en) * 2021-09-28 2021-12-24 常州市宏发纵横新材料科技股份有限公司 Hole shape recognition method, computer equipment and storage medium
CN113870233A (en) * 2021-09-30 2021-12-31 常州市宏发纵横新材料科技股份有限公司 Binding yarn detection method, computer equipment and storage medium
CN114399522A (en) * 2022-01-14 2022-04-26 东南大学 High-low threshold-based Canny operator edge detection method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1512758A (en) * 2002-12-27 2004-07-14 ������������ʽ���� Reducing image noise
CN103116874A (en) * 2013-02-01 2013-05-22 华中科技大学 Image decomposition method and system for maximum extraction of useful information of noise
CN104376564A (en) * 2014-11-24 2015-02-25 西安工程大学 Method for extracting rough image edge based on anisotropism Gaussian directional derivative filter
CN104766278A (en) * 2015-03-19 2015-07-08 天津大学 Anisotropism filtering method based on self-adaptive averaging factor
CN106911904A (en) * 2015-12-17 2017-06-30 通用电气公司 Image processing method, image processing system and imaging system
CN107248148A (en) * 2017-06-14 2017-10-13 上海晔芯电子科技有限公司 Image denoising method and system
CN113837204A (en) * 2021-09-28 2021-12-24 常州市宏发纵横新材料科技股份有限公司 Hole shape recognition method, computer equipment and storage medium
CN113870233A (en) * 2021-09-30 2021-12-31 常州市宏发纵横新材料科技股份有限公司 Binding yarn detection method, computer equipment and storage medium
CN114399522A (en) * 2022-01-14 2022-04-26 东南大学 High-low threshold-based Canny operator edge detection method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王睿: "隧道衬砌裂缝车载检测图像分析研究", 硕士电子期刊, pages 39 - 51 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117670843A (en) * 2023-12-07 2024-03-08 常州市宏发纵横新材料科技股份有限公司 Method, device, equipment and storage medium for detecting broken yarn of color yarn
CN117670843B (en) * 2023-12-07 2024-05-24 常州市宏发纵横新材料科技股份有限公司 Method, device, equipment and storage medium for detecting broken yarn of color yarn

Also Published As

Publication number Publication date
CN116596802B (en) 2023-10-24

Similar Documents

Publication Publication Date Title
CN116596802B (en) Binding yarn extraction method, device, equipment and medium under complex background
CN104217416B (en) Gray level image processing method and its device
CN111861930B (en) Image denoising method and device, electronic equipment and image super-resolution denoising method
WO2021217642A1 (en) Infrared image processing method and apparatus, and movable platform
CN105335947A (en) Image de-noising method and image de-noising apparatus
CN102968770A (en) Method and device for eliminating noise
CN115908415B (en) Edge-based defect detection method, device, equipment and storage medium
CN112669265B (en) Method for realizing surface defect detection based on Fourier transform and image gradient characteristics
CN115330770B (en) Cloth area type defect identification method
CN101499164A (en) Image interpolation reconstruction method based on single low-resolution image
CN112115287B (en) Method, system and medium based on AI quality inspection and industrial big data analysis
CN116912115A (en) Underwater image self-adaptive enhancement method, system, equipment and storage medium
CN114219740A (en) Edge perception guiding filtering method fusing superpixels and window migration
CN115424102A (en) Multi-focus image fusion method based on anisotropic guided filtering
CN102509265B (en) Digital image denoising method based on gray value difference and local energy
CN116823796A (en) Method, device, equipment and medium for detecting transverse strips of non-uniform glass fiber cloth
CN113516608B (en) Method and device for detecting defects of tire and tire detecting equipment
CN115908404B (en) Image stripe interference detection method and device, electronic equipment and medium
CN115035311A (en) Carrier roller detection method based on fusion of visible light and thermal infrared
CN118379317B (en) Glass fiber cloth edge identification method, equipment and storage medium
CN118628459A (en) Thin glass fiber cloth cover horizontal bar detection method, device and storage medium
CN117670842B (en) Cloth cover horizontal bar detection method, device, equipment and storage medium
Udomsiri et al. Design of FIR filter for water level detection
CN116485789B (en) Method, equipment and storage medium for detecting carbon fiber splitting defect
CN116523915B (en) Method, equipment and storage medium for detecting defects of carbon fiber joints

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant