CN115049665A - Fire hose surface quality detection method and system based on image processing - Google Patents

Fire hose surface quality detection method and system based on image processing Download PDF

Info

Publication number
CN115049665A
CN115049665A CN202210977754.7A CN202210977754A CN115049665A CN 115049665 A CN115049665 A CN 115049665A CN 202210977754 A CN202210977754 A CN 202210977754A CN 115049665 A CN115049665 A CN 115049665A
Authority
CN
China
Prior art keywords
fire hose
index value
domain
pixel block
surface quality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202210977754.7A
Other languages
Chinese (zh)
Inventor
黎阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Sentian Fire Fighting Equipment Co ltd
Original Assignee
Nantong Sentian Fire Fighting Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Sentian Fire Fighting Equipment Co ltd filed Critical Nantong Sentian Fire Fighting Equipment Co ltd
Priority to CN202210977754.7A priority Critical patent/CN115049665A/en
Publication of CN115049665A publication Critical patent/CN115049665A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Abstract

The invention relates to the technical field of image processing, in particular to a fire hose surface quality detection method and system based on image processing, which comprises the following steps: acquiring a surface image of a fire hose to be detected, and further acquiring a fastening area and a weaving area of the surface image; determining a distribution relation index value between any two adjacent connected domains in the fastening region, and further determining a quality evaluation index value of each connected domain in the fastening region; dividing a super pixel block in a weaving area, determining a distribution relation index value between each super pixel block and a neighbor super pixel block thereof, and further determining a quality evaluation index value of each super pixel block; and finally determining the quality evaluation result of the fire hose to be detected according to each quality evaluation index value. According to the invention, the quality of the current fire hose is obtained according to the quality evaluation result of the fire hose to be detected, so that the manual detection time is greatly saved, and the working efficiency of detecting the surface quality of the fire hose is improved.

Description

Fire hose surface quality detection method and system based on image processing
Technical Field
The invention relates to the technical field of image processing, in particular to a fire hose surface quality detection method and system based on image processing.
Background
Along with the increasing importance of people on fire safety, fire-fighting affairs are increasingly concerned and valued by all the communities, and each link of fire-fighting work is related to the actual effect and the disaster degree of a fire scene. When a fire disaster occurs, the demand for water is very large, a plurality of water outlets are usually needed to supply water at the same time, and a water source and a large fire engine cannot directly reach the vicinity of a fire source, so that a fire hose needs to be laid for fire extinguishing in a long distance.
The fire hose is common fire fighting equipment in life, and has strict requirements on the surface quality of the fire hose in the production and processing processes of the fire hose. The fire hose takes rubber as a lining, and the outer surface of the fire hose is wrapped by linen woven fabric, so that the state standard of the fire hose indicates that the fabric layer of the fire hose is uniformly woven and the surface of the fire hose is neat; no jumping double warp, breaking double warp, jumping weft and scratch. Therefore, the surface quality of the fire hose needs to be detected in the production process of the fire hose, and the existing detection method for the surface quality of the fire hose usually observes the surface quality of the fire hose according to human experience, thereby wasting time and labor and having low efficiency.
Disclosure of Invention
The invention aims to provide a fire hose surface quality detection method and system based on image processing, which are used for solving the problem that the efficiency of artificially detecting the surface quality of a fire hose is low.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
the invention provides a fire hose surface quality detection method based on image processing, which comprises the following steps:
acquiring a surface image of a fire hose to be detected, and further acquiring a fastening area and a weaving area of the surface image;
calculating the feature vector of each contour point in each connected domain in the fastening region, and determining the index value of the distribution relationship between any two adjacent connected domains in the fastening region according to the feature vector of each contour point in each connected domain in the fastening region and the position of each pixel point in each connected domain in the fastening region;
determining a quality evaluation index value of each connected domain in the fastening region according to a distribution relation index value between any two adjacent connected domains in the fastening region;
dividing a super pixel block of the weaving area to obtain each super pixel block of the weaving area, calculating a characteristic vector of each contour point in each super pixel block, and determining a distribution relation index value between each super pixel block and a neighbor super pixel block thereof according to the characteristic vector of each contour point in each super pixel block and the position of each pixel point in each super pixel block;
determining a quality evaluation index value of each superpixel block according to a distribution relation index value between each superpixel block and a neighbor superpixel block of each superpixel block;
and determining the quality evaluation result of the fire hose to be detected according to the quality evaluation index values of all the connected domains in the fastening region and the quality evaluation index values of all the superpixel blocks.
Further, the step of determining a distribution relation index value between any two adjacent connected domains in the fastening region includes:
according to the position transformation matrix to be determined, carrying out translation, scaling and rotation processing on the position of each pixel point in a first communication domain of two adjacent communication domains in a fastening region to obtain the position of each pixel point in the processed first communication domain;
calculating the feature vectors of all the contour points in the processed first communication domain, and constructing a target function and a constraint condition according to the feature vectors of all the contour points in the processed first communication domain and the feature vectors of all the contour points in the second communication domain of the two adjacent communication domains;
and solving a position transformation matrix corresponding to the minimum value obtained by the objective function according to the constraint condition, and determining a distribution relation index value between the two adjacent connected domains according to the solved position transformation matrix and the minimum value of the objective function.
Further, the calculation formula corresponding to the objective function and the constraint condition is as follows:
the objective function is:
Figure 215709DEST_PATH_IMAGE002
the constraint conditions are as follows:
Figure 409799DEST_PATH_IMAGE004
Figure 824600DEST_PATH_IMAGE006
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE007
is the first communication domain after processing
Figure 642514DEST_PATH_IMAGE008
Contour point and the second in the second connected domain
Figure DEST_PATH_IMAGE009
The euclidean distance of the individual contour points,
Figure 454350DEST_PATH_IMAGE010
Figure DEST_PATH_IMAGE011
is the first communication domain after processing
Figure 750334DEST_PATH_IMAGE008
The feature vectors of the individual contour points are,
Figure 534488DEST_PATH_IMAGE012
is the second in the second connected domain
Figure 444675DEST_PATH_IMAGE009
The feature vectors of the individual contour points are,
Figure DEST_PATH_IMAGE013
for the number of contour points in the processed first connection domain,
Figure 245272DEST_PATH_IMAGE014
as is the number of contour points in the second connected domain,
Figure DEST_PATH_IMAGE015
is the first communication domain after processing
Figure 109060DEST_PATH_IMAGE008
The contour point and the second in the second connected domain
Figure 249186DEST_PATH_IMAGE009
Correlation coefficient of each contour point.
Further, the calculation formula corresponding to the distribution relation index value is as follows:
Figure DEST_PATH_IMAGE017
wherein the content of the first and second substances,
Figure 907438DEST_PATH_IMAGE018
is an adjacent second one in the fastening region
Figure DEST_PATH_IMAGE019
A communication domain and the second
Figure 929752DEST_PATH_IMAGE020
Distribution relation index values among the connected domains,
Figure DEST_PATH_IMAGE021
for the translation matrix in the position transformation matrix,
Figure 66073DEST_PATH_IMAGE022
for the scaling matrix in the position transformation matrix,
Figure DEST_PATH_IMAGE023
for the rotation matrix in the position transformation matrix,
Figure 123022DEST_PATH_IMAGE024
is a natural constant and is a natural constant,
Figure DEST_PATH_IMAGE025
is a target letterThe minimum value of the number.
Further, the step of determining the quality evaluation index value for each connected domain in the fastening region includes:
obtaining the deviation degree of the distribution relation index value between any two adjacent connected domains in the fastening region according to the distribution relation index value between any two adjacent connected domains in the fastening region;
and obtaining the quality evaluation index value of each connected domain in the fastening area according to the deviation degree of the distribution relation index value between any two adjacent connected domains in the fastening area.
Further, the calculation formula of the quality evaluation index value of each connected domain of the fastening area is as follows:
Figure DEST_PATH_IMAGE027
wherein the content of the first and second substances,
Figure 280071DEST_PATH_IMAGE028
is the first in the fastening region
Figure 992943DEST_PATH_IMAGE019
The quality evaluation index values of the individual connected components,
Figure DEST_PATH_IMAGE029
is an adjacent second one in the fastening region
Figure 932955DEST_PATH_IMAGE019
A connected domain and
Figure 296941DEST_PATH_IMAGE030
the degree of deviation of distribution relation index values between the connected domains,
Figure DEST_PATH_IMAGE031
is an adjacent second one in the fastening region
Figure 532881DEST_PATH_IMAGE019
A connected domain and
Figure 497164DEST_PATH_IMAGE020
the degree of deviation of the distribution relation index values of the connected domains.
Further, the step of determining a quality assessment indicator value for each superpixel block comprises:
obtaining a characteristic value of each super-pixel block according to a distribution relation index value between each super-pixel block and a neighbor super-pixel block;
counting the characteristic values of all the superpixel blocks, determining the number corresponding to the same characteristic value, and obtaining the characteristic value of a normal superpixel block according to the number corresponding to the same characteristic value;
and obtaining the quality evaluation index value of each super pixel block according to the characteristic value of each super pixel block and the characteristic value of the normal super pixel block.
Further, the calculation formula of the quality evaluation index value of each super pixel block is as follows:
Figure DEST_PATH_IMAGE033
wherein the content of the first and second substances,
Figure 476752DEST_PATH_IMAGE034
is as follows
Figure DEST_PATH_IMAGE035
A quality assessment indicator value for each super-pixel block,
Figure 6828DEST_PATH_IMAGE036
is as follows
Figure 866200DEST_PATH_IMAGE035
The characteristic values of the super-pixel blocks,
Figure DEST_PATH_IMAGE037
is a characteristic value of a normal superpixel block.
Further, the step of determining the quality evaluation result of the fire hose to be detected comprises:
obtaining a surface quality result graph of the fastening area according to the quality evaluation index value of each connected domain in the fastening area;
obtaining a surface quality result graph of the knitting area according to the quality evaluation index value of each super pixel block;
obtaining a final surface quality result diagram of the fire hose according to the surface quality result diagram of the fastening area and the surface quality result diagram of the weaving area;
and determining the quality evaluation result of the fire hose to be detected according to the final surface quality result diagram of the fire hose.
The invention also provides a fire hose surface quality detection system based on image processing, which comprises a processor and a memory, wherein the processor is used for processing instructions stored in the memory to realize the fire hose surface quality detection method based on image processing.
The invention has the following beneficial effects:
the invention obtains the fastening area and the weaving area of the surface of the fire hose by obtaining the surface image of the fire hose to be detected and utilizing color segmentation, determines the index value of the distribution relationship between any two adjacent communication areas in the fastening area, the distribution relation index value can reflect the similarity of the connected domains and reflect the spatial position relation, determining the quality evaluation index value of each connected domain in the fastening area according to the distribution relation index value between any two adjacent connected domains in the fastening area, dividing a super pixel block in a weaving area, determining a distribution relation index value between each super pixel block and a neighbor super pixel block thereof according to the characteristic vector and the position of each contour point in each super pixel block, and further determining the quality evaluation index value of each superpixel block, and finally determining the quality evaluation result of the fire hose to be detected. According to the method, the similarity and the spatial position relation of the adjacent connected domains and the adjacent super-pixel blocks of the fastening region and the weaving region of the fire hose to be detected are analyzed, so that the surface quality detection result of the fire hose can be accurately obtained.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of the steps of the method for detecting the surface quality of a fire hose based on image processing according to the present invention;
fig. 2 is a schematic surface view of the fire hose of the present invention.
Detailed Description
To further explain the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the embodiments, structures, features and effects of the technical solutions according to the present invention will be given with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the fire hose surface quality detection method and system based on image processing, which is provided by the invention, with reference to the accompanying drawings.
Referring to fig. 1, a flowchart illustrating steps of a fire hose surface quality detection method based on image processing according to an embodiment of the present invention is shown, where the method includes the following steps:
step 1: and acquiring a surface image of the fire hose to be detected, and further acquiring a fastening area and a weaving area of the surface image.
This embodiment divides the fire hose into two areas: the fastening area G and the weaving area H are, as shown in fig. 2, the weaving area is a main area of the fire hose and occupies most of the area of the surface of the whole hose, the fastening area is an area for fastening a lining in the middle or on both sides of the weaving area, according to the obvious difference in color between the fastening area and the weaving area in the fire hose, an RGB camera is arranged to photograph the surface of the produced fire hose to obtain a surface image of the fire hose, so that the acquired surface RGB image of the fire hose is converted into an HSV color space, the weaving area and the fastening area on the surface of the fire hose can be obtained according to a triple channel threshold set a priori, and the conversion of the RGB image into the HSV color space is a known technology and is not repeated herein.
Step 2: and calculating the characteristic vector of each contour point in each connected domain in the fastening region, and determining the distribution relation index value between any two adjacent connected domains in the fastening region according to the characteristic vector of each contour point in each connected domain in the fastening region and the position of each pixel point in each connected domain in the fastening region.
According to a fastening area of a fire hose, because the fastening area of the fire hose is distributed in discontinuous and discontinuous connected domains, all the connected domains of the fastening area of the fire hose are obtained, profile information of all the connected domains of the fastening area is obtained by utilizing a profile detection algorithm, the profile detection algorithm is a known technology and is not repeated here, the profile information of all the connected domains of the fastening area is formed by a series of profile points, each profile point is coded to obtain a feature vector of each profile point, and the feature vector of each profile point is used as the profile point
Figure 350402DEST_PATH_IMAGE038
The encoding process is described in detail for an example: obtaining contour points
Figure 835479DEST_PATH_IMAGE038
Is/are as follows
Figure DEST_PATH_IMAGE039
The gradient direction and gradient magnitude of all contour points in the neighborhood range, and the acquisition process belong to the prior art, and are not described herein again. Gradient squareThe value range of the direction is
Figure 721526DEST_PATH_IMAGE040
Equally dividing the gradient direction range of the 9 subintervals into 9 subintervals, sequentially arranging the gradient direction ranges of the 9 subintervals from small to large, wherein the gradient direction ranges of the subintervals are respectively a first subinterval, a second subinterval, a third subinterval, a fourth subinterval, a fifth subinterval, a sixth subinterval, a seventh subinterval, an eighth subinterval and a ninth subinterval, and the gradient direction range of each subinterval is
Figure DEST_PATH_IMAGE041
Determining a corresponding subinterval according to the gradient direction of the contour point in the neighborhood range, and placing the gradient magnitude of the contour point in the subinterval, for example, if the contour point a is at the contour point
Figure 797804DEST_PATH_IMAGE038
Is/are as follows
Figure 221833DEST_PATH_IMAGE039
In the neighborhood, the gradient direction of the contour point a is
Figure 480907DEST_PATH_IMAGE042
The gradient of the contour point a is set in the first subinterval, if the contour point b is at the contour point
Figure 673991DEST_PATH_IMAGE038
Is/are as follows
Figure 960370DEST_PATH_IMAGE039
In the neighborhood, the gradient direction of the contour point b is
Figure DEST_PATH_IMAGE043
The gradient magnitude of the contour point b is placed within the sixth subinterval. Traversing all contour points in the neighborhood to obtain the sum of the gradient magnitude in each subinterval to obtain the feature vectors of 1 line and 9 lines, carrying out normalization operation on the feature vectors of 1 line and 9 lines to obtain the final feature vector of the contour point, and recording the final feature vector as
Figure 419164DEST_PATH_IMAGE044
. And obtaining the feature vector of each contour point according to the same method, so as to obtain the feature vector of each contour point in each connected domain of the fastening region.
According to priori knowledge, a fastening area of the fire hose is distributed in discontinuous connected domains, the distribution among the connected domains follows a certain rule, so that quality evaluation index values of all positions of the fastening area can be obtained by analyzing the relative relation among the connected domains, and the method specifically comprises the following steps:
and (2-1) according to the position transformation matrix to be determined, carrying out translation, scaling and rotation processing on the position of each pixel point in a first communication domain of two adjacent communication domains in the fastening region to obtain the position of each pixel point in the processed first communication domain.
According to each pixel point in each connected domain of the fastening region, the position of each pixel point in each connected domain of the fastening region is obtained, after the position of each pixel point in each connected domain of the fastening region is subjected to translation, scaling and rotation operations, each pixel point in each processed connected domain can be aligned with each pixel point of the adjacent connected domain, the position transformation matrix of each connected domain after alignment is subjected to translation, scaling and rotation and the degree of association after alignment, and the distribution relation between each connected domain and the adjacent connected domain can be reflected.
Respectively marking a group of two adjacent communication domains in the fastening area as a first communication domain and a second communication domain, and marking the first communication domain along
Figure DEST_PATH_IMAGE045
Direction and
Figure 42781DEST_PATH_IMAGE046
the translation parameter of the direction is
Figure DEST_PATH_IMAGE047
The translation matrix is:
Figure DEST_PATH_IMAGE049
note that the first communication domain follows
Figure 965738DEST_PATH_IMAGE045
Direction and
Figure 587081DEST_PATH_IMAGE046
the scaling parameter of the direction is
Figure 720122DEST_PATH_IMAGE050
The scaling matrix is:
Figure 586578DEST_PATH_IMAGE052
let the first connected domain rotate clockwise by an angle of
Figure DEST_PATH_IMAGE053
The rotation matrix is:
Figure DEST_PATH_IMAGE055
obtaining the position information after the first connected component transformation as follows:
Figure DEST_PATH_IMAGE057
wherein the content of the first and second substances,
Figure 675626DEST_PATH_IMAGE058
being homogeneous coordinates of the pixel points in the first connection domain,
Figure DEST_PATH_IMAGE059
to obtain the homogeneous coordinates of the pixel points in the processed first connection domain,
Figure 530187DEST_PATH_IMAGE058
and
Figure 884945DEST_PATH_IMAGE059
the sizes are the same, the positions of the processed first communicating domain and the second communicating domain are overlapped to the maximum extent, and the position of each pixel point in the processed first communicating domain is obtained.
And (2-2) calculating the feature vectors of the contour points in the processed first communication domain, and constructing an objective function and a constraint condition according to the feature vectors of the contour points in the processed first communication domain and the feature vectors of the contour points in the second communication domain of the two adjacent communication domains.
Repeating the step (2) to obtain the feature vector of each contour point in the processed first communication domain and the feature vector of each contour point in the second communication domain according to each contour point in the processed first communication domain and each contour point in the second communication domain, and obtaining the processed contour feature matrix of the first communication domain according to the feature vector of each contour point in the processed first communication domain and the feature vector of each contour point in the second communication domain
Figure 555092DEST_PATH_IMAGE060
And a profile feature matrix of a second connected domain
Figure DEST_PATH_IMAGE061
Constructing an objective function to enable the processed feature vectors of the contour points of the first connected domain to have the maximum similarity with the feature vectors of the contour points of the second connected domain, and further determining a position transformation matrix corresponding to the maximum similarity and the association degree of the contour points in the two adjacent connected domains: recording the processed contour feature matrix of the first communication domain
Figure 888859DEST_PATH_IMAGE060
First, the
Figure 602737DEST_PATH_IMAGE008
Feature vectors of individual contour points are
Figure 461103DEST_PATH_IMAGE062
Profile feature matrix of second connected component
Figure 918629DEST_PATH_IMAGE061
To (1)
Figure 779006DEST_PATH_IMAGE009
Feature vectors of individual contour points are
Figure DEST_PATH_IMAGE063
Then the Euclidean distance between these two contour points is
Figure 476835DEST_PATH_IMAGE010
It should be noted that the processed contour feature matrix of the first communication domain
Figure 806185DEST_PATH_IMAGE060
Contour feature matrix with second connected domain
Figure 51091DEST_PATH_IMAGE061
The number of contour points in the first connected domain set by the embodiment is less than or equal to the number of contour points in the second connected domain, and a transfer matrix is constructed
Figure 782286DEST_PATH_IMAGE064
Contour feature matrix representing processed first connection domain
Figure 854279DEST_PATH_IMAGE060
The contour point and the second connected domain
Figure 405346DEST_PATH_IMAGE061
The degree of correlation of the contour points in (1), the transition matrix
Figure 719521DEST_PATH_IMAGE064
Has a size of
Figure DEST_PATH_IMAGE065
Wherein, in the step (A),
Figure 118273DEST_PATH_IMAGE066
Figure 344855DEST_PATH_IMAGE013
for the number of contour points in the processed first connection domain,
Figure 898065DEST_PATH_IMAGE014
for the number of contour points in the second connected-domain, the matrix is transferred
Figure 501084DEST_PATH_IMAGE064
To middle
Figure 692025DEST_PATH_IMAGE008
Go to the first
Figure 355088DEST_PATH_IMAGE009
The column values represent the correlation coefficients of two contour points, if the processed contour feature matrix of the first communication domain
Figure 395594DEST_PATH_IMAGE060
To (1)
Figure 802305DEST_PATH_IMAGE008
Contour feature matrix of each contour point and second connected domain
Figure 113331DEST_PATH_IMAGE061
To (1)
Figure 681716DEST_PATH_IMAGE009
After the contour points are aligned, the transfer matrix is formed
Figure 209518DEST_PATH_IMAGE064
The corresponding correlation coefficient in (1).
The objective function is:
Figure 154340DEST_PATH_IMAGE002
the constraint conditions are as follows:
Figure DEST_PATH_IMAGE067
Figure 382190DEST_PATH_IMAGE068
wherein the content of the first and second substances,
Figure 370744DEST_PATH_IMAGE007
is the first communication domain after processing
Figure 870995DEST_PATH_IMAGE008
Contour point and the second in the second connected domain
Figure 370241DEST_PATH_IMAGE009
The euclidean distance of the individual contour points,
Figure 639548DEST_PATH_IMAGE013
for the number of contour points in the processed first connection domain,
Figure 64582DEST_PATH_IMAGE014
being the number of contour points in the second connected domain,
Figure 52130DEST_PATH_IMAGE015
is the first communication domain after processing
Figure 89487DEST_PATH_IMAGE008
The contour point and the second in the second connected domain
Figure 478880DEST_PATH_IMAGE009
Correlation coefficient of each contour point.
The first constraint constrains a profile feature matrix of the second connected domain
Figure 74815DEST_PATH_IMAGE061
The contour feature matrix of the first communication domain after processing of each contour point in the image
Figure 284080DEST_PATH_IMAGE060
Where there are all aligned contour points, the second constraint constraining the point pairs for which there is a degree of correlation to be in the transition matrix
Figure 125128DEST_PATH_IMAGE064
The value of the corresponding position in (1), i.e. the profile feature matrix of the second connected component
Figure 369028DEST_PATH_IMAGE061
The contour feature matrix of the first communication domain after processing of each contour point
Figure 135864DEST_PATH_IMAGE060
There is only one associated contour point.
And (2-3) solving a position transformation matrix corresponding to the objective function when the minimum value is obtained according to the constraint condition, and determining a distribution relation index value between the two adjacent connected domains according to the solved position transformation matrix and the minimum value of the objective function.
And solving the objective function according to the constraint condition to obtain the minimum value and the position transformation matrix of the corresponding objective function when the similarity of the contour feature matrix of the first connected domain and the contour feature matrix of the second connected domain after the position transformation is maximum. The embodiment converts the actual problem into a linear programming problem, obtains an optimal solution by using optimization algorithms such as a genetic algorithm, an ant colony algorithm and the like, and obtains a position transformation matrix of the first communication domain
Figure DEST_PATH_IMAGE069
And the minimum value corresponding to the objective function
Figure 911053DEST_PATH_IMAGE025
The value can indicate that the first connected domain needs to be subjected to position conversion to obtain the minimum distance with the second connected domain, and can reflect the minimum distance between two adjacent connected domainsThe distribution relation index value is calculated according to the following formula:
Figure 539481DEST_PATH_IMAGE017
wherein, the first and the second end of the pipe are connected with each other,
Figure 910592DEST_PATH_IMAGE018
is an adjacent second one in the fastening region
Figure 599062DEST_PATH_IMAGE019
A connected domain and a
Figure 533651DEST_PATH_IMAGE020
Distribution relation index values among the connected domains,
Figure 965770DEST_PATH_IMAGE021
for the translation matrix in the position transformation matrix,
Figure 433529DEST_PATH_IMAGE022
for the scaling matrix in the position transformation matrix,
Figure 292900DEST_PATH_IMAGE023
for the rotation matrix in the position transformation matrix,
Figure 714786DEST_PATH_IMAGE024
is a natural constant and is a natural constant,
Figure 685016DEST_PATH_IMAGE025
is the minimum of the objective function.
And step 3: and determining the quality evaluation index value of each connected domain in the fastening region according to the distribution relation index value between any two adjacent connected domains in the fastening region.
And (3-1) obtaining the deviation degree of the distribution relation index value between any two adjacent connected domains in the fastening region according to the distribution relation index value between any two adjacent connected domains in the fastening region.
Analyzing a plurality of adjacent connected domains of the fastening area according to each connected domain in the fastening area to obtain a central point coordinate of each connected domain in the fastening area, numbering along the length direction of the fire hose according to the central point coordinate, wherein the connected domains with adjacent numbers have an adjacent relation, and obtaining the distribution relation of two connected domains with adjacent relation according to the step (2), wherein the abscissa is the number pair of the adjacent connected domains, such as
Figure 272861DEST_PATH_IMAGE070
And the ordinate is the distribution relation of two adjacent connected domains of the corresponding number pair,
Figure DEST_PATH_IMAGE071
representing the distribution relationship between a first connected domain and a second connected domain,
Figure 116183DEST_PATH_IMAGE072
indicating the distribution relationship between the second connected domain and the third connected domain,
Figure DEST_PATH_IMAGE073
and representing the distribution relation between the third connected domain and the fourth connected domain, and so on to obtain a distribution relation curve between adjacent connected domains in the fastening area. When the quality of the fastening area meets the requirement, the distribution relation curve is a smooth straight line, the straight line equation of the smooth straight line is obtained by using the least square method, and the deviation degree of each distribution relation is obtained
Figure 586216DEST_PATH_IMAGE074
Wherein, in the step (A),
Figure DEST_PATH_IMAGE075
for fastening the adjacent first to second fastening regions on a smooth straight line by least square method
Figure 173187DEST_PATH_IMAGE019
A connected domain and a
Figure 100691DEST_PATH_IMAGE020
The value of the ordinate to which the individual connected component corresponds,
Figure 816712DEST_PATH_IMAGE018
adjacent to the fastening region
Figure 931299DEST_PATH_IMAGE019
A connected domain and a
Figure 259643DEST_PATH_IMAGE020
Distribution relation index value between the connected domains, wherein the larger the deviation degree between the adjacent connected domains is, the adjacent first connected domains are
Figure 307234DEST_PATH_IMAGE019
A connected domain and a
Figure 928577DEST_PATH_IMAGE020
The worse the surface quality between the connected domains.
And (3-2) obtaining the quality evaluation index value of each connected domain in the fastening area according to the deviation degree of the distribution relation index value between any two adjacent connected domains in the fastening area.
The calculation formula of the quality evaluation index value of each connected domain of the fastening area is as follows:
Figure 812350DEST_PATH_IMAGE027
wherein the content of the first and second substances,
Figure 662494DEST_PATH_IMAGE028
is the first in the fastening region
Figure 564591DEST_PATH_IMAGE019
The quality evaluation index values of the individual connected components,
Figure 622415DEST_PATH_IMAGE029
is an adjacent second one in the fastening region
Figure 977173DEST_PATH_IMAGE019
A connected domain and a
Figure 647320DEST_PATH_IMAGE030
The degree of deviation of distribution relation index values between the connected domains,
Figure 918770DEST_PATH_IMAGE031
is an adjacent second one in the fastening region
Figure 632648DEST_PATH_IMAGE019
A connected domain and a
Figure 225435DEST_PATH_IMAGE020
The degree of deviation of the distribution relation index values of the connected domains.
Figure 682961DEST_PATH_IMAGE028
Has a value range of
Figure 808917DEST_PATH_IMAGE076
The larger the quality evaluation index value of each of the connected domains in the fastening region, the higher the surface quality of each of the connected domains in the fastening region, and thus the quality evaluation index value of each of the connected domains in the fastening region is obtained.
And 4, step 4: dividing a super pixel block in the weaving area to obtain each super pixel block of the weaving area, calculating the characteristic vector of each contour point in each super pixel block, and determining the index value of the distribution relationship between each super pixel block and the super pixel block in the neighborhood of the super pixel block according to the characteristic vector of each contour point in each super pixel block and the position of each pixel point in each super pixel block.
The purpose of this step is to analyze the woven region of the waterproof band and obtain the quality evaluation result of each connected domain in the woven region, it should be noted that, because the texture features in the woven region are different from those in the fastening region, in order to ensure the accuracy of the detection result, the surface quality of the woven region and the fastening region needs to be analyzed respectively, and the specific steps are as follows:
setting the pixel value of the woven area to be 1 and setting the pixel values of other areas to be 0 according to the surface image of the fire hose obtained in the step (1), so as to obtain a mask image of the woven area. Further, in order to prevent the fastening area from affecting the detection of the area defects, the fastening area is filled by using an interpolation algorithm to obtain a complete surface area of the fire hose.
According to the texture characteristics of the woven region of the fire hose, the woven region is divided into a plurality of superpixel blocks by using a superpixel division algorithm, the traditional Simple Linear Iterative Clustering (SLIC) can be adopted by the superpixel division algorithm, in order to obtain a fine superpixel division result, an index value is evaluated according to the quality of each connected region of a fastening region, and the average area of the connected region larger than 0.6 is used as the size of an initial superpixel block. Because the distance between the camera and the fire hose directly influences the texture information contained in each pixel point, the accuracy error of super-pixel segmentation caused by inaccurate initial area set by people can be eliminated by utilizing the area size of the fastening area obtained by the image, and the accuracy of the quality evaluation result of the weaving area is ensured.
The texture characteristics of the woven area of the fire hose follow a certain distribution rule. Simple Linear Iterative Clustering (SLIC) can automatically adjust the area of the initial superpixel block according to the texture characteristics of the woven region, divide the woven region into a plurality of superpixel blocks and obtain the area of each superpixel block in the woven region
Figure DEST_PATH_IMAGE077
The super-pixel blocks in the respective knitting areas are uniform in shape and equal in size. Obtaining the actual area of the superpixel block according to the superpixel segmentation result, and setting the neighborhood range of the superpixel block as
Figure 506746DEST_PATH_IMAGE078
The neighborhood range contains 9 superpixel blocks in total due to the textures of the fastening area and the weaving area of the fire hoseThe distribution characteristics have the same rule, so the distribution relationship index values between the superpixel blocks and the eight neighborhood superpixel blocks thereof are obtained by regarding the superpixel blocks as connected domains according to the method in the step 2, namely the eight distribution relationship index values between the eight neighborhood superpixel blocks and the central superpixel block are obtained, and the distribution relationship index values between each superpixel block and the neighborhood superpixel block thereof are obtained.
And 5: and determining the quality evaluation index value of each superpixel block according to the distribution relation index value between each superpixel block and the neighbor superpixel block.
According to the distribution relation index value between each super pixel block and the adjacent super pixel block, the surface quality evaluation index value of each super pixel block in the weaving area can be judged by analyzing the distribution relation between each super pixel block and the adjacent super pixel block, and the specific step of determining the quality evaluation index value of each super pixel block is as follows:
and (5-1) obtaining the characteristic value of each superpixel block according to the index value of the distribution relationship between each superpixel block and the neighbor superpixel block.
Obtaining the area of each super pixel block of the knitting area according to the step (4)
Figure 836096DEST_PATH_IMAGE077
Taking the mean value of all distribution relation index values between the central superpixel block in the neighborhood range and the eight neighborhood thereof as the index value of the distribution relation between each superpixel block and the neighborhood superpixel block thereof
Figure 346581DEST_PATH_IMAGE078
The feature values of the central superpixel blocks within the range, thereby obtaining the feature values of the superpixel blocks, wherein the feature values reflect the distribution characteristics of the superpixel blocks and the superpixel blocks in the eight neighborhoods.
(5-2) counting the characteristic values of the super pixel blocks, determining the number corresponding to the same characteristic value, and obtaining the characteristic value of the normal super pixel block according to the number corresponding to the same characteristic value.
And counting the number of the same characteristic value according to the characteristic value of each superpixel block, wherein the superpixel blocks with poor surface quality are a few regions according to the priori knowledge, so that a characteristic histogram is drawn according to the number of the characteristic values of each superpixel block, and the characteristic value with the largest number in the characteristic histogram is the characteristic value corresponding to the normal superpixel block, namely the characteristic value of the normal superpixel block.
(5-3) obtaining a quality evaluation index value of each super pixel block according to the characteristic value of each super pixel block and the characteristic value of the normal super pixel block, wherein the calculation formula is as follows:
Figure DEST_PATH_IMAGE079
wherein the content of the first and second substances,
Figure 890826DEST_PATH_IMAGE034
is as follows
Figure 946507DEST_PATH_IMAGE035
A quality assessment indicator value for each super-pixel block,
Figure 12420DEST_PATH_IMAGE036
is as follows
Figure 77328DEST_PATH_IMAGE035
The characteristic values of the super-pixel blocks,
Figure 413763DEST_PATH_IMAGE037
is a characteristic value of a normal superpixel block.
Figure 640345DEST_PATH_IMAGE034
Has a value range of
Figure 193555DEST_PATH_IMAGE076
The quality evaluation index value of each super-pixel block in the weaving area is obtained by the method that the larger the quality evaluation index value of each super-pixel block in the weaving area is, the higher the surface quality of each super-pixel block in the weaving area is.
Step 6: and determining a quality evaluation result of the fire hose to be detected according to the quality evaluation index values of all the connected domains in the fastening region and the quality evaluation index values of all the superpixel blocks.
(6-1) obtaining a surface quality result graph of the fastening region according to the quality evaluation index values of the respective connected domains in the fastening region.
And setting the gray value of each pixel point in each connected domain as a corresponding surface quality evaluation index value according to the quality evaluation index value of each connected domain in the fastening region to obtain a surface quality evaluation result graph of the fastening region, wherein the gray value reflects the surface quality detection result of each pixel point in the fastening region.
And (6-2) obtaining a surface quality result graph of the knitting area according to the quality evaluation index value of each super pixel block.
And setting the gray value of each pixel point in each super pixel block as a corresponding surface quality evaluation index value according to the quality evaluation index value of each super pixel block in the weaving region to obtain a surface quality evaluation result graph of the weaving region, wherein the gray value reflects the surface quality detection result of each pixel point in the weaving region.
And (6-3) obtaining a final surface quality result graph of the fire hose according to the surface quality result graph of the fastening area and the surface quality result graph of the weaving area.
Calculating a final surface quality result graph of the fire hose according to the mask image of the woven area of the fire hose obtained in the step (4), the surface quality result graph of the fastening area obtained in the step (6-1) and the step (6-2) in the step (6) and the surface quality result graph of the woven area, wherein the corresponding calculation formula is as follows;
Figure DEST_PATH_IMAGE081
wherein the content of the first and second substances,
Figure 609624DEST_PATH_IMAGE082
is a final surface quality result chart of the fire hose,
Figure DEST_PATH_IMAGE083
a surface quality result graph of the fastening area of the fire hose,
Figure 361417DEST_PATH_IMAGE084
is a surface quality result chart of the woven area of the fire hose,
Figure DEST_PATH_IMAGE085
is a mask image of the woven area of the fire hose.
The larger the value in the final surface quality result chart of the fire hose is, the better the surface quality of the fire hose is.
And (6-4) determining the quality evaluation result of the fire hose to be detected according to the final surface quality result diagram of the fire hose.
Setting an empirical threshold according to the gray value in the final surface quality result graph of the fire hose
Figure 837529DEST_PATH_IMAGE086
The gray value in the final surface quality result chart of the fire hose is less than
Figure DEST_PATH_IMAGE087
The area is the surface defect area of the fire hose, and the gray value in the final surface quality result graph of the fire hose is greater than or equal to the gray value
Figure 674773DEST_PATH_IMAGE087
The area to be measured is a normal area on the surface of the fire hose, and it should be noted that the empirical threshold in this embodiment may be adjusted according to production requirements, and the higher the empirical threshold is set, the higher the requirement for producing the fire hose is represented.
The method comprises the steps of obtaining a plurality of connected domain information and a plurality of superpixel block information according to texture distribution characteristics of a fire hose fastening area and a fire hose weaving area, obtaining a distribution relation between two adjacent connected domains and adjacent superpixel blocks through an optimization algorithm, reflecting the similarity of the connected domains and the superpixel blocks and reflecting the spatial position relation, and obtaining an accurate surface quality detection result of the fire hose according to the distribution relation between the connected domains of the fastening area and the superpixel blocks of the weaving area and combining with a texture distribution rule.
The embodiment also provides a fire hose surface quality detection system based on image processing, which includes a processor and a memory, where the processor is configured to process instructions stored in the memory to implement the fire hose surface quality detection method based on image processing, and since the fire hose surface quality detection method based on image processing is described in detail above, details are not repeated here.
It should be noted that: the sequence of the above embodiments of the present invention is only for description, and does not represent the advantages or disadvantages of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A fire hose surface quality detection method based on image processing is characterized by comprising the following steps:
acquiring a surface image of a fire hose to be detected, and further acquiring a fastening area and a weaving area of the surface image;
calculating the feature vector of each contour point in each connected domain in the fastening region, and determining the index value of the distribution relationship between any two adjacent connected domains in the fastening region according to the feature vector of each contour point in each connected domain in the fastening region and the position of each pixel point in each connected domain in the fastening region;
determining a quality evaluation index value of each connected domain in the fastening region according to a distribution relation index value between any two adjacent connected domains in the fastening region;
dividing a super pixel block of the weaving area to obtain each super pixel block of the weaving area, calculating a characteristic vector of each contour point in each super pixel block, and determining a distribution relation index value between each super pixel block and a neighbor super pixel block thereof according to the characteristic vector of each contour point in each super pixel block and the position of each pixel point in each super pixel block;
determining a quality evaluation index value of each superpixel block according to a distribution relation index value between each superpixel block and a neighbor superpixel block of each superpixel block;
and determining a quality evaluation result of the fire hose to be detected according to the quality evaluation index values of all the connected domains in the fastening region and the quality evaluation index values of all the superpixel blocks.
2. The image processing-based fire hose surface quality detection method according to claim 1, wherein the step of determining a distribution relationship index value between any two adjacent connected domains in the fastening region includes:
according to the position transformation matrix to be determined, carrying out translation, scaling and rotation processing on the position of each pixel point in a first communication domain of two adjacent communication domains in a fastening region to obtain the position of each pixel point in the processed first communication domain;
calculating the feature vectors of all the contour points in the processed first communication domain, and constructing a target function and a constraint condition according to the feature vectors of all the contour points in the processed first communication domain and the feature vectors of all the contour points in the second communication domain of the two adjacent communication domains;
and solving a position transformation matrix corresponding to the minimum value obtained by the objective function according to the constraint condition, and determining a distribution relation index value between the two adjacent connected domains according to the solved position transformation matrix and the minimum value of the objective function.
3. The fire hose surface quality detection method based on image processing according to claim 2, wherein the objective function and the calculation formula corresponding to the constraint condition are as follows:
the objective function is:
Figure DEST_PATH_IMAGE002
the constraint conditions are as follows:
Figure DEST_PATH_IMAGE004
Figure DEST_PATH_IMAGE006
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE008
is the first communication domain after processing
Figure DEST_PATH_IMAGE010
Contour point and the second in the second connected domain
Figure DEST_PATH_IMAGE012
The euclidean distance of the individual contour points,
Figure DEST_PATH_IMAGE014
Figure DEST_PATH_IMAGE016
is the first communication domain after processing
Figure 603761DEST_PATH_IMAGE010
The feature vectors of the individual contour points are,
Figure DEST_PATH_IMAGE018
is the second in the second connected domain
Figure 210061DEST_PATH_IMAGE012
The feature vectors of the individual contour points are,
Figure DEST_PATH_IMAGE020
for the number of contour points in the processed first connection domain,
Figure DEST_PATH_IMAGE022
as is the number of contour points in the second connected domain,
Figure DEST_PATH_IMAGE024
is the first communication domain after processing
Figure 762133DEST_PATH_IMAGE010
The contour point and the second in the second connected domain
Figure 322559DEST_PATH_IMAGE012
Correlation coefficient of each contour point.
4. The fire hose surface quality detection method based on image processing according to claim 2, wherein the calculation formula corresponding to the distribution relationship index value is as follows:
Figure DEST_PATH_IMAGE026
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE028
is an adjacent second one in the fastening region
Figure DEST_PATH_IMAGE030
A connected domain and a
Figure DEST_PATH_IMAGE032
Distribution relation index values among the connected domains,
Figure DEST_PATH_IMAGE034
for the translation matrix in the position transformation matrix,
Figure DEST_PATH_IMAGE036
for the scaling matrix in the position transformation matrix,
Figure DEST_PATH_IMAGE038
for the rotation matrix in the position transformation matrix,
Figure DEST_PATH_IMAGE040
is a natural constant and is a natural constant,
Figure DEST_PATH_IMAGE042
is the minimum of the objective function.
5. The image-processing-based fire hose surface quality detection method according to claim 1, wherein the step of determining the quality assessment index value for each connected domain in the fastening region includes:
obtaining the deviation degree of the distribution relation index value between any two adjacent connected domains in the fastening region according to the distribution relation index value between any two adjacent connected domains in the fastening region;
and obtaining the quality evaluation index value of each connected domain in the fastening area according to the deviation degree of the distribution relation index value between any two adjacent connected domains in the fastening area.
6. The image processing-based fire hose surface quality detection method according to claim 5, wherein the calculation formula of the quality evaluation index value of each connected domain of the fastening area is as follows:
Figure DEST_PATH_IMAGE044
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE046
is the first in the fastening region
Figure 615785DEST_PATH_IMAGE030
The quality evaluation index values of the connected components,
Figure DEST_PATH_IMAGE048
is an adjacent second one in the fastening region
Figure 640373DEST_PATH_IMAGE030
A connected domain and
Figure DEST_PATH_IMAGE050
the degree of deviation of distribution relation index values between the connected domains,
Figure DEST_PATH_IMAGE052
is an adjacent second one in the fastening region
Figure 504293DEST_PATH_IMAGE030
A connected domain and
Figure 4544DEST_PATH_IMAGE032
the deviation degree of the distribution relation index values of the connected domains.
7. The image processing-based fire hose surface quality detection method according to claim 1, wherein the step of determining the quality evaluation index value for each super-pixel block comprises:
obtaining a characteristic value of each super-pixel block according to a distribution relation index value between each super-pixel block and a neighbor super-pixel block;
counting the characteristic values of all the superpixel blocks, determining the number corresponding to the same characteristic value, and obtaining the characteristic value of a normal superpixel block according to the number corresponding to the same characteristic value;
and obtaining the quality evaluation index value of each super pixel block according to the characteristic value of each super pixel block and the characteristic value of the normal super pixel block.
8. The image processing-based fire hose surface quality detection method according to claim 7, wherein the calculation formula of the quality evaluation index value of each super-pixel block is as follows:
Figure DEST_PATH_IMAGE054
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE056
is as follows
Figure DEST_PATH_IMAGE058
A quality assessment indicator value for each super-pixel block,
Figure DEST_PATH_IMAGE060
is as follows
Figure 845068DEST_PATH_IMAGE058
The characteristic values of the super-pixel blocks,
Figure DEST_PATH_IMAGE062
is a characteristic value of a normal superpixel block.
9. The image processing-based fire hose surface quality detection method according to claim 1, wherein the step of determining the quality evaluation result of the fire hose to be detected comprises:
obtaining a surface quality result graph of the fastening area according to the quality evaluation index value of each connected domain in the fastening area;
obtaining a surface quality result graph of the knitting area according to the quality evaluation index value of each super pixel block;
obtaining a final surface quality result diagram of the fire hose according to the surface quality result diagram of the fastening area and the surface quality result diagram of the weaving area;
and determining the quality evaluation result of the fire hose to be detected according to the final surface quality result diagram of the fire hose.
10. An image processing-based fire hose surface quality detection system, characterized by comprising a processor and a memory, wherein the processor is used for processing instructions stored in the memory to realize the image processing-based fire hose surface quality detection method according to any one of claims 1-9.
CN202210977754.7A 2022-08-16 2022-08-16 Fire hose surface quality detection method and system based on image processing Withdrawn CN115049665A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210977754.7A CN115049665A (en) 2022-08-16 2022-08-16 Fire hose surface quality detection method and system based on image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210977754.7A CN115049665A (en) 2022-08-16 2022-08-16 Fire hose surface quality detection method and system based on image processing

Publications (1)

Publication Number Publication Date
CN115049665A true CN115049665A (en) 2022-09-13

Family

ID=83167629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210977754.7A Withdrawn CN115049665A (en) 2022-08-16 2022-08-16 Fire hose surface quality detection method and system based on image processing

Country Status (1)

Country Link
CN (1) CN115049665A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115294131A (en) * 2022-10-08 2022-11-04 南通海发水处理工程有限公司 Sewage treatment quality detection method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140119604A1 (en) * 2012-10-30 2014-05-01 Canon Kabushiki Kaisha Method, apparatus and system for detecting a supporting surface region in an image
CN113496490A (en) * 2021-09-06 2021-10-12 南通弈驰新型建材科技有限公司 Wood board surface defect detection method and system based on computer vision
CN114549529A (en) * 2022-04-26 2022-05-27 武汉福旺家包装有限公司 Carton indentation quality detection method and system based on computer vision

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140119604A1 (en) * 2012-10-30 2014-05-01 Canon Kabushiki Kaisha Method, apparatus and system for detecting a supporting surface region in an image
CN113496490A (en) * 2021-09-06 2021-10-12 南通弈驰新型建材科技有限公司 Wood board surface defect detection method and system based on computer vision
CN114549529A (en) * 2022-04-26 2022-05-27 武汉福旺家包装有限公司 Carton indentation quality detection method and system based on computer vision

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115294131A (en) * 2022-10-08 2022-11-04 南通海发水处理工程有限公司 Sewage treatment quality detection method and system

Similar Documents

Publication Publication Date Title
CN107016647B (en) Panoramic picture color tone consistency correcting method and system
Gao et al. Sand-dust image restoration based on reversing the blue channel prior
CN110874844A (en) Line segment detection method, device and equipment
CN106952271A (en) A kind of image partition method handled based on super-pixel segmentation and EM/MPM
CN103778900B (en) A kind of image processing method and system
CN115049665A (en) Fire hose surface quality detection method and system based on image processing
CN102196274A (en) Automatic white balance method based on color mapping
CN112164086A (en) Refined image edge information determining method and system and electronic equipment
CN109754440A (en) A kind of shadow region detection method based on full convolutional network and average drifting
CN111861880A (en) Image super-fusion method based on regional information enhancement and block self-attention
CN116758077A (en) Online detection method and system for surface flatness of surfboard
CN114323536B (en) Interpolation method for improving measurement accuracy of five-hole probe
CN116110053A (en) Container surface information detection method based on image recognition
CN114998320B (en) Method, system, electronic device and storage medium for visual saliency detection
CN114745532A (en) White balance processing method and device for mixed color temperature scene, storage medium and terminal
Gong et al. User-aided single image shadow removal
CN110377865A (en) A kind of ball curtain combination of edge weight computation method merging Bezier
CN113240685B (en) Image layering super-pixel segmentation method, system, electronic equipment and storage medium
CN116664567B (en) Solid insulation switch cabinet quality assessment method and system
CN107330863B (en) A kind of image de-noising method based on noise estimation
CN116485801B (en) Rubber tube quality online detection method and system based on computer vision
CN109285121A (en) A kind of Bayer image restoring method
CN114972345B (en) Yarn dyeing quality evaluation method and system based on self-adaptive mean shift clustering
CN113486899B (en) Saliency target detection method based on complementary branch network
JP5391970B2 (en) Moving object detection device and pixel similarity determination method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20220913