CN112435217A - Algorithm for recognizing rough surface of prefabricated reinforced concrete by image - Google Patents

Algorithm for recognizing rough surface of prefabricated reinforced concrete by image Download PDF

Info

Publication number
CN112435217A
CN112435217A CN202011201327.7A CN202011201327A CN112435217A CN 112435217 A CN112435217 A CN 112435217A CN 202011201327 A CN202011201327 A CN 202011201327A CN 112435217 A CN112435217 A CN 112435217A
Authority
CN
China
Prior art keywords
concrete
image
rough surface
roughness
algorithm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011201327.7A
Other languages
Chinese (zh)
Other versions
CN112435217B (en
Inventor
赵强
姚竝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Youyun Information Technology Nantong Co.,Ltd.
Original Assignee
Shanghai Ucloudy Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Ucloudy Information Technology Co ltd filed Critical Shanghai Ucloudy Information Technology Co ltd
Priority to CN202011201327.7A priority Critical patent/CN112435217B/en
Publication of CN112435217A publication Critical patent/CN112435217A/en
Application granted granted Critical
Publication of CN112435217B publication Critical patent/CN112435217B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24147Distances to closest patterns, e.g. nearest neighbour classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30132Masonry; Concrete

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention discloses an algorithm for recognizing rough surfaces of prefabricated reinforced concrete, which comprises the steps of firstly, collecting concrete images with known roughness levels, carrying out image recognition, storing statistics of gray level co-occurrence matrixes according to the levels, and building a training set of rough surfaces of the concrete, wherein the roughness levels are obtained by detection of a traditional sand piling method; secondly, acquiring a rough surface image of the concrete in unit area, and performing image processing to obtain gray level co-occurrence matrix statistics; then classifying the roughness grade of the concrete by using a k value adjacent method; and finally, comparing the obtained concrete roughness grade with the detection result of the traditional sand piling method, and if the obtained concrete roughness grade is correct, entering a training set. According to the method, the rough surface image of the concrete in unit area is obtained through the machine vision equipment, the roughness grade of the concrete is rapidly identified by using an image identification algorithm, and the operation requirements of site construction personnel are clarified. The algorithm software is integrated into the handheld device, and roughness grade parameters of the rough surface of the concrete can be obtained after photographing.

Description

Algorithm for recognizing rough surface of prefabricated reinforced concrete by image
Technical Field
The invention belongs to the field of image processing, and particularly relates to an algorithm for recognizing a rough surface of prefabricated reinforced concrete through an image.
Background
The connection quality of the reinforced concrete prefabricated part and the cast-in-place concrete is a key influencing the performance of the whole structure, wherein the roughness of the joint surface of the reinforced concrete prefabricated part is a key technical point and is an important evaluation index of the reinforced concrete prefabricated part. In various joints of precast concrete members, the roughness of the joint surface has a significant influence on the stress performance of the joints. The European specification takes the roughness of the joint surface as an important parameter for calculating the shearing resistance of the joint surface, and the current specification of China also puts forward corresponding design and construction requirements on the roughness of the joint surfaces of different members. Therefore, it is necessary to evaluate the roughness of the bonding surface reliably and quantitatively.
The parameters of the rough surface concave-convex degree of the concrete are generally measured by a sand piling method. The basic principle is as follows: and measuring fine sand with a certain volume V, pouring the fine sand on the dry rough surface to be detected, uniformly spreading the powdery material to ensure that the plane of the powdery material just covers the most protruded points of the rough surface, and calculating the area S covered by the powdery material so as to obtain the concrete roughness index V/S.
The traditional method for detecting the roughness of the concrete can indeed obtain an accurate numerical value, but the significance is to define the rough surface of the concrete as the grade of the order of several so as to guide an operator to bond concrete members, but the traditional detection method wastes time and labor, when hundreds of prefabricated concrete members need to be detected, the roughness grade cannot be rapidly and correctly obtained, the sand piling method can accurately calculate the roughness of the concrete, but the efficiency is low, the time and the labor are wasted, and when the concrete in a large number of construction sites needs to be detected, the great care is not taken.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the algorithm for recognizing the rough surface of the prefabricated reinforced concrete through the image solves the problems that in the prior art, a sand piling method is low in efficiency and wastes time and labor when the rough surface of the concrete is detected.
The invention adopts the following technical scheme for solving the technical problems:
the algorithm for recognizing the rough surface of the prefabricated reinforced concrete by the image comprises the following steps:
acquiring a concrete image with known roughness level, carrying out image recognition, storing statistics of a gray level co-occurrence matrix according to the level, and building a concrete rough surface training set, wherein the roughness level is obtained by detection of a traditional sand piling method;
secondly, acquiring a rough surface image of the concrete in unit area, and processing the image to obtain gray level co-occurrence matrix statistics;
thirdly, classifying the roughness grade of the concrete by using a k value adjacent method;
and step four, comparing the obtained concrete roughness grade with the detection result of the traditional sand piling method, and if the concrete roughness grade is correct, entering a training set.
In the second step, when the rough surface image of the concrete is collected, a standard frame in a unit area is firstly manufactured, and then the rough surface image of the concrete is collected along the outer edge of the standard frame.
The area of the standard frame is not more than 0.1 square meter.
The image processing process in the second step is as follows:
step 1, cutting images, cutting off irrelevant image contents, and cutting off irrelevant images by using the maximum gray difference value between two columns or two rows of the images as a cutting line;
step 2, constructing a gray level co-occurrence matrix, forming a point pair by any two points in the image, acquiring the gray value of the point pair, counting the occurrence frequency of the gray value of all the point pairs in the whole image, arranging the point pairs into a square matrix, and normalizing the square matrix into the occurrence probability by using the total occurrence frequency of the gray value, wherein the generated matrix is the gray level co-occurrence matrix;
and 3, calculating 6 gray level co-occurrence matrix statistics according to the gray level co-occurrence matrix, namely an inverse difference matrix, a second moment, entropy, contrast, difference and correlation.
The process for classifying the roughness grade of the concrete by the K value adjacency method in the third step is as follows:
step a, calculating the distance between a point in a known training set category and a current point;
step b, arranging according to the increasing order of the distances;
c, selecting front k points with the minimum distance from the current point;
d, determining the occurrence frequency of the category where the front k points are located;
and e, taking the category with the highest occurrence frequency of the first k points as the roughness prediction category of the current point, wherein k is an integer larger than 0 and is taken as a variable parameter.
The k value is selected as follows:
starting with k 1, the error rate of the classifier is estimated using the test set, and the process is repeated, incrementing k by 1 each time, allowing one neighbor to be added, choosing the k that produces the smallest error rate.
And the distance between the points is calculated by adopting a Euclidean distance.
Compared with the prior art, the invention has the following beneficial effects:
1. according to the method, the rough surface image of the concrete in unit area is obtained through the machine vision equipment, the roughness grade of the concrete is rapidly identified by using an image identification algorithm, and the operation requirements of site construction personnel are clarified.
2. The equipment used by the detection method is only one handheld equipment with photographing and calculating functions, the algorithm software is integrated into the handheld equipment, and the roughness grade parameter of the rough surface of the concrete in the standard size field of view can be obtained after photographing.
Drawings
FIG. 1 is a flowchart of an algorithm for image recognition of a rough surface of a prefabricated reinforced concrete.
FIG. 2 is an image of a standard frame per unit area of the rough surface of the concrete according to the present invention.
Fig. 3 is an image of the concrete rough surface per unit area collected according to the present invention.
FIG. 4 is a schematic diagram illustrating the calculation of the gray level co-occurrence matrix according to the present invention.
FIG. 5 is a schematic diagram of the K-value neighbor algorithm of the present invention.
Fig. 6 is a schematic diagram of euclidean distance in accordance with the present invention.
Detailed Description
The structure and operation of the present invention will be further described with reference to the accompanying drawings.
And shooting a standard-size visual field on the surface of the precast concrete by using machine vision equipment, carrying out statistical calculation on the gray level co-occurrence matrix of the shot image to obtain 6 statistics, and classifying by using a K value proximity algorithm to obtain the roughness grade of the precast concrete.
An algorithm for recognizing a rough surface of prefabricated reinforced concrete by an image comprises the following steps:
acquiring a concrete image with known roughness level, carrying out image recognition, storing statistics of a gray level co-occurrence matrix according to the level, and building a concrete rough surface training set, wherein the roughness level is obtained by detection of a traditional sand piling method;
secondly, acquiring a rough surface image of the concrete in unit area, and processing the image to obtain gray level co-occurrence matrix statistics;
thirdly, classifying the roughness grade of the concrete by using a K value adjacent method;
and step four, comparing the obtained concrete roughness grade with the detection result of the traditional sand piling method, and if the concrete roughness grade is correct, entering a training set.
In a specific embodiment, as shown in figures 1 to 5,
an algorithm for recognizing a rough surface of prefabricated reinforced concrete by an image comprises the following steps:
firstly, acquiring a concrete image with known roughness level, carrying out image recognition, storing statistics of gray level co-occurrence matrix according to the level, and building a concrete rough surface training set, wherein the roughness level is obtained by detection of a traditional sand piling method;
secondly, collecting rough surface images of the concrete in unit area, and carrying out image processing to obtain gray level co-occurrence matrix statistics; the method specifically comprises the following three parts:
firstly, image acquisition: according to the definition of roughness, the roughness of a unit area is required, so that a standard frame with the area of 0.1 square meter is manufactured, the area of the standard frame is not more than 0.1 square meter, the rough surface image of the concrete with the unit area of 0.1 square meter is collected, and the rough surface image is collected along the outer edge of the standard frame when the rough surface image is collected, and the rough surface image is shown in the attached figure 2.
Secondly, image processing
1. Image cutting, namely cutting off irrelevant images by taking the maximum gray difference value between two adjacent columns or two rows in the standard frame as a cutting line; the area needing to be identified is the area in the standard frame, so image cutting is carried out, irrelevant image content is cut, and the irrelevant image is cut by using the maximum gray scale difference value between two columns or two rows of the image as a cutting line. In this embodiment, the color of the standard frame is black, the color value of the standard frame in the gray scale map is the lowest, the difference between two adjacent columns or two rows is the largest, which is the cutting line, and the calculation formula of the difference between the rows is as follows:
the top inter-row difference is the sum of the gray values of all the pixel points in the (r-1) row and the sum of the gray values of all the pixel points in the (r) row;
top cut point ═ maximum (top row-to-row difference array);
the difference between the bottom lines is the sum of the gray values of all the pixel points of the (r) line and the sum of the gray values of all the pixel points of the (r-1) line;
bottom cut point is maximum (bottom row-to-row difference array);
the difference between the left end columns is the sum of the gray values of all the pixel points in the (c-1) row and the sum of the gray values of all the pixel points in the (c) row;
left-end cutting points are maximum values (left-end inter-column difference arrays);
the difference between the right end columns is the sum of the gray values of all the pixel points in the (c) row and the sum of the gray values of all the pixel points in the (c-1) row;
the right-end cut point is Max (difference between right-end columns);
where r is the number of rows in the image and c is the number of columns in the image.
Three, gray level co-occurrence matrix establishment
A point pair is formed at any point (x, y) in the image and a point (x + a, y + b) deviating from the point (wherein a and b are integers and are self-defined offset, and a and b appearing in a subsequent formula represent the same meaning). Assuming that the grayscale value of the pair is (f1, f2), the maximum grayscale level of the image is L, and L × L combinations of f1 and f2 are shared. For the whole image, the number of times of occurrence of each (f1, f2) value is counted, then arranged into a square matrix, and then normalized into probability of occurrence P (f1, f2) by the total number of times of occurrence of (f1, f2), so that the generated matrix is a gray level co-occurrence matrix.
In this embodiment, the description will be made by taking fig. 4 as an example, in which fig. a is an original image, and the maximum gray level is 16; for convenience of representation, the number of gray levels is reduced to 4, and graph a is changed to the form of graph b, so that the value range (f1, f2) is [0,3 ]; the gray level co-occurrence matrices shown in fig. e to g can be obtained by arranging the occurrence frequency of various combinations of (f1, f2) at different intervals.
Fig. e shows the case where (f1, f2) takes on the point pair consisting of the point (x, y) and the point (x +1, y +0) deviated from it in fig. b (10 types of filling are easily known from fig. b when the filling portion is f1 takes 0 and f2 takes 1).
Similarly, fig. f and g show 8 types of the dot pairs (f1 and f2) formed by the dots (x, y) in fig. c and d and the dots (x +1, y +1), (x +2, y +0), respectively (the filled part in fig. c shows that when f1 takes 0 and f2 takes 0, the diagonal dot pair (0,0) appears.
The filled part of the graph d indicates that 9 horizontal point pairs (0,2) appear when f1 takes 0 and f2 takes 2).
For example, for a-1 and b-0, the combination of (0,1) in the dot pair occurs 10 times in total. The comparison shows that the frequency of occurrence of (0,1), (1,2), (2,3) and (3,0) is higher. The graph b shows that there is a significant texture in the image in the top left and bottom right directions.
The distances (a, b) are different in value, and the values in the gray level co-occurrence matrix are different. The values of a and b are selected according to the characteristics of the periodic distribution of the texture, and for the finer texture, it is necessary to select the values of (1,0), (1,1), (2,0), and the like. and a, b with small value corresponds to a slowly-changing texture image, and the value on the diagonal line of the gray level co-occurrence matrix is large. The faster the texture changes, the smaller the values on the diagonal and the larger the values on both sides of the diagonal.
And 6 gray level co-occurrence matrix statistics are calculated according to the gray level co-occurrence matrix, namely inverse difference moment, second moment, entropy, contrast, difference and correlation.
1) The inverse difference moment (Homogeneity) is used for measuring the uniformity of the local intensity of the gray image, and the specific calculation formula is as follows:
Figure RE-GDA0002913871020000051
wherein, deltaφ,dIs the probability value of the gray level co-occurrence matrix.
2) The second moment (ASM), also called energy, is a measure of the uniformity of an image, the more uniform the gray distribution of the image, the larger the corresponding ASM value, and conversely, the smaller the ASM, and the specific formula is as follows:
Figure RE-GDA0002913871020000052
3) entropy (Entropy), which is a quantity for measuring the information quantity of a target image, when elements in an image are relatively dispersed, the Entropy is larger, and vice versa, the Entropy is smaller; the size of the entropy represents the uniformity or complexity of the texture of the target image, and the specific formula is as follows:
Figure RE-GDA0002913871020000053
4) contrast (Contrast) reflects the degree of gray scale change of a local image, and the larger the difference of gray scale values in the image is, the sharper the edge of the image is, and the larger the Contrast is, and the specific formula is as follows:
Figure RE-GDA0002913871020000054
5) the Dissimilarity (contrast) is similar to the contrast measure, but the Dissimilarity is better for the local feature measure, and when the local contrast increases, the Dissimilarity also increases, and the specific formula is as follows:
Figure RE-GDA0002913871020000055
6) correlation (Correlation), which represents the degree of linear relation between gray pixels of a target image and represents the similarity of row-column gray relation of a gray co-occurrence matrix, is as follows:
Figure RE-GDA0002913871020000061
thirdly, classifying the roughness grade of the concrete by using a K value adjacency method (KNN algorithm); the method comprises the following specific steps:
KNN algorithm flow
Step a, calculating the distance between a point in a known training set category and a current point;
step b, arranging according to the increasing order of the distances;
c, selecting front k points with the minimum distance from the current point;
d, determining the occurrence frequency of the category where the front k points are located;
and e, taking the category with the highest occurrence frequency of the first k points as the roughness grade prediction category of the current point.
The k value is selected as follows:
starting with k 1, the error rate of the classifier is estimated using the test set, and the process is repeated, incrementing k by 1 each time, allowing one neighbor to be added, choosing the k that produces the smallest error rate. In this embodiment, the K-value proximity algorithm is compared with the 6 statistics of the gray level co-occurrence matrix of the concrete roughness image of the known grade, and the most of the K-5 nearest neighbor samples belong to a certain grade in this embodiment. The test set estimation classifier takes different k values, and the error rate of the detection result of the same group of detected objects is the lowest, and the final k value is determined.
The Distance between the points is calculated using the Euclidean Distance (Euclidean Distance) as follows:
euclidean Distance definition:
the euclidean distance between two points or tuples P1 ═ x1, y1 and P2 ═ x2, y2 is d, where the formula for the euclidean distance between two points is as follows:
Figure RE-GDA0002913871020000062
the formula for the euclidean distance between the two sets of data is as follows:
Figure RE-GDA0002913871020000063
and fourthly, comparing the obtained concrete roughness grade with the detection result of the traditional sand piling method, and if the obtained concrete roughness grade is correct, entering a training set.
The invention and its features, aspects and advantages will become more apparent from the detailed description of non-limiting embodiments, which is set forth in all of the accompanying drawings. Like reference symbols in the various drawings indicate like elements. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
The above description is of the preferred embodiment of the invention. It is to be understood that the invention is not limited to the particular embodiments described above, wherein elements that are not described in detail are understood to be implemented in a manner common in the art; those skilled in the art can make many possible variations and modifications to the disclosed embodiments, or modify equivalent embodiments to equivalent variations, without departing from the spirit of the invention, using the methods and techniques disclosed above. Therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present invention are still within the scope of the protection of the technical solution of the present invention, unless the contents of the technical solution of the present invention are departed.

Claims (7)

1. The algorithm for recognizing the rough surface of the prefabricated reinforced concrete by the image is characterized in that: the method comprises the following steps:
acquiring a concrete image with known roughness level, carrying out image recognition, storing statistics of a gray level co-occurrence matrix according to the level, and building a concrete rough surface training set, wherein the roughness level is obtained by detection of a traditional sand piling method;
secondly, acquiring a rough surface image of the concrete in unit area, and processing the image to obtain gray level co-occurrence matrix statistics;
thirdly, classifying the roughness grade of the concrete by using a k value adjacent method;
and step four, comparing the obtained concrete roughness grade with the detection result of the traditional sand piling method, and if the concrete roughness grade is correct, entering a training set.
2. The algorithm for image recognition of a precast reinforced concrete rough surface according to claim 1, wherein: in the second step, when the rough surface image of the concrete is collected, a standard frame in a unit area is firstly manufactured, and then the rough surface image of the concrete is collected along the outer edge of the standard frame.
3. The algorithm for image recognition of a precast reinforced concrete rough surface according to claim 2, wherein: the area of the standard frame is not more than 0.1 square meter.
4. The algorithm for image recognition of a precast reinforced concrete rough surface according to claim 1, wherein: the image processing process in the second step is as follows:
step 1, cutting images, cutting off irrelevant image contents, and cutting off irrelevant images by using the maximum gray difference value between two columns or two rows of the images as a cutting line;
step 2, constructing a gray level co-occurrence matrix, forming a point pair by any two points in the image, acquiring the gray value of the point pair, counting the occurrence frequency of the gray value of all the point pairs in the whole image, arranging the point pairs into a square matrix, and normalizing the square matrix into the occurrence probability by using the total occurrence frequency of the gray value, wherein the generated matrix is the gray level co-occurrence matrix;
and 3, calculating 6 gray level co-occurrence matrix statistics according to the gray level co-occurrence matrix, namely an inverse difference matrix, a second moment, entropy, contrast, difference and correlation.
5. The algorithm for image recognition of a precast reinforced concrete rough surface according to claim 4, wherein: the process for classifying the roughness grade of the concrete by the K value adjacency method in the third step is as follows:
step a, calculating the distance between a point in a known training set category and a current point;
step b, arranging according to the increasing order of the distances;
c, selecting front k points with the minimum distance from the current point;
d, determining the occurrence frequency of the category where the front k points are located;
and e, taking the category with the highest occurrence frequency of the first k points as the roughness prediction category of the current point, wherein k is an integer larger than 0 and is taken as a variable parameter.
6. The algorithm for image recognition of a precast reinforced concrete matte surface according to claim 5, wherein: the k value is selected as follows:
starting with k 1, the error rate of the classifier is estimated using the test set, and the process is repeated, incrementing k by 1 each time, allowing one neighbor to be added, choosing the k that produces the smallest error rate.
7. The algorithm for image recognition of a precast reinforced concrete matte surface according to claim 5, wherein: and the distance between the points is calculated by adopting a Euclidean distance.
CN202011201327.7A 2020-11-02 2020-11-02 Method for recognizing rough surface of prefabricated reinforced concrete through image Active CN112435217B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011201327.7A CN112435217B (en) 2020-11-02 2020-11-02 Method for recognizing rough surface of prefabricated reinforced concrete through image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011201327.7A CN112435217B (en) 2020-11-02 2020-11-02 Method for recognizing rough surface of prefabricated reinforced concrete through image

Publications (2)

Publication Number Publication Date
CN112435217A true CN112435217A (en) 2021-03-02
CN112435217B CN112435217B (en) 2022-01-11

Family

ID=74695081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011201327.7A Active CN112435217B (en) 2020-11-02 2020-11-02 Method for recognizing rough surface of prefabricated reinforced concrete through image

Country Status (1)

Country Link
CN (1) CN112435217B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113034482A (en) * 2021-04-07 2021-06-25 山东大学 Surface roughness detection method based on machine vision and machine learning
CN116819050A (en) * 2023-06-26 2023-09-29 河南安祐中路桥工程有限公司 Concrete stirring uniformity detection system and detection method thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101358837A (en) * 2008-09-24 2009-02-04 重庆交通大学 Method for determine surface structure depth of exposed aggregate concrete by curved surface fitting
US20130082856A1 (en) * 2010-08-26 2013-04-04 David W. Paglieroni Real-time system for imaging and object detection with a multistatic gpr array
CN108665431A (en) * 2018-05-16 2018-10-16 南京信息工程大学 Fractional order image texture Enhancement Method based on K- mean clusters
CN109584286A (en) * 2019-01-22 2019-04-05 东南大学 A kind of bituminous pavement construction depth calculation method based on generalized regression nerve networks
CN209512762U (en) * 2019-05-06 2019-10-18 河北建设集团股份有限公司 A kind of concrete rough surface instruments of inspection
CN110415167A (en) * 2019-08-02 2019-11-05 山东科技大学 A kind of rough surface crack generation method and pilot system based on Digital image technology
CN110415241A (en) * 2019-08-02 2019-11-05 同济大学 A kind of surface of concrete structure quality determining method based on computer vision
CN211306752U (en) * 2019-12-17 2020-08-21 江西省科森建筑科技有限公司 Concrete rough surface processing apparatus is used in prefabricated component production

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101358837A (en) * 2008-09-24 2009-02-04 重庆交通大学 Method for determine surface structure depth of exposed aggregate concrete by curved surface fitting
US20130082856A1 (en) * 2010-08-26 2013-04-04 David W. Paglieroni Real-time system for imaging and object detection with a multistatic gpr array
CN108665431A (en) * 2018-05-16 2018-10-16 南京信息工程大学 Fractional order image texture Enhancement Method based on K- mean clusters
CN109584286A (en) * 2019-01-22 2019-04-05 东南大学 A kind of bituminous pavement construction depth calculation method based on generalized regression nerve networks
CN209512762U (en) * 2019-05-06 2019-10-18 河北建设集团股份有限公司 A kind of concrete rough surface instruments of inspection
CN110415167A (en) * 2019-08-02 2019-11-05 山东科技大学 A kind of rough surface crack generation method and pilot system based on Digital image technology
CN110415241A (en) * 2019-08-02 2019-11-05 同济大学 A kind of surface of concrete structure quality determining method based on computer vision
CN211306752U (en) * 2019-12-17 2020-08-21 江西省科森建筑科技有限公司 Concrete rough surface processing apparatus is used in prefabricated component production

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113034482A (en) * 2021-04-07 2021-06-25 山东大学 Surface roughness detection method based on machine vision and machine learning
CN116819050A (en) * 2023-06-26 2023-09-29 河南安祐中路桥工程有限公司 Concrete stirring uniformity detection system and detection method thereof
CN116819050B (en) * 2023-06-26 2024-04-09 河南安祐中路桥工程有限公司 Concrete stirring uniformity detection system and detection method thereof

Also Published As

Publication number Publication date
CN112435217B (en) 2022-01-11

Similar Documents

Publication Publication Date Title
CN112435217B (en) Method for recognizing rough surface of prefabricated reinforced concrete through image
Hu et al. Automatic pavement crack detection using texture and shape descriptors
CN115082467B (en) Building material welding surface defect detection method based on computer vision
CN105510195B (en) A kind of granularity particle shape online test method for stacking aggregate
CN105427298B (en) Remote sensing image registration method based on anisotropic gradient metric space
CN107248159A (en) A kind of metal works defect inspection method based on binocular vision
CN110084241B (en) Automatic ammeter reading method based on image recognition
CN109284786B (en) SAR image terrain classification method for generating countermeasure network based on distribution and structure matching
CN116664559B (en) Machine vision-based memory bank damage rapid detection method
CN101710387A (en) Intelligent method for classifying high-resolution remote sensing images
CN103578110A (en) Multi-band high-resolution remote sensing image segmentation method based on gray scale co-occurrence matrix
CN106937109B (en) The method that low cost judges resolution ratio of camera head level
CN107248172A (en) A kind of remote sensing image variation detection method based on CVA and samples selection
CN101114337A (en) Ground buildings recognition positioning method
CN105335952A (en) Matching cost calculation method and apparatus, and parallax value calculation method and equipment
Liang et al. An extraction and classification algorithm for concrete cracks based on machine vision
CN115063620B (en) Bit layering based Roots blower bearing wear detection method
CN109993202A (en) A kind of line chirotype shape similarity judgment method, electronic equipment and storage medium
CN110533713A (en) Bridge Crack width high-precision measuring method and measuring device
CN108805896B (en) Distance image segmentation method applied to urban environment
CN104143191A (en) Remote sensing image change detection method based on texton
CN116205884A (en) Concrete dam crack identification method and device
CN104616264B (en) The automatic contrast enhancement method of gene-chip Image
CN111797679A (en) Remote sensing texture information processing method and device, terminal and storage medium
Yang et al. An intelligent defect detection method of small sized ceramic tile using machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220822

Address after: Room 1303, No. 999, Guangzhou Road, Haimen Economic and Technological Development Zone, Nantong City, Jiangsu Province, 226116

Patentee after: Youyun Information Technology Nantong Co.,Ltd.

Address before: Floor 3, building 19, building 8, No. 498, GuoShouJing Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai, 201802

Patentee before: SHANGHAI UCLOUDY INFORMATION TECHNOLOGY Co.,Ltd.