CN113705702A - Embedded part detection method and system - Google Patents

Embedded part detection method and system Download PDF

Info

Publication number
CN113705702A
CN113705702A CN202111020703.7A CN202111020703A CN113705702A CN 113705702 A CN113705702 A CN 113705702A CN 202111020703 A CN202111020703 A CN 202111020703A CN 113705702 A CN113705702 A CN 113705702A
Authority
CN
China
Prior art keywords
image
embedded part
position coordinates
type
embedment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111020703.7A
Other languages
Chinese (zh)
Inventor
邓凯文
彭英辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sany Construction Robot Xian Research Institute Co Ltd
Original Assignee
Sany Construction Robot Xian Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sany Construction Robot Xian Research Institute Co Ltd filed Critical Sany Construction Robot Xian Research Institute Co Ltd
Priority to CN202111020703.7A priority Critical patent/CN113705702A/en
Publication of CN113705702A publication Critical patent/CN113705702A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • G06F18/2414Smoothing the distance, e.g. radial basis function networks [RBFN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a system for detecting an embedded part, which belong to the technical field of assembly type buildings and comprise the following steps: step S0: obtaining the standard position coordinates and the type of the embedded part; step S1: photographing the prefabricated part and the embedded part to obtain an actual assembly image; step S2: correcting distortion of the actual assembly image to obtain a corrected assembly image; step S3: inputting the corrected and assembled image into a deep learning classification network, and identifying the position coordinates and the type of the embedded part; step S4: the position coordinates and type of the embedment are compared with the standard position coordinates and type of the known embedment. The embedded part detection method provided by the invention has the advantages that manual measurement and detection are not needed, time and labor are saved, the detection result is accurate, in addition, the detection time is saved by adopting a photographing mode, the actual image of the embedded part can be photographed, the accurate identification of the embedded part is facilitated, an image standardized template is not needed to be established for a new prefabricated part, and the operation process is simplified.

Description

Embedded part detection method and system
Technical Field
The invention relates to the technical field of assembly type buildings, in particular to a method and a system for detecting an embedded part.
Background
The prefabricated building mode needs various prefabricated components prepared in advance before construction, and in the automatic production process of the prefabricated components, after the reinforcement cage is combined, embedded parts need to be placed at the designated positions of a prefabricated component production mold table. The types and the installation positions of the embedded parts need to be strictly carried out according to drawings, so that whether the types of the embedded parts are correct or not and whether the embedded parts are installed in place or not needs to be detected after the embedded parts are embedded in the prefabricated parts. The prior art generally adopts a manual measurement mode for detection, however, the mode is time-consuming and labor-consuming, and the detection result is inaccurate.
Disclosure of Invention
Therefore, the invention aims to overcome the defects that the embedded part detection method in the prior art is time-consuming and labor-consuming and has inaccurate detection results.
In order to solve the technical problem, the invention provides an embedded part detection method, which comprises the following steps:
step S0: obtaining the standard position coordinates and types of the embedded parts on the current mold table through a central control system;
step S1: photographing the prefabricated part and the embedded part arranged on the mold table to obtain an actual assembly image;
step S2: correcting the distortion of the actual assembly image through a calibration algorithm to obtain a corrected assembly image;
step S3: inputting the corrected and assembled image into a deep learning classification network, identifying the position coordinates and the type of the embedded part through the deep learning classification network,
the deep learning classification network is obtained by inputting samples obtained by respectively arranging various embedded parts on the prefabricated part at various angles and photographing the samples into the deep learning network for training;
step S4: and comparing the position coordinates and the type of the embedded part obtained in the step S3 with the standard position coordinates and the type of the embedded part known in the step S0.
In step S2, the calibration algorithm is calibration parameters for obtaining a model stage image by photographing the model stage, and correcting the model stage image by a nine-point calibration method and a calibration plate calibration method, and the calibration parameters are applied to the actual assembly image for correction.
Repeating the steps S3 to S4 for the same set of the prefabricated parts and the embedded parts for a plurality of times for comparison.
Further comprising step S5: and displaying and/or alarming the comparison result in the step S4.
The invention also provides an embedded part detection system, which comprises:
the image acquisition unit is used for photographing the prefabricated part, the embedded part and the die table to obtain an actual assembly image or a die table image;
the image processing unit comprises an image calibration correction and deep learning classification network to obtain the actual position coordinates and the actual type of the embedded part;
and the image comparison unit is used for comparing the actual position coordinates and the actual type of the embedded part with the known standard position coordinates and type of the embedded part.
The middle control unit is electrically connected with the image acquisition unit, and after the prefabricated part and the embedded part are arranged on the die table, the middle control unit sends a shooting signal to the image acquisition unit so as to control the image acquisition unit to shoot.
The central control unit is electrically connected with the image comparison unit, and sends the standard position coordinates and types of the embedded parts to the image comparison unit and receives the comparison result of the image comparison unit.
And the central control unit automatically extracts the standard position coordinates and the types of the embedded parts from a database file assembled by the prefabricated parts and the embedded parts.
The die table is arranged on the opposite surface, the periphery of the opposite surface extends out of the periphery of the die table, and the opposite surface and the die table have different colors.
The image comparison unit is electrically connected with the central control unit to display the comparison result of the image comparison unit.
The technical scheme of the invention has the following advantages:
1. the invention provides an embedded part detection method, which obtains an actual assembly image by photographing a prefabricated part and an embedded part on a module and corrects the actual assembly image to eliminate distortion generated by photographing, can accurately obtain the position coordinate of the embedded part, and then identifies the type of the embedded part by using a deep learning classification network, so that the obtained position coordinate and type of the embedded part can be compared with the known standard position coordinate and type of the embedded part to obtain whether the installation type and the installation position of the embedded part are correct or not, manual measurement and detection are not needed, time and labor are saved, the detection result is accurate, in addition, the photographing mode is adopted, the detection time is saved, the actual image of the embedded part can be photographed, the accurate identification of the embedded part is facilitated, an image standardized template is not needed to be established for a new prefabricated part, the operation process is simplified.
2. According to the embedded part detection method provided by the invention, the die table is photographed, and the die table image is corrected by using a nine-point calibration method and a calibration plate calibration method to obtain calibration parameters, wherein the calibration parameters can be used as general calibration parameters of the photographing equipment, and the actual assembly image photographed by the photographing equipment can be corrected by using the calibration parameters.
3. According to the embedded part detection method provided by the invention, the same group of prefabricated parts and embedded parts are photographed and identified for multiple times, so that the detection result is more accurate.
4. According to the embedded part detection method provided by the invention, the comparison result is displayed through the display prompt and/or the alarm prompt, so that the detection result is clearer for the working personnel, and the misjudgment is prevented.
5. According to the embedded part detection system, whether the installation type and the installation position of the embedded part are correct or not can be detected by arranging the image acquisition unit, the image processing unit and the image comparison unit, manual measurement and detection are not needed, time and labor are saved, the detection result is accurate, in addition, the actual image of the embedded part can be shot by adopting a shooting mode, and the accurate identification of the embedded part is facilitated.
6. According to the embedded part detection system, the central control unit is arranged to control the image acquisition unit, so that the operation is more convenient.
7. According to the embedded part detection system, the central control unit is electrically connected with the image comparison unit, so that the central control unit can automatically receive a comparison result.
8. According to the embedded part detection system, the central control unit is used for automatically extracting the standard position coordinates and the type of the embedded part from the database file assembled by the prefabricated part and the embedded part, manual input is not needed, and the embedded part detection system is more convenient and accurate.
9. According to the embedded part detection system provided by the invention, the contrast surface is arranged, and the contrast surface and the die table have different colors, so that the die table is more prominent during photographing, and the correction and identification of images at the later stage are more convenient.
10. According to the embedded part detection system provided by the invention, the prompt unit is arranged to clearly and accurately inform the staff of the comparison result, so that the staff is prevented from misjudging.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of a method for embedded part detection according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an embedded part detection system according to an embodiment of the present invention.
Description of reference numerals:
10. an image acquisition unit; 20. a mould table; 30. a central control unit; 40. a contrast surface; 50. and (4) a bracket.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In addition, the technical features involved in the different embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
One specific embodiment of the embedded part detection method shown in fig. 1 includes the following steps:
step S0: obtaining the standard position coordinates and types of the embedded parts on the current mold table through a central control system;
the central control system automatically extracts the standard position coordinates and the types of the embedded parts from a database file assembled by the prefabricated parts and the embedded parts, and stores the standard position coordinates and the types in the central control system without manual input, so that the embedded parts are more convenient and accurate.
Step S1: photographing the prefabricated part and the embedded part arranged on the mold table to obtain an actual assembly image;
the camera is arranged to photograph the prefabricated part and the embedded part, can be selected according to different precision requirements and sizes of the die table, and is required to be guaranteed to photograph a complete die table picture at a time.
Step S2: correcting distortion of the actual assembly image through a calibration algorithm to obtain a corrected assembly image;
the calibration algorithm is calibration parameters for photographing a mold to obtain a mold image, and correcting the mold image by a nine-point calibration method and a calibration plate calibration method, wherein the calibration parameters can be used as general calibration parameters of the camera, and actual assembly images photographed by the camera can be corrected by using the calibration parameters.
Step S3: inputting the corrected and assembled image into a deep learning classification network, and identifying the position coordinates and the types of the embedded parts through the deep learning classification network;
the deep learning classification network is obtained by inputting a sample obtained by photographing a plurality of embedded parts on the prefabricated part at a plurality of angles respectively and training the sample.
Step S4: comparing the position coordinates and the type of the embedded part obtained in the step S3 with the standard position coordinates and the type of the embedded part known in the step S0;
step S5: performing display prompt and alarm prompt on the comparison result in the step S4;
the comparison result is displayed through the display interface, when the position coordinate or the type of the embedded part is wrong, error information is displayed on the display interface, and the alarm is used for prompting, so that the comparison result is clearly and accurately informed to a worker, and the worker is prevented from misjudging.
In this embodiment, for the same set of the prefabricated parts and the embedded parts, the steps S3 to S4 are repeated for multiple times to perform comparison, so that the detection result is more accurate.
There is also provided in accordance with an embodiment of the embedment detection system shown in fig. 2, including: an image acquisition unit 10, an image processing unit and an image contrast unit. The image acquisition unit 10 is used for photographing the prefabricated part, the embedded part and the mold table 20 to obtain an actual assembly image or a mold table image; the image processing unit comprises an image calibration correction and deep learning classification network to obtain the actual position coordinates and the actual type of the embedded part; the image comparison unit is used for comparing the actual position coordinates and the actual type of the embedded part with the known standard position coordinates and type of the embedded part.
Whether the installation type and the installation position of the embedded part are correct or not can be detected by arranging the image acquisition unit 10, the image processing unit and the image comparison unit, manual measurement and detection are not needed, time and labor are saved, the detection result is accurate, in addition, the shooting mode is adopted, the detection time is saved, the actual image of the embedded part can be shot, the accurate identification of the embedded part is facilitated, an image standardized template does not need to be established for a new prefabricated part, and the operation process is simplified.
In the present embodiment, a central control unit 30 is further provided, and the central control unit 30 is electrically connected to the image acquisition unit 10 and the image contrast unit. Specifically, after the prefabricated part and the embedded part are arranged on the mold table 20, the central control unit 30 sends a shooting signal to the image acquisition unit 10 to control the image acquisition unit 10 to shoot, so as to obtain an actual assembly image; the image acquisition unit 10 sends the actual assembly image to the image processing unit, the image processing unit calibrates and corrects the actual assembly image to obtain a corrected assembly image, and the corrected assembly image is identified through a deep learning classification network to obtain the actual position coordinate and the actual type of the embedded part; the central control unit 30 sends the standard position coordinates and the types of the embedded parts to the image comparison unit, compares the standard position coordinates and the types with the actual position coordinates and the actual types of the embedded parts, and sends the comparison result to the central control unit 30.
In this embodiment, the standard position coordinates and types of embedments are automatically extracted from the database file of the prefabricated parts and embedment assembly by the central control unit 30.
In this embodiment, a prompt unit is further provided, and the prompt unit is electrically connected to the central control unit 30, and the prompt unit includes a display interface and an alarm. The comparison result is displayed through the display interface, when the position coordinate or the type of the embedded part is wrong, error information is displayed on the display interface, and the alarm is given, so that the comparison result is clearly and accurately informed to workers, and the workers are prevented from misjudging.
In this embodiment, the mold stage 20 is disposed on the contrast surface 40, the periphery of the contrast surface 40 is disposed to extend beyond the periphery of the mold stage 20, and the contrast surface 40 and the mold stage 20 have different colors. Specifically, the reference surface 40 may be paint applied to the floor of the predetermined set position of the mold table 20. The mould stage 20 can be more prominent when in shooting, and the later correction and identification of the image are more convenient.
In the present embodiment, the image capturing unit 10 is a camera and is supported by the stand 50 at one side of the mold table 20. Of course, the image capturing unit 10 may also be a line scan camera for reading the 3D information of the embedded part.
In summary, the method for detecting an embedded part provided in this embodiment photographs a prefabricated part and an embedded part on a platen to obtain an actual assembly image, corrects the actual assembly image to eliminate distortion caused by photographing, can accurately obtain position coordinates of the embedded part, and then identifies the type of the embedded part by using a deep learning classification network, so that the obtained position coordinates and type of the embedded part can be compared with the known standard position coordinates and type of the embedded part to obtain whether the installation type and installation position of the embedded part are correct, manual measurement and detection are not needed, time and labor are saved, the detection result is accurate, the photographing mode is adopted to save detection time, the actual image of the embedded part can be photographed, the accurate identification of the embedded part is facilitated, an image standardized template is not needed to be established for a new prefabricated part, the operation process is simplified.
It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications therefrom are within the scope of the invention.

Claims (10)

1. An embedded part detection method is characterized by comprising the following steps:
step S0: obtaining the standard position coordinates and types of the embedded parts on the current mold table through a central control system;
step S1: photographing the prefabricated part and the embedded part arranged on the mold table to obtain an actual assembly image;
step S2: correcting the distortion of the actual assembly image through a calibration algorithm to obtain a corrected assembly image;
step S3: inputting the corrected and assembled image into a deep learning classification network, identifying the position coordinates and the type of the embedded part through the deep learning classification network,
the deep learning classification network is obtained by inputting samples obtained by respectively arranging various embedded parts on the prefabricated part at various angles and photographing the samples into the deep learning network for training;
step S4: and comparing the position coordinates and the type of the embedded part obtained in the step S3 with the standard position coordinates and the type of the embedded part known in the step S0.
2. The embedded part detection method according to claim 1, wherein in step S2, the calibration algorithm is calibration parameters that are obtained by taking a picture of the mold table to obtain a mold table image, and the mold table image is corrected by a nine-point calibration method and a calibration plate calibration method, and the calibration parameters are applied to the actual assembly image for correction.
3. The embedment detection method of claim 1 or 2, wherein steps S3 through S4 are repeated a plurality of times for the same set of the prefabricated components and the embedment for a plurality of comparisons.
4. The embedment detection method of claim 1 or 2, further comprising step S5: and displaying and/or alarming the comparison result in the step S4.
5. An embedment detection system, comprising:
the image acquisition unit (10) is used for photographing the prefabricated part, the embedded part and the mould table (20) to obtain an actual assembly image or a mould table (20) image;
the image processing unit comprises an image calibration correction and deep learning classification network to obtain the actual position coordinates and the actual type of the embedded part;
and the image comparison unit is used for comparing the actual position coordinates and the actual type of the embedded part with the known standard position coordinates and type of the embedded part.
6. The embedded part detection system according to claim 5, further comprising a central control unit (30), wherein the central control unit (30) is electrically connected to the image acquisition unit (10), and after the prefabricated part and the embedded part are arranged on the mold table (20), the central control unit (30) sends a shooting signal to the image acquisition unit (10) to control the image acquisition unit (10) to shoot.
7. The embedment detection system of claim 6, wherein the central control unit (30) is electrically connected to the image comparison unit, the central control unit (30) transmitting the standard position coordinates and the type of the embedment to the image comparison unit and receiving the comparison result of the image comparison unit.
8. The embedment detection system of claim 7, wherein the central control unit (30) automatically extracts the standard position coordinates and the type of embedment from a database file of prefabricated components and embedment assemblies.
9. An embedment detection system according to any one of claims 5-8, wherein the mold table (20) is disposed on a reference surface (40), a periphery of the reference surface (40) is disposed to extend beyond a periphery of the mold table (20), the reference surface (40) and the mold table (20) having different colors.
10. The embedded part detection system as claimed in claim 7, further comprising a prompt unit electrically connected to the central control unit (30) for showing the comparison result of the image comparison unit.
CN202111020703.7A 2021-09-01 2021-09-01 Embedded part detection method and system Pending CN113705702A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111020703.7A CN113705702A (en) 2021-09-01 2021-09-01 Embedded part detection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111020703.7A CN113705702A (en) 2021-09-01 2021-09-01 Embedded part detection method and system

Publications (1)

Publication Number Publication Date
CN113705702A true CN113705702A (en) 2021-11-26

Family

ID=78658719

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111020703.7A Pending CN113705702A (en) 2021-09-01 2021-09-01 Embedded part detection method and system

Country Status (1)

Country Link
CN (1) CN113705702A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115126266A (en) * 2022-07-04 2022-09-30 中铁二十局集团第二工程有限公司 Construction method of embedded part
CN117784160A (en) * 2023-12-21 2024-03-29 中国核工业华兴建设有限公司 Deep learning-based embedded part position checking method and checking equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115126266A (en) * 2022-07-04 2022-09-30 中铁二十局集团第二工程有限公司 Construction method of embedded part
CN117784160A (en) * 2023-12-21 2024-03-29 中国核工业华兴建设有限公司 Deep learning-based embedded part position checking method and checking equipment

Similar Documents

Publication Publication Date Title
CN113705702A (en) Embedded part detection method and system
CN113706495B (en) Machine vision detection system for automatically detecting lithium battery parameters on conveyor belt
CN111626139A (en) Accurate detection method for fault information of IT equipment in machine room
CN109167997A (en) A kind of video quality diagnosis system and method
CN106645209B (en) Detection method and system for key silk-screen printing
CN111242902A (en) Method, system and equipment for identifying and detecting parts based on convolutional neural network
CN116030053B (en) Connector pin defect detection method, device, equipment and medium
CN113470018B (en) Hub defect identification method, electronic device, device and readable storage medium
CN110953993A (en) Detection device and method for sag and distance limit of power transmission line
CN111751898A (en) Device and method for detecting whether core print falls off
CN113688817A (en) Instrument identification method and system for automatic inspection
CN111105466A (en) Calibration method of camera in CT system
EP4024129A1 (en) Building material image recognition and analysis system and method
CN116337412A (en) Screen detection method, device and storage medium
CN108511356B (en) Battery series welding machine positioning and battery appearance detection method
CN112781518B (en) House deformation monitoring method and system
CN112950543A (en) Bridge maintenance method and system, storage medium and intelligent terminal
CN106017423A (en) Separation-localization detecting system and method for existing building curtain walls
CN115379197A (en) Method and device for detecting lens surface of camera product on production line
CN110264453B (en) Calibration method and system for monitoring switch machine gap
CN115346164A (en) Automatic model reconstruction method and system for component recognition model
CN114155432A (en) Meter reading identification method based on robot
CN114392940A (en) Stitch detection method and device for special-shaped component
KR20210092454A (en) Automatic evaluation system for craftsman architectural painting test, and its evaluation method using thereof
CN113567450A (en) Engine label information visual detection system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination