CN108226164B - Robot polishing method and system based on visual detection - Google Patents

Robot polishing method and system based on visual detection Download PDF

Info

Publication number
CN108226164B
CN108226164B CN201711470467.2A CN201711470467A CN108226164B CN 108226164 B CN108226164 B CN 108226164B CN 201711470467 A CN201711470467 A CN 201711470467A CN 108226164 B CN108226164 B CN 108226164B
Authority
CN
China
Prior art keywords
workpiece
defects
robot
polishing
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711470467.2A
Other languages
Chinese (zh)
Other versions
CN108226164A (en
Inventor
潘才锦
方思雯
陈和平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Academy Of Robotics
Original Assignee
Shenzhen Academy Of Robotics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Academy Of Robotics filed Critical Shenzhen Academy Of Robotics
Priority to CN201711470467.2A priority Critical patent/CN108226164B/en
Publication of CN108226164A publication Critical patent/CN108226164A/en
Application granted granted Critical
Publication of CN108226164B publication Critical patent/CN108226164B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B1/00Processes of grinding or polishing; Use of auxiliary equipment in connection with such processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B21/00Machines or devices using grinding or polishing belts; Accessories therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B21/00Machines or devices using grinding or polishing belts; Accessories therefor
    • B24B21/18Accessories
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B49/00Measuring or gauging equipment for controlling the feed movement of the grinding tool or work; Arrangements of indicating or measuring equipment, e.g. for indicating the start of the grinding operation
    • B24B49/12Measuring or gauging equipment for controlling the feed movement of the grinding tool or work; Arrangements of indicating or measuring equipment, e.g. for indicating the start of the grinding operation involving optical means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24BMACHINES, DEVICES, OR PROCESSES FOR GRINDING OR POLISHING; DRESSING OR CONDITIONING OF ABRADING SURFACES; FEEDING OF GRINDING, POLISHING, OR LAPPING AGENTS
    • B24B51/00Arrangements for automatic control of a series of individual steps in grinding a workpiece
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8861Determining coordinates of flaws
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8874Taking dimensions of defect into account
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Abstract

The invention discloses a robot polishing method and system based on visual detection, the system comprises a robot, a shielding case, a polishing abrasive belt machine, a computer, a tail end clamp for clamping a workpiece and a camera for collecting images, the camera is installed in the shielding case, the robot is connected with the tail end clamp, the computer is respectively connected with the robot, the camera and the polishing abrasive belt machine, the computer comprises: the invention integrates workpiece defect detection and workpiece polishing, greatly improves the production efficiency, and can be widely applied to the technical field of workpiece processing as a robot polishing method and system based on visual detection with excellent performance.

Description

Robot polishing method and system based on visual detection
Technical Field
The invention relates to the technical field of workpiece processing, in particular to a robot polishing method and system based on visual detection.
Background
In the manufacturing industry, grinding and polishing are a key process, and the quality of grinding often determines the grade of a product. The traditional polishing method comprises three working modes of manual polishing, special machine tool polishing and numerical control machine tool polishing. The manual polishing is the most important working mode, but has many disadvantages, such as occupational diseases such as pneumoconiosis and the like are easy to occur due to the influence of dust and noise when workers polish, the physical health of the workers is threatened, the manual polishing efficiency is low, the production of products is long, the manual work has many flaws, and the detection steps are complicated; the universality of the special machine tool is not good, and the special machine tool is only suitable for batch production; the processing cost of the numerical control machine tool is high.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a robot polishing method based on visual inspection, and another object of the present invention is to provide a robot polishing system based on visual inspection.
The technical scheme adopted by the invention is as follows:
a robot polishing method based on visual inspection comprises the following steps:
s1, controlling the robot to move the workpiece to a polishing area for polishing;
s2, controlling the robot to move the workpiece to a visual detection area;
s3, controlling the camera to sequentially acquire images of the parts to be detected of the workpiece, acquiring images of all the parts to be detected of the workpiece, processing the images, judging whether the workpiece has defects, if not, placing the workpiece in a qualified product area, otherwise, executing the step S4;
s4, acquiring the position coordinates of the defects and generating grading processing instructions corresponding to the defects, and further controlling the robot to carry out grading polishing processing on the workpiece according to the position coordinates of the defects and the grading processing instructions;
and S5, acquiring images of the workpieces subjected to the grading polishing treatment, judging whether the workpieces have defects according to the acquired images, and finally controlling the robot to place the workpieces qualified in polishing into a qualified product area.
Further, the method also comprises the following steps:
and S0, establishing a corresponding relation between the tool coordinate of the robot and the coordinates of the grinding belt sander and the camera.
Further, the step of acquiring images of all to-be-detected portions of the workpiece in step S3, performing image processing on the images, and determining whether the workpiece has a defect includes:
s31, acquiring images of all parts to be detected of the workpiece;
s32, performing traversal scanning on the image of the part to be detected;
s33, carrying out binarization processing on the image, and calibrating the coordinate value of each pixel point in the image;
s34, performing edge detection and contour extraction on the image according to the coordinate value;
and S35, comparing the extracted contour with a preset defect contour database, judging that the workpiece has defects if corresponding defects are obtained, and otherwise, judging that the workpiece does not have defects.
Further, the step S4 includes the steps of:
s41, acquiring position coordinates of the defects;
s42, calculating the surface area of the defect;
s43, generating a grading processing instruction corresponding to the defect according to the surface area of the defect;
and S44, controlling the robot to carry out grading grinding treatment on the workpiece according to the defect position coordinates and the grading treatment command.
Further, the hierarchical processing instruction corresponding to the defect in step S43 is generated by:
surface area responsive to defects is 0-1.0mm2In case of (2), generating a primary processing instruction;
surface area in response to defects was 1.1mm2-2.0mm2In case of (2), generating a secondary processing instruction;
surface area in response to defects was 2.1mm2-3.0mm2In case of (3), generating a three-level processing instruction;
surface area responsive to defects greater than 3mm2In this case, a four-stage processing instruction is generated.
Further, the hierarchical processing instructions are configured to:
the primary processing instruction is as follows: offsetting the central point of the robot tool by 0.5mm, moving the workpiece to the position of a polishing abrasive belt machine, and polishing the defects;
the secondary processing instruction is as follows: offsetting the central point of the robot tool by 1.0mm, moving the workpiece to the position of a polishing abrasive belt machine, and polishing the defects;
the three-stage processing instruction is as follows: offsetting the central point of the robot tool by 1.5mm, moving the workpiece to the position of a polishing abrasive belt machine, and polishing the defects;
the four-level processing instruction is as follows: and placing the workpiece in a defective product area.
Further, the step S5 specifically includes: after the robot is controlled to move the workpiece to the visual detection area, the camera is controlled to sequentially acquire images of the parts to be detected of the workpiece, whether the workpiece has defects is judged after the images of all the parts to be detected of the workpiece are acquired and processed, and if the defects exist and the surface area of the defects is larger than or equal to 0.5mm2And if the workpiece is unqualified to be polished, controlling the robot to place the unqualified workpiece to be polished in an unqualified product area, otherwise, controlling the robot to place the qualified workpiece to be polished in a qualified product area, wherein the workpiece is qualified to be polished.
The utility model provides a robot system of polishing based on visual detection, includes robot, shield cover, the abrasive band machine of polishing, computer, is used for the terminal anchor clamps of centre gripping work piece and the camera that is used for gathering the image, the camera is installed in the shield cover, the robot is connected with terminal anchor clamps, the computer is connected with robot, camera and the abrasive band machine of polishing respectively, the computer includes:
the polishing control module is used for controlling the robot to move the workpiece to the polishing area for polishing;
the workpiece moving module is used for controlling the robot to move the workpiece to the visual detection area;
the defect detection module is used for controlling the camera to sequentially acquire images of the parts to be detected of the workpiece, acquiring images of all the parts to be detected of the workpiece and processing the images, judging whether the workpiece has defects or not, if not, placing the workpiece in a qualified product area, otherwise, executing the defect processing module;
the defect processing module is used for acquiring the position coordinates of the defects and generating grading processing instructions corresponding to the defects so as to control the robot to carry out grading polishing processing on the workpiece according to the position coordinates of the defects and the grading processing instructions;
and the secondary detection processing module is used for acquiring images of the workpieces subjected to the grading polishing processing, judging whether the workpieces have defects according to the acquired images, and finally controlling the robot to place the workpieces qualified for polishing in a qualified product area.
Further, the defect detection module includes:
the image acquisition unit is used for acquiring images of all parts to be detected of the workpiece;
the scanning unit is used for performing traversal scanning on the image of the part to be detected;
the calibration unit is used for carrying out binarization processing on the image and calibrating the coordinate value of each pixel point in the image;
the extracting unit is used for carrying out edge detection and contour extraction on the image according to the coordinate value;
and the judging unit is used for comparing the extracted contour with a preset defect contour database, judging that the workpiece has defects if corresponding defects are obtained, and otherwise, judging that the workpiece does not have defects.
Further, the defect processing module includes:
an acquisition unit configured to acquire position coordinates of the defect;
a calculation unit for calculating a surface area of the defect;
the generating unit is used for generating a grading processing instruction corresponding to the defect according to the surface area of the defect;
and the processing unit is used for controlling the robot to carry out grading polishing treatment on the workpiece according to the defect position coordinates and the grading treatment instruction.
The invention has the beneficial effects that: a robot polishing method based on visual inspection greatly improves production efficiency by integrating workpiece defect detection and workpiece polishing, sequentially acquires images of parts to be detected of a workpiece through a control camera, acquires images of all the parts to be detected of the workpiece and processes the images, judges whether the workpiece has defects or not, can detect fine defects of the workpiece, and processes the defects of the workpiece according to received defect position coordinates and classification processing instructions through the control robot, so that polishing precision is guaranteed.
The invention has the following beneficial effects: the utility model provides a robot system of polishing based on visual detection, includes robot, shield cover, the abrasive band machine of polishing, computer, is used for the terminal anchor clamps of centre gripping work piece and the camera that is used for gathering the image, the camera is installed in the shield cover, the robot is connected with terminal anchor clamps, the computer is connected with robot, camera and the abrasive band machine of polishing respectively, the computer includes: the workpiece moving module, the defect detecting module and the defect processing module are used for detecting and controlling the polishing defects of the robot through integrating workpiece defects and polishing workpieces, so that the production efficiency is greatly improved, the polishing precision can be ensured through detecting and controlling the polishing defects of the robot through the defect detecting module and the defect processing module, and the qualified rate of the workpieces is further ensured through the secondary detection processing module.
Drawings
FIG. 1 is a schematic flow chart of an embodiment of a robot polishing method based on visual inspection according to the present invention;
FIG. 2 is a flowchart illustrating a step S3 of an embodiment of a method for polishing a robot based on visual inspection according to the present invention;
FIG. 3 is a flowchart illustrating a step S4 of an exemplary embodiment of a method for robotic polishing based on visual inspection according to the present invention;
FIG. 4 is a schematic structural diagram of an embodiment of a robotic polishing system based on visual inspection according to the present invention;
FIG. 5 is a block diagram of an embodiment of a robotic polishing system based on visual inspection according to the present invention;
FIG. 6 is a schematic structural diagram of a defect detection module of an embodiment of a vision inspection based robotic polishing system of the present invention;
fig. 7 is a schematic structural diagram of a defect processing module of an embodiment of a robot polishing system based on visual inspection according to the present invention.
Detailed Description
Referring to fig. 1, a robot polishing method based on visual inspection includes the following steps:
s1, controlling the robot to move the workpiece to a polishing area for polishing;
s2, controlling the robot to move the workpiece to a visual detection area;
s3, controlling the camera to sequentially acquire images of the parts to be detected of the workpiece, acquiring images of all the parts to be detected of the workpiece, processing the images, judging whether the workpiece has defects, if not, placing the workpiece in a qualified product area, otherwise, executing the step S4;
s4, acquiring the position coordinates of the defects and generating grading processing instructions corresponding to the defects, and further controlling the robot to carry out grading polishing processing on the workpiece according to the position coordinates of the defects and the grading processing instructions;
and S5, acquiring images of the workpieces subjected to the grading polishing treatment, judging whether the workpieces have defects according to the acquired images, and finally controlling the robot to place the workpieces qualified in polishing into a qualified product area.
Further as a preferred embodiment, the method further comprises the following steps:
and S0, establishing a corresponding relation between the tool coordinate of the robot and the coordinates of the grinding belt sander and the camera.
Referring to fig. 2, as a further preferred embodiment, after acquiring images of all to-be-detected portions of the workpiece and performing image processing in step S3, the step of determining whether the workpiece has a defect specifically includes:
s31, acquiring images of all parts to be detected of the workpiece;
s32, performing traversal scanning on the image of the part to be detected;
s33, carrying out binarization processing on the image, and calibrating the coordinate value of each pixel point in the image;
s34, performing edge detection and contour extraction on the image according to the coordinate value;
and S35, comparing the extracted contour with a preset defect contour database, judging that the workpiece has defects if corresponding defects are obtained, and otherwise, judging that the workpiece does not have defects.
Referring to fig. 3, further as a preferred embodiment, the step S4 includes the steps of:
s41, acquiring position coordinates of the defects;
s42, calculating the surface area of the defect;
s43, generating a grading processing instruction corresponding to the defect according to the surface area of the defect;
and S44, controlling the robot to command the workpiece to be processed according to the defect position coordinates and the grading processing.
Further preferably, the classification processing instruction corresponding to the defect in step S43 is generated by:
surface area responsive to defects is 0-1.0mm2In case of (2), generating a primary processing instruction;
surface area in response to defects was 1.1mm2-2.0mm2In case of (2), generating a secondary processing instruction;
surface area in response to defects was 2.1mm2-3.0mm2In case of (3), generating a three-level processing instruction;
surface area responsive to defects greater than 3mm2In this case, a four-stage processing instruction is generated.
Further as a preferred embodiment, the hierarchical processing instructions are configured to:
the primary processing instruction is as follows: offsetting the central point of the robot tool by 0.5mm, moving the workpiece to the position of a polishing abrasive belt machine, and polishing the defects;
the secondary processing instruction is as follows: offsetting the central point of the robot tool by 1.0mm, moving the workpiece to the position of a polishing abrasive belt machine, and polishing the defects;
the three-stage processing instruction is as follows: offsetting the central point of the robot tool by 1.5mm, moving the workpiece to the position of a polishing abrasive belt machine, and polishing the defects;
the four-level processing instruction is as follows: and placing the workpiece in a defective product area.
Further, as a preferred embodiment, the step S5 is specifically: after the robot is controlled to move the workpiece to the visual detection area, the camera is controlled to sequentially acquire images of the parts to be detected of the workpiece, whether the workpiece has defects is judged after the images of all the parts to be detected of the workpiece are acquired and processed, and if the defects exist and the surface area of the defects is larger than or equal to 0.5mm2If the workpiece is not qualified in polishing, the robot is controlled to polish the workpieceAnd (4) placing the unqualified ground workpiece into an unqualified product area, otherwise, polishing the workpiece to be qualified, and controlling the robot to place the qualified ground workpiece into a qualified product area.
Referring to fig. 4 and 5, a robot system of polishing based on visual inspection, includes robot, shield cover, the abrasive band machine of polishing, computer, the terminal anchor clamps that are used for centre gripping work piece and the camera that is used for gathering the image, the camera is installed in the shield cover, the robot is connected with terminal anchor clamps, the computer is connected with robot, camera and the abrasive band machine of polishing respectively, the computer includes:
the polishing control module is used for controlling the robot to move the workpiece to the polishing area for polishing;
the workpiece moving module is used for controlling the robot to move the workpiece to the visual detection area;
the defect detection module is used for controlling the camera to sequentially acquire images of the parts to be detected of the workpiece, acquiring images of all the parts to be detected of the workpiece and processing the images, judging whether the workpiece has defects or not, if not, placing the workpiece in a qualified product area, otherwise, executing the defect processing module;
the defect processing module is used for acquiring the position coordinates of the defects and generating grading processing instructions corresponding to the defects so as to control the robot to carry out grading polishing processing on the workpiece according to the position coordinates of the defects and the grading processing instructions;
and the secondary detection processing module is used for acquiring images of the workpieces subjected to the grading polishing processing, judging whether the workpieces have defects according to the acquired images, and finally controlling the robot to place the workpieces qualified for polishing in a qualified product area.
Referring to fig. 6, further as a preferred embodiment, the defect detecting module includes:
the image acquisition unit is used for acquiring images of all parts to be detected of the workpiece;
the scanning unit is used for performing traversal scanning on the image of the part to be detected;
the calibration unit is used for carrying out binarization processing on the image and calibrating the coordinate value of each pixel point in the image;
the extracting unit is used for carrying out edge detection and contour extraction on the image according to the coordinate value;
and the judging unit is used for comparing the extracted contour with a preset defect contour database, judging that the workpiece has defects if corresponding defects are obtained, and otherwise, judging that the workpiece does not have defects.
Referring to fig. 7, further as a preferred embodiment, the defect processing module includes:
an acquisition unit configured to acquire position coordinates of the defect;
a calculation unit for calculating a surface area of the defect;
the generating unit is used for generating a grading processing instruction corresponding to the defect according to the surface area of the defect;
and the processing unit is used for controlling the robot to carry out grading polishing treatment on the workpiece according to the defect position coordinates and the grading treatment instruction.
The present invention will be further described with reference to the following specific examples, which take the polishing of the surface of a faucet as an example.
Referring to fig. 1, a robot polishing method based on visual inspection includes the following steps:
and S0, establishing a corresponding relation between the tool coordinate of the robot and the coordinates of the grinding belt sander and the camera.
S1, controlling the robot to move the workpiece to a polishing area for polishing;
s2, controlling the robot to move the water faucet to a visual detection area, namely below the light source and in the camera visual field;
in the present embodiment, a position is set where the coordinates are (200, 0, 0) with the camera as the origin and the three-dimensional coordinates XYZ as the reference.
S3, controlling the camera to sequentially acquire images of the to-be-detected parts of the faucet, acquiring images of all the to-be-detected parts of the faucet, processing the images, judging whether the faucet is defective, if not, executing a step S5, otherwise, executing a step S4;
in the embodiment, the faucet has 26 parts to be detected, and the control camera acquires images of the 26 parts to be detected and sequentially transmits the images to the computer.
Referring to fig. 2, the step of acquiring images of all to-be-detected parts of the faucet in step S3, performing image processing on the images, and determining whether the faucet is defective includes steps S31 to S35:
s31, acquiring images of all parts to be detected of the faucet;
s32, performing traversal scanning on the image of the part to be detected;
s33, carrying out binarization processing on the image, and calibrating the coordinate value of each pixel point in the image;
s34, performing edge detection and contour extraction on the image according to the coordinate value;
and S35, comparing the extracted contour with a preset defect contour database, judging that the water faucet has a defect if a corresponding defect is obtained, and otherwise, judging that the water faucet does not have a defect.
In this embodiment, the images are subjected to traversal scanning by using the src.rows function and the src.cols function, the length and the width of the whole image are obtained, binarization processing is performed on the images by using a threshold function threshold, edge detection is performed on the images by using a Canny operator, and contour extraction is performed on the images by using a Findcontours function.
The defect outline database records various defects and outline data associated with the defects, and the outline data of the various defects are extracted in the following way: acquiring images of various defects; traversing and scanning the image of the defect; carrying out binarization processing on the image, and calibrating the coordinate value of each pixel point in the image; performing edge detection and contour extraction on the image according to the coordinate value; and extracting the profile data of the defect. The defect contour data extraction step is the same as the aforementioned steps S31 to S34.
S4, acquiring the position coordinates of the defects and generating grading processing instructions corresponding to the defects, and further controlling the robot to process the water faucet according to the position coordinates of the defects and the grading processing instructions;
specifically, referring to fig. 3, the step S4 includes steps S41 to S43:
s41, acquiring position coordinates of the defects;
s42, calculating the surface area of the defect;
in this embodiment, the surface area of the defect is calculated by the contourArea function, and after the depth of the defect is calculated by using the cvPerspectiveTransform function and the vcreprojectimageto 3D function, a corresponding hierarchical processing instruction is generated according to different ranges of the depth.
S43, generating a grading processing instruction corresponding to the defect according to the surface area of the defect;
wherein the surface area responsive to defects is 0-1.0mm2Generating a primary processing instruction, namely controlling the robot to deflect a central point TCP of the robot tool by 0.5mm, moving the water faucet to the position of the polishing abrasive belt machine, and polishing the defects; surface area in response to defects was 1.1mm2-2.0mm2Generating a secondary processing instruction, namely controlling the robot to deflect a central point TCP of the robot tool by 1.0mm, moving the water faucet to the position of the polishing abrasive belt machine, and polishing the defect; surface area in response to defects was 2.1mm2-3.0mm2Generating a three-level processing instruction, namely controlling the robot to deflect a central point TCP of the robot tool by 1.5mm, moving the water faucet to the position of the polishing abrasive belt machine, and polishing the defects; surface area responsive to defects greater than 3mm2And (4) generating a four-stage processing instruction, namely controlling the robot to place the water faucet in a defective product area.
And S44, controlling the robot to process the water faucet according to the defect position coordinates and the grading processing instruction.
S5, controlling the robot to move the faucet to a visual detection area, controlling the camera to sequentially acquire images of 26 to-be-detected parts of the faucet, acquiring images of all to-be-detected parts of the faucet, processing the images, judging whether the faucet has defects, and if so, judging that the faucet has defects and the surface area of the defects is greater than or equal to 0.5mm2If the water faucet is unqualified to be polished, the control robot places the water faucet which is unqualified to be polished in an unqualified product area, otherwise, the water faucet is qualified to be polished, and the control robot controls the polishing robot to be qualifiedThe faucet is placed in a qualified product area.
Wherein, the images of all the parts to be detected of the water faucet are obtained by executing the steps S31-S35, and the water faucet is judged whether to have defects or not after the images are processed.
Referring to fig. 4 and 5, the robot polishing system based on visual inspection for performing the above polishing method includes a robot 1, a shielding case 3, a polishing abrasive belt machine 4, a computer, a terminal fixture 2 for clamping a workpiece, a camera for collecting an image, and a light source for providing illumination for the camera, wherein the camera lens adopts a 16mm megapixel industrial lens, the light source adopts a white arc light source with a width of 100mm and a length of 150mm, the light source and the camera are both installed in the shielding case 3, the robot 1 is connected with the terminal fixture 2, the computer is respectively connected with the robot 1, the camera and the polishing abrasive belt machine 4, and the computer includes:
and the coordinate matching module is used for establishing a corresponding relation between the tool coordinate of the robot and the coordinates of the grinding belt sander and the camera.
The polishing control module is used for controlling the robot to move the workpiece to the polishing area for polishing;
the workpiece moving module is used for controlling the robot to move the workpiece to the visual detection area;
the defect detection module is used for controlling the camera to sequentially acquire images of the parts to be detected of the workpiece, acquiring images of all the parts to be detected of the workpiece and processing the images, judging whether the workpiece has defects or not, if not, placing the workpiece in a qualified product area, otherwise, executing the defect processing module;
the defect processing module is used for acquiring the position coordinates of the defects and generating grading processing instructions corresponding to the defects so as to control the robot to carry out grading polishing processing on the workpiece according to the position coordinates of the defects and the grading processing instructions;
and the secondary detection processing module is used for acquiring images of the workpieces subjected to the grading polishing processing, judging whether the workpieces have defects according to the acquired images, and finally controlling the robot to place the workpieces qualified for polishing in a qualified product area.
Referring to fig. 6, the defect detecting module includes:
the image acquisition unit is used for acquiring images of all parts to be detected of the workpiece;
the scanning unit is used for performing traversal scanning on the image of the part to be detected;
the calibration unit is used for carrying out binarization processing on the image and calibrating the coordinate value of each pixel point in the image;
the extracting unit is used for carrying out edge detection and contour extraction on the image according to the coordinate value;
and the judging unit is used for comparing the extracted contour with a preset defect contour database, judging that the workpiece has defects if corresponding defects are obtained, and otherwise, judging that the workpiece does not have defects.
Referring to fig. 7, the defect processing module includes:
an acquisition unit configured to acquire position coordinates of the defect;
a calculation unit for calculating a surface area of the defect;
the generating unit is used for generating a grading processing instruction corresponding to the defect according to the surface area of the defect;
and the processing unit is used for controlling the robot to carry out grading polishing treatment on the workpiece according to the defect position coordinates and the grading treatment instruction.
Wherein the surface area responsive to defects is 0-1.0mm2Generating a primary processing instruction, namely controlling the robot to deflect a central point TCP of the robot tool by 0.5mm, moving the water faucet to the position of the polishing abrasive belt machine, and polishing the defects; surface area in response to defects was 1.1mm2-2.0mm2Generating a secondary processing instruction, namely controlling the robot to deflect a central point TCP of the robot tool by 1.0mm, moving the water faucet to the position of the polishing abrasive belt machine, and polishing the defect; surface area in response to defects was 2.1mm2-3.0mm2Generating a three-level processing instruction, namely controlling the robot to deflect a central point TCP of the robot tool by 1.5mm, moving the water faucet to the position of the polishing abrasive belt machine, and polishing the defects; surface area responsive to defects greater than 3mm2And (4) generating a four-stage processing instruction, namely controlling the robot to place the water faucet in a defective product area.
While the preferred embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (4)

1. A robot polishing method based on visual inspection is characterized by comprising the following steps:
s1, controlling the robot to move the workpiece to a polishing area for polishing;
s2, controlling the robot to move the workpiece to a visual detection area;
s3, controlling the camera to sequentially acquire images of the parts to be detected of the workpiece, acquiring images of all the parts to be detected of the workpiece, processing the images, judging whether the workpiece has defects, if not, placing the workpiece in a qualified product area, otherwise, executing S4;
s4, acquiring the position coordinates of the defects and generating grading processing instructions corresponding to the defects, and further controlling the robot to carry out grading polishing processing on the workpiece according to the position coordinates of the defects and the grading processing instructions;
s5, acquiring images of the workpieces subjected to the grading polishing treatment, judging whether the workpieces have defects according to the acquired images, and finally controlling the robot to place the workpieces qualified in polishing into a qualified product area;
the S4 includes the steps of:
s41, acquiring position coordinates of the defects;
s42, calculating the surface area of the defect;
s43, generating a grading processing instruction corresponding to the defect according to the surface area of the defect;
s44, controlling the robot to polish the workpiece in a grading manner according to the defect position coordinates and the grading processing instruction;
the hierarchical processing instruction corresponding to the defect in the S43 is generated by:
surface area responsive to defects is 0-1.0mm2In case of (2), generating a primary processing instruction;
surface area in response to defects was 1.1mm2-2.0mm2In case of (2), generating a secondary processing instruction;
surface area in response to defects was 2.1mm2-3.0mm2In case of (3), generating a three-level processing instruction;
surface area responsive to defects greater than 3mm2In case of (2), generating a four-level processing instruction;
the hierarchical processing instructions are configured to:
the primary processing instruction is as follows: offsetting the central point of the robot tool by 0.5mm, moving the workpiece to the position of a polishing abrasive belt machine, and polishing the defects;
the secondary processing instruction is as follows: offsetting the central point of the robot tool by 1.0mm, moving the workpiece to the position of a polishing abrasive belt machine, and polishing the defects;
the three-stage processing instruction is as follows: offsetting the central point of the robot tool by 1.5mm, moving the workpiece to the position of a polishing abrasive belt machine, and polishing the defects;
the four-level processing instruction is as follows: placing the workpiece in a unqualified product area;
and a step of judging whether the workpiece has defects or not after acquiring images of all parts to be detected of the workpiece and performing image processing in the step S3, specifically including:
s31, acquiring images of all parts to be detected of the workpiece;
s32, performing traversal scanning on the image of the part to be detected;
s33, carrying out binarization processing on the image, and calibrating the coordinate value of each pixel point in the image;
s34, performing edge detection and contour extraction on the image according to the coordinate value;
and S35, comparing the extracted contour with a preset defect contour database, judging that the workpiece has defects if corresponding defects are obtained, and otherwise, judging that the workpiece does not have defects.
2. The robot grinding method based on the visual inspection as claimed in claim 1, further comprising the steps of:
and S0, establishing a corresponding relation between the tool coordinate of the robot and the coordinates of the grinding belt sander and the camera.
3. The robot grinding method based on the visual inspection as claimed in claim 1, wherein the S5 is specifically: after the robot is controlled to move the workpiece to the visual detection area, the camera is controlled to sequentially acquire images of the parts to be detected of the workpiece, whether the workpiece has defects is judged after the images of all the parts to be detected of the workpiece are acquired and processed, and if the defects exist and the surface area of the defects is larger than or equal to 0.5mm2And if the workpiece is unqualified to be polished, controlling the robot to place the unqualified workpiece to be polished in an unqualified product area, otherwise, controlling the robot to place the qualified workpiece to be polished in a qualified product area, wherein the workpiece is qualified to be polished.
4. A vision inspection based robot grinding system for implementing the vision inspection based robot grinding method according to any one of claims 1 to3, comprising a robot, a shield case, a grinding abrasive belt machine, a computer, an end clamp for clamping a workpiece and a camera for acquiring images, the camera being installed in the shield case, the robot being connected with the end clamp, the computer being connected with the robot, the camera and the grinding abrasive belt machine, respectively, the computer comprising:
the polishing control module is used for controlling the robot to move the workpiece to the polishing area for polishing;
the workpiece moving module is used for controlling the robot to move the workpiece to the visual detection area;
the defect detection module is used for controlling the camera to sequentially acquire images of the parts to be detected of the workpiece, acquiring images of all the parts to be detected of the workpiece and processing the images, judging whether the workpiece has defects or not, if not, placing the workpiece in a qualified product area, otherwise, executing the defect processing module;
the defect processing module is used for acquiring the position coordinates of the defects and generating grading processing instructions corresponding to the defects so as to control the robot to carry out grading polishing processing on the workpiece according to the position coordinates of the defects and the grading processing instructions;
the secondary detection processing module is used for acquiring images of the workpieces subjected to the graded polishing processing, judging whether the workpieces have defects according to the acquired images and finally controlling the robot to place the workpieces qualified for polishing in a qualified product area;
the defect processing module comprises:
an acquisition unit configured to acquire position coordinates of the defect;
a calculation unit for calculating a surface area of the defect;
the generating unit is used for generating a grading processing instruction corresponding to the defect according to the surface area of the defect;
the processing unit is used for controlling the robot to carry out grading polishing treatment on the workpiece according to the defect position coordinates and the grading treatment instruction;
the defect detection module includes:
the image acquisition unit is used for acquiring images of all parts to be detected of the workpiece;
the scanning unit is used for performing traversal scanning on the image of the part to be detected;
the calibration unit is used for carrying out binarization processing on the image and calibrating the coordinate value of each pixel point in the image;
the extracting unit is used for carrying out edge detection and contour extraction on the image according to the coordinate value;
and the judging unit is used for comparing the extracted contour with a preset defect contour database, judging that the workpiece has defects if corresponding defects are obtained, and otherwise, judging that the workpiece does not have defects.
CN201711470467.2A 2017-12-29 2017-12-29 Robot polishing method and system based on visual detection Active CN108226164B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711470467.2A CN108226164B (en) 2017-12-29 2017-12-29 Robot polishing method and system based on visual detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711470467.2A CN108226164B (en) 2017-12-29 2017-12-29 Robot polishing method and system based on visual detection

Publications (2)

Publication Number Publication Date
CN108226164A CN108226164A (en) 2018-06-29
CN108226164B true CN108226164B (en) 2021-05-07

Family

ID=62646832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711470467.2A Active CN108226164B (en) 2017-12-29 2017-12-29 Robot polishing method and system based on visual detection

Country Status (1)

Country Link
CN (1) CN108226164B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109030384B (en) * 2018-07-04 2020-12-29 中国航空制造技术研究院 Method for monitoring polishing quality on line
CN111195843A (en) * 2018-11-19 2020-05-26 中冶宝钢技术服务有限公司 Device and method for removing surface defects of steel billet
CN109986447A (en) * 2019-05-14 2019-07-09 中铁隧道局集团有限公司 Pipeline outer wall sander
CN110170922B (en) * 2019-06-03 2020-12-04 北京石油化工学院 Automatic grinding method, device and equipment
CN110625491A (en) * 2019-08-29 2019-12-31 中车青岛四方机车车辆股份有限公司 Polishing apparatus and polishing method
CN110977767B (en) * 2019-11-12 2021-07-02 长沙长泰机器人有限公司 Casting defect distribution detection method and casting polishing method
CN111203805A (en) * 2020-01-08 2020-05-29 苏州德龙激光股份有限公司 Full-automatic glass scratch repairing method
CN111558854A (en) * 2020-05-20 2020-08-21 山西太钢不锈钢股份有限公司 Surface grinding method for high-grade stainless steel plate for vehicle
CN111922496A (en) * 2020-08-11 2020-11-13 四川工程职业技术学院 Workpiece defect eliminating method and system based on plasma air gouging

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102120307B (en) * 2010-12-23 2012-07-04 中国科学院自动化研究所 System and method for grinding industrial robot on basis of visual information
CN106041946B (en) * 2016-05-23 2017-02-22 广东工业大学 Image-processing-based robot polishing production method and production system applying same
CN106770331B (en) * 2017-02-10 2020-03-10 南京航空航天大学 Workpiece defect detection system based on machine vision
CN107052950B (en) * 2017-05-25 2018-10-12 上海莫亭机器人科技有限公司 A kind of complex-curved sanding and polishing system and method

Also Published As

Publication number Publication date
CN108226164A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108226164B (en) Robot polishing method and system based on visual detection
Xu et al. Real-time image processing for vision-based weld seam tracking in robotic GMAW
CN103206949B (en) Self-detecting self-positioning universal screw machine and positioning method thereof
CN102539443B (en) Bottle body defect automatic detection method based on machine vision
CN109060836B (en) Machine vision-based high-pressure oil pipe joint external thread detection method
CN109324056B (en) Sewing thread trace measuring method based on machine vision
CN102529019B (en) Method for mould detection and protection as well as part detection and picking
Princely et al. Vision assisted robotic deburring of edge burrs in cast parts
CN106770321B (en) A kind of plastic part defect inspection method based on multi-threshold section
CN109692889B (en) Umbrella rod station adjusting system and method based on machine vision
CN104315977A (en) Rubber plug quality detection device and method
CN106370673A (en) Automatic lens flaw detection method
CN108144873B (en) Visual detection system and method
CN205184511U (en) Robot burring device
Hosseininia et al. Flexible automation in porcelain edge polishing using machine vision
CN106248694A (en) A kind of articulated cross shaft defect detecting device and detection method thereof
CN106645185A (en) Method and device for intelligently detecting surface quality of industrial parts
CN105904344A (en) On-machine detection and finishing device for diamond roller
CN109345500B (en) Machine vision-based method for calculating position of tool nose point of machine tool cutter
CN210090325U (en) Screwdriver bit detection device
Pfeifer et al. Measuring drill wear with digital image processing
CN109622404B (en) Automatic sorting system and method for micro-workpieces based on machine vision
TWI407242B (en) Multi - axis tool grinding machine tool grinding image detection system
CN109238165B (en) 3C product profile tolerance detection method
CN208125630U (en) Workpiece, defect detects hand-held device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant