CN109579766B - Automatic product appearance detection method and system - Google Patents

Automatic product appearance detection method and system Download PDF

Info

Publication number
CN109579766B
CN109579766B CN201811583374.5A CN201811583374A CN109579766B CN 109579766 B CN109579766 B CN 109579766B CN 201811583374 A CN201811583374 A CN 201811583374A CN 109579766 B CN109579766 B CN 109579766B
Authority
CN
China
Prior art keywords
coordinate system
workpiece
distance sensor
industrial robot
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811583374.5A
Other languages
Chinese (zh)
Other versions
CN109579766A (en
Inventor
李振瀚
杨帆
颜昌亚
汪敏敏
张子龙
黄坤涛
刘磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Hanhua Zhizao Intelligent Technology Co ltd
Original Assignee
Suzhou Hanhua Zhizao Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Hanhua Zhizao Intelligent Technology Co ltd filed Critical Suzhou Hanhua Zhizao Intelligent Technology Co ltd
Priority to CN201811583374.5A priority Critical patent/CN109579766B/en
Publication of CN109579766A publication Critical patent/CN109579766A/en
Application granted granted Critical
Publication of CN109579766B publication Critical patent/CN109579766B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/20Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring contours or curvatures, e.g. determining profile
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • G01B21/04Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B5/00Measuring arrangements characterised by the use of mechanical techniques
    • G01B5/0025Measuring of vehicle parts

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a method and a system for automatically detecting the appearance of a product, wherein the method comprises the following steps: s1: controlling an industrial robot to drive a reference ball and a distance sensor on the tail end of the industrial robot to move according to a preset motion track so that the reference ball and the distance sensor detect a target detection point on the surface of a workpiece to be detected; s2: acquiring coordinates of a target point on a reference ball shot by tracking of a three-dimensional tracker, and calculating the pose of the reference ball in a three-dimensional tracker coordinate system; s3: calculating the coordinates of each target detection point according to the pose of the reference ball in the three-dimensional tracker coordinate system and distance data measured by the distance sensor; the invention can detect workpiece products with various shapes and sizes by executing the detection of the workpiece appearance through the industrial robot without replacing detection equipment, has high detection precision, can adapt to complicated and variable working environments, and avoids the friction damage of the workpiece in the detection process.

Description

Automatic product appearance detection method and system
Technical Field
The invention belongs to the technical field of robots, and particularly relates to a method and a system for automatically detecting the appearance of a product based on pose tracking and distance measurement.
Background
The moving target pose tracking technology is an important research content in the field of computer vision, and obtains wide attention of a plurality of organizations in a plurality of countries, and an implementation method of the pose tracking is as follows: target points made of special materials and having regular shapes are attached to the surface of a tracked target object, and the target points have a reflection effect on near infrared light with specific wavelength, so that the interference of the environment on tracking is eliminated; the target tracker receives near-infrared light waves with specific wavelengths by adopting a high-precision industrial camera, and further obtains the spatial pose of a target object under a reference coordinate system through algorithm analysis; however, achieving stable and accurate target tracking under complex conditions still faces a series of problems, such as complexity of a work site, change of illumination, shielding of targets, and the like.
The automobile is assembled by a plurality of parts and assemblies, and the assembly is a production process for assembling various parts and assemblies of the automobile into a complete product according to specified technical conditions and quality requirements; in the automobile assembling process, the quality of the whole automobile can be ensured only when the assembling precision requirement, namely the relative position precision of each part and assembly and the matching precision between each relative friction moving part are met; therefore, the appearance of each part required to be produced meets the design requirement, and the part can be smoothly and accurately matched with other products. However, due to the limitation of the production process, the appearance of the produced product cannot be guaranteed to meet the design requirement, so that the appearance of the produced part needs to be detected, and the detection of the appearance of the workpiece plays an important role in the detection of the product quality.
The appearance detection method of the existing automobile accessory product mainly comprises the steps of detecting by using a customized detection tool, clamping the automobile curved surface glass product on the customized detection tool when detecting the automobile curved surface glass product, installing a digital display dial indicator on each detection point on the detection tool, and carrying out multi-point detection by using the dial indicator. The method needs to specially customize the checking tool for the glass product, and each detection point needs to be correspondingly provided with the dial indicator, so that the detection cost is high and the efficiency is low; quality problems may also arise in the production of automobile parts when the dial indicator is in contact with the surface of the product.
Disclosure of Invention
The invention provides a method and a system for automatically detecting the appearance of a product, aiming at solving the problems that the existing detection method needs to specially customize a detection tool, has high detection cost and is easy to influence the quality of the product by contact detection.
To achieve the above object, according to one aspect of the present invention, there is provided a method for automatically detecting a product outer shape, comprising the steps of:
s1: controlling an industrial robot to drive a reference ball and a distance sensor on the tail end of the industrial robot to move according to a preset motion track so that the reference ball and the distance sensor detect a target detection point on the surface of a workpiece to be detected;
s2: acquiring coordinates of a target point on a reference ball shot by the tracking of the three-dimensional tracker, and calculating pose data of the reference ball in a three-dimensional tracker coordinate system TrCS;
s3: calculating the coordinates of the target detection point according to the pose data and the distance data measured by the distance sensor; the calculation formula is as follows:
Ti+RiQ-diRiU=Psi
wherein, PsiCoordinates of the ith target detection point are obtained; diThe distance of the ith target detection point measured by the distance sensor; { Ti,RiDenotes the pose of a point on the reference ball under the three-dimensional tracker coordinate system TrCS when the ith target detection point is detected, TiCoordinates (x) of the TaCS origin in a three-dimensional tracker coordinate system TrCSi,yi,zi);RiRotation transformation matrix [ x, y, z ] representing coordinate system TrCS of three-dimensional tracker to coordinate system TaCS of reference spherical coordinate system]3×3(ii) a Q denotes the coordinates of the origin of the distance sensor coordinate system TCS in the reference spherical coordinate system TaCS, and U denotes the optical axis direction of the distance sensor in the reference spherical coordinate system TaCS.
Preferably, the method for automatically detecting the outer shape of the product further includes, before step S1, the steps of:
s01: calibrating the positions of each detection device and the workpiece, wherein the calibration comprises the calibration of the pose of a reference spherical coordinate system TaCS in a three-dimensional tracker coordinate system TrCS, the calibration of the pose of a distance sensor coordinate system TCS in an industrial robot flange coordinate system ECS, the calibration of the pose of a reference coordinate system WCS of a workpiece clamp to be detected in the three-dimensional tracker coordinate system TrCS, and the calibration of the pose of the reference coordinate system WCS of the workpiece clamp to be detected in the industrial robot reference coordinate system RCS;
s02: and determining the position relationship among the detection devices and between the detection devices and the workpiece according to the calibrated pose, generating the motion trail of the industrial robot according to the position relationship and the target detection point on the workpiece to be detected, and converting the motion trail into codes executable by the industrial robot.
Preferably, the method for automatically detecting the outer shape of the product is characterized in that the step S02 specifically includes the following substeps:
s021: respectively creating three-dimensional models of an industrial robot, a reference ball, a distance sensor, a connecting piece between the reference ball and the distance sensor, a three-dimensional tracker and a workpiece to be detected, assembling the three-dimensional models according to the position relation between each detection device and the workpiece, and building a virtual detection device;
s022: establishing a detection process step according to a target detection point on a workpiece to be detected, and performing simulation on the virtual detection equipment according to the detection process step to generate a motion track of the industrial robot;
s023: and converting the motion track into a track code format which can be recognized by an industrial robot.
Preferably, the method for automatically detecting the outer shape of the product further includes, after step S3, the steps of:
s4: and calculating the deviation between the coordinate data of each target detection point and the corresponding theoretical coordinate value, and judging that the workpiece to be detected meets the qualified standard when the deviation is smaller than a preset deviation threshold value.
According to another aspect of the present invention, there is also provided an automatic product appearance detection system, comprising a processor, a memory, and a computer program stored in the memory and executable in the processor, the computer program, when executed by the processor, implementing the steps of any of the above methods.
Preferably, the processor of the automatic product shape detection system comprises a control unit, a first calculation unit and a second calculation unit;
the control unit is used for controlling the industrial robot to drive the reference ball and the distance sensor on the tail end to move according to a preset motion track so that the reference ball and the distance sensor can detect a target detection point on the surface of the workpiece to be detected;
the first calculation unit is used for acquiring coordinates of a target point on a reference ball shot by the three-dimensional tracker in a tracking manner, and calculating pose data of the reference ball in a three-dimensional tracker coordinate system TrCS;
the second calculation unit is used for calculating the coordinates of the target detection point according to the pose data and the distance data measured by the distance sensor; the coordinate calculation formula of the ith target detection point is as follows:
Ti+RiQ-diRiU=Psi
wherein, PsiCoordinates of the ith target detection point are obtained; diThe distance of the ith target detection point measured by the distance sensor; { Ti,RiThe pose of a point on a reference ball under a three-dimensional tracker coordinate system TrCS is represented when the ith target detection point is detected; q denotes the coordinates of the origin of the distance sensor coordinate system TCS in the reference spherical coordinate system TaCS, and U denotes the optical axis direction of the distance sensor in the reference spherical coordinate system TaCS.
Preferably, in the automatic product appearance detection system, the reference sphere is in the pose under the three-dimensional tracker coordinate system TrCS, TiAs a coordinate T of the origin of the reference spherical coordinate system TaCS in the three-dimensional tracker coordinate system TrCSi(xi,yi,zi);RiRotation transformation matrix [ x, y, z ] representing coordinate system TrCS of three-dimensional tracker to coordinate system TaCS of reference spherical coordinate system]3×3
Preferably, the automatic product shape detection system further comprises a calibration unit and a trajectory generation unit;
the calibration unit is used for calibrating the positions of each detection device and the workpiece, and comprises a position of a calibration reference spherical coordinate system TaCS in a three-dimensional tracker coordinate system TrCS, a position of a calibration distance sensor coordinate system TCS in an industrial robot flange coordinate system ECS, a position of a calibration reference coordinate system WCS of a workpiece clamp to be measured in the three-dimensional tracker coordinate system TrCS, and a position of the calibration reference coordinate system WCS of the workpiece clamp to be measured in the industrial robot base coordinate system RCS;
the track generating unit is used for determining the position relation among the detection devices and between the detection devices and the workpiece according to the calibrated pose, generating the motion track of the industrial robot according to the position relation and the target detection point on the workpiece to be detected, and converting the motion track into codes executable by the industrial robot.
Preferably, in the automatic product shape detection system, the trajectory generation unit includes a model building module, a simulation module and a post-processing module;
the model establishing module is used for respectively establishing three-dimensional models of an industrial robot, a reference ball, a distance sensor, a connecting piece between the reference ball and the distance sensor, a three-dimensional tracker and a workpiece to be detected, and assembling the three-dimensional models according to the position relation between each detection device and the workpiece to obtain virtual detection devices;
the simulation module is used for creating a detection working step according to a target detection point on a workpiece to be detected, and performing simulation on the virtual detection equipment according to the detection working step to generate a motion trail of the industrial robot;
and the post-processing module is used for converting the motion trail into a trail code format which can be recognized by the industrial robot.
Preferably, the automatic product shape detection system further includes a judgment unit, where the judgment unit is configured to calculate a deviation between the coordinate data of each target detection point and a corresponding theoretical coordinate value, and when the deviation is smaller than a preset deviation threshold, judge that the workpiece to be detected meets the qualification standard, and visually present a judgment result.
In general, compared with the prior art, the above technical solution contemplated by the present invention can achieve the following beneficial effects:
(1) according to the method and the system for automatically detecting the product appearance, the industrial robot is used for detecting the workpiece appearance, firstly, motion tracks are generated for products with different sizes or appearances, and then the industrial robot is controlled to drive the reference ball and the distance sensor to move according to the motion tracks so as to detect the surface of the product; compared with the conventional universal customized detection tool, the invention can detect products with various shapes and sizes without replacing detection equipment, can also adapt to complicated and changeable working environments, fully exerts the characteristics of high flexibility and programmability of an industrial robot and realizes flexible detection of product appearance; in addition, the reference ball and the distance sensor do not need to be in contact with the surface of the workpiece in the detection process, so that the workpiece is prevented from being damaged by friction in the detection process;
(2) according to the automatic detection method and system for the product appearance, the industrial robot drives the distance sensor to detect through the theoretical track, the three-dimensional tracker performs high-precision tracking measurement on the pose of the reference ball fixedly connected to the tail end of the industrial robot, the detection precision does not directly depend on the space precision of the industrial robot, and therefore the precision of the detection result of the workpiece appearance can be guaranteed to be higher than the space precision of the industrial robot.
Drawings
FIG. 1 is a flow chart of a method for automatically detecting the shape of a product according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of the assembled whole detection device according to the embodiment of the present invention;
FIG. 3 shows coordinates TaCS of a reference sphere in detecting the ith target detection point according to an embodiment of the present inventioniDistance d from the distance sensoriSchematic diagram of the relationship between;
FIG. 4 is a schematic diagram of a logical structure of a processor according to an embodiment of the present invention;
in all the figures, the same reference numerals denote the same features, in particular: firstly, a three-dimensional tracker; ② industrial robot; ③ a flange; fourthly, a reference ball; -a distance sensor; sixthly, detecting the workpiece to be detected; and (c) clamping.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Fig. 1 is a flowchart of a method for automatically detecting a product appearance based on pose tracking and distance measurement according to this embodiment, and as shown in fig. 1, the method for automatically detecting a product appearance includes the following steps:
s1: arranging detection equipment and workpieces to be detected: the embodiment uses a HyperScan optical tracking 3D scanner as a pose tracking device, and comprises a three-dimensional tracker (I) and a reference ball (IV); after being fixedly connected with a distance sensor through a connecting piece, a reference ball (IV) and the distance sensor (IV) are arranged on a flange (III) of an industrial robot (IV) for testing, a workpiece to be tested is arranged on a clamp (VII) and is arranged in the working space range of the industrial robot (IV), a three-dimensional tracker (IV) is arranged on one side of the whole device and can be fixed on a bracket or a wall, the coordinate system of the three-dimensional tracker is ensured to be fixed, and the detection range of a lens of the three-dimensional tracker covers the working space of the robot; fig. 2 is a schematic structural diagram of the whole detection device after assembly is completed.
S2: after the assembly is completed, the positions of each detection device and the workpiece to be detected are calibrated, and the method comprises the following steps: calibrating the pose of a reference spherical coordinate system TaCS in a three-dimensional tracker coordinate system TrCS, calibrating the pose of a distance sensor coordinate system TCS under an industrial robot flange coordinate system ECS, calibrating the pose of a reference coordinate system WCS of a workpiece clamp to be tested in the three-dimensional tracker coordinate system TrCS, calibrating the pose of the reference coordinate system WCS of the workpiece clamp to be tested under the industrial robot base coordinate system RCS, calibrating an origin Q of the distance sensor coordinate system TCS in the reference spherical coordinate system TaCS, and calibrating the optical axis direction U of a distance sensor under the reference spherical coordinate system TaCS;
s3: and determining the position relations among the detection devices and between the detection devices and the workpiece according to the calibrated pose, generating the motion trail of the industrial robot according to the position relations, and converting the motion trail into codes executable by the industrial robot.
The method specifically comprises the following substeps:
s31: respectively creating three-dimensional models of an industrial robot, a reference ball, a distance sensor, a connecting piece between the reference ball and the distance sensor, a three-dimensional tracker and a workpiece to be detected, assembling the three-dimensional models of the industrial robot, the reference ball, the distance sensor, the connecting piece between the reference ball and the distance sensor and the three-dimensional tracker according to the position relationship among detection devices and between the detection devices and the workpiece, and building virtual detection devices;
s32: establishing a detection process step according to a target detection point on a workpiece to be detected, and performing simulation on virtual detection equipment according to the detection process step to generate a motion track of the industrial robot;
s33: carrying out post-processing on the motion track, and converting the motion track into a track code format which can be recognized by an industrial robot; the track code is used for controlling the industrial robot to move according to the motion track;
in the embodiment, the industrial robot is used for detecting the appearance of the workpiece, firstly, motion tracks are generated for products with different sizes or appearances, and then the industrial robot is controlled to drive the reference ball and the distance sensor to move according to the motion tracks so as to detect the surface of the product; compared with the conventional universal customized detection tool, the invention can detect glass products with various shapes and sizes without replacing detection equipment, can also adapt to complicated and changeable working environments, fully exerts the characteristics of high flexibility and programmability of an industrial robot and realizes flexible detection of product appearance; in addition, the reference ball and the distance sensor do not need to be in contact with the surface of the workpiece in the detection process, so that the workpiece is prevented from being damaged by friction in the detection process.
S4: starting a HyperScan optical tracking 3D scanner to track the pose, and simultaneously starting a distance sensor to detect the appearance of the product; the industrial robot drives the reference ball and the distance sensor on the tail end of the industrial robot to move according to the motion trail generated in the step S32, so that the reference ball and the distance sensor detect a target detection point on the surface of the workpiece to be detected;
s5: acquiring coordinates of a target point on a reference ball shot by the tracking of the three-dimensional tracker, and calculating pose data of the reference ball under a three-dimensional tracker coordinate system TrCS matched with the reference ball; the method comprises the following steps that a reference ball is used as a tracking target of a three-dimensional tracker, a fixed three-dimensional tracker tracks and shoots target points on the reference ball in real time to obtain pose data of the reference ball in the detection process, and therefore a detection pose track of the reference ball in the three-dimensional tracker is obtained; the pose data of the reference ball and the distance data measured by the distance sensor are mutually corresponding in time and space;
s6: calculating the coordinates of each target detection point according to the pose data of the reference ball in the three-dimensional tracker coordinate system TrCS and the distance data measured by the distance sensor; as shown in fig. 3, the coordinates TaCS of the reference ball at the time of detection of the i-th target detection pointiDistance d from the distance sensoriThe relationship between the target detection points is shown in fig. 3, and the coordinate calculation formula of the ith target detection point is as follows:
Ti+RiQ-diRiU=Psi
wherein, PsiCoordinates of the ith target detection point are obtained; diThe distance of the ith target detection point measured by the distance sensor; { Ti,RiThe pose of a point on a reference ball under a three-dimensional tracker coordinate system TrCS is represented when the ith target detection point is detected; q denotes the coordinates of the origin of the distance sensor coordinate system TCS in the reference spherical coordinate system TaCS, and U denotes the optical axis direction of the distance sensor in the reference spherical coordinate system TaCS.
In the pose data of the reference ball in the three-dimensional tracker coordinate system TrCS, TiAs a coordinate T of the origin of the reference spherical coordinate system TaCS in the three-dimensional tracker coordinate system TrCSi(xi,yi,zi);RiRepresenting the rotation of the three-dimensional tracker coordinate system TrCS to the reference spherical coordinate system TaCSTransformation matrix [ x, y, z ]]3×3
S7: processing and analyzing the coordinate data of each target detection point obtained by detection according to a prestored judgment standard, and judging whether the appearance of the workpiece to be detected is qualified;
firstly, calculating the deviation between the coordinate data of each target detection point and the corresponding theoretical coordinate value, judging that the workpiece to be detected meets the qualified standard when the deviation of each target detection point is smaller than a set deviation threshold, and otherwise, judging that the workpiece to be detected is unqualified; the size of the deviation threshold is set according to the process requirement on the product shape, and this embodiment is not particularly limited.
The embodiment also provides an automatic product appearance detection system, which comprises a processor, a memory and a computer program which is stored in the memory and can be executed in the processor, wherein the processor is instantiated with a plurality of functional units, and the functional units comprise a calibration unit, a track generation unit, a control unit, a first calculation unit, a second calculation unit and a judgment unit; the computer programs, when executed by these functional units in the processor, may implement the steps of S2-S7 in the above-described method.
The calibration unit is used for calibrating the positions of each detection device and a workpiece, and comprises a pose calibration reference spherical coordinate system TaCS in a three-dimensional tracker coordinate system TrCS, a pose calibration distance sensor coordinate system TCS in an industrial robot flange coordinate system ECS, a pose calibration reference coordinate system WCS of a workpiece clamp to be measured in the three-dimensional tracker coordinate system TrCS, a pose calibration reference coordinate system WCS of the workpiece clamp to be measured in the industrial robot base coordinate system RCS, an origin Q calibration distance sensor coordinate system TCS in the reference spherical coordinate system TaCS, and an optical axis direction U of the distance sensor in the reference spherical coordinate system TaCS.
The track generating unit is used for determining the position relation among the detection devices and between the detection devices and the workpiece according to the calibrated pose, generating a motion track of the industrial robot according to the position relation, converting the motion track into a track code which can be executed by the industrial robot and sending the track code to the control unit.
Further, the track generation unit comprises a model building module, a simulation module and a post-processing module;
the model building module is used for respectively building three-dimensional models of an industrial robot, a reference ball, a distance sensor, a connecting piece between the reference ball and the distance sensor, a three-dimensional tracker and a workpiece to be detected, and assembling the three-dimensional models according to the position relation between each detection device and the workpiece to obtain virtual detection devices;
the simulation module is used for creating a detection working step according to a target detection point on a workpiece to be detected, and performing simulation on the virtual detection equipment according to the detection working step to generate a motion trail of the industrial robot;
and the post-processing module is used for converting the motion track generated by the simulation module into a track code format which can be recognized by the industrial robot and sending the track code format to the control unit.
And the control unit controls the industrial robot to drive the reference ball and the distance sensor on the tail end of the industrial robot to move according to a preset motion track according to the received track code so that the reference ball and the distance sensor can detect a target detection point on the surface of the workpiece to be detected.
The first calculation unit is used for acquiring coordinates of a target point on a reference ball shot by the three-dimensional tracker in a tracking mode, and calculating pose data of the reference ball in a three-dimensional tracker coordinate system TrCS.
The second calculating unit is used for calculating the coordinates of each target detection point according to the pose data of the reference ball in the three-dimensional tracker coordinate system TrCS and the distance data measured by the distance sensor; the coordinate calculation formula of the ith target detection point is as follows:
Ti+RiQ-diRiU=Psi
wherein, PsiCoordinates of the ith target detection point are obtained; diThe distance of the ith target detection point measured by the distance sensor; { Ti,RiThe pose of a point on a reference ball under a three-dimensional tracker coordinate system TrCS is represented when the ith target detection point is detected; q represents the coordinates of the TCS origin of the distance sensor coordinate system in the reference spherical coordinate system TaCS, and U represents the distance under the reference spherical coordinate system TaCSFrom the optical axis of the sensor.
After the second calculating unit obtains the coordinate data of each target detection point, the judging unit firstly calculates the deviation amount between the coordinate data of each target detection point and the corresponding theoretical coordinate value, then compares the deviation amount of each target detection point with the set deviation threshold value, and judges that the workpiece to be detected is a qualified product when the deviation amount of each target detection point is smaller than the deviation threshold value; otherwise, judging that the workpiece is unqualified; and then, visually presenting the judgment result to enable a worker to visually acquire the detection result.
In the above embodiment, all functions of the processor may be implemented by one computer program stored in the memory, it should be noted that the functions implemented by the one computer program may also be implemented by dividing the functions implemented by the one computer program into a plurality of computer programs loaded on different hardware devices to be cooperatively implemented, for example, the functions of the control unit may be implemented by one program loaded on the control module of the industrial robot, and the functions of the first computing unit may be implemented by a software program provided in the three-dimensional tracker.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (9)

1. A method for automatically detecting the appearance of a product is characterized by comprising the following steps:
s1: controlling an industrial robot to drive a reference ball and a distance sensor on the tail end of the industrial robot to move according to a preset motion track so that the reference ball and the distance sensor detect a target detection point on the surface of a workpiece to be detected;
s2: acquiring coordinates of a target point on a reference ball shot by tracking through a three-dimensional tracker, and calculating pose data of the reference ball in a three-dimensional tracker coordinate system;
s3: calculating the coordinates of each target detection point according to the pose data and the distance data measured by the distance sensor; the coordinate calculation formula of the ith target detection point is as follows:
Ti+RiQ-diRiU=Psi
wherein, PsiCoordinates of the ith target detection point are obtained; diThe distance of the ith target detection point measured by the distance sensor; { Ti,RiThe pose of a point on the reference ball in the coordinate system of the three-dimensional tracker when the ith target detection point is detected is represented by TiCoordinate T in three-dimensional tracker coordinate system for origin of reference spherical coordinate systemi(xi,yi,zi);RiRotation transformation matrix [ x, y, z ] representing coordinate system of three-dimensional tracker to reference spherical coordinate system]3×3(ii) a Q represents the coordinate of the origin of the coordinate system of the distance sensor in the reference spherical coordinate system; u denotes the optical axis direction of the distance sensor in the reference spherical coordinate system.
2. The method for automatically detecting the external shape of a product according to claim 1, wherein the step S1 is preceded by the steps of:
s01: calibrating the positions of each detection device and the workpiece, wherein the calibration comprises calibrating the pose of a reference spherical coordinate system in a three-dimensional tracker coordinate system, calibrating the pose of a distance sensor coordinate system under an industrial robot flange coordinate system, calibrating the pose of a reference coordinate system of a workpiece clamp to be detected in the three-dimensional tracker coordinate system, and calibrating the pose of the reference coordinate system of the workpiece clamp to be detected under the industrial robot base coordinate system;
s02: and determining the position relation among the detection devices and between the detection devices and the workpiece according to the calibrated pose, and generating the motion trail of the industrial robot according to the position relation and the target detection point on the workpiece to be detected.
3. The method for automatically detecting the external shape of the product according to claim 2, wherein the step S02 specifically comprises the following sub-steps:
s021: respectively creating an industrial robot, a reference ball, a distance sensor, a connecting piece between the reference ball and the distance sensor, a three-dimensional tracker and a three-dimensional model of a workpiece to be detected, assembling the three-dimensional models according to the pose relationship between each detection device and the workpiece, and building a virtual detection device;
s022: establishing a detection process step according to a target detection point on a workpiece to be detected, and performing simulation on the virtual detection equipment according to the detection process step to generate a motion track of the industrial robot;
s023: and converting the motion track into a track code format which can be recognized by an industrial robot.
4. The method for automatically detecting the outer shape of the product according to claim 1 or 3, wherein the step S3 is followed by the steps of:
s4: and calculating the deviation between the coordinate data of each target detection point and the corresponding theoretical coordinate value, and judging that the workpiece to be detected meets the qualified standard when the deviation is smaller than a preset deviation threshold value.
5. An automatic detection system for product appearance, comprising a processor, a memory and a computer program stored in the memory and executable in the processor, wherein the computer program when executed by the processor implements the steps of any one of claims 1 to 4.
6. The automatic product appearance inspection system of claim 5, wherein the processor comprises a control unit, a first computing unit, and a second computing unit;
the control unit is used for controlling the industrial robot to drive the reference ball and the distance sensor on the tail end to move according to a preset motion track so that the reference ball and the distance sensor can detect a target detection point on the surface of the workpiece to be detected;
the first calculation unit is used for acquiring coordinates of a target point on a reference ball shot by the three-dimensional tracker in a tracking manner, and calculating pose data of the reference ball in a three-dimensional tracker coordinate system;
the second calculation unit is used for calculating the coordinates of the target detection point according to the pose data and the distance data measured by the distance sensor; the coordinate calculation formula of the ith target detection point is as follows:
Ti+RiQ-diRiU=Psi
wherein, PsiCoordinates of the ith target detection point are obtained; diThe distance of the ith target detection point measured by the distance sensor; { Ti,RiThe pose of a point on the reference ball in the coordinate system of the three-dimensional tracker when the ith target detection point is detected is represented; q represents the coordinate of the origin of the coordinate system of the distance sensor in the reference spherical coordinate system, and U represents the optical axis direction of the distance sensor in the reference spherical coordinate system.
7. The automatic product appearance inspection system of claim 6, further comprising a calibration unit and a trajectory generation unit;
the calibration unit is used for calibrating the positions of each detection device and the workpiece, and comprises a position of a calibration reference spherical coordinate system in a three-dimensional tracker coordinate system, a position of a calibration distance sensor coordinate system in an industrial robot flange coordinate system, a position of a calibration workpiece fixture reference coordinate system in the three-dimensional tracker coordinate system, and a position of the calibration workpiece fixture reference coordinate system in an industrial robot base coordinate system;
the track generating unit is used for determining the position relation among the detection devices and between the detection devices and the workpiece according to the calibrated pose, and generating the motion track of the industrial robot according to the position relation and the target detection point on the workpiece to be detected.
8. The automatic product appearance inspection system of claim 7, wherein the trajectory generation unit comprises a model building module, a simulation module, and a post-processing module;
the model building module is used for respectively building an industrial robot, a reference ball, a distance sensor, a connecting piece between the reference ball and the distance sensor, a three-dimensional tracker and a three-dimensional model of a workpiece to be detected, assembling the three-dimensional models according to the pose relation between each detection device and the workpiece, and building a virtual detection device;
the simulation module is used for creating a detection working step according to a target detection point on a workpiece to be detected, and performing simulation on the virtual detection equipment according to the detection working step to generate a motion trail of the industrial robot;
and the post-processing module is used for converting the motion trail into a trail code format which can be recognized by the industrial robot.
9. The automatic product appearance inspection system according to claim 7 or 8, further comprising a judgment unit for calculating a deviation amount between the coordinate data of each target inspection point and the corresponding theoretical coordinate value, and judging that the workpiece to be inspected satisfies the qualification standard when the deviation amounts are less than a preset deviation threshold.
CN201811583374.5A 2018-12-24 2018-12-24 Automatic product appearance detection method and system Active CN109579766B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811583374.5A CN109579766B (en) 2018-12-24 2018-12-24 Automatic product appearance detection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811583374.5A CN109579766B (en) 2018-12-24 2018-12-24 Automatic product appearance detection method and system

Publications (2)

Publication Number Publication Date
CN109579766A CN109579766A (en) 2019-04-05
CN109579766B true CN109579766B (en) 2020-08-11

Family

ID=65931535

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811583374.5A Active CN109579766B (en) 2018-12-24 2018-12-24 Automatic product appearance detection method and system

Country Status (1)

Country Link
CN (1) CN109579766B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107214692B (en) * 2016-03-22 2020-04-03 泰科电子(上海)有限公司 Automatic calibration method of robot system
CN110053048B (en) * 2019-04-22 2023-06-16 青岛科技大学 Remote control method
CN110202575B (en) * 2019-06-05 2022-07-19 武汉工程大学 Robot target track precision compensation method for industrial measurement
CN110260786B (en) * 2019-06-26 2020-07-10 华中科技大学 Robot vision measurement system based on external tracking and calibration method thereof
CN110987966A (en) * 2019-12-27 2020-04-10 上海天马微电子有限公司 Detection method and detection system for curved substrate
CN112288823B (en) * 2020-10-15 2022-12-06 武汉工程大学 Calibration method of standard cylinder curved surface point measuring equipment
CN113483716B (en) * 2021-05-21 2022-08-30 重庆大学 Curved surface part machining method based on flexible sensor
CN115598128A (en) * 2021-07-09 2023-01-13 宝山钢铁股份有限公司(Cn) Surface defect imaging device and method for real-time tracking of large-span cold-rolled strip steel curved surface
CN113298194B (en) * 2021-07-26 2021-10-19 中大检测(湖南)股份有限公司 Data fusion method and system based on multiple sensors and storage medium
CN114894116B (en) * 2022-04-08 2024-02-23 苏州瀚华智造智能技术有限公司 Measurement data fusion method and non-contact measurement equipment
CN116124081B (en) * 2023-04-18 2023-06-27 菲特(天津)检测技术有限公司 Non-contact workpiece detection method and device, electronic equipment and medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0504755A3 (en) * 1991-03-22 1993-03-17 Firma Carl Zeiss Method and arrangement to detect edges and bores with an optical sensing head
CN102087096A (en) * 2010-11-12 2011-06-08 浙江大学 Automatic calibration apparatus for robot tool coordinate system based on laser tracking measurement and method thereof
CN102350700A (en) * 2011-09-19 2012-02-15 华南理工大学 Method for controlling robot based on visual sense
CN103895023A (en) * 2014-04-04 2014-07-02 中国民航大学 Mechanical arm tail end tracking and measuring system and method based on coding azimuth device
CN103895042A (en) * 2014-02-28 2014-07-02 华南理工大学 Industrial robot workpiece positioning grabbing method and system based on visual guidance
CN104786226A (en) * 2015-03-26 2015-07-22 华南理工大学 Posture and moving track positioning system and method of robot grabbing online workpiece
CN104858870A (en) * 2015-05-15 2015-08-26 江南大学 Industrial robot measurement method based on tail end numbered tool
CN106272424A (en) * 2016-09-07 2017-01-04 华中科技大学 A kind of industrial robot grasping means based on monocular camera and three-dimensional force sensor
EP3147086A1 (en) * 2015-09-22 2017-03-29 Airbus Defence and Space GmbH Automation of robot operations
CN107543495A (en) * 2017-02-17 2018-01-05 北京卫星环境工程研究所 Spacecraft equipment autocollimation measuring system, alignment method and measuring method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3733364B2 (en) * 2003-11-18 2006-01-11 ファナック株式会社 Teaching position correction method
CN108286949B (en) * 2017-12-29 2020-07-14 北京卫星制造厂 Movable three-dimensional detection robot system
CN209085582U (en) * 2018-12-24 2019-07-09 苏州瀚华智造智能技术有限公司 A kind of product shape detection device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0504755A3 (en) * 1991-03-22 1993-03-17 Firma Carl Zeiss Method and arrangement to detect edges and bores with an optical sensing head
CN102087096A (en) * 2010-11-12 2011-06-08 浙江大学 Automatic calibration apparatus for robot tool coordinate system based on laser tracking measurement and method thereof
CN102350700A (en) * 2011-09-19 2012-02-15 华南理工大学 Method for controlling robot based on visual sense
CN103895042A (en) * 2014-02-28 2014-07-02 华南理工大学 Industrial robot workpiece positioning grabbing method and system based on visual guidance
CN103895023A (en) * 2014-04-04 2014-07-02 中国民航大学 Mechanical arm tail end tracking and measuring system and method based on coding azimuth device
CN104786226A (en) * 2015-03-26 2015-07-22 华南理工大学 Posture and moving track positioning system and method of robot grabbing online workpiece
CN104858870A (en) * 2015-05-15 2015-08-26 江南大学 Industrial robot measurement method based on tail end numbered tool
EP3147086A1 (en) * 2015-09-22 2017-03-29 Airbus Defence and Space GmbH Automation of robot operations
CN106272424A (en) * 2016-09-07 2017-01-04 华中科技大学 A kind of industrial robot grasping means based on monocular camera and three-dimensional force sensor
CN107543495A (en) * 2017-02-17 2018-01-05 北京卫星环境工程研究所 Spacecraft equipment autocollimation measuring system, alignment method and measuring method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
《Verification of an articulated arm coordinate measuring machine using a laser tracker as reference equipment and an indexed metrology platform》;R.Acero等;《Measurement》;20151231;全文 *
《基于激光传感器的机器人运动轨迹跟踪》;于生等;《激光杂志》;20161231;全文 *
《大型结构变形及形貌摄像测量技术研究进展》;于起峰;《实验力学》;20111031;全文 *

Also Published As

Publication number Publication date
CN109579766A (en) 2019-04-05

Similar Documents

Publication Publication Date Title
CN109579766B (en) Automatic product appearance detection method and system
CN108818536B (en) Online offset correction method and device for robot hand-eye calibration
CN107972070B (en) Method and system for testing performance of robot and computer readable storage medium
Zhuang et al. Simultaneous calibration of a robot and a hand-mounted camera
CN103759635B (en) The scanning survey robot detection method that a kind of precision is unrelated with robot
CN102706277B (en) Industrial robot online zero position calibration device based on all-dimensional point constraint and method
CN109794963B (en) Robot rapid positioning method facing curved surface component
TWI517101B (en) Calibration system and method for 3d scanner
JP2021527220A (en) Methods and equipment for identifying points on complex surfaces in space
CN107053216A (en) The automatic calibration method and system of robot and end effector
CN107726980B (en) Calibration method of linear laser displacement sensor based on four-axis measuring machine
Rémy et al. Hand-eye calibration
CN113681559A (en) Line laser scanning robot hand-eye calibration method based on standard cylinder
CN114459345B (en) Aircraft fuselage position and posture detection system and method based on visual space positioning
Evangelista et al. An unified iterative hand-eye calibration method for eye-on-base and eye-in-hand setups
US9804252B2 (en) System and method for measuring tracker system accuracy
CN109773589A (en) Method and device, the equipment of on-line measurement and processing guiding are carried out to workpiece surface
CN105571491A (en) Binocular vision-based automobile chassis data measuring system and method thereof
Rüther et al. The narcissistic robot: Robot calibration using a mirror
Antonelli et al. Training by demonstration for welding robots by optical trajectory tracking
CN108195354A (en) A kind of vehicle positioning method and vehicle positioning system based on robot
CN112037190A (en) Intelligent welding spot leakage detecting system and method for electric appliance refrigerant pipeline
CN106931879B (en) Binocular error measurement method, device and system
Wang et al. A Pose Estimation and Optimization Method for Mobile Manipulator’s End-effectors Based on Stereo Vision and ICoP Algorithm
Yang et al. Beam orientation of EAST visible optical diagnostic using a robot-camera system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant