CN116810359A - Visual positioning-based three-dimensional screwing sequence fool-proofing control method and system - Google Patents

Visual positioning-based three-dimensional screwing sequence fool-proofing control method and system Download PDF

Info

Publication number
CN116810359A
CN116810359A CN202310719355.5A CN202310719355A CN116810359A CN 116810359 A CN116810359 A CN 116810359A CN 202310719355 A CN202310719355 A CN 202310719355A CN 116810359 A CN116810359 A CN 116810359A
Authority
CN
China
Prior art keywords
coordinate system
tool
point set
workpiece
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310719355.5A
Other languages
Chinese (zh)
Inventor
鲁敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SUZHOU CHAOJI INFORMATION TECHNOLOGY CO LTD
Original Assignee
SUZHOU CHAOJI INFORMATION TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SUZHOU CHAOJI INFORMATION TECHNOLOGY CO LTD filed Critical SUZHOU CHAOJI INFORMATION TECHNOLOGY CO LTD
Priority to CN202310719355.5A priority Critical patent/CN116810359A/en
Publication of CN116810359A publication Critical patent/CN116810359A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application provides a visual positioning-based three-dimensional screwing sequence fool-proofing control method and a visual positioning-based three-dimensional screwing sequence fool-proofing control system, which relate to the technical field of screw screwing and comprise the steps of constructing two world coordinate systems by a workpiece and a tool; determining a feature point set, three-dimensional coordinates of all feature points in the feature point set, three-dimensional coordinates of all screws on a workpiece coordinate system, and three-dimensional coordinates of a tool execution end on a tool coordinate system; detecting two-dimensional image coordinates of a workpiece feature point set and a tool feature point set; obtaining a homogeneous transformation matrix from a tool coordinate system to a workpiece coordinate system; calculating to obtain the three-dimensional coordinates of the tip vertex of the tool execution in the workpiece coordinate system; and obtaining the screw number corresponding to the vertex of the executing tail end of the current tool, and further controlling the screwing sequence. The application solves the problems that the prior screwing process is difficult to implement space screwing sequence control and potential quality hidden trouble is easy to cause in the complex workpiece assembly process.

Description

Visual positioning-based three-dimensional screwing sequence fool-proofing control method and system
Technical Field
The application relates to the technical field of screw tightening, in particular to a three-dimensional tightening sequence fool-proof control method and system based on visual positioning.
Background
In the complex workpiece assembly process, real-time management and control are required for the scene with the requirement on the screwing sequence. However, because the screwing points of the complex workpiece may be located on different planes and at different angles, some are blind areas of camera vision, so that the existing vision system cannot determine the points through vision direct-view targets, and further, in the complex workpiece assembly process, the existing screwing process is difficult to implement space screwing sequence management and control, and potential quality hidden danger is easily caused.
Disclosure of Invention
Therefore, the embodiment of the application provides a three-dimensional screwing sequence foolproof control method and system based on visual positioning, which are used for solving the problems that in the prior art, in the complex workpiece assembly process, the existing screwing process is difficult to implement spatial screwing sequence control and potential quality hidden trouble is easy to cause.
In order to solve the above problems, an embodiment of the present application provides a three-dimensional tightening sequence fool-proof control method based on visual positioning, the method including:
s1: constructing two world coordinate systems by using a workpiece and a tool, wherein the two world coordinate systems are respectively a workpiece coordinate system and a tool coordinate system;
s2: determining a feature point set A on a workpiece, three-dimensional coordinates of all feature points in the feature point set A and three-dimensional coordinates of all screws on the workpiece coordinate system based on the workpiece coordinate system, automatically generating screw numbers according to the three-dimensional coordinates of the screws, and arranging screw installation sequences; determining a feature point set B on the tool, three-dimensional coordinates of all feature points in the feature point set B and three-dimensional coordinates of a tool execution end on the tool coordinate system based on the tool coordinate system;
s3: acquiring a picture of a product by using a camera, and detecting two-dimensional image coordinates of a workpiece feature point set A and a tool feature point set B;
s4: obtaining a homogeneous transformation matrix T from a tool coordinate system to a workpiece coordinate system according to the two-dimensional image coordinates of the feature point set A, the three-dimensional coordinates of the feature point set A in the workpiece coordinate system, the two-dimensional image coordinates of the feature point set B and the three-dimensional coordinates of the feature point set B in the tool coordinate system;
s5: according to the three-dimensional coordinates of the vertex of the tool execution end in the tool coordinate system and the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system, calculating to obtain the three-dimensional coordinates of the vertex of the tool execution end in the workpiece coordinate system;
s6: and calculating the distance between the vertex of the execution end of the tool and all screws in a workpiece coordinate system, obtaining a screw number corresponding to the vertex of the execution end of the current tool, enabling the tool if the screw number is consistent with the specified screw number of the specified installation sequence, setting the corresponding screwing force of the screws, otherwise, not enabling the tool, and repeating the steps S3-S6 until all screws are screwed.
Preferably, the method for determining the feature point set A on the workpiece or the feature point set B on the tool is as follows:
if the feature points are in the same plane, the number of the feature points is not less than 4; if the feature points are not in one plane, the number of feature points is not less than 6.
The method for detecting the two-dimensional image coordinates of the workpiece feature point set A and the tool feature point set B by utilizing the camera to collect the pictures of the product comprises the following steps:
and acquiring a picture of the product by using a camera, and detecting two-dimensional image coordinates of the workpiece feature set A and the tool feature set B on the picture through deep learning or traditional vision.
Preferably, the camera internal reference matrix and the camera distortion coefficient are obtained through camera calibration:
distCoeffs=[k 1 ,k 2 ,p 1 ,p 2 ,k 3 ]
wherein, the camera matrix represents a camera internal reference matrix, the distCoeffs represents a camera distortion coefficient, and k 1 ,k 2 ,k 3 Representing the radial distortion coefficient, p 1 ,p 2 Represents the tangential distortion coefficient, focal length (f x ,f y ) And an optical center (c) x ,c y )。
Preferably, the method for obtaining the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system according to the two-dimensional image coordinates of the feature point set A, the three-dimensional coordinates of the feature point set A in the workpiece coordinate system, the two-dimensional image coordinates of the feature point set B and the three-dimensional coordinates of the feature point set B in the tool coordinate system comprises the following steps:
obtaining a homogeneous transformation matrix T1 of the camera coordinate system relative to the workpiece coordinate system according to the two-dimensional image coordinates of the feature point set A and the three-dimensional coordinates of the feature point set A in the workpiece coordinate system; obtaining a homogeneous transformation matrix T2 of the camera coordinate system relative to the tool coordinate system according to the two-dimensional image coordinates of the feature point set B and the three-dimensional coordinates of the feature point set B in the tool coordinate system; the two homogeneous transformation matrices T1 and T2 are used for obtaining the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system.
Preferably, the method for obtaining the homogeneous transformation matrix T1 of the camera coordinate system relative to the workpiece coordinate system according to the two-dimensional image coordinates of the feature point set a, the three-dimensional coordinates of the feature point set a in the workpiece coordinate system, the camera internal parameters and the camera distortion parameters comprises the following steps:
according to the corresponding relation between the two-dimensional image coordinates of the feature point set A and the three-dimensional coordinates of the feature point set A in the workpiece coordinate system, the solvePnP function in opencv is used for obtaining initial pose estimation of the camera pose, and the rotation vector and the translation vector of the pose are used for obtaining a homogeneous transformation matrix T1 of the camera coordinate system relative to the workpiece.
Preferably, the initialized pose estimate is used as an initial value for iterative optimization using a nonlinear optimization algorithm comprising Levenberg-Marquardt, gauss-Newton, iteratively adjusting the pose parameters by minimizing the re-projection error, minimizing the re-projection error.
Preferably, the method for calculating the three-dimensional coordinates of the tool execution end vertex in the workpiece coordinate system according to the three-dimensional coordinates of the tool execution end vertex in the tool coordinate system and the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system is as follows:
p 1 =T*p 2
wherein p is 2 In the form of homogeneous coordinates [ x, y, z,1]Where x, y, z are the three-dimensional coordinates of the tool execution end in the tool coordinate system, p 1 Homogeneous coordinates [ x ', y ', z ', w ] representing the vertex of the tool execution tip in the workpiece coordinate system]Dividing the homogeneous coordinates by the last element, [ x '/w, y '/w, z '/w,]that is, the three-dimensional coordinates of the end vertices of the tool execution in the tool coordinate system.
Preferably, the method for calculating the distance between the end vertex of the tool execution and all the screws in the workpiece coordinate system and obtaining the screw number corresponding to the end vertex of the current tool execution is as follows:
in a workpiece coordinate system, calculating the distance between the end vertex of the tool execution and all screws, taking a minimum distance value and a set threshold value for judgment, and if the minimum distance value is smaller than the set threshold value, judging that the screw number corresponding to the end vertex of the current tool execution is the screw number corresponding to the minimum distance value; otherwise, it is determined that the current tool execution end vertex is not at any screw position.
The embodiment of the application also provides a three-dimensional screwing sequence fool-proof control system based on visual positioning, which comprises the following steps:
the coordinate system construction module is used for constructing two world coordinate systems by using the workpiece and the tool, namely a workpiece coordinate system and a tool coordinate system;
the three-dimensional coordinate acquisition module is used for determining a characteristic point set A on a workpiece, three-dimensional coordinates of all characteristic points in the characteristic point set A and three-dimensional coordinates of all screws on the workpiece coordinate system based on the workpiece coordinate system, automatically generating screw numbers according to the three-dimensional coordinates of the screws, and arranging screw installation sequences; determining a feature point set B on the tool, three-dimensional coordinates of all feature points in the feature point set B and three-dimensional coordinates of a tool execution end on the tool coordinate system based on the tool coordinate system;
the detection module is used for acquiring pictures of products by using a camera and detecting two-dimensional image coordinates of the workpiece feature point set A and the tool feature point set B;
the transformation matrix acquisition module is used for obtaining a homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system according to the two-dimensional image coordinates of the feature point set A, the three-dimensional coordinates of the feature point set A in the workpiece coordinate system, the two-dimensional image coordinates of the feature point set B and the three-dimensional coordinates of the feature point set B in the tool coordinate system;
the transformation module is used for calculating to obtain the three-dimensional coordinates of the tool execution terminal vertex in the workpiece coordinate system according to the three-dimensional coordinates of the tool execution terminal vertex in the tool coordinate system and the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system;
the control module is used for calculating the distance between the vertex of the execution end of the tool and all screws in the workpiece coordinate system, obtaining the screw number corresponding to the vertex of the execution end of the current tool, enabling the tool if the screw number is consistent with the screw number appointed by the specified installation sequence, setting the corresponding screwing force of the screws, otherwise, not enabling the tool, and repeatedly executing the detection module, the transformation matrix acquisition module, the transformation module and the control module until all screws are screwed.
From the above technical scheme, the application has the following advantages:
the application provides a visual positioning-based three-dimensional screwing sequence fool-proof control method and a visual positioning-based three-dimensional screwing sequence fool-proof control system, which automatically generate screw numbers according to three-dimensional coordinates of screws and arrange screw installation sequences; the detection accuracy of the feature point set in the two-dimensional image is improved by using deep learning; the homogeneous transformation matrix of the tool coordinate system and the workpiece coordinate system is obtained through the homogeneous transformation matrix of the workpiece coordinate system and the camera coordinate system and the homogeneous transformation matrix of the tool coordinate system and the camera coordinate system, and then the three-dimensional coordinate of the vertex of the execution end of the tool in the workpiece coordinate system can be obtained; finally, the distance between the vertex of the execution end of the tool and all screws is calculated in a workpiece coordinate system, so that the screw number corresponding to the vertex of the execution end of the current tool is obtained, and the screwing order is controlled; the application can realize real-time MPI guidance and management and control of the installation sequence. And can assist force sensor to carry out torsion control to every screw, each screw is screwed down to more accurate.
Drawings
For a clearer description of embodiments of the application or of solutions in the prior art, reference will be made to the accompanying drawings, which are intended to be used in the examples, for a clearer understanding of the characteristics and advantages of the application, by way of illustration and not to be interpreted as limiting the application in any way, and from which, without any inventive effort, a person skilled in the art can obtain other figures. Wherein:
fig. 1 is a flow chart of a visual positioning-based three-dimensional tightening sequence fool-proof control method provided in an embodiment;
FIG. 2 is a schematic diagram of an embodiment of a workpiece coordinate system;
FIG. 3 is a schematic diagram of a tool coordinate system established by a tool in an embodiment;
FIG. 4 is a schematic view of feature points and screw coordinates in a workpiece coordinate system according to an embodiment;
FIG. 5 is a schematic diagram of the coordinates of the feature points and the end vertices of the tool in the tool coordinate system according to the embodiment;
fig. 6 is a block diagram of a three-dimensional tightening sequence fool-proof control system based on visual localization provided in accordance with an embodiment.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Example 1
As shown in fig. 1, an embodiment of the present application provides a three-dimensional tightening sequence fool-proof control method based on visual positioning, which includes:
s1: constructing two world coordinate systems by using a workpiece and a tool, wherein the two world coordinate systems are respectively a workpiece coordinate system and a tool coordinate system;
s2: determining a feature point set A on a workpiece, three-dimensional coordinates of all feature points in the feature point set A and three-dimensional coordinates of all screws on the workpiece coordinate system based on the workpiece coordinate system, automatically generating screw numbers according to the three-dimensional coordinates of the screws, and arranging screw installation sequences; determining a feature point set B on the tool, three-dimensional coordinates of all feature points in the feature point set B and three-dimensional coordinates of a tool execution end on the tool coordinate system based on the tool coordinate system;
s3: acquiring a picture of a product by using a camera, and detecting two-dimensional image coordinates of a workpiece feature point set A and a tool feature point set B;
s4: obtaining a homogeneous transformation matrix T from a tool coordinate system to a workpiece coordinate system according to the two-dimensional image coordinates of the feature point set A, the three-dimensional coordinates of the feature point set A in the workpiece coordinate system, the two-dimensional image coordinates of the feature point set B and the three-dimensional coordinates of the feature point set B in the tool coordinate system;
s5: according to the three-dimensional coordinates of the vertex of the tool execution end in the tool coordinate system and the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system, calculating to obtain the three-dimensional coordinates of the vertex of the tool execution end in the workpiece coordinate system;
s6: and calculating the distance between the vertex of the execution end of the tool and all screws in a workpiece coordinate system, obtaining a screw number corresponding to the vertex of the execution end of the current tool, enabling the tool if the screw number is consistent with the specified screw number of the specified installation sequence, setting the corresponding screwing force of the screws, otherwise, not enabling the tool, and repeating the steps S3-S6 until all screws are screwed.
The application provides a visual positioning-based three-dimensional screwing sequence fool-proof control method, which automatically generates screw numbers according to three-dimensional coordinates of screws and arranges screw installation sequences; the detection accuracy of the feature point set in the two-dimensional image is improved by using deep learning; the homogeneous transformation matrix of the tool coordinate system and the workpiece coordinate system is obtained through the homogeneous transformation matrix of the workpiece coordinate system and the camera coordinate system and the homogeneous transformation matrix of the tool coordinate system and the camera coordinate system, and then the three-dimensional coordinate of the vertex of the execution end of the tool in the workpiece coordinate system can be obtained; finally, the distance between the vertex of the execution end of the tool and all screws is calculated in a workpiece coordinate system, so that the screw number corresponding to the vertex of the execution end of the current tool is obtained, and the screwing order is controlled; the application can realize real-time MPI guidance and management and control of the installation sequence. And can assist force sensor to carry out torsion control to every screw, each screw is screwed down to more accurate.
Further, in step S1, a world coordinate system origin is selected on the workpiece, and x, y, z directions of the world coordinate system are determined. This coordinate system is referred to as the object coordinate system. As shown in fig. 2, the coordinate system is not necessarily taken at the edge of the workpiece, and is selected according to the actual situation. The world coordinate system is selected on the tool and the x, y, z directions of the world coordinate system are determined, which coordinate system is referred to as the tool coordinate system, as shown in fig. 3, and again the coordinate system need not be taken at the tool edge. Both the object coordinate system and the tool coordinate system use the right hand coordinate system.
Further, in step S2, a feature point set a on the workpiece is determined, the number of feature points is not less than 4, if the number of feature points is 4, the 4 points are in the same plane, if the feature points are not in one plane, at least 6 feature points are required, and the feature points can be corner points and can be any identifiable special points. The 4 corner points as in fig. 4 constitute a feature point set a. And determining three-dimensional coordinates of all the characteristic points and all the screws in the characteristic point set A on a workpiece coordinate system. The screw numbers, e.g. 6 screw numbers 1,2,3,4,5,6 indicated by grey dots in fig. 4, are automatically generated from the three-dimensional coordinates of the screw. And the MPI orchestrator orchestrates the screw installation sequence. And (3) automatic screw numbering, namely grouping the screws into large groups according to the z coordinates, grouping the screws into small groups according to the y coordinates, and calculating the specific number of the current screw according to the x coordinates and the first two group numbers. The three-dimensional coordinates of each screw on the workpiece coordinate system in fig. 4 are [ [ x01, y01, z01], [ x02, y02, z02], ].
The three-dimensional coordinates of the four feature points A1, A2, A3, A4 of the feature set a in the workpiece coordinate system in fig. 4 are respectively: [ [ x11, y11, z11], [ x12, y12, z12], [ x13, y13, z13], [ x14, y14, z14] ].
And determining a feature point set B on the tool, wherein the number of feature points is not less than 4, if the number of feature points is 4, the 4 points are in the same plane, and if the feature points are not in one plane, at least 6 feature points are needed, and the feature points can be corner points and can be any identifiable special points. The 4 corner points within the red circle as in fig. 5 constitute a feature point set B. And determining three-dimensional coordinates of all the feature points in the feature point set B and the tool execution tail end on a tool coordinate system. In fig. 5, the three-dimensional coordinates in the workpiece coordinate system of the end point of the tool execution end are: [ x0, y0, z0].
The three-dimensional coordinates of the four feature points B1, B2, B3, B4 of the feature set B in fig. 5 in the object coordinate system are respectively: [ [ x21, y21, z21], [ x22, y22, z22], [ x23, y23, z23], [ x24, y24, z24] ].
Further, in step S3, a picture of the product is acquired by using the camera, and the camera internal reference matrix and the distortion coefficient of the camera are obtained through camera calibration:
distCoeffs=[k 1 ,k 2 ,p 1 ,p 2 ,k 3 ]
wherein, the camera matrix represents a camera internal reference matrix, the distCoeffs represents a camera distortion coefficient, and k 1 ,k 2 ,k 3 Representing the radial distortion coefficient, p 1 ,p 2 Represents the tangential distortion coefficient, focal length (f x ,f y ) And an optical center (c) x ,c y )。
Two-dimensional image coordinates of the workpiece feature set a and the tool feature set B on the picture are detected by deep learning or conventional vision. The coordinates of the four feature points A1, A2, A3, A4 of the feature set a on the two-dimensional image are as follows: [ [ x11', y11' ], [ x12', y12' ], [ x13', y13' ], [ x14', y14' ]
The corresponding coordinates of the four feature points B1, B2, B3, B4 of the feature set B on the two-dimensional image are: [ [ x21', y21' ], [ x22', y22' ], [ x23', y23' ], [ x24', y24' ].
Further, in step S4, a homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system is obtained according to the camera internal parameters, the camera distortion coefficients, the two-dimensional image coordinates of the feature point set a, the three-dimensional coordinates of the feature point set a in the workpiece coordinate system, the two-dimensional image coordinates of the feature point set B, and the three-dimensional coordinates of the feature point set B in the tool coordinate system.
Specifically, according to the corresponding relation between the two-dimensional image coordinates of the feature point set A and the three-dimensional coordinates of the feature point set A in the workpiece coordinate system, the solvePnP function in opencv is used for obtaining initial pose estimation of the camera pose, and the rotation vector and the translation vector of the pose are used for obtaining a homogeneous transformation matrix T1 of the camera coordinate system relative to the workpiece coordinate system.
And taking the initialized pose estimation as an initial value, and performing iterative optimization by using a nonlinear optimization algorithm, wherein the optimization algorithm comprises Levenberg-Marquardt, gauss-Newton, and the pose parameter is adjusted by minimizing the reprojection error iteration, so that the reprojection error is minimized. Wherein the re-projection is to use the estimated pose for the three-dimensional point and project it onto the image. And calculating a re-projection error between the projection point and the corresponding image, taking the re-projection error as an optimized objective function, and adjusting the bit parameter by minimizing the re-projection error so that the projection point is more consistent with the image point.
In the process, the three-dimensional characteristic points and the points on the corresponding images form point pairs, the detection accuracy of the characteristic points on the images has larger influence on the result, and the deep learning or the traditional workpiece visual detection is selected according to the actual workpiece. In addition, the screening of the characteristic point pairs is very important, abnormal points and noise can be eliminated by using a RANSAC (Random Sample Consensus) algorithm, proper point pairs are selected from all the point pairs in an iterative mode, the point pairs which are erroneously matched are filtered, and the accuracy of pose calculation is improved.
According to the actual workpiece condition, if the condition allows the point logarithm to be increased, more constraint conditions are provided, and the pose calculation accuracy is increased. The spatial distribution of the points on the workpiece has a certain influence on the result, and the points are distributed in the whole space as much as possible, so that the distribution is not limited in a small range of the workpiece.
Specifically, the pose of the camera in the workpiece coordinate system can be obtained in this step, the rotation vector R1 and the translation vector T1 of the workpiece coordinate system to the camera coordinate system are obtained, the rotation vector can obtain the rotation matrix R1, and the homogeneous transformation matrix T1 of the workpiece coordinate system to the camera coordinate system can be constructed according to R1 and T1. The pose of the camera in the tool coordinate system can be obtained through the step, the rotation vector R2 and the translation vector T2 of the tool coordinate system to the camera coordinate system are obtained, the rotation vector can obtain the rotation matrix R2, and the homogeneous transformation matrix T2 of the tool coordinate system to the camera coordinate system can be constructed according to the R2 and the T2. The homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system can be obtained through T1 and T2.
Further, in step S5, three-dimensional coordinates of the tool execution end vertex in the workpiece coordinate system are calculated according to the three-dimensional coordinates of the tool execution end vertex in the tool coordinate system and the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system:
p 1 =T*p 2
wherein p is 2 In the form of homogeneous coordinates [ x, y, z,1]Where x, y, z are the three-dimensional coordinates of the tool execution end in the tool coordinate system, p 1 Homogeneous coordinates [ x ', y ', z ', w ] representing the vertex of the tool execution tip in the workpiece coordinate system]Dividing the homogeneous coordinates by the last element, [ x '/w, y '/w, z '/w,]that is, the three-dimensional coordinates of the end vertices of the tool execution in the tool coordinate system.
Further, in step S6, in the workpiece coordinate system, the distances between the tip vertex of the tool execution and all the screws are calculated, the minimum distance value is taken and judged with the set threshold value, and if the minimum distance value is smaller than the set threshold value, the screw number corresponding to the tip vertex of the current tool execution is judged to be the screw number corresponding to the minimum distance value; otherwise, it is determined that the current tool execution end vertex is not at any screw position. If the screw number is consistent with the specified screw number of the specified installation sequence, enabling the tool, setting the corresponding screwing force of the screw, otherwise, disabling the tool. Repeating the steps S3-S6 until all screws are screwed in sequence.
Example two
As shown in fig. 6, the present application provides a three-dimensional tightening sequence fool-proof control system based on visual positioning, the system comprising:
a coordinate system construction module 10 for constructing two world coordinate systems, namely a workpiece coordinate system and a tool coordinate system, with the workpiece and the tool;
the three-dimensional coordinate acquisition module 20 is used for determining a feature point set A on a workpiece, three-dimensional coordinates of all feature points in the feature point set A and three-dimensional coordinates of all screws on the workpiece coordinate system based on the workpiece coordinate system, automatically generating screw numbers according to the three-dimensional coordinates of the screws, and arranging screw installation sequences; determining a feature point set B on the tool, three-dimensional coordinates of all feature points in the feature point set B and three-dimensional coordinates of a tool execution end on the tool coordinate system based on the tool coordinate system;
the detection module 30 is used for acquiring pictures of products by using a camera and detecting two-dimensional image coordinates of the workpiece feature point set A and the tool feature point set B;
a transformation matrix obtaining module 40, configured to obtain a homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system according to the two-dimensional image coordinates of the feature point set a, the three-dimensional coordinates of the feature point set a in the workpiece coordinate system, the two-dimensional image coordinates of the feature point set B, and the three-dimensional coordinates of the feature point set B in the tool coordinate system;
a transformation module 50, configured to calculate, according to the three-dimensional coordinates of the tool execution end vertex in the tool coordinate system and the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system, the three-dimensional coordinates of the tool execution end vertex in the workpiece coordinate system;
the control module 60 is configured to calculate the distances between the vertex of the end of the tool execution and all the screws in the workpiece coordinate system, obtain the screw number corresponding to the vertex of the end of the current tool execution, enable the tool if the screw number matches the screw number specified by the specified installation sequence, and set the screw corresponding screwing force, otherwise, the tool is not enabled, and repeatedly execute the detection module, the transformation matrix acquisition module, the transformation module and the control module until all the screws are screwed.
The system is used for realizing the three-dimensional screwing sequence fool-proof control method based on visual positioning, and in order to avoid redundancy, the description is omitted here.
Note that the above is only a preferred embodiment of the present application and the technical principle applied. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, while the application has been described in connection with the above embodiments, the application is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the application, which is set forth in the following claims.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It is apparent that the above examples are given by way of illustration only and are not limiting of the embodiments. Other variations and modifications of the present application will be apparent to those of ordinary skill in the art in light of the foregoing description. It is not necessary here nor is it exhaustive of all embodiments. And obvious variations or modifications thereof are contemplated as falling within the scope of the present application.

Claims (10)

1. A three-dimensional screwing sequence fool-proof control method based on visual positioning is characterized by comprising the following steps:
s1: constructing two world coordinate systems by using a workpiece and a tool, wherein the two world coordinate systems are respectively a workpiece coordinate system and a tool coordinate system;
s2: determining a feature point set A on a workpiece, three-dimensional coordinates of all feature points in the feature point set A and three-dimensional coordinates of all screws on the workpiece coordinate system based on the workpiece coordinate system, automatically generating screw numbers according to the three-dimensional coordinates of the screws, and arranging screw installation sequences; determining a feature point set B on the tool, three-dimensional coordinates of all feature points in the feature point set B and three-dimensional coordinates of a tool execution end on the tool coordinate system based on the tool coordinate system;
s3: acquiring a picture of a product by using a camera, and detecting two-dimensional image coordinates of a workpiece feature point set A and a tool feature point set B;
s4: obtaining a homogeneous transformation matrix T from a tool coordinate system to a workpiece coordinate system according to the two-dimensional image coordinates of the feature point set A, the three-dimensional coordinates of the feature point set A in the workpiece coordinate system, the two-dimensional image coordinates of the feature point set B and the three-dimensional coordinates of the feature point set B in the tool coordinate system;
s5: according to the three-dimensional coordinates of the vertex of the tool execution end in the tool coordinate system and the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system, calculating to obtain the three-dimensional coordinates of the vertex of the tool execution end in the workpiece coordinate system;
s6: and calculating the distance between the vertex of the execution end of the tool and all screws in a workpiece coordinate system, obtaining a screw number corresponding to the vertex of the execution end of the current tool, enabling the tool if the screw number is consistent with the specified screw number of the specified installation sequence, setting the corresponding screwing force of the screws, otherwise, not enabling the tool, and repeating the steps S3-S6 until all screws are screwed.
2. The visual positioning-based three-dimensional screwing sequence fool-proofing control method according to claim 1, wherein the method for determining the characteristic point set A on the workpiece or the characteristic point set B on the tool is as follows:
if the feature points are in the same plane, the number of the feature points is not less than 4; if the feature points are not in one plane, the number of feature points is not less than 6.
3. The visual positioning-based three-dimensional screwing sequence fool-proofing control method according to claim 1, wherein the method for detecting two-dimensional image coordinates of a workpiece feature point set A and a tool feature point set B by utilizing a camera to collect pictures of products is as follows:
and acquiring a picture of the product by using a camera, and detecting two-dimensional image coordinates of the workpiece feature set A and the tool feature set B on the picture through deep learning or traditional vision.
4. The visual positioning-based three-dimensional screwing sequence fool-proofing control method according to claim 3, wherein a camera internal reference matrix and a camera distortion coefficient are obtained through camera calibration:
distCoeffs=[k 1 ,k 2 ,p 1 ,p 2 ,k 3 ]
wherein, the camera matrix represents a camera internal reference matrix, the distCoeffs represents a camera distortion coefficient, and k 1 ,k 2 ,k 3 Representing the radial distortion coefficient, p 1 ,p 2 Represents the tangential distortion coefficient, focal length (f x ,f y ) And an optical center (c) x ,c y )。
5. The visual positioning-based three-dimensional screwing sequence fool-proofing control method according to claim 1, wherein the method for obtaining the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system according to the two-dimensional image coordinates of the feature point set a, the three-dimensional coordinates of the feature point set a in the workpiece coordinate system, the two-dimensional image coordinates of the feature point set B and the three-dimensional coordinates of the feature point set B in the tool coordinate system is as follows:
obtaining a homogeneous transformation matrix T1 of the camera coordinate system relative to the workpiece coordinate system according to the two-dimensional image coordinates of the feature point set A, the three-dimensional coordinates of the feature point set A in the workpiece coordinate system, the camera internal parameters and the camera distortion parameters; obtaining a homogeneous transformation matrix T2 of the camera coordinate system relative to the tool coordinate system according to the two-dimensional image coordinates of the feature point set B, the three-dimensional coordinates of the feature point set B in the tool coordinate system, the camera internal parameters and the camera distortion parameters; the two homogeneous transformation matrices T1 and T2 are used for obtaining the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system.
6. The visual positioning-based three-dimensional screwing sequence fool-proofing control method according to claim 5, wherein the method for obtaining the homogeneous transformation matrix T1 of the camera coordinate system relative to the workpiece coordinate system according to the two-dimensional image coordinates of the feature point set A, the three-dimensional coordinates of the feature point set A in the workpiece coordinate system, the camera internal parameters and the camera distortion parameters is as follows:
according to the corresponding relation between the two-dimensional image coordinates of the feature point set A and the three-dimensional coordinates of the feature point set A in the workpiece coordinate system, the solvePnP function in opencv is used for obtaining initial pose estimation of the camera pose, the rotation vector and the translation vector of the pose and the homogeneous transformation matrix T1 of the camera coordinate system relative to the workpiece coordinate system.
7. The visual positioning-based three-dimensional tightening sequence fool-proofing control method according to claim 6, wherein the initialized pose estimation is used as an initial value, and a nonlinear optimization algorithm is used for iterative optimization, wherein the optimization algorithm comprises Levenberg-Marquardt, gauss-Newton, and the pose parameter is iteratively adjusted by minimizing the reprojection error, so that the reprojection error is minimized.
8. The visual positioning-based three-dimensional tightening sequence fool-proofing control method according to claim 1, wherein the method for calculating the three-dimensional coordinates of the tool execution end vertex in the workpiece coordinate system according to the three-dimensional coordinates of the tool execution end vertex in the tool coordinate system and the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system comprises the following steps:
p 1 =T*p 2
wherein p is 2 In the form of homogeneous coordinates [ x, y, z,1]Where x, y, z are the three-dimensional coordinates of the tool execution end in the tool coordinate system, p 1 Homogeneous coordinates [ x ', y ', z ', w ] representing the vertex of the tool execution tip in the workpiece coordinate system]Dividing the homogeneous coordinates by the last element, [ x '/w, y '/w, z '/w,]that is, the three-dimensional coordinates of the end vertices of the tool execution in the tool coordinate system.
9. The visual positioning-based three-dimensional tightening sequence fool-proofing control method according to claim 1, wherein the method for calculating the distance between the tip vertex of the tool execution and all screws in the workpiece coordinate system to obtain the screw number corresponding to the tip vertex of the current tool execution comprises the following steps:
in a workpiece coordinate system, calculating the distance between the end vertex of the tool execution and all screws, taking a minimum distance value and a set threshold value for judgment, and if the minimum distance value is smaller than the set threshold value, judging that the screw number corresponding to the end vertex of the current tool execution is the screw number corresponding to the minimum distance value; otherwise, it is determined that the current tool execution end vertex is not at any screw position.
10. A visual positioning-based three-dimensional tightening sequence fool-proofing control system, the system comprising:
the coordinate system construction module is used for constructing two world coordinate systems by using the workpiece and the tool, namely a workpiece coordinate system and a tool coordinate system;
the three-dimensional coordinate acquisition module is used for determining a characteristic point set A on a workpiece, three-dimensional coordinates of all characteristic points in the characteristic point set A and three-dimensional coordinates of all screws on the workpiece coordinate system based on the workpiece coordinate system, automatically generating screw numbers according to the three-dimensional coordinates of the screws, and arranging screw installation sequences; determining a feature point set B on the tool, three-dimensional coordinates of all feature points in the feature point set B and three-dimensional coordinates of a tool execution end on the tool coordinate system based on the tool coordinate system;
the detection module is used for acquiring pictures of products by using a camera and detecting two-dimensional image coordinates of the workpiece feature point set A and the tool feature point set B;
the transformation matrix acquisition module is used for obtaining a homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system according to the two-dimensional image coordinates of the feature point set A, the three-dimensional coordinates of the feature point set A in the workpiece coordinate system, the two-dimensional image coordinates of the feature point set B and the three-dimensional coordinates of the feature point set B in the tool coordinate system;
the transformation module is used for calculating to obtain the three-dimensional coordinates of the tool execution terminal vertex in the workpiece coordinate system according to the three-dimensional coordinates of the tool execution terminal vertex in the tool coordinate system and the homogeneous transformation matrix T from the tool coordinate system to the workpiece coordinate system;
the control module is used for calculating the distance between the vertex of the execution end of the tool and all screws in the workpiece coordinate system, obtaining the screw number corresponding to the vertex of the execution end of the current tool, enabling the tool if the screw number is consistent with the screw number appointed by the specified installation sequence, setting the corresponding screwing force of the screws, otherwise, not enabling the tool, and repeatedly executing the detection module, the transformation matrix acquisition module, the transformation module and the control module until all screws are screwed.
CN202310719355.5A 2023-06-16 2023-06-16 Visual positioning-based three-dimensional screwing sequence fool-proofing control method and system Pending CN116810359A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310719355.5A CN116810359A (en) 2023-06-16 2023-06-16 Visual positioning-based three-dimensional screwing sequence fool-proofing control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310719355.5A CN116810359A (en) 2023-06-16 2023-06-16 Visual positioning-based three-dimensional screwing sequence fool-proofing control method and system

Publications (1)

Publication Number Publication Date
CN116810359A true CN116810359A (en) 2023-09-29

Family

ID=88121492

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310719355.5A Pending CN116810359A (en) 2023-06-16 2023-06-16 Visual positioning-based three-dimensional screwing sequence fool-proofing control method and system

Country Status (1)

Country Link
CN (1) CN116810359A (en)

Similar Documents

Publication Publication Date Title
AU2011267751B2 (en) Acquisition of 3D topographic images of tool marks using non-linear photometric stereo method
US10636151B2 (en) Method for estimating the speed of movement of a camera
US8593524B2 (en) Calibrating a camera system
US8755630B2 (en) Object pose recognition apparatus and object pose recognition method using the same
CN106705849B (en) Calibrating Technique For The Light-strip Sensors
KR100855657B1 (en) System for estimating self-position of the mobile robot using monocular zoom-camara and method therefor
US11082633B2 (en) Method of estimating the speed of displacement of a camera
CN106530358A (en) Method for calibrating PTZ camera by using only two scene images
WO2013008804A1 (en) Measurement device and information processing device
Melo et al. Unsupervised intrinsic calibration from a single frame using a" plumb-line" approach
JP2015184767A (en) Information processor, information processing method, position attitude estimation device and robot system
Belhaoua et al. Error evaluation in a stereovision-based 3D reconstruction system
CA3206206A1 (en) Device and method for correspondence analysis in images
CN110738730A (en) Point cloud matching method and device, computer equipment and storage medium
CN111915681A (en) External parameter calibration method and device for multi-group 3D camera group, storage medium and equipment
EP2840550A1 (en) Camera pose estimation
KR101222009B1 (en) System and method for lens distortion compensation of camera based on projector and epipolar geometry
JP2010145219A (en) Movement estimation device and program
CN116810359A (en) Visual positioning-based three-dimensional screwing sequence fool-proofing control method and system
CN115018922A (en) Distortion parameter calibration method, electronic device and computer readable storage medium
CN110223252B (en) Depth image restoration algorithm based on composite self-adaptive region growing criterion
Su et al. Fast detection method of checkerboard corners based on the combination of template matching and Harris Operator
EP2953096B1 (en) Information processing device, information processing method, system and carrier means
CN111145268A (en) Video registration method and device
CN117523010B (en) Method and device for determining camera pose of vehicle, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination