CN116758160B - Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method - Google Patents

Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method Download PDF

Info

Publication number
CN116758160B
CN116758160B CN202310735104.6A CN202310735104A CN116758160B CN 116758160 B CN116758160 B CN 116758160B CN 202310735104 A CN202310735104 A CN 202310735104A CN 116758160 B CN116758160 B CN 116758160B
Authority
CN
China
Prior art keywords
optical element
coordinate system
camera
global
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310735104.6A
Other languages
Chinese (zh)
Other versions
CN116758160A (en
Inventor
陈冠华
刘国栋
陈凤东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202310735104.6A priority Critical patent/CN116758160B/en
Publication of CN116758160A publication Critical patent/CN116758160A/en
Application granted granted Critical
Publication of CN116758160B publication Critical patent/CN116758160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an optical element assembling process pose detection method and an assembling method based on an orthogonal vision system, belongs to the field of industrial assembling vision detection, and aims to solve the problem that an optical element edge and an assembling frame are easy to collide in the conventional optical element assembling process, so that assembling errors are caused. The method comprises the following steps: s1, constructing a visual detection system; the visual detection system comprises a global camera and two side view angle cameras, and S2, the joint calibration step of three cameras in the visual detection system; taking a camera coordinate system of a global camera as a unified global coordinate system; s3, calculating the pose of the optical element; the three cameras synchronously acquire a top view and two side views of the optical element, extract the analysis of the edges of the optical element under the respective pixel coordinate systems from the images of the three cameras, and align the edges under the pixel coordinate systems of the different cameras under a unified global coordinate system by combining the combined calibration data; the pose of the optical element is determined by the analytical formula of these edges.

Description

Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method
Technical Field
The invention relates to the field of industrial assembly visual detection, in particular to a method for resolving six-degree-of-freedom pose of an optical element in industrial automation optical element assembly.
Background
In the automatic assembly process of the optical element, the optical element is held by a mechanical hand and placed in a mechanical frame, so that the automatic assembly of the optical element is realized. In the assembly process, the accuracy requirement for detecting the relative position and the posture among the optical element, the manipulator and the assembly frame is high. The existing direct visual detection method has poor positioning reliability and limited precision on the transparent optical element, and the optical element and the assembly frame have non-negligible manufacturing tolerance, so that the edge of the optical element and the assembly frame are easy to collide in the assembly of the optical element of the FOA assembly platform (final optics assembl, terminal optical system), and assembly errors are caused.
Disclosure of Invention
Aiming at the problem that the assembly error is caused by the easy collision between the edge of the optical element and the assembly frame in the conventional optical element assembly, the invention provides an optical element assembly visual detection method and an assembly method based on an orthogonal visual detection system.
The invention relates to a pose detection method for an optical element assembly process based on an orthogonal vision system, which comprises the following steps:
s1, constructing a visual detection system;
the visual detection system comprises a global camera and two side view angle cameras, wherein the front of the two side view angle cameras uses a strip light source to glancing and illuminating the optical element;
S2, a joint calibration step of three cameras in the visual detection system;
Taking a camera coordinate system of a global camera as a unified global coordinate system; three camera internal parameters and external parameters are marked, and a spatial mapping relation from a pixel coordinate system of each camera to a unified global coordinate system is established;
S3, calculating the pose of the optical element;
The three cameras synchronously acquire a top view and two side views of the optical element, extract the analysis of the edges of the optical element under the respective pixel coordinate systems from the images of the three cameras, and align the edges under the pixel coordinate systems of the different cameras under a unified global coordinate system by combining the combined calibration data; the pose of the optical element is determined by the analytical formula of these edges.
Preferably, the optical element to be detected is horizontally arranged on the FOA assembly platform, the global camera is arranged above the FOA assembly platform, two cameras positioned at the side of the optical element during detection are a No. 1 side view angle camera and a No. 2 side view angle camera,
The three cameras are orthogonal to each other in terms of gestures, the optical axes of the three cameras intersect at one point on the upper surface of the optical element, and the following coordinate system is established:
the global coordinate system of the global camera is O G-XGYGZG, and the optical axis direction of the global camera is vertical to the horizontal plane;
The No.1 side view angle camera coordinate system is O C1-XC1YC1ZC1, the No.2 side view angle camera coordinate system is O C2-XC2YC2ZC2, and the optical axis directions of the two cameras are parallel to the horizontal plane;
The X-axis positive direction of the global camera coordinate system O G is the same as the Z C2 -axis positive direction of the side view angle camera coordinate system O C2 No. 2;
The model coordinate system O M of the optical element is O M-XMYMZM, the origin is defined on the vertex A of the upper left corner of the upper surface of the optical element, the Z M axis is vertical to the upper surface of the optical element, the Z M axis is upward in the positive direction, and the X M axis is the direction that the vertex of the upper left corner of the upper surface of the optical element points to the vertex of the lower left corner of the upper surface of the optical element.
Preferably, the joint calibration procedure of step S2 is:
Firstly, calibrating internal reference matrixes of three cameras respectively by adopting Zhang Zhengyou calibration method, and establishing mapping relation from each two-dimensional pixel coordinate system to a three-dimensional camera coordinate system;
Then, the camera coordinate systems of the global cameras are used as a unified global coordinate system, and the transformation from the camera coordinate systems of the two side view angle cameras to the coordinate systems of the global coordinate system is respectively calibrated to obtain external parameter matrixes of the two side view angle cameras;
The extrinsic matrix comprises: the conversion relation T C1=(RC1,tC1),RC1 from the No. 1 side view angle camera coordinate system O C1 to the global coordinate system O G is a rotation matrix from the No. 1 side view angle camera coordinate system O C1 to the global coordinate system O G, and T C1 is a translation vector from the No. 1 side view angle camera coordinate system O C1 to the global coordinate system O G;
And the conversion relationship from the No. 2 side view angle camera coordinate system O C2 to the global coordinate system O G is T C2=(RC2,tC2),RC2 which is a rotation matrix from the No. 2 side view angle camera coordinate system O C2 to the global coordinate system O G, and T C2 is a translation vector from the No. 2 side view angle camera coordinate system O C2 to the global coordinate system O G.
Preferably, a three-dimensional calibration block is used for calibrating external reference matrixes of two side-view angle cameras in combination with AprilTag algorithm, the three-dimensional calibration block is a cuboid, four side surfaces and the top surface are sequentially defined as a surface 0-surface 4 of the cuboid, the surface 0-surface 4 is sequentially provided with 36H11 april tag patterns with id of 0-4, an origin of a coordinate system O B of the three-dimensional calibration block is defined in a center of a AprilTag pattern (id=0) on the surface 0, an X axis and a Y axis are parallel to the surface, positive directions are respectively from an upper left vertex to an upper right vertex of the pattern, from an upper left vertex to a lower left vertex of the pattern, and a Z axis direction points to the inner side of the model;
The process of calibrating the external reference matrix by adopting AprilTag algorithm comprises the following steps:
The surface 4 of the three-dimensional calibration block is kept horizontal, one surface is arbitrarily selected from the surfaces 0-3 of the three-dimensional calibration block, the distance from the selected surface AprilTag pattern to the optical center of the No. 1 side view angle camera along the optical axis direction is 280mm, the distance from the AprilTag pattern of the surface 4 to the optical center of the global camera along the optical axis direction is 1200mm, and the corresponding AprilTag pattern is ensured to be completely present in the visual field of the two phases;
When the three-dimensional calibration block, the global camera and the No. 1 side view angle camera meet the requirements, an external triggering mode is used for controlling the two cameras to capture images under the time difference of not more than 15ms, and AprilTag algorithm is used for calculating the position and the gesture from the calibration graph in the view field to each camera; the transformation relation between the coordinate systems of all graphs on the three-dimensional calibration block and the coordinate system O B of the calibration block is known by CAD data, the pose of the calibration block at the photographing time relative to the global camera and the No. 1 side view angle camera is deduced, the transformation T C1=(RC1,tC1 between the two poses is calculated, namely the transformation relation from O C1 to O G, and the external parameter matrix of the No. 1 side view angle camera under O G is obtained;
the external reference matrix of the No. 2 side view angle camera under O G is obtained in the same way as the above process.
Preferably, the pose solving process of the optical element in the step S3 is as follows:
S31, placing an optical element at a preset position of the FOA platform by using a clamp, enabling three cameras to synchronously acquire a top view image I M and two side view images I SL、ISF of the optical element, and preprocessing the three images to generate a gray level image;
s32, extracting line segments from the three gray images by adopting an LSD algorithm (LINE SEGMENT Detector algorithm), and processing the line segments to obtain the edges of the optical elements of the three gray images under the respective camera pixel coordinate systems;
S33, acquiring analysis formulas of the edges of the optical element under three camera coordinate systems, specifically, according to the matching result of the step S32, when one edge in the overlook image I M and one edge in the side view image I SL or the side view image I SF simultaneously correspond to the same edge in the CAD model, considering that the edges in the two images correspond to the same edge of the optical element; according to the principle, obtaining analytic formulas of three gray images under respective camera coordinate systems;
s34, according to the camera internal reference matrix, the external reference matrix and the analysis formula under the three camera coordinate systems, the analysis formula of the edge of the optical element under the global coordinate system is obtained;
s35, according to the analysis of the edge of the optical element under the global coordinate system, calculating the position and the posture of the optical element and the size of the upper surface according to the corresponding relation by utilizing a PnP algorithm.
Preferably, the process of obtaining the edge of the optical element in step S32 is:
all line segments are extracted from the three gray images by adopting an LSD algorithm, and the following processing is carried out:
Step one, eliminating the excessively short line segment of L <25 pixels, wherein the length L of the line segment is obtained according to the following formula:
(x 1,y1),(x2,y2) is the pixel coordinates of the two endpoints of the line segment respectively;
Step two, marking a line segment set which is not fused yet as l Ori, marking the fused line segment set as l meg, and firstly placing the rest line segments after the short line segments are removed in the step one in the set l Ori;
Step three, fusing line segments:
Searching for the L largest line segment in set L Ori, denoted as L 0, and searching for a line segment satisfying the condition in set L Ori: the Euclidean distance dis from the midpoint of the line segment to the straight line where l 0 is located is smaller than 3 pixels, and meanwhile, the included angle ang between the line segment and l 0 is smaller than 0.5 degrees; removing l 0 and the eligible segments from set l Ori;
The Euclidean distance dis and the included angle ang of the two line segments are calculated according to the following formula:
Wherein:
(a 1,b1),(a2,b2) is the l 0 endpoint coordinate,
K is the slope of l 0,
x=(x1+x2)/2
y=(y1+y2)/2
Δx=x2-x1
Δy=y2-y1
Δa=a2-a1
Δb=b2-b1
Step four, for l 0 and line segments meeting the conditions, calculating the minimum circumscribed rectangle of the end points of the line segments by using an SMBR algorithm, taking the midpoint pixel coordinates of the two short sides of the circumscribed rectangle as the pixel coordinates of the two end points of the new fused line segments, and putting the fused line segments into a fused line segment set l meg;
Repeating the third and fourth steps until the set l Ori is an empty set, and then executing the fifth step;
Step five, deleting the wrong line segment in the collection l meg:
first, deleting the line segment of L <1200 pixels;
next, line segments belonging to the edge of the optical element are identified using a priori knowledge of the CAD model of the optical element, specifically, the optical element held by the jig is reset to a predetermined position, and the CAD of the optical element at that time is projected into the acquired image. Since the extracted line segments are from the optical element edges, the line segments representing the optical element edges are screened as follows: the Euclidean distance dis from the midpoint of the line segment to the edge line segment of a certain model extracted by CAD is smaller than 15 pixels, and the included angle ang is smaller than 2 degrees; the line segments in set l meg that do not meet the above conditions are deleted, and the remaining line segments in set l meg are used as optical element edges.
Preferably, the process of S34 obtaining the analytical formula of the edge of the optical element under the global coordinate system is:
According to the known camera internal reference matrix and external reference matrix, converting the straight line of the same edge of the corresponding optical element in the two images into a plane under a camera coordinate system by using back projection to obtain the three-dimensional parameter of the plane where the straight line is located, wherein the plane passes through a camera optical center O; the two images are a top view and a side view;
Assuming that an edge line segment l 1 from the global camera and an edge line segment l 2 of one side view angle camera correspond to the same straight line in space, l 1 and the optical center of the global camera determine a plane P 1,l2 and the optical center of the side view angle camera determine a plane P 2, and the plane P 1、P2 is transformed into the global coordinate system according to the internal reference matrix and the external reference matrix, so as to obtain an analytical formula of P 1、P2 in the global coordinate system as follows:
P1:a1X+b1Y+c1Z+d1=0
P2:a2X+b2Y+c2Z+d2=0
Where vector (a 1,b1,c1) is the normal vector to plane P 1 and d 1 is the suffix to satisfy all points on plane P 1; vector (a 2,b2,c2) is the normal vector to plane P 2, and d 2 is the suffix to satisfy all points on plane P 2; c 1=c2 = 1;
Because in the imaging of the two cameras, the matched straight lines correspond to the same straight line in the three-dimensional space, the planes P 1 and P 2 are necessarily intersected, the intersection line is the target straight line, namely the straight line where the edge of the optical element is located is analyzed as follows:
P=p0+tV
wherein p 0=(X0,Y0,Z0) is a known point on the target line, t is the unit direction vector of the target line, the form is t=ix+ jY +kz, and the coefficients i, j, k satisfy the relation: i 2+j2+k2 = 1;
By repeating the method, the analytical formula of the straight line where the upper surface of the optical element is close to the adjacent two edges of the two side view angle cameras in the global coordinate system can be obtained:
L1:P=p1+1V
L2:P=p2+2V
Where p 1 is a known point on line L 1, t 1 is the unit direction vector of line L 1, p 2 is a known point on line L 2, and t 2 is the unit direction vector of line L 2;
A point a= (X M,YM,ZM) where the sum of distances to L 1 and L 2 is the smallest is calculated as the top left corner vertex of the upper surface of the optical element, that is, points B and C with a distance of 410mm to a are taken on L 1 and L 2 respectively, and a plane is defined by A, B, C together:
P0:a0X+b0Y+Z+c0=0
the method is that the analytic expression of the upper surface of the optical element under the global coordinate system is that the intersection line of the upper surface and the planes of the other two edges is solved according to the method, namely the spatial analytic expression of the straight line of the remaining two edges under the global coordinate system is that:
L3:P=p3+3V
L4:P=p4+4V
where p 3 is a known point on line L 3, t 3 is the unit direction vector of line L 3, p 4 is a known point on line L 4, and t t is the unit direction vector of line L 4.
Preferably, the process of S35 calculating the position and orientation of the optical element and the size of the upper surface is:
the positive Z-axis direction of the pose of the optical element model coordinate system O M to the global coordinate system O G is defined as t 2×t1, and the positive X-axis direction is defined as t 2. Constructing PnP corresponding relation:
Two-dimensional point (0, 0), corresponding to three-dimensional point a= (X M,YM,ZM);
two-dimensional point (1, 0), corresponding to three-dimensional point a+t 2;
two-dimensional points (0, 1), corresponding to three-dimensional points a+norm (t 2×t1)×t2;
Two-dimensional points (1, 1) corresponding to three-dimensional points a+norm (t 2×t1)×t2+t2;
Wherein Norm (t) is the operation normalized to t;
By establishing the PnP relation of the four groups of points, the conversion relation (R, t), (R, t) from the model coordinate system O M of the optical element to the global coordinate system O G can be solved, wherein R is a rotation matrix from the model coordinate system O M of the optical element to the global coordinate system O G as the 6-degree-of-freedom pose of the optical element, and t is a translation vector from the model coordinate system O M of the optical element to the global coordinate system O G;
position matrix t=a T=(XM,YM,ZM)T;
Obtaining the coordinates of the residual vertex of the upper surface of the optical element under the global coordinate system according to the solving method of the coordinates of the point A under the global coordinate system, wherein the residual vertex is B, C, D in turn clockwise,
The length of the optical element is l= | ((a+b) - (d+c))/2|| 2, and the width is w= | ((a+d) - (b+c))/2|| 2.
The invention also provides another technical scheme: the method for assembling the optical element based on the orthogonal vision system comprises the steps of acquiring the pose of the optical element in real time by using the pose detection method for the optical element assembling process based on the orthogonal vision system according to claims 1-8 during assembling, calculating the deviation between the pose of the current optical element and the ideal pose, and guiding the manipulator to adjust the pose.
The invention has the beneficial effects that:
(1) And (3) high-precision positioning: the vision detection system composed of three cameras utilizes glancing light source illumination, and the positioning accuracy of the transparent optical element is effectively improved through targeted clear imaging of the edge of the optical element.
(2) Reducing manufacturing tolerance effects: by re-measuring the length and width of the upper surface of the optical element, the influence of manufacturing tolerance on the assembly process is reduced.
(3) Accurately measuring the pose with six degrees of freedom: based on images obtained by three cameras, the six-degree-of-freedom pose of the optical element is accurately measured by using a computer vision algorithm, and path guidance is provided for the subsequent assembly process.
Drawings
FIG. 1 is a schematic diagram of a method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to the present invention;
Fig. 2 is a schematic diagram of a three-dimensional calibration block, wherein fig. 2 (a) is a schematic diagram of a surface 4, fig. 2 (b) is a schematic diagram of a three-dimensional structure of the three-dimensional calibration block, fig. 2 (c) is a schematic diagram of a surface 3 of the three-dimensional calibration block, fig. 2 (d) is a schematic diagram of a surface 0 of the three-dimensional calibration block, fig. 2 (e) is a schematic diagram of a surface 1 of the three-dimensional calibration block, and fig. 2 (f) is a schematic diagram of a surface 2 of the three-dimensional calibration block;
fig. 3 is a schematic diagram of the back projection of line segments in three-dimensional space.
Description of the components in the drawings: 1. global camera, no. 2, no. 1 side view angle camera, no. 3, no. 2 side view angle camera, no. 4, FOA assembly platform, no. 5, optical element, no. 6, top view, no. 7, no. 1 side view angle camera takes a side view, no. 8, no. 2 side view angle camera takes a side view.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other.
The invention is further described below with reference to the drawings and specific examples, which are not intended to be limiting.
The first embodiment is as follows: next, referring to fig. 1 to 3, a method for detecting a pose of an optical element assembly process based on an orthogonal vision system according to the present embodiment includes the following steps:
s1, constructing a visual detection system;
the visual detection system comprises a global camera and two side view angle cameras, wherein the front of the two side view angle cameras uses a strip light source to glancing and illuminating the optical element;
S2, a joint calibration step of three cameras in the visual detection system;
Taking a camera coordinate system of a global camera as a unified global coordinate system; three camera internal parameters and external parameters are marked, and a spatial mapping relation from a pixel coordinate system of each camera to a unified global coordinate system is established;
S3, calculating the pose of the optical element;
The three cameras synchronously acquire a top view and two side views of the optical element, extract the analysis of the edges of the optical element under the respective pixel coordinate systems from the images of the three cameras, and align the edges under the pixel coordinate systems of the different cameras under a unified global coordinate system by combining the combined calibration data; the pose of the optical element is determined by the analytical formula of these edges.
With respect to step S1, referring to fig. 1, the method of the present embodiment is implemented based on an orthogonal vision system, and in order to simultaneously and clearly image an optical element from different angles and provide image information for subsequent pose detection of the optical element, the vision detection system is designed to be composed of three cameras, including a global camera and two side view angle cameras, wherein the two side view angle cameras use a strip light source in front of them to glancing illumination of the optical element.
The optical element horizontal direction that waits to detect is arranged in on the FOA assembly platform, and the global camera sets up in FOA assembly platform top, and two cameras that lie in optical element side during the detection are No. 1 side view angle camera and No. 2 side view angle camera, and the working distance of configuration global camera is 1200mm, and the working distance of No. 1 side view angle camera and No. 2 side view angle camera is 280mm.
The three cameras are orthogonal to each other in terms of gestures, the optical axes of the three cameras intersect at one point on the upper surface of the optical element, and the following coordinate system is established:
the global coordinate system of the global camera is O G-XGYGZG, and the optical axis direction of the global camera is vertical to the horizontal plane;
The No.1 side view angle camera coordinate system is O C1-XC1YC1ZC1, the No.2 side view angle camera coordinate system is O C2-XC2YC2ZC2, and the optical axis directions of the two cameras are parallel to the horizontal plane;
The X-axis positive direction of the global camera coordinate system O G is the same as the Z C2 -axis positive direction of the side view angle camera coordinate system O C2 No. 2;
The model coordinate system O M of the optical element is O M-XMYMZM, the origin is defined on the vertex A of the upper left corner of the upper surface of the optical element, the Z M axis is vertical to the upper surface of the optical element, the Z M axis is upward in the positive direction, and the X M axis is the direction that the vertex of the upper left corner of the upper surface of the optical element points to the vertex of the lower left corner of the upper surface of the optical element.
Regarding the joint calibration of step S2, in order to calibrate the internal parameters and the external parameters of the three cameras of the visual detection system, a spatial mapping relationship from each pixel coordinate system to a unified global three-dimensional coordinate system is established, firstly, internal parameters of the three cameras are respectively calibrated, a spatial mapping relationship from the pixel coordinate system to each camera coordinate system is established, then, the camera coordinate systems of the global cameras are defined as a unified global coordinate system, and coordinate system transformation from the camera coordinate systems of the two side view angle cameras to the global coordinate system is respectively calibrated, namely, the external parameters of the two cameras are respectively calibrated.
The internal reference matrix adopts Zhang Zhengyou calibration method to calibrate the internal reference matrix of three cameras, and establishes the mapping relation from each two-dimensional pixel coordinate system to the three-dimensional camera coordinate system;
the method comprises the steps of calibrating an external reference matrix, namely respectively calibrating the coordinate system transformation from the camera coordinate systems of two side view angle cameras to a global coordinate system by taking the camera coordinate systems of the global cameras as a unified global coordinate system, and obtaining the external reference matrix of the two side view angle cameras;
The extrinsic matrix comprises: the conversion relation T C1=(RC1,tC1),RC1 from the No. 1 side view angle camera coordinate system O C1 to the global coordinate system O G is a rotation matrix from the No. 1 side view angle camera coordinate system O C1 to the global coordinate system O G, and T C1 is a translation vector from the No. 1 side view angle camera coordinate system O C1 to the global coordinate system O G;
And the conversion relationship from the No. 2 side view angle camera coordinate system O C2 to the global coordinate system O G is T C2=(RC2,tC2),RC2 which is a rotation matrix from the No. 2 side view angle camera coordinate system O C2 to the global coordinate system O G, and T C2 is a translation vector from the No. 2 side view angle camera coordinate system O C2 to the global coordinate system O G.
Referring to fig. 2, in the external parameter calibration, since the camera coordinate system of the global camera is defined as a unified global coordinate system, only the external parameters of the two side view angle cameras need to be calibrated, the external parameters of the orthogonal visual detection system are calibrated, a three-dimensional calibration block is designed in a targeted manner, and the calibration is performed by combining AprilTag algorithms. The designed calibration block is printed by adopting a resin material in a 3D mode, the appearance is cuboid, the dimension specification is shown in figure 2, and the machining precision is +/-0.05 mm. The three-dimensional calibration block is a cuboid, four side surfaces and a top surface are sequentially defined as a surface 0-surface 4 of the cuboid, the surface 0-surface 4 sequentially adopts 36H11 april tag patterns with id of 0-4, the origin of a coordinate system O B of the three-dimensional calibration block is defined at the center of a AprilTag pattern (id=0) on the surface 0, an X axis and a Y axis are parallel to the surface, positive directions are respectively the direction from the top left vertex to the top right vertex of the pattern, the direction from the top left vertex to the bottom left vertex of the pattern, and the Z axis direction points to the inner side of the model;
The process of calibrating the external reference matrix by adopting AprilTag algorithm comprises the following steps:
When the side view angle camera is calibrated to be externally referred, the cuboid calibration blocks are arranged and put, so that the working distance requirement of the camera when detecting the optical element is met. The surface 4 of the three-dimensional calibration block is kept horizontal, one surface is arbitrarily selected from the surfaces 0-3 of the three-dimensional calibration block, the distance from the selected surface AprilTag pattern to the optical center of the No. 1 side view angle camera along the optical axis direction is 280mm, the distance from the AprilTag pattern of the surface 4 to the optical center of the global camera along the optical axis direction is 1200mm, and the corresponding AprilTag pattern is ensured to be completely present in the visual field of the two phases;
When the three-dimensional calibration block, the global camera and the No. 1 side view angle camera meet the requirements, an external triggering mode is used for controlling the two cameras to capture images under the time difference of not more than 15ms, and AprilTag algorithm is used for calculating the position and the gesture from the calibration graph in the view field to each camera; the transformation relation between the coordinate systems of all graphs on the three-dimensional calibration block and the coordinate system O B of the calibration block is known by CAD data, the pose of the calibration block at the photographing time relative to the global camera and the No. 1 side view angle camera is deduced, the transformation T C1=(RC1,tC1 between the two poses is calculated, namely the transformation relation from O C1 to O G, and the external parameter matrix of the No. 1 side view angle camera under O G is obtained;
The external reference matrix of the No. 2 side view angle camera under O G is obtained in the same way as the above process. The conversion relationship from the side view angle camera coordinate system O C2 to the global coordinate system O G is T C2=(RC2,tC2).
Regarding step S3, the optical element pose solving process is:
S31, placing an optical element at a preset position of the FOA platform by using a clamp, enabling three cameras to synchronously acquire a top view image I M and two side view images I SL、ISF of the optical element, and preprocessing the three images to generate a gray level image;
Specifically, firstly, the manipulator clamps the optical element to a preset position, a glancing light source is turned on, the illumination angle is adjusted to enable the edge of the optical element to be illuminated in the field of view of the camera, the exposure time of the camera is adjusted according to the preset average pixel gray value of the RoI area, and the corresponding pixel gray (the value range is 0-255) of the edge of the optical element is 220-240; the three cameras are controlled to synchronously acquire images within 150ms in an external triggering mode, the acquired images are global camera images I M respectively, and the resolution is 5120 multiplied by 5120; the side view camera images I SL、ISF each have a resolution of 4200 x 2160.
S32, extracting line segments from the three gray images by adopting an LSD algorithm, and processing the line segments to obtain the edges of the optical elements of the three gray images under the respective camera pixel coordinate systems;
The acquisition process of the edge of the optical element comprises the following steps:
all line segments are extracted from the three gray images by adopting an LSD algorithm, and the following processing is carried out:
Step one, eliminating the excessively short line segment of L <25 pixels, wherein the length L of the line segment is obtained according to the following formula:
(x 1,y1),(x2,y2) is the pixel coordinates of the two endpoints of the line segment respectively;
Step two, marking a line segment set which is not fused yet as l Ori, marking the fused line segment set as l meg, and firstly placing the rest line segments after the short line segments are removed in the step one in the set l Ori;
Step three, fusing line segments:
Searching for the L largest line segment in set L Ori, denoted as L 0, and searching for a line segment satisfying the condition in set L Ori: the Euclidean distance dis from the midpoint of the line segment to the straight line where l 0 is located is smaller than 3 pixels, and meanwhile, the included angle ang between the line segment and l 0 is smaller than 0.5 degrees; removing l 0 and the eligible segments from set l Ori;
The Euclidean distance dis and the included angle ang of the two line segments are calculated according to the following formula:
Wherein:
(a 1,b1),(a2,b2) is the l 0 endpoint coordinate,
K is the slope of l 0,
x=(x1+x2)/2
y=(y1+y2)/2
Δx=x2-x1
Δy=y2-y1
Δa=a2-a1
Δb=b2-b1
Step four, for l 0 and line segments meeting the conditions, calculating the minimum circumscribed rectangle of the end points of the line segments by using an SMBR algorithm, taking the midpoint pixel coordinates of the two short sides of the circumscribed rectangle as the pixel coordinates of the two end points of the new fused line segments, and putting the fused line segments into a fused line segment set l meg;
Repeating the third and fourth steps until the set l Ori is an empty set, and then executing the fifth step;
Step five, deleting the wrong line segment in the collection l meg:
first, deleting the line segment of L <1200 pixels; since the common length of the line segments at the edges of the optical element in the real shot is higher than 1200 pixels, the line segments with L <1200 pixels are removed, namely the over short line segments are deleted.
Next, line segments belonging to the edge of the optical element are identified using a priori knowledge of the CAD model of the optical element, specifically, the optical element held by the jig is reset to a predetermined position, and the CAD of the optical element at that time is projected into the acquired image. Since the extracted line segments come from the edge of the optical element, the line segments that correctly represent the edge need to satisfy the following two conditions simultaneously (i.e., the line segments that represent the edge of the optical element are screened by the following conditions): the Euclidean distance dis from the midpoint of the line segment to the edge line segment of a certain model extracted by CAD is smaller than 15 pixels, and the included angle ang is smaller than 2 degrees; the line segments in set l meg that do not meet the above conditions are deleted, and the remaining line segments in set l meg are used as optical element edges.
S33, acquiring analysis formulas of the edges of the optical element under three camera coordinate systems, specifically, according to the matching result of the step S32, when one edge in the overlook image I M and one edge in the side view image I SL or the side view image I SF simultaneously correspond to the same edge in the CAD model, considering that the edges in the two images correspond to the same edge of the optical element; according to the principle, obtaining analytic formulas of three gray images under respective camera coordinate systems;
s34, according to the camera internal reference matrix, the external reference matrix and the analysis formula under the three camera coordinate systems, the analysis formula of the edge of the optical element under the global coordinate system is obtained;
the process of obtaining the resolution of the edge of the optical element in the global coordinate system comprises the following steps:
Referring to fig. 3, first, a principle description of edge projection in a pixel coordinate system in a three-dimensional space is given, and according to a known camera internal reference matrix and external reference matrix, a straight line corresponding to the same edge of an optical element in two images is converted into a plane in the camera coordinate system by using back projection, so as to obtain a three-dimensional parameter of the plane in which the straight line is located, and the plane passes through a camera optical center O; the two images are a top view and a side view; and converting the straight line corresponding to the same edge of the optical element to be detected in the two images into a plane under a camera coordinate system by using back projection to obtain three-dimensional parameters of the plane where the straight line is located, wherein the plane passes through the camera optical center O. As shown in fig. 3, x 1、x2 is two points on a straight line in the image, which is a known quantity; x 1、X2 is a point corresponding to the two points in the three-dimensional space, and the space coordinates of X 1、X2 are unknown and cannot be known even if calibration data of a camera are used, but a line segment (straight line) of the space point X 1、X2 on the plane passing through the point O, X 1、x2, that is, the pixel plane of the camera can be determined by projection according to the internal reference of the camera to correspond to one plane in the three-dimensional space.
Assuming that an edge line segment l 1 from the global camera and an edge line segment l 2 of one side view angle camera correspond to the same straight line in space, l 1 and the optical center of the global camera determine a plane P 1,l2 and the optical center of the side view angle camera determine a plane P 2, and the plane P 1、P2 is transformed into the global coordinate system according to the internal reference matrix and the external reference matrix, so as to obtain an analytical formula of P 1、P2 in the global coordinate system as follows:
P1:a1X+b1Y+c1Z+d1=0
P2:a2X+b2Y+c2Z+d2=0
Where vector (a 1,b1,c1) is the normal vector to plane P 1 and d 1 is the suffix to satisfy all points on plane P 1; vector (a 2,b2,c2) is the normal vector to plane P 2, and d 2 is the suffix to satisfy all points on plane P 2; c 1=c2 = 1;
Because in the imaging of the two cameras, the matched straight lines correspond to the same straight line in the three-dimensional space, the planes P 1 and P 2 are necessarily intersected, the intersection line is the target straight line, namely the straight line where the edge of the optical element is located is analyzed as follows:
P=p0+tV
wherein p 0=(X0,Y0,Z0) is a known point on the target line, t is the unit direction vector of the target line, the form is t=ix+ jY +kz, and the coefficients i, j, k satisfy the relation: i 2+j2+k2 = 1;
According to the principle, the analysis of the straight line where the upper surfaces of the optical elements are close to the two adjacent sides of the two side view angle cameras in the global coordinate system can be obtained:
L1:P=p1+1V
L2:P=p2+2V
Where p 1 is a known point on line L 1, t 1 is the unit direction vector of line L 1, p 2 is a known point on line L 2, and t 2 is the unit direction vector of line L 2;
A point a= (X M,YM,ZM) where the sum of distances to L 1 and L 2 is the smallest is calculated as the top left corner vertex of the upper surface of the optical element, that is, points B and C with a distance of 410mm to a are taken on L 1 and L 2 respectively, and a plane is defined by A, B, C together:
P0:a0X+b0Y+Z+c0=0
the method is that the analytic expression of the upper surface of the optical element under the global coordinate system is that the intersection line of the upper surface and the planes of the other two edges is solved according to the method, namely the spatial analytic expression of the straight line of the remaining two edges under the global coordinate system is that:
L3:P=p3+3V
L4:P=p4+4V
Where p 3 is a known point on line L 3, t 3 is the unit direction vector of line L 3, p 4 is a known point on line L 4, and t 4 is the unit direction vector of line L 4.
S35, according to an analytic expression of the edge of the optical element under the global coordinate system, calculating the position and the posture of the optical element and the size of the upper surface according to the corresponding relation by utilizing a PnP algorithm (PERSPECTIVE-n-Point, an algorithm for solving the 3D to 2D Point-to-Point motion).
The process of calculating the position and the posture of the optical element and the size of the upper surface is as follows:
the positive Z-axis direction of the pose of the optical element model coordinate system O M to the global coordinate system O G is defined as t 2×t1, and the positive X-axis direction is defined as t 2. Constructing PnP corresponding relation:
Two-dimensional point (0, 0), corresponding to three-dimensional point a= (X M,YM,ZM);
two-dimensional point (1, 0), corresponding to three-dimensional point a+t 2;
two-dimensional points (0, 1), corresponding to three-dimensional points a+norm (t 2×t1)×t2;
Two-dimensional points (1, 1) corresponding to three-dimensional points a+norm (t 2×t1)×t2+t2;
Wherein Norm (t) is the operation normalized to t;
By establishing the PnP relation of the four groups of points, the conversion relation (R, t), (R, t) from the model coordinate system O M of the optical element to the global coordinate system O G can be solved, wherein R is a rotation matrix from the model coordinate system O M of the optical element to the global coordinate system O G as the 6-degree-of-freedom pose of the optical element, and t is a translation vector from the model coordinate system O M of the optical element to the global coordinate system O G;
position matrix t=a T=(XM,YM,ZM)T;
Obtaining the coordinates of the residual vertex of the upper surface of the optical element under the global coordinate system according to the solving method of the coordinates of the point A under the global coordinate system, wherein the residual vertex is B, C, D in turn clockwise,
The length of the optical element is l= | ((a+b) - (d+c))/2|| 2, and the width is w= | ((a+d) - (b+c))/2|| 2.
The second embodiment is as follows: the following description is made with reference to fig. 1, where the method for assembling an optical element based on an orthogonal vision system according to the present embodiment is implemented based on the method for detecting the first embodiment, the pose of the optical element with six degrees of freedom calculated in the first embodiment is applied to a subsequent assembly process, and when the optical element is assembled, the pose of the optical element is obtained in real time by using the method for detecting the pose of the optical element based on the orthogonal vision system according to the first embodiment, and the deviation between the pose of the current optical element and the ideal pose is calculated by combining the calibration information of the eyes of the manipulator and the ideal pose that the optical element should reach, and the pose of the manipulator is guided to adjust the pose by using the deviation.
Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims. It should be understood that the different dependent claims and the features described herein may be combined in ways other than as described in the original claims. It is also to be understood that features described in connection with separate embodiments may be used in other described embodiments.

Claims (7)

1. The method for detecting the pose of the optical element assembly process based on the orthogonal vision system is characterized by comprising the following steps of:
s1, constructing a visual detection system;
the visual detection system comprises a global camera and two side view angle cameras, wherein the front of the two side view angle cameras uses a strip light source to glancing and illuminating the optical element;
S2, a joint calibration step of three cameras in the visual detection system;
Taking a camera coordinate system of a global camera as a unified global coordinate system; three camera internal parameters and external parameters are marked, and a spatial mapping relation from a pixel coordinate system of each camera to a unified global coordinate system is established;
The optical element to be detected is horizontally arranged on the FOA assembly platform, the global camera is arranged above the FOA assembly platform, the two cameras positioned at the side of the optical element during detection are a No. 1 side view angle camera and a No. 2 side view angle camera,
The three cameras are orthogonal to each other in terms of gestures, the optical axes of the three cameras intersect at one point on the upper surface of the optical element, and the following coordinate system is established:
the global coordinate system of the global camera is O G-XGYGZG, and the optical axis direction of the global camera is vertical to the horizontal plane;
The No.1 side view angle camera coordinate system is O C1-XC1YC1ZC1, the No.2 side view angle camera coordinate system is O C2-XC2YC2ZC2, and the optical axis directions of the two cameras are parallel to the horizontal plane;
The X-axis positive direction of the global camera coordinate system O G is the same as the Z C2 -axis positive direction of the side view angle camera coordinate system O C2 No. 2;
The model coordinate system O M of the optical element is O M-XMYMZM, the origin is defined on the vertex A of the upper left corner of the upper surface of the optical element, the Z M axis is vertical to the upper surface of the optical element, the Z M axis is upward in the positive direction, and the X M axis is the direction that the vertex of the upper left corner of the upper surface of the optical element points to the vertex of the lower left corner of the upper surface of the optical element;
S3, calculating the pose of the optical element;
The three cameras synchronously acquire a top view and two side views of the optical element, extract the analysis of the edges of the optical element under the respective pixel coordinate systems from the images of the three cameras, and align the edges under the pixel coordinate systems of the different cameras under a unified global coordinate system by combining the combined calibration data; the pose of the optical element is determined through the analysis of the edges, and the pose of the optical element is calculated by the following steps:
s31, placing an optical element at a preset position of the FOA assembly platform by using a clamp, enabling three cameras to synchronously acquire a top view image I M and two side view images I SL、ISF of the optical element, and preprocessing the three images to generate a gray level image;
S32, extracting line segments from the three gray images by adopting an LSD algorithm, and processing the line segments to obtain the edges of the optical elements of the three gray images under the respective camera pixel coordinate systems;
S33, acquiring analysis formulas of the edges of the optical element under three camera coordinate systems, specifically, according to the matching result of the step S32, when one edge in the overlook image I M and one edge in the side view image I SL or the side view image I SF simultaneously correspond to the same edge in the CAD model, considering that the edges in the two images correspond to the same edge of the optical element; according to the principle, obtaining analytic formulas of three gray images under respective camera coordinate systems;
s34, according to the camera internal reference matrix, the external reference matrix and the analysis formula under the three camera coordinate systems, the analysis formula of the edge of the optical element under the global coordinate system is obtained;
s35, according to the analysis of the edge of the optical element under the global coordinate system, calculating the position and the posture of the optical element and the size of the upper surface according to the corresponding relation by utilizing a PnP algorithm.
2. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 1, wherein the joint calibration process of step S2 is as follows:
Firstly, calibrating internal reference matrixes of three cameras respectively by adopting Zhang Zhengyou calibration method, and establishing mapping relation from each two-dimensional pixel coordinate system to a three-dimensional camera coordinate system;
Then, the camera coordinate systems of the global cameras are used as a unified global coordinate system, and the transformation from the camera coordinate systems of the two side view angle cameras to the coordinate systems of the global coordinate system is respectively calibrated to obtain external parameter matrixes of the two side view angle cameras;
The extrinsic matrix comprises: the conversion relation T C1=(RC1,tC1),RC1 from the No. 1 side view angle camera coordinate system O C1 to the global coordinate system O G is a rotation matrix from the No. 1 side view angle camera coordinate system O C1 to the global coordinate system O G, and T C1 is a translation vector from the No. 1 side view angle camera coordinate system O C1 to the global coordinate system O G;
And the conversion relationship from the No. 2 side view angle camera coordinate system O C2 to the global coordinate system O G is T C2=(RC2,tC2),RC2 which is a rotation matrix from the No. 2 side view angle camera coordinate system O C2 to the global coordinate system O G, and T C2 is a translation vector from the No. 2 side view angle camera coordinate system O C2 to the global coordinate system O G.
3. The method for detecting the pose of the optical element assembly process based on the orthogonal vision system according to claim 2, wherein a three-dimensional calibration block is adopted and is combined with AprilTag algorithm to calibrate the external reference matrix of two side view angle cameras, the three-dimensional calibration block is a cuboid, four side surfaces and the top surface are sequentially defined as surface 0-surface 4 of the cuboid, the surface 0-surface 4 is sequentially defined by 36H11 april tag patterns with id of 0-4, the origin of a coordinate system O B of the three-dimensional calibration block is defined in the center of AprilTag patterns (id=0) on the surface 0, the X axis and the Y axis are parallel to the surface, the positive directions are the directions from the top left vertex to the top right vertex of the patterns, the top left vertex to the bottom left vertex of the patterns, and the Z axis direction is directed to the inner side of the model;
The process of calibrating the external reference matrix by adopting AprilTag algorithm comprises the following steps:
The surface 4 of the three-dimensional calibration block is kept horizontal, one surface is arbitrarily selected from the surfaces 0-3 of the three-dimensional calibration block, the distance from the selected surface AprilTag pattern to the optical center of the No. 1 side view angle camera along the optical axis direction is 280mm, the distance from the AprilTag pattern of the surface 4 to the optical center of the global camera along the optical axis direction is 1200mm, and the corresponding AprilTag pattern is ensured to be completely present in the visual field of the two phases;
When the three-dimensional calibration block, the global camera and the No. 1 side view angle camera meet the requirements, an external triggering mode is used for controlling the two cameras to capture images under the time difference of not more than 15ms, and AprilTag algorithm is used for calculating the position and the gesture from the calibration graph in the view field to each camera; the transformation relation between the coordinate systems of all graphs on the three-dimensional calibration block and the coordinate system O B of the calibration block is known by CAD data, the pose of the calibration block at the photographing time relative to the global camera and the No. 1 side view angle camera is deduced, the transformation T C1=(RC1,tC1 between the two poses is calculated, namely the transformation relation from O C1 to O G, and the external parameter matrix of the No. 1 side view angle camera under O G is obtained;
the external reference matrix of the No. 2 side view angle camera under O G is obtained in the same way as the above process.
4. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 1, wherein the process of obtaining the edge of the optical element in step S32 is as follows:
all line segments are extracted from the three gray images by adopting an LSD algorithm, and the following processing is carried out:
Step one, eliminating the excessively short line segment of L <25 pixels, wherein the length L of the line segment is obtained according to the following formula:
(x 1,y1),(x2,y2) is the pixel coordinates of the two endpoints of the line segment respectively;
Step two, marking a line segment set which is not fused yet as l Ori, marking the fused line segment set as l meg, and firstly placing the rest line segments after the short line segments are removed in the step one in the set l Ori;
Step three, fusing line segments:
Searching for the L largest line segment in set L Ori, denoted as L 0, and searching for a line segment satisfying the condition in set L Ori: the Euclidean distance dis from the midpoint of the line segment to the straight line where l 0 is located is smaller than 3 pixels, and meanwhile, the included angle ang between the line segment and l 0 is smaller than 0.5 degrees; removing l 0 and the eligible segments from set l Ori;
The Euclidean distance dis and the included angle ang of the two line segments are calculated according to the following formula:
Wherein:
(a 1,b1),(a2,b2) is the l 0 endpoint coordinate,
K is the slope of l 0,
x=(x1+x2)/2
y=(y1+y2)/2
Δx=x2-x1
Δy=y2-y1
Δa=a2-a1
Δb=b2-b1
Step four, for l 0 and line segments meeting the conditions, calculating the minimum circumscribed rectangle of the end points of the line segments by using an SMBR algorithm, taking the midpoint pixel coordinates of the two short sides of the circumscribed rectangle as the pixel coordinates of the two end points of the new fused line segments, and putting the fused line segments into a fused line segment set l meg;
Repeating the third and fourth steps until the set l Ori is an empty set, and then executing the fifth step;
Step five, deleting the wrong line segment in the collection l meg:
first, deleting the line segment of L <1200 pixels;
Secondly, using priori knowledge of the CAD model of the optical element to identify line segments belonging to the edge of the optical element, specifically, resetting the optical element clamped by the clamp to a preset position, and projecting the CAD of the optical element at the moment into the acquired image; since the extracted line segments are from the optical element edges, the line segments representing the optical element edges are screened as follows: the Euclidean distance dis from the midpoint of the line segment to the edge line segment of a certain model extracted by CAD is smaller than 15 pixels, and the included angle ang is smaller than 2 degrees; the line segments in set l meg that do not meet the above conditions are deleted, and the remaining line segments in set l meg are used as optical element edges.
5. The method for detecting pose of an optical element assembling process based on an orthogonal vision system according to claim 4, wherein the step S34 of obtaining the resolution of the edge of the optical element in the global coordinate system is:
According to the known camera internal reference matrix and external reference matrix, converting the straight line of the same edge of the corresponding optical element in the two images into a plane under a camera coordinate system by using back projection to obtain the three-dimensional parameter of the plane where the straight line is located, wherein the plane passes through a camera optical center O; the two images are a top view and a side view;
Assuming that an edge line segment l 1 from the global camera and an edge line segment l 2 of one side view angle camera correspond to the same straight line in space, l 1 and the optical center of the global camera determine a plane P 1,l2 and the optical center of the side view angle camera determine a plane P 2, and the plane P 1、P2 is transformed into the global coordinate system according to the internal reference matrix and the external reference matrix, so as to obtain an analytical formula of P 1、P2 in the global coordinate system as follows:
P1:a1XG+b1YG+c1ZG+d1=0
P2:a2XG+b2YG+c2ZG+d2=0
Where vector (a 1,b1,c1) is the normal vector to plane P 1 and d 1 is the suffix to satisfy all points on plane P 1; vector (a 2,b2,c2) is the normal vector to plane P 2, and d 2 is the suffix to satisfy all points on plane P 2; c 1=c2 = 1;
Because in the imaging of the two cameras, the matched straight lines correspond to the same straight line in the three-dimensional space, the planes P 1 and P 2 are necessarily intersected, the intersection line is the target straight line, namely the straight line where the edge of the optical element is located is analyzed as follows:
P=p0+tV,
Wherein p 0=(X0,Y0,Z0) is a known point on the target line, t is the unit direction vector of the target line, the form is t=ix G+jYG+kZG, and the coefficients i, j, k satisfy the relation: i 2+j2+k2 = 1; v is an argument parameter;
By repeating the method, the analytical formula of the straight line where the upper surface of the optical element is close to the adjacent two edges of the two side view angle cameras in the global coordinate system can be obtained:
L1:P=p1+t1V
L2:P=p2+t2V
Where p 1 is a known point on line L 1, t 1 is the unit direction vector of line L 1, p 2 is a known point on line L 2, and t 2 is the unit direction vector of line L 2;
A point a= (X M,YM,ZM) where the sum of distances to L 1 and L 2 is the smallest is calculated as the top left corner vertex of the upper surface of the optical element, that is, points B and C with a distance of 410mm to a are taken on L 1 and L 2 respectively, and a plane is defined by A, B, C together:
P0:a0X+b0Y+Z+c0=0
the method is that the analytic expression of the upper surface of the optical element under the global coordinate system is that the intersection line of the upper surface and the planes of the other two edges is solved according to the method, namely the spatial analytic expression of the straight line of the remaining two edges under the global coordinate system is that:
L3:P=p3+t3V
L4:P=p4+t4V
Where p 3 is a known point on line L 3, t 3 is the unit direction vector of line L 3, p 4 is a known point on line L 4, and t 4 is the unit direction vector of line L 4.
6. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 5, wherein the step of S35 of calculating the position pose of the optical element and the size of the upper surface is:
the positive direction of the posture Z axis from the optical element model coordinate system O M to the global coordinate system O G is defined as t 2×t1, and the positive direction of the X axis is defined as t 2; constructing PnP corresponding relation:
Two-dimensional point (0, 0), corresponding to three-dimensional point a= (X M,YM,ZM);
two-dimensional point (1, 0), corresponding to three-dimensional point a+t 2;
two-dimensional points (0, 1), corresponding to three-dimensional points a+norm (t 2×t1)×t2;
Two-dimensional points (1, 1) corresponding to three-dimensional points a+norm (t 2×t1)×t2+t2;
Wherein Norm (t) is the operation normalized to t;
By establishing the PnP corresponding relation of the four groups of points, the conversion relation (R, t) from the model coordinate system O M of the optical element to the global coordinate system O G can be solved, wherein (R, t) is taken as the 6-degree-of-freedom pose of the optical element, R is the rotation matrix from the model coordinate system O M of the optical element to the global coordinate system O G, and t is the translation vector from the model coordinate system O M of the optical element to the global coordinate system O G;
position matrix t=a T=(XM,YM,ZM)T;
Obtaining the coordinates of the residual vertex of the upper surface of the optical element under the global coordinate system according to the solving method of the coordinates of the point A under the global coordinate system, wherein the residual vertex is B, C, D in turn clockwise,
The length of the optical element is l= | ((a+b) - (d+c))/2|| 2, and the width is w= | ((a+d) - (b+c))/2|| 2.
7. An optical element assembling method based on an orthogonal vision system is characterized in that the pose of an optical element is obtained in real time by using the pose detection method based on the orthogonal vision system in the optical element assembling process according to any one of claims 1-6 during assembling, the deviation between the pose of the current optical element and the ideal pose is calculated, and a manipulator is guided to adjust the pose.
CN202310735104.6A 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method Active CN116758160B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310735104.6A CN116758160B (en) 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310735104.6A CN116758160B (en) 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method

Publications (2)

Publication Number Publication Date
CN116758160A CN116758160A (en) 2023-09-15
CN116758160B true CN116758160B (en) 2024-04-26

Family

ID=87958590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310735104.6A Active CN116758160B (en) 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method

Country Status (1)

Country Link
CN (1) CN116758160B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108470370A (en) * 2018-03-27 2018-08-31 北京建筑大学 The method that three-dimensional laser scanner external camera joint obtains three-dimensional colour point clouds
JP2018153910A (en) * 2016-12-22 2018-10-04 セイコーエプソン株式会社 Control device, robot, and robot system
CN111009014A (en) * 2019-11-25 2020-04-14 天津大学 Calibration method of orthogonal spectral imaging pose sensor of general imaging model
CN113870358A (en) * 2021-09-17 2021-12-31 聚好看科技股份有限公司 Method and equipment for joint calibration of multiple 3D cameras
CN115953483A (en) * 2022-12-30 2023-04-11 深圳云天励飞技术股份有限公司 Parameter calibration method and device, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018153910A (en) * 2016-12-22 2018-10-04 セイコーエプソン株式会社 Control device, robot, and robot system
CN108470370A (en) * 2018-03-27 2018-08-31 北京建筑大学 The method that three-dimensional laser scanner external camera joint obtains three-dimensional colour point clouds
CN111009014A (en) * 2019-11-25 2020-04-14 天津大学 Calibration method of orthogonal spectral imaging pose sensor of general imaging model
CN113870358A (en) * 2021-09-17 2021-12-31 聚好看科技股份有限公司 Method and equipment for joint calibration of multiple 3D cameras
CN115953483A (en) * 2022-12-30 2023-04-11 深圳云天励飞技术股份有限公司 Parameter calibration method and device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
视觉系统在玩具小车装配中的应用;安宁;翟晓彤;赵华东;徐博凡;;机械设计与制造;20201008(10);第256-260页 *

Also Published As

Publication number Publication date
CN116758160A (en) 2023-09-15

Similar Documents

Publication Publication Date Title
US9124873B2 (en) System and method for finding correspondence between cameras in a three-dimensional vision system
US8600192B2 (en) System and method for finding correspondence between cameras in a three-dimensional vision system
US11488322B2 (en) System and method for training a model in a plurality of non-perspective cameras and determining 3D pose of an object at runtime with the same
Li et al. A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern
Kim et al. A camera calibration method using concentric circles for vision applications
Yan et al. Joint camera intrinsic and lidar-camera extrinsic calibration
JP7486740B2 (en) System and method for efficient 3D reconstruction of an object using a telecentric line scan camera - Patents.com
JP5156601B2 (en) Shape measuring apparatus and program
CN114029946A (en) Method, device and equipment for guiding robot to position and grab based on 3D grating
JP5297779B2 (en) Shape measuring apparatus and program
US11986955B2 (en) In-hand pose refinement for pick and place automation
WO2010071139A1 (en) Shape measurement device and program
CN107850425B (en) Method for measuring an article
CN112529856A (en) Method for determining the position of an operating object, robot and automation system
So et al. Calibration of a dual-laser triangulation system for assembly line completeness inspection
Yamauchi et al. Calibration of a structured light system by observing planar object from unknown viewpoints
Siddique et al. 3d object localization using 2d estimates for computer vision applications
CN116758160B (en) Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method
Tran et al. Extrinsic calibration of a camera and structured multi-line light using a rectangle
JPH09329440A (en) Coordinating method for measuring points on plural images
WO2012076979A1 (en) Model-based pose estimation using a non-perspective camera
Li et al. Method for horizontal alignment deviation measurement using binocular camera without common target
Tehrani et al. A practical method for fully automatic intrinsic camera calibration using directionally encoded light
Abdelhamid et al. Extracting depth information using a correlation matching algorithm
Ribeiro et al. Photogrammetric multi-camera calibration using an industrial programmable robotic arm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant