CN116758160A - Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method - Google Patents

Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method Download PDF

Info

Publication number
CN116758160A
CN116758160A CN202310735104.6A CN202310735104A CN116758160A CN 116758160 A CN116758160 A CN 116758160A CN 202310735104 A CN202310735104 A CN 202310735104A CN 116758160 A CN116758160 A CN 116758160A
Authority
CN
China
Prior art keywords
optical element
coordinate system
camera
global
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310735104.6A
Other languages
Chinese (zh)
Other versions
CN116758160B (en
Inventor
陈冠华
刘国栋
陈凤东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202310735104.6A priority Critical patent/CN116758160B/en
Publication of CN116758160A publication Critical patent/CN116758160A/en
Application granted granted Critical
Publication of CN116758160B publication Critical patent/CN116758160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an optical element assembling process pose detection method and an assembling method based on an orthogonal vision system, belongs to the field of industrial assembling vision detection, and aims to solve the problem that an optical element edge and an assembling frame are easy to collide in the conventional optical element assembling process, so that assembling errors are caused. The method comprises the following steps: s1, constructing a visual detection system; the visual detection system comprises a global camera and two side view angle cameras, and S2, the joint calibration step of three cameras in the visual detection system; taking a camera coordinate system of a global camera as a unified global coordinate system; s3, calculating the pose of the optical element; the three cameras synchronously acquire a top view and two side views of the optical element, extract the analysis of the edges of the optical element under the respective pixel coordinate systems from the images of the three cameras, and align the edges under the pixel coordinate systems of the different cameras under a unified global coordinate system by combining the combined calibration data; the pose of the optical element is determined by the analytical formula of these edges.

Description

Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method
Technical Field
The invention relates to the field of industrial assembly visual detection, in particular to a method for resolving six-degree-of-freedom pose of an optical element in industrial automation optical element assembly.
Background
In the automatic assembly process of the optical element, the optical element is held by a mechanical hand and placed in a mechanical frame, so that the automatic assembly of the optical element is realized. In the assembly process, the accuracy requirement for detecting the relative position and the posture among the optical element, the manipulator and the assembly frame is high. The existing direct visual detection method has poor positioning reliability of the transparent optical element, limited precision and non-negligible manufacturing tolerance of the optical element and the assembly frame, so that the edge of the optical element and the assembly frame are easy to collide in the assembly of the optical element of the FOA assembly platform (final optics assembl, terminal optical system) to cause assembly errors.
Disclosure of Invention
Aiming at the problem that the assembly error is caused by the easy collision between the edge of the optical element and the assembly frame in the conventional optical element assembly, the invention provides an optical element assembly visual detection method and an assembly method based on an orthogonal visual detection system.
The invention relates to a pose detection method for an optical element assembly process based on an orthogonal vision system, which comprises the following steps:
s1, constructing a visual detection system;
the visual detection system comprises a global camera and two side view angle cameras, wherein the front of the two side view angle cameras uses a strip light source to glancing and illuminating the optical element;
s2, a joint calibration step of three cameras in the visual detection system;
taking a camera coordinate system of a global camera as a unified global coordinate system; three camera internal parameters and external parameters are marked, and a spatial mapping relation from a pixel coordinate system of each camera to a unified global coordinate system is established;
s3, calculating the pose of the optical element;
the three cameras synchronously acquire a top view and two side views of the optical element, extract the analysis of the edges of the optical element under the respective pixel coordinate systems from the images of the three cameras, and align the edges under the pixel coordinate systems of the different cameras under a unified global coordinate system by combining the combined calibration data; the pose of the optical element is determined by the analytical formula of these edges.
Preferably, the optical element to be detected is horizontally arranged on the FOA assembly platform, the global camera is arranged above the FOA assembly platform, two cameras positioned at the side of the optical element during detection are a No. 1 side view angle camera and a No. 2 side view angle camera,
the three cameras are orthogonal to each other in terms of gestures, the optical axes of the three cameras intersect at one point on the upper surface of the optical element, and the following coordinate system is established:
the global coordinate system of the global camera is O G -X G Y G Z G The optical axis direction of the global camera is vertical to the horizontal plane;
no. 1 side view camera coordinate system O C1 -X C1 Y C1 Z C1 No. 2 side view camera coordinate system O C2 -X C2 Y C2 Z C2 The optical axis directions of the two cameras are parallel to the horizontal plane;
global camera coordinate system O G X-axis forward direction and side view angle camera coordinate system O number 2 C2 Z of (2) C2 The positive directions of the axes are the same;
model coordinate system O of optical element M Is O M -X M Y M Z M And the origin is defined at the vertex A of the upper left corner of the upper surface of the optical element, Z M The axis is perpendicular to the upper surface of the optical element, Z M Upward in the positive direction of axis, X M The positive axis direction is the direction in which the vertex of the upper left corner of the upper surface of the optical element points to the vertex of the lower left corner of the upper surface of the optical element.
Preferably, the joint calibration procedure of step S2 is:
firstly, calibrating internal reference matrixes of three cameras respectively by adopting a Zhang Zhengyou calibration method, and establishing a mapping relation from a two-dimensional pixel coordinate system to a three-dimensional camera coordinate system;
then, the camera coordinate systems of the global cameras are used as a unified global coordinate system, and the transformation from the camera coordinate systems of the two side view angle cameras to the coordinate systems of the global coordinate system is respectively calibrated to obtain external parameter matrixes of the two side view angle cameras;
the extrinsic matrix comprises: no. 1 side view angle camera coordinate system O C1 To the global coordinate system O G Is a conversion relation T of (2) C1 =(R C1 ,t C1 ),R C1 No. 1 side view camera coordinate system O C1 To the global coordinate system O G T C1 No. 1 side view camera coordinate system O C1 To the global coordinate system O G Is a translation vector of (a);
no. 2 side view angle camera coordinate system O C2 To the global coordinate system O G Is T C2 =(R C2 ,t C2 ),R C2 No. 2 side view camera coordinate system O C2 To the global coordinate system O G T C2 No. 2 side view camera coordinate system O C2 To the global coordinate system O G Is a translation vector of (a).
Preferably, a three-dimensional calibration block is adopted and combined with an april tag algorithm to calibrate the external reference matrixes of the two side-view angle cameras, the three-dimensional calibration block is a cuboid, four side surfaces and the top surface are sequentially defined as surfaces 0-4 of the cuboid, 36H11 april tag patterns with id of 0-4 are sequentially adopted for the surfaces 0-4, and a coordinate system O of the three-dimensional calibration block B The origin defines the center of an april tag pattern (id=0) on the surface 0, the X axis and the Y axis are parallel to the surface, the positive directions are the directions from the top left vertex to the top right vertex of the pattern, from the top left vertex to the bottom left vertex of the pattern, and the Z axis direction points to the inner side of the model;
the process of calibrating the external reference matrix by adopting the april tag algorithm comprises the following steps:
the surface 4 of the three-dimensional calibration block is kept horizontal, one surface is arbitrarily selected from the surfaces 0-3 of the three-dimensional calibration block, the distance from the april tag pattern of the selected surface to the optical center of the No. 1 side view angle camera along the optical axis direction is 280mm, the distance from the april tag pattern of the surface 4 to the optical center of the global camera along the optical axis direction is 1200mm, and the corresponding april tag pattern is ensured to completely appear in the visual fields of the two cameras;
when the three-dimensional calibration block is in side view with the global camera and the No. 1 cameraWhen the angle cameras meet the requirements, using an external triggering mode to control two cameras to capture images under the time difference of not more than 15ms, and using an april tag algorithm to calculate the position and the gesture from the calibration graph in the view field to each camera; coordinate system of all graphics on three-dimensional calibration block and coordinate system O of calibration block B The transformation relation of the camera is known by CAD data, the pose of the shooting moment calibration block relative to the global camera and the No. 1 side view angle camera is deduced, and the transformation T between the two poses is calculated C1 =(R C1 ,t C1 ) Namely O C1 To O G Acquiring the conversion relation of the No. 1 side view angle camera at O G A lower extrinsic matrix;
no. 2 side view angle camera at O G The acquisition mode of the external reference matrix is the same as the above process.
Preferably, the pose solving process of the optical element in the step S3 is as follows:
s31, placing the optical element at a preset position of the FOA platform by using a clamp, and synchronously collecting overlook images I of the optical element by using three cameras M And two side view images I SL 、I SF Preprocessing the three images to generate a gray level image;
s32, extracting line segments from the three gray images by adopting an LSD algorithm (Line Segment Detector, line segment detector algorithm), and processing the line segments to obtain the edges of the optical elements of the three gray images under the respective camera pixel coordinate systems;
s33, acquiring the analytic expressions of the edges of the optical element under three camera coordinate systems, specifically, according to the matching result in the step S32, when looking down the image I M One edge of the image I SL Or side view image I SF One edge of the two images corresponds to the same edge of the optical element, and the edges of the two images correspond to the same edge of the optical element; according to the principle, obtaining analytic formulas of three gray images under respective camera coordinate systems;
s34, according to the camera internal reference matrix, the external reference matrix and the analysis formula under the three camera coordinate systems, the analysis formula of the edge of the optical element under the global coordinate system is obtained;
s35, according to the analysis of the edge of the optical element under the global coordinate system, calculating the position and the posture of the optical element and the size of the upper surface according to the corresponding relation by utilizing a PnP algorithm.
Preferably, the process of obtaining the edge of the optical element in step S32 is:
all line segments are extracted from the three gray images by adopting an LSD algorithm, and the following processing is carried out:
step one, eliminating the excessively short line segment of L <25 pixels, wherein the length L of the line segment is obtained according to the following formula:
(x 1 ,y 1 ),(x 2 ,y 2 ) Pixel coordinates of two endpoints of the line segment respectively;
step two, marking the segment set which is not fused as l Ori The fused set of line segments is denoted as l meg The rest line segments after the short line segments are removed in the first step are firstly placed in the set l Ori In (a) and (b);
step three, fusing line segments:
in set l Ori The line segment with the largest L is searched and marked as L 0 In set l Ori A line segment meeting the condition: line segment midpoint to l 0 The linear Euclidean distance dis is smaller than 3 pixels, and the line segment is equal to l 0 The included angle ang is smaller than 0.5 degrees; will l 0 Line segment slave set meeting condition Ori Removing from the middle part;
the Euclidean distance dis and the included angle ang of the two line segments are calculated according to the following formula:
wherein:
(a 1 ,b 1 ),(a 2 ,b 2 ) Is l 0 The coordinates of the end points,
k is l 0 The slope of the slope is calculated,
x=(x 1 +x 2 )/2
y=(y 1 +y 2 )/2
Δx=x 2 -x 1
Δy=y 2 -y 1
Δa=a 2 -a 1
Δb=b 2 -b1
step four, for l 0 And the minimum circumscribed rectangle of the line end points is calculated by using the SMBR algorithm, the midpoint pixel coordinates of the two short sides of the circumscribed rectangle are used as the pixel coordinates of the two end points of the new fused line, and the fused line is put into the fused line set l meg In (a) and (b);
repeating the third and fourth steps until the set l Ori The set is empty, and then the fifth step is executed;
step five, deleting the set l meg Middle error line segment:
first, deleting the line segment of L <1200 pixels;
next, line segments belonging to the edge of the optical element are identified using a priori knowledge of the CAD model of the optical element, specifically, the optical element held by the jig is reset to a predetermined position, and the CAD of the optical element at that time is projected into the acquired image. Since the extracted line segments are from the optical element edges, the line segments representing the optical element edges are screened as follows: the Euclidean distance dis from the midpoint of the line segment to the edge line segment of a certain model extracted by CAD is smaller than 15 pixels, and the included angle ang is smaller than 2 degrees; will aggregate l meg Segment deletion not meeting the above conditions, set l meg The remaining line segments of (a) act as optical element edges.
Preferably, the process of S34 obtaining the analytical formula of the edge of the optical element under the global coordinate system is:
according to the known camera internal reference matrix and external reference matrix, converting the straight line of the same edge of the corresponding optical element in the two images into a plane under a camera coordinate system by using back projection to obtain the three-dimensional parameter of the plane where the straight line is located, wherein the plane passes through a camera optical center O; the two images are a top view and a side view;
assume an edge line segment l from a global camera 1 And an edge line segment l of a side view camera 2 Corresponding to the same straight line in the space, l 1 And the optical center determination plane P of the global camera 1 ,l 2 And a light center determination plane P of a side view angle camera 2 The plane P is divided into an internal reference matrix and an external reference matrix 1 、P 2 Transforming to global coordinate system to obtain P 1 、P 2 The analytical formula under the global coordinate system is:
P 1 :a 1 X+b 1 Y+c 1 Z+d 1 =0
P 2 :a 2 X+b 2 Y+c 2 Z+d 2 =0
wherein vector (a) 1 ,b 1 ,c 1 ) Is a plane P 1 Normal vector d of (d) 1 Is to satisfy all points in plane P 1 Adding a suffix; vector (a) 2 ,b 2 ,c 2 ) Is a plane P 2 Normal vector d of (d) 2 Is to satisfy all points in plane P 2 Adding a suffix; c 1 =c 2 =1;
Because in the imaging of the two cameras, the matched straight lines correspond to the same straight line in the three-dimensional space, the plane P 1 And P 2 The intersecting lines are the target straight lines, namely the straight line analysis of the edges of the optical element is as follows:
P=p 0 +tV
wherein p is 0 =(X 0 ,Y 0 ,Z 0 ) Is a known point on the target line, t is the unit direction vector of the target line, the form is t=ix+ jY +kz, and the coefficients i, j, k satisfy the relation: i.e 2 +j 2 +k 2 =1;
By repeating the method, the analytical formula of the straight line where the upper surface of the optical element is close to the adjacent two edges of the two side view angle cameras in the global coordinate system can be obtained:
L 1 :P=p 1 + 1 V
L 2 :P=p 2 + 2 V
wherein p is 1 Is a straight line L 1 A known point, t 1 Is a straight line L 1 Unit direction vector, p 2 Is a straight line L 2 A known point, t 2 Is a straight line L 2 Is a unit direction vector of (a);
solving to L 1 And L 2 Point a= (X) where the sum of distances is minimum M ,Y M ,Z M ) Is the top left corner apex of the upper surface of the optical element, i.e. at L 1 And L 2 Taking a point B and a point C with a distance of 410mm from the point A respectively, and jointly determining a plane by A, B, C:
P 0 :a 0 X+b 0 Y+Z+c 0 =0
the method is that the analytic expression of the upper surface of the optical element under the global coordinate system is that the intersection line of the upper surface and the planes of the other two edges is solved according to the method, namely the spatial analytic expression of the straight line of the remaining two edges under the global coordinate system is that:
L 3 :P=p 3 + 3 V
L 4 :P=p 4 + 4 V
wherein p is 3 Is a straight line L 3 A known point, t 3 Is a straight line L 3 Unit direction vector, p 4 Is a straight line L 4 A known point, t t Is a straight line L 4 Is a unit direction vector of (a).
Preferably, the process of S35 calculating the position and orientation of the optical element and the size of the upper surface is:
optical element model coordinate system O M To the global coordinate system O G Is defined as t 2 ×t 1 The positive X-axis direction is defined as t 2 . Constructing PnP correspondence:
Two-dimensional point (0, 0), corresponding to three-dimensional point a= (X) M ,Y M ,Z M );
Two-dimensional point (1, 0) corresponding to three-dimensional point A+t 2
Two-dimensional points (0, 1) corresponding to three-dimensional points A+Norm (t 2 ×t 1 )×t 2
Two-dimensional points (1, 1) corresponding to three-dimensional points A+Norm (t 2 ×t 1 )×t 2 +t 2
Wherein Norm (t) is the operation normalized to t;
by establishing PnP relation of the four groups of points, the model coordinate system O of the optical element can be solved M To the global coordinate system O G The conversion relation (R, t), (R, t) of the optical element is taken as the 6-degree-of-freedom pose of the optical element, and R is taken as the model coordinate system O of the optical element M To the global coordinate system O G T is the model coordinate system O of the optical element M To the global coordinate system O G Is a translation vector of (a);
position matrix t=a T =(X M ,Y M ,Z M ) T
Acquiring the coordinates of the residual vertex of the upper surface of the optical element under the global coordinate system according to the solving method of the coordinates of the point A under the global coordinate system, wherein the residual vertex is B, C, D in turn clockwise,
the length of the optical element is l= | ((a+b) - (d+c))/2|| 2 The width is w= | ((a+d) - (b+c))/2|| 2
The invention also provides another technical scheme: the method for assembling the optical element based on the orthogonal vision system comprises the steps of acquiring the pose of the optical element in real time by using the pose detection method for the optical element assembling process based on the orthogonal vision system according to claims 1-8 during assembling, calculating the deviation between the pose of the current optical element and the ideal pose, and guiding the manipulator to adjust the pose.
The invention has the beneficial effects that:
(1) And (3) high-precision positioning: the vision detection system composed of three cameras utilizes glancing light source illumination, and the positioning accuracy of the transparent optical element is effectively improved through targeted clear imaging of the edge of the optical element.
(2) Reducing manufacturing tolerance effects: by re-measuring the length and width of the upper surface of the optical element, the influence of manufacturing tolerance on the assembly process is reduced.
(3) Accurately measuring the pose with six degrees of freedom: based on images obtained by three cameras, the six-degree-of-freedom pose of the optical element is accurately measured by using a computer vision algorithm, and path guidance is provided for the subsequent assembly process.
Drawings
FIG. 1 is a schematic diagram of a method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to the present invention;
fig. 2 is a schematic diagram of a three-dimensional calibration block, wherein fig. 2 (a) is a schematic diagram of a surface 4, fig. 2 (b) is a schematic diagram of a three-dimensional structure of the three-dimensional calibration block, fig. 2 (c) is a schematic diagram of a surface 3 of the three-dimensional calibration block, fig. 2 (d) is a schematic diagram of a surface 0 of the three-dimensional calibration block, fig. 2 (e) is a schematic diagram of a surface 1 of the three-dimensional calibration block, and fig. 2 (f) is a schematic diagram of a surface 2 of the three-dimensional calibration block;
fig. 3 is a schematic diagram of the back projection of line segments in three-dimensional space.
Description of the components in the drawings: 1. global camera, no. 2, no. 1 side view angle camera, no. 3, no. 2 side view angle camera, no. 4, FOA assembly platform, no. 5, optical element, no. 6, top view, no. 7, no. 1 side view angle camera takes a side view, no. 8, no. 2 side view angle camera takes a side view.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other.
The invention is further described below with reference to the drawings and specific examples, which are not intended to be limiting.
The first embodiment is as follows: next, referring to fig. 1 to 3, a method for detecting a pose of an optical element assembly process based on an orthogonal vision system according to the present embodiment includes the following steps:
s1, constructing a visual detection system;
the visual detection system comprises a global camera and two side view angle cameras, wherein the front of the two side view angle cameras uses a strip light source to glancing and illuminating the optical element;
s2, a joint calibration step of three cameras in the visual detection system;
taking a camera coordinate system of a global camera as a unified global coordinate system; three camera internal parameters and external parameters are marked, and a spatial mapping relation from a pixel coordinate system of each camera to a unified global coordinate system is established;
s3, calculating the pose of the optical element;
the three cameras synchronously acquire a top view and two side views of the optical element, extract the analysis of the edges of the optical element under the respective pixel coordinate systems from the images of the three cameras, and align the edges under the pixel coordinate systems of the different cameras under a unified global coordinate system by combining the combined calibration data; the pose of the optical element is determined by the analytical formula of these edges.
With respect to step S1, referring to fig. 1, the method of the present embodiment is implemented based on an orthogonal vision system, and in order to simultaneously and clearly image an optical element from different angles and provide image information for subsequent pose detection of the optical element, the vision detection system is designed to be composed of three cameras, including a global camera and two side view angle cameras, wherein the two side view angle cameras use a strip light source in front of them to glancing illumination of the optical element.
The optical element horizontal direction that waits to detect is arranged in on the FOA assembly platform, and the global camera sets up in FOA assembly platform top, and two cameras that lie in optical element side during the detection are No. 1 side view angle camera and No. 2 side view angle camera, and the working distance of configuration global camera is 1200mm, and the working distance of No. 1 side view angle camera and No. 2 side view angle camera is 280mm.
The three cameras are orthogonal to each other in terms of gestures, the optical axes of the three cameras intersect at one point on the upper surface of the optical element, and the following coordinate system is established:
the global coordinate system of the global camera is O G -X G Y G Z G The optical axis direction of the global camera is vertical to the horizontal plane;
no. 1 side view camera coordinate system O C1 -X C1 Y C1 Z C1 No. 2 side view camera coordinate system O C2 -X C2 Y C2 Z C2 The optical axis directions of the two cameras are parallel to the horizontal plane;
global camera coordinate system O G X-axis forward direction and side view angle camera coordinate system O number 2 C2 Z of (2) C2 The positive directions of the axes are the same;
model coordinate system O of optical element M Is O M -X M Y M Z M And the origin is defined at the vertex A of the upper left corner of the upper surface of the optical element, Z M The axis is perpendicular to the upper surface of the optical element, Z M Upward in the positive direction of axis, X M The positive axis direction is the direction in which the vertex of the upper left corner of the upper surface of the optical element points to the vertex of the lower left corner of the upper surface of the optical element.
Regarding the joint calibration of step S2, in order to calibrate the internal parameters and the external parameters of the three cameras of the visual detection system, a spatial mapping relationship from each pixel coordinate system to a unified global three-dimensional coordinate system is established, firstly, internal parameters of the three cameras are respectively calibrated, a spatial mapping relationship from the pixel coordinate system to each camera coordinate system is established, then, the camera coordinate systems of the global cameras are defined as a unified global coordinate system, and coordinate system transformation from the camera coordinate systems of the two side view angle cameras to the global coordinate system is respectively calibrated, namely, the external parameters of the two cameras are respectively calibrated.
The internal reference matrix adopts a Zhang Zhengyou calibration method to calibrate the internal reference matrix of the three cameras, and a mapping relation from each two-dimensional pixel coordinate system to the three-dimensional camera coordinate system is established;
the method comprises the steps of calibrating an external reference matrix, namely respectively calibrating the coordinate system transformation from the camera coordinate systems of two side view angle cameras to a global coordinate system by taking the camera coordinate systems of the global cameras as a unified global coordinate system, and obtaining the external reference matrix of the two side view angle cameras;
the extrinsic matrix comprises: no. 1 side view angle camera coordinate system O C1 To the global coordinate system O G Is a conversion relation T of (2) C1 =(R C1 ,t C1 ),R C1 No. 1 side view camera coordinate system O C1 To the global coordinate system O G T C1 No. 1 side view camera coordinate system O C1 To the global coordinate system O G Is a translation vector of (a);
no. 2 side view angle camera coordinate system O C2 To the global coordinate system O G Is T C2 =(R C2 ,t C2 ),R C2 No. 2 side view camera coordinate system O C2 To the global coordinate system O G T C2 No. 2 side view camera coordinate system O C2 To the global coordinate system O G Is a translation vector of (a).
Referring to fig. 2, in the external parameter calibration, since the camera coordinate system of the global camera is defined as a unified global coordinate system, only the external parameters of the two side view angle cameras need to be calibrated, the external parameters of the orthogonal visual detection system are calibrated, a three-dimensional calibration block is designed in a targeted manner, and the april tag algorithm is combined for calibration. The designed calibration block is printed by adopting a resin material in a 3D mode, the appearance is cuboid, the dimension specification is shown in figure 2, and the machining precision is +/-0.05 mm. The three-dimensional calibration block is a cuboid, four side surfaces and the top surface are sequentially defined as surfaces 0-4 of the cuboid, 36H11 april tag patterns with id of 0-4 are sequentially adopted by the surfaces 0-4, and a coordinate system O of the three-dimensional calibration block B The origin defines the center of an april tag pattern (id=0) on the surface 0, the X axis and the Y axis are parallel to the surface, the positive directions are the directions from the top left vertex to the top right vertex of the pattern, from the top left vertex to the bottom left vertex of the pattern, and the Z axis direction points to the inner side of the model;
the process of calibrating the external reference matrix by adopting the april tag algorithm comprises the following steps:
when the side view angle camera is calibrated to be externally referred, the cuboid calibration blocks are arranged and put, so that the working distance requirement of the camera when detecting the optical element is met. The surface 4 of the three-dimensional calibration block is kept horizontal, one surface is arbitrarily selected from the surfaces 0-3 of the three-dimensional calibration block, the distance from the april tag pattern of the selected surface to the optical center of the No. 1 side view angle camera along the optical axis direction is 280mm, the distance from the april tag pattern of the surface 4 to the optical center of the global camera along the optical axis direction is 1200mm, and the corresponding april tag pattern is ensured to completely appear in the visual fields of the two cameras;
when the three-dimensional calibration block, the global camera and the No. 1 side view angle camera meet the requirements, an external triggering mode is used for controlling the two cameras to capture images under the time difference of not more than 15ms, and an april tag algorithm is used for calculating the position and the gesture from the calibration graph in the view field to each camera; coordinate system of all graphics on three-dimensional calibration block and coordinate system O of calibration block B The transformation relation of the camera is known by CAD data, the pose of the shooting moment calibration block relative to the global camera and the No. 1 side view angle camera is deduced, and the transformation T between the two poses is calculated C1 =(R C1 ,t C1 ) Namely O C1 To O G Acquiring the conversion relation of the No. 1 side view angle camera at O G A lower extrinsic matrix;
no. 2 side view angle camera at O G The acquisition mode of the external reference matrix is the same as the above process. No. 2 side view angle camera coordinate system O C2 To the global coordinate system O G Is T C2 =(R C2 ,t C2 )。
Regarding step S3, the optical element pose solving process is:
s31, placing the optical element at a preset position of the FOA platform by using a clamp, and synchronously collecting overlook images I of the optical element by using three cameras M And two side view images I SL 、I SF Preprocessing the three images to generate a gray level image;
specifically, the manipulator clamps the optical element to a preset position, turns on the glancing light source, adjusts the illumination angle to enable the edge of the optical element to be illuminated in the field of view of the camera, adjusts the exposure time of the camera according to the preset average pixel gray value of the RoI area, and enables the edge of the optical element to correspond to the pixelThe gray scale (the value range is 0-255) is 220-240; the three cameras are controlled to synchronously acquire images within 150ms in an external triggering mode, and the acquired images are global camera images I respectively M Resolution 5120×5120; side view camera image I SL 、I SF The resolution is 4200×2160.
S32, extracting line segments from the three gray images by adopting an LSD algorithm, and processing the line segments to obtain the edges of the optical elements of the three gray images under the respective camera pixel coordinate systems;
the acquisition process of the edge of the optical element comprises the following steps:
all line segments are extracted from the three gray images by adopting an LSD algorithm, and the following processing is carried out:
step one, eliminating the excessively short line segment of L <25 pixels, wherein the length L of the line segment is obtained according to the following formula:
(x 1 ,y 1 ),(x 2 ,y 2 ) Pixel coordinates of two endpoints of the line segment respectively;
step two, marking the segment set which is not fused as l Ori The fused set of line segments is denoted as l meg The rest line segments after the short line segments are removed in the first step are firstly placed in the set l Ori In (a) and (b);
step three, fusing line segments:
in set l Ori The line segment with the largest L is searched and marked as L 0 In set l Ori A line segment meeting the condition: line segment midpoint to l 0 The linear Euclidean distance dis is smaller than 3 pixels, and the line segment is equal to l 0 The included angle ang is smaller than 0.5 degrees; will l 0 Line segment slave set meeting condition Ori Removing from the middle part;
the Euclidean distance dis and the included angle ang of the two line segments are calculated according to the following formula:
wherein:
(a 1 ,b 1 ),(a 2 ,b 2 ) Is l 0 The coordinates of the end points,
k is l 0 The slope of the slope is calculated,
x=(x 1 +x 2 )/2
y=(y 1 +y 2 )/2
Δx=x 2 -x 1
Δy=y 2 -y 1
Δa=a 2 -a 1
Δb=b 2 -b 1
step four, for l 0 And the minimum circumscribed rectangle of the line end points is calculated by using the SMBR algorithm, the midpoint pixel coordinates of the two short sides of the circumscribed rectangle are used as the pixel coordinates of the two end points of the new fused line, and the fused line is put into the fused line set l meg In (a) and (b);
repeating the third and fourth steps until the set l Ori The set is empty, and then the fifth step is executed;
step five, deleting the set l meg Middle error line segment:
first, deleting the line segment of L <1200 pixels; since the common length of the line segments at the edges of the optical element in the real shot is higher than 1200 pixels, the line segments with L <1200 pixels are removed, namely the over short line segments are deleted.
Secondly, using a priori knowledge of the CAD model of the optical element to identify line segments belonging to the edge of the optical element, in particular, resetting the optical element held by the clamp to a predetermined position, and projecting the CAD of the optical element at that time onto the acquired imageIs a kind of medium. Since the extracted line segments come from the edge of the optical element, the line segments that correctly represent the edge need to satisfy the following two conditions simultaneously (i.e., the line segments that represent the edge of the optical element are screened by the following conditions): the Euclidean distance dis from the midpoint of the line segment to the edge line segment of a certain model extracted by CAD is smaller than 15 pixels, and the included angle ang is smaller than 2 degrees; will aggregate l meg Segment deletion not meeting the above conditions, set l meg The remaining line segments of (a) act as optical element edges.
S33, acquiring the analytic expressions of the edges of the optical element under three camera coordinate systems, specifically, according to the matching result in the step S32, when looking down the image I M One edge of the image I SL Or side view image I SF One edge of the two images corresponds to the same edge of the optical element, and the edges of the two images correspond to the same edge of the optical element; according to the principle, obtaining analytic formulas of three gray images under respective camera coordinate systems;
s34, according to the camera internal reference matrix, the external reference matrix and the analysis formula under the three camera coordinate systems, the analysis formula of the edge of the optical element under the global coordinate system is obtained;
the process of obtaining the resolution of the edge of the optical element in the global coordinate system comprises the following steps:
referring to fig. 3, first, a principle description of edge projection in a pixel coordinate system in a three-dimensional space is given, and according to a known camera internal reference matrix and external reference matrix, a straight line corresponding to the same edge of an optical element in two images is converted into a plane in the camera coordinate system by using back projection, so as to obtain a three-dimensional parameter of the plane in which the straight line is located, and the plane passes through a camera optical center O; the two images are a top view and a side view; and converting the straight line corresponding to the same edge of the optical element to be detected in the two images into a plane under a camera coordinate system by using back projection to obtain three-dimensional parameters of the plane where the straight line is located, wherein the plane passes through the camera optical center O. As shown in fig. 3, x 1 、x 2 Two points on a straight line in the image are known quantities; x is X 1 、X 2 X is the corresponding point of the two points in the three-dimensional space 1 、X 2 Is unknown, and evenCalibration data using a camera is also unknown, but the spatial point X can be determined by projection from the internal parameters of the camera 1 、X 2 At the passing point O, x 1 、x 2 A line segment (straight line) on the plane of the camera pixel corresponds to one face in three-dimensional space.
Assume an edge line segment l from a global camera 1 And an edge line segment l of a side view camera 2 Corresponding to the same straight line in the space, l 1 And the optical center determination plane P of the global camera 1 ,l 2 And a light center determination plane P of a side view angle camera 2 The plane P is divided into an internal reference matrix and an external reference matrix 1 、P 2 Transforming to global coordinate system to obtain P 1 、P 2 The analytical formula under the global coordinate system is:
P 1 :a 1 X+b 1 Y+c 1 Z+d 1 =0
P 2 :a 2 X+b 2 Y+c 2 Z+d 2 =0
wherein vector (a) 1 ,b 1 ,c 1 ) Is a plane P 1 Normal vector d of (d) 1 Is to satisfy all points in plane P 1 Adding a suffix; vector (a) 2 ,b 2 ,c 2 ) Is a plane P 2 Normal vector d of (d) 2 Is to satisfy all points in plane P 2 Adding a suffix; c 1 =c 2 =1;
Because in the imaging of the two cameras, the matched straight lines correspond to the same straight line in the three-dimensional space, the plane P 1 And P 2 The intersecting lines are the target straight lines, namely the straight line analysis of the edges of the optical element is as follows:
P=p 0 +tV
wherein p is 0 =(X 0 ,Y 0 ,Z 0 ) Is a known point on the target line, t is the unit direction vector of the target line, the form is t=ix+ jY +kz, and the coefficients i, j, k satisfy the relation: i.e 2 +j 2 +k 2 =1;
According to the principle, the analysis of the straight line where the upper surfaces of the optical elements are close to the two adjacent sides of the two side view angle cameras in the global coordinate system can be obtained:
L 1 :P=p 1 + 1 V
L 2 :P=p 2 + 2 V
wherein p is 1 Is a straight line L 1 A known point, t 1 Is a straight line L 1 Unit direction vector, p 2 Is a straight line L 2 A known point, t 2 Is a straight line L 2 Is a unit direction vector of (a);
solving to L 1 And L 2 Point a= (X) where the sum of distances is minimum M ,Y M ,Z M ) Is the top left corner apex of the upper surface of the optical element, i.e. at L 1 And L 2 Taking a point B and a point C with a distance of 410mm from the point A respectively, and jointly determining a plane by A, B, C:
P 0 :a 0 X+b 0 Y+Z+c 0 =0
the method is that the analytic expression of the upper surface of the optical element under the global coordinate system is that the intersection line of the upper surface and the planes of the other two edges is solved according to the method, namely the spatial analytic expression of the straight line of the remaining two edges under the global coordinate system is that:
L 3 :P=p 3 + 3 V
L 4 :P=p 4 + 4 V
wherein p is 3 Is a straight line L 3 A known point, t 3 Is a straight line L 3 Unit direction vector, p 4 Is a straight line L 4 A known point, t 4 Is a straight line L 4 Is a unit direction vector of (a).
S35, according to an analytic expression of the edge of the optical element under the global coordinate system, calculating the position posture of the optical element and the size of the upper surface according to the corresponding relation by utilizing a PnP algorithm (a algorithm for solving 3D to 2D Point-to-Point motion).
The process of calculating the position and the posture of the optical element and the size of the upper surface is as follows:
optical element model coordinatesIs O of M To the global coordinate system O G Is defined as t 2 ×t 1 The positive X-axis direction is defined as t 2 . Constructing PnP corresponding relation:
two-dimensional point (0, 0), corresponding to three-dimensional point a= (X) M ,Y M ,Z M );
Two-dimensional point (1, 0) corresponding to three-dimensional point A+t 2
Two-dimensional points (0, 1) corresponding to three-dimensional points A+Norm (t 2 ×t 1 )×t 2
Two-dimensional points (1, 1) corresponding to three-dimensional points A+Norm (t 2 ×t 1 )×t 2 +t 2
Wherein Norm (t) is the operation normalized to t;
by establishing PnP relation of the four groups of points, the model coordinate system O of the optical element can be solved M To the global coordinate system O G The conversion relation (R, t), (R, t) of the optical element is taken as the 6-degree-of-freedom pose of the optical element, and R is taken as the model coordinate system O of the optical element M To the global coordinate system O G T is the model coordinate system O of the optical element M To the global coordinate system O G Is a translation vector of (a);
position matrix t=a T =(X M ,Y M ,Z M ) T
Acquiring the coordinates of the residual vertex of the upper surface of the optical element under the global coordinate system according to the solving method of the coordinates of the point A under the global coordinate system, wherein the residual vertex is B, C, D in turn clockwise,
the length of the optical element is l= | ((a+b) - (d+c))/2|| 2 The width is w= | ((a+d) - (b+c))/2|| 2
The second embodiment is as follows: the following description is made with reference to fig. 1, where the method for assembling an optical element based on an orthogonal vision system according to the present embodiment is implemented based on the method for detecting the first embodiment, the pose of the optical element with six degrees of freedom calculated in the first embodiment is applied to a subsequent assembly process, and when the optical element is assembled, the pose of the optical element is obtained in real time by using the method for detecting the pose of the optical element based on the orthogonal vision system according to the first embodiment, and the deviation between the pose of the current optical element and the ideal pose is calculated by combining the calibration information of the eyes of the manipulator and the ideal pose that the optical element should reach, and the pose of the manipulator is guided to adjust the pose by using the deviation.
Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims. It should be understood that the different dependent claims and the features described herein may be combined in ways other than as described in the original claims. It is also to be understood that features described in connection with separate embodiments may be used in other described embodiments.

Claims (9)

1. The method for detecting the pose of the optical element assembly process based on the orthogonal vision system is characterized by comprising the following steps of:
s1, constructing a visual detection system;
the visual detection system comprises a global camera and two side view angle cameras, wherein the front of the two side view angle cameras uses a strip light source to glancing and illuminating the optical element;
s2, a joint calibration step of three cameras in the visual detection system;
taking a camera coordinate system of a global camera as a unified global coordinate system; three camera internal parameters and external parameters are marked, and a spatial mapping relation from a pixel coordinate system of each camera to a unified global coordinate system is established;
s3, calculating the pose of the optical element;
the three cameras synchronously acquire a top view and two side views of the optical element, extract the analysis of the edges of the optical element under the respective pixel coordinate systems from the images of the three cameras, and align the edges under the pixel coordinate systems of the different cameras under a unified global coordinate system by combining the combined calibration data; the pose of the optical element is determined by the analytical formula of these edges.
2. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 1, wherein the optical element to be detected is horizontally arranged on a FOA assembly platform, a global camera is arranged above the FOA assembly platform, two cameras positioned at the side of the optical element during detection are a No. 1 side view angle camera and a No. 2 side view angle camera,
the three cameras are orthogonal to each other in terms of gestures, the optical axes of the three cameras intersect at one point on the upper surface of the optical element, and the following coordinate system is established:
the global coordinate system of the global camera is O G -X G Y G Z G The optical axis direction of the global camera is vertical to the horizontal plane;
no. 1 side view camera coordinate system O C1 -X C1 Y C1 Z C1 No. 2 side view camera coordinate system O C2 -X C2 Y C2 Z C2 The optical axis directions of the two cameras are parallel to the horizontal plane;
global camera coordinate system O G X-axis forward direction and side view angle camera coordinate system O number 2 C2 Z of (2) C2 The positive directions of the axes are the same;
model coordinate system O of optical element M Is O M -X M Y M Z M And the origin is defined at the vertex A of the upper left corner of the upper surface of the optical element, Z M The axis is perpendicular to the upper surface of the optical element, Z M Upward in the positive direction of axis, X M The positive axis direction is the direction in which the vertex of the upper left corner of the upper surface of the optical element points to the vertex of the lower left corner of the upper surface of the optical element.
3. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 2, wherein the joint calibration process of step S2 is as follows:
firstly, calibrating internal reference matrixes of three cameras respectively by adopting a Zhang Zhengyou calibration method, and establishing a mapping relation from a two-dimensional pixel coordinate system to a three-dimensional camera coordinate system;
then, the camera coordinate systems of the global cameras are used as a unified global coordinate system, and the transformation from the camera coordinate systems of the two side view angle cameras to the coordinate systems of the global coordinate system is respectively calibrated to obtain external parameter matrixes of the two side view angle cameras;
the extrinsic matrix comprises: no. 1 side view angle camera coordinate system O C1 To the global coordinate system O G Is a conversion relation T of (2) C1 =(R C1 ,t C1 ),R C1 No. 1 side view camera coordinate system O C1 To the global coordinate system O G T C1 No. 1 side view camera coordinate system O C1 To the global coordinate system O G Is a translation vector of (a);
no. 2 side view angle camera coordinate system O C2 To the global coordinate system O G Is T C2 =(R C2 ,t C2 ),R C2 No. 2 side view camera coordinate system O C2 To the global coordinate system O G T C2 No. 2 side view camera coordinate system O C2 To the global coordinate system O G Is a translation vector of (a).
4. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 2 or 3, wherein a three-dimensional calibration block is adopted and combined with an april tag algorithm to calibrate the external reference matrixes of two side-view angle cameras, the three-dimensional calibration block is a cuboid, four side surfaces and the top surface are sequentially defined as surfaces 0-4 of the cuboid, the surfaces 0-4 sequentially adopt 36H11 april tag patterns with id of 0-4, and the coordinate system O of the three-dimensional calibration block B The origin defines the center of an april tag pattern (id=0) on the surface 0, the X axis and the Y axis are parallel to the surface, the positive directions are the directions from the top left vertex to the top right vertex of the pattern, from the top left vertex to the bottom left vertex of the pattern, and the Z axis direction points to the inner side of the model;
the process of calibrating the external reference matrix by adopting the april tag algorithm comprises the following steps:
the surface 4 of the three-dimensional calibration block is kept horizontal, one surface is arbitrarily selected from the surfaces 0-3 of the three-dimensional calibration block, the distance from the april tag pattern of the selected surface to the optical center of the No. 1 side view angle camera along the optical axis direction is 280mm, the distance from the april tag pattern of the surface 4 to the optical center of the global camera along the optical axis direction is 1200mm, and the corresponding april tag pattern is ensured to completely appear in the visual fields of the two cameras;
when the three-dimensional calibration block, the global camera and the No. 1 side view angle camera meet the requirements, an external triggering mode is used for controlling the two cameras to capture images under the time difference of not more than 15ms, and an april tag algorithm is used for calculating the position and the gesture from the calibration graph in the view field to each camera; coordinate system of all graphics on three-dimensional calibration block and coordinate system O of calibration block B The transformation relation of the camera is known by CAD data, the pose of the shooting moment calibration block relative to the global camera and the No. 1 side view angle camera is deduced, and the transformation T between the two poses is calculated C1 =(R C1 ,t C1 ) Namely O C1 To O G Acquiring the conversion relation of the No. 1 side view angle camera at O G A lower extrinsic matrix;
no. 2 side view angle camera at O G The acquisition mode of the external reference matrix is the same as the above process.
5. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 3, wherein the pose calculation process of the optical element in the step S3 is as follows:
s31, placing the optical element at a preset position of the FOA assembly platform by using a clamp, so that three cameras synchronously acquire overlook images I of the optical element M And two side view images I SL 、I SF Preprocessing the three images to generate a gray level image;
s32, extracting line segments from the three gray images by adopting an LSD algorithm, and processing the line segments to obtain the edges of the optical elements of the three gray images under the respective camera pixel coordinate systems;
s33, acquiring the analytic expressions of the edges of the optical element under three camera coordinate systems, specifically, according to the matching result in the step S32, when looking down the image I M One edge of the image I SL Or side view image I SF One edge of the CAD model corresponds to the same edge at the same time, and then the CAD model considers thatEdges in the two images correspond to the same edge of the optical element; according to the principle, obtaining analytic formulas of three gray images under respective camera coordinate systems;
s34, according to the camera internal reference matrix, the external reference matrix and the analysis formula under the three camera coordinate systems, the analysis formula of the edge of the optical element under the global coordinate system is obtained;
s35, according to the analysis of the edge of the optical element under the global coordinate system, calculating the position and the posture of the optical element and the size of the upper surface according to the corresponding relation by utilizing a PnP algorithm.
6. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 5, wherein the process of obtaining the edge of the optical element in step S32 is as follows:
all line segments are extracted from the three gray images by adopting an LSD algorithm, and the following processing is carried out:
step one, eliminating the excessively short line segment of L <25 pixels, wherein the length L of the line segment is obtained according to the following formula:
(x 1 ,y 1 ),(x 2 ,y 2 ) Pixel coordinates of two endpoints of the line segment respectively;
step two, marking the segment set which is not fused as l Ori The fused set of line segments is denoted as l meg The rest line segments after the short line segments are removed in the first step are firstly placed in the set l Ori In (a) and (b);
step three, fusing line segments:
in set l Ori The line segment with the largest L is searched and marked as L 0 In set l Ori A line segment meeting the condition: line segment midpoint to l 0 The linear Euclidean distance dis is smaller than 3 pixels, and the line segment is equal to l 0 The included angle ang is smaller than 0.5 degrees; will l 0 Line segment slave set meeting condition Ori Removing from the middle part;
the Euclidean distance dis and the included angle ang of the two line segments are calculated according to the following formula:
wherein:
(a 1 ,b 1 ),(a 2 ,b 2 ) Is l 0 The coordinates of the end points,
k is l 0 The slope of the slope is calculated,
x=(x 1 +x 2 )/2
y=(y 1 +y 2 )/2
Δx=x 2 -x 1
Δy=y 2 -y 1
Δa=a 2 -a 1
Δb=b 2 -b 1
step four, for l 0 And the minimum circumscribed rectangle of the line end points is calculated by using the SMBR algorithm, the midpoint pixel coordinates of the two short sides of the circumscribed rectangle are used as the pixel coordinates of the two end points of the new fused line, and the fused line is put into the fused line set l meg In (a) and (b);
repeating the third and fourth steps until the set l Ori The set is empty, and then the fifth step is executed;
step five, deleting the set l meg Middle error line segment:
first, deleting the line segment of L <1200 pixels;
second, using a priori knowledge of the CAD model of the optical element to identify line segments belonging to the edge of the optical element, in particular, the optical element to be clamped by the clampResetting to a preset position, and projecting CAD of the optical element at the moment into the acquired image. Since the extracted line segments are from the optical element edges, the line segments representing the optical element edges are screened as follows: the Euclidean distance dis from the midpoint of the line segment to the edge line segment of a certain model extracted by CAD is smaller than 15 pixels, and the included angle ang is smaller than 2 degrees; will aggregate l meg Segment deletion not meeting the above conditions, set l meg The remaining line segments of (a) act as optical element edges.
7. The method for detecting pose of an optical element assembling process based on an orthogonal vision system according to claim 6, wherein the step S34 of obtaining the resolution of the edge of the optical element in the global coordinate system is:
according to the known camera internal reference matrix and external reference matrix, converting the straight line of the same edge of the corresponding optical element in the two images into a plane under a camera coordinate system by using back projection to obtain the three-dimensional parameter of the plane where the straight line is located, wherein the plane passes through a camera optical center O; the two images are a top view and a side view;
assume an edge line segment l from a global camera 1 And an edge line segment l of a side view camera 2 Corresponding to the same straight line in the space, l 1 And the optical center determination plane P of the global camera 1 ,l 2 And a light center determination plane P of a side view angle camera 2 The plane P is divided into an internal reference matrix and an external reference matrix 1 、P 2 Transforming to global coordinate system to obtain P 1 、P 2 The analytical formula under the global coordinate system is:
P 1 :a 1 X+b 1 Y+c 1 Z+d 1 =0
P 2 :a 2 X+b 2 Y+c 2 Z+d 2 =0
wherein vector (a) 1 ,b 1 ,c 1 ) Is a plane P 1 Normal vector d of (d) 1 Is to satisfy all points in plane P 1 Adding a suffix; vector (a) 2 ,b 2 ,c 2 ) Is a plane P 2 Normal vector d of (d) 2 Is to satisfy allPoint at plane P 2 Adding a suffix; c 1 =c 2 =1;
Because in the imaging of the two cameras, the matched straight lines correspond to the same straight line in the three-dimensional space, the plane P 1 And P 2 The intersecting lines are the target straight lines, namely the straight line analysis of the edges of the optical element is as follows:
P=p 0 +tV
wherein p is 0 =(X 0 ,Y 0 ,Z 0 ) Is a known point on the target line, t is the unit direction vector of the target line, the form is t=ix+ jY +kz, and the coefficients i, j, k satisfy the relation: i.e 2 +j 2 +k 2 =1;
By repeating the method, the analytical formula of the straight line where the upper surface of the optical element is close to the adjacent two edges of the two side view angle cameras in the global coordinate system can be obtained:
L 1 :P=p 1 +t 1 V
L 2 :P=β 2 +t 2 V
wherein p is 1 Is a straight line L 1 A known point, t 1 Is a straight line L 1 Unit direction vector, p 2 Is a straight line L 2 A known point, t 2 Is a straight line L 2 Is a unit direction vector of (a);
solving to L 1 And L 2 Point a= (X) where the sum of distances is minimum M ,Y M ,Z M ) Is the top left corner apex of the upper surface of the optical element, i.e. at L 1 And L 2 Taking a point B and a point C with a distance of 410mm from the point A respectively, and jointly determining a plane by A, B, C:
P 0 :a 0 X+b 0 Y+Z+c 0 =0
the method is that the analytic expression of the upper surface of the optical element under the global coordinate system is that the intersection line of the upper surface and the planes of the other two edges is solved according to the method, namely the spatial analytic expression of the straight line of the remaining two edges under the global coordinate system is that:
L 3 :P=p 3 +t 3 V
L 4 :P=p 4 +t 4 V
wherein p is 3 Is a straight line L 3 A known point, t 3 Is a straight line L 3 Unit direction vector, p 4 Is a straight line L 4 A known point, t 4 Is a straight line L 4 Is a unit direction vector of (a).
8. The method for detecting the pose of an optical element assembly process based on an orthogonal vision system according to claim 7, wherein the step of S35 of calculating the position pose of the optical element and the size of the upper surface is:
optical element model coordinate system O M To the global coordinate system O G Is defined as t 2 ×t 1 The positive X-axis direction is defined as t 2 . Constructing PnP corresponding relation:
two-dimensional point (0, 0), corresponding to three-dimensional point a= (X) M ,Y M ,Z M );
Two-dimensional point (1, 0) corresponding to three-dimensional point A+t 2
Two-dimensional points (0, 1) corresponding to three-dimensional points A+Norm (t 2 ×t 1 )×t 2
Two-dimensional points (1, 1) corresponding to three-dimensional points A+Norm (t 2 ×t 1 )×t 2 +t 2
Wherein Norm (t) is the operation normalized to t;
by establishing PnP corresponding relation of the four groups of points, a model coordinate system O of the optical element can be solved M To the global coordinate system O G The conversion relation (R, t), (R, t) of the optical element is taken as the 6-degree-of-freedom pose of the optical element, and R is taken as the model coordinate system O of the optical element M To the global coordinate system O G T is the model coordinate system O of the optical element M To the global coordinate system O G Is a translation vector of (a);
position matrix t=a T =(X M ,Y M ,Z M ) T
Acquiring the coordinates of the residual vertex of the upper surface of the optical element under the global coordinate system according to the solving method of the coordinates of the point A under the global coordinate system, wherein the residual vertex is B, C, D in turn clockwise,
the length of the optical element is l= | ((a+b) - (d+c))/2|| 2 The width is w= | ((a+d) - (b+c))/2|| 2
9. The method for assembling the optical element based on the orthogonal vision system is characterized in that the pose of the optical element is obtained in real time by using the pose detection method for the optical element assembling process based on the orthogonal vision system according to claims 1-8 during assembly, the deviation between the pose of the current optical element and the ideal pose is calculated, and the manipulator is guided to adjust the pose.
CN202310735104.6A 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method Active CN116758160B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310735104.6A CN116758160B (en) 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310735104.6A CN116758160B (en) 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method

Publications (2)

Publication Number Publication Date
CN116758160A true CN116758160A (en) 2023-09-15
CN116758160B CN116758160B (en) 2024-04-26

Family

ID=87958590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310735104.6A Active CN116758160B (en) 2023-06-20 2023-06-20 Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method

Country Status (1)

Country Link
CN (1) CN116758160B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108470370A (en) * 2018-03-27 2018-08-31 北京建筑大学 The method that three-dimensional laser scanner external camera joint obtains three-dimensional colour point clouds
JP2018153910A (en) * 2016-12-22 2018-10-04 セイコーエプソン株式会社 Control device, robot, and robot system
CN111009014A (en) * 2019-11-25 2020-04-14 天津大学 Calibration method of orthogonal spectral imaging pose sensor of general imaging model
CN113870358A (en) * 2021-09-17 2021-12-31 聚好看科技股份有限公司 Method and equipment for joint calibration of multiple 3D cameras
CN115953483A (en) * 2022-12-30 2023-04-11 深圳云天励飞技术股份有限公司 Parameter calibration method and device, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018153910A (en) * 2016-12-22 2018-10-04 セイコーエプソン株式会社 Control device, robot, and robot system
CN108470370A (en) * 2018-03-27 2018-08-31 北京建筑大学 The method that three-dimensional laser scanner external camera joint obtains three-dimensional colour point clouds
CN111009014A (en) * 2019-11-25 2020-04-14 天津大学 Calibration method of orthogonal spectral imaging pose sensor of general imaging model
CN113870358A (en) * 2021-09-17 2021-12-31 聚好看科技股份有限公司 Method and equipment for joint calibration of multiple 3D cameras
CN115953483A (en) * 2022-12-30 2023-04-11 深圳云天励飞技术股份有限公司 Parameter calibration method and device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
安宁;翟晓彤;赵华东;徐博凡;: "视觉系统在玩具小车装配中的应用", 机械设计与制造, no. 10, 8 October 2020 (2020-10-08), pages 256 - 260 *

Also Published As

Publication number Publication date
CN116758160B (en) 2024-04-26

Similar Documents

Publication Publication Date Title
US9124873B2 (en) System and method for finding correspondence between cameras in a three-dimensional vision system
US8600192B2 (en) System and method for finding correspondence between cameras in a three-dimensional vision system
CN108109174B (en) Robot monocular guidance method and system for randomly sorting scattered parts
JP6685199B2 (en) System and method for combining machine vision coordinate spaces in a guided assembly environment
EP3067861B1 (en) Determination of a coordinate conversion parameter
US11488322B2 (en) System and method for training a model in a plurality of non-perspective cameras and determining 3D pose of an object at runtime with the same
Wöhler 3D computer vision: efficient methods and applications
Li et al. A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern
Kim et al. A camera calibration method using concentric circles for vision applications
CN112541946A (en) Real-time pose detection method of mechanical arm based on perspective multi-point projection
US11446822B2 (en) Simulation device that simulates operation of robot
CN114714356A (en) Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision
Wang et al. Phocal: A multi-modal dataset for category-level object pose estimation with photometrically challenging objects
CN107478203A (en) A kind of 3D imaging devices and imaging method based on laser scanning
Yan et al. Joint camera intrinsic and lidar-camera extrinsic calibration
CN107850425B (en) Method for measuring an article
CN114299156A (en) Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area
CN115797461A (en) Flame space positioning system calibration and correction method based on binocular vision
CN114998448A (en) Method for calibrating multi-constraint binocular fisheye camera and positioning space point
Yamauchi et al. Calibration of a structured light system by observing planar object from unknown viewpoints
CN112529856A (en) Method for determining the position of an operating object, robot and automation system
CN112070844A (en) Calibration method and device of structured light system, calibration tool diagram, equipment and medium
CN116758160B (en) Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method
Tran et al. Extrinsic calibration of a camera and structured multi-line light using a rectangle
Kim et al. A new camera calibration method for robotic applications

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant