CN112967343B - Algorithm for guiding 2.5D mounting by 2D camera - Google Patents

Algorithm for guiding 2.5D mounting by 2D camera Download PDF

Info

Publication number
CN112967343B
CN112967343B CN202110076028.3A CN202110076028A CN112967343B CN 112967343 B CN112967343 B CN 112967343B CN 202110076028 A CN202110076028 A CN 202110076028A CN 112967343 B CN112967343 B CN 112967343B
Authority
CN
China
Prior art keywords
coordinate system
camera
user
angle
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110076028.3A
Other languages
Chinese (zh)
Other versions
CN112967343A (en
Inventor
石冲
何伟
丁少华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Vision Dragon Intelligent Sensor Co ltd
Original Assignee
Shenzhen Vision Dragon Intelligent Sensor Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Vision Dragon Intelligent Sensor Co ltd filed Critical Shenzhen Vision Dragon Intelligent Sensor Co ltd
Priority to CN202110076028.3A priority Critical patent/CN112967343B/en
Publication of CN112967343A publication Critical patent/CN112967343A/en
Application granted granted Critical
Publication of CN112967343B publication Critical patent/CN112967343B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The invention is an algorithm for realizing 2D camera guide 2.5D mounting, the algorithm uses 2 cameras, camera 1 shoots objects, camera 2 shoots objects; the algorithm relates to the conversion of an image coordinate system, an output coordinate system, a world coordinate system, a user coordinate system and data among four coordinate systems, wherein the image coordinate system and the output coordinate system are plane rectangular coordinate systems, and the world coordinate system and the user coordinate system are space rectangular coordinate systems. The invention is suitable for the condition that the attached object and the attached object are not on the same plane, and the object attaching plane and the object photographing plane are not changed once determined, 3D mounting can be converted into 2D mounting by using the invention, and an expensive 3D camera is replaced by a common 2D industrial camera.

Description

Algorithm for guiding 2.5D mounting by 2D camera
Technical Field
The invention belongs to the technical field of cameras, and particularly relates to a method for guiding two objects on different planes to carry out surface mounting calibration and calculating a fitting coordinate under a space rectangular coordinate system by a 2D camera.
Background
In daily life, cameras are widely used, especially when people paste and calibrate objects with different coordinate systems.
Generally, the manufacturing cost of the 3D stereoscopic target is high, and the processing accuracy is limited. At present, the camera calibration based on a 2D plane target is adopted by people in a common way.
In the calibration mode, the camera is required to shoot the same plane target in more than two different directions, and in this case, the camera can move randomly without knowing the motion parameters. The calibration is inaccurate, and the calibration is easily interfered by various external factors.
Therefore, it is necessary to design a mounting and calibration method capable of guiding two objects of different planes by a 2D camera.
Disclosure of Invention
In order to solve the above problems, a primary object of the present invention is to provide an algorithm for implementing 2D camera guided 2.5D mounting, which solves the problem of calibrating and calculating a fitting coordinate for guiding two objects on different planes by a 2D camera under a spatial rectangular coordinate system.
Another object of the present invention is to provide an algorithm for implementing 2D camera guided 2.5D mounting, wherein the mounting apparatus has a simple structure, low cost and is convenient to use.
In order to achieve the above object, the technical solution of the present invention is as follows.
An algorithm for guiding 2.5D mounting by a 2D camera is realized, the algorithm uses 2 cameras, a camera 1 shoots an object, and a camera 2 shoots a target; the algorithm relates to an image coordinate system, an output coordinate system, a world coordinate system, a user coordinate system and conversion of data among the four coordinate systems, wherein the image coordinate system and the output coordinate system are plane rectangular coordinate systems, and the world coordinate system and the user coordinate system are space rectangular coordinate systems;
the algorithm has two algorithms of calibration and fitting, wherein:
the calibration algorithm is divided into three steps, wherein the internal parameters and the mounting angle of the camera 1 are calibrated in the first step;
secondly, calculating the position of the rotation center in the output coordinate system of the camera 1;
thirdly, completing mapping calibration of the camera 1 and the camera 2 through a calibration plate;
the fitting algorithm comprises the following steps:
first, the camera 1 photographs an object, and calculates coordinates (XB) of the object in world coordinates w ,YB w ) And angle RB w
Secondly, the camera 2 photographs the target and calculates the value (XT) of the target in the world coordinate system w ,YT w ) And angle RT w
And thirdly, completing the rotation offset of the object and the target under the user coordinate system. The rotation angle alpha = RTu-RBu; values (XB 'u, YB' u) of the target in the user coordinate system after the target is rotated;
fourthly, calculating coordinate offset (delta Xu, delta Yu) of the target and the object after rotation in a user coordinate system;
and fifthly, calculating absolute coordinates (XFu; YFu) and an angle RFu of the attaching position in the user coordinate system.
In summary, the present invention is applicable to a case where the fit object and the fit target are not in one plane, and the target plane and the fit plane are not changed once determined.
The algorithm relates to two poses of a 6-axis robot. When the camera 1 takes a picture, the robot uses the posture a. The robot uses pose B during mounting.
Specifically, the calibration algorithm is as follows:
in the first step, 9-point calibration is adopted. The 6-axis robot drives the characteristic points in a world coordinate system O w X w Y w The plane moves by 9 positions, the positions of the characteristic points in the world coordinate system and the image coordinate system of the camera 1 are recorded in sequence, and an output coordinate system O of the camera 1 is generated according to a 9-point calibration algorithm 1w X 1W Y 1W
Second, calculate center of rotation (XC) w ,YC w ) Position in the output coordinate System of the Camera 1 (XC) 1w ,YC 1w ). The algorithm is prototype to known rotation front (X1) 1w ,Y1 1w ) And after rotation (X2) 1w ,Y2 1w ) Two points, and a rotation angle β.
Solving a rotation center formula:
Figure BDA0002907509140000031
at this time, the camera 1 outputs an arbitrary point (X0) in the coordinate system 1w ,Y0 1w ) Can be converted into coordinates (X0) in world coordinate system w ,Y0 w );
Figure BDA0002907509140000032
Thirdly, finishing mapping calibration by the camera 1 and the camera 2; firstly, the camera 1 positions and calculates the values (X1) of 9 marks on a calibration plate under the output coordinates of the camera 1 1W ,Y1 1W ),(X2 1W ,Y2 1W )…(X9 1W ,Y9 1W ) Then converted into a value (X1) in the world coordinate system by formula 2 W ,Y1 W ),(X2 W ,Y2 W )…(X9 W ,Y9 W ) (ii) a The calibration plate is then moved (Δ Xw, Δ Yw) toPoint P is reached so that it appears within the field of view of camera 2; the values (X1 'of the 9 marks of the calibration plate in the world coordinate system at this time are calculated according to formula 3' W ,Y1’ W ),(X2’ W ,Y2’ W )…(X9’ W ,Y9’ W ) (ii) a The camera 2 positions and calculates the values of 9 marks on the calibration plate under the camera 2 image coordinate system, the camera 2 uses the 9 groups of image coordinate values and world coordinate values to complete 9-point calibration, and an output coordinate system O of the camera 2 is generated 2w X 2w Y 2w
Figure BDA0002907509140000033
The specific implementation steps of the fitting algorithm are as follows:
first, the camera 1 photographs an object and calculates coordinates (XB) of the object in world coordinates w ,YB w ) And angle RB w
World coordinate (XB ') of the object at point P can be calculated according to formula 3' w ,YB’ w ) And angle RB' w
Figure BDA0002907509140000041
The user coordinate (XB) of the object at point P can be calculated according to equation 4 u ,YB u ) And angle RB u
Figure BDA0002907509140000042
Secondly, the camera 2 photographs the target and calculates the value (XT) of the target in the world coordinate system w ,YT w ) And angle RT w
The user coordinates (XT) of the target can be calculated according to equation 4 u ,YT u ) And angle RT u
Figure BDA0002907509140000043
The position of both the target and the object is now converted to values in the user coordinate system.
And thirdly, completing the rotation offset of the object under the user coordinate system. Rotation angle α = RT u -RB u (ii) a Value (XB ') in user coordinate system after target rotation' u ,YB’ u )。
Figure BDA0002907509140000044
Fourthly, calculating the coordinate offset (delta X) of the target and the rotated object in the user coordinate system u ,ΔY u )。
Figure BDA0002907509140000045
Fifthly, calculating absolute coordinates (XF) of the bonding position in the user coordinate system u ;YF u ) And angle RF u
Figure BDA0002907509140000046
The invention has the beneficial effects that:
the invention is suitable for the condition that the fit object and the fit target are not in the same plane, and the target plane and the fit plane are not changed once being determined. The invention can convert 3D mounting into 2D mounting, and replace expensive 3D camera with common 2D industrial camera. The 2D camera has higher photographing speed and higher operation speed than the 3D camera, so that the production efficiency is improved.
Drawings
Fig. 1 is a schematic diagram of a camera position implemented by the present invention.
Fig. 2 is a schematic structural diagram of a robot position realized by the invention.
Fig. 3 is a schematic diagram of a world coordinate system in which the present invention is implemented.
FIG. 4 illustrates the generation of an output coordinate system O implemented by the present invention 1w X 1W Y 1W Schematic representation of (a).
FIG. 5 shows a generated output coordinate system O implemented by the present invention 2w X 2W Y 2W Schematic representation of (a).
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, which is a schematic diagram of a camera position when the present invention is implemented, the camera 1 and the camera 2 are 2D cameras. The camera 1 is vertically installed upwards to shoot a joint object, and the camera 2 is obliquely installed and is vertical to the joint plane to shoot a joint target. As shown in fig. 1.
As shown in fig. 2, the present algorithm involves two poses of a 6-axis robot. When the camera 1 takes a picture, the robot uses the posture a. The robot uses pose B during mounting.
World coordinate system: and a space rectangular coordinate system, wherein the positive direction of the coordinate axis accords with a right-hand rule (a right-hand coordinate system). E.g. O in FIG. 3 w -X w Y w Z w
A user coordinate system: the world coordinate system is a space rectangular coordinate system which rotates and translates around an X axis (or a Y axis), and the positive direction of the coordinate axes accords with a right-hand rule (a right-hand coordinate system). E.g. O in FIG. 3 u -X u Y u Z u
Image coordinate system: the single camera imaging is a two-dimensional plane rectangular coordinate system according to an original coordinate system corresponding to the physical position of the camera photosensitive chip. The origin of the image coordinate system is in the lower left corner of the image, the X-axis is horizontally to the right, the Y-axis is vertically up, and the unit is a pixel. E.g. O in FIG. 3 1i -X 1i Y 1i As coordinates of the camera 1 image, O 2i -X 2i Y 2i Are camera 2 image coordinates.
Outputting a coordinate system: the data coordinate system after camera calibration is a two-dimensional plane rectangular coordinate system. The origin of the output coordinate system is different according to the point location input during calibration, the direction of the X axis is the same as the X direction of the world coordinate system, and the direction of the Y axis is the same as the direction of the world coordinate system.
The calibration is divided into three steps, the first step calibrates the internal parameters and mounting angle of the camera 1. The second step calculates the position of the center of rotation in the output coordinate system of the camera 1. And thirdly, completing mapping calibration by the camera 1 and the camera 2 through a calibration plate.
In the first step, 9-point calibration is adopted. The 6-axis robot drives the characteristic points in a world coordinate system O w X w Y w The plane moves by 9 positions, the positions of the characteristic points in the world coordinate system and the image coordinate system of the camera 1 are recorded in sequence, and an output coordinate system O of the camera 1 is generated according to a 9-point calibration algorithm 1w X 1W Y 1W As shown in fig. 4.
Second, calculate center of rotation (XC) w ,YC w ) Position in the output coordinate system of camera 1 (XC) 1w ,YC 1w ). The algorithm is prototype to known rotation front (X1) 1w ,Y1 1w ) And after rotation (X2) 1w ,Y2 1w ) Two points, and a rotation angle β.
Solving a rotation center formula:
Figure BDA0002907509140000061
at this time, the camera 1 outputs an arbitrary point (X0) in the coordinate system 1w ,Y0 1w ) Can be converted into coordinates (X0) in world coordinate system w ,Y0 w )。
Figure BDA0002907509140000062
And thirdly, finishing mapping calibration by the camera 1 and the camera 2. Firstly, the camera 1 positions and calculates the values (X1) of 9 marks on a calibration plate under the output coordinates of the camera 1 1W ,Y1 1W ),(X2 1W ,Y2 1W )…(X9 1W ,Y9 1W ) Then converted into a value (X1) in the world coordinate system by formula 2 W ,Y1 W ),(X2 W ,Y2 W )…(X9 W ,Y9 W ). The calibration plate is then moved (Δ Xw, Δ Yw) to point P so that it appears in the field of view of camera 2 (due to the angle between camera 1 and camera 2, the robot pose changes during the movement of the calibration plate, i.e. U and V rotate, but only the X and Y offsets need to be recorded here). The values (X1 'of the 9 marks of the calibration plate in the world coordinate system at this time are calculated according to formula 3' W ,Y1’ W ),(X2’ W ,Y2’ W )…(X9’ W ,Y9’ W ). Camera 2 position calculation the values of 9 marks on the calibration plate at the camera 2 image coordinate. The camera 2 completes 9-point calibration by using the 9 sets of image coordinate values and world coordinate values to generate an output coordinate system O of the camera 2 2w X 2w Y 2w . As shown in fig. 5.
Figure BDA0002907509140000071
Image coordinate system O of camera 2 2i X 2i Y 2i Is a plane rectangular coordinate system and a plane O of a space rectangular coordinate system w X w Y w Not in one plane, and need to be rotated about the X and Y axes. In the third step of calibration (Δ X) w ,ΔY w ) When the change of X and Y is ignored when the robot changes the posture, the plane of the output coordinate system of the camera 2 and O are supposed w X w Y w The planes are parallel. As a result of the nine-point calibration, the output coordinate system of the camera 2 is matched with the world coordinate system O w X w Y w And (6) overlapping.
Since the camera 2 outputs a coordinate system in a virtual manner, the value of the camera 2 output coordinate system is not the coordinate in the robot posture B, and the bonding position needs to be converted using the relative positional relationship of the coordinate system. Thus introducing the robot user coordinate system O u -X u Y u Z u User coordinate system O u X u Y u The plane is parallel to the binding surface, and the Z-axis direction is vertical to the binding surface.
When the point P is moved to the point P in the step 3 of calibration, the value (XP) of the point P in a world coordinate system needs to be recorded w ,YP w ) And the value of P point in user coordinates (XP) u ,YP u ) And angle RP u And the angle theta between the positive direction of the X axis of the world coordinate system and the positive direction of the X axis of the user coordinate system.
Then any point in the world coordinate system (XA) can be coordinated w ,YA w ) Conversion to values (XA) in the user coordinate System u ,YA u )
Figure BDA0002907509140000072
Thereby completing the calibration step.
And (5) describing a fitting algorithm.
First, the camera 1 photographs an object and calculates coordinates (XB) of the object in world coordinates w ,YB w ) And angle RB w
World coordinate (XB ') of the object at point P can be calculated according to formula 3' w ,YB’ w ) And angle RB' w
Figure BDA0002907509140000081
The user coordinates (XB) of the object at point P can be calculated according to equation 4 u ,YB u ) And angle RB u
Figure BDA0002907509140000082
Secondly, the camera 2 photographs the target and calculates the value (XT) of the target in the world coordinate system w ,YT w ) And angle RT w
The user coordinates (XT) of the target can be calculated according to equation 4 u ,YT u ) And angle RT u
Figure BDA0002907509140000083
The positions of the object and the object are now converted into values in the user coordinate system.
And thirdly, completing the rotation offset of the object under the user coordinate system. Rotation angle α = RT u -RB u (ii) a Value (XB ') in user coordinate system after target rotation' u ,YB’ u )。
Figure BDA0002907509140000084
Fourthly, calculating the coordinate offset (delta X) of the target and the rotated object in the user coordinate system u ,ΔY u )。
Figure BDA0002907509140000085
Fifthly, calculating absolute coordinates (XF) of the bonding position in the user coordinate system u ;YF u ) And angle RF u
Figure BDA0002907509140000091
In summary, the present invention is applicable to a case where the fit object and the fit target are not in one plane, and the target plane and the fit plane are not changed once determined.
The invention can convert 3D mounting into 2D mounting, and replace expensive 3D camera with common 2D industrial camera. The 2D camera has higher photographing speed and higher operation speed than the 3D camera, so that the production efficiency is improved.
The invention introduces a user coordinate system as a conversion coordinate system, uses relative position relation, and ignores the complex conversion operation of X, Y and Z when the robot gesture changes.
The present invention is not limited to the above preferred embodiments, and any modifications, equivalent substitutions and improvements made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (3)

1. An algorithm for guiding 2.5D (two-dimensional) mounting by a 2D camera is characterized in that the algorithm uses 2 cameras, wherein a camera 1 photographs an object, and a camera 2 photographs a target; the algorithm relates to an image coordinate system, an output coordinate system, a world coordinate system, a user coordinate system and conversion of data among the four coordinate systems, wherein the image coordinate system and the output coordinate system are plane rectangular coordinate systems, and the world coordinate system and the user coordinate system are space rectangular coordinate systems;
the algorithm has two algorithms of calibration and fitting, wherein:
the calibration algorithm is divided into three steps, wherein the internal parameters and the mounting angle of the camera 1 are calibrated in the first step;
secondly, calculating the position of the rotation center in the output coordinate system of the camera 1;
thirdly, the camera 1 and the camera 2 complete mapping calibration through a calibration board;
the fitting algorithm comprises the following steps:
first, the camera 1 photographs the subject and calculates the coordinates of the subject in world coordinates (XB w YB w ) And angleRB w
Secondly, the camera 2 photographs the target and calculates the value of the target in the world coordinate system (XT w YT w ) And angleRTw
Thirdly, completing the rotation offset of the object and the target in a user coordinate system, wherein the rotation angle is alpha = RT u -RB u (ii) a Values in the user coordinate system after object rotation: (XB'uYB'u);
Fourthly, calculating the coordinate offset (delta) of the target and the rotated object in the user coordinate systemX u ,ΔY u );
The fifth step, in the user coordinate system, the absolute coordinates of the bonding position are calculated (XFuYFu) And angleRFu
2. The algorithm for implementing 2D camera guided 2.5D mounting according to claim 1, wherein the calibration algorithm is as follows:
firstly, adopting 9-point calibration; the 6-axis robot drives the feature points to move 9 positions on the world coordinate system OwXwYw plane, the positions of the feature points in the world coordinate system and the image coordinate system of the camera 1 are sequentially recorded, and the output coordinate system O1wX1 of the camera 1 is generated according to a 9-point calibration algorithm W Y1 W
Second, the center of rotation is calculated (XCwYCw) Position in the output coordinate system of the camera 1 (XC 1wYC 1w ) (ii) a The algorithm prototype is before the known rotation (X1 1wY1 1w ) And after rotation (X2 1wY2 1w ) Two points, and a rotation angle β;
solving a rotation center formula:
Figure 969021DEST_PATH_IMAGE001
wherein,XC 1wYC 1w outputting the coordinate position under the coordinate system for the camera 1;X1 1wY1 1w is a known pre-rotation coordinate position;X2 1wY2 1w to know the coordinate position after rotation, the camera 1 outputs any point in the coordinate system (at this timeX0 1w Y0 1w ) All can be converted into coordinates in world coordinate system (X0wY0w);
Figure 168052DEST_PATH_IMAGE002
Wherein,X0 1wY0 1w outputting the coordinate position of any point in the coordinate system for the camera 1;X0 wY0 w is the coordinate position in the world coordinate system.
3. The algorithm for implementing 2D camera guided 2.5D mounting according to claim 1, wherein the step of implementing the attaching algorithm is as follows:
first, the camera 1 photographs the subject and calculates the coordinates of the subject in world coordinates (XB wYB w ) And angleRB w
The world coordinates of the object at point P can be calculated according to the following formula (XB' wYB' w ) And angleRB' w
Figure 478948DEST_PATH_IMAGE003
Wherein,XB' wYB' w world coordinates of the object at the point P;RB' w is the angle of the object at point P; calculate the user coordinates of the object at point P: (XB uYB u ) And angleRB u
Figure 139736DEST_PATH_IMAGE004
Wherein,XB uYB u the user coordinates of the object at the point P;RB u is the angle of the object at point P;
secondly, the camera 2 photographs the target and calculates the value of the target in the world coordinate system (XT wYT w ) And angleRT w
Calculating user coordinates of the target: (XT uYT u ) And angleRT u
Figure 485267DEST_PATH_IMAGE005
Wherein,XT uYT u user coordinates of the target: (XT uYT u );RT u A user angle that is a target;XP wYP w is the value of the P point in a world coordinate system;XP uYP u the value of the point P under the user coordinate is obtained;
at this time, the positions of the target and the object are converted into values in a user coordinate system;
thirdly, completing the rotation offset of the object under a user coordinate system; rotation angle α = RT u -RB u (ii) a Values in the user coordinate system after rotation of the target: (XB' uYB' u );
Figure 580831DEST_PATH_IMAGE006
Wherein,XB' uYB' u the value of the target in the user coordinate system after rotation;
fourthly, calculating the coordinate offset (delta) of the target and the rotated object in the user coordinate systemX u ,ΔY u );
Figure 379023DEST_PATH_IMAGE007
Wherein, deltaX u ,ΔY u Is the coordinate offset after rotation;
the fifth step, in the user coordinate system, the absolute coordinates of the bonding position are calculated (XF u YF u ) And angleRF u
Figure 843503DEST_PATH_IMAGE008
CN202110076028.3A 2021-01-20 2021-01-20 Algorithm for guiding 2.5D mounting by 2D camera Active CN112967343B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110076028.3A CN112967343B (en) 2021-01-20 2021-01-20 Algorithm for guiding 2.5D mounting by 2D camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110076028.3A CN112967343B (en) 2021-01-20 2021-01-20 Algorithm for guiding 2.5D mounting by 2D camera

Publications (2)

Publication Number Publication Date
CN112967343A CN112967343A (en) 2021-06-15
CN112967343B true CN112967343B (en) 2023-01-06

Family

ID=76271233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110076028.3A Active CN112967343B (en) 2021-01-20 2021-01-20 Algorithm for guiding 2.5D mounting by 2D camera

Country Status (1)

Country Link
CN (1) CN112967343B (en)

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9188973B2 (en) * 2011-07-08 2015-11-17 Restoration Robotics, Inc. Calibration and transformation of a camera system's coordinate system
EP2645701A1 (en) * 2012-03-29 2013-10-02 Axis AB Method for calibrating a camera
WO2017092631A1 (en) * 2015-11-30 2017-06-08 宁波舜宇光电信息有限公司 Image distortion correction method for fisheye image, and calibration method for fisheye camera
CN106127722B (en) * 2016-05-03 2019-02-19 深圳视觉龙智能传感器有限公司 The calibration of polyphaser and contraposition applying method
US20170344112A1 (en) * 2016-05-31 2017-11-30 Fove, Inc. Gaze detection device
CN106683138B (en) * 2016-12-28 2018-01-26 华中科技大学 A kind of scaling method of stencil printer camera
JP7038345B2 (en) * 2017-04-20 2022-03-18 パナソニックIpマネジメント株式会社 Camera parameter set calculation method, camera parameter set calculation program and camera parameter set calculation device
CN107160380B (en) * 2017-07-04 2021-01-19 华南理工大学 Camera calibration and coordinate transformation method based on SCARA manipulator
CN108100353B (en) * 2018-01-29 2020-11-10 广东工业大学 Diaphragm positioning and laminating method and device
CN109685857B (en) * 2018-12-28 2023-11-24 深圳视觉龙智能传感器有限公司 Full-automatic screen printer vision calibration and alignment laminating algorithm
CN113400662B (en) * 2019-09-05 2022-08-12 深圳市巨力方视觉技术有限公司 Method and device for attaching electronic element on PCB (printed Circuit Board) and storage medium
CN111735487B (en) * 2020-05-18 2023-01-10 清华大学深圳国际研究生院 Sensor, sensor calibration method and device, and storage medium

Also Published As

Publication number Publication date
CN112967343A (en) 2021-06-15

Similar Documents

Publication Publication Date Title
CN112689135B (en) Projection correction method, projection correction device, storage medium and electronic equipment
WO2021103347A1 (en) Projector keystone correction method, apparatus, and system, and readable storage medium
Kumar et al. Simple calibration of non-overlapping cameras with a mirror
CN103115613B (en) Three-dimensional space positioning method
JP5999615B2 (en) Camera calibration information generating apparatus, camera calibration information generating method, and camera calibration information generating program
JP7486740B2 (en) System and method for efficient 3D reconstruction of an object using a telecentric line scan camera - Patents.com
CN110070598B (en) Mobile terminal for 3D scanning reconstruction and 3D scanning reconstruction method thereof
Martynov et al. Projector calibration by “inverse camera calibration”
US9418435B2 (en) Three-dimensional measurement method
CN106709865B (en) Depth image synthesis method and device
CN102472609A (en) Position and orientation calibration method and apparatus
TWI709062B (en) Virtuality reality overlapping method and system
Yamazoe et al. Easy depth sensor calibration
CN114745529A (en) Projector single TOF trapezoidal correction method and projector
WO2018032841A1 (en) Method, device and system for drawing three-dimensional image
CN116363226A (en) Real-time multi-camera multi-projector 3D imaging processing method and device
CN110136068A (en) Sound film top dome assembly system based on location position between bilateral telecentric lens camera
CN112967343B (en) Algorithm for guiding 2.5D mounting by 2D camera
Avetisyan et al. Calibration of Depth Camera Arrays.
Nagy et al. Development of an omnidirectional stereo vision system
CN113593050A (en) Binocular vision guided robot intelligent assembly method, system and device
CN207215015U (en) A kind of stereoscopic vision camera homonymy target location caliberating device
JP6697150B1 (en) Orbit calculation device, orbit calculation method, orbit calculation program
WO2011115635A1 (en) Object tracking with opposing image capture devices
Arnold et al. Formulating a Moving Camera Solution for Non-Planar Scene Estimation and Projector Calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant