CN113744343A - Hand-eye calibration method and system based on structured light sensor and storage medium - Google Patents

Hand-eye calibration method and system based on structured light sensor and storage medium Download PDF

Info

Publication number
CN113744343A
CN113744343A CN202110906192.2A CN202110906192A CN113744343A CN 113744343 A CN113744343 A CN 113744343A CN 202110906192 A CN202110906192 A CN 202110906192A CN 113744343 A CN113744343 A CN 113744343A
Authority
CN
China
Prior art keywords
calibration
light sensor
structured light
coordinates
demonstrator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110906192.2A
Other languages
Chinese (zh)
Other versions
CN113744343B (en
Inventor
刘起阳
高萌
孔德良
钟家明
陈思敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foshan Institute Of Intelligent Equipment Technology
Original Assignee
Foshan Institute Of Intelligent Equipment Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Foshan Institute Of Intelligent Equipment Technology filed Critical Foshan Institute Of Intelligent Equipment Technology
Priority to CN202110906192.2A priority Critical patent/CN113744343B/en
Publication of CN113744343A publication Critical patent/CN113744343A/en
Application granted granted Critical
Publication of CN113744343B publication Critical patent/CN113744343B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention relates to the technical field of visual calibration, in particular to a hand-eye calibration method, a hand-eye calibration system and a storage medium based on a structured light sensor, wherein the method comprises the following steps: respectively determining a tool coordinate system of the end tool and a camera coordinate system of the structured light sensor; adjusting an action part of the robot through a demonstrator, and carrying out hand-eye calibration on the robot to obtain a world coordinate and a plurality of calibration coordinates of a calibration point; determining a hand-eye calibration transformation matrix according to the world coordinates of the calibration points and the calibration coordinates; the invention shortens the calibration time on the premise of ensuring the precision.

Description

Hand-eye calibration method and system based on structured light sensor and storage medium
Technical Field
The invention relates to the technical field of visual calibration, in particular to a hand-eye calibration method and system based on a structured light sensor and a storage medium.
Background
The vision system has become a hotspot in the field of robot research, and the robot is given the ability of sensing the surrounding environment by virtue of good detection performance and positioning performance of the vision system, and the vision sensing method has the advantages of abundant information quantity, high sensitivity, high precision, no contact and the like, and is valued by researchers. At present, images collected by visual sensing comprise images based on natural light and artificial common light and structured light images taking structured light as active light. The active vision is utilized, the task can be effectively completed under the special severe industrial environment, the surface of an object is detected through line structured light, fine features can be obtained, the features are extracted by utilizing an image processing method, then the three-dimensional coordinates of the features can be obtained through accurate calibration, wherein the mapping relation between a visual coordinate system and a robot tool coordinate system needs to be completed through hand-eye calibration, and the operation precision of the robot is determined to a greater extent by the calibration precision.
In order to improve the working accuracy of a robot vision system, in the prior art, a few methods for calibrating the hand and eye of a structured optical ranging function sensor installed at the tail end of a robot are provided, and the methods mainly include the following steps:
1) utilizing an external high-precision three-dimensional coordinate measuring instrument;
2) firstly, calibrating a certain point of a sensor shell, and then indirectly calibrating by using the geometric dimension relation of the sensor;
3) solving a closed-loop kinematic chain equation of the robot under a specific geometric constraint;
4) and by adopting a calibration method of a plane template, the robot measures at different positions and postures, records corresponding robot pose parameters, and solves a calibration problem by fitting a plane by using a least square method.
The method 1 has high cost and inconvenient operation; the method 2 is difficult to calibrate the posture, and has the problems that the precision is reduced after the sensor is worn; the method 3 has the problems of complex calculation and long time consumption due to the selection of geometric constraint; in the method 4, the parameter distribution has a large influence on the result, the number of point positions required for fitting is large, and the method is time-consuming.
Disclosure of Invention
The present invention is directed to a method, a system and a storage medium for calibrating a hand-eye based on a structured light sensor, so as to solve one or more technical problems in the prior art and provide at least one useful choice or creation condition.
In order to achieve the purpose, the invention provides the following technical scheme:
a hand-eye calibration method based on a structured light sensor, the method comprising the following steps:
step S100, respectively determining a tool coordinate system of the end tool and a camera coordinate system of the structured light sensor;
the structured light sensor is arranged on the end tool, and the Z axis of the structured light sensor is parallel to the Z axis of the end tool; the end tool is arranged at the end of the action part of the robot; the end tool is used for clamping the structured light sensor;
s200, adjusting an action part of the robot through a demonstrator, and carrying out hand-eye calibration on the robot to obtain a world coordinate and a plurality of calibration coordinates of a calibration point;
the calibration coordinates comprise camera coordinates and a conversion matrix corresponding to the camera coordinates; the conversion matrix is a conversion matrix from a base coordinate system of the robot to a tool coordinate system; the world coordinate is the coordinate of the calibration point in a world coordinate system, and the camera coordinate is the coordinate of the calibration point in a camera coordinate system;
and S300, determining a hand-eye calibration transformation matrix according to the world coordinates of the calibration points and the plurality of calibration coordinates.
Further, the step S100 includes:
step S110, adjusting the end tool through a demonstrator so that the end tool is aligned to the tip of the cone-shaped calibration block; wherein the conical calibration block is arranged on the calibration plate;
step S120, controlling the operation of an action part of the robot through a demonstrator so as to drive the tail end tool and the structured light sensor to rotate around the tip of the conical calibration block;
step S130, in the process that the tail end tool and the structured light sensor rotate around the tip of the conical calibration block, calibrating the tail end tool and the structured light sensor respectively by using a demonstrator to obtain a plurality of point position coordinates of the tail end tool and a plurality of point position coordinates of the structured light sensor;
step S140, determining a tool coordinate system of the end tool according to the plurality of point location coordinates of the end tool, and determining a camera coordinate system according to the plurality of point location coordinates detected by the structured light sensor.
Further, the calibration plate is provided with a vertical calibration surface, a first horizontal calibration surface and a second horizontal calibration surface, the first horizontal calibration surface and the second horizontal calibration surface are parallel to each other, and the first horizontal calibration surface is higher than the second horizontal calibration surface; the vertical calibration surface is respectively vertical to the first horizontal calibration surface and the second horizontal calibration surface; the first horizontal calibration surface is provided with scale marks vertical to the vertical surface, and the intersection points of the scale marks and the vertical surface are calibration points;
the step S200 includes:
step S210, adjusting an action part of the robot through a demonstrator so as to enable the tail end of the tool to align with the calibration point and obtain world coordinates of the calibration point;
step S220, controlling the operation of the action part of the robot through a demonstrator so as to enable the structural light of the structured light sensor to be aligned to the scale marks, and on the premise of keeping the structural light of the structured light sensor aligned to the scale marks, obtaining camera coordinates of the structured light sensor at N different heights by adjusting the posture of the action part of the robot and recording the camera coordinates as N calibration coordinates; n is more than or equal to 4, and the variation range of the structural light emitted by the structural light sensor is in different picture areas under different postures;
and step S230, selecting 4 calibration coordinates from the N calibration coordinates.
Further, the step S220 includes:
step S221, determining a visual area of a visual sensor, dividing the visual area of the visual sensor into rectangular nine-square grids, taking a central grid of the rectangular nine-square grids as a second picture, and taking four vertex grids of the rectangular nine-square grids as a third picture, a fourth picture, a fifth picture and a sixth picture respectively;
step S222, adjusting an action part of the robot through the demonstrator so that the tail end tool is aligned to the calibration point, recording the tool coordinate displacement on the demonstrator, namely the world coordinate of the calibration point, and recording the world coordinate as Pw1(xw1,yw1,zw1);
Step S223 is to control the operation of the operation part of the robot through the demonstrator, so as to align the structural light of the structural light sensor with the scale mark, and to make the variation range of the structural light emitted by the structural light sensor in the second screen region, so as to obtain a first camera coordinate P1(x 1) of the calibration pointc1,yc1,zc1) And a conversion matrix currently displayed by the demonstrator;
s224, controlling the tail end tool to lift a first height along the Z-axis direction through the demonstrator so that the structural light of the structured light sensor is aligned to the scale mark, and then controlling the tail end tool to adjust along the Y-axis direction so that the change range of the structural light emitted by the structured light sensor is in a third picture area to obtain a second camera coordinate P2 (x) of the calibration pointc2,yc2,zc2) And a conversion matrix currently displayed by the demonstrator;
step S225, the demonstrator controls the end tool to adjust along the Y-axis direction so that the structural light of the structured light sensor is aligned to the scale mark, the variation range of the structural light emitted by the structured light sensor is in the fourth picture area, and a third camera coordinate P3 (x) of the calibration point is obtainedc3,yc3,zc3) And a conversion matrix currently displayed by the demonstrator;
step S226, the demonstrator controls the end tool to reduce the second height along the Z-axis direction so as to align the structural light of the structured light sensor with the scale mark, and then controls the end tool to adjust along the Y-axis direction so as to enable the variation range of the structural light emitted by the structured light sensor to be in the fifth picture area, and obtain a fourth camera coordinate P4 (x) of the calibration pointc4,yc4,zc4) And a conversion matrix currently displayed by the demonstrator;
step S227, the operation of the action part of the robot is controlled by the demonstrator, and the robot is arranged at the end toolUnder the condition of keeping the height unchanged, aligning the structural light of the structural light sensor with the scale mark, and enabling the variation range of the structural light emitted by the structural light sensor to be in the sixth picture area to obtain a fifth camera coordinate P5 (x) of the calibration pointc5,yc5,zc5) And the conversion matrix currently displayed by the demonstrator.
Further, after step S227, the method further includes:
the teaching machine controls the operation of the action part of the robot, under the condition that the height of the tail end tool is kept unchanged, the structured light of the structured light sensor is aligned to the scale mark, the variation range of the structured light emitted by the structured light sensor is in any region, and a sixth coordinate P6 (x) of the structured light sensor is obtainedc6,yc6,zc6) And the conversion matrix currently displayed by the demonstrator.
Further, the step S200 further includes:
setting M scale marks on the first horizontal calibration surface, and respectively collecting N calibration coordinates on each scale mark; obtaining M multiplied by N calibration coordinates in total;
and 4 calibration coordinates are selected from the M multiplied by N calibration coordinates.
Further, the step S300 includes:
hand and eye setting transformation matrix
Figure BDA0003201713190000041
Comprises the following steps:
Figure BDA0003201713190000042
according to the coordinate conversion relationship of the calibration point in the world coordinate system and the camera coordinate system, a coordinate conversion formula is provided:
Figure BDA0003201713190000043
wherein (x)t,yt,zt) For indexing points in the tool coordinate systemCoordinates;
selecting 4 calibration coordinates from the N calibration coordinates, respectively establishing the following 4 equation sets according to the coordinate conversion formula, and connecting the first lines of the following 4 equation sets;
Figure BDA0003201713190000044
Figure BDA0003201713190000045
Figure BDA0003201713190000046
Figure BDA0003201713190000047
obtaining:
Figure BDA0003201713190000048
Figure BDA0003201713190000049
Figure BDA0003201713190000051
thus, can solve ax、bx、cxAnd px;ay、by、cyAnd py;az、bz、czAnd pz
Will rotate the matrix
Figure BDA0003201713190000052
Unitization to obtain
Figure BDA0003201713190000053
According to displacement offset
Figure BDA0003201713190000054
Obtaining a standardized hand-eye calibration matrix
Figure BDA0003201713190000055
And taking the standardized hand-eye calibration matrix as a hand-eye calibration conversion matrix.
Further, the coordinate transformation relationship of the index point in the world coordinate system and the camera coordinate system is determined according to the following formula:
Figure BDA0003201713190000056
wherein, PwCoordinates (x) in world coordinate system for index pointw,yw,zw),PcAs coordinates (x) of the index point in the camera coordinate systemc,yc,zc),
Figure BDA0003201713190000057
Is a transformation matrix from the base coordinate system to the flange,
Figure BDA0003201713190000058
is a transformation matrix of the flange plate to the tool coordinate system,
Figure BDA0003201713190000059
a hand-eye transformation matrix.
A computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of any of the above-described methods for hand-eye calibration based on a structured light sensor.
A hand-eye calibration system based on a structured light sensor, the system comprising:
at least one processor;
at least one memory for storing at least one program;
when executed by the at least one processor, the at least one program causes the at least one processor to implement any of the above-described methods for hand-eye calibration based on a structured light sensor.
The invention has the beneficial effects that: the invention discloses a hand-eye calibration method, a hand-eye calibration system and a storage medium based on a structured light sensor, and provides a simple hand-eye calibration method for improving the working precision of a robot vision system, wherein the method can be applied to a tail end mounting structured light vision sensor, in addition, the freedom degree is redundant in the robot hand-eye calibration, during the operation, the structured light is aligned to a standard line and is simultaneously intersected with a deep groove straight line, and a related algorithm is utilized for acquiring a characteristic point, so that the characteristic point acquisition process is simplified; meanwhile, by collecting coordinate points with high linear independence, a large amount of calibration data does not need to be acquired. The invention shortens the calibration time on the premise of ensuring the precision. The method is simple and easy to implement, only needs to teach the acquisition point position after the built-in algorithm, is accurate in position and posture control, high in efficiency and good in quality, and has high practical value.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic flow chart of a hand-eye calibration method based on a structured light sensor according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of hand-eye calibration according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of the position of a sensor in an embodiment of the present invention;
FIG. 4 is a schematic view of the visual area division of the visual sensor in an embodiment of the present invention;
FIG. 5 is a schematic illustration of a calibration plate, scale lines and calibration points in an embodiment of the present invention.
Detailed Description
The conception, specific structure and technical effects of the present application will be described clearly and completely with reference to the following embodiments and the accompanying drawings, so that the purpose, scheme and effects of the present application can be fully understood. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
Referring to fig. 1, as shown in fig. 1, a method for calibrating a hand-eye based on a structured light sensor according to an embodiment of the present application includes the following steps:
step S100, respectively determining a tool coordinate system of the end tool and a camera coordinate system of the structured light sensor; the structured light sensor is arranged on the end tool, and the Z axis of the structured light sensor is parallel to the Z axis of the end tool; the end tool is arranged at the end of the action part of the robot; the end tool is used for clamping the structured light sensor;
s200, adjusting an action part of the robot through a demonstrator, and carrying out hand-eye calibration on the robot to obtain a world coordinate and a plurality of calibration coordinates of a calibration point;
the calibration coordinates comprise camera coordinates and a conversion matrix corresponding to the camera coordinates; the conversion matrix is a conversion matrix from a base coordinate system of the robot to a tool coordinate system; the world coordinate is the coordinate of the calibration point in a world coordinate system, and the camera coordinate is the coordinate of the calibration point in a camera coordinate system;
and S300, determining a hand-eye calibration transformation matrix according to the world coordinates of the calibration points and the plurality of calibration coordinates.
It should be noted that, by using the hand-on-eye mounting manner, when the robot moves, the sensor is also driven to move, so that the feature point information detected by the vision sensor can be converted into the world coordinate system only by knowing the transformation matrix from the tool coordinate system to the camera coordinate system.
In the embodiment provided by the invention, the structured light sensor is a visual sensor matched with a camera and linear structured light, the camera coordinates of the characteristic points are extracted by using an image algorithm through the change of the linear structured light irradiating the object surface, and the hand-eye calibration is an important step in the process of controlling the robot by using the visual information, so that the robot can be helped to convert the visual information of the identified target to obtain the pose information of the target under the world coordinate, and the subsequent control work is finished.
Detecting the coordinates of the calibration point under a camera coordinate system through a structured light sensor to serve as camera coordinates; the tail end tool is controlled to touch the calibration point, and after the tail end tool is calibrated, the coordinate of the tail end tool under the world coordinate system is read and displayed on the demonstrator, namely the coordinate Pw (x) of the calibration point under the world coordinate systemw,yw,zw) As world coordinates;
referring to fig. 2, in some embodiments, the step S100 includes:
step S110, adjusting the end tool through a demonstrator so that the end tool is aligned to the tip of the cone-shaped calibration block; wherein the conical calibration block is arranged on the calibration plate;
step S120, controlling the operation of an action part of the robot through a demonstrator so as to drive the tail end tool and the structured light sensor to rotate around the tip of the conical calibration block;
step S130, in the process that the tail end tool and the structured light sensor rotate around the tip of the conical calibration block, calibrating the tail end tool and the structured light sensor respectively by using a demonstrator to obtain a plurality of point position coordinates of the tail end tool and a plurality of point position coordinates of the structured light sensor;
step S140, determining a tool coordinate system of the end tool according to the plurality of point location coordinates of the end tool, and determining a camera coordinate system according to the plurality of point location coordinates detected by the structured light sensor.
In this embodiment, first, the structured light sensor is clamped on the end tool by a clamp, and the Z axis of the structured light sensor and the Z axis of the tool coordinate system are kept parallel as much as possible; then, the control teach pendant adjusts the end tool to align the end tool with the tip of the cone calibration block, controls the end tool to rotate around the tip of the cone calibration block, and finally determines the tool coordinate system of the end tool and the camera coordinate system OcXcYcZc of the structured light sensor, respectively, using a calibration method. The method for calibrating the tool coordinate by using the demonstrator comprises the following steps: a three-point calibration method, a six-point calibration method, or a twenty-point calibration method; it will be appreciated that sufficient point coordinates need to be calibrated by the teach pendant to ensure accuracy of the tool and camera coordinate systems.
With reference to fig. 3, as a further improvement of the above embodiment, the calibration plate has a vertical calibration face, a first horizontal calibration face and a second horizontal calibration face, the first horizontal calibration face and the second horizontal calibration face being parallel to each other, the first horizontal calibration face being higher than the second horizontal calibration face; the vertical calibration surface is respectively vertical to the first horizontal calibration surface and the second horizontal calibration surface; the first horizontal calibration surface is provided with scale marks vertical to the vertical surface, and the intersection points of the scale marks and the vertical surface are calibration points;
the step S200 includes:
step S210, adjusting an action part of the robot through a demonstrator so as to enable the tail end of the tool to align with the calibration point and obtain world coordinates of the calibration point;
step S220, controlling the operation of the action part of the robot through a demonstrator so as to enable the structural light of the structured light sensor to be aligned to the scale marks, and on the premise of keeping the structural light of the structured light sensor aligned to the scale marks, obtaining camera coordinates of the structured light sensor at N different heights by adjusting the posture of the action part of the robot and recording the camera coordinates as N calibration coordinates; n is more than or equal to 4, and the variation range of the structural light emitted by the structural light sensor is in different picture areas under different postures;
and step S230, selecting 4 calibration coordinates from the N calibration coordinates.
Referring to fig. 4 and 5, as a further modification of the above embodiment, the step S220 includes:
step S221, determining a visual area of a visual sensor, dividing the visual area of the visual sensor into rectangular nine-square grids, taking a central square grid of the rectangular nine-square grid as a second picture (a square grid marked as 2 in figure 4), and taking four vertex square grids of the rectangular nine-square grid as a third picture, a fourth picture, a fifth picture and a sixth picture (square grids marked as 3, 4, 5 and 6 in sequence in figure 4);
step S222, adjusting an action part of the robot through the demonstrator so that the tail end tool is aligned to the calibration point, recording the tool coordinate displacement on the demonstrator, namely the world coordinate of the calibration point, and recording the world coordinate as Pw1(xw1,yw1,zw1);
Step S223 is to control the operation of the operation part of the robot through the demonstrator, so as to align the structural light of the structural light sensor with the scale mark, and to make the variation range of the structural light emitted by the structural light sensor in the second screen region, so as to obtain a first camera coordinate P1(x 1) of the calibration pointc1,yc1,zc1) And a conversion matrix currently displayed by the demonstrator;
s224, controlling the tail end tool to lift a first height along the Z-axis direction through the demonstrator so that the structural light of the structured light sensor is aligned to the scale mark, and then controlling the tail end tool to adjust along the Y-axis direction so that the change range of the structural light emitted by the structured light sensor is in a third picture area to obtain a second camera coordinate P2 (x) of the calibration pointc2,yc2,zc2) And a conversion matrix currently displayed by the demonstrator;
it should be noted that, the transformation matrix represents a transformation matrix from the base coordinates of the robot to the end tool, which reflects the posture of the robot, and can be directly read by the teach pendant, and the transformation matrix changes as long as the robot moves;
in some embodiments, the end tool is disposed on a flange at the end of the action part of the robot; the conversion matrix is determined according to a conversion matrix from a base coordinate system of the robot to the flange plate and a conversion matrix from the flange plate to a tool coordinate system; for the transformation matrix
Figure BDA0003201713190000081
When the robot end does not have a tool, the demonstrator displays a conversion matrix from the base coordinates to the end tool
Figure BDA0003201713190000082
A program built in the demonstrator calculates according to feedback data of encoders of a plurality of joint axes of the robot and displays the calculated pose of the end tool; the coordinate system of the tool is calibrated to obtain
Figure BDA0003201713190000083
After calibration is finished, the built-in program of the demonstrator automatically calculates and multiplies, and then the conversion matrix from the base coordinate system to the flange plate is displayed
Figure BDA0003201713190000084
In some embodiments, the teaching device controls the robot to move horizontally through the action part, the Z-axis direction is lifted to a first height, the structured light of the structured light sensor is aligned with the scale mark, the variation range of the structured light emitted by the structured light sensor is in a third picture area, the second camera coordinate P2 of the calibration point is detected through the structured light sensor, and the teaching device displays the second camera coordinate P2(x is x) of the calibration pointc2,yc2,zc2) And a transformation matrix from the base coordinate system to the flange; the first height is determined according to the field range of the structured light sensor, and in the embodiment, the value range of the first height is 5-15 mm.
Step S225, the demonstrator controls the end tool to adjust along the Y-axis direction so that the structural light of the structured light sensor is aligned to the scale mark, the variation range of the structural light emitted by the structured light sensor is in the fourth picture area, and a third camera coordinate P3 (x) of the calibration point is obtainedc3,yc3,zc3) And a conversion matrix currently displayed by the demonstrator;
step S226, the demonstrator controls the end tool to reduce the second height along the Z-axis direction so as to enable the structured light of the structured light sensor to be aligned with the scale mark, and then the demonstrator controls the end tool to perform along the Y-axis directionAdjusting to make the variation range of the structured light emitted by the structured light sensor in the fifth picture area to obtain a fourth camera coordinate P4 (x) of the calibration pointc4,yc4,zc4) And a conversion matrix currently displayed by the demonstrator;
step S227, controlling the operation of the action part of the robot through the demonstrator, aligning the structural light of the structural light sensor to the scale mark under the condition that the height of the end tool is not changed, and obtaining a fifth camera coordinate P5 (x) of the calibration point, where the variation range of the structural light emitted by the structural light sensor is in the sixth screen areac5,yc5,zc5) And the conversion matrix currently displayed by the demonstrator.
In some improved embodiments, after step S227, the method further includes:
the teaching machine controls the operation of the action part of the robot, under the condition that the height of the tail end tool is kept unchanged, the structured light of the structured light sensor is aligned to the scale mark, the variation range of the structured light emitted by the structured light sensor is in any region, and a sixth coordinate P6 (x) of the structured light sensor is obtainedc6,yc6,zc6) And the conversion matrix currently displayed by the demonstrator.
In some improved embodiments, the step S200 further includes:
setting M scale marks on the first horizontal calibration surface, and respectively collecting N calibration coordinates on each scale mark; obtaining M multiplied by N calibration coordinates in total;
and 4 calibration coordinates are selected from the M multiplied by N calibration coordinates.
In some embodiments, two scale marks are designed on the calibration block, and the same method is adopted, and when 6 points are collected, the world coordinates of the two calibration points and the two corresponding scale marks are used simultaneously, so that steps are increased, but the calibration accuracy is improved, because the linear independence of point location selection is increased.
As a further improvement of the above embodiment, the step S300 includes:
hand and eye setting transformation matrix
Figure BDA0003201713190000091
Comprises the following steps:
Figure BDA0003201713190000092
according to the coordinate conversion relationship of the calibration point in the world coordinate system and the camera coordinate system, a coordinate conversion formula is provided:
Figure BDA0003201713190000093
wherein (x)t,yt,zt) Coordinates of the calibration point in a tool coordinate system;
selecting 4 calibration coordinates from the N calibration coordinates, respectively establishing the following 4 equation sets according to the coordinate conversion formula, and connecting the first lines of the following 4 equation sets;
Figure BDA0003201713190000101
Figure BDA0003201713190000102
Figure BDA0003201713190000103
Figure BDA0003201713190000104
obtaining:
Figure BDA0003201713190000105
Figure BDA0003201713190000106
Figure BDA0003201713190000107
thus, can solve ax、bx、cxAnd px;ay、by、cyAnd py;az、bz、czAnd pz
Will rotate the matrix
Figure BDA0003201713190000108
Unitization to obtain
Figure BDA0003201713190000109
Substituting the variable into the hand-eye transformation matrix, and reversely solving the displacement offset in the hand-eye transformation matrix
Figure BDA00032017131900001010
Obtaining a standardized hand-eye calibration matrix
Figure BDA00032017131900001011
And taking the standardized hand-eye calibration matrix as a hand-eye calibration conversion matrix. The variable refers to world coordinates of the index point and any one camera coordinate.
It should be noted that the matrix is rotated according to the definition of the hand-eye transformation matrix
Figure BDA00032017131900001012
Since the unit column vectors are orthogonal to each other, it is necessary to obtain the rotation matrix in units after obtaining each parameter
Figure BDA00032017131900001013
It can be seen that it is required to
Figure BDA0003201713190000111
Only four points under the coordinates of the vision camera are required to be given, in order to improve the precision of the calibration algorithm, N (N is more than or equal to 4) different points are selected in the space, and if 5 points are selected from the N different points for calibration, 5 combination conditions are provided; if the value of N is too large, the effect of improving the accuracy of the calibration result is limited, the computational power of a computer is wasted, and through experiments, the calibration precision can meet the use standard by taking 6 points (N is 6),
as a further improvement of the above embodiment, the coordinate conversion relationship of the index point in the world coordinate system and in the camera coordinate system is determined according to the following formula:
Figure BDA0003201713190000112
wherein, PwCoordinates (x) in world coordinate system for index pointw,yw,zw),PcAs coordinates (x) of the index point in the camera coordinate systemc,yc,zc),
Figure BDA0003201713190000113
Is a transformation matrix from the base coordinate system to the flange,
Figure BDA0003201713190000114
is a transformation matrix of the flange plate to the tool coordinate system,
Figure BDA0003201713190000115
a hand-eye transformation matrix.
It should be noted that the purpose of calibrating the world coordinate system and the camera coordinate system is to determine the known quantity Pw、Pc
Figure BDA0003201713190000116
Finding a hand-eye transformation matrix
Figure BDA0003201713190000117
For the combination selection of point positions, the invention establishes a linear irrelevance measurement standard, because the linear irrelevance is insufficient because the 4 points are close to each other when calculating the matrix, the inversion cannot be carried out or the effective bit is eliminated because of the computer precision after the inversion, the combination with higher linear irrelevance is selected, the calibration precision is higher, and the concrete explanation is as follows:
is provided with
Figure BDA0003201713190000118
For m column vectors in n-dimensional Euclidean space, let
Figure BDA0003201713190000119
The gram matrix for the vector set is then:
Figure BDA00032017131900001110
definition of
Figure BDA00032017131900001111
The linear irrelevance expression of (a) is:
Figure BDA00032017131900001112
taking at the time of actual calibration
Figure BDA00032017131900001113
The largest group is used as calibration variables.
When 6 points are collected, the effective measuring range of the sensor and the working space of the robot are guaranteed to be dispersed as much as possible, and the problem that the linearity independence degree of a matrix is not enough in the operation process can be effectively solved.
Compared with the prior art, the method provided by the invention is applied to the optical vision sensor with the tail end mounting structure, and has the following advantages:
1. the calibration block used by the invention is used for calibrating the tool coordinate system, and then the structured light vision sensor obtains the camera coordinate of the target by using the scale mark and the deep groove on the tool coordinate system.
2. By using the method, the hand-eye calibration matrix can be solved by only acquiring 1 or 2 world coordinates and the total 6 corresponding coordinates of the world coordinates and the corresponding coordinates of the camera respectively through teaching.
3. The specific implementation method is innovative, and can be used for keeping the postures of the sensor and the TCP, and how to acquire the coordinates of 6 points and the distribution mode of the point positions under the phase plane.
4. In order to ensure the precision, the invention provides that linear correlation is adopted to measure several point location combinations obtained after the point location is acquired, and the point location combination with the maximum linear correlation is used for solving the hand-eye calibration matrix.
Corresponding to the method in fig. 1, an embodiment of the present invention further provides a computer-readable storage medium, where a hand-eye calibration program based on a structured light sensor is stored on the computer-readable storage medium, and when executed by a processor, the hand-eye calibration program based on a structured light sensor implements the steps of the hand-eye calibration method based on a structured light sensor according to any of the foregoing embodiments.
Corresponding to the method in fig. 1, an embodiment of the present invention further provides a hand-eye calibration system based on a structured light sensor, where the system includes:
at least one processor;
at least one memory for storing at least one program;
when the at least one program is executed by the at least one processor, the at least one processor is enabled to implement the method for calibrating a hand-eye based on a structured light sensor according to any of the above embodiments.
The contents in the above method embodiments are all applicable to the present system embodiment, the functions specifically implemented by the present system embodiment are the same as those in the above method embodiment, and the beneficial effects achieved by the present system embodiment are also the same as those achieved by the above method embodiment.
The Processor may be a Central-Processing Unit (CPU), other general-purpose Processor, a Digital Signal Processor (DSP), an Application-Specific-Integrated-Circuit (ASIC), a Field-Programmable Gate array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, the processor is a control center of the hand-eye calibration system based on the structured light sensor, and various interfaces and lines are used for connecting various parts of the whole device capable of operating the hand-eye calibration system based on the structured light sensor.
The memory can be used for storing the computer program and/or the module, and the processor can realize various functions of the hand-eye calibration system based on the structured light sensor by operating or executing the computer program and/or the module stored in the memory and calling the data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart-Media-Card (SMC), a Secure-digital (SD) Card, a Flash-memory Card (Flash-Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid-state storage device.
While the description of the present application has been made in considerable detail and with particular reference to a few illustrated embodiments, it is not intended to be limited to any such details or embodiments or any particular embodiments, but it is to be construed that the present application effectively covers the intended scope of the application by reference to the appended claims, which are interpreted in view of the broad potential of the prior art. Further, the foregoing describes the present application in terms of embodiments foreseen by the inventor for which an enabling description was available, notwithstanding that insubstantial changes from the present application, not presently foreseen, may nonetheless represent equivalents thereto.

Claims (10)

1. A hand-eye calibration method based on a structured light sensor is characterized by comprising the following steps:
step S100, respectively determining a tool coordinate system of the end tool and a camera coordinate system of the structured light sensor; the structured light sensor is arranged on the end tool, and the Z axis of the structured light sensor is parallel to the Z axis of the end tool; the end tool is arranged at the end of the action part of the robot; the end tool is used for clamping the structured light sensor;
s200, adjusting an action part of the robot through a demonstrator, and carrying out hand-eye calibration on the robot to obtain a world coordinate and a plurality of calibration coordinates of a calibration point;
the calibration coordinates comprise camera coordinates and a conversion matrix corresponding to the camera coordinates; the conversion matrix is a conversion matrix from a base coordinate system of the robot to a tool coordinate system; the world coordinate is the coordinate of the calibration point in a world coordinate system, and the camera coordinate is the coordinate of the calibration point in a camera coordinate system;
and S300, determining a hand-eye calibration transformation matrix according to the world coordinates of the calibration points and the plurality of calibration coordinates.
2. The method for calibrating a hand-eye based on a structured light sensor according to claim 1, wherein the step S100 comprises:
step S110, adjusting the end tool through a demonstrator so that the end tool is aligned to the tip of the cone-shaped calibration block; wherein the conical calibration block is arranged on the calibration plate;
step S120, controlling the operation of an action part of the robot through a demonstrator so as to drive the tail end tool and the structured light sensor to rotate around the tip of the conical calibration block;
step S130, in the process that the tail end tool and the structured light sensor rotate around the tip of the conical calibration block, calibrating the tail end tool and the structured light sensor respectively by using a demonstrator to obtain a plurality of point position coordinates of the tail end tool and a plurality of point position coordinates of the structured light sensor;
step S140, determining a tool coordinate system of the end tool according to the plurality of point location coordinates of the end tool, and determining a camera coordinate system according to the plurality of point location coordinates detected by the structured light sensor.
3. The hand-eye calibration method based on the structured light sensor as claimed in claim 2, wherein the calibration plate has a vertical calibration surface, a first horizontal calibration surface and a second horizontal calibration surface, the first horizontal calibration surface and the second horizontal calibration surface are parallel to each other, and the first horizontal calibration surface is higher than the second horizontal calibration surface; the vertical calibration surface is respectively vertical to the first horizontal calibration surface and the second horizontal calibration surface; the first horizontal calibration surface is provided with scale marks vertical to the vertical surface, and the intersection points of the scale marks and the vertical surface are calibration points;
the step S200 includes:
step S210, adjusting an action part of the robot through a demonstrator so as to enable the tail end of the tool to align with the calibration point and obtain world coordinates of the calibration point;
step S220, controlling the operation of the action part of the robot through a demonstrator so as to enable the structural light of the structured light sensor to be aligned to the scale marks, and on the premise of keeping the structural light of the structured light sensor aligned to the scale marks, obtaining camera coordinates of the structured light sensor at N different heights by adjusting the posture of the action part of the robot and recording the camera coordinates as N calibration coordinates; n is more than or equal to 4, and the variation range of the structural light emitted by the structural light sensor is in different picture areas under different postures;
and step S230, selecting 4 calibration coordinates from the N calibration coordinates.
4. The method for calibrating a hand-eye based on a structured light sensor as claimed in claim 3, wherein the step S220 comprises:
step S221, determining a visual area of a visual sensor, dividing the visual area of the visual sensor into rectangular nine-square grids, taking a central grid of the rectangular nine-square grids as a second picture, and taking four vertex grids of the rectangular nine-square grids as a third picture, a fourth picture, a fifth picture and a sixth picture respectively;
step S222, adjusting an action part of the robot through the demonstrator so that the tail end tool is aligned to the calibration point, recording the tool coordinate displacement on the demonstrator, namely the world coordinate of the calibration point, and recording the world coordinate as Pw1(xw1,yw1,zw1);
Step S223 is to control the operation of the operation part of the robot through the demonstrator, so as to align the structural light of the structural light sensor with the scale mark, and to make the variation range of the structural light emitted by the structural light sensor in the second screen region, so as to obtain a first camera coordinate P1(x 1) of the calibration pointc1,yc1,zc1) And a conversion matrix currently displayed by the demonstrator;
s224, controlling the tail end tool to lift a first height along the Z-axis direction through the demonstrator so that the structural light of the structured light sensor is aligned to the scale mark, and then controlling the tail end tool to adjust along the Y-axis direction so that the change range of the structural light emitted by the structured light sensor is in a third picture area to obtain a second camera coordinate P2 (x) of the calibration pointc2,yc2,zc2) And a conversion matrix currently displayed by the demonstrator;
step S225, the demonstrator controls the end tool to adjust along the Y-axis direction so that the structural light of the structured light sensor is aligned to the scale mark, the variation range of the structural light emitted by the structured light sensor is in the fourth picture area, and a third camera coordinate P3 (x) of the calibration point is obtainedc3,yc3,zc3) And a conversion matrix currently displayed by the demonstrator;
step S226, the demonstrator controls the end tool to reduce the second height along the Z-axis direction so as to align the structural light of the structured light sensor with the scale mark, and then controls the end tool to adjust along the Y-axis direction so as to enable the variation range of the structural light emitted by the structured light sensor to be in the fifth picture area, and obtain a fourth camera coordinate P4 (x) of the calibration pointc4,yc4,zc4) And a conversion matrix currently displayed by the demonstrator;
step S227, controlling the operation of the action part of the robot through the demonstrator, aligning the structural light of the structural light sensor to the scale mark under the condition that the height of the end tool is not changed, and obtaining a fifth camera coordinate P5 (x) of the calibration point, where the variation range of the structural light emitted by the structural light sensor is in the sixth screen areac5,yc5,zc5) And the conversion matrix currently displayed by the demonstrator.
5. The method for calibrating a hand-eye based on a structured light sensor as claimed in claim 4, wherein after step S227, the method further comprises:
the teaching machine controls the operation of the action part of the robot, under the condition that the height of the tail end tool is kept unchanged, the structured light of the structured light sensor is aligned to the scale mark, the variation range of the structured light emitted by the structured light sensor is in any region, and a sixth coordinate P6 (x) of the structured light sensor is obtainedc6,yc6,zc6) And the conversion matrix currently displayed by the demonstrator.
6. The method for calibrating a hand-eye based on a structured light sensor as claimed in claim 5, wherein the step S200 further comprises:
setting M scale marks on the first horizontal calibration surface, and respectively collecting N calibration coordinates on each scale mark; obtaining M multiplied by N calibration coordinates in total;
and 4 calibration coordinates are selected from the M multiplied by N calibration coordinates.
7. The method for calibrating a hand-eye based on a structured light sensor according to claim 6, wherein the step S300 comprises:
hand and eye setting transformation matrix
Figure FDA0003201713180000031
Comprises the following steps:
Figure FDA0003201713180000032
according to the coordinate conversion relationship of the calibration point in the world coordinate system and the camera coordinate system, a coordinate conversion formula is provided:
Figure FDA0003201713180000033
wherein (x)t,yt,zt) Coordinates of the calibration point in a tool coordinate system;
selecting 4 calibration coordinates from the N calibration coordinates, respectively establishing the following 4 equation sets according to the coordinate conversion formula, and connecting the first lines of the following 4 equation sets;
Figure FDA0003201713180000034
Figure FDA0003201713180000035
Figure FDA0003201713180000036
Figure FDA0003201713180000037
obtaining:
Figure FDA0003201713180000041
Figure FDA0003201713180000042
Figure FDA0003201713180000043
thus, can solve ax、bx、cxAnd px;ay、by、cyAnd py;az、bz、czAnd pz
Will rotate the matrix
Figure FDA0003201713180000044
Unitization to obtain
Figure FDA0003201713180000045
According to displacement offset
Figure FDA0003201713180000046
Obtaining a standardized hand-eye calibration matrix
Figure FDA0003201713180000047
And taking the standardized hand-eye calibration matrix as a hand-eye calibration conversion matrix.
8. The method of claim 7, wherein the coordinate transformation relationship between the calibration point in the world coordinate system and the calibration point in the camera coordinate system is determined according to the following formula:
Figure FDA0003201713180000048
wherein, PwCoordinates (x) in world coordinate system for index pointw,yw,zw),PcFor the calibration point at the camera coordinatesCoordinates under the system (x)c,yc,zc),
Figure FDA0003201713180000049
Is a transformation matrix from the base coordinate system to the flange,
Figure FDA00032017131800000410
is a transformation matrix of the flange plate to the tool coordinate system,
Figure FDA00032017131800000411
a hand-eye transformation matrix.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the method for hand-eye calibration based on a structured light sensor according to any one of claims 1 to 8.
10. A hand-eye calibration system based on a structured light sensor, the system comprising:
at least one processor;
at least one memory for storing at least one program;
when executed by the at least one processor, cause the at least one processor to implement the method for structured light sensor-based hand-eye calibration of any one of claims 1 to 8.
CN202110906192.2A 2021-08-09 2021-08-09 Hand-eye calibration method, system and storage medium based on structured light sensor Active CN113744343B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110906192.2A CN113744343B (en) 2021-08-09 2021-08-09 Hand-eye calibration method, system and storage medium based on structured light sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110906192.2A CN113744343B (en) 2021-08-09 2021-08-09 Hand-eye calibration method, system and storage medium based on structured light sensor

Publications (2)

Publication Number Publication Date
CN113744343A true CN113744343A (en) 2021-12-03
CN113744343B CN113744343B (en) 2023-12-05

Family

ID=78730601

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110906192.2A Active CN113744343B (en) 2021-08-09 2021-08-09 Hand-eye calibration method, system and storage medium based on structured light sensor

Country Status (1)

Country Link
CN (1) CN113744343B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114505864A (en) * 2022-03-11 2022-05-17 上海柏楚电子科技股份有限公司 Hand-eye calibration method, device, equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3134232A1 (en) * 2014-04-22 2017-03-01 Lappeenrannan Teknillinen Yliopisto A method and a system for generating data for calibrating a robot
CN110039523A (en) * 2019-05-20 2019-07-23 北京无远弗届科技有限公司 A kind of industrial robot vision's servo-system, servo method and device
WO2019205299A1 (en) * 2018-04-27 2019-10-31 中国农业大学 Vision measurement system structure parameter calibration and affine coordinate system construction method and system
CN110906863A (en) * 2019-10-30 2020-03-24 成都绝影智能科技有限公司 Hand-eye calibration system and calibration method for line-structured light sensor
CN111986271A (en) * 2020-09-04 2020-11-24 廊坊和易生活网络科技股份有限公司 Robot direction and hand-eye relation simultaneous calibration method based on light beam adjustment
CN112629499A (en) * 2020-12-03 2021-04-09 合肥富煌君达高科信息技术有限公司 Hand-eye calibration repeated positioning precision measuring method and device based on line scanner
CN113118604A (en) * 2021-04-23 2021-07-16 上海交通大学 High-precision projection welding error compensation system based on robot hand-eye visual feedback
US20210241491A1 (en) * 2020-02-04 2021-08-05 Mujin, Inc. Method and system for performing automatic camera calibration

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3134232A1 (en) * 2014-04-22 2017-03-01 Lappeenrannan Teknillinen Yliopisto A method and a system for generating data for calibrating a robot
WO2019205299A1 (en) * 2018-04-27 2019-10-31 中国农业大学 Vision measurement system structure parameter calibration and affine coordinate system construction method and system
CN110039523A (en) * 2019-05-20 2019-07-23 北京无远弗届科技有限公司 A kind of industrial robot vision's servo-system, servo method and device
CN110906863A (en) * 2019-10-30 2020-03-24 成都绝影智能科技有限公司 Hand-eye calibration system and calibration method for line-structured light sensor
US20210241491A1 (en) * 2020-02-04 2021-08-05 Mujin, Inc. Method and system for performing automatic camera calibration
CN111986271A (en) * 2020-09-04 2020-11-24 廊坊和易生活网络科技股份有限公司 Robot direction and hand-eye relation simultaneous calibration method based on light beam adjustment
CN112629499A (en) * 2020-12-03 2021-04-09 合肥富煌君达高科信息技术有限公司 Hand-eye calibration repeated positioning precision measuring method and device based on line scanner
CN113118604A (en) * 2021-04-23 2021-07-16 上海交通大学 High-precision projection welding error compensation system based on robot hand-eye visual feedback

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114505864A (en) * 2022-03-11 2022-05-17 上海柏楚电子科技股份有限公司 Hand-eye calibration method, device, equipment and storage medium
CN114505864B (en) * 2022-03-11 2024-02-09 上海柏楚电子科技股份有限公司 Hand-eye calibration method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN113744343B (en) 2023-12-05

Similar Documents

Publication Publication Date Title
CN110555889B (en) CALTag and point cloud information-based depth camera hand-eye calibration method
JP4021413B2 (en) Measuring device
JP2690603B2 (en) Vision sensor calibration method
JP4191080B2 (en) Measuring device
CN111604598B (en) Tool setting method of mechanical arm feeding type laser etching system
CN101582165B (en) Camera array calibration algorithm based on gray level image and spatial depth data
CN109794963B (en) Robot rapid positioning method facing curved surface component
CN107256568B (en) High-precision mechanical arm hand-eye camera calibration method and calibration system
US20030090483A1 (en) Simulation apparatus for working machine
US20140188274A1 (en) Robot system display device
CN106524912B (en) Light target cursor position scaling method based on the mobile light pen of three coordinate measuring machine
CN107073719A (en) Robot and robot system
US20190176335A1 (en) Calibration and operation of vision-based manipulation systems
KR20020060570A (en) Image processing method and apparatus
CN106737674A (en) Instrument board non-linear scale visible detection method and picture write system and device
CN113744343A (en) Hand-eye calibration method and system based on structured light sensor and storage medium
CN109556510A (en) Position detecting device and computer readable storage medium
CN110640303A (en) High-precision vision positioning system and positioning calibration method thereof
CN113781558B (en) Robot vision locating method with decoupling gesture and position
CN114677429A (en) Positioning method and device of manipulator, computer equipment and storage medium
CN110842917B (en) Method for calibrating mechanical parameters of series-parallel connection machinery, electronic device and storage medium
CN115122333A (en) Robot calibration method and device, electronic equipment and storage medium
CN111028298A (en) Convergent binocular system for rigid coordinate system space transformation calibration
WO2023053395A1 (en) Position and posture measurement system
CN106979749A (en) A kind of fuzzy self-adaption method of adjustment of optical strip image imaging parameters

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant