WO2020132924A1 - 机器人传感器的外参标定方法、装置、机器人及存储介质 - Google Patents

机器人传感器的外参标定方法、装置、机器人及存储介质 Download PDF

Info

Publication number
WO2020132924A1
WO2020132924A1 PCT/CN2018/123793 CN2018123793W WO2020132924A1 WO 2020132924 A1 WO2020132924 A1 WO 2020132924A1 CN 2018123793 W CN2018123793 W CN 2018123793W WO 2020132924 A1 WO2020132924 A1 WO 2020132924A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
sensor data
data
robot
coordinate
Prior art date
Application number
PCT/CN2018/123793
Other languages
English (en)
French (fr)
Inventor
熊友军
胡旭
聂鹏
Original Assignee
深圳市优必选科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市优必选科技有限公司 filed Critical 深圳市优必选科技有限公司
Priority to US16/611,475 priority Critical patent/US11590655B2/en
Publication of WO2020132924A1 publication Critical patent/WO2020132924A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • B25J13/089Determining the position of the robot with reference to its environment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • G01B21/04Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B9/00Measuring instruments characterised by the use of optical techniques
    • G01B9/02Interferometers
    • G01B9/02055Reduction or prevention of errors; Testing; Calibration
    • G01B9/0207Error reduction by correction of the measurement signal based on independently determined error sources, e.g. using a reference interferometer
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39024Calibration of manipulator

Definitions

  • the invention relates to the field of computer processing, in particular to a method, device, robot and storage medium for external parameter calibration of a robot sensor.
  • a robot system usually needs to fuse the data of multiple sensors to work together, and these sensors have different perception characteristics and are placed in different positions of the robot body, so each sensor has different perceptions of environmental information.
  • each sensor has different perceptions of environmental information.
  • Deviation which results in sensor data converted to the same coordinate system, there is a positional deviation between the data, the same object in the scene is mapped to a different position, this deviation will cause the robot to perceive the environment incorrectly, affecting the performance of the robot.
  • an embodiment of the present invention provides a calibration method for external parameters of a robot sensor.
  • the method includes:
  • the first sensor data and the first sensor data obtained by collecting the position information of the calibrated reference object in the first robot and the second sensor in the acquisition robot are entered
  • the steps of the second sensor data are sequentially looped until N sets of coordinate data are accumulated, where N is a positive integer not less than the number of unknown degrees of freedom contained in the external parameter;
  • the external parameter between the first sensor and the second sensor is calculated according to the N sets of coordinate data, and the external parameter is a relative positional relationship parameter between the first sensor and the second sensor.
  • an embodiment of the present invention provides an external parameter calibration device for a robot sensor.
  • the device includes:
  • the collection module is used to obtain the first sensor data and the second sensor data obtained by collecting the position information of the calibration reference object of the first sensor and the second sensor in the robot;
  • the conversion module converts the first sensor data and the second sensor data to the same coordinate system, and accordingly obtains the first converted sensor data and the second converted sensor data;
  • a determining module configured to determine the first coordinate position of the reference point in the calibrated reference according to the first converted sensor data, and determine the reference point in the calibrated reference according to the second converted sensor data A second coordinate position, using the first coordinate position and the second coordinate position as a set of coordinate data;
  • the circulation module is used for the first sensor and the second sensor in the acquisition robot to acquire the position information of the calibration reference object when the relative position relationship between the robot and the calibration reference object changes
  • the steps of the first sensor data and the second sensor data are sequentially looped until N sets of coordinate data are accumulated, where N is a positive integer not less than the number of unknown degrees of freedom included in the external parameter;
  • a calculation module used to calculate the external parameter between the first sensor and the second sensor based on the N sets of coordinate data, the external parameter being the relative position between the first sensor and the second sensor Relationship parameters.
  • an embodiment of the present invention provides a robot including a memory and a processor.
  • the memory stores a computer program.
  • the processor is caused to perform the following steps:
  • the first sensor data and the first sensor data obtained by collecting the position information of the calibrated reference object in the first robot and the second sensor in the acquisition robot are entered
  • the steps of the second sensor data are sequentially looped until N sets of coordinate data are accumulated, where N is a positive integer not less than the number of unknown degrees of freedom contained in the external parameter;
  • the external parameter between the first sensor and the second sensor is calculated according to the N sets of coordinate data, and the external parameter is a relative positional relationship parameter between the first sensor and the second sensor.
  • an embodiment of the present invention provides a computer-readable storage medium that stores a computer program.
  • the processor is caused to perform the following steps:
  • the first sensor data and the first sensor data obtained by collecting the position information of the calibrated reference object in the first robot and the second sensor in the acquisition robot are entered
  • the steps of the second sensor data are sequentially looped until N sets of coordinate data are accumulated, where N is a positive integer not less than the number of unknown degrees of freedom contained in the external parameter;
  • the external parameter between the first sensor and the second sensor is calculated according to the N sets of coordinate data, and the external parameter is a relative positional relationship parameter between the first sensor and the second sensor.
  • the external parameter calibration method, device, robot and storage medium of the robot sensor described above collect the position information of the calibration reference object through the first sensor and the second sensor in the robot to obtain the first sensor data and the second sensor data, and convert to Obtain the first converted sensor data and the second converted sensor data in the same coordinate system, and then obtain the first coordinate position of the reference point in the calibration reference according to the first converted sensor data, and the reference point in the calibration reference according to the second converted sensor data
  • the second coordinate position uses the first coordinate position and the second coordinate position as a set of coordinate data.
  • N sets of coordinate data are collected, where N is not less than the unknown contained in the external reference
  • a positive integer of the number of degrees of freedom the relative positional relationship parameter between the first sensor and the second sensor is calculated according to the N sets of coordinate data. It is only necessary to move the calibration reference object and collect N sets of coordinate data to solve the positional relationship parameters of the first sensor and the second sensor. Not only is the calibration simple, but the deviation is greatly reduced, and the first sensor and the second sensor are improved. Consistency under the robot coordinate system, thus improving the performance of the robot.
  • FIG. 1 is an application environment diagram of an external parameter calibration method of a robot sensor in an embodiment
  • FIG. 2 is a flowchart of an external parameter calibration method of a robot sensor in an embodiment
  • FIG. 3 is a schematic diagram of determining the first and second coordinate positions of a reference point in an embodiment
  • FIG. 5 is a schematic diagram of determining the position of a reference point by fitting two triangle sides in an embodiment
  • FIG. 6 is a schematic flowchart of an external parameter calibration method of a robot sensor in an embodiment
  • FIG. 7 is a frame structure diagram of an external parameter calibration device of a robot sensor in an embodiment
  • FIG. 8 is an internal structure diagram of the robot in an embodiment.
  • FIG. 1 is an application environment diagram of an external parameter calibration method of a robot sensor in an embodiment.
  • the external parameter calibration method of the robot sensor is applied to the external parameter calibration system of the robot sensor.
  • the external reference calibration system of the robot sensor includes a robot 110 and a calibration reference 120.
  • the robot 110 collects the position of the calibration reference object 120 through the first sensor (for example, a single-line radar) and the second sensor (for example, a camera) to obtain the first sensor data and the second sensor data, and then combines the first sensor data and the second sensor data
  • the two sensor data are converted to the same coordinate system, and the first converted sensor data and the second converted sensor data are obtained accordingly, and the first coordinate position of the reference point in the calibration reference is determined according to the first converted sensor data, according to the
  • the second converted sensor data determines the second coordinate position of the reference point in the calibration reference, using the first coordinate position and the second coordinate position as a set of coordinate data, and then moves the calibration reference 120 Repeat the above process until N sets of coordinate data are obtained, where N is a positive integer that is not less than the number of unknown degrees of freedom included in the external parameters, and finally the first sensor and the second sensor are calculated based on the N sets of coordinate data
  • the external parameter is the relative positional relationship parameter between the first sensor and the second sensor.
  • an external parameter calibration method of a robot sensor is proposed.
  • the external parameter calibration method of the robot sensor is applied to a robot, and specifically includes the following steps:
  • Step 202 Obtain first sensor data and second sensor data obtained by collecting the position information of the calibration reference object by the first sensor and the second sensor in the robot.
  • the first sensor and the second sensor are two different sensors in the robot.
  • the first sensor may be a radar
  • the second sensor may be a visual sensor (camera).
  • Both the first sensor and the second sensor can be used to collect the position information of the calibration reference object.
  • the calibration reference object refers to the reference object used for assisting the calibration. The position of a series of points obtained by the detection of the first sensor data and the second sensor data. In order to be able to clearly determine a certain point, the one with a clear turning point is generally selected.
  • the object serves as a calibration reference.
  • the calibration reference object may use a triangular cylinder, and the triangular cylinder includes an angle. The vertex of the angle is an obvious turning point, so that the position of the turning point can be quickly determined later.
  • other objects with obvious turning points can also be used, such as triangular cones or cubes.
  • the first sensor and the second sensor acquire the first sensor data and the second sensor data by performing position acquisition on the same calibration reference object at the same position. It is convenient for subsequent calibration of external parameters according to the difference between the first sensor data and the second sensor data.
  • External parameters are external parameters, which refer to the relative positional relationship between sensors.
  • the robot and the calibration reference on the same level ground, keep the robot stationary, place the calibration reference in front of the robot, and pass the first sensor and the second sensor pair. Calibrate the reference object for measurement and collect the first sensor data and the second sensor data.
  • the distance range and the area range with good measurement accuracy of the first sensor and the second sensor are selected. If the first sensor is a radar and the second sensor is a camera, the closer the measurement distance is, the higher the accuracy is, so it is best to select a short-range distance range, for example, 0.3 m to 2.0 m.
  • the edge of the field of view of the camera is deformed, so the middle area of the field of view can be selected as the area range.
  • Step 204 Convert the first sensor data and the second sensor data to the same coordinate system, and obtain the first converted sensor data and the second converted sensor data accordingly.
  • the first sensor data collected by the first sensor is data in the first sensor coordinate system
  • the second sensor data collected by the second sensor is data in the second sensor coordinate system.
  • the first sensor data and the second sensor data may be uniformly converted to the first sensor coordinate system or the second sensor coordinate system, so that the two are in the same coordinate system to obtain the first converted sensor data and the second Transform sensor data.
  • the first sensor data and the second sensor data are converted to the robot coordinate system, respectively, and the converted first converted sensor data and second converted sensor data are obtained accordingly.
  • the robot coordinate system is XOY
  • the X axis is directly in front of the robot
  • the Y axis is the left side of the robot, perpendicular to the X axis
  • XOY is parallel to the ground.
  • Step 206 Determine the first coordinate position of the reference point in the calibration reference according to the first converted sensor data, determine the second coordinate position of the reference point in the calibration reference according to the second converted sensor data, The two coordinate positions serve as a set of coordinate data.
  • the reference point is a point used as a reference calculation in the indexed reference.
  • the reference point is generally a point that is easy to identify and distinguish.
  • the data curve obtained by projection is a triangle, and the vertex of the triangle is the most easily recognized point, so the vertex of the angle in the triangular cylinder can be used as the reference point.
  • the first coordinate position of the reference point is determined based on the first converted sensor data
  • the second coordinate position of the reference point is determined based on the second converted sensor data.
  • FIG. 3 it is a schematic diagram of determining the first coordinate position and the second coordinate position of the reference point in an embodiment.
  • the coordinates of the reference point determined by the two have a certain error, the black point The coordinates of the determined reference point.
  • the first coordinate position recorded is (x 0 , y 0 )
  • the second coordinate position is (x′ 0 , y′ 0 ).
  • the first coordinate position and the second coordinate position correspond to the coordinates of the same reference point, and the first coordinate position and the second coordinate position constitute a set of coordinate data.
  • step 208 when the relative positional relationship between the robot and the calibration reference object changes, the process proceeds to step 202 and loops in turn until step 210 is reached.
  • step 210 N sets of coordinate data are accumulated, and N is a positive integer not less than the number of unknown degrees of freedom included in the external parameter.
  • N should choose a positive integer not less than the number of unknown degrees of freedom, for example, for three-dimensional space, including 6 degrees of freedom [x, y, z, roll, pitch, yaw], where roll means around z Axis rotation, pitch means rotation about x axis, yaw means rotation about y axis. Then N should be greater than or equal to 6.
  • the z-axis is ignored, the x-axis is directly in front of the robot, the y-axis is the left side of the robot, perpendicular to the x-axis, and xoy is parallel to the ground.
  • the degrees of freedom of the external parameters are reduced to 3 degrees of freedom [x, y, yaw], then N should be greater than or equal to 3.
  • step 212 external parameters between the first sensor and the second sensor are calculated according to the N sets of coordinate data, and the external parameters are relative positional relationship parameters between the first sensor and the second sensor.
  • external parameters are external parameters, which refer to the relative positional relationship between sensors.
  • the external parameter between the first sensor and the second sensor is calculated by acquiring the conversion relationship between the first sensor and the second sensor.
  • the relative positional relationship between the first sensor and the second sensor can be calculated using the least square method. For a robot 2D navigation application, there are three unknown degrees of freedom, then when there are at least three sets of coordinate data, the corresponding values of the three degrees of freedom can be calculated to obtain the difference between the first sensor and the second sensor Relative position relationship parameters.
  • a robot model is constructed according to the obtained external parameters, and the result of mapping the first sensor data and the second sensor data to the robot coordinate system As shown in FIG. 4, the consistency of the first sensor and the second sensor in the robot coordinate system is improved.
  • the external parameter calibration method of the above robot sensor only needs to move the calibration reference object and collect the data of several sets of calibration reference objects to solve the external parameters before the sensor.
  • the obtained calibration result is accurate, which greatly improves the external description of different sensors. consistency.
  • the external parameter calibration method of the robot sensor mentioned above obtains the first sensor data and the second sensor data by collecting the position information of the calibration reference object through the first sensor and the second sensor in the robot, and then converts the two to the same coordinate system to obtain The first converted sensor data and the second converted sensor data, and then obtaining the first coordinate position of the reference point in the calibration reference according to the first converted sensor data, and obtaining the second coordinate position of the reference point in the calibration reference according to the second converted sensor data , Using the first coordinate position and the second coordinate position as a set of coordinate data, by changing the relative positional relationship between the robot and the calibration reference object, N sets of coordinate data are collected, N is not less than the number of unknown degrees of freedom included in the external reference Is a positive integer, and the relative positional relationship parameter between the first sensor and the second sensor is calculated according to the N sets of coordinate data.
  • the reference point is a turning point associated with two sides in the calibration reference; the first coordinate position of the reference point in the calibration reference is determined according to the first converted sensor data, Determining the second coordinate position of the reference point of the calibration reference object according to the second conversion sensor data includes: fitting the two edges associated with the turning point of the collected calibration reference object according to the first conversion sensor data , Determine the first coordinate position of the turning point according to the fitting result; fit the two edges associated with the turning point of the calibration reference object collected according to the second conversion sensor data, and determine the The second coordinate position of the turning point.
  • the two edges associated with the turning point are respectively fitted with corresponding lines, and then the position of the intersection point of the two edges after fitting is used as the position of the turning point.
  • the calibration reference object is a triangular cylinder
  • the turning point is the vertex corresponding to the angle of the triangular cylinder.
  • the vertex is associated with two triangular sides, and the data collected by the sensor is one
  • the points are composed, so the corresponding straight line fitting needs to be carried out, and the position of the intersection point (hollow circle in the figure) of the two triangle sides is calculated by fitting the two triangle sides, so that it can be accurately positioned.
  • the two sides associated with the turning point of the calibration reference object are straight lines; the two sides associated with the turning point of the collected calibration reference object according to the first conversion sensor data are fitted, according to The fitting result determines the first coordinate position of the turning point, including: performing straight line fitting on the two sides associated with the turning point of the collected calibration reference object respectively, and taking the coordinates of the intersection point of the two straight lines obtained by the fitting as the The first coordinate position of the turning point; the fitting the two edges associated with the turning point of the calibration reference object according to the second conversion sensor data, and determining the second coordinate position of the turning point according to the fitting result
  • the method includes: performing straight line fitting on the two sides associated with the turning point of the collected calibration reference object respectively, and using the coordinates of the intersection point of the two straight lines obtained as the second coordinate position of the turning point.
  • the two sides associated with the turning point in the calibration reference are straight lines, then when calculating the coordinates of the turning point, you can first perform straight line fitting on the associated two sides, and then fit the intersection point of the two straight lines obtained by the fitting The coordinates are used as the coordinates of the turning point. In this way, the problem of inaccurate measurement of the coordinate position of the turning point due to the resolution problem is avoided, and the accuracy of the coordinate position measurement is improved.
  • the converting the first sensor data and the second sensor data to the same coordinate system, correspondingly obtaining the first converted sensor data and the second converted sensor data includes: converting the first sensor data The sensor data is converted from the first sensor coordinate system to the robot coordinate system to obtain first converted sensor data; the second sensor data is converted from the second sensor coordinate system to the robot coordinate system to obtain the first Two conversion sensor data.
  • the first sensor data is converted from the first sensor coordinate system to the robot coordinate system
  • the second sensor data is converted from the second sensor coordinate system Convert to robot coordinate system.
  • the structure of the robot there will be a design value for the position of all the sensors of the robot, so the conversion relationship between the machine sensor coordinate system and the robot coordinate system can be obtained, and the coordinates can be calculated according to the coordinate system conversion relationship. Convert.
  • the external parameter between the first sensor and the second sensor is calculated according to the N sets of coordinate data, and the external parameter is between the first sensor and the second sensor
  • the relative positional relationship parameter of includes: obtaining the conversion relationship between the positions of the first sensor and the second sensor; calculating the first sensor and the second sensor according to the N sets of coordinate data and the conversion relationship The relative position between the parameters.
  • the conversion relationship between the positions of the first sensor and the second sensor can be expressed by the parameter to be solved.
  • the data set of the first sensor be (x i , y i ) and the data set of the second sensor be (x′ i , y′ i ).
  • the parameter to be solved is ( ⁇ x, ⁇ y, ⁇ yaw).
  • the position conversion relationship between the first sensor and the second sensor can be expressed as follows:
  • the first sensor is a single-line radar
  • the second sensor is a visual sensor
  • the first sensor and the second sensor in the acquisition robot acquire the first obtained by collecting the position information of the calibration reference object
  • the sensor data and the second sensor data include: acquiring the visual sensor data obtained by the visual sensor by collecting the position information of the calibrated reference object, and extracting the corresponding line data from the visual sensor data according to the measured height of the single-line radar As the second sensor data.
  • the first sensor is a single-line radar
  • the second sensor is a visual sensor.
  • the position obtained by the single-line radar measurement of the calibration reference object is the position of each point on a line in the calibration reference object. Since the data of the visual sensor covers a relatively large area, it measures the position of each point on a surface. Maintain the consistency of the data reference, and select the line data whose height value is the same as or close to the single-line radar from the visual sensor data as the second sensor data.
  • the first sensor data and the second sensor data obtained in this way are one-dimensional data. In another embodiment, if the height of the single-line radar is not within the measurement range of the visual sensor, the closest sensor is selected as the second sensor data.
  • the calibration reference object is a triangular cylinder
  • the reference point is the vertex of the included angle of the triangular cylinder.
  • the isosceles triangle cylinder is selected as the calibration reference.
  • the angle of the isosceles triangle angle can be selected from 90° to 160°.
  • the verticality of the column is high, so as to ensure that the triangular column can be effectively measured high.
  • the surface material of the triangular cylinder is selected from materials with high measurement accuracy of the radar and the camera, such as wood and white frosted plastic.
  • the first sensor is a single-line radar
  • the second sensor is a 3D camera
  • the calibration reference object is a triangular cylinder.
  • FIG. 6 it is a schematic flowchart of the external parameter calibration method of the robot sensor. First, determine the relative position of a robot and the calibrated reference object. Then, acquire the radar data through the single-line radar and the camera data through the 3D camera. Then, select the line data corresponding to the radar height from the camera data.
  • the corner position in the calibration reference is determined, and then the corner position under the radar data and the corner position under the camera data are obtained as a set of corner position coordinates.
  • the relative positional relationship between the robot and the calibration reference object repeat the above process to obtain another set of corner position coordinates, and loop in turn until N sets of corner position coordinates are obtained, and the external parameters are calculated according to the N sets of corner position coordinates.
  • an external parameter calibration device for a robot sensor includes:
  • the collection module 702 is used to obtain the first sensor data and the second sensor data obtained by collecting the position information of the calibration reference object of the first sensor and the second sensor in the robot;
  • the conversion module 704 is configured to convert the first sensor data and the second sensor data to the same coordinate system, and obtain the first converted sensor data and the second converted sensor data accordingly;
  • the determining module 706 is configured to determine the first coordinate position of the reference point in the calibration reference according to the first converted sensor data, and determine the reference point in the calibration reference according to the second converted sensor data The second coordinate position, using the first coordinate position and the second coordinate position as a set of coordinate data;
  • the circulation module 708 is used to obtain the first sensor and the second sensor in the acquisition robot by collecting the position information of the calibration reference object when the relative position relationship between the robot and the calibration reference object changes The steps of the first sensor data and the second sensor data are sequentially looped until N sets of coordinate data are accumulated, where N is a positive integer not less than the number of unknown degrees of freedom included in the external parameter;
  • the calculation module 710 is configured to calculate the external parameter between the first sensor and the second sensor according to the N sets of coordinate data, and the external parameter is a relative value between the first sensor and the second sensor Position relationship parameters.
  • the reference point is a turning point associated with two sides in the calibration reference
  • the determining module is further configured to fit the two edges associated with the turning point of the collected calibration reference object according to the first conversion sensor data, and determine the first coordinate position of the turning point according to the fitting result, according to the The second conversion sensor data fits the collected two sides associated with the turning point of the calibration reference object, and determines the second coordinate position of the turning point according to the fitting result.
  • the two sides associated with the turning point of the calibration reference object are straight lines; the determining module is also used to perform straight line fitting on the two sides associated with the turning point of the collected calibration reference object, respectively, according to The coordinate of the intersection point of the two straight lines obtained is used as the first coordinate position of the turning point, and the two sides associated with the turning point of the collected calibration reference object are respectively subjected to straight line fitting, and the intersection point of the two straight lines obtained by fitting The coordinate of is used as the second coordinate position of the turning point.
  • the conversion module is further configured to convert the first sensor data from the first sensor coordinate system to the robot coordinate system to obtain first converted sensor data; convert the second sensor data Transform from the second sensor coordinate system to the robot coordinate system to obtain second converted sensor data.
  • the calculation module is further used to obtain the conversion relationship between the positions of the first sensor and the second sensor; the first relationship is calculated according to the N sets of coordinate data and the conversion relationship The relative positional relationship parameter between the sensor and the second sensor.
  • the first sensor is a single-line radar
  • the second sensor is a visual sensor
  • the collection module is further used to obtain visual sensor data obtained by the visual sensor by collecting the position information of the calibrated reference object, According to the measured height of the single-line radar, corresponding line data is extracted from the visual sensor data as the second sensor data.
  • the calibration reference object is a triangular cylinder
  • the reference point is the vertex of the included angle of the triangular cylinder.
  • FIG. 8 shows an internal structure diagram of the robot in an embodiment.
  • the computer may be a server.
  • the robot includes a processor, a memory, a first sensor, a second sensor, and a network interface connected through a system bus.
  • the memory includes a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium of the robot stores an operating system and a computer program.
  • the processor can enable the processor to implement the external parameter calibration method of the robot sensor.
  • a computer program may also be stored in the internal memory.
  • the processor may be caused to execute the external parameter calibration method of the robot sensor.
  • the network interface is used to communicate with the outside world.
  • FIG. 8 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the robot to which the solution of the present application is applied.
  • the specific robot may include More or fewer components are shown in the figure, or some components are combined, or have different component arrangements.
  • the external parameter calibration method for robot sensors provided in this application may be implemented in the form of a computer program, and the computer program may run on the robot shown in FIG. 8.
  • Various program templates constituting the external parameter calibration device of the robot sensor can be stored in the memory of the robot.
  • a robot includes a memory and a processor.
  • the memory stores a computer program.
  • the processor is caused to perform the following steps: obtain the first sensor and the second sensor in the robot through The first sensor data and the second sensor data obtained by collecting the position information of the calibration reference object; converting the first sensor data and the second sensor data to the same coordinate system, correspondingly obtaining the first converted sensor data and Second conversion sensor data; determining the first coordinate position of the reference point in the calibration reference according to the first conversion sensor data, and determining the reference point in the calibration reference according to the second conversion sensor data The second coordinate position, using the first coordinate position and the second coordinate position as a set of coordinate data; when the relative positional relationship between the robot and the calibration reference object changes, enter the acquisition
  • the first sensor data and the second sensor data in the robot acquire the first sensor data and the second sensor data by collecting the position information of the calibration reference object, and then iterate in turn until N sets of coordinate data are accumulated, where N is not less than A positive integer of the number of unknown degrees of freedom included in
  • a computer-readable storage medium storing a computer program, when the computer program is executed by a processor, the processor is caused to perform the following steps: acquiring position information of a calibration reference object by a first sensor and a second sensor in a robot Collect the first sensor data and the second sensor data obtained by the acquisition; convert the first sensor data and the second sensor data to the same coordinate system, and obtain the first converted sensor data and the second converted sensor data accordingly;
  • the first converted sensor data determines the first coordinate position of the reference point in the calibration reference
  • the second coordinate sensor data determines the second coordinate position of the reference point in the calibration reference
  • the first coordinate position and the second coordinate position serve as a set of coordinate data; when the relative positional relationship between the robot and the calibration reference object changes, the first sensor and the first
  • the steps of the first sensor data and the second sensor data obtained by the two sensors by collecting the position information of the calibration reference object, and iteratively loops until N sets of coordinate data are accumulated, where N is not less than the unknown freedom included in the external reference A positive integer of
  • Non-volatile memory may include read-only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM) or external cache memory.
  • RAM random access memory
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDRSDRAM double data rate SDRAM
  • ESDRAM enhanced SDRAM
  • SLDRAM synchronous chain (Synchlink) DRAM
  • RDRAM direct RAM
  • DRAM direct memory bus dynamic RAM
  • RDRAM memory bus dynamic RAM

Abstract

一种机器人传感器的外参标定方法,该方法包括:获取机器人(110)中第一传感器和第二传感器通过对标定参照物(120)的位置信息进行采集得到的第一传感器数据和第二传感器数据,并转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据,进而确定标定参照物(120)中的参考点的第一坐标位置(x 0,y 0)和第二坐标位置(x' 0,y' 0),将第一坐标位置(x 0,y 0)和第二坐标位置(x' 0,y' 0)作为一组坐标数据,当机器人(110)与标定参照物(120)之间的相对位置关系发生变化时,重复上述步骤,得到N组坐标数据,然后计算得到第一传感器和第二传感器之间的外参。该方法提高了机器人(110)的性能。此外,还提出了一种机器人传感器的外参标定装置、机器人(110)及存储介质。

Description

机器人传感器的外参标定方法、装置、机器人及存储介质 技术领域
本发明涉及计算机处理领域,尤其是涉及一种机器人传感器的外参标定方法、装置、机器人及存储介质。
背景技术
一个机器人系统通常需要融合多种传感器的数据共同工作,而这些传感器的感知特性存在差异,且放置在机器人本体的不同位置,因此每个传感器对环境信息的感知存在差异。机器人系统在应用这些传感器数据时,希望能将这些数据转换为一个统一的描述。
虽然在对机器人结构进行设计时,会有一个对机器人所有传感器的位置的一个设计值,但由于工件加工误差、装配误差、传感器自身误差等的存在,传感器的真实位置通常与设计值存在一定的偏差,由此导致传感器数据在转换到同一坐标系下,数据间存在位置偏差,场景中的同一物体被映射到了不同的位置,这种偏差将导致机器人对环境的感知错误,影响机器人的性能。
发明内容
基于此,有必要针对上述问题,提供了一种可以减少偏差,提高机器人性能的机器人传感器的外参标定方法、装置、机器人及存储介质。
第一方面,本发明实施例提供一种机器人传感器的外参标定方法,所述方法包括:
获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据;
将所述第一传感器数据和所述第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据;
根据所述第一转换传感器数据确定所述标定参照物中的参考点的第一坐标 位置,根据所述第二转换传感器数据确定所述标定参照物中的所述参考点的第二坐标位置,将所述第一坐标位置和所述第二坐标位置作为一组坐标数据;
当所述机器人与所述标定参照物之间的相对位置关系发生变化时,进入所述获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据的步骤,依次循环,直到累计得到N组坐标数据,所述N为不小于外参中包含的未知自由度个数的正整数;
根据所述N组坐标数据计算得到所述第一传感器和所述第二传感器之间的外参,所述外参为所述第一传感器和第二传感器之间的相对位置关系参数。
第二方面,本发明实施例提供一种机器人传感器的外参标定装置,所述装置包括:
采集模块,用于获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据;
转换模块,将所述第一传感器数据和所述第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据;
确定模块,用于根据所述第一转换传感器数据确定所述标定参照物中的参考点的第一坐标位置,根据所述第二转换传感器数据确定所述标定参照物中的所述参考点的第二坐标位置,将所述第一坐标位置和所述第二坐标位置作为一组坐标数据;
循环模块,用于当所述机器人与所述标定参照物之间的相对位置关系发生变化时,进入所述获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据的步骤,依次循环,直到累计得到N组坐标数据,所述N为不小于外参中包含的未知自由度个数的正整数;
计算模块,用于根据所述N组坐标数据计算得到所述第一传感器和所述第二传感器之间的外参,所述外参为所述第一传感器和第二传感器之间的相对位置关系参数。
第三方面,本发明实施例提供一种机器人,包括存储器和处理器,所述存储器存储有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行如下步骤:
获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据;
将所述第一传感器数据和所述第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据;
根据所述第一转换传感器数据确定所述标定参照物中的参考点的第一坐标位置,根据所述第二转换传感器数据确定所述标定参照物中的所述参考点的第二坐标位置,将所述第一坐标位置和所述第二坐标位置作为一组坐标数据;
当所述机器人与所述标定参照物之间的相对位置关系发生变化时,进入所述获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据的步骤,依次循环,直到累计得到N组坐标数据,所述N为不小于外参中包含的未知自由度个数的正整数;
根据所述N组坐标数据计算得到所述第一传感器和所述第二传感器之间的外参,所述外参为所述第一传感器和第二传感器之间的相对位置关系参数。
第四方面,本发明实施例提供一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时,使得所述处理器执行如下步骤:
获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据;
将所述第一传感器数据和所述第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据;
根据所述第一转换传感器数据确定所述标定参照物中的参考点的第一坐标位置,根据所述第二转换传感器数据确定所述标定参照物中的所述参考点的第二坐标位置,将所述第一坐标位置和所述第二坐标位置作为一组坐标数据;
当所述机器人与所述标定参照物之间的相对位置关系发生变化时,进入所述获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据的步骤,依次循环,直到累计得到N组坐标数据,所述N为不小于外参中包含的未知自由度个数的正整数;
根据所述N组坐标数据计算得到所述第一传感器和所述第二传感器之间的外参,所述外参为所述第一传感器和第二传感器之间的相对位置关系参数。
上述机器人传感器的外参标定方法、装置、机器人及存储介质,通过机器人中的第一传感器和第二传感器对标定参照物的位置信息进行采集得到第一传感器数据和第二传感器数据,并转换到同一坐标系得到第一转换传感器数据和第二转换传感器数据,然后根据第一转换传感器数据得到标定参照物中参考点的第一坐标位置,根据第二转换传感器数据得到标定参考物中参考点的第二坐标位置,将第一坐标位置和第二坐标位置作为一组坐标数据,通过改变机器人与标定参照物的相对位置关系,采集得到N组坐标数据,N为不小于外参中包含的未知自由度个数的正整数,根据N组坐标数据来计算得到第一传感器和第二传感器之间的相对位置关系参数。只需要通过移动标定参照物,采集得到N组坐标数据就可以求解出第一传感器和第二传感器的位置关系参数,不仅标定简单,而且大大减少了偏差,提高了第一传感器和第二传感器在机器人坐标系下的一致性,从而提高了机器人的性能。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
其中:
图1为一个实施例中机器人传感器的外参标定方法的应用环境图;
图2为一个实施例中机器人传感器的外参标定方法的流程图;
图3为一个实施例中确定参考点的第一和第二坐标位置的示意图;
图4为一个实施例中得到外参后映射到机器人坐标系下的示意图;
图5为一个实施例中拟合两条三角形边确定参考点位置的示意图;
图6为一个实施例中机器人传感器的外参标定方法的流程示意图;
图7为一个实施例中机器人传感器的外参标定装置的框架结构图;
图8为一个实施例中机器人的内部结构图。
具体实施方式
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
图1为一个实施例中机器人传感器的外参标定方法的应用环境图。参照图1,该机器人传感器的外参标定方法应用于机器人传感器的外参标定系统。该机器人传感器的外参标定系统包括机器人110和标定参照物120。机器人110通过第一传感器(比如,单线雷达)和第二传感器(比如,相机)对标定参照物120的位置进行采集,得到第一传感器数据和第二传感器数据,然后将第一传感器数据和第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据,根据所述第一转换传感器数据确定所述标定参照物中的参考点的第一坐标位置,根据所述第二转换传感器数据确定所述标定参照物中的所述参考点的第二坐标位置,将所述第一坐标位置和所述第二坐标位置作为一组坐标数据,然后移动标定参照物120的位置,重复上述过程,直到得到N组坐标数据,所述N为不小于外参中包含的未知自由度个数的正整数,最后根据N组坐标数据计算得到第一传感器和第二传感器之间的外参,外参为第一传感器和第二传感器之间的相对位置关系参数。
如图2所示,提出了一种机器人传感器的外参标定方法,该机器人传感器的外参标定方法应用于机器人,具体包括以下步骤:
步骤202,获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据。
其中,第一传感器和第二传感器为机器人中不同的两个传感器。在一个实施例中,第一传感器可以为雷达,第二传感器可以为视觉传感器(相机)。第一传感器和第二传感器都可以用于对标定参照物的位置信息进行采集。标定参照物是指用于辅助进行标定的参照物体,第一传感器数据和第二传感器数据得到的检测得到的一些列点的位置,为了能够明确地确定某个点,一般选用带有明显转折点的物体作为标定参照物。在一个实施例中,标定参照物可以采用三角柱体,三角柱体包含有夹角,夹角的顶点为明显的转折点,便于后续可以快速地确定该转折点的位置。当然也可以用其他带有明显转折点的物体,比如,三角锥,或者立方体等。
第一传感器和第二传感器通过对同一位置的同一标定参照物进行位置采集,这样分别得到第一传感器数据和第二传感器数据。便于后续根据第一传感器数据和第二传感器数据的差异来进行外参的标定。外参即外部参数,是指传感器之间的相对位置关系。
在一个实际的应用场景中,将机器人和标定参照物放置于同一个水平度较好的地面,保持机器人静止不动,将标定参照物放置于机器人正前方,通过第一传感器和第二传感器对标定参照物进行测量,采集得到第一传感器数据和第二传感器数据。在标定参照物与机器人的距离选择上,选取第一传感器和第二传感器测量精度都比较好的距离范围和区域范围。若第一传感器为雷达,第二传感器为相机,那么测量距离越近,精度越高,因此最好选择近距离的距离范围,比如,0.3m~2.0m。对于相机来说,相机视野边缘存在畸形,因此可以选取视野中间区域作为区域范围。
步骤204,将第一传感器数据和第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据。
其中,由于第一传感器采集得到的第一传感器数据为第一传感器坐标系中的数据,第二传感器采集得到的第二传感器数据为第二传感器坐标系中的数据, 为了便于两者进行比较,需要将其转换为同一坐标系中。在一个实施例中,可以将第一传感器数据和第二传感器数据统一转换到第一传感器坐标系或第二传感器坐标系,使得两者在同一坐标系下,得到第一转换传感器数据和第二转换传感器数据。在另一个实施例中,将第一传感器数据和第二传感器数据分别转换到机器人坐标系,相应地就得到得了转换后的第一转换传感器数据和第二转换传感器数据。在一个实施例中,机器人坐标系为XOY,X轴为机器人正前方,Y轴为机器人左侧,垂直于X轴,XOY平行于地面。
步骤206,根据第一转换传感器数据确定标定参照物中的参考点的第一坐标位置,根据第二转换传感器数据确定标定参照物中的参考点的第二坐标位置,将第一坐标位置和第二坐标位置作为一组坐标数据。
其中,参考点是指标定参照物中用作参考计算的点。参考点一般选用容易识别区分的点,比如,对于三角柱体,通过投影得到的数据曲线为三角形,三角形的顶点为最容易识别的点,所以可以将三角柱体中的夹角的顶点作为参考点。根据第一转换传感器数据来确定参考点的第一坐标位置,根据第二转换传感器数据来确定参考点的第二坐标位置。如图3所示,为一个实施例中,确定参考点的第一坐标位置和第二坐标位置的示意图,从图中可以看出,两者确定的参考点的坐标有一定的误差,黑色点为确定的参考点的坐标。在一个二维的平面图中,记录到的第一坐标位置为(x 0,y 0),第二坐标位置为(x′ 0,y′ 0)。第一坐标位置和第二坐标位置对应的是同一参考点的坐标,第一坐标位置和第二坐标位置构成了一组坐标数据。
步骤208,当机器人与标定参照物之间的相对位置关系发生变化时,进入步骤202,依次循环,直到到达步骤210。
步骤210,累计得到N组坐标数据,N为不小于外参中包含的未知自由度个数的正整数。
其中,通过改变机器人与标定参照物之间的相对位置关系,重复上述过程得到另外一组坐标数据,依次类推,直到得到N组坐标数据,N数值是根据外参中包含的未知自由度个数确定的,N要选择不小于未知自由度个数的正整数, 比如,对于三维空间来说,包括6个自由度[x,y,z,roll,pitch,yaw],其中,roll表示绕z轴旋转,pitch表示绕x轴旋转,yaw表示绕y轴旋转。那么N要大于或等于6。对于机器人2D导航应用来说,z轴忽略,x轴为机器人正前方,y轴为机器人左侧,垂直于x轴,xoy平行于地面那么外参的自由度简化为了3个自由度[x,y,yaw],那么N要大于或等于3。
步骤212,根据N组坐标数据计算得到第一传感器和第二传感器之间的外参,外参为第一传感器和第二传感器之间的相对位置关系参数。
其中,外参即外部参数,是指传感器之间的相对位置关系。在已知N组第一传感器和第二传感器对应的坐标数据的情况下,通过获取第一传感器和第二传感器之间的转换关系来计算得到第一传感器和第二传感器之间的外参。在一个实施例中,第一传感器和第二传感器之间的相对位置关系可以采用最小二乘法计算得到。对于机器人2D导航应用来说,有三个未知自由度,那么当存在至少3组坐标数据的情况下,就可以计算得到该三个自由度对应值,从而得到第一传感器和第二传感器之间的相对位置关系参数。
在一个实施例中,在得到第一传感器和第二传感器之间的外参后,根据求解出的外参构建机器人模型,将第一传感器数据和第二传感器数据映射到机器人坐标系下的结果如图4所示,提高了第一传感器和第二传感器在机器人坐标系下的一致性。上述机器人传感器的外参标定方法,只需要移动标定参照物,采集几组标定参照物的数据,就可以求解出传感器之前的外部参数,求解出的标定结果准确,大大提高了不同传感器对外描述的一致性。
上述机器人传感器的外参标定方法,通过机器人中的第一传感器和第二传感器对标定参照物的位置信息进行采集得到第一传感器数据和第二传感器数据,然后将两者转换到同一坐标系得到第一转换传感器数据和第二转换传感器数据,然后根据第一转换传感器数据得到标定参照物中参考点的第一坐标位置,根据第二转换传感器数据得到标定参考物中参考点的第二坐标位置,将第一坐标位置和第二坐标位置作为一组坐标数据,通过改变机器人与标定参照物的相对位 置关系,采集得到N组坐标数据,N为不小于外参中包含的未知自由度个数的正整数,根据N组坐标数据来计算得到第一传感器和第二传感器之间的相对位置关系参数。只需要通过移动标定参照物,采集得到N组坐标数据就可以求解出第一传感器和第二传感器的位置关系参数,不仅标定简单,而且大大减少了偏差,提高了第一传感器和第二传感器在机器人坐标系下的一致性,从而提高了机器人的性能。
在一个实施例中,所述参考点为所述标定参照物中关联两条边的转折点;所述根据所述第一转换传感器数据定所述标定参照物中的参考点的第一坐标位置,根据所述第二转换传感器数据确定所述标定参照物的参考点的第二坐标位置,包括:根据所述第一转换传感器数据对采集到的标定参照物的转折点关联的两条边进行拟合,根据拟合结果确定所述转折点的第一坐标位置;根据所述第二转换传感器数据对采集到的所述标定参照物的转折点关联的两条边进行拟合,根据拟合结果确定所述转折点的第二坐标位置。
其中,由于不同的传感器的分辨率可能不一致,所以如果直接测量标定参照物中的转折点的位置会造成较大的数据误差。本实施例中提出了通过对转折点关联的两条边分别进行相应的线条拟合,然后将拟合后的两条边的交点的位置作为转折点的位置。如图5所示,在一个实施例中,标定参照物为三角柱体,转折点为三角柱体的夹角对应的顶点,从图中可以看出顶点关联两条三角形边,传感器采集的数据为一个个点组成的,所以需要进行相应的直线拟合,通过拟合两条三角形边来计算得到两条三角形边的交点(图中空心圆)的位置,从而可以准确地定位。
在一个实施例中,所述标定参照物的转折点关联的两条边为直线;所述根据所述第一转换传感器数据对采集到的标定参照物的转折点关联的两条边进行拟合,根据拟合结果确定所述转折点的第一坐标位置,包括:对采集到的标定参照物的转折点关联的两条边分别进行直线拟合,将拟合得到的两条直线的交点的坐标作为所述转折点的第一坐标位置;所述根据所述第二转换传感器数据对采集到的所述标定参照物的转折点关联的两条边进行拟合,根据拟合结果确 定所述转折点的第二坐标位置,包括:对采集到的标定参照物的转折点关联的两条边分别进行直线拟合,将拟合得到的两条直线的交点的坐标作为所述转折点的第二坐标位置。
其中,标定参照物中的转折点关联的两条边为直线,那么在计算转折点的坐标时,就可以先对关联的两条边进行直线拟合,然后将拟合得到的两条直线的交点的坐标作为转折点的坐标。这样就避免了由于分辨率问题导致的转折点的坐标位置测量不准确的问题,提高了坐标位置测量的准确性。
在一个实施例中,所述将所述第一传感器数据和所述第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据,包括:将所述第一传感器数据从所述第一传感器坐标系转换到所述机器人坐标系,得到第一转换传感器数据;将所述第二传感器数据从所述第二传感器坐标系转换到所述机器人坐标系,得到第二转换传感器数据。
其中,为了将第一传感器数据和第二传感器数据转换到同一坐标系,分别将第一传感器数据从第一传感器坐标系转换到所述机器人坐标系,将第二传感器数据从第二传感器坐标系转换到机器人坐标系。由于对机器人结构进行设计时,会有一个对机器人的所有传感器的位置的一个设计值,所以可以获取到机传感器坐标系与机器人坐标系之间的转换关系,根据该坐标系转换关系进行坐标的转换。
在一个实施例中,所述根据所述N组坐标数据计算得到所述第一传感器和所述第二传感器之间的外参,所述外参为所述第一传感器和第二传感器之间的相对位置关系参数,包括:获取所述第一传感器和所述第二传感器位置之间的转换关系;根据所述N组坐标数据和所述转换关系计算得到所述第一传感器和第二传感器之间的相对位置关系参数。
其中,第一传感器和第二传感器位置之间的转换关系可以用待求解的参数来表示。在一个实施例中,令第一传感器的数据集为(x i,y i),第二传感器的数据集为(x′ i,y′ i),对于机器人2D导航应用,待求解参数为(Δx,Δy,Δyaw)。具体地, 第一传感器和第二传感器位置转换关系可以表示如下:
Figure PCTCN2018123793-appb-000001
求解矩阵方程得到:
Figure PCTCN2018123793-appb-000002
进一步转换得到:
Figure PCTCN2018123793-appb-000003
因此,该问题可以描述为Ax=b的线性方程组求解问题。
Figure PCTCN2018123793-appb-000004
进一步得到如下公式:
Figure PCTCN2018123793-appb-000005
Figure PCTCN2018123793-appb-000006
最后求解得到外参(Δx,Δy,Δyaw)中各个值,其中,n表示n组坐标数据。
在一个实施例中,所述第一传感器为单线雷达,所述第二传感器为视觉传 感器;所述获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据,包括:获取视觉传感器通过对标定参照物的位置信息进行采集得到的视觉传感器数据,根据所述单线雷达的测量高度从所述视觉传感器数据中提取出相应的行数据作为所述第二传感器数据。
其中,在一个具体的场景中,第一传感器为单线雷达,第二传感器为视觉传感器。单线雷达测量标定参照物得到的位置是标定参照物中某一条线上各个点的位置,而视觉传感器数据由于覆盖的面比较大,其测量得到的是一个面上的各个点的位置,所以为了保持数据参考的一致性,从视觉传感器数据中选取出高度值与单线雷达一致或接近的行数据作为第二传感器数据,这样得到的第一传感器数据和第二传感器数据都是一维的数据。在另一个实施例中,若单线雷达的高度不在视觉传感器的测量范围内,则选取高度最近的作为第二传感器数据。
在一个实施例中,所述标定参照物为三角柱体,所述参考点为所述三角柱体的夹角的顶点。
其中,为了提高测量的准确性,选取等腰三角柱体作为标定参照物,为了提高对三角柱体测量的准确性,等腰三角柱体的等腰夹角的角度可以选择在90°~160°。另外,放置于地面时,柱体垂直度较高,以便保证可以有效地测量高该三角柱体。在一个实施例中,当第一传感器为单线雷达,第二传感器为相机时,三角柱体的表面材质选用雷达和相机测量准确度高的材质,如木材、白色磨砂塑料等。
在一个实施例中,第一传感器为单线雷达,第二传感器为3D相机。标定参照物为三角柱体。如图6所示,为机器人传感器的外参标定方法的流程示意图。首先,确定一个机器人与标定参照物的相对位置,然后,通过单线雷达采集得到雷达数据,通过3D相机采集得到相机数据,之后,从相机数据中选取与雷达高度对应的行数据。接下来,分别从雷达数据和相机数据中选取标定参照物中两条边数据,然后将选取的数据转换到机器人坐标系下,然后对转换后的两 条边数据进行直线拟合,根据拟合结果确定标定参照物中角位置,继而得到雷达数据下的角点位置,相机数据下的角点位置,即得到一组角点位置坐标。然后通过改变机器人与标定参照物的相对位置关系,重复上述过程,得到另一组角点位置坐标,依次循环,直到得到N组角点位置坐标,根据N组角点位置坐标计算得到外参。
如图7所示,提出了一种机器人传感器的外参标定装置,所述装置包括:
采集模块702,用于获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据;
转换模块704,用于将所述第一传感器数据和所述第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据;
确定模块706,用于根据所述第一转换传感器数据确定所述标定参照物中的参考点的第一坐标位置,根据所述第二转换传感器数据确定所述标定参照物中的所述参考点的第二坐标位置,将所述第一坐标位置和所述第二坐标位置作为一组坐标数据;
循环模块708,用于当所述机器人与所述标定参照物之间的相对位置关系发生变化时,进入所述获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据的步骤,依次循环,直到累计得到N组坐标数据,所述N为不小于外参中包含的未知自由度个数的正整数;
计算模块710,用于根据所述N组坐标数据计算得到所述第一传感器和所述第二传感器之间的外参,所述外参为所述第一传感器和第二传感器之间的相对位置关系参数。
在一个实施例中,所述参考点为所述标定参照物中关联两条边的转折点;
所述确定模块还用于根据所述第一转换传感器数据对采集到的标定参照物的转折点关联的两条边进行拟合,根据拟合结果确定所述转折点的第一坐标位置,根据所述第二转换传感器数据对采集到的所述标定参照物的转折点关联的两条边进行拟合,根据拟合结果确定所述转折点的第二坐标位置。
在一个实施例中,所述标定参照物的转折点关联的两条边为直线;所述确定模块还用于对采集到的标定参照物的转折点关联的两条边分别进行直线拟合,根据拟合得到的两条直线的交点的坐标作为所述转折点的第一坐标位置,对采集到的标定参照物的转折点关联的两条边分别进行直线拟合,将拟合得到的两条直线的交点的坐标作为所述转折点的第二坐标位置。
在一个实施例中,所述转换模块还用于将所述第一传感器数据从所述第一传感器坐标系转换到所述机器人坐标系,得到第一转换传感器数据;将所述第二传感器数据从所述第二传感器坐标系转换到所述机器人坐标系,得到第二转换传感器数据。
在一个实施例中,所述计算模块还用于获取所述第一传感器和所述第二传感器位置之间的转换关系;根据所述N组坐标数据和所述转换关系计算得到所述第一传感器和第二传感器之间的相对位置关系参数。
在一个实施例中,所述第一传感器为单线雷达,所述第二传感器为视觉传感器;所述采集模块还用于获取视觉传感器通过对标定参照物的位置信息进行采集得到的视觉传感器数据,根据所述单线雷达的测量高度从所述视觉传感器数据中提取出相应的行数据作为所述第二传感器数据。
在一个实施例中,所述标定参照物为三角柱体,所述参考点为所述三角柱体的夹角的顶点。
图8示出了一个实施例中机器人的内部结构图。该计算机可以是服务器。如图8所示,该机器人包括通过系统总线连接的处理器、存储器、第一传感器、第二传感器和网络接口。其中,存储器包括非易失性存储介质和内存储器。该机器人的非易失性存储介质存储有操作系统,还可存储有计算机程序,该计算机程序被处理器执行时,可使得处理器实现机器人传感器的外参标定方法。该内存储器中也可储存有计算机程序,该计算机程序被处理器执行时,可使得处理器执行机器人传感器的外参标定方法。网络接口用于与外界进行通信。本领域技术人员可以理解,图8中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的机器人的限定,具体的机器 人可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
在一个实施例中,本申请提供的机器人传感器的外参标定方法可以实现为一种计算机程序的形式,计算机程序可在如图8所示的机器人上运行。机器人的存储器中可存储组成该机器人传感器的外参标定装置的各个程序模板。比如,采集模块702、转换模块704、确定模块706、循环模块708和计算模块710。
一种机器人,包括存储器和处理器,所述存储器存储有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行如下步骤:获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据;将所述第一传感器数据和所述第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据;根据所述第一转换传感器数据确定所述标定参照物中的参考点的第一坐标位置,根据所述第二转换传感器数据确定所述标定参照物中的所述参考点的第二坐标位置,将所述第一坐标位置和所述第二坐标位置作为一组坐标数据;当所述机器人与所述标定参照物之间的相对位置关系发生变化时,进入所述获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据的步骤,依次循环,直到累计得到N组坐标数据,所述N为不小于外参中包含的未知自由度个数的正整数;根据所述N组坐标数据计算得到所述第一传感器和所述第二传感器之间的外参,所述外参为所述第一传感器和第二传感器之间的相对位置关系参数。
一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时,使得所述处理器执行如下步骤:获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据;将所述第一传感器数据和所述第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据;根据所述第一转换传感器数据确定所述标定参照物中的参考点的第一坐标位置,根据所述第二转换传感器数据确定所述标定参照物中的所述参考点的第二坐标位置,将所述第一坐标 位置和所述第二坐标位置作为一组坐标数据;当所述机器人与所述标定参照物之间的相对位置关系发生变化时,进入所述获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据的步骤,依次循环,直到累计得到N组坐标数据,所述N为不小于外参中包含的未知自由度个数的正整数;根据所述N组坐标数据计算得到所述第一传感器和所述第二传感器之间的外参,所述外参为所述第一传感器和第二传感器之间的相对位置关系参数。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一非易失性计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的各实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和/或易失性存储器。非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本申请专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (10)

  1. 一种机器人传感器的外参标定方法,其特征在于,所述方法包括:
    获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据;
    将所述第一传感器数据和所述第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据;
    根据所述第一转换传感器数据确定所述标定参照物中的参考点的第一坐标位置,根据所述第二转换传感器数据确定所述标定参照物中的所述参考点的第二坐标位置,将所述第一坐标位置和所述第二坐标位置作为一组坐标数据;
    当所述机器人与所述标定参照物之间的相对位置关系发生变化时,进入所述获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据的步骤,依次循环,直到累计得到N组坐标数据,所述N为不小于外参中包含的未知自由度个数的正整数;
    根据所述N组坐标数据计算得到所述第一传感器和所述第二传感器之间的外参,所述外参为所述第一传感器和第二传感器之间的相对位置关系参数。
  2. 根据权利要求1所述的方法,其特征在于,所述参考点为所述标定参照物中关联两条边的转折点;
    所述根据所述第一转换传感器数据定所述标定参照物中的参考点的第一坐标位置,根据所述第二转换传感器数据确定所述标定参照物的参考点的第二坐标位置,包括:
    根据所述第一转换传感器数据对采集到的标定参照物的转折点关联的两条边进行拟合,根据拟合结果确定所述转折点的第一坐标位置;
    根据所述第二转换传感器数据对采集到的所述标定参照物的转折点关联的两条边进行拟合,根据拟合结果确定所述转折点的第二坐标位置。
  3. 根据权利要求2所述的方法,其特征在于,所述标定参照物的转折点关联的两条边为直线;
    所述根据所述第一转换传感器数据对采集到的标定参照物的转折点关联的 两条边进行拟合,根据拟合结果确定所述转折点的第一坐标位置,包括:
    对采集到的标定参照物的转折点关联的两条边分别进行直线拟合,根据拟合得到的两条直线的交点的坐标作为所述转折点的第一坐标位置;
    所述根据所述第二转换传感器数据对采集到的所述标定参照物的转折点关联的两条边进行拟合,根据拟合结果确定所述转折点的第二坐标位置,包括:
    对采集到的标定参照物的转折点关联的两条边分别进行直线拟合,将拟合得到的两条直线的交点的坐标作为所述转折点的第二坐标位置。
  4. 根据权利要求1所述的方法,其特征在于,所述将所述第一传感器数据和所述第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据,包括:
    将所述第一传感器数据从所述第一传感器坐标系转换到所述机器人坐标系,得到第一转换传感器数据;
    将所述第二传感器数据从所述第二传感器坐标系转换到所述机器人坐标系,得到第二转换传感器数据。
  5. 根据权利要求1所述的方法,其特征在于,所述根据所述N组坐标数据计算得到所述第一传感器和所述第二传感器之间的外参,所述外参为所述第一传感器和第二传感器之间的相对位置关系参数,包括:
    获取所述第一传感器和所述第二传感器位置之间的转换关系;
    根据所述N组坐标数据和所述转换关系计算得到所述第一传感器和第二传感器之间的相对位置关系参数。
  6. 根据权利要求1所述的方法,其特征在于,所述第一传感器为单线雷达,所述第二传感器为视觉传感器;
    所述获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据,包括:
    获取视觉传感器通过对标定参照物的位置信息进行采集得到的视觉传感器数据,根据所述单线雷达的测量高度从所述视觉传感器数据中提取出相应的行数据作为所述第二传感器数据。
  7. 根据权利要求1所述的方法,其特征在于,所述标定参照物为三角柱体,所述参考点为所述三角柱体的夹角的顶点。
  8. 一种机器人传感器的外参标定装置,其特征在于,所述装置包括:
    采集模块,用于获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据;
    转换模块,用于将所述第一传感器数据和所述第二传感器数据转换到同一坐标系,相应地得到第一转换传感器数据和第二转换传感器数据;
    确定模块,用于根据所述第一转换传感器数据确定所述标定参照物中的参考点的第一坐标位置,根据所述第二转换传感器数据确定所述标定参照物中的所述参考点的第二坐标位置,将所述第一坐标位置和所述第二坐标位置作为一组坐标数据;
    循环模块,用于当所述机器人与所述标定参照物之间的相对位置关系发生变化时,进入所述获取机器人中第一传感器和第二传感器通过对标定参照物的位置信息进行采集得到的第一传感器数据和第二传感器数据的步骤,依次循环,直到累计得到N组坐标数据,所述N为不小于外参中包含的未知自由度个数的正整数;
    计算模块,用于根据所述N组坐标数据计算得到所述第一传感器和所述第二传感器之间的外参,所述外参为所述第一传感器和第二传感器之间的相对位置关系参数。
  9. 一种机器人,包括存储器和处理器,所述存储器存储有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行如权利要求1至7中任一项所述方法的步骤。
  10. 一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时,使得所述处理器执行如权利要求1至7中任一项所述方法的步骤。
PCT/CN2018/123793 2018-12-25 2018-12-26 机器人传感器的外参标定方法、装置、机器人及存储介质 WO2020132924A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/611,475 US11590655B2 (en) 2018-12-25 2018-12-26 External parameter calibration method for robot sensors and apparatus and robot with the same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811594120.3 2018-12-25
CN201811594120.3A CN111360810A (zh) 2018-12-25 2018-12-25 机器人传感器的外参标定方法、装置、机器人及存储介质

Publications (1)

Publication Number Publication Date
WO2020132924A1 true WO2020132924A1 (zh) 2020-07-02

Family

ID=71126872

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/123793 WO2020132924A1 (zh) 2018-12-25 2018-12-26 机器人传感器的外参标定方法、装置、机器人及存储介质

Country Status (3)

Country Link
US (1) US11590655B2 (zh)
CN (1) CN111360810A (zh)
WO (1) WO2020132924A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020133288A1 (zh) * 2018-12-28 2020-07-02 深圳市优必选科技有限公司 一种双足机器人步态控制方法以及双足机器人
WO2020215198A1 (zh) * 2019-04-23 2020-10-29 深圳市大疆创新科技有限公司 一种数据处理方法、装置、设备及可移动平台
GB2585969B (en) * 2020-03-12 2022-04-06 Seechange Tech Limited Data processing
CN113319833B (zh) * 2021-05-19 2022-09-02 三一建筑机器人(西安)研究院有限公司 直角坐标机器人标定方法及装配系统
CN113643358B (zh) * 2021-08-10 2023-07-07 追觅创新科技(苏州)有限公司 相机的外参标定方法、装置、存储介质及系统
CN115439561B (zh) * 2022-10-25 2023-03-10 杭州华橙软件技术有限公司 机器人的传感器标定方法、机器人及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060271332A1 (en) * 2005-05-18 2006-11-30 Perceptron, Inc. Method for calibrating a non-contact sensor using a robot
CN101053953A (zh) * 2004-07-15 2007-10-17 上海交通大学 焊接机器人单目视觉传感器的手-眼关系快速标定方法
CN101660903A (zh) * 2009-09-22 2010-03-03 大连海事大学 一种用于测量机器人的外参数计算方法
CN102294695A (zh) * 2010-06-25 2011-12-28 鸿富锦精密工业(深圳)有限公司 机器人标定方法及标定系统
CN105758426A (zh) * 2016-02-19 2016-07-13 深圳杉川科技有限公司 移动机器人的多传感器的联合标定方法
CN108226906A (zh) * 2017-11-29 2018-06-29 深圳市易成自动驾驶技术有限公司 一种标定方法、装置及计算机可读存储介质
CN108399643A (zh) * 2018-03-15 2018-08-14 南京大学 一种激光雷达和相机间的外参标定系统和方法
CN108717715A (zh) * 2018-06-11 2018-10-30 华南理工大学 一种用于弧焊机器人的线结构光视觉系统自动标定方法

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010055069A1 (en) * 2000-03-10 2001-12-27 Hudson Edison T. One camera system for component to substrate registration
US6822748B2 (en) * 2002-10-29 2004-11-23 Metron Systems, Inc. Calibration for 3D measurement system
JP3946716B2 (ja) * 2004-07-28 2007-07-18 ファナック株式会社 ロボットシステムにおける3次元視覚センサの再校正方法及び装置
KR20090066776A (ko) * 2007-12-20 2009-06-24 한국전자통신연구원 로봇 위치 추정을 위한 위치결정서비스 프레임 워크 장치및 그 방법
WO2009132703A1 (en) * 2008-04-30 2009-11-05 Abb Technology Ab A method and a system for determining the relation between a robot coordinate system and a local coordinate system located in the working range of the robot
JP2011115877A (ja) * 2009-12-02 2011-06-16 Canon Inc 双腕ロボット
CN101882313B (zh) * 2010-07-14 2011-12-21 中国人民解放军国防科学技术大学 单线激光雷达与ccd相机之间相互关系的标定方法
JP2012187651A (ja) * 2011-03-09 2012-10-04 Omron Corp 画像処理装置および画像処理システム、ならびにそれらに向けられたガイダンス装置
JP6021533B2 (ja) * 2012-09-03 2016-11-09 キヤノン株式会社 情報処理システム、装置、方法及びプログラム
CN103049912B (zh) * 2012-12-21 2015-03-11 浙江大学 一种基于任意三面体的雷达-相机系统外部参数标定方法
SG10201609857SA (en) * 2012-12-27 2017-01-27 Panasonic Ip Corp America Information communication method
CN203657746U (zh) * 2013-11-27 2014-06-18 深圳市智信精密仪器有限公司 一种应用于激光和相机的位置标定装置
CN104827480A (zh) * 2014-02-11 2015-08-12 泰科电子(上海)有限公司 机器人系统的自动标定方法
ES2876449T3 (es) * 2014-09-05 2021-11-12 Sz Dji Technology Co Ltd Cartografía de entorno de múltiples sensores
DE102015219332A1 (de) * 2015-10-07 2017-04-13 Robert Bosch Gmbh Sensorvorrichtung sowie Roboteranordnung mit der Sensorvorrichtung
CN105678785B (zh) * 2016-02-01 2018-03-02 西安交通大学 一种激光与相机相对位姿关系的标定方法
EP3531151B1 (en) * 2018-02-27 2020-04-22 Melexis Technologies SA Redundant sensor error reduction
CN108596979A (zh) * 2018-03-27 2018-09-28 深圳市智能机器人研究院 一种用于激光雷达和深度相机的标定装置和方法
WO2020010043A1 (en) * 2018-07-06 2020-01-09 Brain Corporation Systems, methods and apparatuses for calibrating sensors mounted on a device
KR102561103B1 (ko) * 2018-11-16 2023-07-31 삼성전자주식회사 로봇 보정 시스템 및 그것의 보정 방법
US11707842B2 (en) * 2018-11-27 2023-07-25 Fanuc Corporation Robot system and coordinate conversion method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101053953A (zh) * 2004-07-15 2007-10-17 上海交通大学 焊接机器人单目视觉传感器的手-眼关系快速标定方法
US20060271332A1 (en) * 2005-05-18 2006-11-30 Perceptron, Inc. Method for calibrating a non-contact sensor using a robot
CN101660903A (zh) * 2009-09-22 2010-03-03 大连海事大学 一种用于测量机器人的外参数计算方法
CN102294695A (zh) * 2010-06-25 2011-12-28 鸿富锦精密工业(深圳)有限公司 机器人标定方法及标定系统
CN105758426A (zh) * 2016-02-19 2016-07-13 深圳杉川科技有限公司 移动机器人的多传感器的联合标定方法
CN108226906A (zh) * 2017-11-29 2018-06-29 深圳市易成自动驾驶技术有限公司 一种标定方法、装置及计算机可读存储介质
CN108399643A (zh) * 2018-03-15 2018-08-14 南京大学 一种激光雷达和相机间的外参标定系统和方法
CN108717715A (zh) * 2018-06-11 2018-10-30 华南理工大学 一种用于弧焊机器人的线结构光视觉系统自动标定方法

Also Published As

Publication number Publication date
US20210354299A1 (en) 2021-11-18
US11590655B2 (en) 2023-02-28
CN111360810A (zh) 2020-07-03

Similar Documents

Publication Publication Date Title
WO2020132924A1 (zh) 机器人传感器的外参标定方法、装置、机器人及存储介质
CN109961468B (zh) 基于双目视觉的体积测量方法、装置及存储介质
EP2990828B1 (en) Point cloud position data processing device, point cloud position data processing system, point cloud position data processing method, and program therefor
JP6573419B1 (ja) 位置決め方法、ロボット及びコンピューター記憶媒体
CN111123242B (zh) 一种基于激光雷达和相机的联合标定方法及计算机可读存储介质
CN107229043B (zh) 一种距离传感器外参数标定方法和系统
EP4083917A1 (en) Depth image processing method, small obstacle detection method and system, robot, and medium
CN110675440B (zh) 三维深度数据的置信度评估方法、装置和计算机设备
WO2022179094A1 (zh) 车载激光雷达外参数联合标定方法、系统、介质及设备
CN112907681A (zh) 基于毫米波雷达与双目相机的联合标定方法和系统
CN116433737A (zh) 一种激光雷达点云与图像配准的方法、装置及智能终端
CN111080682A (zh) 点云数据的配准方法及装置
CN112558043A (zh) 一种激光雷达的标定方法及电子设备
CN102982552B (zh) 一种基于里奇流的表面配准方法
Duran et al. Accuracy comparison of interior orientation parameters from different photogrammetric software and direct linear transformation method
CN111915681A (zh) 多组3d相机群的外参标定方法、装置、存储介质及设备
CN109785388B (zh) 一种基于双目摄像头的短距离精确相对定位方法
CN113959362B (zh) 结构光三维测量系统标定方法、巡检数据处理方法
CN114004949A (zh) 机载点云辅助的移动测量系统安置参数检校方法及系统
CN114638789A (zh) 一种用于孔位检测的方法及系统
CN114266835A (zh) 一种非量测相机的变形监测控制方法与系统
CN114063024A (zh) 传感器的标定方法、装置、电子设备和存储介质
Guerreiro et al. Automatic 2-D LiDAR geometric calibration of installation bias
CN116449393B (zh) 一种针对大中型料堆多传感器测量方法和系统
TWI784754B (zh) 電子裝置以及物件偵測方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18945151

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18945151

Country of ref document: EP

Kind code of ref document: A1