WO2021068723A1 - 传感器标定方法和传感器标定装置 - Google Patents

传感器标定方法和传感器标定装置 Download PDF

Info

Publication number
WO2021068723A1
WO2021068723A1 PCT/CN2020/115879 CN2020115879W WO2021068723A1 WO 2021068723 A1 WO2021068723 A1 WO 2021068723A1 CN 2020115879 W CN2020115879 W CN 2020115879W WO 2021068723 A1 WO2021068723 A1 WO 2021068723A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
camera
point
lidar
coordinate system
Prior art date
Application number
PCT/CN2020/115879
Other languages
English (en)
French (fr)
Inventor
陈亦伦
李涵
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021068723A1 publication Critical patent/WO2021068723A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • This application relates to the field of artificial intelligence, and more specifically, to a sensor calibration method and a sensor calibration device.
  • Multi-line lidar emits multiple laser scanning lines at the same time to meet the needs of quickly collecting large-scale environmental information.
  • lidar has gradually been used for perception of the surrounding environment of vehicles.
  • the multi-line lidar installed on the smart car can provide the smart car with richer, comprehensive and accurate information about the surrounding environment of the vehicle.
  • the original data obtained by multi-line lidar scanning mainly includes distance information and angle information.
  • the distance information indicates the distance from the scan point to the 3D lidar
  • the angle information indicates the elevation angle of the scan line where the scan point is located.
  • Converting the raw data obtained by scanning the multi-line lidar to the coordinate system of the smart device to which the multi-line lidar belongs usually involves the following two steps: Converting the raw data (i.e. distance information and angle information) obtained by scanning the multi-line lidar It is the three-dimensional coordinates in the coordinate system of the multi-line lidar itself; the converted three-dimensional coordinates in the laser radar coordinate system are converted to the three-dimensional coordinates in the coordinate system of the smart device.
  • the external parameters of the multi-line lidar need to be used, and the position of the multi-line lidar in the vehicle body coordinate system needs to be determined first. Determining the position of the multi-line lidar in the vehicle body coordinate system can be called the external parameter calibration of the multi-line lidar, or the external parameter calibration of the multi-line lidar.
  • This application provides a sensor calibration method, a sensor calibration device, a calibration device, and a calibration system, which help improve the calibration accuracy and calibration speed of the lidar.
  • this application provides a sensor calibration method.
  • the method includes: acquiring a first point coordinate set, the first point coordinate set includes three-dimensional coordinates of a scanning point in a lidar area on a calibration board in a lidar coordinate system, and the lidar calibration area includes two Non-parallel sides; perform straight line fitting according to the first point coordinate set to obtain the expression of multiple straight lines, the multiple straight lines including the straight lines where the two non-parallel sides are located; according to the multiple The expression of a straight line is to estimate the three-dimensional coordinates of the lidar calibration point on the calibration board in the lidar coordinate system; according to the three-dimensional coordinates of the lidar calibration point in the lidar coordinate system, Lidar performs external parameter calibration.
  • the coordinates of the lidar calibration points in the lidar coordinate system are estimated based on the straight line obtained by fitting, and these straight lines are fitted by the coordinates of multiple scanning points in the lidar coordinate system, which can improve
  • the accuracy of the coordinates of the lidar calibration points in the lidar coordinate system can further improve the accuracy of the external parameters of the lidar.
  • the calibrating external parameters of the lidar according to the three-dimensional coordinates of the lidar calibration point in the lidar coordinate system includes: obtaining the coordinates of a second point
  • the second point coordinate set includes the coordinates of the camera calibration point on the calibration board in the image coordinate system of the camera; according to the second point coordinate set, the internal parameters of the camera, and the camera calibration area
  • the relative position between the laser radar and the camera is determined according to the positional relationship between the laser radar calibration area and the laser radar; according to the external parameters of the camera and the relative position between the laser radar and the camera Attitude, calibrate the external parameters of the lidar.
  • the external parameters of the lidar are calibrated according to the internal parameters and external parameters of the camera.
  • the determination of the relationship between the laser radar and the camera based on the second point coordinate set, the internal parameters of the camera, and the positional relationship between the camera calibration area and the laser radar calibration area includes: determining that the lidar calibration point is at the position based on the second point coordinate set, the internal parameters of the camera, and the positional relationship between the camera calibration area and the lidar calibration area.
  • the three-dimensional coordinates in the camera coordinate system of the camera; the laser is determined according to the three-dimensional coordinates of the lidar calibration point in the camera coordinate system and the three-dimensional coordinates of the lidar calibration point in the lidar coordinate system.
  • the method further includes: acquiring a second point coordinate set, the second point coordinate set including the coordinates of the camera calibration point in the image coordinate system of the camera;
  • the second point coordinate set, the positional relationship between the camera calibration area and the three-dimensional coordinates of the camera calibration point in the world coordinate system are used to calibrate the camera.
  • the lidar and camera on the same device are calibrated simultaneously, which can improve the calibration efficiency.
  • the calibration of the camera according to the second point coordinate set, the positional relationship between the camera calibration area and the three-dimensional coordinates of the camera calibration point in the world coordinate system includes: The internal parameters of the camera and the second point coordinate set determine the three-dimensional coordinates of the camera calibration point in the camera coordinate system; according to the positional relationship between the camera calibration points, it is determined that the camera calibration point is at The three-dimensional coordinates in the world coordinate system; according to the three-dimensional coordinates of the camera calibration point in the world coordinate system and the three-dimensional coordinates of the camera calibration point in the camera coordinate system, external parameter calibration of the camera is performed.
  • the second point coordinate set includes point coordinates of multiple images obtained by shooting the calibration plate by the camera.
  • the method further includes: determining the coordinates of the camera calibration point in each of the plurality of images from the second point coordinate set according to the positional relationship between the camera calibration areas; according to The coordinates of the calibration point of the camera in each of the multiple images are calibrated with internal parameters of the camera to obtain the internal parameters of the camera.
  • the present application provides a sensor calibration method.
  • the method includes: acquiring a second point coordinate set, where the second point coordinate set includes the coordinates of the camera calibration point in the camera's image coordinate system;
  • the camera is calibrated by a set of point coordinates, the positional relationship between the calibration areas of the camera, and the three-dimensional coordinates of the camera calibration point in the world coordinate system.
  • This method makes it possible to determine whether the feature points in the image captured by the camera are on the calibration board according to the positional relationship between the feature points, regardless of whether the camera captures all the patterns on the entire calibration board or part of the patterns on the calibration board. Which feature points can be determined from the pre-stored three-dimensional coordinates, the three-dimensional coordinates of these feature points in the world coordinate system, and then the internal and external parameters of the camera can be determined. This makes it possible to use the calibration board to calibrate the camera without limiting the distance between the camera and the calibration board, and without moving the calibration board, that is, the automatic calibration of the camera can be realized.
  • the camera is calibrated according to the second point coordinate set, the positional relationship between the camera calibration area, and the three-dimensional coordinates of the camera calibration point in the world coordinate system
  • the method includes: determining the three-dimensional coordinates of the camera calibration point in the camera coordinate system according to the internal parameters of the camera and the second point coordinate set; and determining the camera according to the positional relationship between the camera calibration points
  • the three-dimensional coordinates of the calibration point in the world coordinate system according to the three-dimensional coordinates of the camera calibration point in the world coordinate system and the three-dimensional coordinates of the camera calibration point in the camera coordinate system, external parameter calibration of the camera is performed.
  • the second point coordinate set includes point coordinates of multiple images obtained by shooting the calibration plate by the camera.
  • the method further includes: determining the coordinates of the camera calibration point in each of the plurality of images from the second point coordinate set according to the positional relationship between the camera calibration areas; according to The coordinates of the calibration point of the camera in each of the multiple images are calibrated with internal parameters of the camera to obtain the internal parameters of the camera.
  • the present application provides a sensor calibration device, the device includes: an acquisition module for acquiring a first point coordinate set, the first point coordinate set includes the scanning point in the lidar area on the calibration board The three-dimensional coordinates in the lidar coordinate system, the lidar calibration area includes two non-parallel sides; the fitting module is used to perform straight line fitting according to the first point coordinate set to obtain multiple straight line expressions , The multiple straight lines include the straight lines where the two non-parallel sides are located; an estimation module is used to estimate that the lidar calibration point on the calibration board is at the lidar based on the expression of the multiple straight lines The three-dimensional coordinates in the coordinate system; a calibration module for calibrating the external parameters of the laser radar according to the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system.
  • the acquisition module is further configured to: acquire a second point coordinate set, and the second point coordinate set includes an image of a camera calibration point on the calibration board on the camera. Coordinates in the coordinate system.
  • the calibration module is specifically configured to determine the lidar and the lidar based on the second point coordinate set, the internal parameters of the camera, and the positional relationship between the camera calibration area and the lidar calibration area. The relative pose between the cameras; according to the external parameters of the camera and the relative pose between the lidar and the camera, the external parameters of the lidar are calibrated.
  • the calibration module is specifically configured to: determine the lidar based on the second point coordinate set, the internal parameters of the camera, and the positional relationship between the camera calibration area and the lidar calibration area
  • the three-dimensional coordinates of the calibration point in the camera coordinate system of the camera according to the three-dimensional coordinates of the lidar calibration point in the camera coordinate system and the three-dimensional coordinates of the lidar calibration point in the lidar coordinate system Determine the relative pose between the lidar and the camera.
  • the acquiring module is further configured to acquire a second point coordinate set, where the second point coordinate set includes the coordinates of the camera calibration point in the image coordinate system of the camera.
  • the calibration module is further configured to calibrate the camera according to the second point coordinate set, the positional relationship between the camera calibration area, and the three-dimensional coordinates of the camera calibration point in the world coordinate system.
  • the calibration module is specifically configured to: determine the three-dimensional coordinates of the camera calibration point in the camera coordinate system according to the internal parameters of the camera and the second point coordinate set; and according to the camera calibration point Determine the three-dimensional coordinates of the camera calibration point in the world coordinate system; according to the three-dimensional coordinates of the camera calibration point in the world coordinate system and the three-dimensional coordinates of the camera calibration point in the camera coordinate system Coordinates, calibrate the external parameters of the camera.
  • the second point coordinate set includes point coordinates of multiple images obtained by shooting the calibration plate by the camera.
  • the calibration module is further specifically configured to: according to the positional relationship between the camera calibration areas, determine from the second point coordinate set that the camera calibration point is in each of the multiple images Coordinates; according to the coordinates of the camera calibration point in each of the multiple images, calibrate the internal parameters of the camera to obtain the internal parameters of the camera.
  • the present application provides a sensor calibration device, the device includes: an acquisition module configured to acquire a second point coordinate set, the second point coordinate set including the coordinates of the camera calibration point in the camera's image coordinate system;
  • the calibration module is configured to calibrate the camera according to the second point coordinate set, the positional relationship between the camera calibration area, and the three-dimensional coordinates of the camera calibration point in the world coordinate system.
  • the calibration module is specifically configured to: determine the three-dimensional coordinates of the camera calibration point in the camera coordinate system according to the camera internal parameters and the second point coordinate set; The positional relationship between the calibration points of the camera determines the three-dimensional coordinates of the calibration point of the camera in the world coordinate system; according to the three-dimensional coordinates of the calibration point of the camera in the world coordinate system and the calibration point of the camera in the camera coordinate system The three-dimensional coordinates below are used to calibrate the external parameters of the camera.
  • the second point coordinate set includes point coordinates of multiple images obtained by shooting the calibration plate by the camera.
  • the calibration module is further specifically configured to: according to the positional relationship between the camera calibration areas, determine from the second point coordinate set that the camera calibration point is in each of the multiple images Coordinates; according to the coordinates of the camera calibration point in each of the multiple images, calibrate the internal parameters of the camera to obtain the internal parameters of the camera.
  • a sensor calibration device in a fifth aspect, includes: a memory for storing a program; a processor for executing the program stored in the memory, and when the program stored in the memory is executed, the processing The device is used to execute the method in any one of the above-mentioned first aspects.
  • the device may also include a communication interface.
  • a sensor calibration device in a sixth aspect, includes: a memory for storing a program; a processor for executing the program stored in the memory, and when the program stored in the memory is executed, the processing The device is used to execute the method in any one of the foregoing second aspects.
  • the device may also include a communication interface.
  • a computer-readable medium stores instructions for device execution, and the instructions are used to execute the method in the first aspect.
  • a computer-readable medium stores instructions for device execution, and the instructions are used to execute the method in the second aspect.
  • a computer program product containing instructions is provided, when the computer program product is run on a computer, the computer is caused to execute the method in the above-mentioned first aspect.
  • a computer program product containing instructions is provided, when the computer program product runs on a computer, the computer is caused to execute the method in the second aspect.
  • a chip in an eleventh aspect, includes a processor and a communication interface.
  • the processor reads instructions stored in a memory through the communication interface and executes the method in the first aspect.
  • the chip may further include a memory in which instructions are stored, and the processor is configured to execute instructions stored on the memory, and when the instructions are executed, the processor is configured to execute the first aspect In the method.
  • a chip in a twelfth aspect, includes a processor and a communication interface.
  • the processor reads instructions stored in a memory through the communication interface, and executes the method in the second aspect.
  • the chip may further include a memory in which instructions are stored, and the processor is configured to execute instructions stored on the memory, and when the instructions are executed, the processor is configured to execute the second aspect In the method.
  • an electronic device in a thirteenth aspect, is provided, and the electronic device includes the sensor calibration device in the third aspect.
  • an electronic device in a fourteenth aspect, includes the sensor calibration device in the fourth aspect.
  • the present application provides a calibration device, which is characterized by comprising: one or more lidar calibration areas, the lidar calibration areas including two non-parallel sides, and the two non-parallel sides The intersection of the straight line where the edge is located is used to calibrate the external parameters of the lidar.
  • the lidar calibration area includes at least two non-parallel sides, and the intersection of the straight lines where these non-parallel sides are located can be used as lidar calibration points to achieve lidar calibration.
  • the first side of the first lidar calibration region and the first side of the second lidar calibration region are located on a first straight line, and the first The second side of the lidar calibration area and the second side of the second lidar calibration area are located on a second straight line, the first side and the second side of the first lidar calibration area are not parallel, and the second side The first side and the second side of the lidar calibration area are not parallel.
  • edges of multiple lidar regions are located on the same straight line, so that the straight line can be fitted according to the scanning points near the multiple sides, so that the accuracy of the fitted straight line can be improved, and thus the lidar standard The accuracy of the fixed point.
  • the first intersection point of the first straight line and the second straight line is the first vertex of the first lidar calibration area, and the first intersection point is the first point of the second lidar calibration area. A vertex.
  • lidar areas have the same lidar calibration point as the apex, which makes the layout of the lidar calibration area on the calibration board more compact, which makes it possible to lay out more calibration areas on the calibration board of the same area or The calibration area of the same area can be laid out by using a smaller area of the calibration board.
  • the first side of the third lidar calibration region and the second side of the fourth lidar calibration region are located on a third straight line
  • the third laser radar The second side of the radar calibration area and the first side of the fourth lidar calibration area are located on the fourth straight line
  • the first side and the second side of the third lidar calibration area are not parallel
  • the fourth laser The first side and the second side of the radar calibration area are not parallel.
  • the first straight line is not parallel to the third straight line
  • the second straight line is not parallel to the fourth straight line.
  • the first straight line is not parallel to the third straight line, and the second straight line is not parallel to the fourth straight line, so that the same number of lidar calibration areas can be increased, and more intersection points can be added, thereby adding more lasers. Radar calibration points can improve the calibration accuracy of lidar.
  • the second intersection point of the second straight line and the fourth straight line is the second vertex of the first lidar calibration area, and the second intersection point is the first point of the third lidar calibration area.
  • a vertex, the third intersection of the third straight line and the fourth straight line is the second vertex of the third lidar calibration area, and the third intersection is the first vertex of the fourth lidar calibration area,
  • the fourth intersection point of the first straight line and the third straight line is the second vertex of the fourth lidar calibration area, and the fourth intersection point is the second vertex of the second lidar calibration area.
  • lidar areas have the same lidar calibration point as the apex, which makes the layout of the lidar calibration area on the calibration board more compact, which makes it possible to lay out more calibration areas on the calibration board of the same area or The calibration area of the same area can be laid out by using a smaller area of the calibration board.
  • the shape of the lidar calibration area is a triangle, a quadrilateral, or the like.
  • the device further includes a plurality of camera calibration areas, the positional relationship between any one of the plurality of camera calibration areas and the adjacent camera calibration areas, and, the multiple camera calibration areas Any other camera calibration area in the area has a different positional relationship with the adjacent camera calibration area.
  • the device in this implementation mode makes it possible to realize the calibration of the lidar and the camera simultaneously.
  • the device in this implementation makes it possible to determine the features in the image captured by the camera based on the positional relationship between the feature points regardless of whether the camera captures all the patterns on the entire calibration board or some patterns on the calibration board. Which feature points are the points on the calibration board, so that the three-dimensional coordinates of these feature points in the world coordinate system can be determined from the pre-stored three-dimensional coordinates, and then the internal and external parameters of the camera can be determined.
  • This makes it possible to use the calibration board to calibrate the camera without limiting the distance between the camera and the calibration board, and without moving the calibration board, that is, the automatic calibration of the camera can be realized.
  • the shape of the calibration area of the camera is a circle, a triangle, a quadrilateral, or the like.
  • the present application provides a calibration device, which is characterized by comprising: a plurality of camera calibration areas, the positional relationship between any one of the plurality of camera calibration areas and the adjacent camera calibration areas, and, The positional relationship between any other camera calibration area in the plurality of camera calibration areas and the adjacent camera calibration area is different.
  • the real calibration device makes it possible to realize the calibration of the lidar and the camera simultaneously.
  • the device in this implementation makes it possible to determine the features in the image captured by the camera based on the positional relationship between the feature points regardless of whether the camera captures all the patterns on the entire calibration board or some patterns on the calibration board. Which feature points are the points on the calibration board, so that the three-dimensional coordinates of these feature points in the world coordinate system can be determined from the pre-stored three-dimensional coordinates, and then the internal and external parameters of the camera can be determined.
  • This makes it possible to use the calibration board to calibrate the camera without limiting the distance between the camera and the calibration board, and without moving the calibration board, that is, the automatic calibration of the camera can be realized.
  • the shape of the calibration area of the camera is a circle, a triangle, a quadrilateral, or the like.
  • the present application provides a sensor calibration system, which includes the sensor calibration device in the third aspect or the fifth aspect, and the calibration device in the fifteenth aspect.
  • the present application provides a sensor calibration system, which includes the sensor calibration device in the fourth aspect or the sixth aspect, and the calibration device in the sixteenth aspect.
  • the present application provides a sensor calibration system, which includes the sensor calibration device in the fourth aspect and the calibration device in the sixteenth aspect.
  • FIG. 1 is a schematic diagram of an application scenario of a technical solution of an embodiment of the present application
  • Figure 2 is a schematic structural diagram of a calibration device according to an embodiment of the present application.
  • Fig. 3 is a schematic flowchart of a sensor calibration method according to an embodiment of the present application.
  • Fig. 4 is a schematic structural diagram of a calibration device according to another embodiment of the present application.
  • Fig. 5 is a schematic structural diagram of a calibration device according to another embodiment of the present application.
  • Fig. 6 is a schematic flowchart of a sensor calibration method according to another embodiment of the present application.
  • FIG. 7 is a schematic flowchart of a sensor calibration method according to another embodiment of the present application.
  • Fig. 8 is a schematic structural diagram of a calibration device according to another embodiment of the present application.
  • FIG. 9 is a schematic flowchart of a sensor calibration method according to another embodiment of the present application.
  • FIG. 10 is a schematic flowchart of a sensor calibration method according to another embodiment of the present application.
  • FIG. 11 is a schematic flowchart of a sensor calibration method according to another embodiment of the present application.
  • FIG. 12 is a schematic flowchart of a sensor calibration method according to another embodiment of the present application.
  • FIG. 13 is a schematic flowchart of a sensor calibration method according to another embodiment of the present application.
  • FIG. 14 is a schematic flowchart of a sensor calibration method according to another embodiment of the present application.
  • Fig. 15 is a schematic structural diagram of a sensor calibration device according to an embodiment of the present application.
  • FIG. 16 is a schematic deployment diagram of a sensor calibration device according to another embodiment of the present application.
  • FIG. 17 is a schematic deployment diagram of a sensor calibration device according to another embodiment of the present application.
  • FIG. 18 is a schematic structural diagram of a computing device according to another embodiment of the present application.
  • the smart device in the embodiments of the present application refers to any device, appliance, or machine with computing and processing capabilities.
  • the smart devices in the embodiments of the present application may be robots, autonomous vehicles, smart assisted driving vehicles, unmanned aerial vehicles, smart assisted aircraft, smart home devices, and so on.
  • This application does not impose any limitation on the smart device, as long as it is a device that can be installed with lidar and/or camera, it can be included in the scope of the smart device of this application.
  • Fig. 1 is a schematic diagram of an application scenario of a technical solution of an embodiment of the present application.
  • This scene can include a car, a front calibration board, and four sets of binocular cameras.
  • each pair of binocular cameras forms a four-wheel vision positioning system, and each pair of binocular cameras is responsible for detecting the three-dimensional coordinates of the center of a wheel in the world coordinate system.
  • the car coordinate system can be established, and the position and posture of the car coordinate system relative to the world coordinate system, namely the position and posture, can be determined.
  • the front calibration board and the binocular camera are respectively fixed at the corresponding positions.
  • a multi-line lidar is installed in the front of the vehicle.
  • FIG. 2 is a schematic diagram of a calibration plate 200 according to an embodiment of the application.
  • the triangular area is made of infrared reflective material in the working frequency band of lidar, and the background is made of black light-absorbing material.
  • these four connected triangular areas are referred to as a set of calibration patterns.
  • the calibration board in the embodiment of the present application can have more or fewer groups of calibration patterns, and FIG. 2 only shows three groups as an example.
  • the triangular area on the upper left is called the first area
  • the triangular area on the lower left is called the second area
  • the triangular area on the upper right is called the third area
  • the triangular area on the lower right is called the fourth area.
  • a vertex of the first region and a vertex of the second region are the same point.
  • this point is referred to as point A.
  • An edge of the first area and an edge of the second area are on the same straight line.
  • the edge of the first area is called the first edge of the first area
  • the edge of the second area It is called the first side of the second area
  • this straight line is called the first straight line.
  • the other side of the first area is on the same straight line as the other side of the second area.
  • this side of the first area is called the second side of the first area
  • this side of the second area is The side is called the second side of the second area, and this straight line is called the second straight line.
  • the other vertex of the first region and a vertex of the third region are the same point.
  • the same point is referred to as point B.
  • a vertex of the third region and a vertex of the fourth region are the same point.
  • this point is referred to as point C.
  • An edge of the third area is located on the same straight line as an edge of the fourth area.
  • the edge of the third area is called the first edge of the third area
  • the edge of the fourth area It is called the second side of the fourth area
  • this straight line is called the third straight line.
  • the other side of the third area is on the same straight line as the other side of the fourth area.
  • this side of the third area is called the second side of the third area, and this side of the fourth area
  • the side is called the first side of the second area, and the straight line is called the fourth straight line.
  • the other vertex of the second region is the same point as a vertex of the fourth region.
  • point D the same point is referred to as point D.
  • the method shown in FIG. 3 may include S310 to S390. It should be understood that these steps or operations are only examples. The technical solution proposed in this application may include more or fewer steps or operations, or may perform a modification of each operation in FIG. 3.
  • S310 Pre-store the three-dimensional coordinates of the characteristic points on the calibration board in the world coordinate system.
  • points A, B, C, and D in the calibration pattern on the calibration board shown in Figure 2 are used as feature points
  • multiple sets of points A, B, C, and D on the calibration board are pre-stored in the world Three-dimensional coordinates under coordinates.
  • the three-dimensional coordinates of these points in the world coordinate system can be measured by a total station.
  • S320 Detect whether the vehicle has entered the designated area, if yes, execute S330, otherwise execute S340.
  • the four-wheel vision positioning system can be used to capture images of four wheels, and based on the images, it is determined whether the vehicle has entered the designated area. If the location of the vehicle meets the calibration requirements, it is determined that the vehicle has entered the designated area.
  • the automobile coordinate system is established according to the three-dimensional coordinates of the four wheel centers in the world coordinate system, and the pose of the automobile coordinate system relative to the world coordinate system is solved.
  • S350 Determine the three-dimensional coordinates of the scanning point in the lidar coordinate system.
  • the lidar emits a laser beam, receives the reflected signal, and determines the three-dimensional coordinates of the scanning point in the lidar coordinate system based on the reflected signal.
  • the three-dimensional coordinates of the multiple scanning points under the lidar coordinates constitute a three-dimensional point coordinate set.
  • the laser radar emits multiple laser scanning lines outward, and the spatial coordinates of the scanning point formed by the intersection of each laser scanning line and the calibration plate in the laser radar coordinate system are measured, that is, the three-dimensional coordinates.
  • S360 Determine the three-dimensional coordinates of the feature point in the laser radar coordinate system according to the three-dimensional coordinates of the scanning point in the laser radar coordinate system.
  • the expressions of the first straight line, the second straight line, the third straight line and the fourth straight line are fitted according to the three-dimensional point coordinate set; and the expressions of the first straight line, the second straight line, the third straight line and the fourth straight line are fitted according to the
  • the fitting expression determines the three-dimensional coordinates of point A, point B, point C, and point D in the lidar coordinate system; and according to the three-dimensional coordinates and points of point A, point B, point C, and point D in the lidar coordinate system
  • the three-dimensional coordinates of A, point B, point C and point D in the world coordinate system are used to calibrate the external parameters of the lidar.
  • the three-dimensional coordinates of the start scan point and the end scan point of each scan line in each area are determined from the three-dimensional point coordinate set of the scan point in the lidar coordinate system.
  • the three-dimensional coordinates of points marked by four-pointed stars, five-pointed stars, hexagons, and seven-pointed stars are determined from the three-dimensional point coordinate set.
  • the expression of the first straight line is fitted according to the three-dimensional coordinates of the four points marked by the four-pointed star in the lidar coordinate system. Specifically, the parameters in the expression of the first straight line are obtained by fitting; according to the five-pointed star marked The three-dimensional coordinates of the four points in the lidar coordinate system are fitted to the expression of the second straight line. Specifically, the parameters in the expression of the second straight line are obtained by fitting; the four points marked by the seven-pointed star are in the lidar The three-dimensional coordinates in the coordinate system are fitted to the expression of the third straight line.
  • the parameters in the expression of the third straight line are obtained by fitting; the three-dimensional coordinates of the four points marked in the hexagon in the lidar coordinate system are simulated
  • the expression of the fourth straight line is combined, and specifically, the parameters in the expression of the fourth straight line are obtained by fitting.
  • each straight line can be fitted by the least square method, or the linear fitting method of three-dimensional scattered data in space can be used to fit the expression of each straight line. Combine the expressions of each straight line.
  • the three-dimensional coordinates of point A in the lidar coordinate system are solved according to the expression of the first straight line obtained by fitting and the expression of the second straight line obtained by fitting; according to the expression of the second straight line obtained by fitting and The expression of the fourth straight line obtained by the fitting is solved for the three-dimensional coordinates of point B in the lidar coordinate system; the expression of the third straight line obtained by the fitting and the expression of the fourth straight line obtained by the fitting are solved for the point C in the laser
  • the three-dimensional coordinates in the radar coordinate system; the three-dimensional coordinates of the point D in the lidar coordinate system are solved according to the expression of the third straight line obtained by fitting and the expression of the first straight line obtained by fitting.
  • the three-dimensional coordinates of multiple sets of points A, B, C, and D in the lidar coordinate system can be determined.
  • S370 Perform external parameter calibration on the lidar according to the three-dimensional coordinates of the feature point in the lidar coordinate system and the three-dimensional coordinates of the feature point in the world coordinate system.
  • the lidar coordinate system determines the laser The pose of the radar in the world coordinate system; and according to the pose of the lidar in the world coordinate system, the pose of the lidar in the car coordinate system is determined, that is, the external parameter calibration of the lidar is realized.
  • the pose of the lidar in the car coordinate system is determined, that is, the external parameter calibration of the lidar is realized.
  • the accuracy of the straight line expression obtained by the fitting can be improved, and the result obtained from the fitting can be improved
  • the accuracy of the three-dimensional coordinates of the feature points solved by the straight line expression can further improve the accuracy of the external parameters of the lidar.
  • the relative pose of the lidar and other sensors can also be determined according to the external parameters of the lidar.
  • the relative pose between the lidar and the front camera is solved.
  • the reflective area used to calibrate the external parameters of the lidar may have other shapes, for example, it may be a quadrilateral.
  • the scene shown in FIG. 1 is only an example, and the scene where the sensor calibration method shown in FIG. 3 can be applied may include more or less devices.
  • the car coordinate system can be established in other ways.
  • a front camera may also be installed on the vehicle.
  • the external parameters of the lidar can be calibrated according to the internal and external parameters of the camera; if the internal and external parameters of the front camera are If the external parameters have not been calibrated, the camera's internal parameters, external parameters, and lidar external parameters can be calibrated simultaneously.
  • the internal parameters of the camera refer to the parameters related to the characteristics of the camera itself, which are determined by the installation position of the optical lens and the photoelectric sensor in the camera.
  • the internal parameters of the camera may include: focal length, pixel size, optical distortion, white balance parameters, resolution, contrast, vignetting and/or vignetting, etc.
  • Camera external parameters refer to the parameters of the camera in the world coordinate system, which are determined by the position of the camera in the world coordinate system, for example, include the position and rotation direction of the camera in the world coordinate system.
  • the triangular area in the calibration board 500 shown in FIG. 5 has the same meaning as the triangular area in the calibration board shown in FIG. 2 and will not be repeated here.
  • the circular area in the calibration plate 500 can be made of white reflective material for camera shooting; the background is made of black light-absorbing material.
  • the positional relationship between any circular area and its adjacent circular area in the calibration plate 500 is different from the positional relationship between another arbitrary circular area and its adjacent circular area.
  • the method shown in FIG. 6 may include S610 to S694. It should be understood that these steps or operations are only examples. The technical solution proposed in the present application may include more or fewer steps or operations, or may perform a modification of each operation in FIG. 6.
  • S610 Pre-store the positional relationship between the feature points on the calibration board, the internal parameters of the camera and the external parameters.
  • the camera external parameter is the pose of the camera in the car coordinate system.
  • point A, point B, point C, point D and the center point of the circular area in the calibration pattern on the calibration board shown in Figure 5 are used as characteristic points
  • the points A, B, and C on the calibration board are pre-stored
  • the positional relationship between point D and the center point of the circular area is pre-stored.
  • the relative positions between point A, point B, point C, point D and the center point of the circular area can be pre-stored.
  • point A, point B, point C, and point D are referred to as intersection feature points, and the center point of the circular area is referred to as the center feature point.
  • S620 Detect whether the vehicle has entered the designated area, if yes, execute S630, otherwise execute S640. Refer to S320 for this step.
  • S630 Detect the wheel center through the four-wheel visual positioning system to establish the car coordinate system. Refer to S330 for this step.
  • S650 Determine the three-dimensional coordinates of the scanning point in the lidar coordinate system. For this step, refer to S350, which will not be repeated here.
  • S660 Determine the three-dimensional coordinates of the intersection feature point in the laser radar coordinate system according to the three-dimensional coordinates of the scanning point in the laser radar coordinate system.
  • the calibration pattern is captured by a camera to obtain an image containing multiple circular areas. Fit each circular area in the image to obtain the two-dimensional coordinates of the center feature point in the image coordinate system.
  • S680 Determine the three-dimensional coordinates of the intersection feature point in the camera coordinate system according to the two-dimensional coordinates of the center feature point in the image coordinate system and the position relationship between the feature points.
  • the two-dimensional coordinates of the feature point of the center of the circle in the image coordinate system are calculated, and the feature point of the intersection is in the image coordinate system.
  • the two-dimensional coordinates are converted into three-dimensional coordinates in the camera coordinate system.
  • S690 Perform external parameter calibration on the lidar according to the three-dimensional coordinates of the intersection feature point in the camera coordinate system and the three-dimensional coordinates of the intersection feature point in the lidar coordinate system.
  • the lidar coordinate system determines the relative pose of the lidar and the camera; and according to the external parameters of the camera, as well as the lidar and the camera Determine the position of the lidar in the car coordinate system, that is, realize the calibration of the external parameters of the lidar.
  • the reflective area used to calibrate the external parameters of the lidar can be other shapes, for example, it can be a quadrilateral; the reflective area used for camera imaging can be other shapes, such as triangles, Quadrilateral or checkerboard, etc.
  • the positional relationship between any feature point and the adjacent feature point is different from the positional relationship between another arbitrary feature point and the adjacent feature point, it does not matter whether the camera captures the entire calibration board
  • All the patterns on the calibration board, or some of the patterns on the calibration board can be determined according to the positional relationship between the feature points, which feature points in the image captured by the camera are on the calibration board, so that the difference between these feature points can be determined.
  • the position relationship determines the three-dimensional coordinates of the intersection feature point in the lidar coordinate system. This makes it possible to realize the automatic calibration of the external parameters of the lidar without limiting the distance between the camera and the calibration board when the camera is used to shoot the calibration board.
  • the method shown in FIG. 7 may include S710 to S796. It should be understood that these steps or operations are only examples. The technical solution proposed in this application may include more or fewer steps or operations, or may perform a modification of each operation in FIG. 7.
  • S710 Pre-store the three-dimensional coordinates of the feature points on the calibration board in the world coordinate system and the position relationship between the feature points.
  • point A, point B, point C, point D and the center point of the circular area in the calibration pattern on the calibration board shown in Figure 5 are used as characteristic points
  • the points A, B, and C on the calibration board are pre-stored
  • Point D and the three-dimensional coordinates of the center point of the circular area in the world coordinate system are pre-stored
  • the positional relationship between the center point of the pre-stored circular area is pre-stored
  • point A, point B, point C, and point D are referred to as intersection feature points, and the center point of the circular area is referred to as the center feature point.
  • S720 Detect whether the vehicle has entered the designated area, if yes, execute S730, otherwise execute S740. Refer to S320 for this step.
  • S730 Detect the wheel center through the four-wheel visual positioning system, and establish the car coordinate system. Refer to S330 for this step.
  • S750 Determine the three-dimensional coordinates of the scanning point in the lidar coordinate system. Refer to S350 for this step.
  • S760 Determine the three-dimensional coordinates of the feature point of the intersection in the laser radar coordinate system according to the three-dimensional coordinates of the scanning point in the laser radar coordinate system. For this step, refer to S360, which will not be repeated here.
  • S770 Perform external parameter calibration on the lidar according to the three-dimensional coordinates of the intersection feature point in the lidar coordinate system and the three-dimensional coordinates of the intersection feature point in the world coordinate system. Refer to S370 for this step.
  • S780 Obtain the two-dimensional coordinates of the center feature point in the image coordinate system through the image taken by the camera. Refer to S670 for this step.
  • the image captured by the camera may include the entire calibration board or part of the calibration board.
  • S790 Perform internal and external parameter calibration of the camera according to the two-dimensional coordinates of the center feature point in the image coordinate system and the three-dimensional coordinates of the center feature point in the world coordinate system.
  • the position relationship between the feature points of the center of the circle it is determined which feature points on the calibration board are photographed by the camera, and then the three-dimensional coordinates of these feature points in the world coordinate system are selected from the pre-stored three-dimensional coordinates, and then based on these feature points.
  • the three-dimensional coordinates in the camera coordinate system and the three-dimensional coordinates in the world coordinate system are used to calibrate the internal and external parameters of the camera.
  • the implementation of calibrating the internal and external parameters of the camera can refer to the prior art, which will not be repeated here.
  • the calibration accuracy and stability of the camera and lidar are very important to the automatic driving algorithm. Therefore, if the calibration error of the camera and lidar does not meet the requirements, it will fall back to S720, and execute S720 to S792 again until the camera and lidar The calibration error meets the requirements.
  • the relative pose between the camera and the lidar can be determined according to the external parameters of the camera and the external parameters of the lidar.
  • the relative pose between the camera and other sensors and the relative pose between the laser radar and other sensors can also be determined according to the external parameters of the camera and the external parameters of the lidar.
  • the reflective area used to calibrate the external parameters of the lidar can be other shapes, for example, it can be a quadrilateral; the reflective area used for camera imaging can be other shapes, such as triangles, Quadrilateral or checkerboard, etc.
  • the positional relationship between any feature point and the adjacent feature point is different from the positional relationship between another arbitrary feature point and the adjacent feature point, it does not matter whether the camera captures the entire calibration board
  • All the patterns on the calibration board, or some of the patterns on the calibration board can be determined according to the positional relationship between the feature points, which feature points in the image captured by the camera are the feature points on the calibration board, which can be determined from the pre-stored three-dimensional coordinates
  • the three-dimensional coordinates of these feature points in the world coordinate system can be obtained, and then the internal and external parameters of the camera can be determined.
  • FIG. 8 a schematic diagram of the calibration pattern of the calibration plate is shown in FIG. 8. This scenario can be used to calibrate the front camera on the vehicle.
  • the circular area in the calibration plate shown in Fig. 8 can be made of white reflective material for camera shooting; the background is made of black light-absorbing material. Wherein, the positional relationship between any one circular area and its adjacent circular area is different from the positional relationship between another arbitrary circular area and its adjacent circular area.
  • the method shown in FIG. 9 may include S910 to S980. It should be understood that these steps or operations are only examples. The technical solution proposed in this application may include more or fewer steps or operations, or may perform a modification of each operation in FIG. 9.
  • S910 Pre-store the position relationship between the three-dimensional coordinates of the feature points on the calibration board in the world coordinate system and the feature points.
  • the center point of the circular area on the calibration board shown in FIG. 8 when used as the characteristic point, the three-dimensional coordinates of the center point of the circular area on the calibration board in the world coordinate system are pre-stored.
  • the center point of the circular area is referred to as the center feature point.
  • S930 Detect the wheel center through the four-wheel visual positioning system, and establish the car coordinate system. Refer to S330 for this step.
  • S960 Perform internal and external parameter calibration of the camera according to the two-dimensional coordinates of the center feature point in the image coordinate system and the three-dimensional coordinates of the center feature point in the world coordinate system. This step can be S790.
  • the calibration accuracy and stability of the camera and lidar are very important to the automatic driving algorithm. Therefore, if the calibration error of the camera and lidar does not meet the requirements, return to S920, and execute S920 to S970 again until the camera and lidar The calibration error meets the requirements.
  • the relative pose between the camera and other sensors can be determined according to the external parameters of the camera.
  • the reflective area used to calibrate the external parameters of the lidar can be other shapes, for example, it can be a quadrilateral; the reflective area used for camera imaging can be other shapes, such as triangles, Quadrilateral or checkerboard, etc.
  • the positional relationship between any feature point and the adjacent feature point is different from the positional relationship between another arbitrary feature point and the adjacent feature point, it does not matter whether the camera captures the entire calibration board
  • All the patterns on the calibration board, or some of the patterns on the calibration board can be determined according to the positional relationship between the feature points, which feature points in the image captured by the camera are on the calibration board, so that the difference between these feature points can be determined.
  • the position relationship determines the three-dimensional coordinates of the intersection feature point in the lidar coordinate system. This makes it possible to use the calibration board to calibrate the camera without limiting the distance between the camera and the calibration board, that is, to realize the automatic calibration of the camera.
  • FIG. 10 is a schematic flowchart of a sensor calibration method provided by this application.
  • the method may include S1010 to S1040.
  • the lidar is a multi-line lidar
  • the multiple scan lines may refer to all scan lines of the lidar or part of the scan lines of the lidar.
  • the calibration board may include one or more sets of calibration patterns, wherein each set of calibration patterns may include one or more lidar calibration areas.
  • the lidar calibration area is the reflective area used to calibrate the lidar.
  • the laser radar calibration area uses the infrared material of the laser radar working frequency band.
  • the lidar area can be of any shape.
  • the lidar area includes multiple sides, at least two of which are non-parallel, or at least two of the lines on which the sides are intersecting.
  • the lidar area is triangular or quadrilateral.
  • the multiple sets of calibration patterns can be the same or different.
  • the calibration pattern includes multiple lidar calibration areas
  • the shapes of the multiple lidar calibration areas may be the same or different.
  • the calibration board includes multiple sets of calibration patterns, and an example in which each set of calibration patterns includes multiple lidar calibration areas is shown in FIG. 2 or FIG. 5.
  • the multiple scan lines of the lidar scan the lidar area on the calibration board to generate multiple scan points.
  • the set of three-dimensional coordinates of these multiple scanning points in the lidar coordinate system is called the first point coordinate set.
  • the origin of the lidar coordinate system is usually the center of the lidar, and the x-axis direction of the lidar coordinate system usually points to the opposite direction of the output cable of the lidar; if the lidar is installed directly in front of the car, the y of the lidar coordinate system The axis direction usually points to the left side of the car; the z-axis of the lidar coordinate system usually points to the sky.
  • S1020 Perform straight line fitting according to the first point coordinate set to obtain expressions of multiple straight lines, the multiple straight lines including the straight lines where the two non-parallel sides are located.
  • the starting scan point and the starting scan point when each scan line scans the lidar calibration area between the two non-parallel sides can be selected.
  • Multi-scanning lines scan the Lidar calibration area, so the three-dimensional coordinates of multiple starting scanning points and multiple ending scanning points are selected.
  • a straight line fitting is performed according to the three-dimensional coordinates of the start scan point and the end scan point.
  • a set of calibration patterns on the calibration board includes the first area as shown in Figure 4, and when the lidar scans from the first side to the second side of the first area, the scan point marked by the five-pointed star is the starting scan point , The scan point marked by the four-pointed star is the end scan point.
  • the expression of the first straight line is fitted, and the two points marked by the five-pointed star in the first area are in the lidar coordinates.
  • the three-dimensional coordinates under the system fit the expression of the second straight line.
  • a set of calibration patterns on the calibration board includes the first area and the second area as shown in FIG. 4, and when the scanning direction of the lidar is from the first side to the second side of the first area, the first area
  • the scan point marked by a five-pointed star is the start scan point
  • the scan point marked by a four-pointed star is the end scan point
  • the point marked by a five-pointed star in the second area is the end scan point
  • the point marked by the four-pointed star Is the starting scan point.
  • the expression of the first straight line is fitted;
  • the three-dimensional coordinates of the four points marked by the five-pointed star in the lidar coordinate system fit the expression of the second straight line.
  • the accuracy of the expression obtained by the fitting can be improved, and thus the coordinates of the lidar feature points obtained by solving the expression in the lidar coordinate system can be improved.
  • the accuracy of the laser radar can further improve the accuracy of the calibration results of the lidar.
  • a set of calibration patterns on the calibration board includes the first area, the second area, the third area, and the fourth area as shown in FIG. 4, and the scanning direction of the lidar is from the first side of the first area to
  • the scan point marked by the five-pointed star in the first area is the start scan point
  • the scan point marked by the four-pointed star is the end scan point
  • the point marked by the five-pointed star in the second area is the end scan Point
  • the point marked by the four-pointed star is the starting scan point
  • the point marked by the seven-pointed star in the third area is the start scan point
  • the point marked by the six-pointed star is the end scan point
  • the fourth area is marked by the six-pointed star
  • the point at is the starting point of scanning
  • the point marked by the seven-pointed star is the ending point of scanning.
  • the expression of the first straight line is fitted;
  • the three-dimensional coordinates of the four points marked by a five-pointed star in the lidar coordinate system fit the expression of the second straight line; according to the four points marked by the seven-pointed star in the third area and the fourth area in the lidar coordinate system
  • Three-dimensional coordinates fit the third straight line expression; according to the three-dimensional coordinates of the four points marked by the six-pointed star in the third area and the fourth area in the lidar coordinate system, the fourth straight line is fitted.
  • the implementation manner of performing straight line fitting according to the first point coordinate set is not limited to the above manner.
  • the average coordinates of all scan points in the z-axis direction can be used as the z coordinates of the start scan point and the end scan point; or , You can calculate the coordinate spacing of all the above scan points in the y-axis direction, and calculate the coordinates of the scan point in the middle of these scan points in the y-axis direction plus or minus a certain number of coordinate spacings, and the obtained coordinates The value is used as the y coordinate of the start scan point or the end scan point.
  • this step after determining the three-dimensional coordinates of the start scan point and the end scan point in the lidar coordinate system, when a straight line is fitted according to the three-dimensional coordinates of these points, it can be achieved in a variety of ways, for example, the least squares three-dimensional Straight-line fitting method or linear fitting method of spatial three-dimensional scattered point data can be realized.
  • S1030 Estimate the three-dimensional coordinates of the lidar calibration point in the lidar coordinate system according to the expressions of the multiple straight lines.
  • the Lidar calibration point is the point used to calibrate the external parameters of the Lidar on the indicator board.
  • the lidar calibration point includes the intersection of the straight lines where the two non-parallel sides of the lidar calibration area are located. Therefore, the Lidar calibration point can also be called the intersection feature point.
  • the characteristic point of the lidar includes point A.
  • the lidar feature points include point A, point B, point C, and point D.
  • estimating the three-dimensional coordinates of the lidar feature points in the lidar coordinate system can include: constructing an equation set of expressions, solving the equation set, and the obtained coordinates can be regarded as the lidar feature points in the laser The three-dimensional coordinates in the radar coordinate system.
  • the calibration board when the calibration board includes the first area in Fig. 2, or includes the first area and the second area in Fig. 2, solve the equation set consisting of the expression of the first straight line and the expression of the second straight line, and obtain The coordinates of is the three-dimensional coordinates of point A in the lidar coordinate system.
  • the calibration board includes the first area, the second area, the third area, and the fourth area in Figure 2
  • the expressions of the first straight line, the second straight line, the third straight line, and the fourth straight line are solved. Equations, the obtained coordinates are the three-dimensional coordinates of point A, point B, point C and point D in the lidar coordinate system.
  • S1040 Calibrate the external parameters of the lidar according to the three-dimensional coordinates of the lidar calibration point in the lidar coordinate system.
  • the external parameters of the lidar are calibrated according to the three-dimensional coordinates of the lidar calibration point in the lidar coordinate system, which can include: S1041, according to the lidar calibration point in the lidar coordinate system
  • S1041 according to the lidar calibration point in the lidar coordinate system
  • the three-dimensional coordinates of the laser radar and the three-dimensional coordinates of the laser radar calibration point in the world coordinate system are used to calibrate the external parameters of the laser radar.
  • An example is the method shown in Figure 3.
  • the three-dimensional coordinates of the Lidar calibration point in the world coordinate system may be measured in advance by a total station. According to the three-dimensional coordinates of the lidar calibration point in the lidar coordinate system and the three-dimensional coordinates of the lidar calibration point in the world coordinate system, reference may be made to the prior art, and details are not repeated here.
  • it may also include: S1050 and S1060.
  • the camera calibration point includes a point in the camera calibration area on the calibration board.
  • the camera and the laser radar are located on the same smart device.
  • the image captured by the camera may include all camera calibration areas on the calibration board, or may include part of the camera calibration areas on the calibration board.
  • the camera calibration area is the area used to calibrate the external parameters of the camera or the internal and external parameters of the camera.
  • the camera calibration area can be made of white reflective material.
  • the camera calibration area can be any shape, for example, it can be a circle, a triangle, or a checkerboard.
  • the calibration board usually includes multiple camera calibration areas, where the positional relationship between any one camera calibration area and its adjacent area is different from the positional relationship between any other camera calibration area and its adjacent area.
  • the camera calibration points are located in the camera calibration area. Therefore, the positional relationship between the camera calibration areas can also be understood as the positional relationship between the camera calibration points.
  • the positional relationship between the center point of any camera calibration area and the center point of its adjacent area is different from the positional relationship between the center point of any other camera calibration area and the end point of its adjacent area.
  • S1060 Calibrate the camera according to the second point coordinate set, the position relationship between the camera calibration area, and the three-dimensional coordinates of the camera calibration point in the world coordinate system.
  • calibrating the camera includes calibrating the internal parameters and/or external parameters of the camera.
  • the coordinates in the second point coordinate set are transferred from the image coordinate system to the camera coordinate system to obtain the three-dimensional coordinates of the camera calibration point in the camera coordinate system; determine the camera according to the positional relationship between the camera calibration points
  • the camera calibration points captured are which calibration points on the calibration board; according to the three-dimensional coordinates of these calibration points in the world coordinate system and the three-dimensional coordinates of these points in the camera coordinate system, the camera external parameters can be calibrated.
  • the specific implementation method can be Refer to existing technology.
  • the three-dimensional coordinates of the camera calibration point in the world coordinates can be measured by a total station, of course, it can also be measured in other ways, which is not limited in the embodiment of the present application.
  • the internal parameters of the camera can be pre-calibrated, or can be calibrated according to the calibration board.
  • the camera shoots the calibration board from multiple angles to obtain multiple images; according to the positional relationship between the camera calibration points, the coordinates of each camera calibration point can be determined from the multiple images; according to the camera calibration point in multiple images
  • the internal parameters of the camera are calibrated by the coordinates in the image, and the specific implementation manner can refer to the prior art.
  • the camera is calibrated according to the positional relationship between the camera calibration areas, which can realize the automatic calibration of the camera.
  • the external parameters of the laser radar can be calibrated, which can include: according to the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system and the camera external parameters, Calibrate the external parameters of the lidar.
  • S1040 may include S1042 to S1046.
  • S1042 Obtain a second point coordinate set, where the second point coordinate set includes the coordinates of the camera calibration point on the calibration board in the camera's image coordinate system.
  • This step can refer to S1050.
  • S1044 Calibrate the external parameters of the lidar according to the second point coordinate set, the internal and external parameters of the camera, and the positional relationship between the calibration area of the camera and the calibration area of the lidar.
  • the lidar calibration point is in the camera coordinate system of the camera according to the second point coordinate set, camera internal parameters, and the positional relationship between the camera calibration area and the lidar calibration area According to the three-dimensional coordinates of the laser radar calibration point in the camera coordinate system, the three-dimensional coordinates of the laser radar calibration point in the laser radar coordinate system and the camera external parameters, the external parameters of the laser radar are calibrated.
  • the coordinates of the center point of the camera calibration area in the image coordinate system may be determined according to the second point coordinate set; according to the positional relationship between the camera calibration area and the lidar calibration area, and the center point of the camera calibration area in the image
  • the coordinates in the coordinate system determine the coordinates of the lidar calibration point in the image coordinate system; according to the coordinates of the lidar calibration point in the image coordinate system and the internal parameters of the camera, determine the three-dimensionality of the lidar calibration point in the camera coordinate system coordinate.
  • the coordinates are converted into three-dimensional coordinates in the camera coordinate system; then according to the three-dimensional coordinates of the center point of the camera calibration area in the camera coordinate system , And the positional relationship between the calibration area of the camera and the calibration point of the lidar to determine the three-dimensional coordinates of the calibration point of the lidar in the camera coordinate system.
  • the relative pose of the camera and the laser radar determine the relative pose of the camera and the laser radar; according to the relative pose and the camera in the vehicle coordinate system Under the pose (ie external camera parameters), determine the pose of the lidar in the vehicle coordinate system (ie external parameters of the lidar).
  • FIG. 14 is a schematic flowchart of a sensor calibration method according to an embodiment of the application.
  • the sensor calibration method shown in FIG. 14 may include S1410 and S1420.
  • S1410 Acquire a second point coordinate set, where the second point coordinate set includes the coordinates of the camera calibration point in the image coordinate system of the camera. Refer to S1050 for this step.
  • S1420 Calibrate the camera according to the second point coordinate set, the position relationship between the camera calibration area, and the three-dimensional coordinates of the camera calibration point in the world coordinate system. Refer to S1060 for this step.
  • the feature points in the image captured by the camera can be determined according to the positional relationship between the feature points as the calibration Which feature points are on the board, the three-dimensional coordinates of these feature points in the world coordinate system can be determined from the pre-stored three-dimensional coordinates, and then the internal and external parameters of the camera can be determined.
  • a calibration device provided by the present application includes: one or more lidar calibration areas, and the lidar calibration area includes two non-parallel sides.
  • the calibration device can also be called a calibration plate.
  • the calibration board may include one or more sets of calibration patterns, wherein each set of calibration patterns may include one or more lidar calibration areas.
  • the lidar calibration area is the reflective area used to calibrate the lidar.
  • the laser radar calibration area uses the infrared material of the laser radar working frequency band.
  • the lidar area can be of any shape.
  • the lidar area includes multiple sides, at least two of which are non-parallel, or at least two of the lines on which the sides are intersecting.
  • the lidar area is triangular or quadrilateral.
  • the two sides that are not parallel in the lidar area are called the first side and the second side, respectively.
  • the multiple sets of calibration patterns can be the same or different.
  • the calibration pattern includes multiple lidar calibration areas
  • the shapes of the multiple lidar calibration areas may be the same or different.
  • the calibration board includes multiple sets of calibration patterns, and an example in which each set of calibration patterns includes multiple lidar calibration areas is shown in FIG. 2 or FIG. 5.
  • the lidar calibration area includes at least two non-parallel sides, and the intersection of the straight lines where these non-parallel sides are located can be used as the lidar calibration points to achieve lidar calibration.
  • the first side of one Lidar calibration area and the first side of another Lidar calibration area are on the same straight line.
  • the front A lidar calibration area becomes the first lidar calibration area
  • the latter lidar area is called the second lidar calibration area
  • this straight line is called the first straight line
  • the second side of the first lidar calibration area is The second side of the second lidar calibration area is located on the same straight line.
  • this straight line is called the second straight line.
  • edges of multiple lidar areas are located on the same straight line, so that the straight line can be fitted according to the scanning points near the multiple sides, so that the accuracy of the fitted straight line can be improved, and the lidar calibration point can be improved Accuracy.
  • the intersection of the first straight line and the second straight line is a vertex of the calibration area of the first lidar.
  • the intersection of the first straight line and the second straight line is called the first intersection.
  • the first intersection point is at the same time a vertex of the calibration area of the second lidar.
  • lidar areas have the same lidar calibration point as the apex, which makes the layout of the lidar calibration area on the calibration board more compact, which makes it possible to lay out more calibration areas on the calibration board of the same area or The calibration area of the same area can be laid out by using a smaller area of the calibration board.
  • the first side of the third lidar calibration region and the second side of the fourth lidar calibration region are located on a third straight line
  • the third lidar calibration region And the first side of the fourth lidar calibration area are located on a fourth straight line; wherein, the first straight line is not parallel to the third straight line, and the second straight line is aligned with the fourth straight line.
  • the straight lines are not parallel.
  • the first straight line is not parallel to the third straight line, and the second straight line is not parallel to the fourth straight line, increasing the same number of lidar calibration areas can add more intersections, thereby adding more lidars
  • the calibration point can further improve the calibration accuracy of the lidar.
  • the second intersection point of the second straight line and the fourth straight line is another vertex of the first lidar calibration area
  • the second intersection point is a vertex of the third lidar calibration area
  • the second intersection point of the third straight line and the fourth straight line The three intersection point is the other vertex of the third lidar calibration area
  • the third intersection point is a vertex of the fourth lidar calibration area
  • the fourth intersection point of the first straight line and the third straight line is the other vertex of the fourth lidar calibration area.
  • the vertex, the fourth intersection is the other vertex of the second lidar calibration area.
  • lidar areas have the same lidar calibration point as the apex, which makes the layout of the lidar calibration area on the calibration board more compact, which makes it possible to lay out more calibration areas on the calibration board of the same area or The calibration area of the same area can be laid out by using a smaller area of the calibration board.
  • FIG. 2 An example of this design is shown in Figure 2, where the first area, second area, third area, and fourth area correspond to the first lidar calibration area, the second lidar calibration area, and the third lidar calibration area. And the fourth lidar calibration area.
  • Another calibration device provided by the present application includes: a plurality of camera calibration areas, the positional relationship between any one of the plurality of camera calibration areas and the adjacent camera calibration areas, and, among the plurality of camera calibration areas The positional relationship between the calibration area of any other camera and the calibration area of the adjacent camera is different.
  • the camera calibration area is the area used to calibrate the external parameters of the camera or the internal and external parameters of the camera.
  • the camera calibration area can be made of white reflective material.
  • the camera calibration area can be any shape, for example, it can be a circle, a triangle, or a checkerboard.
  • the calibration board usually includes multiple camera calibration areas, where the positional relationship between any one camera calibration area and its adjacent area is different from the positional relationship between any other camera calibration area and its adjacent area.
  • the points in the calibration area of the camera may be used as the calibration points of the camera, for example, the center point of the calibration area of the camera is used as the calibration point of the camera. Therefore, the positional relationship between the calibration areas of the camera can also be understood as the positional relationship between the calibration points of the camera.
  • the feature points in the image captured by the camera can be determined according to the positional relationship between the feature points as the calibration Which feature points are on the board, the three-dimensional coordinates of these feature points in the world coordinate system can be determined from the pre-stored three-dimensional coordinates, and then the internal and external parameters of the camera can be determined.
  • the calibration device may also include the lidar calibration area in the aforementioned calibration device.
  • the lidar area please refer to the foregoing content, which will not be repeated here.
  • An example of this design is shown in Figure 5.
  • the application also provides a sensor calibration device.
  • the sensor calibration device includes modules for performing various operations in the sensor calibration method described in FIG. 3, FIG. 6, FIG. 7, FIG. 9, FIG. 10, FIG. 11, FIG. 12, FIG. 13 or FIG.
  • the sensor calibration device includes multiple modules, and each module is used to implement the corresponding process in the method described in Figure 3, Figure 6, Figure 7, Figure 9, Figure 10, Figure 11, Figure 12, Figure 13 or Figure 14. .
  • FIG. 15 is a schematic structural diagram of a sensor calibration device 1500 according to an embodiment of the application.
  • the sensor calibration device 1500 may include an acquisition module 1510, a fitting module 1520, an estimation module 1530, and a calibration module 1540.
  • the acquisition module 1510 is used to perform S1010
  • the fitting module is used to perform S1020
  • the estimation module 1530 is used to perform S1030
  • the calibration module 1540 is used to perform 1040 to implement the sensor calibration method shown in FIG. 10.
  • the acquisition module 1510 is used to perform S1010
  • the fitting module is used to perform S1020
  • the estimation module 1530 is used to perform S1030
  • the calibration module 1540 is used to perform S1041 to implement the sensor calibration method shown in FIG. 11.
  • the acquisition module 1510 is used to perform S1010 and S1050
  • the fitting module is used to perform S1020
  • the estimation module 1530 is used to perform S1030
  • the calibration module 1540 is used to perform S1041 and S1060 to achieve the sensor calibration shown in Figure 12. method.
  • the acquisition module 1510 is used to perform S1010
  • the fitting module is used to perform S1020
  • the estimation module 1530 is used to perform S1030
  • the calibration module 1540 is used to perform S1042 and S1044 to implement the sensor calibration method shown in FIG. 13.
  • FIG. 15 is only an exemplary division of the structure and functional modules of the sensor device of the present application, and the specific division is not limited in this application.
  • the sensor calibration device can be deployed in a cloud environment, which is an entity that uses basic resources to provide cloud services to users in a cloud computing mode.
  • the cloud environment includes a cloud data center and a cloud service platform.
  • the cloud data center includes a large number of basic resources (including computing resources, storage resources, and network resources) owned by a cloud service provider.
  • the computing resources included in the cloud data center can be a large number of computing resources.
  • Device for example, server).
  • the sensor calibration device can be a server used to calibrate the sensor in the cloud data center; the sensor calibration device can also be a virtual machine created in the cloud data center to calibrate the sensor; the sensor calibration device can also be deployed in the cloud A software device on a server or virtual machine in a data center.
  • the software device is used to calibrate the sensor.
  • the software device can be distributed on multiple servers, or distributed on multiple virtual machines, or Distributedly deployed on virtual machines and servers.
  • the sensor calibration device is abstracted by the cloud service provider on the cloud service platform as a cloud service for sensor calibration and provided to the user.
  • the cloud environment uses the sensor calibration device to provide the user
  • the sensor calibration cloud service the user can upload the point coordinate collection collected by the laser radar to be calibrated to the cloud environment through the application program interface (API) or through the web interface provided by the cloud service platform, and the sensor calibration device receives the point coordinate collection ,
  • API application program interface
  • the lidar to be calibrated is calibrated, and the calibration result is returned by the sensor calibration device to the terminal where the user is located, or the calibration result is stored in the cloud environment, for example, presented on the web interface of the cloud service platform for the user to view.
  • the sensor calibration device is a software device
  • different modules of the sensor calibration device can be deployed in different environments or equipment.
  • part of the sensor calibration device is deployed in terminal computing equipment (such as vehicles, smart phones, laptops, tablets, personal desktop computers, smart cameras), and the other part is deployed in data centers (specifically Deployed on a server or virtual machine in a data center).
  • the data center can be a cloud data center or an edge data center.
  • the edge data center is a collection of edge computing devices that are deployed closer to terminal smart devices.
  • the vehicle is equipped with the acquisition module in the sensor calibration device. After the vehicle acquires the first point coordinate set, it sends the first point coordinate set to the data center through the network.
  • the data center is equipped with a fitting module, an estimation module, and a calibration module. Modules, these modules further process the first point coordinate set, and finally obtain the calibration result, and the data center sends the calibration result to the vehicle.
  • this application does not restrict which parts of the sensor calibration device are deployed in the terminal computing equipment and which parts are deployed in the data center. In actual applications, it can be adapted according to the computing capabilities of the terminal computing equipment or specific application requirements. Deployment. It is worth noting that, in an embodiment, the sensor calibration device can also be deployed in three parts, of which one part is deployed in the terminal computing device, one part is deployed in the edge data center, and the other part is deployed in the cloud data center.
  • the sensor calibration device can also be separately deployed on a computing device in any environment (for example: separately deployed on a terminal computing device or separately deployed on a computing device in a data center).
  • the computing device 1800 includes a bus 1801, a processor 1802, a communication interface 1803, and a memory 1804.
  • the processor 1802, the memory 1804, and the communication interface 1803 communicate with each other through the bus 1801.
  • the memory 1804 stores executable codes included in the sensor calibration device (codes that realize the functions of various modules), and the processor 1802 reads the executable codes in the memory 1804 to execute the sensor calibration method.
  • the memory 1804 may also include an operating system and other software modules required for running processes.
  • the operating system can be LINUX TM , UNIX TM , WINDOWS TM etc.
  • the present application also provides a computing device 1800 as shown in FIG. 18.
  • the present application also provides a chip, which includes a bus 1801, a processor 1802, a communication interface 1803, and a memory 1804.
  • the processor 1802, the memory 1804, and the communication interface 1803 communicate with each other through the bus 1801.
  • the memory 1804 stores executable codes (codes that realize the functions of various modules) included in the sensor calibration device, and the processor 1802 reads the executable codes in the memory 1804 to execute the sensor calibration method.
  • the memory 1804 may also include an operating system and other software modules required for running processes.
  • the chip can be a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), a system on chip (SoC), or a central processing unit.
  • the central processor unit (CPU) can also be a network processor (NP), a digital signal processing circuit (digital signal processor, DSP), or a microcontroller (microcontroller unit, MCU) It can also be a programmable logic device (PLD) or other integrated chips.
  • each module in this application may also be referred to as a corresponding unit.
  • the acquisition module may also be referred to as an acquisition unit
  • the estimation module may also be referred to as an estimation unit, and so on.
  • the steps of each method can be completed by hardware integrated logic circuits in the processor or instructions in the form of software.
  • the steps of the method disclosed in the embodiments of the present application may be directly embodied as being executed and completed by a hardware processor, or executed and completed by a combination of hardware and software modules in the processor.
  • the software module can be located in a mature storage medium in the field, such as random access memory, flash memory, read-only memory, programmable read-only memory, or electrically erasable programmable memory, registers.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps of the above method in combination with its hardware. To avoid repetition, it will not be described in detail here.
  • the processor in the embodiment of the present application may be an integrated circuit chip with signal processing capability.
  • the steps of the foregoing method embodiments can be completed by hardware integrated logic circuits in the processor or instructions in the form of software.
  • the above-mentioned processor may be a general-purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components .
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the methods, steps, and logical block diagrams disclosed in the embodiments of the present application can be implemented or executed.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the steps of the method disclosed in the embodiments of the present application can be directly embodied as being executed and completed by a hardware decoding processor, or executed and completed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a mature storage medium in the field, such as random access memory, flash memory, read-only memory, programmable read-only memory, or electrically erasable programmable memory, registers.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps of the above method in combination with its hardware.
  • the memory in the embodiments of the present application may be a volatile memory or a non-volatile memory, or may include both volatile and non-volatile memory.
  • the non-volatile memory can be read-only memory (ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), and electrically available Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory, hard disk drive (HDD) or solid state disk (SSD).
  • the volatile memory may be random access memory (RAM), which is used as an external cache.
  • RAM random access memory
  • static random access memory static random access memory
  • dynamic RAM dynamic RAM
  • DRAM dynamic random access memory
  • synchronous dynamic random access memory synchronous DRAM, SDRAM
  • double data rate synchronous dynamic random access memory double data rate SDRAM, DDR SDRAM
  • enhanced synchronous dynamic random access memory enhanced SDRAM, ESDRAM
  • synchronous connection dynamic random access memory serial DRAM, SLDRAM
  • direct rambus RAM direct rambus RAM
  • the present application also provides a computer program product.
  • the computer program product includes: computer program code, which when the computer program code runs on a computer, causes the computer to execute the method in any one of the foregoing method embodiments.
  • the present application also provides a computer-readable medium that stores program code, and when the program code runs on a computer, the computer executes the method in any one of the foregoing method embodiments.
  • the present application also provides a system, which includes any one of the aforementioned sensor calibration device, computing device, and the aforementioned calibration device.
  • the computer program product for video similarity detection includes one or more computer instructions for video similarity detection. When these computer program instructions are loaded and executed on the computer, the process or function according to FIG. 6 of the embodiment of the present application is generated in whole or in part.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from a website, computer, server, or data center. Transmission to another website, computer, server, or data center via wired (such as coaxial cable, optical fiber, digital subscriber line, or wireless (such as infrared, wireless, microwave, etc.)).
  • the computer-readable storage medium stores the video A readable storage medium of similarly detected computer program instructions.
  • the computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, an SSD).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

一种激光雷达和摄像头的标定方法,涉及人工智能领域。在激光雷达的标定方法中,通过激光雷达标定区域内的激光雷达扫描点来拟合激光雷达标定区域的边缘直线,并根据拟合得到的直线求解直线间的交点,以及根据该交点在激光雷达坐标系的坐标进行激光雷达的标定。该方法有助于提高激光雷达的标定精度和标定速度。

Description

传感器标定方法和传感器标定装置
本申请要求于2019年10月08日提交中国专利局、申请号为201910950634.6、申请名称为“传感器标定方法和传感器标定装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及人工智能领域,并且更具体地,涉及传感器标定方法和传感器标定装置。
背景技术
多线激光雷达同时发射多条激光扫描线,以满足快速采集大范围环境信息的需要。特别是在自动驾驶或者智能驾驶领域,激光雷达已经逐步用于车辆周围环境感知。例如,智能车上安装的多线激光雷达能够给智能车提供更加丰富、全面且准确的车辆周围环境信息。
多线激光雷达扫描获得的原始数据主要包括距离信息和角度信息。其中,距离信息指示扫描点至三维激光雷达的距离,角度信息指示该扫描点所在扫描线的俯仰角。
为了后续处理的方便性,需要将多线激光雷达扫描获得的原始数据,转换到多线激光雷达所属的智能设备的坐标系。
将多线激光雷达扫描获得的原始数据转换到多线激光雷达所属的智能设备的坐标系,通常包括以下两个步骤:将多线激光雷达扫描得到的原始数据(即距离信息和角度信息)转换为多线激光雷达自身的坐标系下的三维坐标;将转换得到的激光雷达坐标系下的三维坐标转换到智能设备坐标系下的三维坐标。
将转换得到的激光雷达坐标系下的三维坐标转换到智能设备坐标系下的三维坐标,需要用到多线激光雷达的外参,需要先确定多线激光雷达在车体坐标系下的位置。确定多线激光雷达在车体坐标系下的位置,可以称为多线激光雷达的外参标定,或者标定多线激光雷达的外参。
因此,如何对多线激光雷达进行外参标定,是一个亟待解决的技术问题。
发明内容
本申请提供传感器标定方法、传感器标定装置以及标定装置、标定系统,有助于提高激光雷达的标定精度和标定速度。
第一方面,本申请提供了一种传感器标定方法。该方法包括:获取第一点坐标集合,所述第一点坐标集合中包括标定板上的激光雷达区域中的扫描点在激光雷达坐标系下的三维坐标,所述激光雷达标定区域包括两条不平行的边;根据所述第一点坐标集合进行直线拟合,以得到多条直线的表达式,所述多条直线包括所述两条不平行的边所在的直线;根据所述多条直线的表达式,估计所述标定板上的激光雷达标定点在所述激光雷达坐标系 下的三维坐标;根据所述激光雷达标定点在所述激光雷达坐标系下的三维坐标,对所述激光雷达进行外参标定。
该方法中,激光雷达标定点在激光雷达坐标系下的坐标是根据拟合得到的直线估计的,且这些直线是通过多个扫描点在激光雷达坐标系下的坐标拟合的,从而可以提高激光雷达标定点在激光雷达坐标系下的坐标的精准度,进而可以提高激光雷达的外参精确度。
在第一方面的一些可能的实现方式中,所述根据所述激光雷达标定点在所述激光雷达坐标系下的三维坐标,对所述激光雷达进行外参标定,包括:获取第二点坐标集合,所述第二点坐标集合包括所述标定板上的摄像头标定点在摄像头的图像坐标系中的坐标;根据所述第二点坐标集合、所述摄像头的内参,以及所述摄像头标定区域与所述激光雷达标定区域之间的位置关系,确定所述激光雷达与所述摄像头之间的相对位姿;根据所述摄像头的外参以及所述激光雷达与所述摄像头之间的相对位姿,对所述激光雷达进行外参标定。
该实现方式中,根据摄像头的内参和外参来对激光雷达进行外参标定。
可选地,所述根据所述第二点坐标集合、所述摄像头的内参,以及所述摄像头标定区域与所述激光雷达标定区域之间的位置关系,确定所述激光雷达与所述摄像头之间的相对位姿,包括:根据所述第二点坐标集合、所述摄像头的内参以及所述摄像头标定区域与所述激光雷达标定区域之间的位置关系,确定所述激光雷达标定点在所述摄像头的摄像头坐标系下的三维坐标;根据所述激光雷达标定点在所述摄像头坐标系下的三维坐标和所述激光雷达标定点在所述激光雷达坐标系下的三维坐标确定所述激光雷达与所述摄像头之间的相对位姿。
在第一方面的第二种可能的实现方式中,所述方法还包括:获取第二点坐标集合,所述第二点坐标集合包括摄像头标定点在摄像头的图像坐标系中的坐标;根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定。
该实现方式中,同步对同一个设备上的激光雷达和摄像头进行标定,可以提高标定效率。
此外,不论摄像头拍摄到的是整个标定板上的所有图案,还是标定板上的部分图案,均可以根据特征点之间的位置关系确定摄像头拍摄到的图像中的特征点为标定板上的哪些特征点,从而可以从预存的三维坐标中确定出这些特征点在世界坐标系下的三维坐标,进而可以确定出摄像头内外参。这使得使用该标定板对摄像头进行标定时,不用限定摄像头与标定板之间的距离,也不需要移动标定板,即能够实现摄像头的自动标定。
可选地,所述根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定,包括:根据所述摄像头的内参和所述第二点坐标集合,确定所述摄像头标定点在所述摄像头坐标系下的三维坐标;根据所述摄像头标定点之间的位置关系,确定所述摄像头标定点在世界坐标系下的三维坐标;根据所述摄像头标定点在世界坐标系下的三维坐标和所述摄像头标定点在所述摄像头坐标系下的三维坐标,对摄像头进行外参标定。
可选地,所述第二点坐标集合中包括所述摄像头对所述标定板进行拍摄得到的多张图像的点坐标。其中,所述方法还包括:根据所述摄像头标定区域之间的位置关系,从所述第二点坐标集合中确定所述摄像头标定点在所述多张图像中每张图像中的坐标;根据所述 摄像头标定点在所述多张图像中每张图像中的坐标,对所述摄像头进行内参标定,以得到所述摄像头的内参。
第二方面,本申请提供一种传感器标定方法,该方法包括:获取第二点坐标集合,所述第二点坐标集合包括摄像头标定点在摄像头的图像坐标系中的坐标;根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定。
该方法使得不论摄像头拍摄到的是整个标定板上的所有图案,还是标定板上的部分图案,均可以根据特征点之间的位置关系确定摄像头拍摄到的图像中的特征点为标定板上的哪些特征点,从而可以从预存的三维坐标中确定出这些特征点在世界坐标系下的三维坐标,进而可以确定出摄像头内外参。这使得使用该标定板对摄像头进行标定时,不用限定摄像头与标定板之间的距离,也不需要移动标定板,即能够实现摄像头的自动标定。
在一些可能的实现方式,所述根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定,包括:根据所述摄像头的内参和所述第二点坐标集合,确定所述摄像头标定点在所述摄像头坐标系下的三维坐标;根据所述摄像头标定点之间的位置关系,确定所述摄像头标定点在世界坐标系下的三维坐标;根据所述摄像头标定点在世界坐标系下的三维坐标和所述摄像头标定点在所述摄像头坐标系下的三维坐标,对摄像头进行外参标定。
可选地,所述第二点坐标集合中包括所述摄像头对所述标定板进行拍摄得到的多张图像的点坐标。其中,所述方法还包括:根据所述摄像头标定区域之间的位置关系,从所述第二点坐标集合中确定所述摄像头标定点在所述多张图像中每张图像中的坐标;根据所述摄像头标定点在所述多张图像中每张图像中的坐标,对所述摄像头进行内参标定,以得到所述摄像头的内参。
第三方面,本申请提供一种传感器标定装置,该装置包括:获取模块,用于获取第一点坐标集合,所述第一点坐标集合中包括标定板上的激光雷达区域中的扫描点在激光雷达坐标系下的三维坐标,所述激光雷达标定区域包括两条不平行的边;拟合模块,用于根据所述第一点坐标集合进行直线拟合,以得到多条直线的表达式,所述多条直线包括所述两条不平行的边所在的直线;估计模块,用于根据所述多条直线的表达式,估计所述标定板上的激光雷达标定点在所述激光雷达坐标系下的三维坐标;标定模块,用于根据所述激光雷达标定点在所述激光雷达坐标系下的三维坐标,对所述激光雷达进行外参标定。
在第三方面的第一种可能的实现方式中,所述获取模块还用于:获取第二点坐标集合,所述第二点坐标集合包括所述标定板上的摄像头标定点在摄像头的图像坐标系中的坐标。所述标定模块具体用于:根据所述第二点坐标集合、所述摄像头的内参,以及所述摄像头标定区域与所述激光雷达标定区域之间的位置关系,确定所述激光雷达与所述摄像头之间的相对位姿;根据所述摄像头的外参以及所述激光雷达与所述摄像头之间的相对位姿,对所述激光雷达进行外参标定。
可选地,所述标定模块具体用于:根据所述第二点坐标集合、所述摄像头的内参以及所述摄像头标定区域与所述激光雷达标定区域之间的位置关系,确定所述激光雷达标定点在所述摄像头的摄像头坐标系下的三维坐标;根据所述激光雷达标定点在所述摄像头坐标系下的三维坐标和所述激光雷达标定点在所述激光雷达坐标系下的三维坐标确定所述激 光雷达与所述摄像头之间的相对位姿。
在第三方面的第二种可能的实现方式中,所述获取模块还用于:获取第二点坐标集合,所述第二点坐标集合包括摄像头标定点在摄像头的图像坐标系中的坐标。所述标定模块还用于:根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定。
可选地,所述标定模块具体用于:根据所述摄像头的内参和所述第二点坐标集合,确定所述摄像头标定点在所述摄像头坐标系下的三维坐标;根据所述摄像头标定点之间的位置关系,确定所述摄像头标定点在世界坐标系下的三维坐标;根据所述摄像头标定点在世界坐标系下的三维坐标和所述摄像头标定点在所述摄像头坐标系下的三维坐标,对摄像头进行外参标定。
可选地,所述第二点坐标集合中包括所述摄像头对所述标定板进行拍摄得到的多张图像的点坐标。其中,所述标定模块还具体用于:根据所述摄像头标定区域之间的位置关系,从所述第二点坐标集合中确定所述摄像头标定点在所述多张图像中每张图像中的坐标;根据所述摄像头标定点在所述多张图像中每张图像中的坐标,对所述摄像头进行内参标定,以得到所述摄像头的内参。
第四方面,本申请提供一种传感器标定装置,该装置包括:获取模块,用于获取第二点坐标集合,所述第二点坐标集合包括摄像头标定点在摄像头的图像坐标系中的坐标;标定模块,用于根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定。
在一些可能的实现方式中,所述标定模块具体用于:根据所述摄像头内参和所述第二点坐标集合,确定所述摄像头标定点在所述摄像头坐标系下的三维坐标;根据所述摄像头标定点之间的位置关系,确定所述摄像头标定点在世界坐标系下的三维坐标;根据所述摄像头标定点在世界坐标系下的三维坐标和所述摄像头标定点在所述摄像头坐标系下的三维坐标,对摄像头外参进行标定。
可选地,所述第二点坐标集合中包括所述摄像头对所述标定板进行拍摄得到的多张图像的点坐标。其中,所述标定模块还具体用于:根据所述摄像头标定区域之间的位置关系,从所述第二点坐标集合中确定所述摄像头标定点在所述多张图像中每张图像中的坐标;根据所述摄像头标定点在所述多张图像中每张图像中的坐标,对所述摄像头进行内参标定,以得到所述摄像头内参。
第五方面,提供了一种传感器标定装置,该装置包括:存储器,用于存储程序;处理器,用于执行所述存储器存储的程序,当所述存储器存储的程序被执行时,所述处理器用于执行上述第一方面中的任意一种实现方式中的方法。
可选地,该装置还可以包括通信接口。
第六方面,提供了一种传感器标定装置,该装置包括:存储器,用于存储程序;处理器,用于执行所述存储器存储的程序,当所述存储器存储的程序被执行时,所述处理器用于执行上述第二方面中的任意一种实现方式中的方法。
可选地,该装置还可以包括通信接口。
第七方面,提供一种计算机可读介质,该计算机可读介质存储用于设备执行的指令,该指令用于执行第一方面中的方法。
第八方面,提供一种计算机可读介质,该计算机可读介质存储用于设备执行的指令,该指令用于执行第二方面中的方法。
第九方面,提供一种包含指令的计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述第一方面中的方法。
第十方面,提供一种包含指令的计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述第二方面中的方法。
第十一方面,提供一种芯片,所述芯片包括处理器与通信接口,所述处理器通过所述通信接口读取存储器上存储的指令,执行上述第一方面中的方法。
可选地,该芯片还可以包括存储器,所述存储器中存储有指令,所述处理器用于执行所述存储器上存储的指令,当所述指令被执行时,所述处理器用于执行第一方面中的方法。
第十二方面,提供一种芯片,所述芯片包括处理器与通信接口,所述处理器通过所述通信接口读取存储器上存储的指令,执行上述第二方面中的方法。
可选地,该芯片还可以包括存储器,所述存储器中存储有指令,所述处理器用于执行所述存储器上存储的指令,当所述指令被执行时,所述处理器用于执行第二方面中的方法。
第十三方面,提供了一种电子设备,该电子设备包括上述第三方面中的传感器标定装置。
第十四方面,提供了一种电子设备,该电子设备包括上述第四方面中的传感器标定装置。
第十五方面,本申请提供了一种标定装置,其特征在于,包括:一个或多个激光雷达标定区域,所述激光雷达标定区域包括两条不平行的边,所述两条不平行的边所在的直线的交点用于标定激光雷达的外参。
激光雷达标定区域包括至少两条不平行的边,这些不平行的边所在的直线的交点可以作为激光雷达标定点来实现激光雷达的标定。
在一些可能的实现方式中,所述多个激光雷达标定区域中,第一激光雷达标定区域的第一边与第二激光雷达标定区域的第一边位于第一直线上,所述第一激光雷达标定区域的第二边与所述第二激光雷达标定区域的第二边位于第二直线上,所述第一激光雷达标定区域的第一边和第二边不平行,所述第二激光雷达标定区域的第一边和第二边不平行。
该实现方式中,多个激光雷达区域的边位于同一条直线上,使得可以根据多条边附近的扫描点来拟合该直线,从而可以提高拟合的直线的精准度,进而提高激光雷达标定点的精准度。
可选地,所述第一直线和所述第二直线的第一交点为所述第一激光雷达标定区域的第一顶点,所述第一交点为所述第二激光雷达标定区域的第一顶点。
也就是说,不同的激光雷达区域以相同的激光雷达标定点为顶点,这使得标定板上的激光雷达标定区域的布局更紧凑,这使得在同样面积的标定板上可以布局更多标定区域或者使用更小面积的标定板就能布局同样面积的标定区域。
在一些可能的实现方式中,所述多个激光雷达标定区域中,第三激光雷达标定区域的第一边与第四激光雷达标定区域的第二边位于第三直线上,所述第三激光雷达标定区域的第二边与所述第四激光雷达标定区域的第一边位于第四直线上,所述第三激光雷达标定区域的第一边与第二边不平行,所述第四激光雷达标定区域的第一边与第二边不平行。其中, 所述第一直线与所述第三直线不平行,所述第二直线与所述第四直线不平行。
该实现方式中,由于第一直线与第三直线不平行,第二直线与第四直线不平行,使得增加同样数量的激光雷达标定区域,可以增加更多的交点,从而增加更多的激光雷达标定点,进而可以提高激光雷达的标定精度。
可选地,所述第二直线与所述第四直线的第二交点为所述第一激光雷达标定区域的第二顶点,所述第二交点为所述第三激光雷达标定区域的第一顶点,所述第三直线与所述第四直线的第三交点为所述第三激光雷达标定区域的第二顶点,所述第三交点为所述第四激光雷达标定区域的第一顶点,所述第一直线与所述第三直线的第四交点为所述第四激光雷达标定区域的第二顶点,所述第四交点为所述第二激光雷达标定区域的第二顶点。
也就是说,不同的激光雷达区域以相同的激光雷达标定点为顶点,这使得标定板上的激光雷达标定区域的布局更紧凑,这使得在同样面积的标定板上可以布局更多标定区域或者使用更小面积的标定板就能布局同样面积的标定区域。
可选地,所述激光雷达标定区域的形状为三角形、四边形等。
在一些可能的实现方式中,所述装置还包括多个摄像头标定区域,所述多个摄像头标定区域中任意一个摄像头标定区域与相邻摄像头标定区域的位置关系,和,所述多个摄像头标定区域中其他任意一个摄像头标定区域与相邻摄像头标定区域的位置关系不同。
该实现方式中的装置使得可以同步实现激光雷达和摄像头的标定。此外,该实现方式中的装置使得不论摄像头拍摄到的是整个标定板上的所有图案,还是标定板上的部分图案,均可以根据特征点之间的位置关系确定摄像头拍摄到的图像中的特征点为标定板上的哪些特征点,从而可以从预存的三维坐标中确定出这些特征点在世界坐标系下的三维坐标,进而可以确定出摄像头内外参。这使得使用该标定板对摄像头进行标定时,不用限定摄像头与标定板之间的距离,也不需要移动标定板,即能够实现摄像头的自动标定。
可选地,所述摄像头标定区域的形状为圆形、三角形、四边形等。
第十六方面,本申请提供一种标定装置,其特征在于,包括:多个摄像头标定区域,所述多个摄像头标定区域中任意一个摄像头标定区域与相邻摄像头标定区域的位置关系,和,所述多个摄像头标定区域中其他任意一个摄像头标定区域与相邻摄像头标定区域的位置关系不同。
该实标定装置使得可以同步实现激光雷达和摄像头的标定。此外,该实现方式中的装置使得不论摄像头拍摄到的是整个标定板上的所有图案,还是标定板上的部分图案,均可以根据特征点之间的位置关系确定摄像头拍摄到的图像中的特征点为标定板上的哪些特征点,从而可以从预存的三维坐标中确定出这些特征点在世界坐标系下的三维坐标,进而可以确定出摄像头内外参。这使得使用该标定板对摄像头进行标定时,不用限定摄像头与标定板之间的距离,也不需要移动标定板,即能够实现摄像头的自动标定。
可选地,所述摄像头标定区域的形状为圆形、三角形、四边形等。
第十七方面,本申请提供一种传感器标定系统,该系统包括第三方面或第五方面中的传感器标定装置,以及第十五方面中的标定装置。
第十八方面,本申请提供一种传感器标定系统,该系统包括第四方面或第六方面中的传感器标定装置,以及第十六方面中的标定装置。
第十九方面,本申请提供一种传感器标定系统,该系统包括第四方面中的传感器标定 装置和第十六方面中的标定装置。
附图说明
图1是本申请实施例的技术方案的一种应用场景的示意图;
图2是本申请一个实施例的标定装置的示意性结构图;
图3是本申请一个实施例的传感器标定方法的示意性流程图;
图4是本申请另一个实施例的标定装置的示意性结构图;
图5是本申请另一个实施例的标定装置的示意性结构图;
图6是本申请另一个实施例的传感器标定方法的示意性流程图;
图7是本申请另一个实施例的传感器标定方法的示意性流程图;
图8是本申请另一个实施例的标定装置的示意性结构图;
图9是本申请另一个实施例的传感器标定方法的示意性流程图;
图10是本申请另一个实施例的传感器标定方法的示意性流程图;
图11是本申请另一个实施例的传感器标定方法的示意性流程图;
图12是本申请另一个实施例的传感器标定方法的示意性流程图;
图13是本申请另一个实施例的传感器标定方法的示意性流程图;
图14是本申请另一个实施例的传感器标定方法的示意性流程图;
图15是本申请一个实施例的传感器标定装置的示意性结构图;
图16是本申请另一个实施例的传感器标定装置的示意性部署图;
图17是本申请另一个实施例的传感器标定装置的示意性部署图;
图18是本申请另一个实施例的计算设备的示意性结构图。
具体实施方式
应理解,本申请实施例的智能设备是指任何一种具有计算处理能力的设备、器械或者机器。本申请实施例中的智能设备可以是机器人(robot)、自动驾驶汽车(autonomous vehicles)、智能辅助驾驶汽车、无人机(unmanned aerial vehicle)、智能辅助飞机、智能家居(smart home)设备等。本申请对该智能设备不作任何限定,只要是可以安装激光雷达和/或摄像头的设备均可以纳入本申请智能设备的范围。
图1是本申请实施例的技术方案的一种应用场景的示意图。该场景中可以包括汽车、前置标定板和四组双目摄像头。
其中,四对双目摄像头组成四轮视觉定位系统,每对双目摄像头负责检测一个车轮中心在世界坐标系下的三维坐标。根据四个车轮中心在世界坐标系下的三维坐标,可以建立汽车坐标系,进一步可以确定汽车坐标系相对于世界坐标系的位姿,即位置与姿态。
前置标定板和双目摄像头分别固定在相应的位置。车辆前方安装有多线激光雷达。
图2为本申请一个实施例的标定板200的示意图。其中,三角形区域采用激光雷达工作频段内的红外反光材质,背景采用黑色吸光材质。为了描述方便,将这四个相接的三角形区域称为一组标定图案。本申请实施例中的标定板上可以更多组或更少组的标定图案,图2仅示出三组作为示例。
一组标定图案中,左上的三角形区域称为第一区域,左下的三角形区域称为第二区域, 右上的三角形区域称为第三区域,右下的三角形区域称为第四区域。
第一区域的一个顶点与第二区域的一个顶点为同一个点,为了后续描述方便,将该点成为点A。第一区域的一条边与第二区域的一条边位于同一条直线上,为了后续描述方便,将第一区域的这条边称为第一区域的第一边,将第二区域的这条边称为第二区域的第一边,将该直线称为第一直线。第一区域的另一条边与第二区域的另一条边位于同一条直线上,为了后续描述方便,将第一区域的这条边称为第一区域的第二边,将第二区域的这条边称为第二区域的第二边,将这条直线称为第二直线。
第一区域的另一个顶点与第三区域的一个顶点为同一个点,为了后续描述方便,将该同一个点称为点B。
第三区域的一个顶点与第四区域的一个顶点为同一个点,为了后续描述方便,将该点成为点C。第三区域的一条边与第四区域的一条边位于同一条直线上,为了后续描述方便,将第三区域的这条边称为第三区域的第一边,将第四区域的这条边称为第四区域的第二边,将该直线称为第三直线。第三区域的另一条边与第四区域的另一条边位于同一条直线上,为了后续描述方便,将第三区域的这条边称为第三区域的第二边,将第四区域的这条边称为第二区域的第一边,将该直线称为第四直线。
第二区域的另一个顶点与第四区域的一个顶点为同一个点,为了后续描述方便,将该同一个点称为点D。
基于图1所示的应用场景和图2所示的标定板,下面结合图3介绍本申请一个实施例的传感器标定方法的示意性流程图。
图3所示的方法可以包括S310至S390。应理解,这些步骤或操作仅是示例。本申请提出的技术方案中可以包括更多或者更少的步骤或操作,或者可以执行图3中的各个操作的变形。
S310,预存标定板上的特征点在世界坐标系下的三维坐标。
例如,以图2所示标定板上标定图案中的点A、点B、点C和点D作为特征点时,预存标定板上的多组点A、点B、点C和点D在世界坐标下的三维坐标。这多组点在世界坐标系下的三维坐标可以通过全站仪测量得到。
S320,检测车辆是否已经进入指定区域,是则执行S330,否则执行S340。
其中,可以通过四轮视觉定位系统捕捉四个车轮的图像,并根据该图像判断车辆是否进入指定区域。若车辆的位置满足标定要求,则确定车辆已经进入指定区域。
该步骤更详细的实现方式可以参考现有技术,此处不再赘述。
S330,通过四轮视觉定位系统检测车轮中心,建立汽车坐标系。
通过四组双目摄像头同时观测车轮和侧视摄像头标定板,根据侧视摄像头标定板上的特征点求解双目摄像头的外参,根据双目摄像头获取的图像构建四个车轮中心在世界坐标系下的三维坐标。根据四个车轮中心在世界坐标系下的三维坐标建立汽车坐标系,并求解汽车坐标系相对于世界坐标系的位姿。
该步骤更详细的实现方式可以参考现有技术,此处不再赘述。
S340,提示调整车辆位置。
S350,确定扫描点在激光雷达坐标系下的三维坐标。
激光雷达发射激光线束、接收反射信号,并根据反射信号确定扫描点在激光雷达坐标 系下的三维坐标。多个扫描点在激光雷达坐标下的三维坐标构成三维点坐标集合。
具体地,激光雷达向外发射多道激光扫描线,并测算每道激光扫描线与标定板相交所形成的扫描点在激光雷达坐标系下的空间坐标,即三维坐标。
以激光雷达的四道激光扫描线扫描到图2中的标定图案为例,激光雷达在该标定图案中的有效扫描点如图4中的四条虚线所示。
S360,根据扫描点在激光雷达坐标系下的三维坐标确定特征点在激光雷达坐标系下的三维坐标。
具体地,根据该三维点坐标集合拟合第一直线、第二直线、第三直线和第四直线的表达式;并根据第一直线、第二直线、第三直线和第四直线的拟合表达式确定点A、点B、点C和点D在激光雷达坐标系下的三维坐标;以及根据点A、点B、点C和点D在激光雷达坐标系下的三维坐标和点A、点B、点C和点D在世界坐标系下的三维坐标,对激光雷达进行外参标定。
例如,从扫描点在激光雷达坐标系下的三维点坐标集合中确定各个扫描线在各个区域中的起始扫描点和结束扫描点的三维坐标。以图4为例,从三维点坐标集合中确定四角星、五角星、六角形和七角星标注的点的三维坐标。
根据四角星标注的四个点在激光雷达坐标系下的三维坐标拟合出第一直线的表达式,具体地,拟合得到第一直线的表达式中的参数;根据五角星标注的四个点在激光雷达坐标系下的三维坐标拟合出第二直线的表达式,具体地,拟合得到第二直线的表达式中的参数;根据七角星标注的四个点在激光雷达坐标系下的三维坐标拟合出第三直线的表达式,具体地,拟合得到第三直线的表达式中的参数;根据六角形标注的四个点在激光雷达坐标系下的三维坐标拟合出第四直线的表达式,具体地,拟合得到第四直线的表达式中的参数。
其中,可以通过多种拟合方式来拟合各个直线的表达式,例如,可以通过最小二乘法来拟合各个直线的表达式,或者,可以通过空间三维散点的数据线性拟合方式来拟合各个直线的表达式。
然后,根据拟合得到的第一直线的表达式和拟合得到的第二直线的表达式求解点A在激光雷达坐标系下的三维坐标;根据拟合得到的第二直线的表达式和拟合得到的第四直线的表达式求解点B在激光雷达坐标系下的三维坐标;根据拟合得到的第三直线的表达式和拟合得到的第四直线的表达式求解点C在激光雷达坐标系下的三维坐标;根据拟合得到的第三直线的表达式和拟合得到的第一直线的表达式求解点D在激光雷达坐标系下的三维坐标。
根据激光雷达的多道扫描线扫描多组标定图案得到的三维点坐标集合,可以确定出多组点A、点B、点C和点D在激光雷达坐标系下的三维坐标。
S370,根据特征点在激光雷达坐标系下的三维坐标和特征点在世界坐标系下的三维坐标,对所述激光雷达进行外参标定。
例如,根据多组点A、点B、点C和点D在激光雷达坐标系下的三维坐标和多组点A、点B、点C和点D在世界坐标系下的三维坐标,确定激光雷达在世界坐标系下的位姿;并根据激光雷达在世界坐标系下的位姿,确定激光雷达在汽车坐标系下的位姿,即实现激光雷达外参标定。该操作的具体实现方式,可以参考现有技术。
由于第一直线、第二直线、第三直线和第四直线都是通过多个点拟合得到的,因此可 以提高拟合得到的直线表达式的精准度,从而可以提高根据拟合得到的直线表达式求解的特征点的三维坐标的精准度,进而可以提高激光雷达的外参精准度。
S380,判断激光雷达的外参标定误差是否满足要求,是则执行S390,否则重新执行S320。
激光雷达的外参标定精度及其稳定性对自动驾驶算法至关重要,因此,若激光雷达的外参标定误差不满足要求,则回退至S320,再次执行S320至S380,直至激光雷达的外参标定误差满足要求。
S390,输出和保存激光雷达的外参标定结果。
例如,显示激光雷达的外参标定结果和将激光雷达的外参标定结果存储至存储单元。
可选地,还可以根据激光雷达的外参确定激光雷达与其他传感器的相对位姿。
例如,根据激光雷达的外参和车辆上的前置摄像头的外参,求解激光雷达与前置摄像头之间的相对位姿。
可以理解的是,图2所示的标定板中,用于标定激光雷达外参的反光区域可以是其他形状,例如,可以是四边形。
可以理解的是,图1所示的场景只是一种示例,可以应用图3所示的传感器标定方法的场景中可以包含更多或更少的装置。
例如,图1所示的场景中可以没有双目摄像头,相应地,汽车坐标系可以根据其他方式来建立。
例如,图1所示的场景中,车辆上还可以安装有前置摄像头。车辆上安装有前置摄像头的情况下,若前置摄像头的内参和外参已经标定过,则可以根据该摄像头的内参和外参来对激光雷达进行外参标定;若前置摄像头的内参和外参没有标定过,则可以同步对摄像头的内参、外参以及激光雷达的外参进行标定。
摄像头的内参是指与摄像头自身特性相关的参数,由光学镜头和光电传感器在摄像头内的安装位置决定。摄像头的内参可以包括:焦距、像素大小、光学畸变、白平衡参数、分辨率、反差、晕影和/或暗角等。
摄像头外参是指摄像头在世界坐标系中的参数,由摄像头在世界坐标系中的位置决定,例如包括摄像头在世界坐标系中的位置、旋转方向等。
前置摄像头的内参和外参已经标定过,根据该摄像头的内参和外参来对激光雷达进行外参标定的场景下,前置标定板的一种标定图案示意图如图5所示。
图5所示的标定板500中的三角形区域与图2所示的标定板中的三角形区域含义相同,此处不再赘述。标定板500中的圆形区域可以采用白色反光材质,用于摄像头拍摄;背景采用黑色吸光材质。
标定板500中任意一个圆形区域与其相邻圆形区域的位置关系,与另一任意圆形区域与其相邻圆形区域的位置关系不同。
基于上述应用场景和图5所示的标定板,下面结合图6介绍本申请一个实施例的传感器标定方法的示意性流程图。
图6所示的方法可以包括S610至S694。应理解,这些步骤或操作仅是示例。本申请提出的技术方案中可以包括更多或者更少的步骤或操作,或者可以执行图6中的各个操作的变形。
S610,预存标定板上的特征点之间的位置关系、摄像头内参和外参。其中,摄像头外参即为摄像头在汽车坐标系下的位姿。
例如,以图5所示标定板上标定图案中的点A、点B、点C、点D和圆形区域的中心点作为特征点时,预存标定板上的点A、点B、点C、点D和圆形区域的中心点之间的位置关系。具体地,可以预存点A、点B、点C、点D和圆形区域的中心点之间的相对位置。
为了后续描述方面,将点A、点B、点C、点D称为交点特征点,将圆形区域的中心点称为圆心特征点。
S620,检测车辆是否已经进入指定区域,是则执行S630,否则执行S640。该步骤可以参考S320。
S630,通过四轮视觉定位系统检测车轮中心,建立汽车坐标系。该步骤可以参考S330。
S640,提示调整车辆位置。
S650,确定扫描点在激光雷达坐标系下的三维坐标。该步骤可以参考S350,此处不再赘述。
S660,根据扫描点在激光雷达坐标系下的三维坐标确定交点特征点在激光雷达坐标系下的三维坐标。
该步骤可以参考S360,此处不再赘述。
S670,通过摄像头拍摄的图像获取圆心特征点在图像坐标系下的二维坐标。
例如,通过摄像头拍摄标定图案,得到包含多个圆形区域的图像。对图像中的每个圆形区域进行拟合,得到圆心特征点在图像坐标系中的二维坐标。
S680,根据圆心特征点在图像坐标系中的二维坐标,以及特征点之间的位置关系,确定交点特征点在摄像头坐标系中的三维坐标。
例如,根据圆心特征点在图像坐标系中的二维坐标,以及特征点之间的位置关系,计算交点特征点在图像坐标系中的二维坐标,并将交点特征点在图像坐标系下的二维坐标转换成摄像头坐标系下的三维坐标。
S690,根据交点特征点在摄像头坐标系下的三维坐标和交点特征点在激光雷达坐标系下的三维坐标,对激光雷达进行外参标定。
例如,根据交点特征点在摄像头坐标系下的三维坐标和交点特征点在激光雷达坐标系下的三维坐标,确定激光雷达与摄像头的相对位姿;并根据摄像头的外参,以及激光雷达与摄像头的相对位姿,确定激光雷达在汽车坐标系下的位姿,即实现激光雷达的外参标定。
S692,判断激光雷达的外参标定误差是否满足要求,是则执行S694,否则重新执行S620。
激光雷达的外参标定精度及其稳定性对自动驾驶算法至关重要,因此,若激光雷达的外参标定误差不满足要求,则回退至S620,再次执行S620至S692,直至激光雷达的外参标定误差满足要求。
S694,输出和保存激光雷达的外参标定结果。
例如,显示激光雷达的外参标定结果和将激光雷达的外参标定结果存储至存储单元。
可以理解的是,图5所示的标定板中,用于标定激光雷达外参的反光区域可以是其他形状,例如,可以是四边形;用于摄像头成像的反光区域可以是其他形状,例如三角形、四边形或棋盘格等。
本申请实施例中,由于任意特征点与相邻特征点之间的位置关系与另一个任意特征点与相邻特征点之间的位置关系不同,因此,不论摄像头拍摄到的是整个标定板上的所有图案,还是标定板上的部分图案,均可以根据特征点之间的位置关系确定摄像头拍摄到的图像中的特征点为标定板上的哪些特征点,从而可以根据这些特征点之间的位置关系确定出交点特征点在激光雷达坐标系的三维坐标。这使得使用摄像头拍摄标定板时,不用限定摄像头与标定板之间的距离,即能够实现激光雷达外参的自动标定。
前置摄像头的内参和外参还没标定过,同步对该摄像头进行内参和外参标定,以及对激光雷达进行外参标定的场景下,前置标定板的一种标定图案示意图如图5所示。
基于上述场景和图5所示的标定板,下面结合图7介绍本申请一个实施例的传感器标定方法的示意性流程图。
图7所示的方法可以包括S710至S796。应理解,这些步骤或操作仅是示例。本申请提出的技术方案中可以包括更多或者更少的步骤或操作,或者可以执行图7中的各个操作的变形。
S710,预存标定板上的特征点在世界坐标系下的三维坐标以及特征点之间的位置关系。
例如,以图5所示标定板上标定图案中的点A、点B、点C、点D和圆形区域的中心点作为特征点时,预存标定板上的点A、点B、点C、点D和圆形区域的中心点在世界坐标系下的三维坐标,以及预存圆形区域的中心点之间的位置关系。
为了后续描述方面,将点A、点B、点C、点D称为交点特征点,将圆形区域的中心点称为圆心特征点。
S720,检测车辆是否已经进入指定区域,是则执行S730,否则执行S740。该步骤可以参考S320。
S730,通过四轮视觉定位系统检测车轮中心,建立汽车坐标系。该步骤可以参考S330。
S740,提示调整车辆位置。
S750,确定扫描点在激光雷达坐标系下的三维坐标。该步骤可以参考S350。
S760,根据扫描点在激光雷达坐标系下的三维坐标确定交点特征点在激光雷达坐标系下的三维坐标。该步骤可以参考S360,此处不再赘述。
S770,根据交点特征点在激光雷达坐标系下的三维坐标和交点特征点在世界坐标系下的三维坐标,对激光雷达进行外参标定。该步骤可以参考S370。
S780,通过摄像头拍摄的图像获取圆心特征点在图像坐标系下的二维坐标。该步骤可以参考S670。
摄像头拍摄的图像可以包括整个标定板,可以是部分标定板。
S790,根据圆心特征点在图像坐标系中的二维坐标和圆心特征点在世界坐标系下的三维坐标,对摄像头进行内参和外参标定。
其中,根据圆心特征点之间的位置关系确定摄像头拍摄的是标定板上的哪些特征点,然后从预存的三维坐标中选出这些特征点在世界坐标系下的三维坐标,再根据这些特征点在摄像头坐标系下的三维坐标和在世界坐标系下的三维坐标,对摄像头内外参进行标定。
根据这些特征点在摄像头坐标系下的三维坐标和在世界坐标系下的三维坐标,对摄像头内外参进行标定的实现方式可以参考现有技术,此处不再赘述。
S792,判断摄像头的内参标定误差、外参标定误差以及激光雷达的外参标定误差是否满足要求,是则执行S794,否则重新执行S720。
摄像头和激光雷达的标定精度及其稳定性对自动驾驶算法至关重要,因此,若摄像头和激光雷达的标定误差不满足要求,则回退至S720,再次执行S720至S792,直至摄像头和激光雷达的标定误差满足要求。
S794,输出和保存摄像头和激光雷达的标定结果。
例如,显示标定结果和将标定结果存储至存储单元。
进一步地,可以根据摄像头的外参和激光雷达的外参,确定摄像头与激光雷达之间的相对位姿。
当然,也可以根据摄像头的外参和激光雷达的外参,确定摄像头与其他传感器之间的相对位姿以及确定激光雷达与其他传感器之间的相对位姿。
可以理解的是,图5所示的标定板中,用于标定激光雷达外参的反光区域可以是其他形状,例如,可以是四边形;用于摄像头成像的反光区域可以是其他形状,例如三角形、四边形或棋盘格等。
本申请实施例中,由于任意特征点与相邻特征点之间的位置关系与另一个任意特征点与相邻特征点之间的位置关系不同,因此,不论摄像头拍摄到的是整个标定板上的所有图案,还是标定板上的部分图案,均可以根据特征点之间的位置关系确定摄像头拍摄到的图像中的特征点为标定板上的哪些特征点,从而可以从预存的三维坐标中确定出这些特征点在世界坐标系下的三维坐标,进而可以确定出摄像头内外参。这使得使用该标定板对摄像头进行标定时,不用限定摄像头与标定板之间的距离,也不需要移动标定板,即能够实现摄像头的自动标定。
图1所示的场景中,可选地,标定板的标定图案的一种示意图如图8所示。这种场景可以用于对车辆上的前置摄像头进行标定。
图8所示的标定板中的圆形区域可以采用白色反光材质,用于摄像头拍摄;背景采用黑色吸光材质。其中,任意一个圆形区域与其相邻圆形区域的位置关系,与另一任意圆形区域与其相邻圆形区域的位置关系不同。
基于上述场景和图8所示的标定板,下面结合图9介绍本申请一个实施例的传感器标定方法的示意性流程图。
图9所示的方法可以包括S910至S980。应理解,这些步骤或操作仅是示例。本申请提出的技术方案中可以包括更多或者更少的步骤或操作,或者可以执行图9中的各个操作的变形。
S910,预存标定板上的特征点在世界坐标系下的三维坐标和特征点之间的位置关系。
例如,以图8所示标定板上圆形区域的中心点作为特征点时,预存标定板上的圆形区域的中心点在世界坐标系下的三维坐标。为了后续描述方面,将圆形区域的中心点称为圆心特征点。
S920,检测车辆是否已经进入指定区域,是则执行S930,否则执行S940。该步骤可以参考S320。
S930,通过四轮视觉定位系统检测车轮中心,建立汽车坐标系。该步骤可以参考S330。
S940,提示调整车辆位置。
S950,通过摄像头拍摄的图像获取圆心特征点在图像坐标系下的二维坐标。该步骤可以参考S670。
S960,根据圆心特征点在图像坐标系中的二维坐标和圆心特征点在世界坐标系下的三维坐标,对摄像头进行内参和外参标定。该步骤可以S790。
S970,判断摄像头的内参标定误差、外参标定误差是否满足要求,是则执行S980,否则重新执行S970。
摄像头和激光雷达的标定精度及其稳定性对自动驾驶算法至关重要,因此,若摄像头和激光雷达的标定误差不满足要求,则回退至S920,再次执行S920至S970,直至摄像头和激光雷达的标定误差满足要求。
S980,输出和保存摄像头和激光雷达的标定结果。
例如,显示标定结果和将标定结果存储至存储单元。
进一步地,可以根据摄像头的外参确定摄像头与其他传感器之间的相对位姿。
可以理解的是,图5所示的标定板中,用于标定激光雷达外参的反光区域可以是其他形状,例如,可以是四边形;用于摄像头成像的反光区域可以是其他形状,例如三角形、四边形或棋盘格等。
本申请实施例中,由于任意特征点与相邻特征点之间的位置关系与另一个任意特征点与相邻特征点之间的位置关系不同,因此,不论摄像头拍摄到的是整个标定板上的所有图案,还是标定板上的部分图案,均可以根据特征点之间的位置关系确定摄像头拍摄到的图像中的特征点为标定板上的哪些特征点,从而可以根据这些特征点之间的位置关系确定出交点特征点在激光雷达坐标系的三维坐标。这使得使用该标定板对摄像头进行标定时,不用限定摄像头与标定板之间的距离,即能够实现摄像头的自动标定。
图10为本申请提供的一种传感器标定方法的示意性流程图。该方法可以包括S1010至S1040。
S1010,获取第一点坐标集合,所述第一点坐标集合包括标定板上的激光雷达区域中的扫描点在激光雷达坐标系下的三维坐标,所述激光雷达标定区域包括两条不平行的边。
其中,该激光雷达为多线激光雷达,该多道扫描线可以指激光雷达的全部扫描线,也可以指激光雷达的部分扫描线。
标定板上可以包括一组或多组标定图案,其中,每组标定图案中可以包括一个或多个激光雷达标定区域。
激光雷达标定区域为用于标定激光雷达的反光区域。例如,激光雷达标定区域采用激光雷达工作频段的红外材质。激光雷达区域可以是任意形状。
激光雷达区域包括多条边,其中,至少有两条边是不平行的,或者说至少有两条边所在的直线是相交的。例如,激光雷达区域为三角形或四边形等。
标定板上有多组标定图案时,这多组标定图案可以相同,也可以不同。标定图案包括多个激光雷达标定区域时,这多个激光雷达标定区域的形状可以相同,也可以不相同。
标定板包括多组标定图案,每组标定图案包括多个激光雷达标定区域的一种示例如图2或图5所示。
激光雷达的多道扫描线对标定板上激光雷达区域进行扫描,产生多个扫描点。为了描述方便,将这多个扫描点在激光雷达坐标系下的三维坐标构成的集合称为第一点坐标集 合。
激光雷达坐标系的原点通常为激光雷达的中心,该激光雷达坐标系的x轴方向通常指向激光雷达的输出电缆的相反方向;如果将激光雷达指向汽车正前方安装,则激光雷达坐标系的y轴方向通常指向汽车的左侧;激光雷达坐标系的z轴通常指向天空。
S1020,根据所述第一点坐标集合进行直线拟合,以得到多条直线的表达式,所述多条直线包括所述两条不平行的边所在的直线。
在一些可能的设计中,根据所述第一点坐标集合进行直线拟合时,可以选取每道扫描线扫描所述不平行的两条边之间的激光雷达标定区域时的起始扫描点和结束扫描点的三维坐标。多道扫描线扫描激光雷达标定区域,所以选择得到多个起始扫描点和多个结束扫描点的三维坐标。然后,根据这些起始扫描点和结束扫描点的三维坐标进行直线拟合。
例如,标定板上的一组标定图案包括如图4中的第一区域,且激光雷达从第一区域的第一边扫描至第二边时,五角星标注的扫描点即为起始扫描点,四角星标注的扫描点即为结束扫描点。此时,根据第一区域中四角星标注的两个点在激光雷达坐标系下的三维坐标拟合第一直线的表达式,根据第一区域中五角星标注的两个点在激光雷达坐标系下的三维坐标拟合第二直线的表达式。
又如,标定板上的一组标定图案包括如图4中的第一区域和第二区域,且激光雷达的扫描方向为从第一区域的第一边扫描至第二边时,第一区域中通过五角星标注的扫描点即为起始扫描点,通过四角星标注的扫描点即为结束扫描点;第二区域中通过五角星标注的点即为结束扫描点,通过四角星标注的点为起始扫描点。此时,根据第一区域和第二区域中通过四角星标注的四个点在激光雷达坐标系下的三维坐标,拟合第一直线的表达式;根据第一区域和第二区域中通过五角星标注的四个点在激光雷达坐标系下的三维坐标拟合第二直线的表达式。
这种实现方式中,由于拟合同一条直线的点增多,所以可以提高拟合得到的表达式的精确度,从而可以提高根据该表达式求解得到的激光雷达特征点在激光雷达坐标系下的坐标的精确度,进而可以提高激光雷达的标定结果的精确度。
再如,标定板上的一组标定图案包括如图4中的第一区域、第二区域、第三区域和第四区域,且激光雷达的扫描方向为从第一区域的第一边扫描至第二边时,第一区域中通过五角星标注的扫描点即为起始扫描点,通过四角星标注的扫描点即为结束扫描点;第二区域中通过五角星标注的点即为结束扫描点,通过四角星标注的点为起始扫描点;第三区域中通过七角星标注的点为起始扫描点,通过六角星标注的点为结束扫描点;第四区域中通过六角星标注的点为起始扫描点,通过七角星标注的点为结束扫描点。
此时,根据第一区域和第二区域中通过四角星标注的四个点在激光雷达坐标系下的三维坐标,拟合第一直线的表达式;根据第一区域和第二区域中通过五角星标注的四个点在激光雷达坐标系下的三维坐标拟合第二直线的表达式;根据第三区域和第四区域中通过七角星标注的四个点在激光雷达坐标系下的三维坐标拟合第三直线的表达式;根据第三区域和第四区域中通过六角星标注的四个点在激光雷达坐标系下的三维坐标拟合第四直线的表达式。
这种实现方式,在增加一定数量扫描点的情况下,可以增加更多的交点特征点,从而可以提高激光雷达的标定结果的精确度。
可以理解的是,根据所述第一点坐标集合进行直线拟合的实现方式不限于上面的方式。例如,可以将每道扫描线扫描所述不平行的两条边之间的激光雷达标定区域时,所有扫描点在z轴方向的平均坐标作为起始扫描点和结束扫描点的z坐标;或者,可以计算上述所有扫描点在y轴方向上的坐标间距,并计算这些扫描点中位于中间的扫描点在y轴方向上的坐标加上或减去一定数量的坐标间距,并将得到的坐标值作为起始扫描点或结束扫描点的y坐标。
该步骤中,确定起始扫描点和结束扫描点在激光雷达坐标系下的三维坐标后,根据这些点的三维坐标进行直线拟合时,可以通过多种方式实现,例如可以通过最小二乘三维直线拟合法或空间三维散点数据的线性拟合法来实现。
S1030,根据所述多条直线的表达式,估计激光雷达标定点在激光雷达坐标系下的三维坐标。
激光雷达标定点是指标定板上用于标定激光雷达外参的点。具体地而言,激光雷达标定点包括激光雷达标定区域上述不平行的两条边所在直线的交点。因此,激光雷达标定点也可以称为交点特征点。
例如,标定板上包括图2中的第一区域时,激光雷达特征点包括点A。又如,标定板上包括图2中的第一区域、第二区域、第三区域和第四区域时,激光雷达特征点包括点A、点B、点C和点D。
根据拟合得到表达式,估计激光雷达特征点在激光雷达坐标系下的三维坐标,可以包括:构建表达式组成的方程组,解答方程组,得到的坐标即可当作激光雷达特征点在激光雷达坐标系下的三维坐标。
例如,标定板上包括图2中的第一区域,或者包括图2中的第一区域和第二区域时,解答第一直线的表达式和第二直线的表达式构成的方程组,得到的坐标即为点A在激光雷达坐标系下的三维坐标。
又如,标定板上包括图2中的第一区域、第二区域、第三区域和第四区域时,解答第一直线、第二直线、第三直线和第四直线的表达式构成的方程组,得到的坐标即为点A、点B、点C和点D在激光雷达坐标系下的三维坐标。
S1040,根据激光雷达标定点在激光雷达坐标系下的三维坐标,对激光雷达进行外参标定。
在一些设计中,如图11所示,根据激光雷达标定点在激光雷达坐标系下的三维坐标,对激光雷达进行外参标定,可以包括:S1041,根据激光雷达标定点在激光雷达坐标系下的三维坐标和激光雷达标定点在世界坐标系下的三维坐标,对激光雷达进行外参标定。一种示例如图3所示的方法。
其中,激光雷达标定点在世界坐标系下的三维坐标可以是预先通过全站仪测到的。根据激光雷达标定点在激光雷达坐标系下的三维坐标和激光雷达标定点在世界坐标系下的三维坐标,可以参考现有技术,此处不再赘述。
在该设计中,可选地,如图12所示,还可以包括:S1050和S1060。
S1050,获取第二点坐标集合,所述第二点坐标集合包括摄像头标定点在摄像头的图像坐标系中的坐标。
所述摄像头标定点包括所述标定板上的摄像头标定区域中的点。所述摄像头与激光雷 达位于同一个智能设备上。所述摄像头拍摄的图像中可以包括标定板上所有摄像头标定区域,也可以包括标定板上部分摄像头标定区域。
摄像头标定区域为用于标定摄像头外参或标定摄像头内外参的区域。摄像头标定区域可以采用白色反光材质。摄像头标定区域可以是任意形状,例如可以是圆形、三角形或棋盘格等。
标定板上通常包括多个摄像头标定区域,其中,任意一个摄像头标定区域与其相邻区域的位置关系,与任意另一个摄像头标定区域与其相邻区域的位置关系不同。
摄像头标定点位于摄像头标定区域中,因此,摄像头标定区域之间的位置关系也可以理解为摄像头标定点之间的位置关系。
例如,任意一个摄像头标定区域的中心点与其相邻区域的中心点的位置关系,与任意另一个摄像头标定区域中心点与其相邻区域终点的位置关系不同。
S1060,根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定。
其中,对摄像头标定包括对摄像头的内参和/或外参进行标定。
例如,根据摄像头内参,将第二点坐标集合中的坐标从图像坐标系转到摄像头坐标系,得到摄像头标定点在摄像头坐标系下的三维坐标;根据摄像头标定点之间的位置关系,确定摄像头拍摄到的摄像头标定点为标定板上的哪些标定点;根据这些标定点在世界坐标系下的三维坐标和这些点在摄像头坐标系下的三维坐标,对摄像头外参进行标定,具体实现方式可以参考现有技术。
其中,摄像头标定点在世界坐标下的三维坐标可以是通过全站仪测量得到的,当然,也可以通过其他方式测量,本申请实施例对此不作限制。
其中,摄像头的内参可以是预先标定好的,也可以是根据该标定板标定的。
例如,摄像头从多个角度拍摄标定板,得到多张图像;根据摄像头标定点之间的位置关系,可以从多张图像中分别确定出每个摄像头标定点的坐标;根据摄像头标定点在多张图像中的坐标对摄像头进行内参标定,具体实现方式可以参考现有技术。
该步骤中,根据摄像头标定区域之间的位置关系来对摄像头进行标定,可以实现摄像头的自动化标定。
在一些设计中,根据激光雷达标定点在激光雷达坐标系下的三维坐标,对激光雷达进行外参标定,可以包括:根据激光雷达标定点在激光雷达坐标系下的三维坐标和摄像头外参,对激光雷达进行外参标定。
例如,如图13所示,S1040可以包括S1042至S1046。
S1042,获取第二点坐标集合,所述第二点坐标集合包括所述标定板上的摄像头标定点在摄像头的图像坐标系中的坐标。
该步骤可以参考S1050.
S1044,根据所述第二点坐标集合、摄像头内外参以及所述摄像头标定区域与激光雷达标定区域之间的位置关系,对激光雷达进行外参标定。
在一些设计中,可以根据所述第二点坐标集合、摄像头内参以及所述摄像头标定区域与激光雷达标定区域之间的位置关系,确定所述激光雷达标定点在所述摄像头的摄像头坐标系下的三维坐标;并根据激光雷达标定点在摄像头坐标系下的三维坐标、激光雷达标定 点在激光雷达坐标系下的三维坐标以及摄像头外参,对激光雷达进行外参标定。
例如,可以根据所述第二点坐标集合确定摄像头标定区域中心点在图像坐标系下的坐标;根据所述摄像头标定区域与激光雷达标定区域之间的位置关系,以及摄像头标定区域中心点在图像坐标系下的坐标,确定激光雷达标定点在该图像坐标系下的坐标;根据激光雷达标定点在该图像坐标系下的坐标和摄像头的内参,确定激光雷达标定点在摄像头坐标系下的三维坐标。
或者,在确定摄像头标定区域中心点在图像坐标系下的坐标之后,根据摄像头内参,将该坐标转换为摄像头坐标系下的三维坐标;再根据摄像头标定区域中心点在摄像头坐标系下的三维坐标,以及摄像头标定区域与激光雷达标定点之间的位置关系,确定激光雷达标定点在摄像头坐标系下的三维坐标。
例如,根据激光雷达标定点在摄像头坐标系下的三维坐标和激光雷达标定点在激光雷达坐标系下的三维坐标确定摄像头与激光雷达的相对位姿;根据该相对位姿和摄像头在车辆坐标系下的位姿(即摄像头外参),确定激光雷达在车辆坐标系下的位姿(即激光雷达外参)。
本申请还提供一种传感器标定方法。图14为本申请一个实施例的传感器标定方法的示意性流程图。图14所示的传感器标定方法可以包括S1410和S1420。
S1410,获取第二点坐标集合,所述第二点坐标集合包括摄像头标定点在摄像头的图像坐标系中的坐标。该步骤可以参考S1050。
S1420,根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定。该步骤可以参考S1060。
本申请实施例中,不论摄像头拍摄到的是整个标定板上的所有图案,还是标定板上的部分图案,均可以根据特征点之间的位置关系确定摄像头拍摄到的图像中的特征点为标定板上的哪些特征点,从而可以从预存的三维坐标中确定出这些特征点在世界坐标系下的三维坐标,进而可以确定出摄像头内外参。这使得使用该标定板对摄像头进行标定时,不用限定摄像头与标定板之间的距离,也不需要移动标定板,即能够实现摄像头的自动标定。
本申请还提供标定装置。本申请提供的一种标定装置包括:一个或多个激光雷达标定区域,所述激光雷达标定区域包括两条不平行的边。该标定装置也可以称为标定板。
标定板上可以包括一组或多组标定图案,其中,每组标定图案中可以包括一个或多个激光雷达标定区域。
激光雷达标定区域为用于标定激光雷达的反光区域。例如,激光雷达标定区域采用激光雷达工作频段的红外材质。激光雷达区域可以是任意形状。
激光雷达区域包括多条边,其中,至少有两条边是不平行的,或者说至少有两条边所在的直线是相交的。例如,激光雷达区域为三角形或四边形等。为了描述方便,将激光雷达区域不平行的两条边分别称为第一边和第二边。
标定板上有多组标定图案时,这多组标定图案可以相同,也可以不同。标定图案包括多个激光雷达标定区域时,这多个激光雷达标定区域的形状可以相同,也可以不相同。
标定板包括多组标定图案,每组标定图案包括多个激光雷达标定区域的一种示例如图2或图5所示。
激光雷达标定区域包括至少两条不平行的边,这些不平行的边所在的直线的交点可以 作为激光雷达标定点来实现激光雷达的标定。
在一些设计中,在一组标定图案的多个激光雷达标定区域中,一个激光雷达标定区域的第一边与另一个激光雷达标定区域的第一边位于同一条直线上,为了描述方便,前一个激光雷达标定区域成为第一激光雷达标定区域,将后面一个激光雷达区域称为第二激光雷达标定区域,将该直线称为第一直线上;第一激光雷达标定区域的第二边与第二激光雷达标定区域的第二边位于同一条直线上,为了描述方面,将该直线称为第二直线上。
该设计中,多个激光雷达区域的边位于同一条直线上,使得可以根据多条边附近的扫描点来拟合该直线,从而可以提高拟合的直线的精准度,进而提高激光雷达标定点的精准度。
在该设计中,可选地,第一直线和第二直线的交点为第一激光雷达标定区域的一个顶点。为了描述方便,将第一直线和第二直线的交点称为第一交点。第一交点同时为第二激光雷达标定区域的一个顶点。
也就是说,不同的激光雷达区域以相同的激光雷达标定点为顶点,这使得标定板上的激光雷达标定区域的布局更紧凑,这使得在同样面积的标定板上可以布局更多标定区域或者使用更小面积的标定板就能布局同样面积的标定区域。
在一些设计中,所述多个激光雷达标定区域中,第三激光雷达标定区域的第一边与第四激光雷达标定区域的第二边位于第三直线上,所述第三激光雷达标定区域的第二边与所述第四激光雷达标定区域的第一边位于第四直线上;其中,所述第一直线与所述第三直线不平行,所述第二直线与所述第四直线不平行。
该设计中,由于第一直线与第三直线不平行,第二直线与第四直线不平行,使得增加同样数量的激光雷达标定区域,可以增加更多的交点,从而增加更多的激光雷达标定点,进而可以提高激光雷达的标定精度。
可选地,第二直线与第四直线的第二交点为第一激光雷达标定区域的另一顶点,第二交点为第三激光雷达标定区域的一个顶点,第三直线与第四直线的第三交点为第三激光雷达标定区域的另一顶点,第三交点为第四激光雷达标定区域的一个顶点,第一直线与第三直线的第四交点为第四激光雷达标定区域的另一顶点,第四交点为第二激光雷达标定区域的另一顶点。
也就是说,不同的激光雷达区域以相同的激光雷达标定点为顶点,这使得标定板上的激光雷达标定区域的布局更紧凑,这使得在同样面积的标定板上可以布局更多标定区域或者使用更小面积的标定板就能布局同样面积的标定区域。
该设计的一个示例如图2所示,其中的第一区域、第二区域、第三区域和第四区域分别对应第一激光雷达标定区域、第二激光雷达标定区域、第三激光雷达标定区域和第四激光雷达标定区域。
本申请提供的另一种标定装置包括:多个摄像头标定区域,所述多个摄像头标定区域中任意一个摄像头标定区域与相邻摄像头标定区域的位置关系,和,所述多个摄像头标定区域中其他任意一个摄像头标定区域与相邻摄像头标定区域的位置关系不同。
摄像头标定区域为用于标定摄像头外参或标定摄像头内外参的区域。摄像头标定区域可以采用白色反光材质。摄像头标定区域可以是任意形状,例如可以是圆形、三角形或棋盘格等。
标定板上通常包括多个摄像头标定区域,其中,任意一个摄像头标定区域与其相邻区域的位置关系,与任意另一个摄像头标定区域与其相邻区域的位置关系不同。
摄像头标定区域中的点可以作为摄像头标定点,例如将摄像头标定区域中心点作为摄像头标定点。因此,摄像头标定区域之间的位置关系也可以理解为摄像头标定点之间的位置关系。
本申请实施例中,不论摄像头拍摄到的是整个标定板上的所有图案,还是标定板上的部分图案,均可以根据特征点之间的位置关系确定摄像头拍摄到的图像中的特征点为标定板上的哪些特征点,从而可以从预存的三维坐标中确定出这些特征点在世界坐标系下的三维坐标,进而可以确定出摄像头内外参。这使得使用该标定板对摄像头进行标定时,不用限定摄像头与标定板之间的距离,也不需要移动标定板,即能够实现摄像头的自动标定。
该设计的一种示例如图8所示。
在一些设计中,该标定装置还可以包括前述一种标定装置中的激光雷达标定区域。关于激光雷达区域的相关内容参考前述内容,此处不再赘述。该设计的一种示例如图5所示。
本申请还提供一种传感器标定装置。该传感器标定装置包括用于执行图3、图6、图7、图9、图10、图11、图12、图13或图14所述的传感器标定方法中的各个操作的模块。或者说,该传感器标定装置包括多个模块,各模块分别为了实现图3、图6、图7、图9、图10、图11、图12、图13或图14所述方法中的相应流程。
图15为本申请一个实施例的传感器标定装置1500的示意性结构图。传感器标定装置1500可以包括获取模块1510,拟合模块1520,估计模块1530和标定模块1540。
在一些设计中,获取模块1510用于执行S1010,拟合模块用于执行S1020,估计模块1530用于执行S1030,标定模块1540用于执行1040,以实现图10所示的传感器标定方法。
在一些设计中,获取模块1510用于执行S1010,拟合模块用于执行S1020,估计模块1530用于执行S1030,标定模块1540用于执行S1041,以实现图11所示的传感器标定方法。
在一些设计中,获取模块1510用于执行S1010和S1050,拟合模块用于执行S1020,估计模块1530用于执行S1030,标定模块1540用于执行S1041和S1060,以实现图12所示的传感器标定方法。
在一些设计中,获取模块1510用于执行S1010,拟合模块用于执行S1020,估计模块1530用于执行S1030,标定模块1540用于执行S1042和S1044,以实现图13所示的传感器标定方法。
可以理解的是,各模块执行上述相应步骤的具体过程在上述方法实施例中已经详细说明,在此不再赘述。
可以理解的是,图15仅是对本申请的传感器装置的结构和功能模块进行的示例性划分,本申请并不对其具体划分做任何限定。
图16是本申请实施例提供的一种传感器标定装置的部署示意图,传感器标定装置可部署在云环境中,云环境是云计算模式下利用基础资源向用户提供云服务的实体。云环境包括云数据中心和云服务平台,所述云数据中心包括云服务提供商拥有的大量基础资源(包括计算资源、存储资源和网络资源),云数据中心包括的计算资源可以是大量的计算 设备(例如服务器)。
传感器标定装置可以是云数据中心中用于对传感器进行标定的服务器;传感器标定装置也可以是创建在云数据中心中的用于对传感器进行标定的虚拟机;传感器标定装置还可以是部署在云数据中心中的服务器或者虚拟机上的软件装置,该软件装置用于对传感器进行标定,该软件装置可以分布式地部署在多个服务器上、或者分布式地部署在多个虚拟机上、或者分布式地部署在虚拟机和服务器上。
如图16所示,传感器标定装置由云服务提供商在云服务平台抽象成一种传感器标定的云服务提供给用户,用户在云服务平台购买该云服务后,云环境利用传感器标定装置向用户提供传感器标定云服务,用户可以通过应用程序接口(application program interface,API)或者通过云服务平台提供的网页界面上传待标定的激光雷达采集的点坐标集合至云环境,由传感器标定装置接收点坐标集合,对待标定的激光雷达进行标定,标定结果由传感器标定装置返回至用户所在的终端,或者标定结果存储在云环境,例如:呈现在云服务平台的网页界面上供用户查看。
当传感器标定装置为软件装置时,传感器标定装置的不同模块可以部署在不同的环境或设备中。例如:如图17所示,传感器标定装置中的一部分部署在终端计算设备(如:车辆、智能手机、笔记本电脑、平板电脑、个人台式电脑、智能摄相机),另一部分部署在数据中心(具体部署在数据中心中的服务器或虚拟机上),数据中心可以是云数据中心,数据中心也可以是边缘数据中心,边缘数据中心是部署在距离终端智能设备较近的边缘计算设备的集合。
部署在不同环境或设备的传感器标定装置的各个部分之间协同实现传感器标定方法的功能。例如,车辆上部署有传感器标定装置中的获取模块,车辆获取到第一点坐标集合之后,通过网络将第一点坐标集合发送至数据中心,数据中心上部署有拟合模块、估计模块和标定模块,这些模块进一步地对第一点坐标集合进行处理,最终获得标定结果,数据中心将标定结果发送至车辆。
可以理解的是,本申请不对传感器标定装置的哪些部分部署在终端计算设备和哪些部分部署在数据中心进行限制性的划分,实际应用时可根据终端计算设备的计算能力或具体应用需求进行适应性的部署。值得注意的是,在一种实施例中,传感器标定装置还可以分三部分部署,其中,一部分部署在终端计算设备,一部分部署在边缘数据中心,一部分部署在云数据中心。
当传感器标定装置为软件装置时,传感器标定装置也可以单独部署在任意环境的一个计算设备上(例如:单独部署在一个终端计算设备上或者单独部署在数据中心中的一个计算设备上)。
如图18所示,计算设备1800包括总线1801、处理器1802、通信接口1803和存储器1804。处理器1802、存储器1804和通信接口1803之间通过总线1801通信。
存储器1804中存储有传感器标定装置所包括的可执行代码(实现各个模块功能的代码),处理器1802读取存储器1804中的该可执行代码以执行传感器标定方法。存储器1804中还可以包括操作系统等其他运行进程所需的软件模块。操作系统可以为LINUX TM,UNIX TM,WINDOWS TM等。
本申请还提供一种如图18所示的计算设备1800。
本申请还提供一种芯片,该芯片包括总线1801、处理器1802、通信接口1803和存储器1804。处理器1802、存储器1804和通信接口1803之间通过总线1801通信。存储器1804中存储有传感器标定装置所包括的可执行代码(实现各个模块功能的代码),处理器1802读取存储器1804中的该可执行代码以执行传感器标定方法。存储器1804中还可以包括操作系统等其他运行进程所需的软件模块。
该芯片可以是现场可编程门阵列(field programmable gate array,FPGA),可以是专用集成芯片(application specific integrated circuit,ASIC),还可以是系统芯片(system on chip,SoC),还可以是中央处理器(central processor unit,CPU),还可以是网络处理器(network processor,NP),还可以是数字信号处理电路(digital signal processor,DSP),还可以是微控制器(micro controller unit,MCU),还可以是可编程控制器(programmable logic device,PLD)或其他集成芯片。
应理解,本申请中的各个模块也可以称为对应的单元,例如,获取模块也可以称为获取单元,估计模块也可以称为估计单元,等等。
在实现过程中,各个方法的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法的步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。为避免重复,这里不再详细描述。
应注意,本申请实施例中的处理器可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法实施例的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器可以是通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。
可以理解,本申请实施例中的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存,硬盘驱动器(hard disk drive,HDD)或固态驱动器(solid state disk,SSD)。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。
通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、同步动态随 机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。应注意,本文描述的系统和方法的存储器旨在包括但不限于这些和任意其它适合类型的存储器。
本申请还提供一种计算机程序产品,该计算机程序产品包括:计算机程序代码,当该计算机程序代码在计算机上运行时,使得该计算机执行前述任意一个方法实施例中的方法。
本申请还提供一种计算机可读介质,该计算机可读介质存储有程序代码,当该程序代码在计算机上运行时,使得该计算机执行前述任意一个方法实施例中的方法。
本申请还提供一种系统,其包括前述的传感器标定装置、计算设备中任意一个和前述的标定装置。
上述各个附图对应的流程的描述各有侧重,某个流程中没有详述的部分,可以参见其他流程的相关描述。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。视频相似检测的计算机程序产品包括一个或多个视频相似检测的计算机指令,在计算机上加载和执行这些计算机程序指令时,全部或部分地产生按照本申请实施例图6所述的流程或功能。
所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质存储有视频相似检测的计算机程序指令的可读存储介质。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如SSD)。

Claims (31)

  1. 一种传感器标定方法,其特征在于,包括:
    获取第一点坐标集合,所述第一点坐标集合中包括标定板上的激光雷达区域中的扫描点在激光雷达坐标系下的三维坐标,所述激光雷达标定区域包括两条不平行的边;
    根据所述第一点坐标集合进行直线拟合,以得到多条直线的表达式,所述多条直线包括所述两条不平行的边所在的直线;
    根据所述多条直线的表达式,估计所述标定板上的激光雷达标定点在所述激光雷达坐标系下的三维坐标;
    根据所述激光雷达标定点在所述激光雷达坐标系下的三维坐标,对所述激光雷达进行外参标定。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述激光雷达标定点在所述激光雷达坐标系下的三维坐标,对所述激光雷达进行外参标定,包括:
    获取第二点坐标集合,所述第二点坐标集合包括所述标定板上的摄像头标定点在摄像头的图像坐标系中的坐标;
    根据所述第二点坐标集合、所述摄像头的内参,以及所述摄像头标定区域与所述激光雷达标定区域之间的位置关系,确定所述激光雷达与所述摄像头之间的相对位姿;
    根据所述摄像头的外参以及所述激光雷达与所述摄像头之间的相对位姿,对所述激光雷达进行外参标定。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述第二点坐标集合、所述摄像头的内参,以及所述摄像头标定区域与所述激光雷达标定区域之间的位置关系,确定所述激光雷达与所述摄像头之间的相对位姿,包括:
    根据所述第二点坐标集合、所述摄像头的内参以及所述摄像头标定区域与所述激光雷达标定区域之间的位置关系,确定所述激光雷达标定点在所述摄像头的摄像头坐标系下的三维坐标;
    根据所述激光雷达标定点在所述摄像头坐标系下的三维坐标和所述激光雷达标定点在所述激光雷达坐标系下的三维坐标确定所述激光雷达与所述摄像头之间的相对位姿。
  4. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    获取第二点坐标集合,所述第二点坐标集合包括摄像头标定点在摄像头的图像坐标系中的坐标;
    根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定。
  5. 根据权利要求4所述的方法,其特征在于,所述根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定,包括:
    根据所述摄像头的内参和所述第二点坐标集合,确定所述摄像头标定点在所述摄像头坐标系下的三维坐标;
    根据所述摄像头标定点之间的位置关系,确定所述摄像头标定点在世界坐标系下的三 维坐标;
    根据所述摄像头标定点在世界坐标系下的三维坐标和所述摄像头标定点在所述摄像头坐标系下的三维坐标,对摄像头进行外参标定。
  6. 根据权利要求5所述的方法,其特征在于,所述第二点坐标集合中包括所述摄像头对所述标定板进行拍摄得到的多张图像的点坐标;
    其中,所述方法还包括:
    根据所述摄像头标定区域之间的位置关系,从所述第二点坐标集合中确定所述摄像头标定点在所述多张图像中每张图像中的坐标;
    根据所述摄像头标定点在所述多张图像中每张图像中的坐标,对所述摄像头进行内参标定,以得到所述摄像头的内参。
  7. 一种传感器标定方法,其特征在于,包括:
    获取第二点坐标集合,所述第二点坐标集合包括摄像头标定点在摄像头的图像坐标系中的坐标;
    根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定,包括:
    根据所述摄像头的内参和所述第二点坐标集合,确定所述摄像头标定点在所述摄像头坐标系下的三维坐标;
    根据所述摄像头标定点之间的位置关系,确定所述摄像头标定点在世界坐标系下的三维坐标;
    根据所述摄像头标定点在世界坐标系下的三维坐标和所述摄像头标定点在所述摄像头坐标系下的三维坐标,对摄像头进行外参标定。
  9. 根据权利要求8所述的方法,其特征在于,所述第二点坐标集合中包括所述摄像头对所述标定板进行拍摄得到的多张图像的点坐标;
    其中,所述方法还包括:
    根据所述摄像头标定区域之间的位置关系,从所述第二点坐标集合中确定所述摄像头标定点在所述多张图像中每张图像中的坐标;
    根据所述摄像头标定点在所述多张图像中每张图像中的坐标,对所述摄像头进行内参标定,以得到所述摄像头的内参。
  10. 一种传感器标定装置,其特征在于,包括:
    获取模块,用于获取第一点坐标集合,所述第一点坐标集合中包括标定板上的激光雷达区域中的扫描点在激光雷达坐标系下的三维坐标,所述激光雷达标定区域包括两条不平行的边;
    拟合模块,用于根据所述第一点坐标集合进行直线拟合,以得到多条直线的表达式,所述多条直线包括所述两条不平行的边所在的直线;
    估计模块,用于根据所述多条直线的表达式,估计所述标定板上的激光雷达标定点在所述激光雷达坐标系下的三维坐标;
    标定模块,用于根据所述激光雷达标定点在所述激光雷达坐标系下的三维坐标,对所述激光雷达进行外参标定。
  11. 根据权利要求10所述的装置,其特征在于,所述获取模块还用于:获取第二点坐标集合,所述第二点坐标集合包括所述标定板上的摄像头标定点在摄像头的图像坐标系中的坐标;
    所述标定模块具体用于:
    根据所述第二点坐标集合、所述摄像头的内参,以及所述摄像头标定区域与所述激光雷达标定区域之间的位置关系,确定所述激光雷达与所述摄像头之间的相对位姿;
    根据所述摄像头的外参以及所述激光雷达与所述摄像头之间的相对位姿,对所述激光雷达进行外参标定。
  12. 根据权利要求11所述的装置,其特征在于,所述标定模块具体用于:
    根据所述第二点坐标集合、所述摄像头的内参以及所述摄像头标定区域与所述激光雷达标定区域之间的位置关系,确定所述激光雷达标定点在所述摄像头的摄像头坐标系下的三维坐标;
    根据所述激光雷达标定点在所述摄像头坐标系下的三维坐标和所述激光雷达标定点在所述激光雷达坐标系下的三维坐标确定所述激光雷达与所述摄像头之间的相对位姿。
  13. 根据权利要求10所述的装置,其特征在于,所述获取模块还用于:获取第二点坐标集合,所述第二点坐标集合包括摄像头标定点在摄像头的图像坐标系中的坐标;
    所述标定模块还用于:根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定。
  14. 根据权利要求13所述的装置,其特征在于,所述标定模块具体用于:
    根据所述摄像头的内参和所述第二点坐标集合,确定所述摄像头标定点在所述摄像头坐标系下的三维坐标;
    根据所述摄像头标定点之间的位置关系,确定所述摄像头标定点在世界坐标系下的三维坐标;
    根据所述摄像头标定点在世界坐标系下的三维坐标和所述摄像头标定点在所述摄像头坐标系下的三维坐标,对摄像头进行外参标定。
  15. 根据权利要求14所述的装置,其特征在于,所述第二点坐标集合中包括所述摄像头对所述标定板进行拍摄得到的多张图像的点坐标;
    其中,所述标定模块还具体用于:
    根据所述摄像头标定区域之间的位置关系,从所述第二点坐标集合中确定所述摄像头标定点在所述多张图像中每张图像中的坐标;
    根据所述摄像头标定点在所述多张图像中每张图像中的坐标,对所述摄像头进行内参标定,以得到所述摄像头的内参。
  16. 一种传感器标定装置,其特征在于,包括:
    获取模块,用于获取第二点坐标集合,所述第二点坐标集合包括摄像头标定点在摄像头的图像坐标系中的坐标;
    标定模块,用于根据所述第二点坐标集合、所述摄像头标定区域之间的位置关系和所述摄像头标定点在世界坐标系下的三维坐标,对所述摄像头进行标定。
  17. 根据权利要求16所述的装置,其特征在于,所述标定模块具体用于:
    根据所述摄像头内参和所述第二点坐标集合,确定所述摄像头标定点在所述摄像头坐标系下的三维坐标;
    根据所述摄像头标定点之间的位置关系,确定所述摄像头标定点在世界坐标系下的三维坐标;
    根据所述摄像头标定点在世界坐标系下的三维坐标和所述摄像头标定点在所述摄像头坐标系下的三维坐标,对摄像头外参进行标定。
  18. 根据权利要求17所述的装置,其特征在于,所述第二点坐标集合中包括所述摄像头对所述标定板进行拍摄得到的多张图像的点坐标;
    其中,所述标定模块还具体用于:
    根据所述摄像头标定区域之间的位置关系,从所述第二点坐标集合中确定所述摄像头标定点在所述多张图像中每张图像中的坐标;
    根据所述摄像头标定点在所述多张图像中每张图像中的坐标,对所述摄像头进行内参标定,以得到所述摄像头内参。
  19. 一种标定装置,其特征在于,包括:一个或多个激光雷达标定区域,所述激光雷达标定区域包括两条不平行的边,所述两条不平行的边所在的直线的交点用于标定激光雷达的外参。
  20. 如权利要求19所述的装置,其特征在于,所述多个激光雷达标定区域中,第一激光雷达标定区域的第一边与第二激光雷达标定区域的第一边位于第一直线上,所述第一激光雷达标定区域的第二边与所述第二激光雷达标定区域的第二边位于第二直线上,所述第一激光雷达标定区域的第一边和第二边不平行,所述第二激光雷达标定区域的第一边和第二边不平行。
  21. 如权利要求20所述的装置,其特征在于,所述第一直线和所述第二直线的第一交点为所述第一激光雷达标定区域的第一顶点,所述第一交点为所述第二激光雷达标定区域的第一顶点。
  22. 如权利要求20或21所述的装置,其特征在于,所述多个激光雷达标定区域中,第三激光雷达标定区域的第一边与第四激光雷达标定区域的第二边位于第三直线上,所述第三激光雷达标定区域的第二边与所述第四激光雷达标定区域的第一边位于第四直线上,所述第三激光雷达标定区域的第一边与第二边不平行,所述第四激光雷达标定区域的第一边与第二边不平行;
    其中,所述第一直线与所述第三直线不平行,所述第二直线与所述第四直线不平行。
  23. 如权利要求22所述的装置,其特征在于,所述第二直线与所述第四直线的第二交点为所述第一激光雷达标定区域的第二顶点,所述第二交点为所述第三激光雷达标定区域的第一顶点,所述第三直线与所述第四直线的第三交点为所述第三激光雷达标定区域的第二顶点,所述第三交点为所述第四激光雷达标定区域的第一顶点,所述第一直线与所述第三直线的第四交点为所述第四激光雷达标定区域的第二顶点,所述第四交点为所述第二激光雷达标定区域的第二顶点。
  24. 如权利要求19至21中任一项所述的装置,其特征在于,所述激光雷达标定区域的形状为三角形。
  25. 如权利要求19至24中任一项所述的装置,其特征在于,所述装置还包括多个摄像头标定区域,所述多个摄像头标定区域中任意一个摄像头标定区域与相邻摄像头标定区域的位置关系,和,所述多个摄像头标定区域中其他任意一个摄像头标定区域与相邻摄像头标定区域的位置关系不同。
  26. 如权利要求25所述的装置,其特征在于,所述摄像头标定区域的形状为圆形。
  27. 一种标定装置,其特征在于,包括:多个摄像头标定区域,所述多个摄像头标定区域中任意一个摄像头标定区域与相邻摄像头标定区域的位置关系,和,所述多个摄像头标定区域中其他任意一个摄像头标定区域与相邻摄像头标定区域的位置关系不同。
  28. 如权利要求27所述的装置,其特征在于,所述摄像头标定区域的形状为圆形。
  29. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储用于计算设备执行的指令,所述指令用于实现如权利要求1至9中任一项所述的传感器标定方法。
  30. 一种传感器标定系统,其特征在于,包括如权利要求10至15中任一项所述的传感器标定装置和/或如权利要求19至26中任一项所述的标定装置。
  31. 一种传感器标定系统,其特征在于,包括如权利要求16至18中任一项所述的传感器标定装置和/或如权利要求27至28中任一项所述的标定装置。
PCT/CN2020/115879 2019-10-08 2020-09-17 传感器标定方法和传感器标定装置 WO2021068723A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910950634.6 2019-10-08
CN201910950634.6A CN112630750A (zh) 2019-10-08 2019-10-08 传感器标定方法和传感器标定装置

Publications (1)

Publication Number Publication Date
WO2021068723A1 true WO2021068723A1 (zh) 2021-04-15

Family

ID=75283383

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/115879 WO2021068723A1 (zh) 2019-10-08 2020-09-17 传感器标定方法和传感器标定装置

Country Status (2)

Country Link
CN (1) CN112630750A (zh)
WO (1) WO2021068723A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230150129A1 (en) * 2021-11-15 2023-05-18 Bear Robotics, Inc. Method, system, and non-transitory computer-readable recording medium for controlling a robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104483664A (zh) * 2015-01-05 2015-04-01 中国科学院光电研究院 单线阵激光雷达设备中心标定的方法
CN106815873A (zh) * 2017-01-19 2017-06-09 宁波维森智能传感技术有限公司 摄像头内参的确定方法和装置
CN107976669A (zh) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 一种确定相机与激光雷达之间的外参数的装置
US20190073550A1 (en) * 2017-09-07 2019-03-07 Symbol Technologies, Llc Imaging-based sensor calibration
CN109946680A (zh) * 2019-02-28 2019-06-28 北京旷视科技有限公司 探测系统的外参数标定方法、装置、存储介质及标定系统
CN110021046A (zh) * 2019-03-05 2019-07-16 中国科学院计算技术研究所 相机与激光雷达组合传感器的外参数标定方法及系统
CN110148174A (zh) * 2019-05-23 2019-08-20 北京阿丘机器人科技有限公司 标定板、标定板识别方法及装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104483664A (zh) * 2015-01-05 2015-04-01 中国科学院光电研究院 单线阵激光雷达设备中心标定的方法
CN107976669A (zh) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 一种确定相机与激光雷达之间的外参数的装置
CN106815873A (zh) * 2017-01-19 2017-06-09 宁波维森智能传感技术有限公司 摄像头内参的确定方法和装置
US20190073550A1 (en) * 2017-09-07 2019-03-07 Symbol Technologies, Llc Imaging-based sensor calibration
CN109946680A (zh) * 2019-02-28 2019-06-28 北京旷视科技有限公司 探测系统的外参数标定方法、装置、存储介质及标定系统
CN110021046A (zh) * 2019-03-05 2019-07-16 中国科学院计算技术研究所 相机与激光雷达组合传感器的外参数标定方法及系统
CN110148174A (zh) * 2019-05-23 2019-08-20 北京阿丘机器人科技有限公司 标定板、标定板识别方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230150129A1 (en) * 2021-11-15 2023-05-18 Bear Robotics, Inc. Method, system, and non-transitory computer-readable recording medium for controlling a robot
US11759949B2 (en) * 2021-11-15 2023-09-19 Bear Robotics, Inc. Method, system, and non-transitory computer-readable recording medium for controlling a robot

Also Published As

Publication number Publication date
CN112630750A (zh) 2021-04-09

Similar Documents

Publication Publication Date Title
JP6564537B1 (ja) 単眼3次元走査システムによる3次元再構成法および装置
CN109961468B (zh) 基于双目视觉的体积测量方法、装置及存储介质
KR101666959B1 (ko) 카메라로부터 획득한 영상에 대한 자동보정기능을 구비한 영상처리장치 및 그 방법
US10728525B2 (en) Image capturing apparatus, image processing method, and recording medium
WO2021098448A1 (zh) 传感器标定方法及装置、存储介质、标定系统和程序产品
US9591280B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
CN110996082B (zh) 投影调节方法、装置、投影仪及可读存储介质
JP5070435B1 (ja) 3次元相対座標計測装置およびその方法
WO2021098439A1 (zh) 传感器标定方法及装置、存储介质、标定系统和程序产品
JP6333396B2 (ja) モバイルプラットフォームの変位を計測する方法及び装置
JP2015049200A (ja) 計測装置、方法及びプログラム
EP3350772A1 (en) An apparatus and a method for calibrating an optical acquisition system
WO2021068723A1 (zh) 传感器标定方法和传感器标定装置
CN117288151B (zh) 一种投影设备的三维姿态确定方法、装置和电子设备
CN111145264A (zh) 多传感器的标定方法、装置及计算设备
AU2016321728B2 (en) An apparatus and a method for encoding an image captured by an optical acquisition system
CN210986289U (zh) 四目鱼眼相机及双目鱼眼相机
JP5648159B2 (ja) 3次元相対座標計測装置およびその方法
US20230252666A1 (en) Systems and methods of measuring an object in a scene of a captured image
US20180077405A1 (en) Method and system for scene scanning
CN109741384B (zh) 深度相机的多距离检测装置及方法
TWM594322U (zh) 全向立體視覺的相機配置系統
CN115690212A (zh) 一种对视定位方法、装置及存储介质
JP5964093B2 (ja) 車両サイズ測定装置、車両サイズ測定方法、およびプログラム
CN113052884A (zh) 信息处理方法、信息处理装置、存储介质与电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20874402

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20874402

Country of ref document: EP

Kind code of ref document: A1