CN111880164A - Laser radar calibration device and method - Google Patents

Laser radar calibration device and method Download PDF

Info

Publication number
CN111880164A
CN111880164A CN202010711316.7A CN202010711316A CN111880164A CN 111880164 A CN111880164 A CN 111880164A CN 202010711316 A CN202010711316 A CN 202010711316A CN 111880164 A CN111880164 A CN 111880164A
Authority
CN
China
Prior art keywords
light spot
spot image
laser beam
screen
laser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010711316.7A
Other languages
Chinese (zh)
Other versions
CN111880164B (en
Inventor
王超
张泉
孔令凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Voyager Technology Co Ltd
Original Assignee
Beijing Didi Infinity Technology and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Didi Infinity Technology and Development Co Ltd filed Critical Beijing Didi Infinity Technology and Development Co Ltd
Priority to CN202010711316.7A priority Critical patent/CN111880164B/en
Publication of CN111880164A publication Critical patent/CN111880164A/en
Priority to PCT/CN2021/107644 priority patent/WO2022017419A1/en
Application granted granted Critical
Publication of CN111880164B publication Critical patent/CN111880164B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Abstract

The embodiment of the specification provides a laser radar calibration device and a laser radar calibration method. The device comprises: the laser radar platform is used for bearing a laser radar to be calibrated; the rough positioning module is used for acquiring a first light spot image of a laser beam of the laser radar to be calibrated; the movable fine positioning module is used for adjusting a second shooting position based on the first light spot image and acquiring a second light spot image of the laser beam, and the resolution of the second light spot image is higher than that of the first light spot image; the control module is used for determining the target position of the light spot in a preset coordinate system based on the second light spot image; and determining an actual exit angle of the laser beam based at least on the target position.

Description

Laser radar calibration device and method
Technical Field
The present disclosure relates to the field of laser technologies, and in particular, to a laser radar calibration apparatus and method.
Background
The lidar detects characteristics of the environment and the position (e.g., distance and angle), the motion state (e.g., velocity, vibration, and attitude), and the shape of an object within a target range by emitting laser light to the target range or the object and receiving reflected light from the object. Due to the fact that errors may exist in various links such as design and manufacturing of mechanical parts, the laser radar may have certain errors. The laser radar has errors, which may cause the deviation of the object data or information obtained when detecting the object.
Accordingly, it is desirable to provide a lidar calibration apparatus and method.
Disclosure of Invention
One aspect of the present description provides a lidar calibration apparatus. The device comprises: the laser radar platform is used for bearing a laser radar to be calibrated; the rough positioning module is used for acquiring a first light spot image of a laser beam of the radar to be calibrated; the movable fine positioning module is used for adjusting a second shooting position based on the first light spot image and acquiring a second light spot image of the laser beam, and the resolution of the second light spot image is higher than that of the first light spot image; the control module is used for determining the target position of the light spot in a preset coordinate system based on the second light spot image; and determining an actual exit angle of the laser beam based at least on the target position.
In some embodiments, the lidar stage comprises a controllable 360-degree rotating platform for controlling the lidar to be calibrated to rotate around a rotating shaft in the vertical direction so as to calibrate the emission modules of the lidar to be calibrated in different horizontal fields of view.
In some embodiments, the coarse positioning module comprises a first camera and a receiving screen; the receiving screen is used for receiving the laser beam; the first camera is used for shooting light spots formed by the laser beams on the receiving screen so as to obtain the first light spot image.
In some embodiments, the receiving screen comprises a liftable curtain.
In some embodiments, the fine positioning module comprises: the second camera and the photosensitive screen are arranged on the guide rail; the guide rail is used for providing a motion track for the second camera and the photosensitive screen to move in the plane of the receiving screen or a plane parallel to the receiving screen so as to position the fine positioning module to the second shooting position; the photosensitive screen is used for receiving the laser beam; the second camera is used for shooting light spots formed by the laser beams on the photosensitive screen so as to obtain a second light spot image.
In some embodiments, the guide rails include a first guide rail running along a horizontal edge of the receiving screen and a second guide rail running perpendicular to the first guide rail; the positioning accuracy of the guide rail is higher than 10 microns.
In some embodiments, the apparatus further comprises a distance adjuster for adjusting a distance between the lidar stage and the receiving screen.
In some embodiments, the distance adjuster comprises at least a track and a rangefinder.
In some embodiments, the apparatus further includes a detachable origin defining module, configured to define a spatial central origin of the lidar calibration apparatus, where the spatial central origin is a mapping position on the receiving screen when the actual exit angle of the laser beam is 0 degree.
In some embodiments, the origin defining module comprises a self-leveling laser striping machine and a posture adjustment pan/tilt head; the self-level laser line marking instrument is used for defining the position of the space center original point, and the posture adjusting holder is used for adjusting the position of the self-level laser line marking instrument.
In some embodiments, to determine the target position of the spot in a preset coordinate system based on the second spot image, the control module is further configured to: acquiring second position information of the light spot based on the second light spot image; and determining the target position of the light spot in a preset coordinate system based on the second position information of the light spot and the second shooting position.
Another aspect of the present description provides a lidar calibration method. The method comprises the following steps: acquiring a first light spot image of a laser beam emitted by a laser radar to be calibrated; determining a shooting position of a second light spot image of the laser beam based on the first light spot image and acquiring the second light spot image, wherein the resolution of the second light spot image is higher than that of the first light spot image; determining the target position of the light spot in a preset coordinate system based on the second light spot image; and determining an actual exit angle of the laser beam based at least on the target position.
In some embodiments, the acquiring a first spot image of the laser beam emitted by the lidar to be calibrated includes acquiring, by a coarse positioning module, the first spot image of the laser beam emitted by the lidar to be calibrated; the coarse positioning module comprises a first camera and a receiving screen; the receiving screen is used for receiving the laser beam; the first camera is used for shooting light spots formed by the laser beams on the receiving screen so as to obtain the first light spot image.
In some embodiments, the determining a capture position of a second spot image of the laser beam based on the first spot image and acquiring the second spot image includes: processing the first spot image to determine position data of the spot on the receiving screen; moving a fine positioning module based on the position data to enable the fine positioning module to acquire the second spot image; wherein the fine positioning module comprises: the second camera and the photosensitive screen are arranged on the guide rail; the guide rail is used for providing a motion track for the second camera and the photosensitive screen to move in the plane of the receiving screen or a plane parallel to the receiving screen so as to position the fine positioning module to the shooting position of the second light spot image; the photosensitive screen is used for receiving the laser beam; the second camera is used for shooting light spots formed by the laser beams on the photosensitive screen so as to obtain a second light spot image.
In some embodiments, the determining the target position of the light spot in a preset coordinate system based on the second light spot image comprises: acquiring second position information of the light spot based on the second light spot image; superposing the second position information and the shooting position of the second light spot image to obtain a target position of the light spot in a preset coordinate system; the second position information comprises position data of the light spot in the photosensitive screen, the shooting position is position data of the second camera and the photosensitive screen on a plane where the receiving screen is located, and the preset coordinate system is established based on the receiving screen.
In some embodiments, the method further comprises: defining a space center origin of the laser radar to be calibrated; the spatial center origin is a mapping position on the receiving screen when the actual emergence angle of the laser beam is 0 degree.
In some embodiments, the defining the spatial center origin of the lidar to be calibrated includes: adjusting the position of an origin defining module to enable the intersection point of two mutually perpendicular lines projected by two mutually perpendicular planes projected by the origin defining module on the laser radar to be calibrated to coincide with a laser emergent port of the laser radar to be calibrated; and defining the position information of the intersection point of two mutually perpendicular lines projected on the receiving screen by the two mutually perpendicular planes as the position information of the space center origin.
In some embodiments, said determining an actual exit angle of said laser beam based on at least said target position comprises: determining a first side length based on the target position and the position information of the space center origin in the preset coordinate system; determining a second side length based on the distance from the laser emitting port of the laser radar to be calibrated to the preset coordinate plane; and processing the first side length and the second side length by using an arc tangent function to determine the actual emergent angle of the laser beam.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an exemplary lidar calibration scenario shown in accordance with some embodiments of the present description;
FIG. 2 is a schematic diagram of an exemplary lidar calibration apparatus shown in accordance with some embodiments of the present description;
FIGS. 3A and 3B are schematic diagrams of an exemplary fine positioning module, shown in accordance with some embodiments of the present description;
FIG. 4 is an exemplary flow chart of a lidar calibration method shown in accordance with some embodiments of the present description;
FIG. 5 is a schematic diagram of an exemplary lidar calibration method shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "device", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Although various references are made herein to certain modules or units in a system according to embodiments of the present description, any number of different modules or units may be used and run on the client and/or server. The modules are merely illustrative and different aspects of the systems and methods may use different modules.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic diagram of an exemplary lidar calibration scenario shown in accordance with some embodiments of the present description.
As shown in fig. 1, a lidar calibration scenario 100 may include a lidar 110 to be calibrated, a receiving screen 120, a first camera 130, and an origin definition module 140.
The lidar 110 to be calibrated may be used to emit a laser beam. In some embodiments, lidar 110 to be calibrated may be placed on a lidar stage 115. In some embodiments, lidar 110 to be calibrated may emit laser beams of different horizontal and pitch angles, i.e., may emit laser beams of different deflection angles in the horizontal and vertical directions, respectively. In some embodiments, lidar 110 to be calibrated may be placed on a table, a stand, or the like that may elevate the height of the lidar. By placing the laser radar to be calibrated at a high position, the laser radar to be calibrated can emit laser beams with different angle ranges in a vertical plane. In some embodiments, lidar 110 to be calibrated may include, but is not limited to, a pulsed lidar, a continuous wave lidar, and the like.
Receiving screen 120 may be used to receive a laser beam emitted by lidar 110 to be calibrated. The emitted laser beam, when reaching the receiving screen, may form a spot on the receiving screen. In some embodiments, the planar position of the receiving screen may be calibrated when the receiving screen is installed, ensuring that the receiving screen plane is absolutely vertical relative to the ground.
The first camera 130 may be used to capture a first spot image of the laser beam on the receiving screen 120. In some embodiments, the first camera 130 may include, but is not limited to, an optical camera or the like.
The origin definition module 140 may be used to define a spatial center origin in the lidar calibration scene 100. And the origin of the center of the space is a mapping position on the receiving screen when the actual emergence angle of the laser beam is 0 degree. And a connecting line of the space center origin and a laser emergent port of the laser radar to be calibrated is parallel to the ground and is vertical to the receiving screen. Specifically, the position of the origin defining module 140 may be adjusted such that an intersection point P of two mutually perpendicular lines, which are projected by the origin defining module 140 and are perpendicular to each other (e.g., the planes on which J and W are respectively located in fig. 1), on the laser radar 110 to be calibrated coincides with the laser exit port S of the laser radar 110 to be calibrated, where one of the two mutually perpendicular planes is perpendicular to the ground and the other plane is parallel to the ground; the position information of the intersection Q of two mutually perpendicular lines projected by the origin defining module 140 on the receiving screen 120 by two mutually perpendicular planes is defined as the position information of the spatial center origin O. By defining a spatial center origin in a laser radar calibration scene, the position of a light spot relative to the origin can be obtained, and the emission angle of a laser beam can be further obtained. In some embodiments, the origin defining module may include a self-leveling laser striping machine and a pan-tilt head. The self-level laser striping machine can be used for projecting two planes which are perpendicular to each other, one of the two planes which are perpendicular to each other is in the horizontal direction (namely the plane is parallel to the ground), the other plane is in the vertical direction (namely the plane is perpendicular to the ground), the posture adjusting cradle head can be used for adjusting the position of the self-level laser striping machine, so that the self-level laser striping machine is aligned with an emergent port of a laser radar to be calibrated, and the two planes which are perpendicular to each other and are projected by the self-level laser striping machine in the horizontal direction and the vertical direction are perpendicular to the vertical edge and the horizontal edge of the receiving screen respectively (namely the connecting line of the self-level laser striping machine and the laser emergent port of the laser radar to be calibrated is perpendicular to. In some embodiments, the origin defining module 140 may be provided as a detachable structure in the laser calibration apparatus. In some alternative embodiments, the origin defining module may be any device or apparatus capable of implementing origin calibration, which is not limited in this specification.
In a particular embodiment, a spatial center origin in the lidar calibration scene 100 may be defined by the origin definition module 140. The spatial center origin is a mapping position on the receiving screen at an actual exit angle of the laser beam of 0 degree, for example, a point Q in the figure. After the laser radar 110 to be calibrated emits a laser beam with a specified emergence angle, a first spot image of the laser beam on the receiving screen 120 can be captured by the first camera 130. Based on the position information of the light spot in the first light spot image and the position information of the spatial center origin, the coordinate position of the light spot relative to the origin can be obtained, and based on the coordinate position and the distance between the laser exit S of the laser radar 110 to be calibrated and the plane where the spatial center origin is located and parallel to the receiving screen, the actual exit angle of the laser beam can be calculated. By comparing the specified emergence angle and the actual emergence angle of the laser beam, the error angle of the laser beam can be obtained, and the laser radar can be calibrated based on the error angle.
Since the lidar can emit laser beams at a plurality of different angles (e.g., -60 to +60 degrees in a vertical plane, or-45 to 45 degrees in a horizontal plane), the camera needs to have a wide enough shooting view to cover the laser beams at larger angles in order to calibrate the laser beams at various angles. However, the shooting view and edge imaging distortion of the optical camera are usually contradictory, and if a wide-angle camera lens is used to cover a sufficient view, geometric distortion may occur at the edge of the shot image, which may cause distortion of the spot in the acquired spot image when measuring a laser beam with a large exit angle (e.g., 80 degrees), thereby affecting the calculation of the actual exit angle of the laser beam. In addition, for a multiline lidar, the angular difference between the emitted laser beams is small, and the measurement accuracy of the lidar calibration apparatus may be affected by the resolution of the camera. For example, if the resolution of the camera is five million pixels, only millimeter resolution can be provided at a line-of-sight distance of two meters, and for a laser radar above 32 lines, positions of light spots in light spot images corresponding to laser beams at different angles in a shot light spot image may be the same, so that calculation results of actual emergence angles corresponding to laser beams at different specified emergence angles may be the same, and accuracy of calibration is affected.
The embodiment of the specification provides a laser radar calibrating device, and a movable fine positioning module is arranged at a receiving screen, so that a light spot image with higher resolution can be obtained. The laser radar calibrating device provided by the embodiment of the specification can acquire the position data of the light spot on the receiving screen based on the first light spot image of the laser beam acquired by the first camera, move the fine positioning module based on the position data to adjust the shooting position of the fine positioning module, acquire the second light spot image with the resolution higher than that of the first light spot image in a short distance, acquire the more accurate position information of the light spot based on the second light spot image, and further improve the accuracy and precision of laser radar calibration.
Fig. 2 is a schematic diagram of an exemplary lidar calibration apparatus shown in accordance with some embodiments of the present description.
As shown in FIG. 2, lidar calibration apparatus 200 may include a lidar stage 210, a coarse positioning module 220, and a fine positioning module 230.
Lidar stage 210 may be used to carry a lidar to be calibrated. In some embodiments, lidar stage 210 may include a controllable 360 degree rotation platform 210-1. The controllable 360-degree rotating platform 210-1 may be used to control the lidar to be calibrated to rotate around a vertical rotating shaft, so as to calibrate the exit modules of different horizontal views of the lidar to be calibrated. For example, if the laser radar to be calibrated is composed of four emission modules covering a horizontal field of view of 90 degrees, when the laser radar is calibrated, the 360-degree controllable rotating platform 210-1 can be controlled to horizontally rotate to positions of 0 degree, 90 degrees, 180 degrees, and 270 degrees, respectively, so as to complete the calibration of the four emission modules of the laser radar. In some embodiments, the horizontal field of view of the exit module of the lidar to be calibrated may be at any angle, for example, a 45 degree horizontal field of view; accordingly, the controllable 360-degree rotating platform may rotate at different horizontal angles based on the horizontal view angle of the exit module of the laser radar to be calibrated, which is not limited in this specification.
The coarse positioning module 220 may be configured to acquire a first spot image of a laser beam of the lidar to be calibrated. A rough emergence angle of the laser beam emitted by the laser radar to be calibrated can be determined according to the first spot image, and the fine positioning module can be assisted to complete positioning based on the rough emergence angle. In some embodiments, the coarse positioning module 220 may acquire a first spot image of the laser beam at a first capture position. In some embodiments, the first photographing position may be a fixed photographing position. In some embodiments, coarse positioning module 220 may include a receiving screen 220-1 and a first camera 220-2. Receive screen 220-1 may be used to receive a laser beam emitted by a lidar to be calibrated. In some embodiments, receiving screen 220-1 may include, but is not limited to, a liftable curtain. The liftable curtain can descend before the first light spot image is obtained and rise after the first light spot image is obtained. The receiving screen is lifted after the first light spot image is obtained, so that the receiving screen can be prevented from influencing the obtaining of the second light spot image. In some embodiments, when the receiving screen is lowered, the plane of the receiving screen can be ensured to be absolutely perpendicular to the ground by calibrating the included angle between the plane of the receiving screen and the ground; or based on the included angle between the receiving screen plane and the ground, the laser beam position is calculated, so that the accuracy of laser calibration is improved. The first camera 220-2 may be used to photograph a spot of the laser beam formed on the receiving screen 220-1 to obtain a first spot image of the laser beam. In some embodiments, the first camera 220-2 may include, but is not limited to, a wide-angle camera and a non-wide-angle camera. In some embodiments, first camera 220-2 may be located on a support of lidar stage 210, such as a support mounted below stage 210-1 of the lidar stage that carries the lidar to be calibrated, and receive screen 220-1 may be located at a distance from the lidar stage, opposite first camera 220-2, e.g., in the position shown in FIG. 2.
The fine positioning module 230 may be used to acquire a second spot image of the laser beam. In some embodiments, the fine positioning module 230 may determine a second shot position of the laser beam based on the position data of the spot in the first spot image, and acquire a second spot image of the laser beam at the second shot position. In some embodiments, the second capture position is the same as or close to the first position of the spot (i.e., the position of the spot in the receiving screen). The second spot image has a higher resolution than the first spot image. And acquiring second position information of the light spot based on the second light spot image with higher resolution, and acquiring the actual emergence angle of the laser beam by performing related processing on the second position information. The exit angle of the laser beam may include an azimuth angle and a pitch angle. The azimuth angle means a yaw angle in a horizontal direction parallel to the ground, and the pitch angle means a yaw angle in a vertical direction perpendicular to the ground. In some embodiments, fine positioning module 230 may be located in a plane in which receiving screen 220-1 is located and in a plane parallel to or parallel to the receiving screen. For more details on the fine positioning module 230, reference may be made to fig. 3A and fig. 3B and the related description thereof, which are not repeated herein.
In some embodiments, lidar calibration apparatus 200 may also include a control module (not shown). The control module may be used to control and/or schedule various modules (e.g., lidar stage 210, coarse positioning module 220, fine positioning module 230, etc.) in lidar calibration apparatus 200. In some embodiments, the control module may control the coarse positioning module to acquire the first spot image. In some embodiments, the control module may acquire position data of the spot based on the first spot image, and control the fine positioning module to move based on the position data. In some embodiments, the control module may determine the actual exit angle of the laser beam. Specifically, the control module may acquire second position information of the light spot based on the second light spot image, determine a target position of the light spot in the preset coordinate system based on the second position information and the corresponding second shooting position, and determine an actual emergence angle of the laser beam based on the target position and a distance between the laser emergence opening of the laser radar to be calibrated and a plane where the preset coordinate system is located. Further details regarding determining the actual emergence angle may be found in other parts of this specification (for example, fig. 4, fig. 5 and the related description thereof), and are not described herein again. In some embodiments, the control module may include, but is not limited to, a programmable chip, a desktop computer, a laptop computer, a cell phone mobile terminal, an iPad mobile terminal, and the like.
In some embodiments, lidar calibration apparatus 200 may also include a range adjuster 250. Distance adjuster 250 may be used to adjust the distance between lidar stage 210 and receiving screen 220-1. By adjusting the distance between the laser radar carrying platform and the receiving screen, the laser radar calibration device can be suitable for the calibration of laser radars in various laser beam emission ranges. In some embodiments, the distance adjuster may include a track and a rangefinder. The track can be used for removing the lidar microscope carrier, and the distancer can be used for measuring the distance of removal and/or the lidar microscope carrier of lidar microscope carrier and receiving the distance between the screen. In some embodiments, lidar calibration apparatus 200 may further include a removable origin-defining module (e.g., origin-defining module 140).
In a specific embodiment, the laser radar to be calibrated may be placed on the laser radar carrier 210, and the distance between the laser radar carrier and the receiving screen 220-1 is adjusted by the distance adjuster 250 according to the performance parameter of the laser radar to be calibrated, so that the receiving screen may receive all laser beams emitted by the laser radar to be calibrated on the receiving screen plane. The receiving screen 220-1 is controlled to be in a descending state, after the laser radar to be calibrated emits the laser beam with the specified emergence angle, the first camera 220-2 is controlled to shoot a first light spot image of the laser beam, and the receiving screen is controlled to be lifted. The control module determines a second shooting position of the laser beam based on the position data of the light spot in the first light spot image and controls the fine positioning module to move to the second shooting position so as to acquire a second light spot image of the laser beam. And determining the actual emergence angle of the laser beam based on the second position of the light spot in the second light spot image, the second shooting position, and the distance between the laser emergence opening of the laser radar to be calibrated and the fine positioning module. In some embodiments, the relative error angle of the laser beam may be obtained based on the actual exit angle of the laser beam and the specified exit angle at which the laser radar to be calibrated emits the laser beam.
Fig. 3A and 3B are schematic diagrams of an exemplary fine positioning module, shown in accordance with some embodiments of the present description.
In some embodiments, the fine positioning module 230 may include a second camera 235 and a photosensitive screen 237. Second camera 235 and photosensitive screen 237 may be mounted on a rail that may be used to provide a motion track for the second camera and photosensitive screen to move within the plane of receiving screen 220-1 or within a plane parallel to receiving screen 220-1. The photosensitive screen can be used for receiving laser beams emitted by the laser radar to be calibrated, and the second camera can be used for shooting light spots formed on the photosensitive screen by the laser beams so as to obtain a second light spot image.
In some embodiments, the guide rails may include a first guide rail running along a horizontal edge of the receiving screen and a second guide rail running perpendicular to the first guide rail. For example, as shown in FIG. 3A (or in FIG. 2), the guide rails may include a first guide rail 233-1 running parallel to the horizontal edge of the receiving screen and a second guide rail 233-2 running perpendicular to the first guide rail 233-1. In some embodiments, the guide rails may be used to position the fine positioning module. The position of the fine positioning module corresponds to the second photographing position. For example, when the guide rail moves the fine positioning module to the second photographing position with respect to the receiving screen, the coordinate position of the second photographing position in the coordinate system established with the receiving screen may be obtained based on the displacement distance of the guide rail. In some embodiments, the higher the positioning accuracy of the guide rail, the more accurate the position of the fine positioning module (i.e., the second photographing position) is obtained. In some embodiments, the positioning accuracy of the guide rails is higher than 10 microns. For example, the displacement distance of the guide rail may be 10 microns, or 8 microns, or 5 microns, etc. for each movement of the guide rail.
In some embodiments, the second camera and the photosensitive screen may be mounted on a rail. For example, as shown in FIG. 3A, the second camera 235 and the photosensitive screen 237 may move up and down on the second rail 233-2, which may be positioned to snap-fit to the first rail, and may move horizontally along the first rail. In some embodiments, the second camera and the distance between the photosensitive screen and the second guide rail, and the distance between the second guide rail and the first guide rail may be in threaded fit, the second camera and the photosensitive screen are driven by the first motor to rotate by the second guide rail so as to move up and down along the second guide rail, and the second camera and the photosensitive screen are driven by the second motor to rotate by the first guide rail so as to move left and right along the first guide rail. The control module may control the first motor and the second motor to move the fine positioning model to be near the first position of the spot based on the first position information of the spot. For another example, the second camera and the photosensitive screen may be mounted on a mobile robot, the mobile robot may move on the guide rail, the control module may determine the second photographing position based on the first position information of the light spot, send coordinate information of the second photographing position to the mobile robot, and the mobile robot carries the second camera and the photosensitive screen to move to the second photographing position based on the coordinate information. In some embodiments, the distance between the second camera and the lidar stage is greater than the distance between the photosensitive screen and the lidar stage. For example, as shown in FIG. 3B, the right panel of photosensitive screen 237 may be parallel to the plane of receive panel 220-1 to receive the laser beam emitted by the lidar to be calibrated, and second camera 235 may be positioned a distance from the left panel of photosensitive screen 237 to capture a second spot image of the laser beam on the photosensitive screen. In some embodiments, the photosensitive screen may be any screen capable of simultaneously presenting light spots on the left and right sides of the corresponding position of the photosensitive screen after receiving the laser beam. In some embodiments, the performance parameters of the second camera may be the same or different than the performance parameters of the first camera.
Compared with the rough positioning module, the distance between the second camera in the fine positioning module and the photosensitive screen is short, and the fine positioning module can move to the corresponding position of the laser beam, so that a spot image of the laser beam with higher definition can be acquired. More accurate spot position can be obtained based on the spot image, and then the calibration precision of the laser radar calibration device can be improved.
Fig. 4 is an exemplary flow diagram of a lidar calibration method shown in accordance with some embodiments of the present description.
Step 410, a first spot image of a laser beam emitted by a laser radar to be calibrated is acquired.
The first spot image may reflect the position of a spot formed on the receiving screen by the emitted laser beam. In some embodiments, a first spot image of a laser beam emitted by a lidar to be calibrated may be acquired by a coarse positioning module. The coarse positioning module may include a first camera and a receiving screen. The receiving screen can be used for receiving the laser beam, and the first camera can be used for shooting a light spot formed on the receiving screen by the laser beam to obtain a first light spot image of the laser beam. In some embodiments, the receiving screen may be a drop screen. In some embodiments, the system may control the coarse positioning module to acquire the first spot image based on the measurement instruction. For example, after the laser radar to be calibrated emits the laser beam with the specified emergence angle, the control module may control the coarse positioning module to shoot the first spot image of the laser beam based on the received measurement instruction. In some alternative embodiments, the first spot image of the laser beam may be obtained by other feasible manners, which is not limited by the present specification.
And step 420, determining a shooting position of a second light spot image of the laser beam based on the first light spot image, and acquiring the second light spot image.
Position data of the emitted laser beam in space, for example, position data of the laser beam relative to the receiving screen, may be acquired based on the first spot image, and from this position data, a second shot position of the laser beam (i.e., a shot position of the second spot image) may be determined to acquire a second spot image having a higher resolution than the first spot image. In some embodiments, a second spot image of the laser beam may be acquired by the fine positioning module. Specifically, the position data of the light spot on the receiving screen can be acquired by processing the first light spot image, and the fine positioning module is moved based on the position data, so that the fine positioning module can acquire the second light spot image with higher resolution. For example, if the intersection point of the upper left corner of the receiving screen is used as the origin, the horizontal edge of the receiving screen is used as the x axis, the vertical edge of the receiving screen is used as the y axis, and the coordinate position of the light spot in the coordinate system is (50,80) obtained based on the first light spot image, a certain point (such as the intersection point of the upper left corner of the photosensitive screen) in the fine positioning module can be used as a reference point, the fine positioning module is moved relative to the intersection point of the upper left corner of the receiving screen, so that the photosensitive screen can receive the laser beam, and the laser beam is moved, for example, by 45 cm and 75 cm respectively in the horizontal direction and the vertical direction (assuming that the initial position of the fine positioning module is the intersection point of the upper. In some embodiments, the photosensitive screen and the receiving screen may be adjusted to be in the same plane, and after the photographing position of the second spot image is determined, the receiving screen in the coarse positioning module may be lifted so that the fine positioning module may move and receive the laser beam in the same plane.
In some embodiments, the fine positioning module may include a second camera and a photosensitive screen. The second camera and the photosensitive screen can be mounted on a guide rail, and the guide rail can be used for providing a motion track for the second camera and the photosensitive screen to move in the plane of the receiving screen or a plane parallel to the receiving screen so as to position the fine positioning module to the shooting position of the second light spot image. The photosensitive screen can be used for receiving the laser beam; the second camera may be used to photograph a spot of the laser beam formed on the photosensitive screen to obtain a second spot image of the laser beam. In some embodiments, the shooting position of the second spot image may be determined based on the movement displacement of the guide rail.
And step 430, acquiring second position information of the light spot based on the second light spot image.
In some embodiments, the second position information of the spot may be a coordinate position of the spot in the second spot image. In some embodiments, the second position information of the spot may be a coordinate position of the spot in the plane of the receiving screen. In some embodiments, the second position information of the light spot may be a coordinate position of the light spot in the photosensitive screen. The coordinate position of the light spot can be obtained by any feasible manner, for example, OpenCv, MATLAB, and the like, which is not limited in this specification. More about the position information of the light spot can be referred to in other parts of this specification (for example, fig. 5 and its related description), and will not be described herein again.
And step 440, determining the target position of the light spot in the preset coordinate system based on the second position information and the shooting position of the second light spot image.
In some embodiments, the preset coordinate system may be determined based on the second position information and/or the photographing position of the second spot image. For example, a coordinate system corresponding to the second position information may be determined as the preset coordinate system. In some embodiments, the target position of the light spot in the preset coordinate system can be determined by a coordinate transformation method. For example, if the preset coordinate system is a coordinate system established based on the receiving screen, the coordinate system corresponding to the second position information is a coordinate system established based on the photosensitive screen (the upper left corner of the photosensitive screen is used as an origin, the horizontal edge of the photosensitive screen is used as an x-axis, and the vertical edge of the photosensitive screen is used as a y-axis), that is, the second position information includes position data of the light spot in the photosensitive screen, and the coordinate system corresponding to the second shooting position is a coordinate system established based on the receiving screen, that is, the second shooting position is position data of a plane where the second camera and the photosensitive screen are located on the receiving screen, the second position and the second shooting position can be superimposed in a coordinate superimposing manner (assuming that the second shooting position is directly opposite to an intersection point of the upper left corner of the photosensitive screen), so as to obtain a target position of the light spot in the. Further details regarding the target position of the light spot can be found in other parts of this specification (e.g., fig. 5 and the related description thereof), and are not repeated herein.
The actual exit angle of the laser beam is determined based on the target position, step 450.
In some embodiments, the actual exit angle of the laser beam may be determined based on a positional relationship between the target position of the spot and the spatial center origin of the lidar to be calibrated. And the origin of the center of the space is a mapping position on the receiving screen when the actual emergence angle of the laser beam is 0 degree. In some embodiments, a spatial center origin of the lidar to be calibrated may be defined by an origin definition module. Further details regarding the definition of the origin of the center of space by the origin defining module can be seen in fig. 1 and its related description, which are not repeated herein.
In some embodiments, the actual emergence angle of the laser beam may be determined based on the position information of the target position of the light spot and the spatial center origin O in the preset coordinate system, and the distance between the laser radar to be calibrated and the preset coordinate plane. Specifically, the method comprises the following steps: the method comprises the steps of determining a first side length based on position information of a target position of a light spot and a space center origin in a preset coordinate system, determining a second side length based on the distance from a laser exit port of the laser radar to be calibrated to a preset coordinate plane, processing the first side length and the second side length by utilizing an arc tangent function, and determining an actual exit angle of a laser beam. In some embodiments, the first edge length may include displacement of the spot in a horizontal direction and displacement in a vertical direction relative to the spatial center origin.
It should be noted that the above description of method 400 is for purposes of example and illustration only and is not intended to limit the scope of applicability of the present description. Various modifications and alterations to method 400 will be apparent to those skilled in the art in light of this description. However, such modifications and variations are intended to be within the scope of the present description. For example, when the second position information of the spot in step 430 is directly converted into a spatial coordinate system (as a preset coordinate system) based on the relationship between the second spot image coordinate system and the second camera internal coordinate system, and the actual emergence angle is calculated in the spatial coordinate system, step 440 may be omitted.
FIG. 5 is a schematic diagram of an exemplary lidar calibration method shown in accordance with some embodiments of the present description.
For ease of understanding, the lidar calibration process will be described below in a specific embodiment in conjunction with fig. 5.
In one embodiment, the intersection point D at the upper left corner of the screen may be received to establish a coordinate system (X, Y) as the origin, and determine it as the predetermined coordinate system. Wherein, receive the screen horizontal edge and be the X axle and the level right is the forward of X axle, and receive the vertical edge of screen and be the Y axle and vertically downwards be the forward of Y axle, and the coordinate unit is centimetre. Since the first spot image is an image of the laser beam on the receiving screen, the first spot image including the entire receiving screen can be acquired when the rough positioning module is used for shooting, and a first position of the spot in a preset coordinate system (corresponding to position data of the spot on the receiving screen) can be acquired based on the first spot image. For example only, the coordinates of the light spot in the first light spot image may be determined, and then the position of the light spot in the space coordinate system may be calculated based on the relationship between the image coordinate system and the first camera internal coordinate system and the relationship between the first camera internal coordinate system and the space coordinate system. Wherein the XOY plane of the spatial (three-dimensional) coordinate system may be the aforementioned preset (two-dimensional) coordinate system determined based on the receiving screen, and the Z-axis of the spatial coordinate system is perpendicular to the preset coordinate system. Further, a second photographing position of the fine positioning module may be determined based on the first position, and the fine positioning module is moved to the photographing position through the guide rail to acquire a second spot image of the laser beam. Because the fine positioning module has a certain volume, in actual operation, a point in the plane of the photosensitive screen can be taken as a reference point, the reference point moves relative to a certain point (for example, a cross point D at the upper left corner of the receiving screen), and the position of the reference point after moving is defined as a second shooting position. For example, with the intersection point of the upper left corner of the photosensitive screen as a reference point, the intersection point D of the upper left corner of the receiving screen is moved to a second shooting position, so that the photosensitive screen can cover the first position of the light spot, and assuming that the second shooting position is the position of the point a in fig. 5(a), the coordinate of a is (50, 70).
When the second position of the light spot in the second light spot image is obtained, a coordinate system can be established by taking the intersection point of the upper left corner of the photosensitive screen as the origin (corresponding to the point a in fig. 5 (a)), and the second position of the light spot in the coordinate system is obtained, and the coordinate of the second position B is assumed to be (20, 15). The target position of the point B in the preset coordinate system (X, Y) may be obtained as (70,85) by means of superposition.
The coordinates of point B and the spatial center origin O may be transformed into the same coordinate system to calculate the distance between the two. For example, it can be determined that x in FIG. 5(a)1And/or y1The corresponding value has a size of the first side length. Wherein x is1The corresponding values are the length (or distance) of the side of the point B and the point O in the horizontal direction parallel to the ground, y1The corresponding value is the length (or distance) of the sides of point B and point O in the vertical direction perpendicular to the ground.
If the distance from the laser exit port S of the lidar to be calibrated to the preset coordinate plane is l, as shown in fig. 5(b) and (c), based on x1Corresponding to the first side length and the first side length l, the actual outgoing angle alpha of the laser beam in the horizontal direction, namely the actual azimuth angle of the laser beam can be obtained through arc tangent calculation; based on y1And (4) obtaining the actual emergence angle beta of the laser beam in the vertical direction, namely the actual pitch angle of the laser beam by performing over-arc tangent calculation on the corresponding first side length and the corresponding first side length l. In some embodiments, the difference values with α and β may be calculated based on the specified exit angle of the laser beam in the horizontal direction and the specified exit angle in the vertical direction of the laser radar to be calibrated, respectively, to obtain the error angles of the laser beam in the horizontal direction and the vertical direction, respectively.
In some embodiments, the laser beam emitted by the lidar to be calibrated may be adjusted based on an error angle of the laser beam. In some embodiments, the laser beam emitted by the laser radar to be calibrated may be adjusted by calibrating the laser beam at different specified exit angles, and statistically obtaining an error angle table or curve (reflecting the mapping relationship between the different specified exit angles and the actual exit angle) of the laser radar to be calibrated.
It should be noted that the above description of method 500 is for purposes of example and illustration only and is not intended to limit the scope of applicability of the present application. Various modifications and alterations to method 500 will be apparent to those skilled in the art in light of the present application. For example, the predetermined coordinate system in fig. 5 may be a coordinate system established with a point corresponding to the spatial center origin of the lidar calibration apparatus on the plane where the receiving screen is located as the origin. However, such modifications and variations are intended to be within the scope of the present application.
The beneficial effects that may be brought by the embodiments of the present description include, but are not limited to: (1) by utilizing the movable fine positioning module, a spot image of the laser beam with higher resolution can be obtained; (2) the spot position is obtained based on the spot image with higher resolution of the laser beam, and the accuracy of the laser radar calibration device can be improved. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present description may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereof. Accordingly, aspects of this description may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.), or by a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present description may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of this specification may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, VisualBasic, Fortran2003, Perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing processing device or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except where the application history document does not conform to or conflict with the contents of the present specification, it is to be understood that the application history document, as used herein in the present specification or appended claims, is intended to define the broadest scope of the present specification (whether presently or later in the specification) rather than the broadest scope of the present specification. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (18)

1. A lidar calibration apparatus, the apparatus comprising:
the laser radar platform is used for bearing a laser radar to be calibrated;
the rough positioning module is used for acquiring a first light spot image of a laser beam of the laser radar to be calibrated;
the movable fine positioning module is used for adjusting a second shooting position based on the first light spot image and acquiring a second light spot image of the laser beam, and the resolution of the second light spot image is higher than that of the first light spot image;
the control module is used for determining the target position of the light spot in a preset coordinate system based on the second light spot image; and determining an actual exit angle of the laser beam based at least on the target position.
2. The apparatus of claim 1, wherein the lidar stage comprises a controllable 360-degree rotating platform for controlling the lidar to be calibrated to rotate around a vertical rotating shaft so as to calibrate the exit modules of the lidar to be calibrated for different horizontal fields of view.
3. The apparatus of claim 1, the coarse positioning module comprising a first camera and a receiving screen;
the receiving screen is used for receiving the laser beam;
the first camera is used for shooting light spots formed by the laser beams on the receiving screen so as to obtain the first light spot image.
4. The apparatus of claim 3, the receiving screen comprising a liftable curtain.
5. The apparatus of claim 3, the fine positioning module comprising: the second camera and the photosensitive screen are arranged on the guide rail;
the guide rail is used for providing a motion track for the second camera and the photosensitive screen to move in the plane of the receiving screen or a plane parallel to the receiving screen so as to position the fine positioning module to the second shooting position;
the photosensitive screen is used for receiving the laser beam;
the second camera is used for shooting light spots formed by the laser beams on the photosensitive screen so as to obtain a second light spot image.
6. The apparatus of claim 5, the guide rails comprising a first guide rail running along a horizontal edge of the receiving screen and a second guide rail running perpendicular to the first guide rail; the positioning accuracy of the guide rail is higher than 10 microns.
7. The apparatus of claim 1, further comprising a distance adjuster to adjust a distance between the lidar stage and the coarse positioning module.
8. The apparatus of claim 7, the distance adjuster comprising at least a track and a rangefinder.
9. The apparatus of claim 1, further comprising a detachable origin defining module for defining a spatial center origin of the lidar calibration apparatus, the spatial center origin being a mapping position on a receiving screen at which an actual exit angle of the laser beam is 0 degrees.
10. The apparatus of claim 9, the origin defining module comprising a self-leveling laser striping machine and a pose adjustment stage;
the self-level laser line marking instrument is used for defining the position of the space center original point, and the posture adjusting holder is used for adjusting the position of the self-level laser line marking instrument.
11. The apparatus of claim 1, wherein to determine the target position of the spot in a preset coordinate system based on the second spot image, the control module is further configured to:
acquiring second position information of the light spot based on the second light spot image;
and determining the target position of the light spot in a preset coordinate system based on the second position information of the light spot and the second shooting position.
12. A lidar calibration method, the method comprising:
acquiring a first light spot image of a laser beam emitted by a laser radar to be calibrated;
determining a shooting position of a second light spot image of the laser beam based on the first light spot image and acquiring the second light spot image, wherein the resolution of the second light spot image is higher than that of the first light spot image;
determining the target position of the light spot in a preset coordinate system based on the second light spot image;
and determining an actual exit angle of the laser beam based at least on the target position.
13. The method of claim 12, the acquiring a first spot image of the laser beam emitted by the lidar to be calibrated comprising acquiring, by a coarse positioning module, a first spot image of the laser beam emitted by the lidar to be calibrated;
the coarse positioning module comprises a first camera and a receiving screen; the receiving screen is used for receiving the laser beam; the first camera is used for shooting light spots formed by the laser beams on the receiving screen so as to obtain the first light spot image.
14. The method of claim 13, the determining a capture location of a second spot image of the laser beam based on the first spot image and acquiring a second spot image, comprising:
processing the first spot image to determine position data of the spot on the receiving screen;
moving a fine positioning module based on the position data to enable the fine positioning module to acquire the second spot image;
wherein the fine positioning module comprises: the second camera and the photosensitive screen are arranged on the guide rail;
the guide rail is used for providing a motion track for the second camera and the photosensitive screen to move in the plane of the receiving screen or a plane parallel to the receiving screen so as to position the fine positioning module to the shooting position of the second light spot image;
the photosensitive screen is used for receiving the laser beam;
the second camera is used for shooting light spots formed by the laser beams on the photosensitive screen so as to obtain a second light spot image.
15. The method of claim 14, wherein determining the target position of the spot in the preset coordinate system based on the second spot image comprises:
acquiring second position information of the light spot based on the second light spot image;
superposing the second position information and the shooting position of the second light spot image to obtain a target position of the light spot in a preset coordinate system;
the second position information comprises position data of the light spot in the photosensitive screen, the shooting position is position data of the second camera and the photosensitive screen on a plane where the receiving screen is located, and the preset coordinate system is established based on the receiving screen.
16. The method of claim 13, further comprising:
defining a space center origin of the laser radar to be calibrated; the spatial center origin is a mapping position on the receiving screen when the actual emergence angle of the laser beam is 0 degree.
17. The method of claim 16, the defining a spatial center origin of the lidar to be calibrated comprising:
adjusting the position of an origin defining module to enable the intersection point of two mutually perpendicular lines projected by two mutually perpendicular planes projected by the origin defining module on the laser radar to be calibrated to coincide with a laser emergent port of the laser radar to be calibrated;
and defining the position information of the intersection point of two mutually perpendicular lines projected on the receiving screen by the two mutually perpendicular planes as the position information of the space center origin.
18. The method of claim 17, the determining an actual exit angle of the laser beam based at least on the target location, comprising:
determining a first side length based on the target position and the position information of the space center origin in the preset coordinate system;
determining a second side length based on the distance from the laser emitting port of the laser radar to be calibrated to the preset coordinate plane;
and processing the first side length and the second side length by using an arc tangent function to determine the actual emergent angle of the laser beam.
CN202010711316.7A 2020-07-22 2020-07-22 Laser radar calibration device and method Active CN111880164B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010711316.7A CN111880164B (en) 2020-07-22 2020-07-22 Laser radar calibration device and method
PCT/CN2021/107644 WO2022017419A1 (en) 2020-07-22 2021-07-21 Laser radar calibration device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010711316.7A CN111880164B (en) 2020-07-22 2020-07-22 Laser radar calibration device and method

Publications (2)

Publication Number Publication Date
CN111880164A true CN111880164A (en) 2020-11-03
CN111880164B CN111880164B (en) 2023-02-28

Family

ID=73155310

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010711316.7A Active CN111880164B (en) 2020-07-22 2020-07-22 Laser radar calibration device and method

Country Status (2)

Country Link
CN (1) CN111880164B (en)
WO (1) WO2022017419A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112526486A (en) * 2020-11-23 2021-03-19 哈尔滨工业大学 Three-dimensional laser radar space coordinate calibration method based on shafting error model
CN112689114A (en) * 2021-03-11 2021-04-20 太平金融科技服务(上海)有限公司 Method, apparatus, device and medium for determining target position of vehicle
CN112946607A (en) * 2021-01-26 2021-06-11 光为科技(广州)有限公司 Calibration method, system and machine readable medium for optical detection and ranging device
CN113176579A (en) * 2021-03-01 2021-07-27 奥比中光科技集团股份有限公司 Light spot position self-adaptive searching method, time flight ranging system and ranging method
CN113296082A (en) * 2021-05-28 2021-08-24 南京牧镭激光科技有限公司 Calibration method and auxiliary device for monitoring clearance distance of fan by using laser clearance radar
CN113433520A (en) * 2021-08-26 2021-09-24 盎锐(常州)信息科技有限公司 Zero detection method and system and laser radar
CN113567967A (en) * 2021-08-27 2021-10-29 北京航迹科技有限公司 Laser radar calibration device and method
WO2022017419A1 (en) * 2020-07-22 2022-01-27 北京航迹科技有限公司 Laser radar calibration device and method
CN113985420A (en) * 2021-12-28 2022-01-28 四川吉埃智能科技有限公司 Method for compensating scanning light path error of laser radar inclined by 45 degrees
WO2022227844A1 (en) * 2021-04-30 2022-11-03 北京航迹科技有限公司 Laser radar correction apparatus and method
CN116879872A (en) * 2023-09-05 2023-10-13 家园数字科技(吉林省)有限公司 Laser radar calibration equipment

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115615338B (en) * 2022-09-09 2024-02-20 成都飞机工业(集团)有限责任公司 Aircraft complete machine level measurement system and measurement method
CN117111044A (en) * 2023-10-25 2023-11-24 武汉市品持科技有限公司 Laser radar pitch angle and spot size automatic correction equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102353950A (en) * 2011-10-18 2012-02-15 中国工程物理研究院应用电子学研究所 Laser radar optical system with optical axis calibrating function and optical axis calibrating method
US20180059221A1 (en) * 2016-08-31 2018-03-01 Qualcomm Incorporated Hybrid scanning lidar systems
CN110553605A (en) * 2019-09-18 2019-12-10 苏州华兴源创科技股份有限公司 System and method for measuring deflection angle error of laser radar
CN111208496A (en) * 2020-03-10 2020-05-29 广东博智林机器人有限公司 Calibration device and calibration method for laser radar
CN111308450A (en) * 2020-03-13 2020-06-19 广东博智林机器人有限公司 Laser radar calibration device and application method thereof

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201317974D0 (en) * 2013-09-19 2013-11-27 Materialise Nv System and method for calibrating a laser scanning system
CN111044990B (en) * 2018-10-11 2022-10-18 北京北科天绘科技有限公司 Airborne laser radar beam pointing calibration method and system and laser spot detector
CN110749876B (en) * 2019-08-30 2021-11-19 上海禾赛科技有限公司 Calibration method and calibration structure for laser radar
KR200491988Y1 (en) * 2019-09-30 2020-07-13 (주)이즈미디어 Calibration device for tof camera
CN111880164B (en) * 2020-07-22 2023-02-28 北京嘀嘀无限科技发展有限公司 Laser radar calibration device and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102353950A (en) * 2011-10-18 2012-02-15 中国工程物理研究院应用电子学研究所 Laser radar optical system with optical axis calibrating function and optical axis calibrating method
US20180059221A1 (en) * 2016-08-31 2018-03-01 Qualcomm Incorporated Hybrid scanning lidar systems
CN110553605A (en) * 2019-09-18 2019-12-10 苏州华兴源创科技股份有限公司 System and method for measuring deflection angle error of laser radar
CN111208496A (en) * 2020-03-10 2020-05-29 广东博智林机器人有限公司 Calibration device and calibration method for laser radar
CN111308450A (en) * 2020-03-13 2020-06-19 广东博智林机器人有限公司 Laser radar calibration device and application method thereof

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022017419A1 (en) * 2020-07-22 2022-01-27 北京航迹科技有限公司 Laser radar calibration device and method
CN112526486B (en) * 2020-11-23 2022-06-14 哈尔滨工业大学 Three-dimensional laser radar space coordinate calibration method based on shafting error model
CN112526486A (en) * 2020-11-23 2021-03-19 哈尔滨工业大学 Three-dimensional laser radar space coordinate calibration method based on shafting error model
CN112946607B (en) * 2021-01-26 2023-10-27 光为科技(广州)有限公司 Calibration method, system and machine readable medium for light detection and ranging device
CN112946607A (en) * 2021-01-26 2021-06-11 光为科技(广州)有限公司 Calibration method, system and machine readable medium for optical detection and ranging device
CN113176579A (en) * 2021-03-01 2021-07-27 奥比中光科技集团股份有限公司 Light spot position self-adaptive searching method, time flight ranging system and ranging method
CN112689114B (en) * 2021-03-11 2021-06-22 太平金融科技服务(上海)有限公司 Method, apparatus, device and medium for determining target position of vehicle
CN112689114A (en) * 2021-03-11 2021-04-20 太平金融科技服务(上海)有限公司 Method, apparatus, device and medium for determining target position of vehicle
WO2022227844A1 (en) * 2021-04-30 2022-11-03 北京航迹科技有限公司 Laser radar correction apparatus and method
CN113296082A (en) * 2021-05-28 2021-08-24 南京牧镭激光科技有限公司 Calibration method and auxiliary device for monitoring clearance distance of fan by using laser clearance radar
CN113433520A (en) * 2021-08-26 2021-09-24 盎锐(常州)信息科技有限公司 Zero detection method and system and laser radar
CN113433520B (en) * 2021-08-26 2021-12-17 盎锐(常州)信息科技有限公司 Zero detection method and system and laser radar
CN113567967A (en) * 2021-08-27 2021-10-29 北京航迹科技有限公司 Laser radar calibration device and method
CN113985420A (en) * 2021-12-28 2022-01-28 四川吉埃智能科技有限公司 Method for compensating scanning light path error of laser radar inclined by 45 degrees
CN113985420B (en) * 2021-12-28 2022-05-03 四川吉埃智能科技有限公司 Method for compensating scanning light path error of laser radar inclined by 45 degrees
CN116879872A (en) * 2023-09-05 2023-10-13 家园数字科技(吉林省)有限公司 Laser radar calibration equipment
CN116879872B (en) * 2023-09-05 2023-11-07 家园数字科技(吉林省)有限公司 Laser radar calibration equipment

Also Published As

Publication number Publication date
CN111880164B (en) 2023-02-28
WO2022017419A1 (en) 2022-01-27

Similar Documents

Publication Publication Date Title
CN111880164B (en) Laser radar calibration device and method
CN113204004A (en) Laser radar calibration device and method
KR101214081B1 (en) Image expression mapping system using space image and numeric information
CN109813509B (en) Method for realizing measurement of vertical dynamic disturbance degree of high-speed rail bridge based on unmanned aerial vehicle
CN111025032B (en) Aerial beam measuring system and method based on lift-off platform
CN112697112A (en) Method and device for measuring horizontal plane inclination angle of camera
CN103493470A (en) Method for determining adjustment deviations of an image data capture chip of an optical camera and corresponding adjustment verification devices
CN113189568B (en) Laser radar calibration device and method
CN113496523A (en) System and method for three-dimensional calibration of visual system
CN107534715B (en) Camera production method and advanced driving assistance system
CN109813510B (en) High-speed rail bridge vertical dynamic disturbance degree measuring method based on unmanned aerial vehicle
CN105627954A (en) Included angle measurement method and device and angle adjustment method and device
CN109798874B (en) Method for measuring vertical dynamic disturbance of high-speed railway bridge
CN108931236B (en) Industrial robot tail end repeated positioning precision measuring device and method
US20230045402A1 (en) Laser Leveling Device and Leveling Method
KR101283932B1 (en) Method for measuring direction error of gimbal platform and apparatus thereof
CN115685155A (en) Laser radar calibration equipment and method
CN114964316A (en) Position and attitude calibration method and device, and method and system for measuring target to be measured
CN114222115A (en) Optical anti-shake calibration method, device, equipment and medium
CN112013810A (en) Distance measurement method, device and equipment
CN109813231B (en) Method for measuring vertical dynamic disturbance of high-speed railway bridge
CN116012455A (en) Relative position relation determining method, structured light imaging method and related system
CN114002706A (en) Measuring method and device of photoelectric sight-stabilizing measuring system and computer equipment
US8488200B2 (en) System and method for reproducing images onto surfaces
NL2032374B1 (en) Calibration method for survey instrument and system for calibrating a survey instrument

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230506

Address after: 100193 no.218, 2nd floor, building 34, courtyard 8, Dongbeiwang West Road, Haidian District, Beijing

Patentee after: Beijing Track Technology Co.,Ltd.

Address before: 100193 No. 34 Building, No. 8 Courtyard, West Road, Dongbei Wanglu, Haidian District, Beijing

Patentee before: BEIJING DIDI INFINITY TECHNOLOGY AND DEVELOPMENT Co.,Ltd.

TR01 Transfer of patent right