WO2023103198A1 - 一种计算测距系统相对外参的方法、装置和存储介质 - Google Patents

一种计算测距系统相对外参的方法、装置和存储介质 Download PDF

Info

Publication number
WO2023103198A1
WO2023103198A1 PCT/CN2022/080516 CN2022080516W WO2023103198A1 WO 2023103198 A1 WO2023103198 A1 WO 2023103198A1 CN 2022080516 W CN2022080516 W CN 2022080516W WO 2023103198 A1 WO2023103198 A1 WO 2023103198A1
Authority
WO
WIPO (PCT)
Prior art keywords
scanning
moment
time
relative
target object
Prior art date
Application number
PCT/CN2022/080516
Other languages
English (en)
French (fr)
Inventor
刘浏
徐玉华
闫敏
余宇山
杨晓立
赵鑫
Original Assignee
深圳奥锐达科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳奥锐达科技有限公司 filed Critical 深圳奥锐达科技有限公司
Publication of WO2023103198A1 publication Critical patent/WO2023103198A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Definitions

  • the present application relates to the field of multi-sensor fusion, and more specifically, relates to a method, device and storage medium for calculating relative extrinsic parameters of a ranging system.
  • the measurement data of each sensor can be collected and managed in a unified manner through the cooperation and coordination between multiple sensors, and the final parameter measurement results can be obtained to achieve more accurate measurement. .
  • sensors installed on a car may include temperature sensors, pressure sensors, speed sensors, acceleration sensors, lidar, cameras, etc.
  • the distance between the car and the front car, the rear car, the roads on both sides and each target object can be measured by laser radar;
  • the device collects the images of the scene around the car;
  • the processor completes the three-dimensional reconstruction of the scene around the car according to the collected images and the distance of the measured object.
  • some laser radars are set to measure the distance of the target field of view by row-by-row, column-by-column or point-by-point scanning according to the different sampling methods.
  • the lidar itself moves, for example, the lidar moves with the car while the car is driving, resulting in different poses at each measurement moment, and the result of one frame of measurement is inaccurate, which affects driving safety .
  • the present application provides a method, device and storage medium for calculating the relative extrinsic parameters of the ranging system.
  • the method can measure the pose parameters of the laser radar in real time and calculate the relative extrinsic parameters of the ranging system. Based on the relative extrinsics, the The measured data is corrected to improve the measurement accuracy of the ranging system.
  • a method for calculating the relative extrinsic parameters of the ranging system which is characterized in that the method includes: controlling the laser radar to scan the target object according to the preset scanning mode, so as to obtain the three-dimensional image of the target object at each scanning moment Point cloud data; at each scanning moment, synchronously control the inertial navigation device to measure the real-time pose parameters of the lidar, the real-time pose parameters include displacement acceleration and/or rotational angular velocity; determine the reference moment and calculate each scan The time difference between the moment and the reference moment, calculate the relative pose offset according to the time difference and the real-time pose parameters at each scan moment; calculate the measurement position at each scan moment according to the relative pose offset The relative extrinsics of the distance system.
  • the laser radar when the laser radar measures the distance data of the target object, uses the inertial navigation device to measure the real-time pose parameters of the laser radar, and further determines the relative pose offset of the laser radar when measuring the distance data of the target object, and uses the relative
  • the relative extrinsic parameters of the ranging system are obtained by calculating the pose offset, which eliminates the measurement error caused by the movement of the lidar itself in the ranging system and improves the measurement accuracy of the lidar.
  • the reference time is a first reference time
  • the first reference time is any scanning time among the multiple scanning times.
  • the method further includes, according to the relative external reference, correcting the 3D point cloud data collected at other scanning moments in the multiple scanning moments to be collected at the first reference moment 3D point cloud data.
  • the method further includes synchronously controlling the camera to collect image data of the target object, and acquiring an exposure moment when the camera collects the image data of the target object.
  • the reference moment is a second reference moment
  • the second reference moment is an exposure moment
  • the method further includes: performing fusion processing on the three-dimensional point cloud data collected at each scanning moment and the image data according to the relative external reference.
  • the ranging system when the ranging system also includes a camera, the camera is synchronously controlled to collect the image data of the target object, and the exposure time of the camera is taken as the reference time, and the relative pose offset is calculated according to the reference time, and the distance measurement system is further obtained.
  • the 3D point cloud data of the target object collected by the lidar and the image data of the target object collected by the camera are fused to realize the 3D reconstruction of the target object and improve the measurement accuracy and accuracy of the ranging system.
  • the inertial navigation device is used to measure the real-time pose parameters of the lidar, and the relative pose offset of the lidar when measuring the distance data of the target object is further determined.
  • the relative extrinsic parameters of the ranging system are obtained by calculating the relative pose offset.
  • the ranging system also includes a camera
  • the camera is synchronously controlled to collect the image data of the target object
  • the exposure time of the camera is used as the second reference time
  • the relative pose offset is calculated according to the second reference time, and further obtained in the measurement
  • the distance system includes the relative external parameters of the camera
  • the 3D point cloud data of the target object collected by the laser radar and the image data of the target object collected by the camera are fused to realize the 3D reconstruction of the target object and improve the performance of the distance measurement system. Measurement precision and accuracy.
  • a device for calculating the relative external parameters of the ranging system includes: a first processing module, configured to control the laser radar to scan the target object in a preset scanning mode, so as to obtain the target at each scanning time The three-dimensional point cloud data of the object; the measurement module is used to synchronously control the inertial navigation device to measure the real-time pose parameters of the lidar at each scanning moment, and the real-time pose parameters include displacement acceleration and/or rotational angular velocity; The second processing module is used to determine the reference moment and calculate the time difference between each scanning moment and the reference moment, and calculate the relative pose offset according to the time difference and the real-time pose parameters at each scanning moment; the third processing A module is used to calculate the relative extrinsics of the ranging system at each scanning moment according to the relative pose offset.
  • the device further includes: a fourth processing module, configured to modify the 3D point cloud data collected at other scanning moments in the multiple scanning moments to the first The 3D point cloud data collected at the reference time.
  • the device further includes: an acquisition module, configured to synchronously control the camera to acquire the image data of the target object, and obtain the time when the camera acquires the image data of the target object exposure time.
  • an acquisition module configured to synchronously control the camera to acquire the image data of the target object, and obtain the time when the camera acquires the image data of the target object exposure time.
  • the device further includes: a fifth processing module, configured to fuse the 3D point cloud data collected at each scanning moment with the image data according to the relative external reference deal with.
  • a device which includes: a memory for storing a computer program, the computer program including program instructions; a processor for invoking and executing the program instructions from the memory, so that the device executes the first A method in one aspect or any possible implementation of the first aspect.
  • a ranging system in a fourth aspect, includes: an inertial navigation device, a laser radar, a camera and a fusion control processing circuit, and the fusion control processing circuit can control the laser radar, the inertial navigation device, the camera Execute the method in the foregoing first aspect or any possible implementation manner of the first aspect.
  • a fifth aspect provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed, the computer executes the first aspect or any possible implementation of the first aspect method in .
  • a computer program product is provided.
  • the computer program product is run on a computer, the computer is made to execute the method in the first aspect or any possible implementation manner of the first aspect.
  • FIG. 1 is a schematic diagram of an unmanned driving scene provided by an embodiment of the present application
  • Fig. 2 is a schematic diagram of the working process of a laser radar sampling configuration for scanning measurement provided by the embodiment of the present application;
  • Fig. 3 is a schematic diagram of a working process of camera sampling provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a system structure of a ranging system provided by an embodiment of the present application.
  • Fig. 5 is a schematic flowchart of a method for calculating the relative external parameters of the ranging system provided by the embodiment of the present application;
  • Fig. 6 is a schematic flowchart of another example of a method for calculating the relative external parameters of the ranging system provided by the embodiment of the present application;
  • Fig. 7 is a schematic diagram of a device for calculating the relative external parameters of the ranging system provided by the embodiment of the present application.
  • Fig. 8 is a schematic structural diagram of a device provided by an embodiment of the present application.
  • first and second are used for descriptive purposes only, and cannot be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, the meaning of "multiple” is two or more.
  • Fig. 1 is a schematic diagram of an unmanned driving scenario provided by an embodiment of the present application.
  • the distance D 4 between the car 10 and the front car 30 must be taken into account, and the distance D 1 between the rear cars 20 is within the safe driving distance range.
  • the distance D 2 from the left lane line and the distance D 3 from the right lane line must be taken into account to meet the requirements of traffic regulations. Therefore, it is often necessary to install a ranging system to complete the measurement of surrounding targets for navigation and obstacle avoidance. .
  • the ranging system includes a laser radar 101, and the distance of the target object can be measured through the laser radar 101; or, the ranging system can also include a laser radar 101 and a camera 102, based on the laser radar 101 and the camera 102.
  • the data is fused to complete the three-dimensional reconstruction of the surrounding driving environment.
  • the ranging system may include a laser radar 101 and a camera 102, the ranging system is used to match the distance measured by the laser radar 101 with the images of the surrounding driving environment collected by the camera 102, so as to realize the detection of the vehicle 10. 3D reconstruction of the surrounding environment.
  • one or more laser radars 101 may be installed on the car 10 , and the embodiment of the present application does not limit the type and specific quantity of the laser radars 101 .
  • one or more cameras 102 may be installed on the car 10, and the cameras 102 may be of different types, and the embodiment of the present application does not limit the type and specific quantity of the cameras 102.
  • Fig. 2 is a schematic diagram of a working process of a lidar sampling configuration for scanning measurement provided by an embodiment of the present application.
  • the sampling method of lidar is point-by-point, block-by-block, and row/column scanning measurement, so each measurement result of lidar in obtaining a frame of measurement data is the distance data of the target object corresponding to each scanning moment.
  • the following takes point-by-point scanning as an example for a detailed introduction.
  • the measurement time for the laser radar to complete a frame of data can be divided into multiple scanning moments according to the scanning method.
  • the lidar can obtain the distance data of a target point at each scanning moment, and complete 6 measurements in sequence
  • the collection of one frame of data is completed in no time to obtain 6 distance data.
  • the data sampling time of each point in a frame of measurement data is not equal to the time when a frame of data acquisition is completed.
  • FIG. 3 is a schematic diagram of a working process of camera sampling provided by an embodiment of the present application.
  • the distance measuring system can also include a laser radar and a camera at the same time, and complete the three-dimensional reconstruction of the surrounding environment of the car 10 by fusing the distance of the target object collected by the laser radar with the image of the target object collected by the camera. Identify surrounding targets to complete navigation and obstacle avoidance functions.
  • the processor is used to control the synchronization of the lidar and the camera, that is, to ensure that each frame of data sampled by the lidar is synchronized with each frame of image captured by the camera for subsequent data fusion.
  • the captured image of the camera is the imaging at the moment of exposure, and the lidar needs multiple scanning measurements to complete the acquisition of a frame of data, combined with the working process shown in Figure 2 and Figure 3, it is assumed that the processor controls the camera and the lidar to be turned on at the same time It is used to collect one frame of data, but the exposure time of the camera is relatively short, so the time when the camera collects the image may correspond to the first scanning time t 1 of the lidar.
  • the moment when the target image is acquired is the exposure moment of the camera, which is denoted as "t c ".
  • lidar also needs to perform multiple scan measurements in sequence to complete the acquisition of a frame of data, and the moment when the data acquisition is completed can be considered as the last scan measurement moment.
  • the exposure moment of the camera may coincide with a certain scanning moment of the lidar, or it may not coincide with each scanning moment.
  • this application will provide a method, device and storage medium for calculating the relative external parameters of the distance measuring system, the method can calculate the relative external parameters of the distance measuring system, eliminate the error of the distance measuring system during the measurement process, and improve the The measurement accuracy of the ranging system.
  • Fig. 4 is a schematic diagram of a system structure of a ranging system provided by an embodiment of the present application.
  • the ranging system 100 may include a laser radar 101 , a camera 102 , an inertial navigation device 103 , and a fusion control processing circuit 104 .
  • the laser radar 101 includes a transmitter, a collector and a control processing circuit.
  • Emitters include light sources, emitting optics, and more. In some embodiments, one or more units such as mechanical scanning/optical diffraction elements are also included.
  • the light source can be a single light source or a light source array composed of multiple light sources, where the light source array can be configured to emit light in groups, and only one light source is turned on at a time; or it can be divided into multiple sub-light source arrays, each sub-light source array includes a row /column light source, or any other form.
  • the control processing circuit controls the emitter to emit spot beams, only one sub-light source array or only one light source in each sub-light source array can be turned on at a time, and the scanning measurement of the target scene is completed in turn. .
  • the light source is configured as a VCSEL (Vertical-Cavity Surface-Emitting Laser) array light source, and the light source array is configured to include multiple columns of light sources, and each column of light sources is turned on in turn according to a preset order After the scanning of the target field of view is completed, at least one column of light sources projects a spot beam at each scanning moment.
  • the light source can be configured as a column light source, and the light beam emitted by the column light source is projected into the target field of view after passing through the mechanical scanning element. Scanning measurement, wherein the mechanical scanning element can be a galvanometer, a MEMS scanning mirror, a rotating mirror, etc.
  • the collector includes a pixel unit composed of at least one pixel and a receiving optical element, the receiving optical element is used to image the spot light beam reflected by the target onto the pixel unit, and is also used to filter out background light and stray light, wherein the pixel can be One of APD, SiPM, SPAD, CCD, CMOS and other photodetectors.
  • the pixel unit is an image sensor specially used for light time-of-flight measurement, and the pixel unit can also be integrated into a photosensitive chip specially used for light time-of-flight measurement.
  • the pixel unit comprises a plurality of SPADs that can respond to an incident single photon and output a photon signal indicating the corresponding arrival time of the received photon at each SPAD.
  • the collector also includes a TDC timing circuit connected to the pixel unit for receiving the photon signal output by the pixel unit and recording the time signal from emission to reception of the photon.
  • the control processing circuit simultaneously controls the transmitter and the collector, and receives the time signal output by the TDC timing circuit for processing to calculate the distance of the target object to be measured.
  • the control processing circuit can also be an independent dedicated circuit, such as an independent circuit with computing power of the depth camera itself; it can also contain a general processing circuit, such as when the depth camera is integrated into a mobile phone, TV, computer, etc.
  • the processor in the terminal can perform the functions of control and processing circuits.
  • the principle of lidar to measure the distance of the target object is to directly measure the time of flight (direct time of flight, DTOF), and calculate the time of flight t of the pulse by calculating the difference between the moment of pulse emission and the moment of reception, and further according to The following formula (1) calculates the distance D of the object.
  • DTOF direct time of flight
  • the camera 102 may include an image signal processor (Image Signal Processor, ISP), a complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor, CMOS) chip, a charge-coupled device (Complementary Metal-Oxide Semiconductor, CCD), etc.
  • ISP Image Signal Processor
  • CMOS complementary metal oxide semiconductor
  • CCD Charge-coupled device
  • the inertial navigation device 103 is disposed on the lidar 101 and can measure the change in pose of the lidar 101 in real time, for example, measure the displacement acceleration and/or rotational angular velocity of the lidar 101 .
  • the embodiment of the present application provides a method for calculating the relative external parameters of the ranging system, and the real-time position of the laser radar at different times is determined by the inertial navigation device. pose, and calculate the relative pose offset at different times relative to the reference time, and calculate the relative extrinsic parameters according to the relative pose offset, and further correct the measurement deviation caused by different scanning times in a frame of data, and eliminate the laser
  • the measurement error caused by the radar's own movement improves the accuracy of the laser radar in the process of measuring distance.
  • Fig. 5 is a schematic flowchart of a method for calculating relative external parameters of a ranging system provided by an embodiment of the present application.
  • the method 500 includes the following steps:
  • the laser radar installed on the car scans the target object according to the preset scanning method, so as to obtain the three-dimensional point cloud data of the target object at each scanning moment, specifically, at each scanning moment
  • the object projects a speckle beam, and collects the speckle beam reflected by the target object, calculates the flight time of the speckle beam from emission to reception to calculate the distance data of the target object, and converts the distance data into the 3D point cloud data of the target object, after Acquisition of one frame of 3D point cloud data is completed at multiple scanning moments.
  • the method proposed in this application is described with reference to the schematic diagram of the working process shown in Figure 2, assuming that the preset scanning method is point-by-point scanning, and the laser radar only projects one spot beam at a time for measuring
  • the collection of one frame of data includes 6 scanning moments t 1 , t 2 , t 3 , t 4 , t 5 , and t 6 , and the spot beams are emitted sequentially to complete one frame of data collection, and 6 distances are obtained
  • the data are denoted as D 1 , D 2 , D 3 , D 4 , D 5 , D 6 in sequence.
  • the three-dimensional point cloud data of the target object is calculated according to the distance data of the target object measured at each scanning moment and the internal parameters of the lidar.
  • the internal reference parameters include the imaging focal length of the laser radar system, the lens distortion parameters of the laser radar system, the coordinates of the light source, the emission angle of the light beam emitted by the light source, and the like.
  • the three-dimensional point cloud data of the target object can be directly calculated according to the emission angle of the light beam emitted by the light source and the distance data measured when the light beam is projected onto the target object.
  • the position coordinates [u, v] of each reflected light spot incident on the pixel array are determined.
  • the coordinate position of each spot light beam projected on the pixel unit can be pre-calibrated, according to the preset
  • the internal reference of , [X L , Y L , Z L ,] is the 3D point cloud data of the target object, (u L , v L ) represents the coordinate position of the spot beam incident on the pixel unit.
  • the inertial navigation device At each scanning moment, synchronously control the inertial navigation device to measure real-time pose parameters of the lidar, where the real-time pose parameters include displacement acceleration and/or rotational angular velocity.
  • the inertial navigation device is used to measure the lidar at each scanning moment Real-time pose parameters.
  • the inertial navigation device when the scanning time for the lidar to complete one frame of data acquisition is 6 scanning times, the inertial navigation device will simultaneously measure the real-time pose parameters corresponding to the lidar at each scanning time.
  • the reference moment is any one of the multiple scanning moments.
  • the reference time may be any one of the six scanning times t 1 , t 2 , t 3 , t 4 , t 5 , and t 6 .
  • the reference time is the first scanning time t 1 as an example for illustration.
  • the scanning imaging time t 1 of the first point is used as the reference time of the laser radar, and the time difference between each other scanning time and the reference time is calculated respectively.
  • each scanning time is compared with The time differences of the reference time are respectively denoted as ⁇ t 1 , ⁇ t 2 , ⁇ t 3 , ⁇ t 4 , ⁇ t 5 , and ⁇ t 6 .
  • the solution in this application is to correct the pose deviation of the lidar caused by its own motion, and the reference time is the initial time. At this time, it can be considered that the lidar does not generate a pose deviation, and the position and attitude deviation in the subsequent scanning time Therefore, it is necessary to calculate the pose offset of each scanning moment relative to the reference moment, that is, the relative pose offset of the lidar.
  • step 503 the time difference between each scanning moment and the reference moment is calculated, and then the relative pose offset is calculated according to the real-time pose parameters of the laser radar at each scanning moment measured by the inertial navigation device, and the relative The pose offset includes position change ⁇ T ( ⁇ t 1 , ⁇ t 2 , ⁇ t 3 ) and attitude change ⁇ ( ⁇ x , ⁇ y , ⁇ z ).
  • the real-time pose parameters the time difference corresponding to each scanning moment is integrated, and the relative pose offset of the lidar relative to the reference moment at each scanning moment is calculated. Since the first scanning moment is the reference moment, here we take the second scanning moment t2 as an example to calculate the relative pose offset of the lidar at the second scanning moment, which can be divided into the following three scenarios:
  • the relative pose offset at the scanning imaging moment, the relative pose offset specifically refers to the position change at the second scanning imaging moment, expressed as ⁇ T 2 .
  • the relative pose offset at the scanning moment, the relative pose offset specifically refers to the attitude change at the second scanning imaging moment, expressed as ⁇ 2 .
  • a 2 ⁇ a x2 ,a y2 .a z2 ⁇
  • rotational angular velocity ⁇ 2 ⁇ x2 , ⁇ y2 , ⁇ z2 ⁇
  • the external parameters of the ranging system are usually expressed as a rotation-translation matrix [R, T], and the rotation matrix has three degrees of freedom, which can be expressed by Euler angles ⁇ ( ⁇ x , ⁇ y , ⁇ z ), along The rotation matrix in different directions (x, y, z) can be expressed as:
  • the relative attitude change can be calculated according to the rotational angular velocity measured by the inertial navigation device: ⁇ ( ⁇ x ⁇ x , ⁇ y ⁇ y , ⁇ z ⁇ z ), and the relative rotation can be calculated further according to the relative attitude change Matrix R'.
  • the relative translation matrix can also be calculated according to the position change measured by the inertial navigation device:
  • T' [t 1 ⁇ t 1 ,t 2 ⁇ t 2 ,t 3 ⁇ t 3 ]
  • the relative extrinsic parameters [R', T'] of the system are obtained. It can be understood that in the embodiment of the present application, the relative external parameter change caused by the movement of the lidar itself is calculated, and the reference time can be understood as the time when no motion has occurred. Therefore, the pose parameter ⁇ ( ⁇ x , ⁇ y , ⁇ z ) and T(t 1 , t 2 , t 3 ) are 0 values.
  • the measurement deviation caused by different scanning moments in a frame of data is further corrected.
  • the first scanning moment is taken as the reference moment
  • the three-dimensional point cloud data measured at each scanning moment is transformed and corrected to be at the same position as the reference scanning moment
  • [X',Y',Z'] T [R',T'][X,Y,Z] T .
  • the laser radar measures the distance to the target object by means of scanning measurement, and performs real-time pose parameter measurement according to the inertial navigation device, and finally obtains the relative distance between each scanning moment in the laser radar measurement and the reference moment External parameters, according to the relative external parameters, the 3D point cloud data measured at different scanning times are corrected to reduce the error caused by the lidar's own movement and improve the measurement accuracy of the lidar.
  • the ranging system 100 includes a laser radar 101, a camera 102, an inertial navigation device 103, and a fusion control processing circuit 104.
  • the embodiment of the present application provides a method for calculating the relative external parameters of the ranging system, which can The fused extrinsic parameters of the laser radar 101 and the camera 102 are corrected through the pose parameters of the laser radar determined by the inertial navigation device 103 , thereby improving the accuracy of the ranging system.
  • Fig. 6 is a schematic flowchart of another example of the method for calculating the relative extrinsic parameters of the ranging system provided by the embodiment of the present application.
  • the method 600 includes:
  • the inertial navigation device At each scanning moment, synchronously control the inertial navigation device to measure real-time pose parameters of the lidar, where the real-time pose parameters include displacement acceleration and/or rotational angular velocity.
  • the laser radar installed on the car projects a speckle beam towards the target object according to the preset scanning method, collects the speckle beam reflected by the target object, and calculates the time-of-flight of the speckle beam from emission to reception To calculate the distance data of the target object, and convert the distance data into the 3D point cloud data of the target object.
  • the preset scanning mode is to project at least one spot light beam toward the target object at each scanning moment, and complete the acquisition of one frame of data after multiple scanning moments.
  • the lidar is affected by its own motion, and its own pose parameters will change at each moment.
  • the inertial navigation device is used to measure the real-time Pose parameters
  • the distance measuring system may also include a laser radar and a camera at the same time, and accurately identify the target object through fusion processing of the distance of the target object collected by the laser radar and the image of the target object collected by the camera.
  • the processor needs to control the synchronous work of the lidar and the camera, that is, to ensure that each frame of distance data sampled by the lidar is synchronized with each frame of image data captured by the camera for subsequent data fusion processing.
  • the meaning of synchronization is to ensure that the camera captures One frame of image of the target object is the same frame of image as the frame of image of the target object output when the lidar scanning is completed.
  • the moment when the camera collects the image data of the target object is the exposure moment t c of the camera.
  • lidar also needs to perform multiple scan measurements in sequence to complete the acquisition of a frame of data, and the moment when the data acquisition is completed can be considered as the last scan measurement moment.
  • the exposure moment of the camera may coincide with a certain scanning moment of the lidar, or it may not coincide with each scanning moment.
  • the relative position of the camera and the lidar will change. If the external parameters of the initial calibration are still used for fusion, an offset error will occur, which will affect the accuracy of the 3D reconstruction result.
  • the exposure time of the camera is used as the reference time to calculate the time difference between each scanning moment and the reference moment when the laser radar measures the target object according to the preset scanning method.
  • the point-by-point scanning method and set six scanning moments for schematic illustration respectively calculate the time difference between the six scanning moments and the exposure moment of the camera, and express the time difference of the six scanning moments as " ⁇ t 1 ', ⁇ t 2 ', ⁇ t 3 ', ⁇ t 4 ', ⁇ t 5 ', ⁇ t 6 '".
  • the time difference between each scanning moment and the reference moment is calculated, and then the relative pose offset is calculated according to the real-time pose parameters of the laser radar at each scan moment measured by the inertial navigation device, and the relative pose offset Including position change ⁇ T ( ⁇ t 1 , ⁇ t 2 , ⁇ t 3 ) and attitude change ⁇ ( ⁇ x , ⁇ y , ⁇ z ).
  • the time difference corresponding to each scanning moment is integrated to calculate the relative pose offset of the lidar relative to the camera at each scanning moment. For the specific calculation process, refer to step 503, which will not be repeated here.
  • the extrinsic parameters of the ranging system refer to the joint extrinsic parameters of the lidar and the camera.
  • the joint extrinsic parameters [R, T] between the camera and the lidar need to be calibrated in advance.
  • the method can be any calibration method in the field, which is not specifically limited here.
  • the 3D point cloud data measured by the lidar is projected into the image data collected by the camera, where (u c , v c ) is the projection of the 3D point cloud data of the lidar into the image data of the camera Pixel coordinates, K c is the internal reference of the camera.
  • step 504 the relationship between the rotation-translation matrix representing the external parameters and the position change and attitude change can be determined.
  • the ⁇ ( ⁇ x , ⁇ y , ⁇ z ) and T(t 1 , t 2 , t 3 ), combined with the relative pose offset ⁇ T( ⁇ t 1 , ⁇ t 2 , ⁇ t 3 ) and ⁇ ( ⁇ x , ⁇ y , ⁇ z ) can be calculated
  • the relative extrinsic parameters of each scanning moment of the laser radar relative to the exposure moment of the camera and further project the 3D point cloud data into the image data according to the relative extrinsic parameters of each scanning moment.
  • the final calculation can obtain the position changes at six scanning moments as ⁇ T 1 ', ⁇ T 2 ', ⁇ T 3 ', ⁇ T 4 ', ⁇ T 5 ', ⁇ T 6 ', and /Or, the attitude changes at the six scanning moments are ⁇ 1 ′ , ⁇ 2 ′ , ⁇ 3 ′, ⁇ 4 ′, ⁇ 5 ′, and ⁇ 6 ′ .
  • the coordinate data projected from the three-dimensional point cloud data measured at six scanning moments to the pixel coordinate system of the camera can be further obtained.
  • the ranging system includes a laser radar and a camera
  • the image data of the target object is collected by synchronously controlling the camera, and the exposure time of the camera is used as the reference time, and the relative pose offset is calculated according to the reference time, and the measurement is further obtained.
  • the 3D point cloud data of the target object collected by the lidar and the image data of the target object collected by the camera are fused to realize the 3D reconstruction of the target object and improve the measurement accuracy and accuracy of the distance measurement system. accuracy.
  • the inertial navigation device is used to measure the real-time pose parameters of the lidar, and the relative pose offset of the lidar when measuring the distance data of the target object is further determined.
  • the relative extrinsic parameters of the ranging system are obtained by calculating the relative pose offset. According to the relative extrinsic parameters, the data measured at each scanning time can be corrected, thereby improving the measurement accuracy of the system.
  • the ranging system also includes a camera
  • the image data of the target object is collected by synchronously controlling the camera
  • the exposure time of the camera is used as the reference time
  • the relative pose offset is calculated according to the reference time, and further obtained when the ranging system includes the camera
  • the relative external parameters of the time, the 3D point cloud data of the target object collected by the lidar and the image data of the target object collected by the camera are fused to realize the 3D reconstruction of the target object and improve the measurement accuracy and accuracy of the ranging system. sex.
  • FIG. 7 is a schematic diagram of an apparatus for calculating relative external parameters of a ranging system provided by an embodiment of the present application.
  • the apparatus 700 includes a first processing module 701, a measurement module 702, a second processing module 703, and a third processing module 704, wherein:
  • the first processing module 701 is used to control the laser radar to scan the target object according to the preset scanning mode, so as to obtain the three-dimensional point cloud data of the target object at each scanning moment;
  • the measurement module 702 is used to synchronously control the inertial navigation device to measure the real-time pose parameters of the laser radar at each scanning moment, and the real-time pose parameters include displacement acceleration and/or rotational angular velocity;
  • the second processing module 703 is used to determine the reference moment and calculate the time difference between each scanning moment and the reference moment, and calculate the relative pose offset according to the time difference and the real-time pose parameters at each scanning moment;
  • the third processing module 704 is used to calculate the relative extrinsics of the ranging system at each scanning moment according to the relative pose offset;
  • the device further includes: a fourth processing module, configured to correct the 3D point cloud data collected at other scanning moments in the plurality of scanning moments into the 3D point cloud data collected at the first reference moment according to the relative external reference;
  • the device further includes: an acquisition module, configured to synchronously control the camera to acquire image data of the target object, and acquire an exposure moment when the camera acquires the image data of the target object;
  • the device further includes: a fifth processing module, configured to perform fusion processing on the three-dimensional point cloud data collected at each scanning moment and the image data according to the relative external reference.
  • a fifth processing module configured to perform fusion processing on the three-dimensional point cloud data collected at each scanning moment and the image data according to the relative external reference.
  • Fig. 8 is a schematic structural diagram of a device provided by an embodiment of the present application.
  • the apparatus 800 includes a memory 801 and a processor 802 .
  • the memory 801 is used to store a computer program 8011
  • the processor 802 is used to execute the computer program 8011 to realize the process of calculating the relative external parameters of the ranging system, such as steps 501 to 504 in FIG. Steps 601 to 605 of .
  • the functional modules of the device can be divided according to the above-mentioned method examples.
  • each functional module can be corresponded to, or two or more functions can be integrated into one processing module.
  • the above-mentioned integrated modules can use hardware The form is realized. It should be noted that the division of modules in this embodiment is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • the device may include: a first processing module, a measurement module, a second processing module, a third processing module, and the like. It should be noted that all the relevant content of each step involved in the above method embodiment can be referred to the functional description of the corresponding functional module, which will not be repeated here.
  • the device provided in this embodiment is used to implement the above-mentioned method for calculating the relative extrinsic parameters of the ranging system, so the same effect as the above-mentioned implementation method can be achieved.
  • the device may include a processing module and a storage module.
  • the processing module can be used to control and manage the actions of the electronic device.
  • the storage module may be used to support devices to execute mutual program codes and data, etc.
  • the processing module can be a processor or a controller, which can implement or execute various exemplary logic blocks, modules and circuits that are despised in conjunction with the disclosure of this application.
  • the processor can also be a combination of computing functions, for example, a combination of one or more microprocessors, a combination of digital signal processing (digital signal processing, DSP) and a microprocessor, etc.
  • the storage module can be a memory.
  • the processing module is a processor.
  • the storage module is a memory
  • the electronic device involved in this embodiment may be a device having the structure shown in FIG. 3 .
  • This embodiment also provides a computer-readable storage medium, where computer instructions are stored in the computer-readable storage medium, and when the computer instructions are run on the device, the device executes the above-mentioned related method steps to realize one of the above-mentioned embodiments A method for calculating the relative extrinsic parameters of the ranging system.
  • This embodiment also provides a computer program product.
  • the computer program product When the computer program product is run on a computer, it causes the computer to execute the above related steps, so as to realize a method for calculating the relative extrinsic parameters of the ranging system in the above embodiment.
  • an embodiment of the present application also provides a device, which may specifically be a chip, a component or a module, and the device may include a connected processor and a memory; wherein the memory is used to store computer-executable instructions, and when the device is running, The processor can execute the computer-executable instructions stored in the memory, so that the chip executes a method for calculating the relative extrinsic parameters of the ranging system in the above-mentioned embodiment.
  • the device, computer storage medium, computer program product or chip provided in this embodiment is all used to execute the corresponding method provided above, therefore, the beneficial effects that it can achieve can refer to the corresponding method provided above The beneficial effects in the above will not be repeated here.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined or It may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种计算测距系统相对外参的方法,方法包括:控制激光雷达(101)按照预设扫描方式扫描目标物体,以在每个扫描时刻(t 1,t 2,t 3,t 4,t 5,t 6)获取目标物体的三维点云数据(501);在每个扫描时刻(t 1,t 2,t 3,t 4,t 5,t 6)下,同步控制惯导装置(103)测量激光雷达(101)的实时位姿参数,实时位姿参数包括位移加速度和/或旋转角速度(502);确定基准时刻并计算每个扫描时刻(t 1,t 2,t 3,t 4,t 5,t 6)与基准时刻的时间差,根据时间差和每个扫描时刻(t 1,t 2,t 3,t 4,t 5,t 6)下的实时位姿参数计算相对位姿偏移量(503);根据相对位姿偏移量计算出每个扫描时刻(t 1,t 2,t 3,t 4,t 5,t 6)下测距系统的相对外参(504)。计算测距系统相对外参的方法能够通过惯导装置(103)的测量结果,计算得到测距系统的相对外参,提高了测距系统的测量精度。

Description

一种计算测距系统相对外参的方法、装置和存储介质
本申请要求于2021年12月8日提交中国专利局,申请号为202111495579.X,发明名称为“一种计算测距系统相对外参的方法、装置和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及多传感器融合领域,并且更具体地,涉及一种计算测距系统相对外参的方法、装置和存储介质。
背景技术
目前的传感器种类多种多样,对于一些参数的测量,可以通过多个传感器之间相互配合、相互协调,将每个传感器的测量数据统一收集管理,得到最终的参数测量结果,实现更加精确的测量。
示例性的,在无人驾驶领域,要实现汽车的安全驾驶,可能通过在汽车上安装的多种传感器实时采集汽车运行过程中的相关参数。例如,汽车上安装的传感器可以包括温度传感器、压力传感器、速度传感器、加速度传感器、激光雷达、相机等。一种特定的场景:在无人驾驶汽车的行驶过程中,需要实时测量汽车与前车、后车、两侧道路及各个障碍物之间的距离,保证该距离在安全距离范围之内,降低安全隐患。
一种可能的实现方式中,在测量目标物体的距离时,可以通过激光雷达测量汽车与前车、后车、两侧道路及各个目标物之间的距离;另外,还可以通过相机等图像采集装置对汽车周围场景的图像进行采集;处理器根据采集的图像以及测量的目标物的距离完成对汽车周围场景的三维重建。
当汽车上通过激光雷达测量目标物体的距离时,一些激光雷达根据采样方式的不同,设置为采用逐行、逐列或逐点扫描的方式进行目标视场的距离测量,在按照上述扫描方式进行测量的过程中,如果激光雷达自身存在运动,例如汽车行驶过程中激光雷达随汽车一起运动而导致每个测量时刻的位姿不相同,完成一帧测量时的结果不准确,从而影响了行驶安全。
当汽车上同时通过激光雷达和相机等传感器同步实现对周围环境的三维重建时,由于相机成像为曝光时刻瞬时成像,而激光雷达设置为采用逐行、逐列或逐点扫描的方式进行目标视场的距离测量时,激光雷达采集每一帧数据和相机采集每一帧图像的时刻不同步,最终会影响三维重建的结果,进一步的若基于目标周围环境的三维重建数据进行导航、避障时易发生危险。
在上述过程中,如何解决激光雷达自身运动引起的运动变化以及每帧数据采集不同步的问题以提升测距精度,成为了当前亟需解决的问题。
发明内容
本申请提供了一种计算测距系统相对外参的方法、装置和存储介质,该方法能够实时测量出激光雷达的位姿参数,并计算出测距系统的相对外参,基于相对外参可以对测量的 数据进行修正,从而提高测距系统的测量精度。
第一方面,提供了一种计算测距系统相对外参的方法,其特征在于,该方法包括:控制激光雷达按照预设扫描方式扫描目标物体,以在每个扫描时刻获取该目标物体的三维点云数据;在该每个扫描时刻下,同步控制惯导装置测量该激光雷达的实时位姿参数,该实时位姿参数包括位移加速度和/或旋转角速度;确定基准时刻并计算该每个扫描时刻与该基准时刻的时间差,根据该时间差和该每个扫描时刻下的该实时位姿参数计算相对位姿偏移量;根据该相对位姿偏移量计算出该每个扫描时刻下该测距系统的相对外参。
上述技术方案,在激光雷达测量目标物体的距离数据时,采用惯导装置测量激光雷达的实时位姿参数,进一步确定激光雷达在测量目标物体的距离数据时的相对位姿偏移量,利用相对位姿偏移量计算得到测距系统的相对外参,消除了测距系统中激光雷达自身运动引起的测量误差,提高了激光雷达的测量精度。
结合第一方面,在某些可能的实现方式中,基准时刻为第一基准时刻,该第一基准时刻为多个扫描时刻中的任意一个扫描时刻。
结合第一方面和上述实现方式,在某些可能的实现方式中,该方法还包括,根据相对外参将多个扫描时刻中的其他扫描时刻采集的三维点云数据修正为第一基准时刻采集的三维点云数据。
结合第一方面和上述实现方式,在某些可能的实现方式中,该方法还包括,同步控制相机采集目标物体的图像数据,并获取该相机采集该目标物体的图像数据时的曝光时刻。
结合第一方面和上述实现方式,在某些可能的实现方式中,基准时刻为第二基准时刻,该第二基准时刻为曝光时刻。
结合第一方面和上述实现方式,在某些可能的实现方式中,该方法还包括,根据相对外参将每个扫描时刻采集的三维点云数据与图像数据做融合处理。
上述技术方案,在测距系统还包括相机时,通过同步控制相机采集目标物体的图像数据,以相机的曝光时刻为基准时刻,根据基准时刻计算相对位姿偏移量,进一步得到测距系统的相对外参,将激光雷达采集的目标物体的三维点云数据和相机采集的目标物体的图像数据进行融合处理,实现了对目标物体的三维重建,提高了测距系统的测量精度和准确性。
综上所述,在激光雷达测量目标物体的距离数据时,采用惯导装置测量激光雷达的实时位姿参数,进一步确定激光雷达在测量目标物体的距离数据时的相对位姿偏移量,利用相对位姿偏移量计算得到测距系统的相对外参。此外,当测距系统还包括相机时,通过同步控制相机采集目标物体的图像数据,以相机的曝光时刻为第二基准时刻,根据第二基准时刻计算相对位姿偏移量,进一步得到在测距系统包含相机时的相对外参,将激光雷达采集的目标物体的三维点云数据和相机采集的目标物体的图像数据进行融合处理,实现了对目标物体的三维重建,提高了测距系统的测量精度和准确性。
第二方面,提供了一种计算测距系统相对外参的装置,该装置包括:第一处理模块,用于控制激光雷达按照预设扫描方式扫描目标物体,以在每个扫描时刻获取该目标物体的三维点云数据;测量模块,用于在该每个扫描时刻下,同步控制惯导装置测量该激光雷达的实时位姿参数,该实时位姿参数包括位移加速度和/或旋转角速度;第二处理模块,用于确定基准时刻并计算该每个扫描时刻与该基准时刻的时间差,根据该时间差和该每个扫描时刻下的该实时位姿参数计算相对位姿偏移量;第三处理模块,用于根据该相对位姿偏 移量计算出该每个扫描时刻下该测距系统的相对外参。
结合第二方面,在某些可能的实现方式中,该装置还包括:第四处理模块,用于根据相对外参将多个扫描时刻中的其他扫描时刻采集的三维点云数据修正为第一基准时刻采集的三维点云数据。
结合第二方面和上述实现方式,在某些可能的实现方式中,该装置还包括:采集模块,用于同步控制相机采集目标物体的图像数据,并获取该相机采集该目标物体的图像数据时的曝光时刻。
结合第二方面和上述实现方式,在某些可能的实现方式中,该装置还包括:第五处理模块,用于根据相对外参将每个扫描时刻采集的三维点云数据与图像数据做融合处理。
第三方面,提供了一种装置,该装置包括:存储器,用于存储计算机程序,该计算机程序包括程序指令;处理器,用于从该存储器中调用并运行该程序指令,使得该装置执行第一方面或第一方面任意一种可能的实现方式中的方法。
第四方面,提供了一种测距系统,该测距系统包括:惯导装置、激光雷达,相机和融合控制处理电路,该融合控制处理电路能够控制该激光雷达、该惯导装置、该相机执行上述第一方面或第一方面任意一种可能的实现方式中的方法。
第五方面,提供了一种计算机可读存储介质,该计算机可读存储介质存储有计算机程序,当该计算机程序被执行时,使得计算机执行第一方面或第一方面任意一种可能的实现方式中的方法。
第六方面,提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得该计算机执行第一方面或第一方面任意一种可能的实现方式中的方法。
附图说明
图1是本申请实施例提供的一种无人驾驶场景示意图;
图2是本申请实施例提供的一种激光雷达采样配置为扫描测量的工作过程的示意图;
图3是本申请实施例提供的一种相机采样的工作过程的示意图;
图4是本申请实施例提供的一例测距系统的系统结构示意图;
图5是本申请实施例提供的一例计算测距系统相对外参的方法的示意性流程图;
图6是本申请实施例提供的另一例计算测距系统相对外参的方法的示意性流程图;
图7是本申请实施例提供的一种计算测距系统相对外参的装置示意图;
图8是本申请实施例提供的一种装置的结构示意图。
具体实施方式
下面将结合附图,对本申请中的技术方案进行清楚、详尽地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B:文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义 是两个或两个以上。
图1是本申请实施例提供的一种无人驾驶场景示意图。
示例性的,如图1所示,汽车10在道路上行驶的过程中,要兼顾和前车30之间的距离D 4,后车20之间的距离D 1都在安全驾驶的距离范围之内,同时还要兼顾与左车道线的距离D 2和右车道线的距离D 3以满足交通规则要求,因此,常常需要安装测距系统完成对周围目标的测量以进行导航和避障等功能。
一种可能的实现方式中,测距系统包括激光雷达101,可以通过激光雷达101测量目标物体的距离;或者,测距系统还可以包括激光雷达101和相机102,基于激光雷达101和相机102采集的数据进行融合处理完成对周围行驶环境的三维重建。在本申请实施例中,测距系统可以包括激光雷达101和相机102时,该测距系统用于将激光雷达101测量的距离和相机102采集周围行驶环境的图像进行匹配,实现对汽车10的周围的环境的三维重建。
可选地,汽车10上可以安装一个或多个激光雷达101,本申请实施例对激光雷达101的类型和具体数量不作限定。
可选地,汽车10上还可以安装一个或多个相机102,该相机102可以具有不同的类型,本申请实施例对相机102的类型和具体数量不作限定。
图2是本申请实施例提供的一种激光雷达采样配置为扫描测量的工作过程的示意图。激光雷达的采样方式为逐点、逐块、逐行/列扫描测量,因此激光雷达在获取一帧测量数据中的每个测量结果为每一个扫描时刻对应的目标物体的距离数据。下面以逐点扫描方式为例进行详细的介绍,如图2所示,激光雷达完成一帧数据的测量时间内可根据扫描的方式依次分成多个扫描时刻,假设具有6个扫描时刻t 1、t 2、t 3、t 4、t 5、t 6,每一个扫描时刻投射一个斑点光束并对应一个测量结果,激光雷达可以在每一个扫描时刻获取一个目标点的距离数据,依次完成6次测量时即完成一帧数据的采集获取6个距离数据。根据扫描测量的原理可知,在一帧测量数据中每个点的数据采样时刻并不等于一帧数据采集完成时刻,当激光雷达自身存在运动时,每个扫描时刻激光雷达的位姿会发生变化,则导致一帧数据存在测量误差,影响测量结果的准确性。
图3是本申请实施例提供的一种相机采样的工作过程的示意图。
在一些实施例中,测距系统还可以同时包括激光雷达和相机,通过将激光雷达采集的目标物体的距离和相机采集的目标物体的图像进行融合处理,完成对汽车10周围环境的三维重建以进行周围目标的识别完成导航和避障等功能。
在激光雷达和相机的测距系统中,处理器用于控制激光雷达和相机的同步工作,即保证激光雷达采样的每一帧数据与相机拍摄的每一帧图像同步以进行后续的数据融合。但是,由于相机的采集图像为曝光瞬时的成像,而激光雷达需要多次扫描测量完成一帧数据的采集,结合图2和图3所示的工作过程,假设处理器控制相机和激光雷达同时开启用于进行一帧数据的采集,但是相机的曝光时间比较短,则可能相机采集图像的时刻对应激光雷达的第一次扫描时刻t 1。在实际应用中,对于相机来说,采集目标图像的完成时刻即是相机的曝光时刻,记为“t c”。而激光雷达还需要依次进行多次扫描测量完成一帧数据的采集,数据采集完成的时刻可以认为是最后一次扫描测量时刻。相机的曝光时刻可能与激光雷达的某一个扫描时刻中重合,也可能与每一个扫描时刻都不重合。而在将同步帧的图像数据与距离数据进行融合处理时,由于系统的运动将导致相机与激光雷达的相对位置会产生变 化,则若还采用初始标定的外参进行融合时就会产生偏移误差,即会影响三维重建的结果准确度。
因此,不管是对激光雷达自身还是包括激光雷达与相机的测距系统来说,需要准确标定出每个扫描时刻激光雷达的位姿参数以修正测距系统的外参变化,才能有效修正因为位姿变化引起的测量误差。
鉴于此,本申请将提供一种计算测距系统相对外参的方法、装置和存储介质,该方法能够计算得到测距系统的相对外参,消除了测距系统在测量过程中的误差,提高了测距系统的测量精度。
图4是本申请实施例提供的一例测距系统的系统结构示意图。示例性的,该测距系统100可以包括激光雷达101、相机102、惯导装置103、融合控制处理电路104。其中:激光雷达101包括发射器、采集器和控制处理电路。发射器包括光源、发射光学元件等。在一些实施例中,还包括机械扫描/光学衍射元件等一种或多种单元。对于光源,光源可以是单个光源或者是由多个光源组成的光源阵列,其中,光源阵列可以被配置分组发光,一次仅开启一个光源;或者也可以分成多个子光源阵列,每个子光源阵列包括一行/列光源,也可以是其他任意的形式,控制处理电路控制发射器发射斑点光束时可以一次仅开启一个子光源阵列或者仅开启每个子光源阵列中的一个光源,依次完成对目标场景的扫描测量。一种典型的实例,光源配置为VCSEL(Vertical-Cavity Surface-Emitting Laser,垂直腔面发射激光器)阵列光源,光源阵列被配置为包括多个列光源,按照预设的顺序依次开启每个列光源完成对目标视场的扫描,每个扫描时刻至少有一个列光源投射出斑点光束。又一种典型的实例,光源可以被配置为是列光源,列光源发射出的光束经过机械扫描元件后投射到目标视场中,通过控制依次调节机械扫描元件的偏转角度完成对目标视场的扫描测量,其中,机械扫描元件可以是振镜、MEMS扫描镜、旋转镜等。
采集器,包括由至少一个像素组成的像素单元和接收光学元件,接收光学元件用于将目标反射的斑点光束成像到像素单元上,还用于滤除背景光和杂散光,其中,像素可以是APD、SiPM、SPAD、CCD、CMOS等光电探测器中的一种。在一些实施例中,像素单元是一种专门用于光飞行时间测量的图像传感器,像素单元也可以集成到一种专门用于光飞行时间测量的感光芯片中。在一个典型实施例中,像素单元包括由多个SPAD组成,SPAD可以对入射的单个光子进行响应并输出指示所接收光子在每个SPAD处相应到达时间的光子信号。一般地,采集器还包括有与像素单元连接的TDC计时电路,用于接收像素单元输出的光子信号并记录光子从发射到接收的时间信号。
控制处理电路同时控制发射器和采集器,并接收TDC计时电路输出的时间信号进行处理计算出待测目标物的距离。在一些其他实施例中,控制处理电路也可以是独立的专用电路,比如深度相机自身具有计算能力的独立电路;也可以包含通用处理电路,比如当该深度相机被集成到如手机、电视、电脑等智能终端中去,终端中的处理器可以执行控制和处理电路的功能。
具体地,激光雷达测量目标物体的距离的原理是通过直接测量飞行时间(direct time of flight,DTOF),通过计算脉冲发射的时刻与接收时刻间的差值来计算脉冲的飞行时间t,进一步根据如下公式(1)计算物体的距离D。
D=ct/2                          公式(1)
在公式(1)中,c为光速,单位:米/每秒;t为脉冲发射的时刻与接收时刻间的时 长,单位:秒。
相机102可以包括图像信号处理器(Image Signal Processor,ISP)、互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)芯片、电荷耦合元件(Complementary Metal-Oxide Semiconductor,CCD)等,用于和激光雷达同步采集待测视场的图像信息,进一步获取目标物体的图像信息。
惯导装置103设置于激光雷达101上,能够实时测量激光雷达101的位姿变化量,例如测量激光雷达101的位移加速度和/或旋转角速度等。
下面,基于图4所示的测距系统100,详细介绍本申请实施例的一种计算测距系统相对外参的方法。
一种可能的场景中,当测距系统中包括激光雷达和惯导装置,本申请实施例提供一种计算测距系统相对外参的方法,通过惯导装置确定激光雷达的不同时刻的实时位姿,并计算出不同时刻相对于基准时刻的相对位姿偏移量,根据相对位姿偏移量计算出相对外参,进一步修正一帧数据中因不同扫描时刻而引起的测量偏差,消除激光雷达由于自身的运动带来的测量误差,提高激光雷达测量距离过程中的准确性。
图5是本申请实施例提供的一例计算测距系统相对外参的方法的示意性流程图。
示例性的,如图5所示,该方法500包括以下步骤:
501,控制激光雷达按照预设扫描方式扫描目标物体,以在每个扫描时刻获取目标物体的三维点云数据。
示例性的,汽车在行驶过程中,汽车上安装的激光雷达按照预设扫描方式扫描目标物体,以在每个扫描时刻获取目标物体的三维点云数据,具体的,在每个扫描时刻朝向目标物体投射斑点光束,并采集被目标物体反射回的斑点光束,计算出斑点光束从发射到接收的飞行时间以计算目标物体的距离数据,并将距离数据转化为目标物体的三维点云数据,经过多个扫描时刻完成一帧三维点云数据的采集。
一种可能的实现方式,本申请的所提的方法参考图2所示的工作过程示意图进行示例性描述,假设预设扫描方式为逐点扫描,激光雷达每次仅投射一个斑点光束用于测量目标物体的距离数据,一帧数据的采集包含6个扫描时刻t 1、t 2、t 3、t 4、t 5、t 6,依次发射斑点光束完成一帧数据采集后,获取到6个距离数据依次表示为D 1、D 2、D 3、D 4、D 5、D 6。根据每个扫描时刻测量出的目标物体的距离数据以及激光雷达的内参参数计算出目标物体的三维点云数据。所述内参参数包括激光雷达系统的成像焦距、激光雷达系统的镜头畸变参数、光源的坐标、光源发射光束的发射角等。在一个实施例中,可根据光源发射光束的发射角以及该光束投影到目标物体上测量出的距离数据直接计算出目标物体的三维点云数据。在另一个实施例中,确定每个反射光斑入射到像素阵列上的位置坐标[u,v],在激光雷达中每个斑点光束投射到像素单元上的坐标位置均可以预先标定,根据预设扫描顺序可以直接得到,再结合激光雷达系统的内参获取目标物体的三维点云数据,即:D[u L,v L] T=K[X L,Y L,Z L];K为激光雷达的内参,[X L,Y L,Z L,]为目标物体的三维点云数据,(u L,v L)表示斑点光束入射到像素单元上的坐标位置。
502,在每个扫描时刻下,同步控制惯导装置测量激光雷达的实时位姿参数,实时位姿参数包括位移加速度和/或旋转角速度。
具体地,当测距系统存在运动时,激光雷达受到自身运动的影响,导致每个时刻下自身的位姿参数均会发生变化,因此,利用惯导装置测量在每个扫描时刻下激光雷达的实时 位姿参数,激光雷达的实时位姿参数包括位移加速度a={a x,a y.a z}和/或旋转角速度ω={ω xyz}。
一种可能的实现方式,激光雷达完成一帧数据采集的扫描时刻为6个扫描时刻时,惯导装置会同步测量出每个扫描时刻下激光雷达对应的实时位姿参数。具体的,位移加速度分别可以表示为:a 1={a x1,a y1.a z1}、a 2={a x2,a y2.a z2}、a 3={a x3,a y3.a z3}、a 4={a x4,a y4.a z4}、a 5={a x5,a y5.a z5}、a 6={a x6,a y6.a z6};旋转角速度分别表示为:ω 1={ω x1y1z1}、ω 2={ω x2y2z2}、ω 3={ω x3y3z3}、ω 4={ω x4y4z4}、ω 5={ω x5y5z5}、ω 6={ω x6y6z6}。
503,确定基准时刻并计算每个扫描时刻与基准时刻的时间差,根据时间差和每个扫描时刻下的实时位姿参数计算激光雷达的相对位姿偏移量。
可选地,基准时刻为多个扫描时刻中的任意一个扫描时刻。基准时刻可以是6个扫描时刻t 1、t 2、t 3、t 4、t 5、t 6中的任意一个时刻,优选的,以基准时刻为第一个扫描时刻t 1为例进行说明。激光雷达在一帧数据测量完成之后,以第一个点的扫描成像时刻t 1为激光雷达的基准时刻,分别计算其他每个扫描时刻与基准时刻的时间差,为了方便,将每个扫描时刻与基准时刻的时间差分别记为Δt 1、Δt 2、Δt 3、Δt 4、Δt 5、Δt 6
本申请中方案是为了修正激光雷达由于自身运动所引起的位姿偏差量,则基准时刻即为初始时刻,此时可认为激光雷达并未产生位姿偏移量,在随后的扫描时刻内位姿发生变化,因此,需要计算出每个扫描时刻相对于基准时刻的位姿偏移量,即激光雷达的相对位姿偏移量。
具体地,在步骤503中,计算得到了每个扫描时刻与基准时刻的时间差,再根据惯导装置测量的激光雷达在每一个扫描时刻的实时位姿参数计算出相对位姿偏移量,相对位姿偏移量包括位置变化量ΔT(Δt 1,Δt 2,Δt 3)和姿态变化量Δθ(Δθ x,Δθ y,Δθ z)。根据实时位姿参数,对每个扫描时刻对应的时间差进行积分,计算激光雷达在每个扫描时刻下相对基准时刻的相对位姿偏移量。由于第一个扫描时刻为基准时刻,此处以第二个扫描时刻t 2为例,计算激光雷达在第二个扫描时刻的相对位姿偏移量,总共分为以下三种场景:
场景(1):第二个扫描成像时刻的位姿参数为位移加速度a 2={a x2,a y2.a z2},基于时间差Δt 2,对位移加速度进行积分,确定激光雷达在第二个扫描成像时刻的相对位姿偏移量,相对位姿偏移量具体指第二个扫描成像时刻的位置变化量,表示为ΔT 2
场景(2):第二个扫描成像时刻的位姿参数为旋转角速度ω 2={ω x2y2z2},基于时间差Δt 2,对旋转角速度进行积分,确定激光雷达在第二个扫描时刻的相对位姿偏移量,相对位姿偏移量具体指第二个扫描成像时刻的姿态变化量,表示为Δθ 2
场景(3):第二个扫描成像时刻的位姿参数包括位移加速度a 2={a x2,a y2.a z2}和旋转角速度ω 2={ω x2y2z2},结合前述场景(1)和场景(2)的计算过程,得到第二个扫描时刻的位置变化量ΔT 2、和姿态变化量Δθ 2。为了简便,此处不再赘述。
同理,根据上述场景(1)-场景(3),可以计算出其他几个扫描时刻相对于基准时刻的位置变化量ΔT 2、ΔT 3、ΔT 4、ΔT 5、ΔT 6和/或姿态变化量Δθ 2、Δθ 3、Δθ 4、Δθ 5、Δθ 6,此处不再赘述。
504,根据相对位姿偏移量计算出每个扫描时刻下测距系统的相对外参。
一般来说,测距系统的外参通常表示为旋转平移矩阵[R,T],旋转矩阵有3个自由度,可以通过欧拉角θ(θ xyz)来表示,沿着不同方向(x、y、z)的旋转矩阵可以表示为:
Figure PCTCN2022080516-appb-000001
两个参考系的旋转关系,可以表示为:
Figure PCTCN2022080516-appb-000002
其中,s i=sinθ i,c i=cosθ i
则根据惯导装置测量出的旋转角速度可以计算出相对姿态变化量:θ(θ x±Δθ xy±Δθ yz±Δθ z),进一步根据相对姿态变化量可以计算出相对旋转矩阵R'。
对于平移矩阵T=[t 1,t 2,t 3],有3个自由度,同样的可以根据惯导装置测量出的位置变化量计算相对平移矩阵:
T'=[t 1±Δt 1,t 2±Δt 2,t 3±Δt 3]
最终获得系统的相对外参[R’,T’]。可以理解的是,在本申请实施例中,是计算激光雷达自身运动引起的相对外参变化,基准时刻可以理解为是尚未发生运动的时刻,因此,基准时刻的位姿参数θ(θ x,θ y,θ z)以及T(t 1,t 2,t 3)则为0值。
进一步的修正一帧数据中因不同扫描时刻而引起的测量偏差。优选地,假设以第一扫描时刻为基准时刻,则根据其他每个扫描时刻与基准时刻的相对外参,将每个扫描时刻测量的三维点云数据进行变换修正到与基准扫描时刻处于同一位姿下,以减少因激光雷达自身的运动产生的一帧数据中的测量偏差,即[X',Y',Z'] T=[R',T'][X,Y,Z] T
在上述技术方案中,激光雷达通过扫描测量的方式测量与目标物体的距离,并根据惯导装置进行实时的位姿参数的测量,最终得到激光雷达测量中每个扫描时刻相对于基准时刻的相对外参,根据相对外参对不同扫描时刻下测量的三维点云数据进行修正以减少激光雷达自身运动所造成的误差,提高了激光雷达的测量精度。
另一种可能的应用场景中,测距系统100包括激光雷达101、相机102、惯导装置103以及融合控制处理电路104,本申请实施例提供一种计算测距系统相对外参的方法,能够通过惯导装置103确定的所述激光雷达的位姿参数,对激光雷达101与相机102的融合外参进行修正,进而提高测距系统的准确性。
图6是本申请实施例提供的另一例计算测距系统相对外参的方法的示意性流程图。
示例性的,如图6所示,该方法600包括:
601,控制激光雷达按照预设扫描方式扫描目标物体,以在每个扫描时刻获取目标物体的三维点云数据。
602,在每个扫描时刻下,同步控制惯导装置测量激光雷达的实时位姿参数,实时位姿参数包括位移加速度和/或旋转角速度。
示例性的,汽车在行驶过程中,汽车上安装的激光雷达按照预设扫描方式朝向目标物体投射斑点光束,并采集被目标物体反射回的斑点光束,计算出斑点光束从发射到接收的飞行时间以计算目标物体的距离数据,并将距离数据转化为目标物体的三维点云数据。其中,预设扫描方式为在每个扫描时刻朝向目标物体投射至少一个斑点光束,经过多个扫描时刻完成一帧数据的采集。并且,当测距系统存在运动时,激光雷达受到自身运动的影响, 在每个时刻下自身的位姿参数均会发生变化,因此,利用惯导装置测量在每个扫描时刻下激光雷达的实时位姿参数,激光雷达的实时位姿参数包括位移加速度a={a x,a y.a z}和/或旋转角速度ω={ω xyz}。具体实现方式可参见步骤501、502的详细描述,在此不再重复赘述
603,同步控制相机采集目标物体的图像数据,并获取相机采集目标物体的图像数据时的曝光时刻。
应理解,测距系统还可以同时包括激光雷达和相机,通过将激光雷达采集的目标物体的距离和相机采集的目标物体的图像进行融合处理来进行对目标物体的精确识别。通常,处理器需控制激光雷达和相机的同步工作,即保证激光雷达采样的每一帧距离数据与相机拍摄的每一帧图像数据同步以进行后续的数据融合处理,同步的含义是保证相机采集目标物体的一帧图像与激光雷达扫描完成时输出的目标物体的一帧图像为同一帧图像。相机采集目标物体的图像数据的时刻为相机的曝光时刻t c。而激光雷达还需要依次进行多次扫描测量完成一帧数据的采集,数据采集完成的时刻可以认为是最后一次扫描测量时刻。相机的曝光时刻可能与激光雷达的某一个扫描时刻中重合,也可能与每一个扫描时刻都不重合。而随着系统的运动导致相机与激光雷达的相对位置会产生变化,则若还采用初始标定的外参进行融合时就会产生偏移误差,即会影响三维重建的结果准确度。
604,确定基准时刻并计算每个扫描成像时刻与基准时刻的时间差,根据时间差和每个扫描时刻下的实时位姿参数计算相对位姿偏移量。
在相机与激光雷达实现对目标物体三维重建的过程中,以相机的曝光时刻为基准时刻,计算激光雷达在按照预设扫描方式测量目标物体时,每个扫描时刻与基准时刻的时间差。按照前文所述,依然选择逐点扫描的方式并设置6个扫描时刻做示意性的说明,分别计算六个扫描时刻与相机的曝光时刻的时间差,将六个的扫描时刻的时间差表示为“Δt 1'、Δt 2'、Δt 3'、Δt 4'、Δt 5'、Δt 6'”。
具体地,计算得到了每个扫描时刻与基准时刻的时间差,再根据惯导装置测量的激光雷达在每一个扫描时刻的实时位姿参数计算出相对位姿偏移量,相对位姿偏移量包括位置变化量ΔT(Δt 1,Δt 2,Δt 3)和姿态变化量Δθ(Δθ x,Δθ y,Δθ z)。根据实时位姿参数,对每个扫描时刻对应的时间差进行积分,计算激光雷达在每个扫描时刻下相对相机的相对位姿偏移量。具体计算过程可参见步骤503,在此不再重复说明。
605,根据相对位姿偏移量计算出每个扫描时刻下测距系统的相对外参。
在测量系统包括相机和激光雷达时,测距系统的外参指的是激光雷达和相机的联合外参,通常需预先标定出相机与激光雷达之间的联合外参[R,T],标定方法可选用本领域内的任意标定方法,在此不做具体限定。在进行融合处理时,将激光雷达测量出的三维点云数据投影到相机采集的图像数据中,其中,(u c,v c)为激光雷达的三维点云数据投影到相机的图像数据中的像素坐标,K c为相机的内参。
[X c,Y c,Z c] T=[R,T][X L,Y L,Z L,1] T
Figure PCTCN2022080516-appb-000003
而随着测距系统的运动,激光雷达每个扫描时刻的位姿都在发生变化,采集数据完成时,若还采用初始的联合外参进行投影则必然引入了测量误差,因此,需要对每个扫描时 刻下激光雷达相对相机的外参变化量求解出来,进一步的,在融合投影时对外参进行修正。通过步骤504的描述可以确定表征外参的旋转平移矩阵与位置变化量和姿态变化量的关系,根据标定的联合外参[R,T]可以确定基准时刻的θ(θ x,θ y,θ z)以及T(t 1,t 2,t 3),再结合相对位姿偏移量ΔT(Δt 1,Δt 2,Δt 3)以及Δθ(Δθ x,Δθ y,Δθ z)则可以计算出激光雷达每个扫描时刻相对相机曝光时刻的相对外参,进一步根据每个扫描时刻的相对外参将三维点云数据投影到图像数据中。
具体的,根据前面的具体示例所述,最终计算可以得到六个扫描时刻的位置变化量分别为ΔT 1'、ΔT 2'、ΔT 3'、ΔT 4'、ΔT 5'、ΔT 6',和/或,六个扫描时刻的姿态变化量分别为Δθ 1'、Δθ 2'、Δθ 3'、Δθ 4'、Δθ 5'、Δθ 6'。
具体地,根据上述位置变化量ΔT 1'、ΔT 2'、ΔT 3'、ΔT 4'、ΔT 5'、ΔT 6'和/或姿态变化量Δθ 1'、Δθ 2'、Δθ 3'、Δθ 4'、Δθ 5'、Δθ 6',通过计算,得到每一个扫描时刻下修正后的测距系统的相对外参,分别为[R 1',T 1']、[R 2',T 2']、[R 3',T 3']、[R 4',T 4']、[R 5',T 5']、[R 6',T 6']。
根据修正后的相对外参和投影公式可以进一步得出六个扫描时刻测量的三维点云数据投影至相机的像素坐标系中的坐标数据。
上述技术方案,当测距系统包括激光雷达和相机时,通过同步控制相机采集目标物体的图像数据,以相机的曝光时刻额为基准时刻,根据基准时刻计算相对位姿偏移量,进一步得到测距系统的相对外参,将激光雷达采集的目标物体的三维点云数据和相机采集的目标物体的图像数据进行融合处理,实现了对目标物体的三维重建,提高了测距系统的测量精度和准确性。
综上所述,在激光雷达测量目标物体的距离数据时,采用惯导装置测量激光雷达的实时位姿参数,进一步确定激光雷达在测量目标物体的距离数据时的相对位姿偏移量,利用相对位姿偏移量计算得到测距系统的相对外参,根据相对外参可以对每个扫描时刻测量的数据进行修正,进而提高了系统的测量精度。此外,当测距系统还包括相机时,通过同步控制相机采集目标物体的图像数据,以相机的曝光时刻为基准时刻,根据基准时刻计算相对位姿偏移量,进一步得到在测距系统包含相机时的相对外参,将激光雷达采集的目标物体的三维点云数据和相机采集的目标物体的图像数据进行融合处理,实现了对目标物体的三维重建,提高了测距系统的测量精度和准确性。
图7是本申请实施例提供的一种计算测距系统相对外参的装置示意图。
示例性的,如图7所示,该装置700包括第一处理模块701、测量模块702、第二处理模块703、第三处理模块704,其中:
第一处理模块701,用于控制激光雷达按照预设扫描方式扫描目标物体,以在每个扫描时刻获取该目标物体的三维点云数据;
测量模块702,用于在该每个扫描时刻下,同步控制惯导装置测量该激光雷达的实时位姿参数,该实时位姿参数包括位移加速度和/或旋转角速度;
第二处理模块703,用于确定基准时刻并计算该每个扫描时刻与该基准时刻的时间差,根据该时间差和该每个扫描时刻下的该实时位姿参数计算相对位姿偏移量;
第三处理模块704,用于根据该相对位姿偏移量计算出该每个扫描时刻下该测距系统的相对外参;
可选地,该装置还包括:第四处理模块,用于根据相对外参将多个扫描时刻中的其他扫描时刻采集的三维点云数据修正为第一基准时刻采集的三维点云数据;
可选地,该装置还包括:采集模块,用于同步控制相机采集目标物体的图像数据,并获取该相机采集该目标物体的图像数据时的曝光时刻;
可选地,该装置还包括:第五处理模块,用于根据相对外参将每个扫描时刻采集的三维点云数据与所述图像数据做融合处理。
图8是本申请实施例提供的一种装置的结构示意图。
示例性的,如图8所示,该装置800包括存储器801和处理器802。
一种可能的实现方式中,存储器801用于存储计算机程序8011,处理器802用于执行计算机程序8011实现计算测距系统相对外参的过程,例如图5中的步骤501至504、图6中的步骤601至605。
本实施例可以根据上述方法示例对装置进行功能模块的划分,例如,可以对应各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中,上述集成的模块可以采用硬件的形式实现。需要说明的是,本实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,该装置可以包括:第一处理模块、测量模块、第二处理模块、第三处理模块等。需要说明的是,上述方法实施例涉及的各个步骤的所有相关内容的可以援引到对应功能模块的功能描述,在此不再赘述。
本实施例提供的装置,用于执行上述一种计算测距系统相对外参的方法,因此可以达到与上述实现方法相同的效果。
在采用集成的单元的情况下,装置可以包括处理模块、存储模块。其中,处理模块可以用于对电子设备的动作进行控制管理。存储模块可以用于支持装置执行相互程序代码和数据等。
其中,处理模块可以是处理器或控制器,其可以实现或执行结合本申请公开内容所藐视的各种示例性的逻辑方框,模块和电路。处理器也可以是实现计算功能的组合,例如包括一个或多个微处理器组合,数字信号处理(digital signal processing,DSP)和微处理器的组合等等,存储模块可以是存储器。
在一个实施例中,当处理模块为处理器。存储模块为存储器时,本实施例所涉及的电子设备可以是具有图3所示结构的设备。
本实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机指令,当该计算机指令在装置上运行时,使得装置执行上述相关方法步骤实现上述实施例中的一种计算测距系统相对外参的方法。
本实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中的一种计算测距系统相对外参的方法。
另外,本申请的实施例还提供一种装置,这个装置具体可以是芯片,组件或模块,该装置可包括相连的处理器和存储器;其中,存储器用于存储计算机执行指令,当装置运行时,处理器可执行存储器存储的计算机执行指令,以使芯片执行上述实施例中的一种计算测距系统相对外参的方法。
其中,本实施例提供的装置、计算机存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上实施方式的描述,所属领域的技术人员可以了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (10)

  1. 一种计算测距系统相对外参的方法,其特征在于,所述方法包括:
    控制激光雷达按照预设扫描方式扫描目标物体,以在每个扫描时刻获取所述目标物体的三维点云数据;
    在所述每个扫描时刻下,同步控制惯导装置测量所述激光雷达的实时位姿参数,所述实时位姿参数包括位移加速度和/或旋转角速度;
    确定基准时刻并计算所述每个扫描时刻与所述基准时刻的时间差,根据所述时间差和所述每个扫描时刻下的所述实时位姿参数计算相对位姿偏移量;
    根据所述相对位姿偏移量计算出所述每个扫描时刻下所述测距系统的相对外参。
  2. 根据权利要求1所述的方法,其特征在于,所述基准时刻为第一基准时刻,所述第一基准时刻为多个扫描时刻中的任意一个扫描时刻。
  3. 根据权利要求2所述的方法,其特征在于,所述方法还包括:根据所述相对外参将所述多个扫描时刻中的其他扫描时刻获取的三维点云数据修正为所述第一基准时刻获取的三维点云数据。
  4. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    同步控制相机采集所述目标物体的图像数据,并获取所述相机采集所述目标物体的图像数据时的曝光时刻。
  5. 根据权利要求4所述的方法,其特征在于,所述基准时刻为第二基准时刻,所述第二基准时刻为所述曝光时刻。
  6. 根据权利要求5所述的方法,其特征在于,所述方法还包括:根据所述相对外参将所述每个扫描时刻采集的三维点云数据与所述图像数据做融合处理。
  7. 一种计算测距系统相对外参的装置,其特征在于,所述装置包括:
    第一处理模块,用于控制激光雷达按照预设扫描方式扫描目标物体,以在每个扫描时刻获取目标物体的三维点云数据;
    测量模块,用于在所述每个扫描时刻下,同步控制惯导装置测量所述激光雷达的实时位姿参数,所述实时位姿参数包括位移加速度和/或旋转角速度;
    第二处理模块,用于确定基准时刻并计算所述每个扫描时刻与所述基准时刻的时间差,根据所述时间差和所述每个扫描时刻下的所述实时位姿参数计算相对位姿偏移量;
    第三处理模块,用于根据所述相对位姿偏移量计算出所述每个扫描时刻下所述测距系统的相对外参。
  8. 根据权利要求7所述的装置,其特征在于,所述装置还包括:
    采集模块,用于同步控制相机采集所述目标物体的图像数据,并获取所述相机采集所述目标物体的图像数据时的曝光时刻。
  9. 一种装置,其特征在于,所述装置包括:
    存储器,用于存储指令;
    处理器,用于从所述存储器中调用并运行所述指令,使得所述装置执行如权利要求1至6中任意一项所述的方法。
  10. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,当所述计算机程序被执行时,实现如权利要求1至6中任意一项所述的方法。
PCT/CN2022/080516 2021-12-08 2022-03-13 一种计算测距系统相对外参的方法、装置和存储介质 WO2023103198A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111495579.X 2021-12-08
CN202111495579.XA CN114296057A (zh) 2021-12-08 2021-12-08 一种计算测距系统相对外参的方法、装置和存储介质

Publications (1)

Publication Number Publication Date
WO2023103198A1 true WO2023103198A1 (zh) 2023-06-15

Family

ID=80966425

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/080516 WO2023103198A1 (zh) 2021-12-08 2022-03-13 一种计算测距系统相对外参的方法、装置和存储介质

Country Status (2)

Country Link
CN (1) CN114296057A (zh)
WO (1) WO2023103198A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117784088A (zh) * 2024-01-30 2024-03-29 荣耀终端有限公司 激光扫描装置、系统、控制方法及存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115457841A (zh) * 2022-07-26 2022-12-09 南京清湛人工智能研究院有限公司 一种实验教具
CN116781837B (zh) * 2023-08-25 2023-11-14 中南大学 一种自动化激光三维扫描系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109613543A (zh) * 2018-12-06 2019-04-12 深圳前海达闼云端智能科技有限公司 激光点云数据的修正方法、装置、存储介质和电子设备
CN110517209A (zh) * 2018-05-21 2019-11-29 北京京东尚科信息技术有限公司 数据处理方法、装置、系统以及计算机可读存储介质
WO2020104423A1 (en) * 2018-11-20 2020-05-28 Volkswagen Aktiengesellschaft Method and apparatus for data fusion of lidar data and image data
CN112230240A (zh) * 2020-09-30 2021-01-15 深兰人工智能(深圳)有限公司 激光雷达与相机数据的时空同步系统、装置及可读介质
CN113391300A (zh) * 2021-05-21 2021-09-14 中国矿业大学 一种基于imu的激光雷达三维点云实时运动补偿方法
CN113724303A (zh) * 2021-09-07 2021-11-30 广州文远知行科技有限公司 点云与图像匹配方法、装置、电子设备和存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110517209A (zh) * 2018-05-21 2019-11-29 北京京东尚科信息技术有限公司 数据处理方法、装置、系统以及计算机可读存储介质
WO2020104423A1 (en) * 2018-11-20 2020-05-28 Volkswagen Aktiengesellschaft Method and apparatus for data fusion of lidar data and image data
CN109613543A (zh) * 2018-12-06 2019-04-12 深圳前海达闼云端智能科技有限公司 激光点云数据的修正方法、装置、存储介质和电子设备
CN112230240A (zh) * 2020-09-30 2021-01-15 深兰人工智能(深圳)有限公司 激光雷达与相机数据的时空同步系统、装置及可读介质
CN113391300A (zh) * 2021-05-21 2021-09-14 中国矿业大学 一种基于imu的激光雷达三维点云实时运动补偿方法
CN113724303A (zh) * 2021-09-07 2021-11-30 广州文远知行科技有限公司 点云与图像匹配方法、装置、电子设备和存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117784088A (zh) * 2024-01-30 2024-03-29 荣耀终端有限公司 激光扫描装置、系统、控制方法及存储介质

Also Published As

Publication number Publication date
CN114296057A (zh) 2022-04-08

Similar Documents

Publication Publication Date Title
WO2023103198A1 (zh) 一种计算测距系统相对外参的方法、装置和存储介质
US10764487B2 (en) Distance image acquisition apparatus and application thereof
CN110596721B (zh) 双重共享tdc电路的飞行时间距离测量系统及测量方法
CN110873883B (zh) 融合激光雷达和imu的定位方法、介质、终端和装置
CN111435162B (zh) 激光雷达与相机同步方法、装置、设备和存储介质
US9046599B2 (en) Object detection apparatus and method
WO2021213432A1 (zh) 数据融合
CN113538591A (zh) 一种距离测量装置与相机融合系统的标定方法及装置
US11977167B2 (en) Efficient algorithm for projecting world points to a rolling shutter image
US11619481B2 (en) Coordinate measuring device
WO2023015880A1 (zh) 训练样本集的获取方法、模型训练方法及相关装置
KR20120105761A (ko) 외부 환경 가시화 장치 및 그 방법
WO2022183658A1 (zh) 光斑位置自适应搜索方法、时间飞行测距系统及测距方法
US10949981B2 (en) Position measuring method, position measuring apparatus, and position measuring system
CN115427832A (zh) 自动驾驶车辆的激光雷达和图像校准
WO2021208582A1 (zh) 标定装置、标定系统、电子设备及标定方法
CN110986816B (zh) 一种深度测量系统及其测量方法
US11914028B2 (en) Object detection device for vehicle
CN109618085B (zh) 电子设备和移动平台
WO2020066068A1 (ja) ステレオカメラシステム、及び測距方法
CN111044039A (zh) 基于imu的单目目标区域自适应高精度测距装置和方法
US11860317B1 (en) Optical adjustment for image fusion LiDAR systems
CN116047481A (zh) 矫正点云数据畸变方法、装置、设备及存储介质
WO2021068723A1 (zh) 传感器标定方法和传感器标定装置
CN111982071B (zh) 一种基于tof相机的3d扫描方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22902642

Country of ref document: EP

Kind code of ref document: A1