CN111771140A - Detection device external parameter calibration method, data processing device and detection system - Google Patents

Detection device external parameter calibration method, data processing device and detection system Download PDF

Info

Publication number
CN111771140A
CN111771140A CN201980005318.XA CN201980005318A CN111771140A CN 111771140 A CN111771140 A CN 111771140A CN 201980005318 A CN201980005318 A CN 201980005318A CN 111771140 A CN111771140 A CN 111771140A
Authority
CN
China
Prior art keywords
detection device
calculating
visible area
movement
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980005318.XA
Other languages
Chinese (zh)
Inventor
陈涵
邢万里
吴特思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111771140A publication Critical patent/CN111771140A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A method for calibrating external parameters of a detection device, a data processing device and a detection system are provided. The method is suitable for calibrating the external parameter between a first detection device (60) and a second detection device (70), the visual areas of the first detection device (60) and the second detection device (70) are not overlapped, the visual area of the first detection device (60) after N times of movement is overlapped with the visual area of the second detection device (70) in the initial state, and N is more than or equal to 1, and the method comprises the following steps: calculating a first offset parameter between the coordinate systems before and after each movement of the first detecting device (60) (S101); calculating a second offset parameter between the coordinate system of the first detection device (60) after N times of movement and the coordinate system of the second detection device (70) in the initial state (S102); calculating an external parameter between the first probe device (60) and the second probe device (70) based on the first offset parameter and the second offset parameter (S103). Therefore, the efficiency of calibrating the external parameters of the detection device can be improved.

Description

Detection device external parameter calibration method, data processing device and detection system Technical Field
The present invention relates to data processing technologies, and in particular, to a method for calibrating external parameters of a detection device, a data processing device, and a detection system.
Background
Detection devices such as laser radars can transmit detection signals to different directions, and depth information, reflectivity information and the like of an object are acquired according to echoes of different directions. When a plurality of laser radars are installed on the same equipment, external parameter calibration needs to be carried out on the laser radars in order to fuse data collected by different laser radars to the same coordinate system. In the related art, for the laser radar without overlapping in the visible area, the adopted external parameter calibration method is a mode of performing indirect calibration by additionally arranging a camera, so that the defects of complex calibration process and low efficiency exist.
Disclosure of Invention
The embodiment of the invention provides a detection device parameter calibration method, a data processing device and a calibration system, which are used for carrying out external parameter calibration on a detection device with a non-overlapped visual area, and improving the efficiency of external parameter calibration.
In a first aspect, an embodiment of the present invention provides a method for calibrating an external parameter of a detection device, which is applied to calibrate an external parameter between a first detection device and a second detection device, where visible areas of the first detection device and the second detection device do not overlap with each other, the visible area of the first detection device after moving N times overlaps with the visible area of the second detection device in an initial state, where N is greater than or equal to 1, and the method includes:
calculating a first offset parameter between the coordinate systems before and after each movement of the first detection device;
calculating a second offset parameter between the coordinate system of the first detection device after N times of movement and the coordinate system of the second detection device in the initial state;
calculating an external parameter between the first detection device and the second detection device according to the first offset parameter and the second offset parameter.
In a second aspect, an embodiment of the present invention provides a probe data processing apparatus, including at least a memory and a processor; the memory is connected with the processor through a communication bus and is used for storing computer instructions executable by the processor; the processor is to read computer instructions from the memory to implement:
calculating a first offset parameter between the coordinate systems before and after each movement of the first detection device;
calculating a second offset parameter between the coordinate system of the first detection device after N times of movement and the coordinate system of the second detection device in the initial state;
calculating an external parameter between the first detection device and the second detection device according to the first offset parameter and the second offset parameter; the visible areas of the first detection device and the second detection device are not overlapped, the visible area of the first detection device after N times of movement is overlapped with the visible area of the second detection device in the initial state, and N is larger than or equal to 1.
In a third aspect, an embodiment of the present invention provides a detection system, including: a plurality of detection devices and the data processing device of the second aspect, the plurality of detection devices including a first detection device and a second detection device; the plurality of detection devices are mounted on the same carrier, the carrier comprising: and the movable platform drives the detection device to move through the movable platform.
As can be seen from the above technical solution, in this embodiment, when calibrating external parameters between a first detection device and a second detection device whose visible areas do not overlap with each other, a first offset parameter between coordinate systems before and after movement of the first detection device is calculated by overlapping the visible area of the first detection device after N times of movement with the visible area of the second detection device in an initial state, a second offset parameter between the coordinate system of the first detection device after N times of movement and the coordinate system of the second detection device in the initial state is calculated, and then the external parameters between the first detection device and the second detection device are calculated according to the first offset parameter and the second offset parameter; the method can calibrate the external parameters of the detection device without overlapping the visible area, and improve the efficiency of calibrating the external parameters.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a block diagram of a detection apparatus according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a detection apparatus using a coaxial optical path according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of a method for calibrating parameters of a detection apparatus according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating a process of calculating a first offset parameter according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating a process for calculating a second offset parameter according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of relative positions of a first detecting device and a second detecting device according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of relative positions of a first detecting device and a second detecting device according to an embodiment of the present invention;
FIG. 8 is a block diagram of a data processing apparatus according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a detection system according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Detection devices such as laser radars can transmit detection signals to different directions, and depth information, reflectivity information and the like of an object are acquired according to echoes of different directions. When a plurality of laser radars are installed on the same equipment, external parameter calibration needs to be carried out on the laser radars in order to fuse data collected by different laser radars to the same coordinate system. In the related art, for the laser radar without overlapping in the visible area, the adopted external parameter calibration method is a mode of performing indirect calibration by additionally arranging a camera, so that the defects of complex calibration process and low efficiency exist.
Therefore, the embodiment of the invention provides a method for calibrating external parameters of detection devices, which is suitable for calibrating the external parameters between the detection devices (such as laser radars) with non-overlapping visual areas.
In one embodiment, the detection device is used for sensing external environment information, such as distance information, orientation information, reflection intensity information, speed information, and the like of an environmental target. In one implementation, the detection device may detect the distance from the detection device to the detection object by measuring a Time-of-Flight (TOF) Time of light propagation between the detection device and the detection object. Alternatively, the detecting device may detect the distance from the detecting object to the detecting device by other techniques, such as a distance measuring method based on phase shift (phase shift) measurement or a distance measuring method based on frequency shift (frequency shift) measurement, which is not limited herein.
For convenience of understanding, fig. 1 is a block diagram of a detection device according to an embodiment of the present invention, and a work flow of ranging will be described in detail below with reference to the detection device 100 shown in fig. 1.
Referring to fig. 1, the detection apparatus 100 may include a transmission circuit 110, a reception circuit 120, a sampling circuit 130, and an operation circuit 140.
The transmit circuitry 110 may transmit a sequence of light pulses (e.g., a sequence of laser pulses). The receiving circuit 120 may receive an optical pulse sequence (which may also be referred to as an echo signal) reflected by the detected object, perform photoelectric conversion on the optical pulse sequence to obtain an electrical signal, process the electrical signal, and output the electrical signal to the sampling circuit 130. The sampling circuit 130 may sample the electrical signal to obtain a sampling result. The arithmetic circuit 140 may determine the distance between the detection apparatus 100 and the detected object based on the sampling result of the sampling circuit 130.
Optionally, the detection apparatus 100 may further include a control circuit 150, and the control circuit 150 may implement control of other circuits, for example, may control an operating time of each circuit and/or perform parameter setting on each circuit, and the like.
It should be understood that, although the detecting device shown in fig. 1 includes a transmitting circuit, a receiving circuit, a sampling circuit and an arithmetic circuit for emitting a light beam to detect, the embodiments of the present application are not limited thereto, and the number of any one of the transmitting circuit, the receiving circuit, the sampling circuit and the arithmetic circuit may be at least two, and the at least two light beams are emitted in the same direction or in different directions respectively; the at least two light paths may be emitted simultaneously or at different times. In one example, the light emitting chips in the at least two transmitting circuits are packaged in the same module. For example, each transmitting circuit comprises a laser emitting chip, and the laser emitting chips in the at least two transmitting circuits can be packaged together and accommodated in the same packaging space.
In some embodiments, in addition to the circuit shown in fig. 1, the detecting device 100 may further include a scanning module 160 for changing the propagation direction of at least one laser pulse sequence emitted from the emitting circuit.
Here, a module including the transmitting circuit 110, the receiving circuit 120, the sampling circuit 130, and the operation circuit 140, or a module including the transmitting circuit 110, the receiving circuit 120, the sampling circuit 130, the operation circuit 140, and the control circuit 150 may be referred to as a ranging module, and the ranging module 150 may be independent of other modules, for example, the scanning module 160.
The detection device may adopt a coaxial optical path, that is, the light beam emitted from the detection device and the reflected light beam share at least part of the optical path in the detection device. For example, at least one path of laser pulse sequence emitted by the emitting circuit is emitted by the scanning module after the propagation direction is changed, and the laser pulse sequence reflected by the detector is emitted to the receiving circuit after passing through the scanning module. Alternatively, the detection device may also adopt an off-axis optical path, that is, the light beam emitted from the detection device and the reflected light beam are transmitted along different optical paths in the detection device. FIG. 2 shows a schematic diagram of an embodiment of the detection apparatus of the present invention using coaxial optical paths.
The detection apparatus 200 comprises a ranging module 210, the ranging module 210 comprising an emitter 203 (which may comprise the above-described transmitting circuitry), a collimating element 204, a detector 205 (which may comprise the above-described receiving circuitry, sampling circuitry and arithmetic circuitry), and a beam path altering element 206. The distance measuring module 210 is configured to emit a light beam, receive return light, and convert the return light into an electrical signal. Wherein the emitter 203 may be configured to emit a sequence of light pulses. In one embodiment, the transmitter 203 may emit a sequence of laser pulses. Optionally, the laser beam emitted by the emitter 203 is a narrow bandwidth beam having a wavelength outside the visible range. The collimating element 204 is disposed on an emitting light path of the emitter, and is configured to collimate the light beam emitted from the emitter 203, and collimate the light beam emitted from the emitter 203 into parallel light to be emitted to the scanning module. The collimating element is also for converging at least a portion of the return light reflected by the detector. The collimating element 204 may be a collimating lens or other element capable of collimating a light beam.
In the embodiment shown in fig. 2, the transmit and receive optical paths within the detection apparatus are combined by the optical path changing element 206 before the collimating element 204, so that the transmit and receive optical paths can share the same collimating element, making the optical path more compact. In other implementations, the emitter 203 and the detector 205 may use respective collimating elements, and the optical path changing element 206 may be disposed in the optical path after the collimating elements.
In the embodiment shown in fig. 2, since the beam aperture of the light beam emitted from the emitter 203 is small and the beam aperture of the return light received by the detection device is large, the optical path changing element can adopt a small-area mirror to combine the emission optical path and the reception optical path. In other implementations, the optical path changing element may also be a mirror with a through hole, wherein the through hole is used for transmitting the outgoing light from the emitter 203, and the mirror is used for reflecting the return light to the detector 205. Therefore, the shielding of the bracket of the small reflector to the return light can be reduced in the case of adopting the small reflector.
In the embodiment shown in fig. 2, the optical path altering element is offset from the optical axis of the collimating element 204. In other implementations, the optical path altering element may also be located on the optical axis of the collimating element 204.
The probing apparatus 200 also includes a scanning module 202. The scanning module 202 is disposed on the emitting light path of the distance measuring module 210, and the scanning module 202 is configured to change the transmission direction of the collimated light beam 219 emitted by the collimating element 204, project the collimated light beam to the external environment, and project the return light beam to the collimating element 204. The return light is converged by the collimating element 204 onto the detector 205.
In one embodiment, the scanning module 202 may include at least one optical element for altering the propagation path of the light beam, wherein the optical element may alter the propagation path of the light beam by reflecting, refracting, diffracting, etc., the light beam. For example, the scanning module 202 includes a lens, mirror, prism, galvanometer, grating, liquid crystal, Optical Phased Array (Optical Phased Array), or any combination thereof. In one example, at least a portion of the optical element is moved, for example, by a driving module, and the moved optical element can reflect, refract, or diffract the light beam to different directions at different times. In some embodiments, multiple optical elements of the scanning module 202 may rotate or oscillate about a common axis 209, with each rotating or oscillating optical element serving to constantly change the direction of propagation of an incident beam. In one embodiment, the multiple optical elements of the scanning module 202 may rotate at different rotational speeds or oscillate at different speeds. In another embodiment, at least some of the optical elements of the scanning module 202 may rotate at substantially the same rotational speed. In some embodiments, the multiple optical elements of the scanning module may also be rotated about different axes. In some embodiments, the multiple optical elements of the scanning module may also rotate in the same direction, or in different directions; or in the same direction, or in different directions, without limitation.
In one embodiment, the scanning module 202 includes a first optical element 214 and a driver 216 coupled to the first optical element 214, the driver 216 configured to drive the first optical element 214 to rotate about the rotation axis 209, such that the first optical element 214 redirects the collimated light beam 219. The first optical element 214 projects the collimated beam 219 into different directions. In one embodiment, the angle between the direction of the collimated beam 219 after it is altered by the first optical element and the rotational axis 109 changes as the first optical element 214 is rotated. In one embodiment, the first optical element 214 includes a pair of opposing non-parallel surfaces through which the collimated light beam 219 passes. In one embodiment, the first optical element 214 includes a prism having a thickness that varies along at least one radial direction. In one embodiment, the first optical element 214 comprises a wedge angle prism that refracts the collimated beam 219.
In one embodiment, the scanning module 202 further comprises a second optical element 215, the second optical element 215 rotating around a rotation axis 209, the rotation speed of the second optical element 215 being different from the rotation speed of the first optical element 214. The second optical element 215 is used to change the direction of the light beam projected by the first optical element 214. In one embodiment, the second optical element 215 is coupled to another driver 217, and the driver 217 drives the second optical element 215 to rotate. The first optical element 214 and the second optical element 215 may be driven by the same or different drivers, such that the first optical element 214 and the second optical element 215 rotate at different speeds and/or turns, thereby projecting the collimated light beam 219 into different directions in the ambient space, which may scan a larger spatial range. In one embodiment, the controller 218 controls the drivers 216 and 217 to drive the first optical element 214 and the second optical element 215, respectively. The rotation speed of the first optical element 214 and the second optical element 215 can be determined according to the region and the pattern of the expected scanning in the actual application. The drives 216 and 217 may include motors or other drives.
In one embodiment, second optical element 215 includes a pair of opposing non-parallel surfaces through which the light beam passes. In one embodiment, second optical element 215 includes a prism having a thickness that varies along at least one radial direction. In one embodiment, second optical element 215 comprises a wedge angle prism.
In one embodiment, the scan module 202 further comprises a third optical element (not shown) and a driver for driving the third optical element to move. Optionally, the third optical element comprises a pair of opposed non-parallel surfaces through which the light beam passes. In one embodiment, the third optical element comprises a prism having a thickness that varies along at least one radial direction. In one embodiment, the third optical element comprises a wedge angle prism. At least two of the first, second and third optical elements rotate at different rotational speeds and/or rotational directions.
Rotation of the optical elements in the scanning module 202 may project light in different directions, such as directions 211 and 213, thus scanning the space around the detection device 200. When the light 211 projected by the scanning module 202 hits the detection object 201, a part of the light is reflected by the detection object 201 to the detection device 200 in a direction opposite to the projected light 211. The return light 212 reflected by the object 201 passes through the scanning module 202 and then enters the collimating element 204.
The detector 205 is placed on the same side of the collimating element 204 as the emitter 203, and the detector 205 is used to convert at least part of the return light passing through the collimating element 204 into an electrical signal.
In one embodiment, each optical element is coated with an antireflection coating. Optionally, the thickness of the antireflection film is equal to or close to the wavelength of the light beam emitted by the emitter 203, which can increase the intensity of the transmitted light beam.
In one embodiment, a filter layer is coated on a surface of a component in the light beam propagation path of the detection device, or a filter is disposed on the light beam propagation path for transmitting at least a wavelength band in which the light beam emitted from the emitter is located and reflecting other wavelength bands, so as to reduce noise of the ambient light to the receiver.
In some embodiments, the transmitter 203 may include a laser diode through which laser pulses in the order of nanoseconds are emitted. Further, the laser pulse reception time may be determined, for example, by detecting the rising edge time and/or the falling edge time of the electrical signal pulse. In this manner, the detecting apparatus 200 can calculate TOF using the pulse reception time information and the pulse emission time information, thereby determining the distance of the probe 201 from the detecting apparatus 200.
The distance and orientation detected by the detection device 200 may be used for remote sensing, obstacle avoidance, mapping, modeling, navigation, and the like. In one embodiment, the detection device of the embodiment of the present invention may be applied to a movable platform, and the detection device may be mounted on a platform body of the movable platform. The movable platform with the detection device can measure the external environment, for example, the distance between the movable platform and an obstacle is measured for the purpose of avoiding the obstacle, and the external environment is mapped in two dimensions or three dimensions. In certain embodiments, the movable platform comprises at least one of an unmanned aerial vehicle, an automobile, a remote control car, a robot, a camera. When the detection device is applied to the unmanned aerial vehicle, the platform body is a fuselage of the unmanned aerial vehicle. When the detection device is applied to an automobile, the platform body is the automobile body of the automobile. The vehicle may be an autonomous vehicle or a semi-autonomous vehicle, without limitation. When the detection device is applied to the remote control car, the platform body is the car body of the remote control car. When the detection device is applied to a robot, the platform body is the robot. When the detection device is applied to a camera, the platform body is the camera itself.
Fig. 3 is a schematic flow chart of a method for calibrating external parameters of a detection device according to an embodiment of the present invention. The method provided by the embodiment of the invention is suitable for calibrating the external parameters between two detection devices, wherein the two detection devices are respectively as follows: for example, the first detection device and the second detection device have no overlapping of the visible regions, and the visible region of the first detection device after N movements overlaps with the visible region of the second detection device in the initial state, where N is greater than or equal to 1, as shown in fig. 3, the method includes the following steps S101 to S103:
s101, calculating a first offset parameter between coordinate systems before and after each movement of the first detection device.
The first offset parameter is a relationship parameter of mutual conversion between a coordinate system before the first detecting device moves and a coordinate system after the first detecting device moves, and is used for representing a relative position relationship before and after the first detecting device moves.
Taking the first detection device and the second detection device as laser radars as an example, the laser radars are a perception sensor and can obtain three-dimensional information of a scene. The method comprises the following steps of actively emitting a laser pulse signal to a detected object, receiving a laser pulse signal reflected by the detected object, and calculating depth information of the detected object according to the time difference between the emitted laser pulse signal and the received reflected laser pulse signal and the propagation speed of the laser pulse signal; according to the transmitting direction of the laser radar, obtaining angle information of the measured object relative to the laser radar; and combining the depth information and the angle information to obtain massive detection points, wherein a data set of the detection points is called point cloud, and the spatial three-dimensional information of the detected object relative to the laser radar can be reconstructed based on the point cloud.
In an embodiment of the present invention, in the step S101, a method for calculating a first offset parameter between coordinate systems before and after each movement of the first detecting device includes:
respectively acquiring point clouds of visible areas of the first detection device before and after each movement, and calculating the first offset parameter according to the acquired point clouds of the visible areas before and after each movement; wherein the first detection device has an overlapping area in the visible area before and after each movement.
In this embodiment, the first detection device is set to satisfy that there is an overlapping region between the visible region before the movement and the visible region after the movement when the first detection device moves each time, and then the first offset parameter may be calculated according to a relationship satisfied by the point clouds of the visible regions before and after the movement of the first detection device.
Fig. 4 is a schematic flowchart of calculating a first offset parameter according to an embodiment of the present invention. Referring to fig. 4, in this embodiment, the calculating a first offset parameter according to the acquired point cloud of the visible area before and after each movement includes the following steps S201 to S202:
s201, establishing a first coordinate relation function which is required to be met by point clouds in the overlapped area of the visible area before the first detection device moves and the visible area after the first detection device moves each time.
In this embodiment, the point cloud includes three-dimensional coordinate data, and since there is an overlapping region between the visible region before the first detection device moves and the visible region after the first detection device moves in the moving process, at least one group of point clouds corresponding to the same point in the actual three-dimensional space exists in the point cloud of the visible region before the first detection device moves and the point cloud of the visible region after the first detection device moves, and it is assumed that the group of point clouds is piAnd q isiI.e. point cloud p of moving front visual areaiAnd point cloud q of the moved visible regioniCorresponding to the same point in the actual three-dimensional space, establishing a first coordinate relation function based on the first coordinate relation function, wherein the first coordinate relation function is as the following formula (1),
Figure PCTCN2019073990-APPB-000001
the coefficients of the first coordinate relation function are a parameter R and a parameter t, respectively, where the parameter R is a first rotation matrix and the parameter t is a first translation matrix.
S202, establishing a first objective function according to the coefficient of the first coordinate relation function, minimizing the first objective function through the acquired point clouds of the visible areas before and after each movement, and calculating to obtain a first rotation matrix and a first translation matrix.
The first objective function established according to the coefficients R and t of the first coordinate relation function is expressed by the following formula (2),
Figure PCTCN2019073990-APPB-000002
minimizing the first objective function, calculating to obtain a parameter R and a parameter t, and further obtaining a first rotation matrix and a first translation matrix, wherein n is the total number of point clouds in the visible area before the first detection device moves and the total number of point clouds in the visible area after the first detection device moves; further, the first offset parameter between the coordinate systems before and after each movement of the first detecting device obtained in this embodiment includes a first rotation matrix and a first translation matrix.
Alternatively, the method for minimizing the first objective function may be implemented by a nonlinear optimization method or a Singular Value Decomposition (SVD) method.
In an alternative embodiment, after the first rotation matrix and the first shift matrix are obtained through calculation, the first shift matrix is obtained according to the first rotation matrix and the first shift matrix, and the obtained first shift matrix α 1 is as follows (3):
Figure PCTCN2019073990-APPB-000003
furthermore, the first offset parameter obtained by the method provided in this embodiment may further include: a first transfer matrix.
S102, calculating a second offset parameter between the coordinate system of the first detection device after N times of movement and the coordinate system of the second detection device in the initial state.
In this embodiment, the manner of calculating the second offset parameter includes:
and acquiring the point cloud of the visible area of the first detection device after N times of movement and the point cloud of the visible area of the second detection device in the initial state, and calculating the second offset parameter according to the point cloud of the visible area of the first detection device after N times of movement and the point cloud of the visible area of the second detection device in the initial state.
Fig. 5 is a schematic flowchart of calculating a second offset parameter according to an embodiment of the present invention. Referring to fig. 5, in the embodiment of the present invention, the calculating the second offset parameter according to the point cloud of the visible area after the first detecting device moves N times and the point cloud of the visible area of the second detecting device in the initial state includes the following steps S301 to S302:
s301, establishing a second coordinate relation function which is required to be met by the point cloud of the overlapping area of the visible area of the first detection device after N times of movement and the visible area of the second detection device in the initial state.
In this embodiment, the visible area of the first detection device after the N times of movement and the visible area of the second detection device in the initial state have an overlapping area, so that at least one group of point clouds in the point cloud of the visible area of the first detection device after the N times of movement and the point cloud of the visible area of the second detection device in the initial state correspond to the same point in the actual three-dimensional space, and it is assumed that the group of point clouds is XiAnd YiThat is, the point cloud X of the visible area after the first detecting device moves for N timesiAnd a point cloud Y of a visible area of the second detecting device in an initial stateiCorresponding to the same point in the actual three-dimensional space, a second coordinate relation function is established based on the same point, as the following formula (4),
Xi=R0Yi+t0(4)
wherein the coefficient of the second coordinate relation function is a parameter R0And a parameter t0The parameter R0Is a second rotation matrix, the parameter t0Is a second translation matrix.
S302, a second objective function is established according to the coefficient of the second coordinate relation function, the second objective function is minimized through the acquired point cloud of the visible area of the first detection device after N times of movement and the point cloud of the visible area of the second detection device in the initial state, and a second rotation matrix and a second translation matrix are obtained through calculation.
The coefficient R according to the second coordinate relation function0And t0A second objective function is established, as shown in equation (5),
Figure PCTCN2019073990-APPB-000004
minimizing the second objective function to obtain a parameter R0And a parameter t0Obtaining a second rotation matrix and a second translation matrix, wherein N is the total number of the point clouds of the visible area obtained after the first detection device moves for N times and the total number of the point clouds of the visible area obtained by the second detection device in the initial state; further, the second offset parameter obtained in this embodiment includes a second rotation matrix and a second translation matrix.
Optionally, the method for minimizing the second objective function may also be implemented by using a nonlinear optimization method or a Singular Value Decomposition (SVD) method.
In an alternative embodiment, after the second rotation matrix and the second shift matrix are obtained by calculation, the second transition matrix β 0 is obtained according to the second rotation matrix and the second shift matrix, and the finally obtained second transition matrix β 0 is as the following formula (6),
Figure PCTCN2019073990-APPB-000005
s103, calculating an external parameter between the first detection device and the second detection device according to the first offset parameter and the second offset parameter.
In this embodiment, the calculating an external parameter between the first detection device and the second detection device according to the first offset parameter and the second offset parameter includes:
and multiplying the first offset parameters obtained by each time of mobile calculation of the first detection device respectively, and then multiplying the first offset parameters by the second offset parameters to obtain the external parameters between the first detection device and the second detection device.
The external parameters between the first detection device and the second detection device represent the relative position relationship of the first detection device and the second detection device in space, and are used for subsequent processing operations such as point cloud fusion.
Taking the first offset parameter as a first offset matrix and the second offset parameter as a second offset matrix as an example, when the visible area of the first detection device after moving for N times overlaps with the visible area of the second detection device in the initial state, each movement of the first detection device is calculated to obtain a first transfer matrix; taking the case that the visible area of the first detection device after the third movement overlaps with the visible area of the second detection device in the initial state, the visible area of the first detection device in the initial state is a0, the visible area of the first detection device after the first movement is a visible area a1, the visible area of the first detection device after the second movement is a visible area a2, the visible area of the first detection device after the third movement is a visible area A3, the visible area of the second detection device in the initial state is a visible area B0, the first transfer matrix is α 1 calculated according to the point cloud of the visible area a0 and the point cloud of the visible area a1, the first transfer matrix α 2 is calculated according to the point cloud of the visible area a1 and the point cloud of the visible area a2, the first transfer matrix α 3 is calculated according to the point cloud of the visible area a2 and the point cloud of the visible area A3, calculating a second transfer matrix beta 0 according to the point cloud of the visible area A3 and the point cloud of the visible area B0 of the second detection device in the initial state, and obtaining a transfer matrix T between the first detection device and the second detection device according to the calculated first transfer matrix and second transfer matrix: t ═ α 1 · α 1 · α 1 · β 0.
Optionally, in this embodiment of the present application, the first detecting device and the second detecting device start to rotate around the same central point simultaneously when moving; the rotation may be implemented by a movable platform (e.g., an automobile) driving the first detection device and the second detection device to turn or rotate on site, and the specific rotation manner is not limited herein.
Optionally, in this embodiment of the present application, the first detecting device and the second detecting device translate along the same straight line when moving; the translation can be realized by a movable platform (such as an automobile) driving the first detection device and the second detection device to translate along the same straight line.
It should be noted that, the present invention is applicable to any moving manner that satisfies that the visible region of the first detection device after N times of movement has an overlapping region with the visible region of the second detection device in the initial state, and the moving manner is not particularly limited in the present invention.
Optionally, when the external parameter between the first detection device and the second detection device is calculated according to the fact that the visible region of the first detection device after N rotations overlaps with the visible region of the second detection device in the initial state, and when the installation positions of the first detection device and the second detection device are not on the same straight line, the first detection device rotates in the direction close to the second detection device; thus, the number of rotations is reduced. For example, in the scenario shown in fig. 6, the first detecting device 60 and the second detecting device 70 are installed on the same straight line, and when the first detecting device 60 rotates in any direction and finally rotates through the same angle, the visible area of the first detecting device 60 overlaps with the visible area of the second detecting device 70 in the initial state. For another example, in the scenario shown in fig. 7, when the first detecting device 60 rotates in a direction (counterclockwise direction) approaching the second detecting device 70, the final rotation angle is smaller than the final rotation angle in a direction (clockwise direction) away from the second detecting device.
Alternatively, before the rotation, the number of rotations required for each rotation by the same angle may be determined in advance according to the included angle between the first detecting device 60 and the second detecting device 70, the angle of view of the first detecting device 60, and the angle of view of the second detecting device 70 under the condition that the visible regions before and after each rotation of the first detecting device 60 have overlapping regions, and then the first detecting device 60 and the second detecting device 70 are rotated according to the number of rotations.
Fig. 8 is a schematic diagram of a data processing apparatus according to an embodiment of the present invention. Referring to fig. 8, the data processing apparatus 1000 includes at least a memory 1002 and a processor 1001; the memory 1002 is connected to the processor 1001 through a communication bus 1003, and is configured to store computer instructions executable by the processor 1001; the processor 1001 is configured to read a computer instruction from the memory 1001 to implement a method for calibrating an external parameter of a detection apparatus, and is adapted to calibrate an external parameter between a first detection apparatus and a second detection apparatus, where visible areas of the first detection apparatus and the second detection apparatus are not overlapped with each other, a visible area of the first detection apparatus after N times of movement overlaps with a visible area of the second detection apparatus in an initial state, where N is greater than or equal to 1, and the method includes:
calculating a first offset parameter between the coordinate systems before and after each movement of the first detection device;
calculating a second offset parameter between the coordinate system of the first detection device after N times of movement and the coordinate system of the second detection device in the initial state;
calculating an external parameter between the first detection device and the second detection device according to the first offset parameter and the second offset parameter.
Optionally, the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
respectively acquiring point clouds of visible areas of the first detection device before and after each movement, and calculating the first offset parameter according to the acquired point clouds of the visible areas before and after each movement; wherein the first detection device has an overlapping area in the visible area before and after each movement.
Optionally, the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
establishing a first coordinate relation function to be met by point clouds in a visible area before the first detection device moves and an overlapped area of the visible area after the first detection device moves each time;
and establishing a first target function according to the coefficient of the first coordinate relation function, minimizing the first target function through the acquired point clouds of the visible areas before and after each movement, and calculating to obtain a first rotation matrix and a first translation matrix.
Optionally, the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
and calculating to obtain a first transfer matrix according to the first rotation matrix and the first translation matrix.
Optionally, the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
and acquiring the point cloud of the visible area of the first detection device after N times of movement and the point cloud of the visible area of the second detection device in the initial state, and calculating the second offset parameter according to the point cloud of the visible area of the first detection device after N times of movement and the point cloud of the visible area of the second detection device in the initial state.
Optionally, the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
establishing a second coordinate relation function which is required to be met by the point cloud of the overlapping area of the visible area of the first detection device after the first detection device moves for N times and the visible area of the second detection device in the initial state;
and establishing a second objective function according to the coefficient of the second coordinate relation function, minimizing the second objective function through the acquired point cloud of the visible area of the first detection device after N times of movement and the point cloud of the visible area of the second detection device in the initial state, and calculating to obtain a second rotation matrix and a second translation matrix.
Optionally, the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
and calculating to obtain a second transfer matrix according to the second rotation matrix and the second translation matrix.
Optionally, the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
and multiplying the first offset parameters obtained by each time of mobile calculation of the first detection device respectively, and then multiplying the first offset parameters by the second offset parameters to obtain the external parameters between the first detection device and the second detection device.
Optionally, the first detecting device and the second detecting device start rotating around the same central point at the same time.
Optionally, the first detecting device is rotated to approach the second detecting device.
Optionally, the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
and calculating the external parameters between the first detection device and other detectors according to the external parameters between the first detection device and the second detection device and the external parameters between the second detection device and other detectors.
Optionally, the first detection device and the second detection device each include at least one of: laser radar, millimeter wave radar, ultrasonic radar.
Optionally, the data processing device is an upper computer.
Optionally, the data processing device is disposed in the detection device.
An embodiment of the present invention further provides a detection system, which is shown in fig. 9 and includes a plurality of detection devices and the data processing device 1000, where the plurality of detection devices includes a first detection device 60 and a second detection device 70 (only two detection devices are shown in the figure); the plurality of detection devices are mounted on a same carrier (not shown in the figures) comprising: and the movable platform drives the detection device to move through the movable platform.
Optionally, the movable platform includes: any one of a vehicle, an aircraft, and a turntable. The rotary table can be arranged on a vehicle or an aircraft, and the detection device is driven to rotate by the rotation of the rotary table to calibrate the external parameters.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The method and apparatus provided by the embodiments of the present invention are described in detail above, and the principle and the embodiments of the present invention are explained in detail herein by using specific examples, and the description of the embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (25)

  1. A method for calibrating external parameters of a detection device is characterized by being suitable for calibrating the external parameters between a first detection device and a second detection device, wherein the visual areas of the first detection device and the second detection device are not overlapped, the visual area of the first detection device after N times of movement is overlapped with the visual area of the second detection device in an initial state, and N is more than or equal to 1, and the method comprises the following steps:
    calculating a first offset parameter between the coordinate systems before and after each movement of the first detection device;
    calculating a second offset parameter between the coordinate system of the first detection device after N times of movement and the coordinate system of the second detection device in the initial state;
    calculating an external parameter between the first detection device and the second detection device according to the first offset parameter and the second offset parameter.
  2. The method of claim 1, wherein calculating a first offset parameter between the coordinate systems before and after each movement of the first probe device comprises:
    respectively acquiring point clouds of visible areas of the first detection device before and after each movement, and calculating the first offset parameter according to the acquired point clouds of the visible areas before and after each movement; wherein the first detection device has an overlapping area in the visible area before and after each movement.
  3. The method of claim 2, wherein said calculating the first offset parameter from the acquired point cloud of the viewable area before and after the each movement comprises:
    establishing a first coordinate relation function to be met by point clouds in a visible area before the first detection device moves and an overlapped area of the visible area after the first detection device moves each time;
    and establishing a first target function according to the coefficient of the first coordinate relation function, minimizing the first target function through the acquired point clouds of the visible areas before and after each movement, and calculating to obtain a first rotation matrix and a first translation matrix.
  4. The method of claim 3, wherein after the calculating the first transition matrix and the first translation matrix, further comprising:
    and calculating to obtain a first transfer matrix according to the first rotation matrix and the first translation matrix.
  5. The method of claim 1, wherein the calculating a second offset parameter between the coordinate system of the first detecting device after N movements and the coordinate system of the second detecting device in the initial state comprises:
    and acquiring the point cloud of the visible area of the first detection device after N times of movement and the point cloud of the visible area of the second detection device in the initial state, and calculating the second offset parameter according to the point cloud of the visible area of the first detection device after N times of movement and the point cloud of the visible area of the second detection device in the initial state.
  6. The method of claim 5, wherein the calculating the second offset parameter according to the point cloud of the visible area after the first detecting device moves for N times and the point cloud of the visible area of the second detecting device in the initial state comprises:
    establishing a second coordinate relation function which is required to be met by the point cloud of the overlapping area of the visible area of the first detection device after the first detection device moves for N times and the visible area of the second detection device in the initial state;
    and establishing a second objective function according to the coefficient of the second coordinate relation function, minimizing the second objective function through the acquired point cloud of the visible area of the first detection device after N times of movement and the point cloud of the visible area of the second detection device in the initial state, and calculating to obtain a second rotation matrix and a second translation matrix.
  7. The method of claim 5, wherein after the calculating obtains the second rotation matrix and the second translation matrix, further comprising:
    and calculating to obtain a second transfer matrix according to the second rotation matrix and the second translation matrix.
  8. The method of claim 1, wherein said calculating an external parameter between the first probe device and the second probe device as a function of the first offset parameter and the second offset parameter comprises:
    and multiplying the first offset parameters obtained by each time of mobile calculation of the first detection device respectively, and then multiplying the first offset parameters by the second offset parameters to obtain the external parameters between the first detection device and the second detection device.
  9. The method of claim 1, wherein the first and second detection devices simultaneously initiate rotation about the same center point.
  10. The method of claim 1, wherein the first sensing device is rotated in a direction to approach the second sensing device.
  11. The method of claim 1, further comprising:
    and calculating the external parameters between the first detection device and other detection devices according to the external parameters between the first detection device and the second detection device and the external parameters between the second detection device and other detection devices.
  12. A data processing apparatus comprising at least a memory and a processor; the memory is connected with the processor through a communication bus and is used for storing computer instructions executable by the processor; the processor is used for reading the computer instructions from the memory to realize a detection device external parameter calibration method, and is suitable for calibrating the external parameters between a first detection device and a second detection device, the visible areas of the first detection device and the second detection device are not overlapped, the visible area of the first detection device after N times of movement is overlapped with the visible area of the second detection device in an initial state, and N is greater than or equal to 1, and the method comprises the following steps:
    calculating a first offset parameter between the coordinate systems before and after each movement of the first detection device;
    calculating a second offset parameter between the coordinate system of the first detection device after N times of movement and the coordinate system of the second detection device in the initial state;
    calculating an external parameter between the first detection device and the second detection device according to the first offset parameter and the second offset parameter.
  13. The apparatus of claim 12, wherein the processor is further configured to read computer instructions from the memory to implement:
    respectively acquiring point clouds of visible areas of the first detection device before and after each movement, and calculating the first offset parameter according to the acquired point clouds of the visible areas before and after each movement; wherein the first detection device has an overlapping area in the visible area before and after each movement.
  14. The apparatus of claim 13, wherein the processor is further configured to read computer instructions from the memory to implement:
    establishing a first coordinate relation function to be met by point clouds in a visible area before the first detection device moves and an overlapped area of the visible area after the first detection device moves each time;
    and establishing a first target function according to the coefficient of the first coordinate relation function, minimizing the first target function, and calculating to obtain a first rotation matrix and a first translation matrix.
  15. The apparatus of claim 14, wherein the processor is further configured to read computer instructions from the memory to implement:
    and calculating to obtain a first transfer matrix according to the first rotation matrix and the first translation matrix.
  16. The apparatus of claim 12, wherein the processor is further configured to read computer instructions from the memory to implement:
    and acquiring the point cloud of the visible area of the first detection device after N times of movement and the point cloud of the visible area of the second detection device in the initial state, and calculating the second offset parameter according to the point cloud of the visible area of the first detection device after N times of movement and the point cloud of the visible area of the second detection device in the initial state.
  17. The apparatus of claim 16, wherein the processor is further configured to read computer instructions from the memory to implement:
    establishing a second coordinate relation function which is required to be met by the point cloud of the overlapping area of the visible area of the first detection device after the first detection device moves for N times and the visible area of the second detection device in the initial state;
    and establishing a second objective function according to the coefficient of the second coordinate relation function, minimizing the second objective function, and calculating to obtain a second rotation matrix and a second translation matrix.
  18. The apparatus of claim 16, wherein the processor is further configured to read computer instructions from the memory to implement:
    and calculating to obtain a second transfer matrix according to the second rotation matrix and the second translation matrix.
  19. The apparatus of claim 12, wherein the processor is further configured to read computer instructions from the memory to implement:
    and multiplying the first offset parameters obtained by each time of mobile calculation of the first detection device respectively, and then multiplying the first offset parameters by the second offset parameters to obtain the external parameters between the first detection device and the second detection device.
  20. The device of claim 12, wherein the first and second detection means simultaneously initiate rotation about the same central point.
  21. The apparatus of claim 12, wherein the first sensing device is rotated in a direction to approach the second sensing device.
  22. The apparatus of claim 12, wherein the processor is further configured to read computer instructions from the memory to implement:
    and calculating the external parameters between the first detection device and other detectors according to the external parameters between the first detection device and the second detection device and the external parameters between the second detection device and other detectors.
  23. The apparatus of claim 12, wherein the first and second detection devices each comprise at least one of: laser radar, millimeter wave radar, ultrasonic radar.
  24. A detection system, comprising: a plurality of detection devices and a data processing device according to any of claims 12 to 23, the plurality of detection devices comprising a first detection device and a second detection device; the plurality of detection devices are mounted on the same carrier, the carrier comprising: and the movable platform drives the detection device to move through the movable platform.
  25. The system of claim 24, wherein the movable platform comprises: any one of a vehicle, an aircraft, and a turntable.
CN201980005318.XA 2019-01-30 2019-01-30 Detection device external parameter calibration method, data processing device and detection system Pending CN111771140A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/073990 WO2020154980A1 (en) 2019-01-30 2019-01-30 Method for calibrating external parameters of detection device, data processing device and detection system

Publications (1)

Publication Number Publication Date
CN111771140A true CN111771140A (en) 2020-10-13

Family

ID=71840661

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980005318.XA Pending CN111771140A (en) 2019-01-30 2019-01-30 Detection device external parameter calibration method, data processing device and detection system

Country Status (2)

Country Link
CN (1) CN111771140A (en)
WO (1) WO2020154980A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112462350A (en) * 2020-12-10 2021-03-09 苏州一径科技有限公司 Radar calibration method and device, electronic equipment and storage medium
CN114646932A (en) * 2022-05-23 2022-06-21 深圳元戎启行科技有限公司 Radar external parameter calibration method and device based on external radar and computer equipment
WO2022257138A1 (en) * 2021-06-11 2022-12-15 深圳市大疆创新科技有限公司 Calibration method and apparatus, and laser radar, detection system and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7643135B1 (en) * 2008-12-05 2010-01-05 Leica Geosystems Ag Telescope based calibration of a three dimensional optical scanner
CN101922974A (en) * 2010-08-31 2010-12-22 中国科学院西安光学精密机械研究所 Automatic calibration device and method for laser parameter performance test
CN107229043A (en) * 2017-05-22 2017-10-03 中国农业科学院农业资源与农业区划研究所 A kind of range sensor external parameters calibration method and system
CN107796370A (en) * 2016-08-30 2018-03-13 北京四维图新科技股份有限公司 For obtaining the method, apparatus and mobile mapping system of conversion parameter
CN108020825A (en) * 2016-11-03 2018-05-11 岭纬公司 Laser radar, Laser video camera head, the fusion calibration system of video camera and method
CN108700665A (en) * 2017-06-01 2018-10-23 深圳市大疆创新科技有限公司 A kind of detection method, device and detecting devices based on laser radar

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109118542B (en) * 2017-06-22 2021-11-23 阿波罗智能技术(北京)有限公司 Calibration method, device, equipment and storage medium between laser radar and camera
CN109215083B (en) * 2017-07-06 2021-08-31 华为技术有限公司 Method and device for calibrating external parameters of vehicle-mounted sensor
CN108226906B (en) * 2017-11-29 2019-11-26 深圳市易成自动驾驶技术有限公司 A kind of scaling method, device and computer readable storage medium
CN109001711B (en) * 2018-06-05 2020-06-26 北京智行者科技有限公司 Multi-line laser radar calibration method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7643135B1 (en) * 2008-12-05 2010-01-05 Leica Geosystems Ag Telescope based calibration of a three dimensional optical scanner
CN101922974A (en) * 2010-08-31 2010-12-22 中国科学院西安光学精密机械研究所 Automatic calibration device and method for laser parameter performance test
CN107796370A (en) * 2016-08-30 2018-03-13 北京四维图新科技股份有限公司 For obtaining the method, apparatus and mobile mapping system of conversion parameter
CN108020825A (en) * 2016-11-03 2018-05-11 岭纬公司 Laser radar, Laser video camera head, the fusion calibration system of video camera and method
CN107229043A (en) * 2017-05-22 2017-10-03 中国农业科学院农业资源与农业区划研究所 A kind of range sensor external parameters calibration method and system
CN108700665A (en) * 2017-06-01 2018-10-23 深圳市大疆创新科技有限公司 A kind of detection method, device and detecting devices based on laser radar

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112462350A (en) * 2020-12-10 2021-03-09 苏州一径科技有限公司 Radar calibration method and device, electronic equipment and storage medium
CN112462350B (en) * 2020-12-10 2023-04-04 苏州一径科技有限公司 Radar calibration method and device, electronic equipment and storage medium
WO2022257138A1 (en) * 2021-06-11 2022-12-15 深圳市大疆创新科技有限公司 Calibration method and apparatus, and laser radar, detection system and storage medium
CN114646932A (en) * 2022-05-23 2022-06-21 深圳元戎启行科技有限公司 Radar external parameter calibration method and device based on external radar and computer equipment

Also Published As

Publication number Publication date
WO2020154980A1 (en) 2020-08-06

Similar Documents

Publication Publication Date Title
US12013464B2 (en) Environment sensing system and movable platform
CN111902730B (en) Calibration plate, depth parameter calibration method, detection device and calibration system
CN210038146U (en) Distance measurement module, distance measurement device and movable platform
CN111771140A (en) Detection device external parameter calibration method, data processing device and detection system
CN111771136A (en) Abnormity detection method, alarm method, distance measuring device and movable platform
CN111587381A (en) Method for adjusting motion speed of scanning element, distance measuring device and mobile platform
CN210199305U (en) Scanning module, range unit and movable platform
CN209979845U (en) Distance measuring device and mobile platform
CN111699442B (en) Time measurement correction method and device
US20210333401A1 (en) Distance measuring device, point cloud data application method, sensing system, and movable platform
CN111902732A (en) Initial state calibration method and device for detection device
WO2020237663A1 (en) Multi-channel lidar point cloud interpolation method and ranging apparatus
CN111712734A (en) Laser ranging device and mobile platform
US20210333399A1 (en) Detection method, detection device, and lidar
US20220082665A1 (en) Ranging apparatus and method for controlling scanning field of view thereof
US20210341588A1 (en) Ranging device and mobile platform
WO2022256976A1 (en) Method and system for constructing dense point cloud truth value data and electronic device
CN111670568A (en) Data synchronization method, distributed radar system and movable platform
WO2020147121A1 (en) Rainfall measurement method, detection device, readable storage medium
CN112654893A (en) Motor rotating speed control method and device of scanning module and distance measuring device
CN111670375A (en) Distance measuring device and mobile platform
US20210333369A1 (en) Ranging system and mobile platform
WO2022226984A1 (en) Method for controlling scanning field of view, ranging apparatus and movable platform
CN111587383A (en) Reflectivity correction method applied to distance measuring device and distance measuring device
CN111630412A (en) Detection system and movable platform with same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20201013