WO2020154980A1 - Procédé d'étalonnage de paramètres externes d'un dispositif de détection, dispositif de traitement de données et système de détection - Google Patents

Procédé d'étalonnage de paramètres externes d'un dispositif de détection, dispositif de traitement de données et système de détection Download PDF

Info

Publication number
WO2020154980A1
WO2020154980A1 PCT/CN2019/073990 CN2019073990W WO2020154980A1 WO 2020154980 A1 WO2020154980 A1 WO 2020154980A1 CN 2019073990 W CN2019073990 W CN 2019073990W WO 2020154980 A1 WO2020154980 A1 WO 2020154980A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection device
visible area
point cloud
offset parameter
movement
Prior art date
Application number
PCT/CN2019/073990
Other languages
English (en)
Chinese (zh)
Inventor
陈涵
邢万里
吴特思
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980005318.XA priority Critical patent/CN111771140A/zh
Priority to PCT/CN2019/073990 priority patent/WO2020154980A1/fr
Publication of WO2020154980A1 publication Critical patent/WO2020154980A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Definitions

  • the invention relates to data processing technology, in particular to a method for calibrating external parameters of a detection device, a data processing device and a detection system.
  • Detection devices such as lidar can emit detection signals in different directions, and obtain depth information and reflectivity information of objects based on echoes in different directions.
  • the external parameter calibration method adopted includes indirect calibration by additionally setting a camera. In this way, the calibration process is complicated and the efficiency is low.
  • the embodiment of the present invention provides a method, a data processing device and a calibration system for a detection device parameter calibration, so as to calibrate the external parameters of the detection device with no overlap in the visible area, and improve the efficiency of the external parameter calibration.
  • an embodiment of the present invention provides a method for calibrating external parameters of a detection device, which is suitable for calibrating external parameters between a first detection device and a second detection device.
  • the first detection device and the second detection device are The visible areas of the devices do not overlap each other, the visible area of the first detection device after N moves and the visible area of the second detection device in the initial state overlap, and the N is greater than or equal to 1, and the method includes :
  • an embodiment of the present invention provides a detection data processing device, which includes at least a memory and a processor; the memory is connected to the processor through a communication bus, and is used to store computer instructions executable by the processor; The processor is used to read computer instructions from the memory to realize:
  • an embodiment of the present invention provides a detection system, including: a plurality of detection devices and the data processing device of the second aspect, the plurality of detection devices include a first detection device and a second detection device; A plurality of detection devices are installed on the same carrier, and the carrier includes a movable platform through which the detection device is driven to move.
  • the first detection device when calibrating the external parameters between the first detection device and the second detection device that do not overlap the visible areas, the first detection device is moved after N times.
  • the visible area overlaps with the visible area of the second detection device in the initial state, calculate the first offset parameter between the coordinate systems before and after the movement of the first detection device, and calculate the first detection device after N moves
  • the second offset parameter between the coordinate system of the first detection device and the coordinate system of the second detection device in the initial state, and then the first detection device and the second detection device are calculated according to the first offset parameter and the second offset parameter
  • Figure 1 is a block diagram of a detection device provided by an embodiment of the present invention.
  • FIG. 2 is a schematic structural diagram of a detection device using a coaxial optical path provided by an embodiment of the present invention
  • FIG. 3 is a schematic flowchart of a method for calibrating parameters of a detection device according to an embodiment of the present invention
  • FIG. 4 is a schematic flowchart of calculating a first offset parameter according to an embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of calculating a second offset parameter according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of the relative positions of the first detection device and the second detection device provided by an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of the relative positions of the first detection device and the second detection device according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a data processing device provided by an embodiment of the present invention.
  • Fig. 9 is a schematic structural diagram of a detection system provided by an embodiment of the present invention.
  • Detection devices such as lidar can emit detection signals in different directions, and obtain depth information and reflectivity information of objects based on echoes in different directions.
  • the external parameter calibration method adopted includes indirect calibration by additionally setting a camera. In this way, the calibration process is complicated and the efficiency is low.
  • the embodiment of the present invention provides a method for calibrating the external parameters of the detection device, which is suitable for calibrating the external parameters between the detection devices (such as lidar) with non-overlapping visible areas.
  • the aforementioned detection device is used to sense external environmental information, for example, distance information, orientation information, reflection intensity information, speed information, etc. of environmental targets.
  • the detection device can detect the distance between the detection device and the detection device by measuring the time of light propagation between the detection device and the detection object, that is, the time-of-flight (TOF).
  • TOF time-of-flight
  • the detection device can also use other technologies to detect the distance from the detection device to the detection device, such as a ranging method based on phase shift measurement, or a ranging method based on frequency shift measurement. Do restrictions.
  • FIG. 1 is a block diagram of a detection device provided by an embodiment of the present invention. The working process of ranging will be described below with reference to the detection device 100 shown in FIG. 1 as an example.
  • the detection device 100 may include a transmitting circuit 110, a receiving circuit 120, a sampling circuit 130, and an arithmetic circuit 140.
  • the transmitting circuit 110 may emit a light pulse sequence (for example, a laser pulse sequence).
  • the receiving circuit 120 can receive the light pulse sequence (also called an echo signal) reflected by the detected object, and perform photoelectric conversion on the light pulse sequence to obtain an electrical signal. After processing the electrical signal, it can be output to Sampling circuit 130.
  • the sampling circuit 130 may sample the electrical signal to obtain the sampling result.
  • the arithmetic circuit 140 may determine the distance between the detection device 100 and the detected object based on the sampling result of the sampling circuit 130.
  • the detection device 100 may further include a control circuit 150, which can control other circuits, for example, can control the working time of each circuit and/or set parameters for each circuit.
  • a control circuit 150 can control other circuits, for example, can control the working time of each circuit and/or set parameters for each circuit.
  • the detection device shown in FIG. 1 includes a transmitting circuit, a receiving circuit, a sampling circuit, and an arithmetic circuit for emitting a beam for detection
  • the transmitting circuit, The number of any one of the receiving circuit, the sampling circuit, and the arithmetic circuit may also be at least two, which are used to emit at least two light beams in the same direction or in different directions; wherein, the at least two light paths may be emitted simultaneously , It can also be launched at different times.
  • the light-emitting chips in the at least two transmitting circuits are packaged in the same module.
  • each emitting circuit includes a laser emitting chip, and the laser emitting chips in the at least two emitting circuits can be packaged together and housed in the same packaging space.
  • the detection device 100 may further include a scanning module 160 for changing the propagation direction of at least one laser pulse sequence emitted by the transmitting circuit.
  • the module including the transmitting circuit 110, the receiving circuit 120, the sampling circuit 130, and the arithmetic circuit 140, or the module including the transmitting circuit 110, the receiving circuit 120, the sampling circuit 130, the arithmetic circuit 140, and the control circuit 150 may be referred to as the tester.
  • the distance measurement module 150 can be independent of other modules, for example, the scanning module 160.
  • the detection device can adopt a coaxial optical path, that is, the light beam emitted by the detection device and the reflected light beam share at least part of the optical path in the detection device.
  • a coaxial optical path that is, the light beam emitted by the detection device and the reflected light beam share at least part of the optical path in the detection device.
  • the detection device may also adopt an off-axis optical path, that is, the light beam emitted by the detection device and the reflected light beam are respectively transmitted along different optical paths in the detection device.
  • Fig. 2 shows a schematic diagram of an embodiment in which the detection device of the present invention adopts a coaxial optical path.
  • the detection device 200 includes a ranging module 210.
  • the ranging module 210 includes a transmitter 203 (which may include the above-mentioned transmitting circuit), a collimating element 204, a detector 205 (which may include the above-mentioned receiving circuit, sampling circuit, and arithmetic circuit), and an optical path. Change element 206.
  • the ranging module 210 is used to emit a light beam, receive the return light, and convert the return light into an electrical signal.
  • the transmitter 203 can be used to emit a light pulse sequence.
  • the transmitter 203 may emit a sequence of laser pulses.
  • the laser beam emitted by the transmitter 203 is a narrow-bandwidth beam with a wavelength outside the visible light range.
  • the collimating element 204 is arranged on the exit light path of the emitter, and is used to collimate the light beam emitted from the emitter 203, and collimate the light beam emitted from the emitter 203 into parallel light and output to the scanning module.
  • the collimating element is also used to condense at least a part of the return light reflected by the probe.
  • the collimating element 204 may be a collimating lens or other elements capable of collimating light beams.
  • the light path changing element 206 is used to combine the transmitting light path and the receiving light path in the detection device before the collimating element 204, so that the transmitting light path and the receiving light path can share the same collimating element, making the light path more compact.
  • the transmitter 203 and the detector 205 may use respective collimating elements, and the optical path changing element 206 may be arranged on the optical path behind the collimating element.
  • the light path changing element can use a small-area mirror to transmit The light path and the receiving light path are combined.
  • the optical path changing element may also use a reflector with a through hole, where the through hole is used to transmit the emitted light of the emitter 203 and the reflector is used to reflect the returned light to the detector 205. In this way, the shielding of the back light by the bracket of the small mirror in the case of using the small mirror can be reduced.
  • the optical path changing element deviates from the optical axis of the collimating element 204.
  • the optical path changing element may also be located on the optical axis of the collimating element 204.
  • the detection device 200 further includes a scanning module 202.
  • the scanning module 202 is placed on the exit light path of the distance measuring module 210.
  • the scanning module 202 is used to change the transmission direction of the collimated light beam 219 emitted by the collimating element 204 and project it to the external environment, and project the return light to the collimating element 204 .
  • the returned light is collected on the detector 205 via the collimating element 204.
  • the scanning module 202 may include at least one optical element for changing the propagation path of the light beam, wherein the optical element may change the propagation path of the light beam by reflecting, refracting, or diffracting the light beam.
  • the scanning module 202 includes a lens, a mirror, a prism, a galvanometer, a grating, a liquid crystal, an optical phased array (Optical Phased Array), or any combination of the foregoing optical elements.
  • at least part of the optical elements are moving.
  • a driving module is used to drive the at least part of the optical elements to move.
  • the moving optical elements can reflect, refract, or diffract the light beam to different directions at different times.
  • the multiple optical elements of the scanning module 202 may rotate or vibrate around a common axis 209, and each rotating or vibrating optical element is used to continuously change the propagation direction of the incident light beam.
  • the multiple optical elements of the scanning module 202 may rotate at different speeds or vibrate at different speeds.
  • at least part of the optical elements of the scanning module 202 may rotate at substantially the same rotation speed.
  • the multiple optical elements of the scanning module may also be rotated around different axes.
  • the multiple optical elements of the scanning module may also rotate in the same direction or in different directions; or vibrate in the same direction, or vibrate in different directions, which is not limited herein.
  • the scanning module 202 includes a first optical element 214 and a driver 216 connected to the first optical element 214.
  • the driver 216 is used to drive the first optical element 214 to rotate around the rotation axis 209 to change the first optical element 214.
  • the direction of the beam 219 is collimated.
  • the first optical element 214 projects the collimated beam 219 to different directions.
  • the angle between the direction of the collimated light beam 219 changed by the first optical element and the rotation axis 109 changes with the rotation of the first optical element 214.
  • the first optical element 214 includes a pair of opposed non-parallel surfaces through which the collimated light beam 219 passes.
  • the first optical element 214 includes a prism whose thickness varies in at least one radial direction.
  • the first optical element 214 includes a wedge-angle prism, and the collimated beam 219 is refracted.
  • the scanning module 202 further includes a second optical element 215, the second optical element 215 rotates around the rotation axis 209, and the rotation speed of the second optical element 215 is different from the rotation speed of the first optical element 214.
  • the second optical element 215 is used to change the direction of the light beam projected by the first optical element 214.
  • the second optical element 215 is connected to another driver 217, and the driver 217 drives the second optical element 215 to rotate.
  • the first optical element 214 and the second optical element 215 can be driven by the same or different drivers, so that the rotation speed and/or rotation of the first optical element 214 and the second optical element 215 are different, so as to project the collimated light beam 219 to the outside space.
  • the controller 218 controls the drivers 216 and 217 to drive the first optical element 214 and the second optical element 215, respectively.
  • the rotational speeds of the first optical element 214 and the second optical element 215 can be determined according to the expected scanning area and pattern in actual applications.
  • the drivers 216 and 217 may include motors or other drivers.
  • the second optical element 215 includes a pair of opposite non-parallel surfaces through which the light beam passes. In one embodiment, the second optical element 215 includes a prism whose thickness varies in at least one radial direction. In one embodiment, the second optical element 215 includes a wedge prism.
  • the scanning module 202 further includes a third optical element (not shown) and a driver for driving the third optical element to move.
  • the third optical element includes a pair of opposite non-parallel surfaces, and the light beam passes through the pair of surfaces.
  • the third optical element includes a prism whose thickness varies in at least one radial direction.
  • the third optical element includes a wedge prism. At least two of the first, second, and third optical elements rotate at different rotation speeds and/or rotation directions.
  • each optical element in the scanning module 202 can project light to different directions, such as directions 211 and 213, so that the space around the detection device 200 is scanned.
  • directions 211 and 213 the directions that the space around the detection device 200 is scanned.
  • the return light 212 reflected by the probe 201 is incident on the collimating element 204 after passing through the scanning module 202.
  • the detector 205 and the transmitter 203 are placed on the same side of the collimating element 204, and the detector 205 is used to convert at least part of the return light passing through the collimating element 204 into an electrical signal.
  • each optical element is coated with an anti-reflection coating.
  • the thickness of the antireflection film is equal to or close to the wavelength of the light beam emitted by the emitter 203, which can increase the intensity of the transmitted light beam.
  • a filter layer is plated on the surface of an element located on the beam propagation path in the detection device, or a filter is provided on the beam propagation path, for transmitting at least the wavelength band of the beam emitted by the transmitter, Reflect other bands to reduce the noise caused by ambient light to the receiver.
  • the transmitter 203 may include a laser diode through which nanosecond laser pulses are emitted.
  • the laser pulse receiving time can be determined, for example, the laser pulse receiving time can be determined by detecting the rising edge time and/or the falling edge time of the electrical signal pulse. In this way, the detection device 200 can use the pulse receiving time information and the pulse sending time information to calculate the TOF, so as to determine the distance between the detection object 201 and the detection device 200.
  • the distance and orientation detected by the detection device 200 can be used for remote sensing, obstacle avoidance, surveying and mapping, modeling, navigation, and the like.
  • the detection device of the embodiment of the present invention can be applied to a movable platform, and the detection device can be installed on the platform body of the movable platform.
  • a movable platform with a detection device can measure the external environment, for example, measuring the distance between the movable platform and an obstacle for obstacle avoidance and other purposes, and for two-dimensional or three-dimensional mapping of the external environment.
  • the movable platform includes at least one of an unmanned aerial vehicle, a car, a remote control car, a robot, and a camera.
  • the platform body When the detection device is applied to an unmanned aerial vehicle, the platform body is the fuselage of the unmanned aerial vehicle.
  • the platform body When the detection device is applied to a car, the platform body is the body of the car.
  • the car can be a self-driving car or a semi-self-driving car, and there is no restriction here.
  • the detection device When the detection device is applied to a remote control car, the platform body is the body of the remote control car.
  • the platform body When the detection device is applied to a robot, the platform body is a robot.
  • the detection device When the detection device is applied to a camera, the platform body is the camera itself.
  • Fig. 3 is a schematic flowchart of a method for calibrating external parameters of a detection device according to an embodiment of the present invention.
  • the method provided by the embodiment of the present invention is suitable for calibrating the external parameters between two detection devices.
  • the two detection devices as an example: the first detection device and the second detection device, the first detection device and The visible areas of the second detection device do not overlap each other.
  • the visible area of the first detection device after N moves and the visible area of the second detection device in the initial state overlap.
  • the N is greater than or equal to 1, refer to the figure As shown in 3, the method includes the following steps S101-S103:
  • the above-mentioned first offset parameter is a mutual conversion relationship parameter between the coordinate system before and after the movement of the first detection device, and is used to characterize the relative position relationship before and after the movement of the first detection device.
  • Lidar is a perceptual sensor that can obtain three-dimensional information of the scene.
  • the basic principle is to actively emit laser pulse signals to the detected object and receive the reflected laser pulse signals.
  • Calculate the depth information of the measured object obtain the angle information of the measured object relative to the lidar according to the emission direction of the lidar; combine the aforementioned depth information and angle information to obtain a large number of detection points, the data set of the detection points is called a point cloud
  • the three-dimensional information of the measured object relative to the lidar can be reconstructed.
  • the method of calculating the first offset parameter between the coordinate systems before and after each movement of the first detection device includes:
  • each time the first detection device moves it is set to satisfy the overlap between the visible area before the movement and the visible area after the movement, and the point cloud of the visible area before and after the movement of the first detection device
  • the satisfied relationship calculates the first offset parameter
  • FIG. 4 is a schematic flowchart of calculating a first offset parameter according to an embodiment of the present invention.
  • calculating the first offset parameter according to the acquired point cloud of the visible area before and after each movement includes the following steps S201-S202:
  • S201 Establish a first coordinate relationship function to be satisfied by the point cloud of the overlapping area of the visible area before and after the movement of the first detection device each time.
  • the point cloud includes three-dimensional coordinate data. Since the visible area before the movement of the first detection device and the visible area after the movement of the first detection device overlap during the movement, the visible area before the movement of the first detection device There are at least one set of point clouds in the point cloud of the viewing area and the point cloud of the moving viewing area that correspond to the same point in the actual three-dimensional space. Assuming that the set of point clouds are p i and q i , that is, it can be The point cloud p i of the viewing area and the point cloud q i of the visual area after moving correspond to the same point in the actual three-dimensional space. Based on this, the first coordinate relationship function is established, and the first coordinate relationship function is as follows:
  • the coefficients of the first coordinate relationship function are a parameter R and a parameter t respectively.
  • the parameter R is the first rotation matrix
  • t is the first translation matrix.
  • the first objective function is minimized, the parameter R and the parameter t are calculated, and then the first rotation matrix and the first translation matrix are obtained, where n is the total number of point clouds in the visible area obtained before the movement of the first detection device and The total number of point clouds in the visual area after the movement; and the first offset parameter between the coordinate systems before and after each movement of the first detection device obtained in this embodiment includes the first rotation matrix and the first translation matrix.
  • the above-mentioned method of minimizing the first objective function may be implemented using a nonlinear optimization method or a singular value decomposition (SVD) method.
  • SVD singular value decomposition
  • the first transfer matrix is obtained according to the first rotation matrix and the first offset matrix, and the obtained first transfer matrix ⁇ 1 is as follows Formula (3):
  • the first offset parameter obtained by the method provided in this embodiment may further include: a first transition matrix.
  • S102 Calculate a second offset parameter between the coordinate system of the first detection device after N moves and the coordinate system of the second detection device in the initial state.
  • the foregoing method of calculating the second offset parameter includes:
  • the second offset parameter is calculated from the point cloud of the visible area and the point cloud of the visible area of the second detection device in the initial state.
  • FIG. 5 is a schematic flowchart of calculating a second offset parameter according to an embodiment of the present invention.
  • the point cloud of the visible area after the first detection device moves N times and the point cloud of the visible area of the second detection device in the initial state are described above.
  • Calculating the second offset parameter includes the following steps S301-S302:
  • S301 Establish a second coordinate relationship function to be satisfied by the point cloud of the overlapping area of the visible area of the first detection device after N moves and the visible area of the second detection device in the initial state.
  • the visible area of the first detecting device after N moves and the visible area of the second detecting device in the initial state have an overlapping area, so the visible area of the first detecting device after N moves cloud point and the second detection means has at least one set point cloud corresponding to the actual three-dimensional space with the point, assuming the set point cloud X i and Y i in the visible region of the cloud point in an initial state, i.e., a first visual area detecting means after N times after moving cloud X i and a second detection means cloud point Y i viewable area in the initial state corresponds to the actual three-dimensional space with the point established based on this second coordinate
  • the relation function is as follows (4),
  • the coefficients of the above-mentioned second coordinate relationship function are the parameter R 0 and the parameter t 0 , the parameter R 0 is the second rotation matrix, and the parameter t 0 is the second translation matrix.
  • the second offset parameter obtained in this embodiment includes a second rotation matrix and a second translation matrix.
  • the above-mentioned method of minimizing the second objective function may also be implemented using a nonlinear optimization method or a singular value decomposition (SVD) method.
  • SVD singular value decomposition
  • the second transfer matrix ⁇ 0 is obtained according to the second rotation matrix and the second offset matrix, and the second transfer matrix is finally obtained.
  • the matrix ⁇ 0 is as follows (6),
  • the foregoing calculation of the external parameters between the first detection device and the second detection device according to the first offset parameter and the second offset parameter includes:
  • the external parameters between the first detection device and the second detection device characterize the relative positional relationship between the first detection device and the second detection device in space, and are used in subsequent processing operations such as point cloud fusion.
  • the visible area of the first detection device moves through N times and the initial state of the second detection device
  • the visible area of is overlapped, and a first transfer matrix is calculated for each movement of the first detection device; the visible area of the first detection device after three movements is compared with the visible area of the second detection device in the initial state
  • the visible area of the first detection device in the initial state is A0
  • the visible area of the first detection device after the first movement is the visible area A1
  • the visible area of the first detection device after the second movement The visible area is the visible area A2
  • the visible area of the first detection device after the third movement is the visible area A3
  • the visible area of the second detection device in the initial state is the visible area B0.
  • the first transition matrix ⁇ 1 is calculated from the point cloud of the area A0 and the point cloud of the visible area A1
  • the first transition matrix ⁇ 2 is calculated according to the point cloud of the visible area A1 and the point cloud of the visible area A2, according to the visible area
  • the first transition matrix ⁇ 3 is calculated from the point cloud of A2 and the point cloud of the visible area A3
  • the first detection device and the second detection device mentioned above in the embodiment of the present application start to rotate at the same time around the same central point when they move; this rotation can be realized by a movable platform (such as a car) driving the first detection device With the second detecting device turning or rotating in place, the specific rotation mode is not limited here.
  • the above-mentioned first detection device and the second detection device in the embodiment of the present application translate along the same straight line when moving; the translation may be realized by a movable platform (such as a car) driving the first detection device and the second detection device.
  • the detection device translates along the same straight line.
  • the movement modes satisfying the overlapping area between the visible area of the first detection device after N movement and the visible area of the second detection device in the initial state are applicable to the present invention, and the present invention does not affect the movement mode. Specific restrictions.
  • the first detection device rotates in a direction close to the second detection device; this way, the number of rotations will be reduced.
  • the first detection device 60 and the second detection device 70 are installed on the same straight line. At this time, no matter which direction the first detection device 60 rotates, it will eventually need to rotate through the same angle. After that, the visible area of the first detection device 60 overlaps with the visible area of the second detection device 70 in the initial state.
  • the angle between the first detection device 60 and the second detection device 70, the field of view angle of the first detection device 60 and the field of view angle of the second detection device 70 can be preliminarily determined before the rotation. Under the condition that the visible area before and after each rotation of a detection device 60 has overlapping areas, determine the number of rotations required for each rotation of the same angle, and then the first detection device 60 and the second detection device 70 rotate according to the number of rotations .
  • Fig. 8 is a schematic diagram of a data processing device provided by an embodiment of the present invention.
  • the data processing device 1000 includes at least a memory 1002 and a processor 1001; the memory 1002 is connected to the processor 1001 through a communication bus 1003, and is used to store computer instructions executable by the processor 1001;
  • the processor 1001 is configured to read computer instructions from the memory 1001 to implement the method for calibrating the external parameters of the detection device, and is suitable for calibrating the external parameters between the first detection device and the second detection device.
  • the visible area of the device and the second detection device do not overlap each other, the visible area of the first detection device after N moves and the visible area of the second detection device in the initial state overlap, the N Greater than or equal to 1, the method includes:
  • the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
  • the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
  • a first objective function is established according to the coefficients of the first coordinate relationship function, and the first objective function is minimized by obtaining the point cloud of the visible area before and after each movement, and the first rotation matrix and the first rotation matrix are calculated.
  • a translation matrix is established according to the coefficients of the first coordinate relationship function, and the first objective function is minimized by obtaining the point cloud of the visible area before and after each movement, and the first rotation matrix and the first rotation matrix are calculated.
  • the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
  • the first transfer matrix is calculated according to the first rotation matrix and the first translation matrix.
  • the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
  • the second offset parameter is calculated from the point cloud of the visible area and the point cloud of the visible area of the second detection device in the initial state.
  • the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
  • a second objective function is established according to the coefficients of the second coordinate relationship function, and the obtained point cloud of the visible area of the first detection device after N moves and the initial state of the second detection device.
  • the point cloud of the viewing area minimizes the second objective function, and the second rotation matrix and the second translation matrix are calculated.
  • the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
  • the second transfer matrix is calculated according to the second rotation matrix and the second translation matrix.
  • the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
  • the first detection device and the second detection device simultaneously start to rotate around the same center point.
  • the first detection device rotates in a direction approaching the second detection device.
  • the processor 1001 is further configured to read computer instructions from the memory 1002 to implement:
  • the external parameters between the first detection device and the second detection device calculate the first detection device and the other detectors The external parameters between.
  • both the first detection device and the second detection device include at least one of the following: lidar, millimeter wave radar, and ultrasonic radar.
  • the aforementioned data processing device is a host computer.
  • the aforementioned data processing device is provided in the detection device.
  • the embodiment of the present invention also provides a detection system. As shown in FIG. 9, it includes a plurality of detection devices and the above-mentioned data processing device 1000, and the plurality of detection devices includes a first detection device 60 and a second detection device 70 ( Only two detection devices are shown in the figure); the multiple detection devices are mounted on the same carrier (not shown in the figure), and the carrier includes: a movable platform through which the detection device is driven The device moves.
  • the above-mentioned movable platform includes any one of a vehicle, an aircraft, and a turntable.
  • the above-mentioned turntable may be installed on a vehicle or an aircraft, and the rotation of the turntable drives the rotation of the detection device for external parameter calibration.
  • the relevant part can refer to the part of the description of the method embodiment.
  • the device embodiments described above are merely illustrative.
  • the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units.
  • Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments. Those of ordinary skill in the art can understand and implement it without creative work.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un procédé d'étalonnage de paramètres externes d'un dispositif de détection, un dispositif de traitement de données et un système de détection. Ledit procédé permet d'étalonner des paramètres externes entre un premier dispositif de détection (60) et un second dispositif de détection (70). Des zones de visualisation du premier dispositif de détection (60) et du second dispositif de détection (70) ne se chevauchent pas l'une l'autre. La zone de visualisation du premier dispositif de détection (60) après N mouvements chevauche la zone de visualisation du deuxième dispositif de détection (70) à l'état initial, N étant supérieur ou égal à 1. Le procédé consiste : à calculer un premier paramètre de décalage entre des systèmes de coordonnées du premier dispositif de détection (60) avant et après chaque mouvement (S101) ; à calculer un second paramètre de décalage entre le système de coordonnées du premier dispositif de détection (60) après N mouvements et le système de coordonnées du second dispositif de détection (70) dans l'état initial (S102) ; et à calculer des paramètres externes entre le premier dispositif de détection (60) et le second dispositif de détection (70) en fonction du premier paramètre de décalage et du second paramètre de décalage (S103). Ainsi, l'efficacité d'étalonnage des paramètres externes des dispositifs de détection peut être améliorée.
PCT/CN2019/073990 2019-01-30 2019-01-30 Procédé d'étalonnage de paramètres externes d'un dispositif de détection, dispositif de traitement de données et système de détection WO2020154980A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980005318.XA CN111771140A (zh) 2019-01-30 2019-01-30 一种探测装置外参数标定方法、数据处理装置和探测系统
PCT/CN2019/073990 WO2020154980A1 (fr) 2019-01-30 2019-01-30 Procédé d'étalonnage de paramètres externes d'un dispositif de détection, dispositif de traitement de données et système de détection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/073990 WO2020154980A1 (fr) 2019-01-30 2019-01-30 Procédé d'étalonnage de paramètres externes d'un dispositif de détection, dispositif de traitement de données et système de détection

Publications (1)

Publication Number Publication Date
WO2020154980A1 true WO2020154980A1 (fr) 2020-08-06

Family

ID=71840661

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/073990 WO2020154980A1 (fr) 2019-01-30 2019-01-30 Procédé d'étalonnage de paramètres externes d'un dispositif de détection, dispositif de traitement de données et système de détection

Country Status (2)

Country Link
CN (1) CN111771140A (fr)
WO (1) WO2020154980A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112462350B (zh) * 2020-12-10 2023-04-04 苏州一径科技有限公司 雷达标定方法及装置、电子设备及存储介质
WO2022257138A1 (fr) * 2021-06-11 2022-12-15 深圳市大疆创新科技有限公司 Procédé et appareil d'étalonnage, et radar laser, système de détection et support de stockage
CN114646932B (zh) * 2022-05-23 2022-10-21 深圳元戎启行科技有限公司 基于外置雷达的雷达外参标定方法、装置和计算机设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107229043A (zh) * 2017-05-22 2017-10-03 中国农业科学院农业资源与农业区划研究所 一种距离传感器外参数标定方法和系统
CN108020825A (zh) * 2016-11-03 2018-05-11 岭纬公司 激光雷达、激光摄像头、视频摄像头的融合标定系统及方法
CN108226906A (zh) * 2017-11-29 2018-06-29 深圳市易成自动驾驶技术有限公司 一种标定方法、装置及计算机可读存储介质
CN109001711A (zh) * 2018-06-05 2018-12-14 北京智行者科技有限公司 多线激光雷达标定方法
US20180372852A1 (en) * 2017-06-22 2018-12-27 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for calibration between laser radar and camera, device and storage medium
CN109215083A (zh) * 2017-07-06 2019-01-15 华为技术有限公司 车载传感器的外部参数标定的方法和设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7643135B1 (en) * 2008-12-05 2010-01-05 Leica Geosystems Ag Telescope based calibration of a three dimensional optical scanner
CN101922974B (zh) * 2010-08-31 2012-02-01 中国科学院西安光学精密机械研究所 一种激光参数性能测试自动标定装置及其方法
CN107796370B (zh) * 2016-08-30 2020-09-08 北京四维图新科技股份有限公司 用于获取转换参数的方法、装置及移动测图系统
WO2018218629A1 (fr) * 2017-06-01 2018-12-06 深圳市大疆创新科技有限公司 Procédé et dispositif de détection reposant sur un lidar et équipement de sondage

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108020825A (zh) * 2016-11-03 2018-05-11 岭纬公司 激光雷达、激光摄像头、视频摄像头的融合标定系统及方法
CN107229043A (zh) * 2017-05-22 2017-10-03 中国农业科学院农业资源与农业区划研究所 一种距离传感器外参数标定方法和系统
US20180372852A1 (en) * 2017-06-22 2018-12-27 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for calibration between laser radar and camera, device and storage medium
CN109215083A (zh) * 2017-07-06 2019-01-15 华为技术有限公司 车载传感器的外部参数标定的方法和设备
CN108226906A (zh) * 2017-11-29 2018-06-29 深圳市易成自动驾驶技术有限公司 一种标定方法、装置及计算机可读存储介质
CN109001711A (zh) * 2018-06-05 2018-12-14 北京智行者科技有限公司 多线激光雷达标定方法

Also Published As

Publication number Publication date
CN111771140A (zh) 2020-10-13

Similar Documents

Publication Publication Date Title
WO2020082363A1 (fr) Système de détection d'environnement et plateforme mobile
CN111902730B (zh) 一种标定板、深度参数标定方法、探测装置及标定系统
CN210038146U (zh) 测距模组、测距装置及可移动平台
WO2020154980A1 (fr) Procédé d'étalonnage de paramètres externes d'un dispositif de détection, dispositif de traitement de données et système de détection
CN210142187U (zh) 一种距离探测装置
CN209356678U (zh) 测距装置
US20220120899A1 (en) Ranging device and mobile platform
CN210199305U (zh) 一种扫描模组、测距装置及可移动平台
WO2020177076A1 (fr) Procédé et appareil d'étalonnage de l'état initial d'un appareil de détection
WO2020237663A1 (fr) Procédé d'interpolation de nuage de points lidar multi-canal et appareil de télémétrie
WO2020133384A1 (fr) Dispositif de télémétrie laser et plateforme mobile
US20220082665A1 (en) Ranging apparatus and method for controlling scanning field of view thereof
US20210341588A1 (en) Ranging device and mobile platform
US20210333374A1 (en) Ranging apparatus and mobile platform
US20210333399A1 (en) Detection method, detection device, and lidar
WO2020142909A1 (fr) Procédé de synchronisation de données, système de radar distribué, et plateforme mobile
WO2020147121A1 (fr) Procédé de mesure de précipitations, dispositif de détection et support de stockage lisible
WO2021026766A1 (fr) Procédé et dispositif de commande de la vitesse de rotation d'un moteur pour module de balayage, et dispositif de mesure de distance
US20210333369A1 (en) Ranging system and mobile platform
WO2020107379A1 (fr) Procédé de correction de réflectivité pour utilisation dans un appareil de télémétrie, et appareil de télémétrie
WO2020133038A1 (fr) Système de détection et plateforme mobile comprenant un système de détection
WO2022226984A1 (fr) Procédé de commande de champ de vision de balayage, appareil de télémétrie et plateforme mobile
WO2020142879A1 (fr) Procédé de traitement de données, dispositif de détection, dispositif de traitement de données et plateforme mobile
WO2020142893A1 (fr) Procédé de détection d'accès à un radar, circuit et plateforme mobile
WO2020150961A1 (fr) Dispositif de détection et plateforme mobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19913089

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19913089

Country of ref document: EP

Kind code of ref document: A1