WO2023133939A1 - 图像融合激光的雷达探测系统及方法 - Google Patents

图像融合激光的雷达探测系统及方法 Download PDF

Info

Publication number
WO2023133939A1
WO2023133939A1 PCT/CN2022/073845 CN2022073845W WO2023133939A1 WO 2023133939 A1 WO2023133939 A1 WO 2023133939A1 CN 2022073845 W CN2022073845 W CN 2022073845W WO 2023133939 A1 WO2023133939 A1 WO 2023133939A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
signal
detection module
image
laser
Prior art date
Application number
PCT/CN2022/073845
Other languages
English (en)
French (fr)
Inventor
时菲菲
王世玮
郑睿童
沈罗丰
Original Assignee
探维科技(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 探维科技(北京)有限公司 filed Critical 探维科技(北京)有限公司
Publication of WO2023133939A1 publication Critical patent/WO2023133939A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present disclosure relates to the technical field of laser radar, in particular to a radar detection system and method for image fusion laser.
  • the lateral resolution of the laser detection components in the lidar system is low, and the image sensor device has a high horizontal resolution for obtaining two-dimensional images, but it does not have the capability of direct three-dimensional imaging.
  • the lidar point cloud data and image data are usually obtained separately, and then the two are fused. The requirements are high and the algorithm is complicated.
  • Time synchronization means that a unified external clock source provides the same reference time for each sensor, and each sensor adds time stamp information to the different types of data collected by each sensor according to the calibrated respective time, so as to realize the time stamp synchronization of all sensors.
  • Global Positioning System Global Positioning System, GPS
  • the technical problem to be solved in this disclosure is to solve the problem that in the existing laser radar system, it is difficult to ensure that different types of sensors collect the same information at the same time due to the different collection periods of various sensors.
  • embodiments of the present disclosure provide an image fusion laser radar detection system and method.
  • the present disclosure provides an image fusion laser radar detection system, the system includes a laser emitting module, a spectroscopic module, a photoelectric detection module and an image detection module, the photoelectric detection module and the image
  • the detection module is arranged on both sides of the spectroscopic module;
  • the laser emitting module is used to emit laser signals to the area to be detected
  • the beam splitting module is used to split the echo signal reflected by the object in the area to be detected into a first signal and a second signal;
  • the photoelectric detection module is used to receive the first signal, so as to determine the point cloud data
  • the image detection module is used to receive the second signal so as to determine the image data
  • the spectroscopic module is used to send the first signal to the photodetection module, and send the second signal to the image detection module at the same position of the spectroscopic module;
  • the laser emission module and the image detection module are triggered to work at the same time.
  • the photoelectric detection module and the image detection module are arranged mirror-symmetrically along the plane where the spectroscopic module receives echo signals.
  • the distance difference between the distance between the photoelectric detection module and the transflective surface of the spectroscopic module and the distance between the image detection module and the transmissive reflective surface of the spectroscopic module is equal to or less than 5mm .
  • the optical splitting module is used to transmit the first signal and reflect the second signal; or, the optical splitting module is used to transmit the second signal and reflect the first signal Signal.
  • the photodetection module is arranged on a side corresponding to the first signal, and the image detection module is arranged on a side corresponding to the second signal.
  • the echo signal includes a laser signal and a visible light signal reflected by an object in the region to be detected;
  • the first signal is the laser signal; the second signal is the visible light signal.
  • the laser emitting module is used to emit a line laser signal to the area to be detected; the photoelectric detection module and the image detection module are both linear array detectors.
  • a focusing lens group is also included, and the focusing lens group is arranged in front of the spectroscopic module for irradiating the focused echo signal onto the spectroscopic module.
  • the light splitting module includes a dichroic mirror; the angle R between the dichroic mirror and the optical axis of the echo signal satisfies:
  • the embodiment of the present disclosure also provides a radar detection method for image fusion laser, the method includes:
  • Fusion of point cloud data and image data is performed based on the first signal and the second signal.
  • an embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing instructions executable by the processor; wherein, the processor invokes the program or instruction stored in the memory for Executing some or all of the steps in various implementation manners that can realize the image fusion laser radar detection method provided by the second aspect of the present disclosure.
  • the embodiment of the present disclosure also provides a computer storage medium, wherein the computer storage medium can store a program, and when the program is executed, it can realize the various aspects of the image fusion laser radar detection method provided by the second aspect of the present disclosure. Implement some or all of the steps in the manner.
  • the embodiment of the present disclosure provides an image fusion laser radar detection system and method, which couples the photoelectric detection module and the image detection module to the same radar detection system, which can eliminate the cumbersome position conversion relationship, and the time synchronization accuracy can be improved. Reaching the microsecond level, the target recognition and fusion of images and point clouds can be completed without adding additional computing power.
  • FIG. 1 is a schematic structural diagram of an image fusion laser radar detection system provided by an embodiment of the present disclosure
  • Fig. 2 is a schematic structural diagram of the image fusion laser radar detection system shown in Fig. 1 under another viewing angle;
  • FIG. 3 is another schematic structural diagram of the image fusion laser radar detection system provided by an embodiment of the present disclosure
  • Fig. 4 is a schematic structural diagram of the image fusion laser radar detection system shown in Fig. 3 under another viewing angle;
  • FIG. 5 is another schematic structural diagram of the image fusion laser radar detection system provided by an embodiment of the present disclosure.
  • Fig. 6 is the spectral transmittance curve of visible light 45 degree dichroic mirror
  • FIG. 7 is a detection method provided by an embodiment of the present disclosure.
  • FIG. 8 is a schematic diagram of a comparison between a time synchronization method in the prior art and a synchronization method provided by an embodiment of the present disclosure
  • FIG. 9 is a schematic diagram of the effect of the image fusion laser provided by the embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of an electronic device provided by an embodiment of the present disclosure.
  • FIG. 1 is a schematic structural diagram of an image fusion laser radar detection system provided by an embodiment of the present disclosure
  • FIG. 2 is a structural schematic diagram of the image fusion laser radar detection system shown in FIG. 1 under another viewing angle.
  • the system includes a laser emission module 1, a spectroscopic module 2, a photoelectric detection module 3 and an image detection module 4, and the photoelectric detection module 3 and the image detection module 4 are separately located in the spectroscopic module. Group 2 sides.
  • the laser emitting module 1 is used to emit laser signals to the area to be detected, the signal reflected by the object in the area to be detected is an echo signal, and the beam splitting module 2 is used to split the echo signal reflected by the object in the area to be detected into first signal and second signal.
  • the spectroscopic module 2 transmits the first signal to the photodetection module 3, and the photodetection module 3 is used to receive the first signal so as to determine the point cloud data.
  • the spectroscopic module 2 transmits the second signal to the image detection module 4, and the image detection module 4 is used to receive the second signal so as to determine the image data. Moreover, the spectroscopic module 2 is used to send the first signal to the photodetection module 3 , and send the second signal to the image detection module 4 at the same position of the spectroscopic module 2 . Wherein, the laser emission module 1 and the image detection module 4 are triggered to work simultaneously.
  • the embodiment of the present disclosure provides an image fusion laser radar detection system, which couples the photoelectric detection module and the image detection module into the same system, which can avoid the cumbersome position conversion relationship, and the time synchronization accuracy can reach the microsecond level , so that the target recognition and fusion of images and point clouds can be completed without adding additional computing power.
  • the laser emission module and the image detection module are triggered to work at the same time, so that the time alignment of the point cloud data and the image data acquisition level can be realized, and since the photoelectric detection module and the image detection module are coupled to the same radar detection
  • the photoelectric detection module and the image detection module are arranged mirror-symmetrically along the plane where the spectroscopic module receives the echo signal.
  • the structure for receiving the echo signal by the spectroscopic module is a planar structure, and the photoelectric detection module and the image detection module are arranged on both sides of the plane where the spectroscopic module receives the echo signal, and are mirror-symmetrically arranged along the plane.
  • the photodetection module 3 and the image detection module 4 are mirror-symmetrically arranged along the plane where the splitter module 2 receives echo signals.
  • the distance difference between the distance from the photodetection module 3 to the transflective surface of the spectroscopic module 2 and the distance from the image detection module 4 to the transmissive reflective surface of the spectroscopic module 2 is equal to or less than 5 mm. That is, the distance from any position on the transmissive and reflective surface of the spectroscopic module 2 to the photoelectric detection module 3 is equal to or less than 5 mm from the distance from the same position on the spectroscopic module 2 to the image detection module 4 .
  • the image fusion laser radar detection system provided by the embodiments of the present disclosure can realize the spatial synchronization of the image detection module and the photoelectric detection module.
  • the spatial synchronization refers to converting the measured values based on different detection module coordinate systems to the same coordinate Under the system, for example, to realize the fusion of radar point cloud data and image data, it is necessary to establish an accurate coordinate transformation relationship among the three-dimensional world coordinate system, radar coordinate system, camera coordinate system, image coordinate system and pixel coordinate system.
  • the photoelectric detection module and the image detection module are arranged mirror-symmetrically along the plane where the splitting module receives the echo signal, and the design of the optical path ensures that the photoelectric detection module and the image
  • the mirror symmetry relationship of the detection module in space, the coordinate system of the corresponding photoelectric detection module and the coordinate system of the image detection module can be integrated into the same coordinate system only by changing the positive and negative, which greatly reduces the radar The complexity of point cloud data and image data fusion.
  • the position accuracy of the photoelectric detection module and the image detection module can be guaranteed to be ⁇ 0.1mm. Therefore, the image fusion laser radar detection system of the present disclosure has fused the coordinate system of the photoelectric detection module and the coordinate system of the image detection module into the same coordinate system through hardware, so the spatial synchronization has been realized without complex coordinate conversion.
  • the light splitting module 2 is configured to transmit the second signal and reflect the first signal.
  • the second signal transmitted by the spectroscopic module 2 is received by the image detection module 4
  • the first signal reflected by the spectroscopic module 2 is received by the photodetection module 3 .
  • the photodetection module is arranged on the side corresponding to the first signal
  • the image detection module is arranged on the side corresponding to the second signal. In this way, it is convenient for each detection module to receive corresponding signals, which makes the structure design of the whole radar detection system simpler.
  • the echo signal includes a laser signal and a visible light signal reflected by objects in the area to be detected.
  • the first signal is a laser signal
  • the second signal is a visible light signal.
  • the image detection module receives the visible light signal and determines the visible light signal as image data.
  • the photoelectric detection module receives the laser signal and determines the laser signal as point cloud data.
  • FIG. 3 is another schematic structural diagram of the radar detection system for image fusion laser provided by an embodiment of the present disclosure
  • FIG. 4 is a schematic structural diagram of the radar detection system for image fusion laser shown in FIG. 3 at another viewing angle.
  • the system includes a laser emitting module 1, a spectroscopic module 2, a photodetection module 3 and an image detection module 4, the photodetection module 3 and the image detection module 4 They are respectively arranged on both sides of the light splitting module 2 .
  • the laser emitting module 1 is used to emit laser signals to the area to be detected, the signal reflected by the object in the area to be detected is an echo signal, and the beam splitting module 2 is used to split the echo signal reflected by the object in the area to be detected into first signal and second signal.
  • the light splitting module 2 is used to transmit the first signal and reflect the second signal.
  • the spectroscopic module 2 transmits the first signal to the photodetection module 3, and the photodetection module 3 is used to receive the first signal so as to determine point cloud data.
  • the spectroscopic module 2 transmits the second signal to the image detection module 4, and the image detection module 4 is used to receive the second signal so as to determine the image data.
  • the spectroscopic module 2 is used to send the first signal to the photodetection module 3 , and send the second signal to the image detection module 4 at the same position of the spectroscopic module 2 .
  • the laser emission module 1 and the image detection module 4 are triggered to work simultaneously.
  • the image fusion laser radar detection system provided by the embodiment of the present disclosure can avoid the cumbersome position conversion relationship, the time synchronization accuracy can reach the microsecond level, and strictly realize the pixel-level space synchronization and time synchronization fusion without adding additional calculations.
  • the target recognition and fusion of image and point cloud can be completed by force.
  • the laser emission module 1 and the image detection module 4 are triggered to work at the same time, so that the time alignment of the point cloud data and the image data acquisition level can be realized.
  • the photoelectric detection module and the image detection module are coupled to the same In a radar detection system, the laser emission module 1 and the image detection module 4 can be triggered simultaneously. And it ensures that the same object is detected by the laser and the image at the same time. In this way, structured point cloud data and image data with time stamp or time series order can be obtained, and then the time synchronization of point cloud data and image data can be realized.
  • the photoelectric detection module and the image detection module are arranged mirror-symmetrically along the plane where the spectroscopic module receives the echo signal.
  • the structure for receiving the echo signal by the splitting module is a planar structure, and the photodetection module and the image detection module are arranged on both sides of the plane where the splitting module receives the echo signal, and are mirror-symmetrically arranged along the plane.
  • the photodetection module 3 and the image detection module 4 are mirror-symmetrically arranged along the plane where the splitter module 2 receives echo signals.
  • the distance difference between the distance from the photodetection module 3 to the transflective surface of the spectroscopic module 2 and the distance from the image detection module 4 to the transmissive reflective surface of the spectroscopic module 2 is equal to or less than 5mm. That is, the distance from any position on the transmissive and reflective surface of the spectroscopic module 2 to the photoelectric detection module 3 is equal to or less than 5 mm from the distance from the same position on the spectroscopic module 2 to the image detection module 4 .
  • the image fusion laser radar detection system provided by the embodiments of the present disclosure can realize the spatial synchronization of the image detection module and the photoelectric detection module.
  • the spatial synchronization refers to converting the measured values based on different detection module coordinate systems to the same coordinate Under the system, for example, to realize the fusion of radar point cloud data and image data, it is necessary to establish an accurate coordinate transformation relationship among the three-dimensional world coordinate system, radar coordinate system, camera coordinate system, image coordinate system and pixel coordinate system.
  • the photoelectric detection module and the image detection module are arranged mirror-symmetrically along the plane where the splitting module receives the echo signal, and the design of the optical path ensures that the photoelectric detection module and the image
  • the mirror symmetry relationship of the detection module in space, the coordinate system of the corresponding photoelectric detection module and the coordinate system of the image detection module can be integrated into the same coordinate system only by changing the positive and negative, which greatly reduces the radar The complexity of point cloud data and image data fusion.
  • the position accuracy of the photoelectric detection module and the image detection module can be guaranteed to be ⁇ 0.1mm. Therefore, the image fusion laser radar detection system of the present disclosure has fused the coordinate system of the photoelectric detection module and the coordinate system of the image detection module into the same coordinate system through hardware, so the spatial synchronization has been realized without complicated coordinate conversion.
  • the echo signal includes a laser signal and a visible light signal reflected by objects in the area to be detected.
  • the first signal is a laser signal
  • the second signal is a visible light signal.
  • the image detection module receives the visible light signal and determines the visible light signal as image data.
  • the photoelectric detection module receives the laser signal and determines the laser signal as point cloud data.
  • the laser emitting module is used to emit a line laser signal to the area to be detected; the photoelectric detection module and the image detection module are both linear array detectors.
  • the laser signal emitted by the laser emitting module is beam-shaped to form a line laser signal.
  • the laser echo signals reflected by objects in the area to be detected are corresponding line laser signals.
  • the beam splitting module is used to split the echo signal including the line laser signal reflected by the object in the area to be detected into a first signal and a second signal, and the first signal and the second signal are respectively detected by the photoelectric detection module and the second signal. Received by the image detection module.
  • the point cloud data is obtained by receiving the first signal column by column through the photoelectric detection module
  • the image data is obtained by receiving the second signal column by column by the image detection module
  • the point cloud data of each column is related to the image of each column
  • the data is in one-to-one correspondence in the scanning sequence, so that the synchronous scanning of laser visible light on the hardware can be realized, and the obtained point cloud data and image data can be easily fused with a simple process, easy operation, and high accuracy.
  • the radar detection system of the image fusion laser also includes a focusing lens group 5, as shown in Figures 1 to 4, the focusing lens group 5 is arranged in front of the spectroscopic module 2 for focusing the echo signal Irradiate on the spectroscopic module 2.
  • the laser emitting module 1 emits a laser signal to the area to be detected
  • the object in the area to be detected reflects the laser signal as an echo signal
  • the echo signal is irradiated onto the spectroscopic module 2 after being focused by the focusing lens group 5 .
  • the focusing lens group 5 includes a plurality of focusing lenses, and the number of focusing lenses is set according to the optical design of the radar detection system of the actual image fusion laser, which is not limited in the present disclosure.
  • FIG. 5 is another structural schematic diagram of the image fusion laser radar detection system provided by the embodiment of the present disclosure.
  • the splitting module 2 splits the echo signal into a first signal and a second signal, wherein the first signal is sent to the photodetection module 3 , and the second signal is sent to the image detection module 4 .
  • the photoelectric detection module 3 and the image detection module 4 are arranged mirror-symmetrically along the plane where the spectroscopic module 2 receives echo signals. In this way, the spatial synchronization of the coordinate system of the photoelectric detection module and the coordinate system of the image detection module can be realized without complex coordinate transformation, and the coordinate system of the photoelectric detection module and the coordinate system of the image detection module can be easily fused in the same coordinate system.
  • the beam splitting module includes a dichroic mirror.
  • the included angle R between the dichroic mirror and the optical axis of the echo signal satisfies: 40° ⁇ R ⁇ 50°.
  • the included angle R between the dichroic mirror and the optical axis of the echo signal is 45°, and when the included angle R between the dichroic mirror and the optical axis of the echo signal is 45°, more It is beneficial to the layout of the location of each device in the radar detection system of the entire image fusion laser.
  • the dichroic mirror can be a 45° visible light cold mirror.
  • Figure 6 is the spectral transmittance curve of a 45-degree visible light dichroic mirror. It can be seen from Figure 6 that the 45-degree visible light dichroic mirror can reflect visible light in the 400nm-600nm band -Laser transmission in the 1300nm band. Therefore, it can meet the needs of image fusion laser, that is, it can realize reflected visible light transmitted laser.
  • the laser emission module and the photoelectric detection module are arranged in parallel along the optical axis of the echo signal.
  • the image fusion laser radar detection system includes, for example, a scanning module, the scanning module includes a rotating mirror and a motor, and the motor can control the rotating mirror to scan the laser signal.
  • the scanning module is arranged on the side of the laser emitting module close to the object in the area to be detected.
  • the laser emission module includes a laser and a collimating mirror.
  • the laser is used to emit laser signals.
  • the collimating mirror collimates the laser signal into a line beam and irradiates it on the rotating mirror, which is reflected by the rotating mirror and combined with the rotating mirror. Scan to scan the area to be detected.
  • the photodetection module can be, for example, an avalanche photodiode detector, a silicon photomultiplier tube detector, a single photon avalanche diode detector, a photodiode detector, and the like.
  • the image detection module may be, for example, a charge-coupled device (Charge-coupled Device, CCD) or a complementary metal-oxide-semiconductor (Complementary Metal-Oxide-Semiconductor, CMOS).
  • CCD Charge-coupled Device
  • CMOS complementary metal-oxide-semiconductor
  • the embodiment of the present disclosure also provides a radar detection method for image fusion laser, which is executed by the radar detection system for image fusion laser provided by the embodiment of the present disclosure, as shown in FIG. 7 , which is a method provided by the embodiment of the present disclosure.
  • detection method the method includes:
  • Step 701 Simultaneously trigger the laser emission module and the image detection module.
  • the laser emission module and the image detection module are simultaneously triggered. In this way, the time alignment between the point cloud data and the image data acquisition level can be realized, and since the photoelectric detection module and the image detection module are coupled to the same radar detection system, the laser emission module 1 and the image detection module 4 can realize It is triggered at the same time, and it is guaranteed that the laser and the image detect the same object at the same time. In this way, structured point cloud data and image data with time stamp or time series order can be obtained, and then the time synchronization of point cloud data and image data can be realized.
  • Step 702 Receive a first signal based on the photoelectric detection module, and receive a second signal based on the image detection module.
  • the laser emitting module sends a laser signal to the area to be detected.
  • the signal reflected by the object in the area to be detected is an echo signal.
  • the echo signal is irradiated on the beam splitting module, and the beam splitting module splits the echo signal into the first signal and the second signal.
  • the light splitting module transmits the first signal to the photoelectric detection module, and the photoelectric detection module is used to receive the first signal.
  • the spectroscopic module transmits the second signal to the image detection module, and the image detection module is used to receive the second signal.
  • Step 703 Perform fusion of point cloud data and image data based on the first signal and the second signal.
  • the image fusion laser radar detection system performs fusion of point cloud data and image data based on the first signal and the second signal.
  • An image fusion laser radar detection method is based on a radar detection system in which the photoelectric detection module and the image detection module are coupled to the same system, which can eliminate the cumbersome position conversion relationship, and the time synchronization accuracy can be improved. Reaching the microsecond level, it strictly realizes pixel-level space synchronization and time synchronization fusion, and at the same time, it can complete the target recognition and fusion of images and point clouds without adding additional computing power.
  • Fig. 8 is a schematic diagram of the comparison between the time synchronization method in the prior art and the synchronization method provided by the embodiment of the present disclosure; it can be seen from Fig. 8 that the time synchronization method in the prior art is: the hardware triggers the radar sending module and the image detection module respectively Then the laser analog front-end and the image analog front-end obtain point cloud data and image data respectively, and then respectively pass through the laser signal processing module and the image signal processing module to obtain point cloud data and image data with time stamps. These data are then transmitted to the host computer through routers or switches, and processed by perception and fusion processing algorithms to achieve fusion of point clouds and images.
  • the synchronization method provided by the embodiments of the present disclosure is: an external clock source simultaneously triggers the radar emission module and the image detection module to achieve time alignment at the data acquisition level. It ensures that the measurement values of point cloud data and image data are in the same coordinate system, and that the laser and image measure the same object at the same time. In this way, the point cloud data and image data respectively obtained by the laser analog front end and the image analog front end respectively pass through the corresponding signal processing modules to obtain structured point cloud and image data with time stamp or time series sequence, and then realize point cloud and image data. Time synchronization of images.
  • the time synchronization method provided by the embodiments of the present disclosure has a simple process, is easy to operate, and has high precision.
  • FIG. 9 is a schematic diagram of the effect of the image fusion laser provided by the embodiment of the present disclosure.
  • the image fusion laser radar detection system is based on the first signal and the second signal, the effect diagram after the fusion of point cloud data and image data, as shown in Figure 9, the point cloud data is actually a colored mosaic image (not shown in the figure), different colors in the color image represent different distances.
  • the mosaic image in the fusion effect is the point cloud data fused into the image data. It can be seen from the fusion effect that there is a one-to-one correspondence between the point cloud data and the image data, that is, the spatial synchronization effect is better.
  • FIG. 10 is a schematic structural diagram of the electronic device provided by an embodiment of the present disclosure.
  • the electronic device includes a processor and a memory, and the processor executes the steps of the radar detection method for image fusion laser in the above embodiment by calling the program or instructions stored in the memory, so it has the beneficial effects of the above embodiment, I won't go into details here.
  • an electronic device may be set to include at least one processor 101 , at least one memory 102 and at least one communication interface 103 .
  • Various components in the electronic device are coupled together through the bus system 104 .
  • the communication interface 103 is used for information transmission with external devices.
  • the bus system 104 is used to realize connection and communication between these components.
  • the bus system 104 also includes a power bus, a control bus and a status signal bus.
  • the various buses are labeled as bus system 104 in FIG. 10 .
  • the memory 102 in this embodiment may be a volatile memory or a non-volatile memory, or may include both volatile and non-volatile memories.
  • memory 102 stores elements such as executable units or data structures, or subsets thereof, or extensions thereof, operating systems and application programs.
  • the processor 101 executes the steps of the embodiments of the image fusion laser radar detection method provided by the embodiment of the present disclosure by calling the program or instruction stored in the memory 102 .
  • the image fusion laser radar detection method provided by the embodiment of the present disclosure may be applied to the processor 101 or implemented by the processor 101 .
  • the processor 101 may be an integrated circuit chip, which has a signal processing capability. In the implementation process, each step of the above method can be completed by an integrated logic circuit of hardware in the processor 101 or instructions in the form of software.
  • the above-mentioned processor 101 may be a general-purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field Programmable Gate Array, FPGA) or other available Program logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the steps of the image fusion laser radar detection method provided by the embodiments of the present disclosure may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software units in the decoding processor.
  • the software unit may be located in a mature storage medium in the field such as random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, register.
  • the storage medium is located in the memory 102, and the processor 101 reads the information in the memory 101, and completes the steps of the method in combination with its hardware.
  • the electronic device may also include one physical component, or multiple physical components, according to the instructions generated by the processor 101 when executing the image fusion laser radar detection method provided by the embodiment of the present application. Different physical components can be set inside the electronic device, or outside the electronic device, such as a cloud server. Each physical component cooperates with the processor 101 and the memory 102 to implement the functions of the electronic device in this embodiment.
  • the embodiments of the present disclosure may also be computer program products, which include computer program instructions that, when executed by a processor, cause the processor to execute the image provided by the embodiments of the present disclosure.
  • the computer program product can be written in any combination of one or more programming languages to execute the program codes for performing the operations of the embodiments of the present disclosure, and the programming languages include object-oriented programming languages, such as Java, C++, etc. , also includes conventional procedural programming languages, such as the "C" language or similar programming languages.
  • the program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server to execute.
  • the embodiments of the present disclosure may also be a computer-readable storage medium, on which computer program instructions are stored.
  • the processor executes the image fusion provided by the embodiments of the present disclosure.
  • the computer readable storage medium may employ any combination of one or more readable media.
  • the readable medium may be a readable signal medium or a readable storage medium.
  • a readable storage medium may include, but is not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination thereof, for example.
  • readable storage media include: electrical connection with one or more conductors, portable disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read-only memory
  • CD-ROM compact disk read-only memory
  • magnetic storage devices or any suitable combination of the foregoing.
  • the image fusion laser radar detection system and method provided in the present disclosure couples the photoelectric detection module and the image detection module into the same system, which can avoid the cumbersome position conversion relationship.
  • the laser emission module and the image detection module are triggered to work at the same time, which can realize the time alignment of the point cloud data and the image data acquisition level, and because the photoelectric detection module and the image detection module are coupled to the same radar detection system , it is easy to realize that the laser emission module and the image detection module are triggered at the same time, and it is guaranteed that the laser and the image detect the same object at the same time.
  • It solves the problem that in the existing laser radar system it is difficult to ensure that different types of sensors collect the same information at the same time due to the different collection periods of various sensors.
  • the time synchronization accuracy can reach the microsecond level, so that the target recognition and fusion of images and point clouds can be completed without adding additional computing power. It has strong industrial applicability.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种图像融合激光的雷达探测系统,该系统包括激光发射模组(1)、分光模组(2)、光电探测模组(3)以及图像探测模组(4),光电探测模组(3)和图像探测模组(4)分设于分光模组(2)的两侧;激光发射模组(1)用于向待探测区域发射激光信号;分光模组(2)用于将待探测区域中的物体反射的回波信号分束为第一信号和第二信号。该图像融合激光的雷达探测系统及方法,将光电探测模组(3)和图像探测模组(4)耦合到同一个雷达探测系统中,可以免去繁琐的位置转换关系,时间同步精度可以达到微秒级,进而无需增加额外算力即可完成图像和点云的目标识别及融合。

Description

图像融合激光的雷达探测系统及方法
本公开要求于2022年1月14日提交中国专利局、申请号为202210040879.7发明名称为“图像融合激光的雷达探测系统及方法”的中国专利申请的优先权,其全部内容通过引用结合在本公开中。
技术领域
本公开涉及激光雷达技术领域,尤其涉及一种图像融合激光的雷达探测系统及方法。
背景技术
在目前的自动驾驶领域,激光雷达系统中的激光探测部件的横向分辨率较低,而图像传感器件对于获取二维图像,虽然横向分别率较高,但是不具备直接三维成像的能力。现有技术中,通常是在分别获取激光雷达点云数据和图像数据之后,将二者进行融合,但是基于图像处理算法的激光雷达点云数据与图像数据融合方法对激光雷达点云数据密度的要求较高,且算法复杂。
现有的激光雷达点云数据和图像数据融合的缺点在于相机和激光雷达是两个分立的设备,相机和激光雷达的空间位置不一致,则点云数据和图像数据在进行融合的时候需要进行繁琐而复杂的位置关系转换。其次,图像数据和点云数据很难做到时间同步。时间同步指统一的外部时钟源给各个传感器提供相同的基准时间,各个传感器再根据已经校准后的各自时间为各自采集的不同类别的数据加上时间戳信息,从而实现所有传感器时间戳同步。目前很多自动驾驶车辆的传感器系统,大部分支持携带全球定位系统(Global Positioning System,GPS)时间戳的时间同步方法。但时间同步依旧存在一些问题,例如由于各类传感器各自采集周期不同,难以保证同一时刻不同类传感器采集到相同信息。
发明内容
(一)要解决的技术问题
本公开要解决的技术问题是解决现有的激光雷达系统中,由于各类传感器各自采集周期不同,难以保证同一时刻不同类传感器采集到相同信息的问题。
(二)技术方案
为了解决上述技术问题,本公开实施例提供了一种图像融合激光的雷达探测系统及方法。
第一方面,本公开提供了一种图像融合激光的雷达探测系统,该系统包括激光发射模组、分光模组、光电探测模组以及图像探测模组,所述光电探测模组和所述图像探测模组分设于所述分光模组的两侧;
所述激光发射模组用于向待探测区域发射激光信号;
所述分光模组用于将所述待探测区域中的物体反射的回波信号分束为第一信号和第二信号;
所述光电探测模组用于接收第一信号,以便确定点云数据;
所述图像探测模组用于接收第二信号,以便确定图像数据;
所述分光模组用于发出所述第一信号至所述光电探测模组,以及在所述分光模组的同一位置发出所述第二信号至所述图像探测模组;
其中,所述激光发射模组与所述图像探测模组被同时触发而工作。
可选地,所述光电探测模组与所述图像探测模组沿所述分光模组接收回波信号的平面呈镜像对称设置。
可选地,所述光电探测模组到所述分光模组的透射反射面的距离与所述图像探测模组到所述分光模组的透射反射面的距离之间的距离差等于或小于5mm。
可选地,所述分光模组用于透过所述第一信号,且反射所述第二信号;或者,所述分光模组用于透过所述第二信号,且反射所述第一信号。
可选地,所述光电探测模组设置在对应于所述第一信号的一侧,所述图像探测模组设置在对应于所述第二信号的一侧。
可选地,所述回波信号包括所述待探测区域中的物体反射的激光信号和可见光信号;
所述第一信号为所述激光信号;所述第二信号为所述可见光信号。
可选地,所述激光发射模组用于向待探测区域发射线激光信号;所述光电探测模组以及所述图像探测模组均为线阵探测器。
可选地,还包括聚焦透镜组,所述聚焦透镜组设置在所述分光模组前,用于将聚焦之后的回波信号照射到所述分光模组上。
可选地,所述分光模组包括二向色镜;所述二向色镜与所述回波信号的光轴之间的夹角R满足:
40°≤R≤50°。
第二方面,本公开实施例还提供一种图像融合激光的雷达探测方法,该方法包括:
同时触发所述激光发射模组和所述图像探测模组;
基于所述光电探测模组接收第一信号;
基于所述图像探测模组接收第二信号;
基于所述第一信号和所述第二信号,进行点云数据和图像数据的融合。
第三方面,本公开实施例还提供一种电子设备,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器通过调用所述存储器存储的程序或指令,用于执行可实现本公开第二方面提供一种图像融合激光的雷达探测方法的各实现方式中的部分或全部步骤。
第四方面,本公开实施例还提供一种计算机存储介质,其中,该计算机存储介质可存储有程序,该程序执行时可实现本公开第二方面提供一种图像融合激光的雷达探测方法的各实现方式中的部分或全部步骤。
(三)有益效果
本公开实施例提供的上述技术方案与现有技术相比具有如下优点:
本公开实施例提供的一种图像融合激光的雷达探测系统及方法,将光电探测模组和图像探测模组耦合到同一个雷达探测系统中,可以免去繁琐的位置转换关系,时间同步精度可以达到微秒级,进而无需增加额外算力即可完成图像和点云的目标识别及融合。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例提供的图像融合激光的雷达探测系统的一种结构示意图;
图2为图1所示的图像融合激光的雷达探测系统的在另一视角下的结构示意图;
图3为本公开实施例提供的图像融合激光的雷达探测系统的另一种结构示意图;
图4为图3所示的图像融合激光的雷达探测系统的在另一视角下的结构示意图;
图5为本公开实施例提供的图像融合激光的雷达探测系统的另一种结构示意图;
图6为可见光45度二向色镜的光谱透过率曲线;
图7为本公开实施例提供的一种探测方法;
图8为现有技术中时间同步方法与本公开实施例提供的同步方法的对比示意图;
图9为本公开实施例提供的图像融合激光的效果示意图;
图10为本公开实施例提供的一种电子设备的结构示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的 实施例是本公开的一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本公开保护的范围。
图1为本公开实施例提供的图像融合激光的雷达探测系统的一种结构示意图,图2为图1所示的图像融合激光的雷达探测系统的在另一视角下的结构示意图。如图1和图2所示,该系统包括激光发射模组1、分光模组2、光电探测模组3以及图像探测模组4,光电探测模组3和图像探测模组4分设于分光模组2的两侧。定义分光模组2指向光电探测模组3的方向为x方向,回波信号的光轴的延伸方向为z方向,与x方向和z方向均垂直的方向定义为y方向。激光发射模组1用于向待探测区域发射激光信号,待探测区域中的物体反射的信号为回波信号,分光模组2用于将待探测区域中的物体反射的回波信号分束为第一信号和第二信号。分光模组2将第一信号发射到光电探测模组3,光电探测模组3用于接收第一信号,以便确定点云数据。分光模组2将第二信号发射到图像探测模组4,图像探测模组4用于接收第二信号,以便确定图像数据。并且,分光模组2用于发出第一信号至光电探测模组3,以及在分光模组2的同一位置发出第二信号至图像探测模组4。其中,激光发射模组1与图像探测模组4被同时触发而工作。
本公开实施例提供的一种图像融合激光的雷达探测系统,将光电探测模组和图像探测模组耦合到同一个系统中,可以免去繁琐的位置转换关系,时间同步精度可以达到微秒级,进而无需增加额外算力即可完成图像和点云的目标识别及融合。其中,激光发射模组与图像探测模组被同时触发而工作,这样能够实现点云数据与图像数据获取层面的时间对准,并且由于光电探测模组与图像探测模组耦合到同一个雷达探测系统中,能够很容易的实现激光发射模组与图像探测模组被同时触发,且保证了同一时刻激光和图像测到同一个物体。这样可以得到带有时间戳或时间序列顺序的结构化点云数据和图像数据,进而实现点云数据和图像数据的时间同步。
可选地,光电探测模组与图像探测模组沿分光模组接收回波信号的平面呈镜像对称设置。分光模组接收回波信号的结构是平面结构, 光电探测模组与图像探测模组分设在分光模组接收回波信号的平面的两侧,且沿着该平面呈镜像对称设置。如图1所示,在xz平面上,光电探测模组3与图像探测模组4沿着分光模组2接收回波信号的平面镜像对称设置。
可选地,光电探测模组3到分光模组2的透射反射面的距离与图像探测模组4到分光模组2的透射反射面的距离之间的距离差等于或小于5mm。即分光模组2的透射反射面上的任意位置到达光电探测模组3的距离,与分光模组2上同一位置到达图像探测模组4的距离差等于或小于5mm。
本公开实施例提供的图像融合激光的雷达探测系统,能够实现图像探测模组与光电探测模组的空间同步,该空间同步是指将基于不同探测模组坐标系的测量值转换到同一个坐标系下,比如要实现雷达点云数据和图像数据的融合,需要建立精确的三维世界坐标系、雷达坐标系、相机坐标系、图像坐标系以及像素坐标系之间的坐标转换关系。在本公开实施例提供的系统中,通过光学系统设计,光电探测模组与图像探测模组沿分光模组接收回波信号的平面呈镜像对称设置,通过光路设计保证了光电探测模组与图像探测模组在空间上的镜像对称关系,相应的光电探测模组的坐标系和图像探测模组的坐标系只需进行正负的变化就可以融合在同一坐标系中,大大的减小了雷达点云数据与图像数据融合的复杂度。并可以通过机械加工精度,保证光电探测模组与图像探测模组的位置精度为±0.1mm。因此本公开的图像融合激光的雷达探测系统通过硬件已经将光电探测模组的坐标系和图像探测模组的坐标系融合在同一坐标系,故无需进行复杂的坐标转换就已经实现空间同步。
可选地,如图1所示,分光模组2用于透过第二信号,且反射第一信号。分光模组2透过的第二信号由图像探测模组4接收,分光模组2反射的第一信号由光电探测模组3接收。其中,光电探测模组设置在对应于第一信号的一侧,图像探测模组设置在对应于第二信号的一侧。这样可以方便各个探测模组接收相应的信号,使得整个雷达探测系统的结构设计更为简单。
可选地,该回波信号包括待探测区域中的物体反射的激光信号和可见光信号。第一信号为激光信号,第二信号为可见光信号。图像探测模组接收该可见光信号,将该可见光信号确定为图像数据。光电探测模组接收该激光信号,将该激光信号确定为点云数据。
图3为本公开实施例提供的图像融合激光的雷达探测系统的另一种结构示意图;图4为图3所示的图像融合激光的雷达探测系统的在另一视角下的结构示意图。可选地,如图3和图4所示,该系统包括激光发射模组1、分光模组2、光电探测模组3以及图像探测模组4,光电探测模组3和图像探测模组4分设于分光模组2的两侧。定义分光模组2指向光电探测模组3的方向为x方向,回波信号的光轴的延伸方向为z方向,与x方向和z方向均垂直的方向定义为y方向。激光发射模组1用于向待探测区域发射激光信号,待探测区域中的物体反射的信号为回波信号,分光模组2用于将待探测区域中的物体反射的回波信号分束为第一信号和第二信号。分光模组2用于透过第一信号,且反射第二信号。
具体地,分光模组2将第一信号发射到光电探测模组3,光电探测模组3用于接收第一信号,以便确定点云数据。分光模组2将第二信号发射到图像探测模组4,图像探测模组4用于接收第二信号,以便确定图像数据。并且,分光模组2用于发出第一信号至光电探测模组3,以及在分光模组2的同一位置发出第二信号至图像探测模组4。其中,激光发射模组1与图像探测模组4被同时触发而工作。
本公开实施例提供的一种图像融合激光的雷达探测系统,可以免去繁琐的位置转换关系,时间同步精度可以达到微秒级,严格实现像素级别空间同步、时间同步融合,同时无需增加额外算力即可完成图像和点云的目标识别及融合。
其中,激光发射模组1与图像探测模组4被同时触发而工作,这样能够实现点云数据与图像数据获取层面的时间对准,同时,由于光电探测模组与图像探测模组耦合到同一个雷达探测系统中,能够实现激光发射模组1与图像探测模组4能够被同时触发。且保证了同一时刻激光和图像测到同一个物体。这样可以得到带有时间戳或时间序列 顺序的结构化点云数据和图像数据,进而实现点云数据和图像数据的时间同步。
可选地,光电探测模组与图像探测模组沿分光模组接收回波信号的平面呈镜像对称设置。分光模组接收回波信号的结构是平面结构,光电探测模组与图像探测模组分设在分光模组接收回波信号的平面的两侧,且沿着该平面呈镜像对称设置。如图3所示,在xz平面上,光电探测模组3与图像探测模组4沿着分光模组2接收回波信号的平面镜像对称设置。
可选地,光电探测模组3到分光模组2的透射反射面的距离与图像探测模组4到分光模组2的透射反射面的距离之间的距离差等于或小于5mm。即分光模组2的透射反射面上的任意位置到达光电探测模组3的距离,与分光模组2上同一位置到达图像探测模组4的距离差等于或小于5mm。
本公开实施例提供的图像融合激光的雷达探测系统,能够实现图像探测模组与光电探测模组的空间同步,该空间同步是指将基于不同探测模组坐标系的测量值转换到同一个坐标系下,比如要实现雷达点云数据和图像数据的融合,需要建立精确的三维世界坐标系、雷达坐标系、相机坐标系、图像坐标系以及像素坐标系之间的坐标转换关系。在本公开实施例提供的系统中,通过光学系统设计,光电探测模组与图像探测模组沿分光模组接收回波信号的平面呈镜像对称设置,通过光路设计保证了光电探测模组与图像探测模组在空间上的镜像对称关系,相应的光电探测模组的坐标系和图像探测模组的坐标系只需进行正负的变化就可以融合在同一坐标系中,大大的减小了雷达点云数据与图像数据融合的复杂度。并可以通过机械加工精度,保证光电探测模组与图像探测模组的位置精度为±0.1mm。因此本公开的图像融合激光的雷达探测系统通过硬件已经将光电探测模组的坐标系和图像探测模组的坐标系融合在同一坐标系,故无需进行复杂的坐标转换就已经实现空间同步。
可选地,该回波信号包括待探测区域中的物体反射的激光信号和可见光信号。第一信号为激光信号,第二信号为可见光信号。图像探 测模组接收该可见光信号,将该可见光信号确定为图像数据。光电探测模组接收该激光信号,将该激光信号确定为点云数据。
可选地,激光发射模组用于向待探测区域发射线激光信号;光电探测模组以及图像探测模组均为线阵探测器。
可选地,激光发射模组发射的激光信号经过光束整形后形成线激光信号。待探测区域中的物体反射的激光回波信号为对应的线激光信号。其中,分光模组用于将待探测区域中的物体反射的包括线激光信号的回波信号分束为第一信号和第二信号,该第一信号和第二信号分别被光电探测模组和图像探测模组接收。其中,点云数据是通过光电探测模组逐列接收第一信号得到的,图像数据是通过图像探测模组逐列接收第二信号得出的,且每列的点云数据与每列的图像数据在扫描时序上一一对应,由此,能够实现激光可见光在硬件上的同步扫描,便于获取到的点云数据与图像数据在进行融合的时候,流程简单,容易操作,且精确度高。
可选地,该图像融合激光的雷达探测系统还包括聚焦透镜组5,如图1至图4所示,该聚焦透镜组5设置在分光模组2之前,用于将聚焦之后的回波信号照射到分光模组2上。激光发射模组1向待探测区域发射激光信号后,待探测区域中的物体反射该激光信号为回波信号,该回波信号经过聚焦透镜组5聚焦之后照射到分光模组2上。其中,该聚焦透镜组5包括多个聚焦透镜,聚焦透镜设置的数量以实际图像融合激光的雷达探测系统的光学设计进行设定,本公开对此不限定。
图5为本公开实施例提供的图像融合激光的雷达探测系统的另一种结构示意图,可选地,如图5所示,回波信号由聚焦透镜组5聚焦后照射到分光模组2上,分光模组2将该回波信号分束为第一信号和第二信号,其中,第一信号发送到光电探测模组3,第二信号发送到图像探测模组4。光电探测模组3与图像探测模组4沿分光模组2接收回波信号的平面呈镜像对称设置。这样可以无需进行复杂的坐标转换就能够实现光电探测模组的坐标系和图像探测模组的坐标系的空间同步,很容易就将光电探测模组的坐标系和图像探测模组的坐标系融合在同一坐标系。
可选地,分光模组包括二向色镜,如图1所示,该二向色镜与回波信号的光轴之间的夹角R满足:40°≤R≤50°。
可选地,该二向色镜与回波信号的光轴之间的夹角R为45°,当二向色镜与回波信号的光轴之间的夹角R为45°时,更加利于整个图像融合激光的雷达探测系统中各个器件设置位置的布局。例如,该二向色镜可以选用45°可见光冷镜。如图6所示,图6为可见光45度二向色镜的光谱透过率曲线,从图6可以看出,该45度可见光二向色镜能够将400nm-600nm波段的可见光反射,将800nm-1300nm波段的激光透射。因此,可以满足图像融合激光的需求,即能够实现反射可见光透射激光。
可选地,激光发射模组与光电探测模组沿着回波信号的光轴平行设置。
可选地,该图像融合激光的雷达探测系统例如包括扫描模组,该扫描模组包括转镜和电机,电机能够控制转镜旋转扫描激光信号。该扫描模组设置在激光发射模组靠近待探测区域中的物体的一侧。
可选地,该激光发射模组包括激光器和准直镜,激光器用于发射激光信号,准直镜将激光信号准直成线光束后照射到转镜上,由转镜反射,再结合转镜扫描,以扫描待探测区域。
可选地,该光电探测模组例如可以是雪崩光电二极管探测器,硅光电倍增管探测器,单光子雪崩二极管探测器,光电二极管探测器等。
可选地,该图像探测模组例如可以是电荷耦合器件(Charge-coupled Device,CCD)或者互补金属氧化物半导体(Complementary Metal-Oxide-Semiconductor,CMOS)。
本公开实施例还提供一种图像融合激光的雷达探测方法,该方法由本公开实施例提供的图像融合激光的雷达探测系统执行,如图7所示,图7为本公开实施例提供的一种探测方法,该方法包括:
步骤701:同时触发所述激光发射模组和所述图像探测模组。
在图像融合激光的雷达探测系统中,同时触发激光发射模组和图像探测模组。这样能够实现点云数据与图像数据获取层面的时间对准,并且由于光电探测模组与图像探测模组耦合到同一个雷达探测系统 中,能够实现激光发射模组1与图像探测模组4能够被同时触发,且保证了同一时刻激光和图像测到同一个物体。这样可以得到带有时间戳或时间序列顺序的结构化点云数据和图像数据,进而实现点云数据和图像数据的时间同步。
步骤702:基于所述光电探测模组接收第一信号,基于所述图像探测模组接收第二信号。
激光发射模组向待探测区域发射激光信号,待探测区域中的物体反射的信号为回波信号,该回波信号照射到分光模组上,分光模组将回波信号分束为第一信号和第二信号。分光模组将第一信号发射到光电探测模组,光电探测模组用于接收第一信号。分光模组将第二信号发射到图像探测模组,图像探测模组用于接收第二信号。
步骤703:基于所述第一信号和所述第二信号,进行点云数据和图像数据的融合。
该图像融合激光的雷达探测系统基于第一信号和第二信号,进行点云数据和图像数据的融合。
本公开实施例提供的一种图像融合激光的雷达探测方法,基于光电探测模组和图像探测模组耦合到同一个系统中的雷达探测系统,可以免去繁琐的位置转换关系,时间同步精度可以达到微秒级,严格实现像素级别空间同步、时间同步融合,同时无需增加额外算力即可完成图像和点云的目标识别及融合。
图8为现有技术中时间同步方法与本公开实施例提供的同步方法的对比示意图;从图8可以看出,现有技术中时间同步方法为:硬件分别触发雷达发送模组和图像探测模组,随即激光模拟前端和图像模拟前端分别获得点云数据和图像数据,再分别经激光信号处理模块和图像信号处理模块,得到带有时间戳的点云数据和图像数据。这些数据再经路由器或者交换机传输至上位机,经感知和融合处理算法处理实现点云和图像的融合。而本公开实施例提供的同步方法为:由外部时钟源同时触发雷达发射模组和图像探测模组,实现数据获取层面时间对准,同时,由于图像融合激光的雷达探测系统设计和机械加工精度保证了点云数据与图像数据的测量值在同一个坐标系下,且保证了 同一时刻激光和图像测到同一个物体。这样,激光模拟前端和图像模拟前端分别得到的点云数据和图像数据再分别经过对应的信号处理模块,得到带有时间戳或时间序列顺序的结构化点云和图像数据,进而实现点云和图像的时间同步。本公开实施例提供的时间同步方法,流程简单,容易操作,且精确度高。
图9为本公开实施例提供的图像融合激光的效果示意图。该图像融合激光的雷达探测系统基于第一信号和第二信号,进行点云数据和图像数据的融合后的效果图,如图9所示,点云数据实际中是彩色的马赛克式样的图像(图中未示出),该彩色图像中的不同颜色代表不同的距离。融合效果中的马赛克式样的图像就是融合到图像数据中的点云数据。从融合效果可以看出,点云数据和图像数据之间的关系一一对应,即空间同步效果较好。对比融合效果图像和图像数据,可以看出两者的图像没有出现模糊或者错位的情况,说明物体棱角和点云数据探测的边界完全重合,且没有错位,这样说明该图像融合激光的雷达探测方法进行的点云数据和图像数据的融合效果极佳,且时间同步效果好。
本公开实施例还提供了一种电子设备,图10为本公开实施例提供的一种电子设备的结构示意图。如图10所示,电子设备包括处理器和存储器,处理器通过调用存储器存储的程序或指令,执行如上述实施例的图像融合激光的雷达探测方法的步骤,因此具备上述实施例的有益效果,这里不再赘述。
如图10所示,可以设置电子设备包括至少一个处理器101、至少一个存储器102和至少一个通信接口103。电子设备中的各个组件通过总线系统104耦合在一起。通信接口103用于与外部设备之间的信息传输。可理解,总线系统104用于实现这些组件之间的连接通信。总线系统104除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。但为了清楚说明起见,在图10中将各种总线都标为总线系统104。
可以理解,本实施例中的存储器102可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。在一些实施方 式中,存储器102存储了如下的元素:可执行单元或者数据结构,或者他们的子集,或者他们的扩展集操作系统和应用程序。在本公开实施例中,处理器101通过调用存储器102存储的程序或指令,执行本公开实施例提供的图像融合激光的雷达探测方法各实施例的步骤。
本公开实施例提供的图像融合激光的雷达探测方法可以应用于处理器101中,或者由处理器101实现。处理器101可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器101中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器101可以是通用处理器、数字信号处理器(Digital SignalProcessor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
本公开实施例提供的图像融合激光的雷达探测方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件单元组合执行完成。软件单元可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器102,处理器101读取存储器101中的信息,结合其硬件完成方法的步骤。该电子设备还可以包括一个实体部件,或者多个实体部件,以根据处理器101在执行本申请实施例提供的图像融合激光的雷达探测方法时生成的指令。不同的实体部件可以设置到电子设备内,或者电子设备外,例如云端服务器等。各个实体部件与处理器101和存储器102共同配合实现本实施例中电子设备的功能。
除了上述方法和设备以外,本公开的实施例还可以是计算机程序产品,其包括计算机程序指令,所述计算机程序指令在被处理器运行时使得所述处理器执行本公开实施例所提供的图像融合激光的雷达探测方法。
所述计算机程序产品可以以一种或多种程序设计语言的任意组合 来编写用于执行本公开实施例操作的程序代码,所述程序设计语言包括面向对象的程序设计语言,诸如Java、C++等,还包括常规的过程式程序设计语言,诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算设备上执行、部分地在用户设备上执行、作为一个独立的软件包执行、部分在用户计算设备上部分在远程计算设备上执行、或者完全在远程计算设备或服务器上执行。
此外,本公开的实施例还可以是计算机可读存储介质,其上存储有计算机程序指令,所述计算机程序指令在被处理器运行时使得所述处理器执行本公开实施例所提供的图像融合激光的雷达探测方法。所述计算机可读存储介质可以采用一个或多个可读介质的任意组合。可读介质可以是可读信号介质或者可读存储介质。可读存储介质例如可以包括但不限于电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。可读存储介质的更具体的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。
需要说明的是,在本文中,诸如“第一”和“第二”等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上所述仅是本公开的具体实施方式,使本领域技术人员能够理解或实现本公开。对这些实施例的多种修改对本领域的技术人员来说将是显而易见的,本文中所定义的一般原理可以在不脱离本公开的精 神或范围的情况下,在其它实施例中实现。因此,本公开将不会被限制于本文所示的这些实施例,而是要符合与本文所公开的原理和新颖特点相一致的最宽的范围。
工业实用性
本公开提供的图像融合激光的雷达探测系统及方法,将光电探测模组和图像探测模组耦合到同一个系统中,可以免去繁琐的位置转换关系。激光发射模组与图像探测模组被同时触发而工作,这样能够实现点云数据与图像数据获取层面的时间对准,并且由于光电探测模组与图像探测模组耦合到同一个雷达探测系统中,能够很容易的实现激光发射模组与图像探测模组被同时触发,且保证了同一时刻激光和图像测到同一个物体。解决了现有的激光雷达系统中,由于各类传感器各自采集周期不同,难以保证同一时刻不同类传感器采集到相同信息的问题。同时,时间同步精度可以达到微秒级,进而无需增加额外算力即可完成图像和点云的目标识别及融合。具有很强的工业实用性。

Claims (10)

  1. 一种图像融合激光的雷达探测系统,其特征在于,包括激光发射模组、分光模组、光电探测模组以及图像探测模组,所述光电探测模组和所述图像探测模组分设于所述分光模组的两侧;
    所述激光发射模组用于向待探测区域发射激光信号;
    所述分光模组用于将所述待探测区域中的物体反射的回波信号分束为第一信号和第二信号;
    所述光电探测模组用于接收第一信号,以便确定点云数据;
    所述图像探测模组用于接收第二信号,以便确定图像数据;
    所述分光模组用于发出所述第一信号至所述光电探测模组,以及在所述分光模组的同一位置发出所述第二信号至所述图像探测模组;
    其中,所述激光发射模组与所述图像探测模组被同时触发而工作。
  2. 根据权利要求1所述的系统,其特征在于,所述光电探测模组与所述图像探测模组沿所述分光模组接收回波信号的平面呈镜像对称设置。
  3. 根据权利要求2所述的系统,其特征在于,所述光电探测模组到所述分光模组的透射反射面的距离与所述图像探测模组到所述分光模组的透射反射面的距离之间的距离差等于或小于5mm。
  4. 根据权利要求1所述的系统,其特征在于,所述分光模组用于透过所述第一信号,且反射所述第二信号;或者,所述分光模组用于透过所述第二信号,且反射所述第一信号。
  5. 根据权利要求4所述的系统,其特征在于,所述光电探测模组设置在对应于所述第一信号的一侧,所述图像探测模组设置在对应于所述第二信号的一侧。
  6. 根据权利要求5所述的系统,其特征在于,所述回波信号包括所述待探测区域中的物体反射的激光信号和可见光信号;
    所述第一信号为所述激光信号;所述第二信号为所述可见光信号。
  7. 根据权利要求1所述的系统,其特征在于,所述激光发射模组用于向待探测区域发射线激光信号;所述光电探测模组以及所述图像 探测模组均为线阵探测器。
  8. 根据权利要求1所述的系统,其特征在于,还包括聚焦透镜组,所述聚焦透镜组设置在所述分光模组前,用于将聚焦之后的回波信号照射到所述分光模组上。
  9. 根据权利要求1所述的系统,其特征在于,所述分光模组包括二向色镜;所述二向色镜与所述回波信号的光轴之间的夹角R满足:
    40°≤R≤50°。
  10. 一种针对权利要求1-9任一项所述的系统的探测方法,其特征在于,包括:
    同时触发所述激光发射模组和所述图像探测模组;
    基于所述光电探测模组接收第一信号;
    基于所述图像探测模组接收第二信号;
    基于所述第一信号和所述第二信号,进行点云数据和图像数据的融合。
PCT/CN2022/073845 2022-01-14 2022-01-25 图像融合激光的雷达探测系统及方法 WO2023133939A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210040879.7A CN114063111A (zh) 2022-01-14 2022-01-14 图像融合激光的雷达探测系统及方法
CN202210040879.7 2022-01-14

Publications (1)

Publication Number Publication Date
WO2023133939A1 true WO2023133939A1 (zh) 2023-07-20

Family

ID=80230843

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/073845 WO2023133939A1 (zh) 2022-01-14 2022-01-25 图像融合激光的雷达探测系统及方法

Country Status (2)

Country Link
CN (1) CN114063111A (zh)
WO (1) WO2023133939A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115902818A (zh) * 2023-02-21 2023-04-04 探维科技(北京)有限公司 图像融合激光的信号探测系统、雷达系统及其探测方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103471715A (zh) * 2013-09-02 2013-12-25 北京航空航天大学 一种共光路组合式光场光谱成像方法及装置
CN104483676A (zh) * 2014-12-04 2015-04-01 北京理工大学 一种3d/2d非扫描激光雷达复合成像装置
CN107219533A (zh) * 2017-08-04 2017-09-29 清华大学 激光雷达点云与图像融合式探测系统
CN110764070A (zh) * 2019-10-29 2020-02-07 北科天绘(合肥)激光技术有限公司 基于三维数据与图像数据的数据实时融合处理方法、装置
CN111289995A (zh) * 2018-11-21 2020-06-16 北京万集科技股份有限公司 三维激光雷达装置及系统
CN112912766A (zh) * 2021-02-02 2021-06-04 华为技术有限公司 一种探测装置、控制方法、融合探测系统及终端
CN113447947A (zh) * 2020-03-26 2021-09-28 杭州海康威视数字技术股份有限公司 生成场景数据的装置及方法
US20210341616A1 (en) * 2018-10-12 2021-11-04 Sony Semiconductor Solutions Corporation Sensor fusion system, synchronization control apparatus, and synchronization control method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103034109A (zh) * 2012-12-13 2013-04-10 浙江科技学院 双ccd镜像重叠调节及单曝光同轴数字全息记录装置
CN107045208A (zh) * 2017-05-02 2017-08-15 浙江红谱科技有限公司 红外和夜视仪的光学图像融合系统及方法
CN107024763B (zh) * 2017-05-16 2023-12-05 广东欧谱曼迪科技有限公司 一种双通道结构光数字相衬显微成像系统及其实现方法
CN109429001B (zh) * 2017-08-25 2021-06-29 杭州海康威视数字技术股份有限公司 图像采集方法、装置、电子设备以及计算机可读存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103471715A (zh) * 2013-09-02 2013-12-25 北京航空航天大学 一种共光路组合式光场光谱成像方法及装置
CN104483676A (zh) * 2014-12-04 2015-04-01 北京理工大学 一种3d/2d非扫描激光雷达复合成像装置
CN107219533A (zh) * 2017-08-04 2017-09-29 清华大学 激光雷达点云与图像融合式探测系统
US20210341616A1 (en) * 2018-10-12 2021-11-04 Sony Semiconductor Solutions Corporation Sensor fusion system, synchronization control apparatus, and synchronization control method
CN111289995A (zh) * 2018-11-21 2020-06-16 北京万集科技股份有限公司 三维激光雷达装置及系统
CN110764070A (zh) * 2019-10-29 2020-02-07 北科天绘(合肥)激光技术有限公司 基于三维数据与图像数据的数据实时融合处理方法、装置
CN113447947A (zh) * 2020-03-26 2021-09-28 杭州海康威视数字技术股份有限公司 生成场景数据的装置及方法
CN112912766A (zh) * 2021-02-02 2021-06-04 华为技术有限公司 一种探测装置、控制方法、融合探测系统及终端

Also Published As

Publication number Publication date
CN114063111A (zh) 2022-02-18

Similar Documents

Publication Publication Date Title
US11860280B2 (en) Integrated illumination and detection for LIDAR based 3-D imaging
CN110596721B (zh) 双重共享tdc电路的飞行时间距离测量系统及测量方法
US20230145537A1 (en) Optical system for collecting distance information within a field
WO2022262332A1 (zh) 一种距离测量装置与相机融合系统的标定方法及装置
US11328446B2 (en) Combining light-field data with active depth data for depth map generation
US7800739B2 (en) Distance measuring method and distance measuring element for detecting the spatial dimension of a target
CN110596722A (zh) 直方图可调的飞行时间距离测量系统及测量方法
CN110596724B (zh) 动态直方图绘制飞行时间距离测量方法及测量系统
EP4276495A1 (en) Detection device, control method, fusion detection system, and terminal
CN102947726A (zh) 扫描3d成像仪
CN110596723A (zh) 动态直方图绘制飞行时间距离测量方法及测量系统
US11269065B2 (en) Muilti-detector with interleaved photodetector arrays and analog readout circuits for lidar receiver
US20210041539A1 (en) Method and apparatus for determining malfunction, and sensor system
JP2002039716A (ja) 距離画像入力装置
CN116507984A (zh) 点云滤波技术
WO2023103198A1 (zh) 一种计算测距系统相对外参的方法、装置和存储介质
WO2023133939A1 (zh) 图像融合激光的雷达探测系统及方法
EP3602110A1 (en) Time of flight sensor
JP2022115975A (ja) 電磁波検出装置および情報取得システム
WO2020221188A1 (zh) 基于同步ToF离散点云的3D成像装置及电子设备
CN110456371B (zh) 一种激光雷达系统及相关测量方法
CN105091797B (zh) 一种单ccd的强度关联自准直仪
US20220413149A1 (en) Operating method and control unit for a lidar system, lidar system, and device
CN112596068A (zh) 一种采集器、距离测量系统及电子设备
JP2006322856A (ja) 距離計測装置、距離計測方法および距離計測プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22919585

Country of ref document: EP

Kind code of ref document: A1