CN115993608A - Three-dimensional imaging system, three-dimensional imaging method, vehicle-mounted system and vehicle - Google Patents

Three-dimensional imaging system, three-dimensional imaging method, vehicle-mounted system and vehicle Download PDF

Info

Publication number
CN115993608A
CN115993608A CN202111223187.8A CN202111223187A CN115993608A CN 115993608 A CN115993608 A CN 115993608A CN 202111223187 A CN202111223187 A CN 202111223187A CN 115993608 A CN115993608 A CN 115993608A
Authority
CN
China
Prior art keywords
data
infrared laser
infrared
dimensional imaging
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111223187.8A
Other languages
Chinese (zh)
Inventor
刘鹏
边宁
徐欣奕
蔡东
王绍峰
郭枫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongfeng Motor Corp
Uisee Technologies Beijing Co Ltd
Original Assignee
Dongfeng Motor Corp
Uisee Technologies Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongfeng Motor Corp, Uisee Technologies Beijing Co Ltd filed Critical Dongfeng Motor Corp
Priority to CN202111223187.8A priority Critical patent/CN115993608A/en
Publication of CN115993608A publication Critical patent/CN115993608A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The present disclosure relates to a three-dimensional imaging system, a three-dimensional imaging method, a vehicle-mounted system, and a vehicle, the three-dimensional imaging system including an infrared laser emission system, an infrared sensor, a lidar sensor, and a data fusion system; the infrared laser emission system is used for projecting infrared laser rays to the target area; the infrared sensor is used for collecting infrared laser rays reflected by an object in the target area so as to determine image data; the laser radar sensor is used for collecting infrared laser rays reflected by an object in the target area so as to determine point cloud data of the target object; the data fusion system is used for matching the image data and the point cloud data. In the three-dimensional imaging system provided by the embodiment of the disclosure, the infrared sensor and the laser radar sensor share the infrared laser emission system, so that one light source can be saved, the overall structure of the three-dimensional imaging system is simplified, and the hardware cost of the three-dimensional imaging system is reduced.

Description

Three-dimensional imaging system, three-dimensional imaging method, vehicle-mounted system and vehicle
Technical Field
The disclosure relates to the technical field of imaging detection, in particular to a three-dimensional imaging system, a three-dimensional imaging method, a vehicle-mounted system and a vehicle.
Background
The laser radar (Light Detection And Ranging, liDAR) is an active detection system which uses laser as a light source and can accurately and rapidly acquire information in a target area, acquire point cloud data and generate an accurate digital three-dimensional model.
In the related art, in order to obtain more abundant information in a target area, a laser radar may be combined with other various sensors to implement multi-dimensional detection on the same target area. However, when the lidar is combined with other detection systems, there are problems that the system structure is complicated and the overall cost is high.
Disclosure of Invention
In order to solve or at least partially solve the above technical problems, the present disclosure provides a three-dimensional imaging system, a three-dimensional imaging method, a vehicle-mounted system, and a vehicle, which are simple in structure and low in cost.
The present disclosure provides a three-dimensional imaging system comprising an infrared laser emission system, an infrared sensor, a lidar sensor, and a data fusion system;
the infrared laser emission system is used for projecting infrared laser rays to a target area;
the infrared sensor is used for collecting infrared laser rays reflected by objects in a target area so as to determine image data;
The laser radar sensor is used for collecting infrared laser rays reflected by an object in a target area so as to determine point cloud data of the target object;
the data fusion system is used for matching the image data and the point cloud data.
In some embodiments, the infrared laser emission system comprises a vertical cavity surface emitting laser;
the vertical cavity surface emitting laser is used for projecting the infrared laser light in a pulse form according to a preset time interval.
In some embodiments, the infrared sensor comprises a gated image sensor;
the gated image sensor is configured to determine an on time and an off time of the pixelated gate array based on the pulse emission time to collect infrared laser light reflected by objects in the target area.
In some embodiments, the lidar sensor comprises a silicon photomultiplier;
the silicon photomultiplier is configured to be triggered based on the pulse emission time and to collect infrared laser light reflected by an object in a target area in synchronization with the infrared sensor.
In some embodiments, the imaging system further comprises a support member, the infrared laser emitting system, the infrared sensor, and the lidar sensor each being secured by the support member;
The infrared sensor and the laser radar sensor are respectively arranged on two opposite sides of the infrared laser emission system.
In some embodiments, the infrared laser light has a wavelength of 808nm, 905nm, or 940nm.
In some embodiments, the data fusion system is specifically configured to:
converting pixels of the image data to a reference frame to obtain first spatial data;
converting the point cloud data to a reference system to obtain second space data;
and matching the second space data with the first space data.
In some embodiments, the data fusion system is specifically further configured to:
assigning the second space data to the pixels corresponding to the successfully matched first space data;
for the first space data which are not successfully matched, calculating auxiliary space data through a model algorithm based on the second space data which are successfully matched; the auxiliary space data corresponds to first space data which are not successfully matched;
and matching the first space data which are not successfully matched with the auxiliary space data until all the matching is completed.
In some embodiments, the point cloud data comprises spatial location data; the second spatial data includes corresponding spatial position data.
In some embodiments, the point cloud data further comprises surface characteristic data, the surface characteristic data comprising laser reflection intensity data; the second spatial data includes corresponding laser reflection intensity data.
The present disclosure also provides a three-dimensional imaging method comprising:
the infrared laser emission system projects infrared laser rays to a target area;
the infrared sensor collects infrared laser light reflected by an object in a target area to determine image data;
the laser radar sensor collects infrared laser rays reflected by an object in a target area to determine point cloud data of the target object;
the data fusion system matches the image data with the point cloud data.
In some embodiments, the data fusion system matches the image data and the point cloud data, comprising:
converting pixels of the image data to a reference frame to obtain first spatial data;
converting the point cloud data to a reference system to obtain second space data;
and matching the second space data with the first space data.
In some embodiments, the data fusion system matches the image data and the point cloud data, further comprising:
Assigning the second space data to the pixels corresponding to the successfully matched first space data;
for the first space data which are not successfully matched, calculating auxiliary space data through a model algorithm based on the second space data which are successfully matched; the auxiliary space data corresponds to first space data which are not successfully matched;
and matching the first space data which are not successfully matched with the auxiliary space data until all the matching is completed.
In some embodiments, the point cloud data comprises spatial location data; the second spatial data includes corresponding spatial position data.
In some embodiments, the point cloud data further comprises surface characteristic data, the surface characteristic data comprising laser reflection intensity data; the second spatial data includes corresponding laser reflection intensity data.
In some embodiments, when the infrared laser emission system projects the infrared laser light in pulses at preset time intervals, the method further includes:
the data fusion system determines image brightness based on the image data;
the infrared laser emission system dynamically adjusts a duration of a laser pulse based on the image brightness.
The present disclosure also provides an in-vehicle system including any one of the three-dimensional imaging systems described above.
The present disclosure also provides a vehicle comprising any of the above-described onboard systems.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages:
the three-dimensional imaging system, the three-dimensional imaging method, the vehicle-mounted system and the vehicle provided by the embodiment of the disclosure comprise an infrared laser emission system, an infrared sensor, a laser radar sensor and a data fusion system; the infrared laser emission system is used for projecting infrared laser rays to the target area; the infrared sensor is used for collecting infrared laser rays reflected by an object in the target area so as to determine image data; the laser radar sensor is used for collecting infrared laser rays reflected by an object in the target area so as to determine point cloud data of the target object; the data fusion system is used for matching the image data and the point cloud data, and therefore the infrared sensor and the laser radar sensor share the same infrared laser emission system, so that the number of light sources in the three-dimensional imaging system can be reduced, the structure of the three-dimensional imaging system is simplified, and the cost of the three-dimensional imaging system is reduced.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments of the present disclosure or the solutions in the prior art, the drawings that are required for the description of the embodiments or the prior art will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a schematic structural diagram of a three-dimensional imaging system according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a three-dimensional imaging system according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of another three-dimensional imaging system according to an embodiment of the present disclosure;
fig. 4 is a schematic flow chart of a three-dimensional imaging method according to an embodiment of the disclosure;
FIG. 5 is a schematic diagram of a refinement flow of S24 in the three-dimensional imaging method shown in FIG. 4;
fig. 6 is a schematic diagram of another refinement flow of S24 in the three-dimensional imaging method shown in fig. 4.
The infrared laser emission system comprises an infrared laser emission system, a laser emission system and a laser emission system, wherein the infrared laser emission system comprises a laser source and an infrared laser source, and the infrared laser source is arranged; 12. an infrared sensor; 13. a lidar sensor; 14. a data fusion system; 15. a target object in the target area; 16. and a support member.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, a further description of aspects of the present disclosure will be provided below. It should be noted that, without conflict, the embodiments of the present disclosure and features in the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced otherwise than as described herein; it will be apparent that the embodiments in the specification are only some, but not all, embodiments of the disclosure.
The three-dimensional imaging system provided by the embodiment of the disclosure can be applied to any scene with three-dimensional imaging requirements, and can comprise a robot, an unmanned aerial vehicle and the like. When the method is applied to an unmanned scene, the method can assist in realizing functions such as unmanned positioning, road edge/drivable area detection, lane marking line detection, obstacle detection, dynamic object tracking, obstacle classification and identification and the like, and is not limited herein.
The three-dimensional imaging system, the three-dimensional imaging method, the vehicle-mounted system and the vehicle provided by the embodiment of the disclosure are exemplarily described below with reference to fig. 1 to 6.
In some embodiments, fig. 1 is a schematic structural diagram of a three-dimensional imaging system according to an embodiment of the disclosure. Referring to fig. 1, the three-dimensional imaging system includes an infrared laser emission system 11, an infrared sensor 12, a lidar sensor 13, and a data fusion system 14; the infrared laser emission system 11 is used for projecting infrared laser rays to a target area; the infrared sensor 12 is used to collect infrared laser light reflected by an object in the target area to determine image data; the lidar sensor 13 is used to collect infrared laser light reflected by an object in the target area to determine point cloud data of the target object; the data fusion system 14 is used to match image data with point cloud data.
Wherein the target area is the detected area. Illustratively, the target area may be a front area, a rear area, or a peripheral area of the vehicle, as exemplified by unmanned, without limitation. Wherein the target object is a detectable object in the detected region. Illustratively, the target object may be a road sign, an indicator light, a pedestrian, a vehicle, a road edge, or any other dynamic or static object that may be present in the driving environment, as exemplified by unmanned, without limitation.
Wherein the infrared laser emission system 11 is capable of projecting infrared laser light to a target area, the infrared laser light projected into the target area being reflected by an object in the target area; correspondingly, the infrared sensor 12 is capable of capturing infrared laser light emitted by objects in the target area to determine image data, and may send the image data to the data fusion system 14; the lidar sensor 13 is capable of collecting infrared laser light reflected by a problem in the target area to determine point cloud data of the target object, and may send the point cloud data to the data fusion system 14; the data fusion system 14 fuses the image data with the point cloud data to determine information such as spatial locations, surface characteristics, etc. of the target objects in the target area, and to implement multi-dimensional detection of the target objects in the target area.
Exemplary, as shown in fig. 2, a schematic diagram of an operating principle of a three-dimensional imaging system according to an embodiment of the disclosure is provided. Referring to fig. 2, the three-dimensional imaging system may operate according to the following principles: the infrared laser emission system 11 projects infrared laser rays to a target area, the infrared laser rays are reflected by objects in the target area after being irradiated to the target area, and the infrared laser rays reflected by the objects in the target area are collected by the infrared sensor 12 and the laser radar sensor 13; correspondingly, the infrared sensor 12 determines image data including the target object 15 according to the collected infrared laser light, and the laser radar sensor 13 determines point cloud data including the target object 15 according to the collected infrared laser light; the data fusion system 14 matches the image data with the point cloud data, and outputs a three-dimensional image including an image of the target object and a plurality of different dimensional data such as a spatial position and a surface feature.
It will be appreciated that the target object 15 is shown in fig. 2 as a vehicle by way of example only; in other embodiments, the target object 15 may be another object, which is not limited herein.
The infrared sensor 12 and the lidar sensor 13 are both in communication connection with the data fusion system 14, and a specific implementation manner of the communication connection may be a wired electrical connection or a wireless connection, which is not limited herein. Illustratively, a specific matching process of the image data and the point cloud data by the data fusion system 14 is exemplified hereinafter.
In the three-dimensional imaging system provided in the embodiment of the present disclosure, the infrared sensor 12 and the laser radar sensor 13 share the same infrared laser emission system 11, that is, the infrared sensor 12 and the laser radar sensor 13 both use the infrared laser emission system as light sources, so that compared with the related art that different light sources are correspondingly provided for the infrared sensor 12 and the laser radar sensor 13 respectively, the number of light sources in the three-dimensional imaging system is reduced, the structure of the three-dimensional imaging system is simplified, and the overall cost of the three-dimensional imaging system is reduced.
In addition, through carrying out three-dimensional imaging based on infrared laser ray, the influence of the intensity of visible light on the imaging effect is avoided, and the imaging device can have better three-dimensional imaging effect under low light, glare or bad weather conditions, so that real-time imaging in all-weather environments can be realized.
In some embodiments, infrared laser emission system 11 includes a vertical cavity surface emitting laser (Vertical Cavity Surface Emitting Laser, VCSEL); the vertical cavity surface emitting laser is used for projecting infrared laser light in the form of pulses at preset time intervals.
The vertical cavity surface emitting Laser is based on gallium arsenide semiconductor materials, is different from other light sources such as a light emitting Diode (LightEmitting Diode, LED) and a Laser Diode (LD), has the advantages of small volume, low power, convenience in mass production, easiness in integration, low price and the like, and can be applied to a three-dimensional imaging system to further reduce the overall cost of the three-dimensional imaging system.
For example, the operating current of the VCSEL may be 3.5A, the operating voltage may be 30V, the power consumption may be 35W, and the wavelength of the corresponding infrared laser light may be 850nm. In other embodiments, the VCSEL may also operate with other parameters, which are not described in detail herein.
For example, the preset time interval may be 5ms, 8ms, 10ms, or other time interval, and the duration of the single pulse may be 10ms, 15ms, 20ms, or other time interval, which may be set based on the requirements of the three-dimensional imaging system, and is not limited herein.
The vertical cavity surface emitting laser may be connected to the timing device, and pulse control is implemented based on the timing device, so that the vertical cavity surface emitting laser may project infrared laser light in a pulse form at preset time intervals, so that the infrared sensor 12 and the laser radar sensor 13 may collect infrared laser light reflected by an object in the target area only at preset time, and further obtain a three-dimensional image of the target object. Thus, the pulse control based on the infrared laser emission system 11 is beneficial to realizing synchronous acquisition of the infrared sensor 12 and the laser radar sensor 13 in time, so that synchronous fusion of image data and point cloud data is realized.
It will be appreciated that the present embodiment shows the infrared laser emitting system 11 as a vertical cavity surface emitting laser by way of example only. In other embodiments, the infrared laser emission system 11 may also employ other infrared laser emitters known to those skilled in the art that can meet the needs of a three-dimensional imaging system, and is not limited herein.
In some embodiments, the infrared sensor 12 comprises a gated image sensor (Gated Imaging Sensor, GIS); the gated image sensor is configured to determine an on time and an off time of the pixelated gate array based on the pulse emission time to collect infrared laser light reflected by the object in the target area.
The gate control image sensor comprises a pixelated gate array and a sensor array for realizing switch control based on the gate array; the gate array may include a plurality of rows and columns of control switches, the sensor array may include a plurality of rows and columns of sensors, each control switch may control a sensor correspondingly, so that the switch control of each sensor in the sensor array based on the pulse emission time may be realized by controlling the on time and the off time of the pixelated gate array based on the pulse emission time, thereby realizing the collection of infrared laser light reflected by an object in the target area.
Wherein the gating image sensor and the infrared laser emission system 11 together form an active gating imaging system (Active Gated Imaging System, AGIS); wherein the gated image sensor acts as an imaging unit and the infrared laser emitting system 11 (e.g. VCSEL above) acts as a near infrared pulsed light source. By carrying out gating linkage on the gating image sensor and the infrared laser emission system 11, the opening time and the closing time of the sensor pixelated gate array can be controlled within the pulse emission time of the infrared laser emission system 11, so that the imaging unit is controlled to collect reflected infrared laser rays within the pulse emission time, and synchronous control on the gating image sensor and the infrared laser emission system 11 is realized.
The working principle of the active gating imaging system is as follows: the infrared laser emission system 11 projects infrared laser light in the form of pulses toward the target area to illuminate the target area; after a single pulse of infrared laser light irradiates the target area, the infrared laser light can be reflected by objects in the target area (including the target object 15), and the gating image sensor collects the infrared laser light reflected by the objects in the target area, so that image data is formed.
Specifically, the target area is illuminated with a beam of infrared laser light in the form of a pulse, each pulse activating a short integration (exposure) of a gated image sensor (which may be referred to simply as a "camera"). The short integration of the camera is delayed according to the adjustable pulse laser, and the integration time of the camera is also adjustable. The delay time and the integration time are determined by the distance and depth of a target area (corresponding to a target scene) detected by the three-dimensional imaging system, and the longer the distance is, the deeper the depth is, and the longer the time is. Thus, only photons reflected from objects in the target area can be collected and imaged. Shorter integration times combined with shorter laser pulse durations can result in images with higher quality resolution. In the active gating imaging system, pulses of the infrared sensor and the infrared laser emission system can be synchronously gated, so that a clear image can be generated. Further, by slicing the scene depth, the images it captures can also generate distance maps.
Illustratively, the gated image sensor may be a gated complementary metal-Oxide-Semiconductor (CMOS) image sensor that enables tight control of the on-time and off-time of the pixelated gate array, thereby ensuring higher image quality. Or other types of gated image sensors known to those skilled in the art may be used as the gated image sensor, and are not limited herein.
Further, the gating image sensor can judge whether the image brightness and the size of the target area irradiated by the pulse laser meet the three-dimensional imaging requirement or not according to the imaging effect of the current frame through an internal algorithm, and can further dynamically adjust the gating time and control the duration of the pulse emitted by the infrared laser emitting system so as to realize the adjustment of the size of the illuminated area and the brightness of the image, so that the imaging effect of the image of the next frame is better.
In other embodiments, infrared sensor 12 may employ other types of infrared sensors other than gated image sensors that would be known to those skilled in the art to be capable of meeting the needs of three-dimensional imaging systems, and is not limited in this regard.
In some embodiments, the lidar sensor 13 comprises a silicon photomultiplier (Silicon Photomultiplier, siPM); the silicon photomultiplier is used to collect infrared laser light reflected by an object in the target area in synchronization with the infrared sensor, triggered based on the pulse emission time.
The arrangement is that the silicon photomultiplier and the infrared sensor 12 are triggered by the same infrared laser emission system 11 and are triggered specifically at the pulse emission time, the silicon photomultiplier and the infrared sensor 12 synchronously collect infrared laser rays reflected by objects in a target area, infrared imaging and laser radar ranging point cloud hard synchronization are achieved, and synchronous fusion is accurate.
Among them, the silicon photomultiplier may specifically employ a Multi-pixel photon counter (Multi-Pixel Photon Counter, MPPC) whose constituent unit is constituted by an avalanche photodiode (Avalanche Photodiode, APD) in geiger mode and a quenching resistor connected in series therewith, and the constituent unit may be referred to as one pixel, and the MPPC may be an array constituted by a plurality of such pixels.
Therefore, when one pixel in the MPPC receives an incident photon, a pulse with a certain amplitude is output, when a plurality of pixels receive the incident photon, each pixel in the plurality of pixels outputs a pulse with a corresponding amplitude, the pulses output by the plurality of pixels are overlapped and output by a common output end of the MPPC, so that multiphoton detection can be realized in a high dynamic range, and larger gain can be realized. Based on this, the gain of MPPC can reach 105-106 compared with single APD, thus being beneficial to obtaining longer distance information in shorter time, and the detection bandwidth is also equivalent to APD.
Further, the unique photon resolution capability of the MPPC may be used to identify objects with different surface reflectivities, so that the distance information may be measured while the object surface characteristics may be resolved, for example, objects with different reflectivities may be distinguished.
Furthermore, the packaging form of the MPPC is also easier to splice into an array, and compared with an APD, the MPPC has a larger photosensitive area, so that the MPPC is more suitable for the solid-state laser radar.
From the above performance analysis of MPPC, MPPC is better suited for pulse ranging, which can be used as a lidar sensor for autopilot (e.g. unmanned) scenes.
In other embodiments, the lidar sensor 13 may also employ other types of pulse excitation-based lidar sensors known to those skilled in the art, and may be configured based on the requirements of a three-dimensional imaging system, without limitation.
In some embodiments, with continued reference to fig. 2, the three-dimensional imaging system may further include a lens assembly, which may be disposed in front of at least one of the infrared laser emission system 11, the infrared sensor 12, and the lidar sensor 13, and disposed in the light transmission path to adjust the infrared laser light emitted by the infrared laser emission system 11 (i.e., "emitted light" in fig. 2) and/or the infrared laser light reflected by the object in the target area (e.g., "reflected light" collected by the infrared sensor 12 or the lidar sensor 13 in fig. 2), so as to increase the intensity of the infrared laser light, thereby improving the signal-to-noise ratio and ensuring higher detection accuracy.
Illustratively, the lens assembly may include a lens or a lens group, which may be set based on the requirements of the three-dimensional imaging system, without limitation.
In some embodiments, with continued reference to fig. 2, the three-dimensional imaging system may further include a micro-control unit (Micro Controller Unit, MCU), shown in fig. 2 as MCU, to which the infrared laser emission system 11, the infrared sensor 12 and the lidar sensor 13 are connected, the MCU being capable of controlling the infrared laser emission system 11 to project infrared laser light in pulses at preset time intervals; the MCU can also control clock synchronization of the infrared sensor 12 and the laser radar sensor 13 to synchronously collect infrared laser light reflected by an object in a target area based on laser pulses, thereby realizing synchronous collection, and facilitating hard fusion of corresponding image data and point cloud data.
In some embodiments, with continued reference to FIG. 2, the three-dimensional imaging system includes a timer circuit time measurement circuit coupled to the lidar sensor 13 that can determine a measurement distance, which is the distance between the target object and the lens of the lidar sensor 13, using a time-of-flight method based on the time difference between the emitted light and the reflected light.
In other embodiments, the relative distance between any objects may be determined based on other preset or detected position information, and may be set based on actual requirements, which is not limited herein.
In some embodiments, as shown in fig. 3, a schematic structural diagram of another three-dimensional imaging system provided in an embodiment of the disclosure is shown, where spatial relative positional relationships among an infrared laser emission system, an infrared sensor, and a lidar sensor in the three-dimensional imaging system are shown. Referring to fig. 3, the imaging system further includes a support member 16, and the infrared laser emission system 11, the infrared sensor 12, and the lidar sensor 13 are each fixed by the support member 16; the infrared sensor 12 and the lidar sensor 13 are respectively disposed on opposite sides of the infrared laser emission system 11.
Wherein the support member 16 is used for supporting and fixing the infrared laser emitting system 11, the infrared sensor 12 and the lidar sensor 13.
Illustratively, the support member 16 may be a hollow housing having an opening, and the infrared laser emitting system 11, the infrared sensor 12, and the lidar sensor 13 are all fixed in the hollow housing and respectively leak out of the corresponding openings to allow the infrared laser light to exit and enter.
In other embodiments, the support member 16 may also use other structural components with support fixing functions known to those skilled in the art, which are not limited herein.
Illustratively, as shown in FIG. 3, the support member 16 is generally a rectangular parallelepiped structure; in other embodiments, the support member 16 may be provided in other three-dimensional shapes, and may be fixed by a support suitable for the spatial arrangement of the infrared laser emission system 11, the infrared sensor 12, and the lidar sensor 13, which is not limited herein.
The infrared laser emission system 11 is centrally arranged, and the infrared sensor 12 and the laser radar sensor 13 are arranged on two opposite sides of the infrared laser emission system 11, so that the path length from the object in the target area to the infrared sensor 12 is substantially equal to the path length from the object in the target area to the laser radar sensor 13, and after the path lengths from the infrared laser emission system 11 to the object in the target area are respectively overlapped, the emitting and receiving paths of the corresponding infrared laser rays are also substantially equal, thereby facilitating the infrared sensor 12 and the laser radar sensor 13 to synchronously collect the infrared laser rays reflected by the object in the target area, and further facilitating the improvement of detection accuracy.
Illustratively, as shown in fig. 3, the infrared laser emission system 11 is disposed at a central position of the support member 16, and the infrared sensor 12 and the lidar sensor 13 are horizontally disposed at left and right sides of the infrared laser emission system 11, respectively.
In other embodiments, the infrared sensor 12 and the lidar sensor 13 may be disposed on the upper side and the lower side of the infrared laser emission system 11, respectively, or the infrared sensor 12 and the lidar sensor 13 may be disposed on opposite sides of other orientations of the infrared laser emission system 11, which is not limited herein.
In some embodiments, the infrared laser light has a wavelength of 808nm, 905nm, or 940nm.
By the arrangement, the infrared laser light can be identified by the infrared sensor 12 and the laser radar sensor 13, and the infrared sensor 12 and the laser radar sensor 13 can share the same light source.
Meanwhile, the infrared laser light cannot be recognized by human eyes, but can be recognized by the infrared sensor 12 and the laser radar sensor 13, based on which, when the infrared laser emission system 11 projects the infrared laser light in the form of pulses toward the target area, the infrared laser light reflected by the object in the target area does not affect human eyes, avoiding glare.
Further, when the Infrared laser light is Near Infrared (NIR) light, the Near Infrared light can pass through a medium such as fog, rain, snow, etc., and continue to propagate due to its relatively strong penetrating power. Based on the above, the three-dimensional imaging system applying the near infrared rays can have good imaging effect in severe weather such as fog, rain, snow and the like, and can realize real-time perception in all-weather environments.
The wavelength of the infrared laser light emitted by the infrared laser emission system 11 is matched with the wavelength of the infrared laser light which can be identified by the infrared sensor 12 and the laser radar sensor 13, so that the infrared sensor 12 and the laser radar sensor 13 have higher sensitivity and accuracy. By setting the infrared laser light to be short-wave infrared laser light with the wavelength of 808nm, 905nm or 940nm, on one hand, the infrared sensor 12 and the laser radar sensor 13 have higher sensitivity and accuracy, on the other hand, the infrared laser emission system 11 has lower power, and can adopt a silicon-based device without adopting a gallium arsenide-based device with high cost, so that the cost and the system detection performance can be simultaneously considered, and the three-dimensional imaging system has higher cost performance.
It should be noted that the above only exemplarily shows that the wavelength of the infrared laser light is a single-point wavelength, and in other embodiments, the wavelength of the infrared laser light may also be in a band range, for example, 0.75 μm to 1.1 μm; in other embodiments, infrared laser light with other wavelengths or wavebands can be used, so that the infrared sensor and the laser radar sensor have higher sensitivity and accuracy in the wavelength or wavebands, and can be set based on the requirements of a three-dimensional imaging system, and the method is not limited herein.
In some embodiments, the data fusion system 14 is configured to match image data and point cloud data, and specifically includes: converting pixels of the image data to a reference frame to obtain first spatial data; converting the point cloud data into a reference system to obtain second space data; the second spatial data is matched with the first spatial data.
The conversion between the image data and the reference system can be based on the calibration parameters of the infrared sensor 12, and the conversion between the point cloud data and the reference system can be based on the calibration parameters of the laser radar sensor 13; based on the above conversion process, both the image data and the point cloud data can be converted into spatial data in the reference frame in order to achieve data matching.
Specifically, data fusion system 14 may determine first spatial data (e.g., a first spatial location) corresponding to a target pixel in the image data based on calibration parameters of infrared sensor 12; and, the data fusion system 14 may determine second spatial data (e.g., a second spatial position) corresponding to the target point in the point cloud data based on the calibration parameters of the lidar sensor 13. On this basis, the data fusion system 14 matches the second spatial data with the first spatial data, i.e. the matching of the image data and the point cloud data is achieved.
Illustratively, the reference frame may be a spatial coordinate system; correspondingly, the first space data and the second space data are space coordinates corresponding to the conversion of the pixels and the point cloud data of the image data into a space coordinate system, and the space coordinates can be space rectangular coordinates, space polar coordinates or other space coordinate forms for representation, and the method is not limited herein.
Further, the data fusion system 14 may determine the target pixel corresponding to the first spatial position matched with the second spatial position as the pixel to be assigned, and determine the spatial position data of the target point corresponding to the second spatial position as the spatial position of the pixel to be assigned, thereby determining the spatial position of the successfully matched pixel in the image data based on the spatial coordinate system where the point cloud is located.
In other embodiments, the reference frame may also be used as a reference to determine the spatial location of each pixel in the image data.
In some embodiments, the data fusion system 14 is configured to match image data and point cloud data, and specifically further includes: assigning the second space data to the pixels corresponding to the successfully matched first space data; for the first space data which are not successfully matched, calculating auxiliary space data through a model algorithm based on the second space data which are successfully matched; the auxiliary spatial data corresponds to first spatial data which is not successfully matched; and matching the first space data which are not successfully matched with the auxiliary space data until all the matching is completed.
When the number of pixels of the image data is greater than the number of point clouds of the point cloud data, the data fusion system 14 may first assign the second spatial data to the pixels corresponding to the successfully matched first spatial data, thereby determining the spatial position of the pixels corresponding to the successfully matched first spatial data in the reference frame; further, for the first space data which is not successfully matched, interpolation method can be used for determining auxiliary space data matched with the first space data, for example, a model algorithm can be used for calculating the auxiliary space data, the auxiliary space data and the first space data are matched, after the matching is successful, the auxiliary space data are copied to pixels corresponding to the first space data which is successfully matched, and the above processes are repeated until all matching and assignment are completed. Thereby, the spatial position of each pixel in the image data is determined with reference to the reference frame.
In combination with the above, when the first spatial data and the second spatial data are both spatial coordinates, the first spatial data may be the first spatial coordinates, and the second spatial data may be the second spatial coordinates. The data fusion system 14 may first assign the second spatial coordinates to pixels of the successfully matched first spatial coordinates; and further, for the first space coordinates which are not matched successfully, calculating second space coordinates which are not matched through a model algorithm based on the second space coordinates which are matched, and matching the second space coordinates which are not matched with the first space coordinates which are not matched until all the matching is completed.
In combination with the above, when the number of target pixels in the image data is greater than the number of target points in the point cloud data, the data fusion system 14 may insert a preset number of auxiliary target points between each adjacent two target points based on the successfully matched second spatial locations; and the sum of the auxiliary target points and the number of the target points is equal to the number of the target pixels, namely the number of the auxiliary target points is equal to the number of pixels corresponding to the first space data which is not successfully matched.
The data fusion system determines a third space position corresponding to the auxiliary target point based on calibration parameters of the laser radar sensor; the data fusion system determines a target pixel corresponding to a first space position matched with the third space position as an auxiliary pixel to be assigned; and determining the spatial position data of the auxiliary target point corresponding to the third spatial position as the spatial position of the auxiliary pixel to be assigned until all the pixels corresponding to the first spatial data are successfully assigned. Thereby, the spatial position of each pixel in the image data is determined with reference to the spatial coordinate system in which the point cloud is located.
In some embodiments, the point cloud data includes spatial location data; the second spatial data includes corresponding spatial position data.
The point cloud data includes, but is not limited to, multidimensional data characterizing the point cloud, such as spatial location data, surface feature data (see below), or other characteristic data.
When the point cloud data includes spatial location data, it may be three-dimensional data, which may be represented as a set of a plurality of (X, Y, Z) coordinates. The spatial position data in the point cloud data can be converted into corresponding spatial position data in the reference system by performing coordinate conversion between the coordinate system where the point cloud is located and the reference system, and can be represented as a set of a plurality of (X2, Y2, Z2) coordinates, for example.
Correspondingly, the image data may be two-dimensional data, which may be represented as a set of a plurality of (X0, Y0) coordinates; the planar coordinates in the image data may be converted into corresponding spatial position data under the reference frame by coordinate conversion between the planar coordinate system corresponding to the image data and the reference frame, and may be represented as a set of a plurality of (X1, Y1, Z1) coordinates, for example.
Further, matching of the image data and the point cloud data is achieved by matching (X2, Y2, Z2) with (X1, Y1, Z1) under the reference frame.
In some embodiments, the point cloud data further includes surface property data; the surface characteristic data includes laser reflection intensity data; the second spatial data includes corresponding laser reflection intensity data.
Wherein the surface properties of different objects in the target area or of different areas of the same object may differ, e.g. their reflectivity differs, the intensity of the infrared laser light reflected by them differs. Based on this, there is also a difference in the intensity of infrared laser light reflected by an object in the target area acquired by the lidar sensor 13, and the point cloud data includes surface characteristic data corresponding to each spatial position data, and when the surface characteristic data is different, it indicates that there is a difference in the surface characteristic at the corresponding position.
Thus, the matching fusion of the point cloud data and the image data may further include: the surface property data is assigned to pixels that are successfully matched based on spatial location.
In other embodiments, when the point cloud data further includes data of other dimensions, the data of other dimensions may also be assigned to the pixels that are successfully matched correspondingly, which is not limited herein.
The three-dimensional imaging system provided by the embodiment of the disclosure has at least the following beneficial effects:
The first infrared sensor and the laser radar sensor all take an infrared laser emission system as light sources, compared with the prior art that the infrared sensor and the laser radar sensor are respectively and correspondingly provided with different light sources, the number of the light sources in the three-dimensional imaging system is reduced, the structure of the three-dimensional imaging system is simplified, and the overall cost of the three-dimensional imaging system is reduced;
secondly, near infrared rays with strong penetrating power are adopted, so that compared with the imaging based on visible light in the related art, the imaging method has good imaging effect in low-light, glare and other scenes and in fog, rain, snow and other severe weather, and real-time perception can be realized in all-weather environments;
the third infrared sensor and the laser radar sensor are triggered based on the same infrared laser emission system, infrared laser rays reflected by objects in a target area can be synchronously collected, so that accurate synchronous fusion of image data and point cloud data is achieved.
Based on the same inventive concept, the embodiments of the present disclosure further provide a three-dimensional imaging method, which may be executed based on any one of the three-dimensional imaging systems provided in the foregoing embodiments, and have corresponding beneficial effects, and the same points may be understood with reference to the foregoing, and will not be repeated hereinafter.
The three-dimensional imaging method provided by the embodiment of the present disclosure is exemplarily described below with reference to fig. 4 to 6.
In some embodiments, as shown in fig. 4, a flow chart of a three-dimensional imaging method according to an embodiment of the disclosure is provided. Referring to fig. 4, the three-dimensional imaging method may include the steps of:
s21, the infrared laser emission system projects infrared laser rays to the target area.
S22, the infrared sensor collects infrared laser light reflected by the object in the target area to determine image data.
S23, the laser radar sensor collects infrared laser rays reflected by the object in the target area to determine point cloud data of the target object.
S24, the data fusion system matches the image data with the point cloud data.
In the three-dimensional imaging method provided by the embodiment of the disclosure, an infrared laser emission system projects infrared laser rays to a target area, the infrared laser rays are reflected after being irradiated to an object in the target area, and the reflected infrared laser rays are collected by an infrared sensor and a laser radar sensor; correspondingly, the infrared sensor collects infrared laser rays reflected by objects in the target area and determines image data of the target area according to the infrared laser rays; the laser radar sensor collects infrared laser rays reflected by an object in a target area and determines point cloud data of the target object according to the infrared laser rays; based on this, the data fusion system matches the image data with the point cloud data. Therefore, in the three-dimensional imaging method, the infrared sensor and the laser radar sensor both use the infrared laser emission system as a light source, so that one light source is saved, the system structure is simplified, and the overall cost of the system is reduced; compared with the control of two light sources, the control process of one light source is saved, and the three-dimensional imaging method is simplified. In addition, the infrared sensor and the laser radar sensor are synchronously triggered by the same light source, so that infrared laser rays reflected by objects in a target area are synchronously collected, synchronous matching fusion of image data and point cloud data is facilitated, the synchronous mode is simple, realization is convenient, synchronism is good, and fusion accuracy is high.
The data fusion process is exemplarily described below.
In some embodiments, as shown in fig. 5, a schematic diagram of a refinement of S24 in the three-dimensional imaging method shown in fig. 4 is shown. Referring to fig. 4 and 5, S24 may specifically include the following steps performed by the data fusion system:
s241, converting pixels of the image data into a reference system to obtain first space data.
Illustratively, the reference frame is a spatial coordinate system; the first spatial data includes a first spatial location. Thereby, coordinate conversion from the planar coordinate system of the image data to the reference system is achieved.
S242, converting the point cloud data into a reference system to obtain second space data.
The reference system is illustratively a spatial coordinate system, and the second spatial data includes a second spatial location. Thereby, coordinate conversion from the spatial coordinate system of the point cloud data to the reference system is achieved.
S243, the second space data is matched with the first space data.
The first space data and the second space data are matched, so that the matching of the point cloud data and the image data is realized.
It should be noted that, when the number of pixels of the image data is equal to the number of point clouds in the point cloud data, the number of the first space data and the number of the second space data are also correspondingly equal, and the two may be all matched with each other. When the number of pixels of the image data is greater than the number of point clouds of the point cloud data, auxiliary spatial data corresponding to the first spatial data that is not successfully matched may be determined based on the second spatial data that is successfully matched, so that the first spatial data can be all matched.
An exemplary description is provided below in connection with fig. 6.
In some embodiments, fig. 6 is another detailed flowchart of S24 in the three-dimensional imaging method shown in fig. 4. Referring to fig. 6 on the basis of fig. 5, in S24, after S243, the following steps may be further included:
s244, the second space data is assigned to the pixel corresponding to the successfully matched first space data.
After the first space data and the second space data are successfully matched, the second space data are assigned to the pixels corresponding to the successfully matched first space data.
S245, for the first space data which are not successfully matched, calculating auxiliary space data through a model algorithm based on the second space data which are successfully matched; the auxiliary spatial data corresponds to first spatial data for which the matching is unsuccessful.
When the number of pixels of the image data is larger than the number of point clouds of the point cloud data, the number of the first space data is correspondingly larger than the number of the second space data, and the first space data and the second space data cannot be matched one by one; at this time, for the first space data which is not successfully matched, auxiliary space data can be calculated through a model algorithm based on the second space data which is successfully matched, for example, a difference method can be utilized to insert a preset number of auxiliary space data between two adjacent second space data; the total number of auxiliary spatial data is equal to the number of first spatial data that have not been successfully matched, so that the complete matching of the first spatial data is completed later.
S246, matching the first space data which are not successfully matched with the auxiliary space data until all the matching is completed.
And matching the auxiliary space data with the first space data which is not successfully matched until all the first space data are completely matched.
In any of the above embodiments, the point cloud data includes spatial position data corresponding to a three-dimensional spatial position in a coordinate system of the point cloud data; the second spatial data includes corresponding spatial position data corresponding to the three-dimensional spatial position in the coordinate-converted reference frame alone.
In some embodiments, the point cloud data further includes surface characteristic data, the surface characteristic data including laser reflection intensity data; the second spatial data includes corresponding laser reflection intensity data.
For example, the surface characteristic data may be the same as the corresponding laser reflection intensity data, or the surface characteristic data may be interpolated to obtain the corresponding laser reflection intensity data, which is not limited herein.
On the basis of the above embodiment, when the infrared laser emission system of the three-dimensional imaging system projects infrared laser light in a pulse form at preset time intervals, the three-dimensional imaging method may further include the steps of:
step one, the data fusion system determines the brightness of the image based on the image data.
And step two, dynamically adjusting the duration of the laser pulse based on the brightness of the image by the infrared laser emission system.
The data fusion system determines image brightness based on the image data and judges whether the image brightness meets the three-dimensional imaging requirement according to the imaging effect of the current frame; when the brightness of the image does not meet the three-dimensional imaging requirement, for example, the brightness of the image is too bright or too dark, the infrared laser emission system adaptively adjusts the duration of the laser pulse based on the brightness of the image of the current frame so as to realize the brightness adjustment of the image, so that the imaging effect of the image of the next frame is better.
For example, when the image brightness is too bright, the duration of the laser pulse may be shortened; when the image brightness is too dark, the duration of the laser pulse can be prolonged; or may be adjusted in other ways known to those skilled in the art, and are not limited herein.
The embodiment of the disclosure provides a vehicle-mounted system, which comprises any three-dimensional imaging system in the embodiment, and can realize corresponding beneficial effects.
In other embodiments, the in-vehicle system may further include other structural or functional components known to those skilled in the art, and are not described in detail herein.
The embodiment of the disclosure provides a vehicle, which comprises any vehicle-mounted system in the embodiment, and can realize corresponding beneficial effects.
In other embodiments, the vehicle may also include other systems known to those skilled in the art, such as a drive system, a cockpit system, etc., and are not described in detail herein.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing is merely a specific embodiment of the disclosure to enable one skilled in the art to understand or practice the disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown and described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. The three-dimensional imaging system is characterized by comprising an infrared laser emission system, an infrared sensor, a laser radar sensor and a data fusion system;
the infrared laser emission system is used for projecting infrared laser rays to a target area;
the infrared sensor is used for collecting infrared laser rays reflected by objects in a target area so as to determine image data;
the laser radar sensor is used for collecting infrared laser rays reflected by an object in a target area so as to determine point cloud data of the target object;
the data fusion system is used for matching the image data and the point cloud data.
2. The three-dimensional imaging system of claim 1, wherein the infrared laser emission system comprises a vertical cavity surface emitting laser;
the vertical cavity surface emitting laser is used for projecting the infrared laser light in a pulse form according to a preset time interval.
3. The three-dimensional imaging system of claim 2, wherein the infrared sensor comprises a gated image sensor;
the gated image sensor is configured to determine an on time and an off time of the pixelated gate array based on the pulse emission time to collect infrared laser light reflected by objects in the target area.
4. The three-dimensional imaging system of claim 2, wherein the lidar sensor comprises a silicon photomultiplier;
the silicon photomultiplier is configured to be triggered based on the pulse emission time and to collect infrared laser light reflected by an object in a target area in synchronization with the infrared sensor.
5. A method of three-dimensional imaging comprising:
the infrared laser emission system projects infrared laser rays to a target area;
the infrared sensor collects infrared laser light reflected by an object in a target area to determine image data;
The laser radar sensor collects infrared laser rays reflected by an object in a target area to determine point cloud data of the target object;
the data fusion system matches the image data with the point cloud data.
6. The three-dimensional imaging method of claim 5, wherein the data fusion system matches the image data and the point cloud data, comprising:
converting pixels of the image data to a reference frame to obtain first spatial data;
converting the point cloud data to a reference system to obtain second space data;
and matching the second space data with the first space data.
7. The three-dimensional imaging method of claim 6, wherein the data fusion system matches the image data and the point cloud data, further comprising:
assigning the second space data to the pixels corresponding to the successfully matched first space data;
for the first space data which are not successfully matched, calculating auxiliary space data through a model algorithm based on the second space data which are successfully matched; the auxiliary space data corresponds to first space data which are not successfully matched;
and matching the first space data which are not successfully matched with the auxiliary space data until all the matching is completed.
8. The three-dimensional imaging method of claim 5, wherein the infrared laser emission system projects the infrared laser light in pulses at predetermined time intervals, the method further comprising:
the data fusion system determines image brightness based on the image data;
the infrared laser emission system dynamically adjusts a duration of a laser pulse based on the image brightness.
9. An in-vehicle system comprising the three-dimensional imaging system of any of claims 1-4.
10. A vehicle comprising the in-vehicle system of claim 9.
CN202111223187.8A 2021-10-20 2021-10-20 Three-dimensional imaging system, three-dimensional imaging method, vehicle-mounted system and vehicle Pending CN115993608A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111223187.8A CN115993608A (en) 2021-10-20 2021-10-20 Three-dimensional imaging system, three-dimensional imaging method, vehicle-mounted system and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111223187.8A CN115993608A (en) 2021-10-20 2021-10-20 Three-dimensional imaging system, three-dimensional imaging method, vehicle-mounted system and vehicle

Publications (1)

Publication Number Publication Date
CN115993608A true CN115993608A (en) 2023-04-21

Family

ID=85993030

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111223187.8A Pending CN115993608A (en) 2021-10-20 2021-10-20 Three-dimensional imaging system, three-dimensional imaging method, vehicle-mounted system and vehicle

Country Status (1)

Country Link
CN (1) CN115993608A (en)

Similar Documents

Publication Publication Date Title
KR102634870B1 (en) Noise adaptive solid-state lidar system
JP6942966B2 (en) Object detection device and mobile device
US10908266B2 (en) Time of flight distance sensor
EP2707748B1 (en) Multiple-field-of-view scannerless optical rangefinder in high ambient background light
CN110325879A (en) System and method for compress three-dimensional depth sense
CN111025318B (en) Depth measuring device and measuring method
CN110579775A (en) Ultra-long-range single-photon three-dimensional laser radar scanning imaging system
CN109791207A (en) For determining the system and method to the distance of object
CN111830530A (en) Distance measuring method, system and computer readable storage medium
CN109791204A (en) For determining the system to the distance of object
CN111766596A (en) Distance measuring method, system and computer readable storage medium
WO2018122415A1 (en) System for characterizing surroundings of a vehicle
CN209894976U (en) Time flight depth camera and electronic equipment
CN113970757A (en) Depth imaging method and depth imaging system
CN111025321A (en) Variable-focus depth measuring device and measuring method
CN111796295A (en) Collector, manufacturing method of collector and distance measuring system
CN114488173A (en) Distance detection method and system based on flight time
CN220381291U (en) Laser radar system of common light path
CN213091889U (en) Distance measuring system
CN111796296A (en) Distance measuring method, system and computer readable storage medium
CN217879628U (en) Emitter, solid-state laser radar and detection system
CN115993608A (en) Three-dimensional imaging system, three-dimensional imaging method, vehicle-mounted system and vehicle
CN114935743A (en) Emitting module, photoelectric detection device and electronic equipment
CN114935742A (en) Emitting module, photoelectric detection device and electronic equipment
Villa et al. 3d spad camera for advanced driver assistance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination