CN220381291U - Laser radar system of common light path - Google Patents

Laser radar system of common light path Download PDF

Info

Publication number
CN220381291U
CN220381291U CN202321938309.6U CN202321938309U CN220381291U CN 220381291 U CN220381291 U CN 220381291U CN 202321938309 U CN202321938309 U CN 202321938309U CN 220381291 U CN220381291 U CN 220381291U
Authority
CN
China
Prior art keywords
laser
light
light beam
unit
visible light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202321938309.6U
Other languages
Chinese (zh)
Inventor
张硕
毕勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Technical Institute of Physics and Chemistry of CAS
Original Assignee
Technical Institute of Physics and Chemistry of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Technical Institute of Physics and Chemistry of CAS filed Critical Technical Institute of Physics and Chemistry of CAS
Priority to CN202321938309.6U priority Critical patent/CN220381291U/en
Application granted granted Critical
Publication of CN220381291U publication Critical patent/CN220381291U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The embodiment of the utility model discloses a laser radar system with a common light path. The system comprises a laser emitting module, a driving and modulating signal emitting module and a laser shaping module, wherein the laser emitting module responds to the driving and modulating signal and irradiates the laser to a target to be detected after shaping; the laser receiving module is used for receiving a target echo generated by a target to be detected to obtain a dual-band light beam; the beam splitting and direction adjusting module is used for receiving the dual-band light beam, carrying out beam splitting treatment to obtain a first light beam and a second light beam, and transmitting the first light beam and the second light beam to a first light path and a second light path respectively; the visible light sensor module is used for collecting a first light beam along a first light path and processing the first light beam to obtain visible light data; the depth sensor module is used for collecting the second light beam along the second light path and processing the second light beam to obtain depth data; and the fusion module is used for processing the visible light data and the depth data in parallel to obtain a pseudo-color three-dimensional image. Through the optical structure of the common light path and the common view angle, the consistency and the spatial and temporal synchronization of the imaging view fields of the depth image and the visible light image are ensured.

Description

Laser radar system of common light path
Technical Field
The utility model relates to the technical field of laser radars. And more particularly to a laser radar system for a common optical path.
Background
The core device of the existing laser radar is a depth sensor and mainly provides distance information. However, in practical applications, RGB visible light image information is generally required, and a system for acquiring a visible light image is generally separately provided, where visible light gray scale data and depth data are acquired by a single sensor, so as to generate a visible light image and a depth image. The scheme has larger volume, inconsistent fields of view of the two systems, influences the acquisition speed of data, and greatly reduces the imaging frame rate.
Disclosure of Invention
The present utility model is directed to a laser radar system with a common optical path, which solves at least one of the problems of the prior art.
In order to achieve the above purpose, the utility model adopts the following technical scheme:
the first aspect of the utility model provides a laser radar system with a common optical path, the system comprising
The laser emission module is used for responding to the driving and modulating signals to emit laser and irradiating the laser to the target to be detected after shaping the laser;
the laser receiving module is used for receiving a target echo generated by the target to be detected to obtain a dual-band light beam;
the beam splitting and directing adjusting module is used for receiving the dual-band light beam, carrying out beam splitting treatment to obtain a first light beam and a second light beam, and transmitting the first light beam and the second light beam to a first light path and a second light path respectively;
the visible light sensor module is used for collecting the first light beam along the first light path and processing the first light beam to obtain visible light data;
the depth sensor module is used for collecting the second light beam along the second light path and processing the second light beam to obtain depth data;
and the fusion module is used for processing the visible light data and the depth data in parallel to obtain a pseudo-color three-dimensional image.
Optionally, the laser emitting module comprises a laser, a laser shaping unit, a laser driving and modulating unit and a temperature control unit; wherein the method comprises the steps of
The laser driving and modulating unit is used for sending driving and modulating signals to the laser;
the laser is used for receiving the driving and modulating signals and emitting laser;
the temperature control unit is used for sending a temperature control signal to the laser and controlling the temperature;
the laser shaping unit is used for receiving the laser and shaping the laser into a beam required by laser illumination.
Optionally, the laser receiving module includes an imaging lens matching the visible light band sensor and the infrared light band sensor, and a receiving field of view of the imaging lens is larger than a field of view of the laser illumination.
Optionally, the beam splitting and directing adjusting module includes a beam splitting unit disposed on an optical path of the dual-band beam, and a beam directing adjusting unit disposed perpendicular to the beam splitting unit; wherein the method comprises the steps of
The light splitting unit is used for carrying out light splitting treatment on the dual-band light beam to obtain a first light beam and a second light beam, and transmitting the first light beam to the visible light sensor module along the first light path;
the light beam direction adjusting unit is used for receiving the second light beam and transmitting the second light beam to the depth sensor module in a reflection mode along the second light path.
Optionally, the visible light sensor module includes a first filter unit, a visible light sensor, a visible light data acquisition unit, and a visible light data processing unit disposed along the first optical path; wherein the method comprises the steps of
The first filtering unit is used for receiving the first light beam and selecting a light beam of a first wave band to pass through;
the visible light sensor is used for obtaining a visible light color image according to the light beam of the first wave band;
the visible light data acquisition unit is used for processing the visible light color image to obtain a first electric signal;
the visible light data processing unit is used for receiving the first electric signal and performing data processing to obtain the visible light data.
Optionally, the depth sensor module includes a second filter unit, a depth sensor, a depth data acquisition unit, and a depth data processing unit disposed along the second optical path; wherein the method comprises the steps of
The second filtering unit is used for receiving the second light beam and selecting a light beam of a second wave band to pass through;
the depth sensor is used for obtaining a depth image according to the light beam of the second wave band;
the depth data acquisition unit is used for processing the depth image to obtain a second electric signal;
the depth data processing unit is used for receiving the second electric signal and performing data processing to obtain the depth data.
Optionally, the fusion module comprises a dual-mode image fusion unit and a fusion image processing unit which are sequentially arranged; wherein the method comprises the steps of
The dual-mode image fusion unit is used for receiving and carrying out parallel processing on the visible light data and the depth data to obtain fusion data;
and the fusion image processing unit is used for processing the fusion data to generate a pseudo-color three-dimensional image.
Optionally, the system further comprises
The data output module is used for externally transmitting various data through an external interface;
and the control module is used for storing and analyzing the various data and controlling the modules.
Optionally, the pixel size of the depth sensor is larger than the pixel size of the visible light sensor.
Optionally, the pixel size of the depth sensor is an integer multiple of the visible light sensor.
The beneficial effects of the utility model are as follows:
the utility model provides a laser radar system with a common light path, which has a simple structure, ensures that imaging fields of a depth image and a visible light image are consistent through the optical structures of the common light path and the same view angle, and solves the problem of inconsistent imaging fields of a double sensor; meanwhile, the synchronization of the two-dimensional image and the three-dimensional image in space and time is ensured, and the difficulty of complex processing work such as subsequent registration and fusion is reduced.
Drawings
The following describes the embodiments of the present utility model in further detail with reference to the drawings.
Fig. 1 shows a schematic structural diagram of a laser radar system with a common optical path according to an embodiment of the present utility model.
Detailed Description
In order to more clearly illustrate the present utility model, the present utility model will be further described with reference to examples and drawings. Like parts in the drawings are denoted by the same reference numerals. It is to be understood by persons skilled in the art that the following detailed description is illustrative and not restrictive, and that this utility model is not limited to the details given herein.
The laser radar is a photoelectric remote sensing detection device which takes laser as a radiation source and utilizes physical parameters such as reflected light echo intensity, phase, polarization and the like to actively measure the distance, namely depth data, of a measurement target. The method is widely applied to the high-tech fields such as automatic driving, unmanned aerial vehicle, robot, landform mapping, security monitoring, smart city, internet of things, modern agriculture and the like.
The core device of the existing laser radar is a depth sensor and mainly provides distance information. However, in practical applications, RGB visible light image information is generally required, and a system for acquiring a visible light image is generally separately provided, where visible light gray scale data and depth data are acquired by a single sensor, so as to generate a visible light image and a depth image. The scheme has larger volume, inconsistent fields of view of the two systems, influences the acquisition speed of data, and greatly reduces the imaging frame rate.
In view of the foregoing, an embodiment of the present utility model provides a laser radar system with a common optical path, the system includes a laser emitting module, configured to emit laser in response to a driving and modulating signal, and shape the laser and irradiate the laser to a target to be measured; the laser receiving module is used for receiving a target echo generated by the target to be detected to obtain a dual-band light beam; the beam splitting and directing adjusting module is used for receiving the dual-band light beam, carrying out beam splitting treatment to obtain a first light beam and a second light beam, and transmitting the first light beam and the second light beam to a first light path and a second light path respectively; the visible light sensor module is used for collecting the first light beam along the first light path and processing the first light beam to obtain visible light data; the depth sensor module is used for collecting the second light beam along the second light path and processing the second light beam to obtain depth data; and the fusion module is used for processing the visible light data and the depth data in parallel to obtain a pseudo-color three-dimensional image.
Specifically, as shown in fig. 1, the present embodiment discloses a common-path dual-light fusion solid-state laser radar system, which sequentially includes a laser transmitting module 100, a laser receiving module 200 (a receiving lens module), a beam splitting and directing adjusting module 300, an RGB visible light sensor module 400, a depth sensor module 500, a fusion module 600, a data processing module 700 and a system control module 800.
The embodiment adopts a dual-sensor mode of 'a depth sensor and a color visible light sensor', ensures that the imaging view field of the depth image and the imaging view field of the visible light image are consistent through an optical structure of a common light path and a common view angle, solves the problem of inconsistent imaging view field of the dual-sensor, and improves the acquisition frame rate and the quality of the visible light image. Meanwhile, the synchronization of the two-dimensional image and the three-dimensional image in space and time is ensured, and the difficulty of complex processing work such as subsequent registration and fusion is reduced.
In one possible implementation, the laser emitting module includes a laser, a laser shaping unit, a laser driving and modulating unit, and a temperature control unit; the laser driving and modulating unit is used for sending driving and modulating signals to the laser; the laser is used for receiving the driving and modulating signals and emitting laser; the temperature control unit is used for sending a temperature control signal to the laser and controlling the temperature; the laser shaping unit is used for receiving the laser and shaping the laser into a beam required by laser illumination.
Specifically, as shown in fig. 1, the laser emitting module 100 includes a laser 101, a laser shaping unit 102, a laser driving and modulating unit 103, and a temperature control unit 104. The laser emitting module 100 is used for emitting infrared laser, modulating waveforms and parameters, shaping and expanding beams, controlling the position, direction and form of the laser beam in space, irradiating the laser beam to a target and generating a required laser echo signal.
Further, the laser 101 may be a distributed feedback semiconductor laser, an external cavity semiconductor laser, a vertical cavity surface emitting laser, a dye laser, a quantum cascade laser, a solid laser, a gas laser, etc., and the laser 101 has characteristics of narrow bandwidth, nanosecond trigger signal, high power, stable output, etc. The wavelength of the emission of a particular laser is selected to correspond to the spectral response sensitivity of the depth sensor, typically in the near infrared band, such as 850nm, 905nm, 940nm, 1550nm, etc. Alternatively, the lasers may be single or distributed in an array.
The laser shaping unit 102 shapes the beam output from the laser. The unshaped beam cannot meet the illumination requirements of the lidar system because the asymmetry of the illumination field of view does not match the receiving surface pattern of the sensor, which is disadvantageous for fully utilizing the light energy, and the uneven distribution of the light intensity within the illumination field of view results in a reduction in the dynamic range available to the system. The shaping elements in the laser shaping unit 102 include diffuse reflectors, diffractive optical elements (Diffractive Optical Elements, DOE), binary diffraction gratings, liquid crystal spatial light modulators (Liquid Crystal Spatial Light Modulator, LC-SLM), fly eye lenses, microlens arrays, optical fibers, integrator rods, quadrangular pyramid mirrors, and the like. The laser shaping unit 102 shapes the light beam output by the laser into the light beam required by the illumination of the laser radar system, so that the field of view matching is met and the light intensity is uniform.
The laser driving and modulating unit 103 is configured to drive and control the laser, provide a stable working current for the laser, and simultaneously control the laser to work in a continuous mode or a pulse mode. The modulation unit can be a signal generator, a singlechip, an FPGA and the like, can generate precise triangular wave, sawtooth wave, rectangular wave (including square wave), sine wave signals and the like, can meet the working frequency of the sensor in high frequency (up to 100 MHz) and wide range (100 MHz-0.75 MHz), and has the advantages of adjustable duty cycle, adjustable adjustment range of 10% -90%, independent adjustment of duty cycle and frequency, and no mutual influence of the two.
The temperature control unit 104 is configured to perform temperature control on the laser, so that the laser works in an optimal temperature range, typically about 25 ℃.
In one possible implementation, the laser receiving module includes an imaging lens that matches the visible band sensor and the infrared band sensor, and the imaging lens has a receive field of view that is greater than the field of view of the laser illumination.
Specifically, the laser receiving module 200 is specifically an imaging lens group or lens, and the embodiment adopts an optical path design that an imaging lens matches with a dual sensor, where the imaging lens works in dual bands, including a visible light band 400-700nm and an infrared light band 800-1700nm (such as 850nm, 905nm, 940nm, 1550nm, etc.). The receiving view field of the imaging lens is larger than the laser illumination view field, the requirement of fusion imaging of two wave bands is met, the F number is controlled to be about 2, the modulation transfer function (Modulation Transfer Function, MTF) values within 0.7 view field are all larger than 0.6 at the resolution of 50lp/mm, the absolute value of the distortion of the full view field is smaller than 3%, and the edge relative illumination is larger than 30%.
The embodiment adopts an optical structure of a visible light and near infrared dual-band common light path and a common view angle, ensures that the imaging view fields of a visible light image and a depth image are consistent, has a compact structure, and realizes the integration of high efficiency and high reliability of a laser and a sensor receiving and transmitting system.
In one possible implementation manner, the beam splitting and directing adjustment module includes a beam splitting unit disposed on an optical path of the dual-band beam, and a beam directing adjustment unit disposed perpendicular to the beam splitting unit; the light splitting unit is used for carrying out light splitting treatment on the dual-band light beam to obtain a first light beam and a second light beam, and transmitting the first light beam to the visible light sensor module along the first light path; the light beam direction adjusting unit is used for receiving the second light beam and transmitting the second light beam to the depth sensor module in a reflection mode along the second light path.
Specifically, the beam splitting and directing adjustment module 300 includes a beam splitting unit 301 and a beam directing adjustment unit 302. The beam splitting and directing adjustment module 300 is configured to perform acquisition and imaging on a dual-band beam, where one path is transmitted and the other path is reflected.
Further, the light splitting unit 301 may be a light splitting sheet or a light splitting prism, where the light splitting sheet or the light splitting prism is coated with a film layer with a certain inverse transmittance, and transmits a part of the imaged light to the visible light sensor for imaging, and reflects a part of the imaged light to the depth sensor in the vertical direction for collecting data, so that a lens is matched with two sensors. The beam splitting prism is used for ensuring that reflected light and transmitted light have equal optical paths, so that the coplanarity of the sensor, the measurement accuracy and the imaging quality of the optical system are ensured. Optionally, the beam splitting prism selected in this embodiment is formed by gluing two 45 ° right angle prisms, and a 50% reflection and 50% transmission neutral film is plated in the central area of the glued surface, so as to ensure the aplanatic of the reflected light and the transmitted light.
The beam direction adjusting unit 302 may be a reflecting mirror with high reflectivity coated with a corresponding wave band, and is used for adjusting the beam collected by the laser receiving module to enter the corresponding sensor.
The embodiment realizes the beam splitting imaging treatment of the dual-band light beams, realizes the transmission imaging of the first band light beams and the reflection imaging of the other band light beams, and solves the problem of inconsistent view fields caused by the parallel optical systems.
In one possible implementation, the visible light sensor module includes a first filter unit, a visible light sensor, a visible light data acquisition unit, and a visible light data processing unit disposed along the first optical path; the first filtering unit is used for receiving the first light beam and selecting a light beam of a first wave band to pass through; the visible light sensor is used for obtaining a visible light color image according to the light beam of the first wave band; the visible light data acquisition unit is used for processing the visible light color image to obtain a first electric signal; the visible light data processing unit is used for receiving the first electric signal and performing data processing to obtain the visible light data.
Specifically, the RGB visible light sensor module (visible light sensor module) 400 includes a light filtering unit (first light filtering unit) 401, an RGB visible light sensor (visible light sensor) 402, a visible light data acquisition unit 403, and a visible light data processing unit 404, for acquiring a visible light image.
Further, the filtering unit 401 is coated with an antireflection film with a 400-700nm wave band, and is used for selecting light of the wave band to pass through and cutting off light outside a pass band, so that interference is avoided, and the acquisition effect is improved.
The RGB visible light sensor 402, which may be a CCD/CMOS sensor, is used for capturing a visible light color image, and typically has a smaller pixel size and higher resolution. The pixel size of the depth sensor is larger than that of the visible light CMOS/CCD sensor, and the resolution of the depth sensor is smaller than that of the depth sensor, so that the pixel sizes of the two sensors are in integral multiple in order to match the sensors with different pixels and resolutions in one lens of the same optical path.
The visible light data acquisition unit 403 includes an analog-to-digital conversion chip and an amplifier, and generates an electrical signal from the acquired optical signal through the analog-to-digital conversion chip and transmits the electrical signal to the visible light data processing unit 404.
In one possible implementation, the depth sensor module includes a second filter unit, a depth sensor, a depth data acquisition unit, and a depth data processing unit disposed along the second optical path; the second filtering unit is used for receiving the second light beam and selecting a light beam of a second wave band to pass through; the depth sensor is used for obtaining a depth image according to the light beam of the second wave band; the depth data acquisition unit is used for processing the depth image to obtain a second electric signal; the depth data processing unit is used for receiving the second electric signal and performing data processing to obtain the depth data.
In one possible implementation, the pixel size of the depth sensor is greater than the pixel size of the visible light sensor.
Specifically, the depth sensor module 500 includes a filtering unit (second filtering unit) 501, a depth sensor 502, a depth data acquisition unit 503, and a depth data processing unit 504, for acquiring a depth image.
Further, the filtering unit 501 is coated with an antireflection film with a wavelength band of 800-1700nm, light in the selected wavelength band passes through and light outside the passband is cut off, interference is avoided, and the collection effect is improved. Alternatively, the filtering unit 501 may be a filter having a high peak transmittance and a high cut-off depth.
The depth sensor 502 is a key element of the lidar system, and its characteristic parameters directly determine the imaging field of view range and resolution. The depth sensor integrates a matrix sensor with a large number of independent working pixels, each pixel has a complex structure and comprises a modulation control unit, a timing circuit, an analog-digital conversion unit and a data processing unit, sine wave or square wave periodic modulated light emitted by a laser is measured, after being reflected by a measured object, time difference or phase difference of the echo light is received by the sensor, and then phase information is calculated through charge quantity proportion accumulated by different phases so as to be converted into required flight time difference, and further depth information is calculated. The depth sensor may be a CCD/CMOS sensor based on structured light principle, time of flight principle, binocular vision principle, etc., wherein the resolution is 320×240, 640×480, etc., and the present embodiment employs a depth sensor of 640×480 resolution.
The depth data acquisition unit 503 includes an analog-to-digital conversion chip and an amplifier, and generates an electrical signal from the acquired optical signal through the analog-to-digital conversion chip and transmits the electrical signal to the depth data processing unit.
The depth data processing unit 504 performs noise reduction processing and depth restoration on the acquired image data, and finally presents a restored depth image.
In one possible implementation manner, the fusion module comprises a dual-mode image fusion unit and a fusion image processing unit which are sequentially arranged; the dual-mode image fusion unit is used for receiving and carrying out parallel processing on the visible light data and the depth data to obtain fusion data; and the fusion image processing unit is used for processing the fusion data to generate a pseudo-color three-dimensional image.
Specifically, the fusion module 600 includes a dual-mode image fusion unit 601 and a fusion image processing unit 602, and is configured to perform parallel processing of optical data and depth data transmitted by the RGB visible light sensor module 400 and the depth sensor module 500, perform multipoint simultaneous acquisition on a target in an imaging area, implement rapid depth data imaging and visible light image, perform image fusion by using features of depth bottom image data and visible light image detail data, and reconstruct fusion data to generate a pseudo-color three-dimensional image. And (3) realizing resolution interpolation and sampling of the depth image and the visible light image, and generating pseudo-color image data of fusion of the visible light image and the depth image by image fusion reconstruction, pseudo-color image generation processing and other algorithms.
According to the embodiment, integrated receiving is realized by adopting a common caliber mode, and three-dimensional (depth) data and two-dimensional (color) data are rapidly acquired by carrying out multipoint synchronization on a target in an imaging area, so that the acquisition speed of original data is improved, and the complexity of registration and fusion processing of images is reduced.
In one possible implementation manner, the system further comprises a data output module, which is used for externally transmitting various data through an external interface; and the control module is used for storing and analyzing the various data and controlling the modules.
Specifically, the data output unit 700 is configured to externally transmit the processed and resolved various data through an external interface according to a specified protocol. The data output unit 700 integrates the I2C interface, RJ45, 10/100M, USB virtual network and TCP/IP protocol.
In a specific example, the system control module (control module) 800 is configured to store, analyze, process, and transmit the collected test data, and perform integrated control on the entire system. The system control module stores and analyzes the collected detection data on one side; on the other hand, the information such as the detection data, the instrument operation parameters, the real-time state parameters and the like can be externally transmitted through an external interface according to a specified protocol.
Further, the system control content comprises detection flow control, operation parameter setting, operation parameter acquisition, operation log generation, operation protection, alarm and the like of the device.
Further, the pixel size of the depth sensor is an integer multiple of the visible light sensor.
The embodiment provides an optical path system of matching double sensors (RGB visible light sensor and depth sensor) aiming at one imaging lens of a solid-state laser radar, uses two sensors with different functions through light splitting distribution, shares a set of transmitting and receiving system, synchronously and parallelly collects and receives visible light-depth data, and then carries out registration fusion. The system has a simple structure, ensures that the imaging view fields of the depth image and the visible light image are consistent through the optical structures of the common light path and the common view angle, and solves the problem of inconsistent imaging view fields of the dual sensors; meanwhile, the synchronization of the two-dimensional image and the three-dimensional image in space and time is ensured, and the difficulty of complex processing work such as subsequent registration and fusion is reduced.
In the description of the present utility model, it should be noted that the azimuth or positional relationship indicated by the terms "upper", "lower", etc. are based on the azimuth or positional relationship shown in the drawings, and are merely for convenience of describing the present utility model and simplifying the description, and are not indicative or implying that the apparatus or element in question must have a specific azimuth, be constructed and operated in a specific azimuth, and thus should not be construed as limiting the present utility model. Unless specifically stated or limited otherwise, the terms "mounted," "connected," and "coupled" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present utility model can be understood by those of ordinary skill in the art according to the specific circumstances.
It is further noted that in the description of the present utility model, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
It should be understood that the foregoing examples of the present utility model are provided merely for clearly illustrating the present utility model and are not intended to limit the embodiments of the present utility model, and that various other changes and modifications may be made therein by one skilled in the art without departing from the spirit and scope of the present utility model as defined by the appended claims.

Claims (10)

1. A laser radar system for a common optical path, the system comprising
The laser emission module is used for responding to the driving and modulating signals to emit laser and irradiating the laser to the target to be detected after shaping the laser;
the laser receiving module is used for receiving a target echo generated by the target to be detected to obtain a dual-band light beam;
the beam splitting and directing adjusting module is used for receiving the dual-band light beam, carrying out beam splitting treatment to obtain a first light beam and a second light beam, and transmitting the first light beam and the second light beam to a first light path and a second light path respectively;
the visible light sensor module is used for collecting the first light beam along the first light path and processing the first light beam to obtain visible light data;
the depth sensor module is used for collecting the second light beam along the second light path and processing the second light beam to obtain depth data;
and the fusion module is used for processing the visible light data and the depth data in parallel to obtain a pseudo-color three-dimensional image.
2. The laser radar system of claim 1, wherein,
the laser emitting module comprises a laser, a laser shaping unit, a laser driving and modulating unit and a temperature control unit; wherein the method comprises the steps of
The laser driving and modulating unit is used for sending driving and modulating signals to the laser;
the laser is used for receiving the driving and modulating signals and emitting laser;
the temperature control unit is used for sending a temperature control signal to the laser and controlling the temperature;
the laser shaping unit is used for receiving the laser and shaping the laser into a beam required by laser illumination.
3. The laser radar system of claim 2, wherein,
the laser receiving module comprises an imaging lens matched with the visible light wave band sensor and the infrared light wave band sensor, and the receiving view field of the imaging lens is larger than the view field of laser illumination.
4. A laser radar system with a common optical path as claimed in claim 3, wherein,
the beam splitting and directing adjusting module comprises a beam splitting unit arranged on the optical path of the dual-band light beam and a light beam directing adjusting unit which is perpendicular to the beam splitting unit; wherein the method comprises the steps of
The light splitting unit is used for carrying out light splitting treatment on the dual-band light beam to obtain a first light beam and a second light beam, and transmitting the first light beam to the visible light sensor module along the first light path;
the light beam direction adjusting unit is used for receiving the second light beam and transmitting the second light beam to the depth sensor module in a reflection mode along the second light path.
5. The laser radar system of claim 4, wherein,
the visible light sensor module comprises a first light filtering unit, a visible light sensor, a visible light data acquisition unit and a visible light data processing unit which are arranged along the first light path; wherein the method comprises the steps of
The first filtering unit is used for receiving the first light beam and selecting a light beam of a first wave band to pass through;
the visible light sensor is used for obtaining a visible light color image according to the light beam of the first wave band;
the visible light data acquisition unit is used for processing the visible light color image to obtain a first electric signal;
the visible light data processing unit is used for receiving the first electric signal and performing data processing to obtain the visible light data.
6. The laser radar system of claim 5, wherein,
the depth sensor module comprises a second optical filtering unit, a depth sensor, a depth data acquisition unit and a depth data processing unit which are arranged along the second optical path; wherein the method comprises the steps of
The second filtering unit is used for receiving the second light beam and selecting a light beam of a second wave band to pass through;
the depth sensor is used for obtaining a depth image according to the light beam of the second wave band;
the depth data acquisition unit is used for processing the depth image to obtain a second electric signal;
the depth data processing unit is used for receiving the second electric signal and performing data processing to obtain the depth data.
7. The laser radar system of claim 6, wherein,
the fusion module comprises a dual-mode image fusion unit and a fusion image processing unit which are sequentially arranged; wherein the method comprises the steps of
The dual-mode image fusion unit is used for receiving and carrying out parallel processing on the visible light data and the depth data to obtain fusion data;
and the fusion image processing unit is used for processing the fusion data to generate a pseudo-color three-dimensional image.
8. The co-optic path lidar system of claim 7, further comprising
The data output module is used for externally transmitting various data through an external interface;
and the control module is used for storing and analyzing the various data and controlling the modules.
9. The laser radar system of claim 8, wherein,
the pixel size of the depth sensor is greater than the pixel size of the visible light sensor.
10. The laser radar system of claim 9, wherein,
the pixel size of the depth sensor is an integer multiple of the visible light sensor.
CN202321938309.6U 2023-07-21 2023-07-21 Laser radar system of common light path Active CN220381291U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202321938309.6U CN220381291U (en) 2023-07-21 2023-07-21 Laser radar system of common light path

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202321938309.6U CN220381291U (en) 2023-07-21 2023-07-21 Laser radar system of common light path

Publications (1)

Publication Number Publication Date
CN220381291U true CN220381291U (en) 2024-01-23

Family

ID=89566143

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202321938309.6U Active CN220381291U (en) 2023-07-21 2023-07-21 Laser radar system of common light path

Country Status (1)

Country Link
CN (1) CN220381291U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117768634A (en) * 2024-02-22 2024-03-26 长春市榣顺科技有限公司 vehicle-mounted stereoscopic vision camera based on binocular camera and laser radar and imaging method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117768634A (en) * 2024-02-22 2024-03-26 长春市榣顺科技有限公司 vehicle-mounted stereoscopic vision camera based on binocular camera and laser radar and imaging method

Similar Documents

Publication Publication Date Title
CN111025317B (en) Adjustable depth measuring device and measuring method
CN110579775A (en) Ultra-long-range single-photon three-dimensional laser radar scanning imaging system
CN220381291U (en) Laser radar system of common light path
CN109889809A (en) Depth camera mould group, depth camera, depth picture capturing method and depth camera mould group forming method
CN209375823U (en) 3D camera
CN111487639A (en) Laser ranging device and method
CN111830530A (en) Distance measuring method, system and computer readable storage medium
CN104483676A (en) 3D/2D (Three Dimensional/Two Dimensional) scannerless laser radar compound imaging device
CN104020474A (en) Laser three-dimensional imaging optical transmit-receive system
CN111766596A (en) Distance measuring method, system and computer readable storage medium
CN111708039A (en) Depth measuring device and method and electronic equipment
CN111458717A (en) TOF depth measuring device and method and electronic equipment
CN110780312B (en) Adjustable distance measuring system and method
CN111025321A (en) Variable-focus depth measuring device and measuring method
CN211236245U (en) Laser rangefinder and three-dimensional laser scanner
CN111796295A (en) Collector, manufacturing method of collector and distance measuring system
CN112213737A (en) Long-distance photon counting three-dimensional laser radar imaging system and method thereof
CN212135135U (en) 3D imaging device
CN106791781B (en) A kind of continuous wave phase measurement formula single pixel 3-D imaging system and method
CN111025319A (en) Depth measuring device and measuring method
CN213091889U (en) Distance measuring system
CN111796296A (en) Distance measuring method, system and computer readable storage medium
CN112946688B (en) Novel photon counting laser radar 3D imaging method and device
CN216133412U (en) Distance measuring device and camera fusion system
CN211206789U (en) Color laser radar imaging device

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant