CN111239760A - Multi-view-field target environment information acquisition device and method based on fusion sensor - Google Patents

Multi-view-field target environment information acquisition device and method based on fusion sensor Download PDF

Info

Publication number
CN111239760A
CN111239760A CN202010049656.8A CN202010049656A CN111239760A CN 111239760 A CN111239760 A CN 111239760A CN 202010049656 A CN202010049656 A CN 202010049656A CN 111239760 A CN111239760 A CN 111239760A
Authority
CN
China
Prior art keywords
laser
field
target
mems micro
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010049656.8A
Other languages
Chinese (zh)
Inventor
侯鹏程
周子建
柳邦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei Sanjiang Aerospace Hongfeng Control Co Ltd
Original Assignee
Hubei Sanjiang Aerospace Hongfeng Control Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Sanjiang Aerospace Hongfeng Control Co Ltd filed Critical Hubei Sanjiang Aerospace Hongfeng Control Co Ltd
Priority to CN202010049656.8A priority Critical patent/CN111239760A/en
Publication of CN111239760A publication Critical patent/CN111239760A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4913Circuits for detection, sampling, integration or read-out

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a multi-view field target environment information acquisition device and a method based on a fusion sensor, wherein in the acquisition device, a data processing and output system is respectively connected with a laser emission time measuring system and an image control acquisition processing system through a system auxiliary control and communication system, the laser emission time measuring system is connected with a laser diode, and the image control acquisition processing system is connected with a camera; the mirror surface of the MEMS micro-vibration mirror is positioned in the angle range of an output light path of the laser diode and a receiving light path of the camera, the MEMS micro-vibration mirror is connected with the data processing and outputting system, and the receiving optical system is connected with the data processing and outputting system through the photoelectric detection system; and the data processing and output system calculates to obtain the environment point cloud data and the two-dimensional image information of the matched target view field. By the technical scheme, the information enrichment degree of the target environment is increased, the reliability and the efficiency of target environment information acquisition are improved, and the system practicability is improved.

Description

Multi-view-field target environment information acquisition device and method based on fusion sensor
Technical Field
The invention relates to the technical field of optics, in particular to a multi-view-field target environment information acquisition device based on a fusion sensor and a multi-view-field target environment information acquisition method based on the fusion sensor.
Background
With the continuous development of computer and robot technology, the fields of artificial intelligence and unmanned driving become reality gradually, an intelligent machine represented by a robot and an unmanned vehicle is a high-intelligent machine system integrating multiple functions of environment perception, dynamic decision planning, behavior control and the like, and the problem to be solved for obstacle avoidance and map construction navigation is to accurately acquire environment information of a target view field.
In an unknown environment, a single sensor is difficult to accurately acquire information aiming at a target, so that multi-field scanning is performed through multiple sensors, and comparison and analysis of information of the multiple sensors are effective methods for obtaining reliable environment information. At present, the laser radar is widely applied to the field of environmental perception as a new technology, and is characterized by high detection precision, good real-time performance, large amount of acquired information data, complex processing and poor color information acquisition capacity aiming at a target; the camera is low in cost, can acquire a large amount of surrounding environment information in a two-dimensional image mode, and is low in distance measuring precision and poor in instantaneity and stability. The safety problem to be solved in the first place of popularization and application of robots and unmanned vehicles is that a multi-sensor fusion scheme is necessary under the condition that no one type of sensor can exert the best performance under all conditions.
In the process of acquiring target environment information, information acquisition of different distances and view fields is of great importance to the field of unmanned driving with extremely high real-time requirements, targets in different distance ranges are accurately extracted, and safety can be greatly improved by combining the information acquisition mode of switching of different view fields. Meanwhile, the precision of the optical element is extremely high, so that high requirements are placed on factory debugging, and low-efficiency debugging and assembling are the main bottleneck of capacity release and cost reduction.
Disclosure of Invention
Aiming at least one of the problems, the invention provides a multi-view-field target environment information acquisition device and method based on a fusion sensor, which utilize a laser radar and a camera to acquire two-dimensional information of a target environment in a mode of MEMS micro-vibration mirror scanning, acquire three-dimensional information of the target environment through the laser radar, and increase the information abundance degree of the target environment through data fusion comparison so as to improve the reliability of target environment perception; the scheme of utilizing the camera to add the scanning of MEMS micro-vibrating mirror can effectively increase the two-dimensional information acquisition scope of target environment visual field, and the laser emission unit that has adopted the different power of dual wavelength can carry out three-dimensional information acquisition to different distances and different scanning visual fields to can greatly enrich the information acquisition of target environment, with reliability and the high efficiency of guaranteeing target environment information acquisition.
In order to achieve the above object, the present invention provides a multi-view field target environment information acquisition device based on a fusion sensor, comprising: the system comprises a data processing and output system, a system auxiliary control and communication system, a laser emission time measurement system, a laser diode, an image control acquisition processing system, a camera, an MEMS micro-vibration mirror, a receiving optical system and a photoelectric detection system; the data processing and output system is respectively connected with the laser emission time measuring system and the image control acquisition processing system through the system auxiliary control and communication system, the laser emission time measuring system is connected with the laser diode, and the image control acquisition processing system is connected with the camera; the mirror surface of the MEMS micro-vibration mirror is positioned in the angle range of the output light path of the laser diode and the receiving light path of the camera, and the MEMS micro-vibration mirror is connected with the data processing and output system; the signal receiving range of the receiving optical system is in the range of an echo signal of laser emitted by the laser diode, the receiving optical system is connected with the photoelectric detection system, and the photoelectric detection system is connected with the data processing and output system; and the data processing and output system calculates and obtains environment point cloud data and two-dimensional image information of the matched target view field according to the two-dimensional image information of the target view field acquired by the camera and the echo signal of the laser received by the receiving optical system and the deflection angle of the MEMS micro-galvanometer to the laser.
In the above technical solution, preferably, the laser diodes adopt at least two groups of laser diodes with different wavelengths, and the laser emission time measurement system drives the laser diodes with different wavelengths according to a preset condition to emit laser with different wavelengths.
In the above technical scheme, preferably, two groups of laser diodes with wavelengths of 1550nm and 905nm are adopted, the laser diode with wavelength of 1550nm is used as a high-power emitter, a far-field narrow-field scanning area is formed after small-angle incidence and reflection scanning of the MEMS micro-oscillator, the laser diode with wavelength of 905nm is used as a low-power emitter, and a near-field large-field scanning area is formed after large-angle incidence and reflection scanning of the MEMS micro-oscillator.
In the above technical solution, preferably, the multi-field-of-view target environment information acquisition device based on the fusion sensor further includes a triple prism, and the laser emitted by the laser diode reaches the MEMS micro-galvanometer through the triple prism.
In the above technical solution, preferably, the receiving optical system is plated with antireflection films for 1550nm wavelength and 905nm wavelength.
The invention also provides a multi-view field target environment information acquisition method based on the fusion sensor, which is applied to the multi-view field target environment information acquisition device based on the fusion sensor in the technical scheme and comprises the following steps: sending a light-emitting signal to a laser diode through a laser emission time measuring system so as to control the laser diode to emit laser and trigger a synchronous signal to an MEMS micro-vibrating mirror; the laser enters a target environment through the MEMS micro-vibration mirror to scan and is reflected to a receiving optical system through a target, and the receiving optical system amplifies an echo signal through a photoelectric detection system and then feeds the amplified echo signal back to a data processing and output system; the data processing and output system collects the laser deflection angle fed back by the MEMS micro-galvanometer, calculates a target distance by combining the echo signal and converts the current target distance into a spatial coordinate point; controlling a camera to acquire a two-dimensional image of a target view field obtained by scanning the MEMS micro-galvanometer through an image acquisition processing system; changing the deflection angle of the MEMS micro-vibrating mirror to traverse the emitted laser scanning area and the two-dimensional image scanning area to the whole target field of view; and performing fusion matching on the radar point cloud data obtained by converting the space coordinate points of the target environment and the two-dimensional image information, and outputting.
In the above technical solution, preferably, the laser emission time measurement system drives the laser diodes with wavelengths of 1550nm and 905nm, respectively; the 1550nm laser diode is used as a high-power emitter, a far-field narrow-field scanning area is formed after small-angle incidence and reflection scanning of the MEMS micro-oscillator, the 905nm laser diode is used as a low-power emitter, and a near-field large-field scanning area is formed after large-angle incidence and reflection scanning of the MEMS micro-oscillator.
In the above technical solution, preferably, the MEMS micro-oscillating mirror includes a micro-oscillating mirror lens and a micro-oscillating mirror driving circuit, and the micro-oscillating mirror driving circuit loads a sine signal or a sawtooth signal with preset amplitude and frequency on an X axis and a Y axis of the micro-oscillating mirror lens, respectively; and the micro-vibration mirror driving circuit sends a loading signal aiming at the micro-vibration mirror lens to the data processing and output system so as to feed back the deflection angle of the MEMS micro-vibration mirror.
In the above technical solution, preferably, a target distance of a target view field is calculated according to the synchronization signal, the echo signal and the laser deflection angle, and converted to form the radar point cloud data of a spatial three-dimensional coordinate; and the scanning view field of the camera is switched by the beam expanding collimation system through the MEMS micro-galvanometer, and the two-dimensional image information acquired by the camera is spliced to obtain the two-dimensional image information of the whole target view field.
In the above technical solution, preferably, a data fusion comparison algorithm is used to compare and analyze the radar point cloud data and the two-dimensional image information, and the radar point cloud data and the two-dimensional image information of the target field of view are output in a matching manner.
Compared with the prior art, the invention has the beneficial effects that: the laser radar and the camera are used for acquiring two-dimensional information of a target environment in a micro-electro-mechanical system (MEMS) micro-galvanometer scanning mode, the laser radar is used for acquiring three-dimensional information of the target environment, the information richness of the target environment can be increased through data fusion comparison, and the reliability of sensing of the target environment is improved; the scheme of utilizing the camera to add the scanning of MEMS micro-vibrating mirror has effectively increased the two-dimensional information acquisition scope in target environment visual field, has adopted the laser emission unit of the different power of dual wavelength can carry out three-dimensional information acquisition to different distances and different scanning visual fields to greatly richen the information acquisition of target environment, with reliability and the high efficiency of guaranteeing target environment information acquisition. In addition, the scheme of adding the prism by parallel incidence of the detector is utilized, the system light path can be rapidly assembled under the condition of not obviously increasing the volume of the system, and the system practicability is improved.
Drawings
FIG. 1 is a schematic structural diagram of a multi-view-field target environment information acquisition device based on a fusion sensor according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a sensor structure of a multi-view-field target environment information acquisition device based on a fusion sensor according to an embodiment of the present invention;
fig. 3 is a schematic view of a sensor field range of a multi-field target environment information acquisition device based on a fusion sensor according to an embodiment of the present invention.
In the drawings, the correspondence between each component and the reference numeral is:
1. the system comprises a data processing and output system, 2 a system auxiliary control and communication system, 3 a laser emission time measuring system, 4 a laser diode, 41 a laser diode I, 42 a laser diode II, 5 an image control acquisition processing system, 6 a camera, 7 an MEMS micro-vibrating mirror, 8 a receiving optical system, 9 a photoelectric detection system and 10 a prism.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
The invention is described in further detail below with reference to the attached drawing figures:
as shown in fig. 1 and 2, the present invention provides a multi-view-field target environment information collecting device based on a fusion sensor, including: the system comprises a data processing and output system 1, a system auxiliary control and communication system 2, a laser emission time measurement system 3, a laser diode 4, an image control acquisition processing system 5, a camera 6, an MEMS micro-vibration mirror 7, a receiving optical system 8 and a photoelectric detection system 9; the data processing and output system 1 is respectively connected with a laser emission time measuring system 3 and an image control acquisition processing system 5 through a system auxiliary control and communication system 2, the laser emission time measuring system 3 is connected with a laser diode 4, and the image control acquisition processing system 5 is connected with a camera 6; the mirror surface of the MEMS micro-vibration mirror 7 is positioned in the angle range of the output light path of the laser diode 4 and the receiving light path of the camera 6, and the MEMS micro-vibration mirror 7 is connected with the data processing and output system 1; the signal receiving range of the receiving optical system 8 is in the range of the echo signal of the laser emitted by the laser diode 4, the receiving optical system 8 is connected with the photoelectric detection system 9, and the photoelectric detection system 9 is connected with the data processing and output system 1; the data processing and output system 1 calculates and obtains environment point cloud data and two-dimensional image information of the target view field matched with each other according to the two-dimensional image information of the target view field acquired by the camera 6 and the echo signal of the laser received by the receiving optical system 8 and the deflection angle of the laser fed back by the MEMS micro-vibration mirror 7.
In the embodiment, the laser radar and the camera 6 are used for acquiring the two-dimensional information of the target environment in a scanning mode through the MEMS micro-vibration mirror 7, the laser radar is used for acquiring the three-dimensional information of the target environment, the information richness degree of the target environment can be increased through data fusion comparison, and the reliability of sensing of the target environment is improved; the scheme of scanning by the camera 6 and the MEMS micro-vibrating mirror 7 effectively increases the two-dimensional information acquisition range of the target environment view field, thereby greatly enriching the information acquisition of the target environment and ensuring the reliability and the high efficiency of the information acquisition of the target environment.
In the above embodiment, preferably, the laser diodes 4 adopt at least two groups of laser diodes with different wavelengths, the laser emission time measurement system 3 drives the laser diodes 4 with different wavelengths according to preset conditions to emit laser with different wavelengths, and a plurality of laser emission units with different wavelengths are adopted, so that three-dimensional information acquisition can be performed for different distances and different scanning fields.
As shown in fig. 3, in the above embodiment, preferably, two sets of laser diodes 4 with wavelengths of 1550nm and 905nm are used, the laser diode 4 with wavelength of 1550nm is used as a high power transmitter, a far field narrow field scanning area is formed after small angle incidence is reflected and scanned by the MEMS micro-galvanometer 7, the laser diode 4 with wavelength of 905nm is used as a low power transmitter, and a near field large field scanning area is formed after large angle incidence is reflected and scanned by the MEMS micro-galvanometer 7.
In this embodiment, two sets of laser diodes 4 with different wavelengths are combined, and the laser emitting units with two wavelengths and different powers can perform three-dimensional information acquisition for different distances and different scanning fields. The wavelength of the laser diode I41 is 1550nm, which greatly exceeds the wavelength range of visible light, so that the laser diode is safer for human eyes, and the power of the laser diode I41 can be greatly improved, so that the laser diode I is used as a high-power transmitter, and a far-field narrow-field scanning area is formed after the laser diode I is reflected and scanned by the MEMS micro-vibration mirror 7 due to a small incident angle; the wavelength of the laser diode II 42 is 905nm, which is relatively close to the wavelength of visible light to damage human eyes, the power of the laser diode II is strictly limited and the cost is lower, so the laser diode II is used as a low-power transmitter, and a near-field large-field scanning area is formed after the laser diode II is reflected and scanned by the MEMS micro-vibrating mirror 7 due to the larger incident angle of the laser diode II.
In the above embodiment, preferably, the multi-field-of-view target environment information acquisition device based on the fusion sensor further includes a triangular prism 10, and the laser emitted by the laser diode 4 reaches the MEMS micro-galvanometer 7 through the triangular prism 10. Specifically, the plurality of groups of laser diodes 4 are arranged in parallel so that the outgoing laser light enters the triple prism 10 in parallel, and the optical path of the system is adjusted rapidly by adjusting the relative position and the relative angle.
In the above embodiment, it is preferable that the receiving optical system 8 is plated with antireflection films (not shown in the figure) for 1550nm wavelength and 905nm wavelength to ensure accurate reception of the echo signal. In other embodiments, when the laser diodes 4 with different wavelengths are selected, the antireflection film with the corresponding wavelength is selected.
The invention also provides a multi-view field target environment information acquisition method based on the fusion sensor, which is applied to the multi-view field target environment information acquisition device based on the fusion sensor in the embodiment and comprises the following steps: the laser emission time measurement system 3 sends out light emitting signals to the laser diode 4 to control the laser diode 4 to emit laser and trigger synchronous signals to the MEMS micro-vibrating mirror 7; after receiving the synchronous trigger signal, the control system sets scanning parameters of the MEMS micro-vibration mirror 7, at the moment, laser enters a target environment through the MEMS micro-vibration mirror 7 to be scanned and is reflected to the receiving optical system 8 through a target, and the receiving optical system 8 amplifies an echo signal through the photoelectric detection system 9 and then feeds the amplified echo signal back to the data processing and output system 1; the data processing and output system 1 acquires the laser deflection angle fed back by the MEMS micro-galvanometer 7, calculates a target distance by combining an echo signal and converts the current target distance into a spatial coordinate point; controlling a camera 6 to acquire two-dimensional images of a target view field obtained by scanning through an MEMS micro-galvanometer 7 through an image acquisition processing system to obtain view field images of different ranges acquired at different moments; changing the deflection angle of the MEMS micro-galvanometer 7 to traverse the emitted laser scanning area and the two-dimensional image scanning area to the whole target field of view; and performing fusion matching on the radar point cloud data obtained by converting the spatial coordinate points of the target environment and the two-dimensional image information, and outputting.
As shown in fig. 3, in the above embodiment, it is preferable that the laser emission time measuring system 3 drives the laser diode 4 having the wavelengths of 1550nm and 905nm, respectively; the 1550nm laser diode 4 is used as a high-power emitter, a far-field narrow-field scanning area is formed after small-angle incidence is reflected and scanned by the MEMS micro-vibrating mirror 7, the 905nm laser diode 4 is used as a low-power emitter, and a near-field large-field scanning area is formed after large-angle incidence is reflected and scanned by the MEMS micro-vibrating mirror 7. The laser emission test system comprises a laser diode 4 drive circuit and a laser emission test circuit, the laser diode 4 drive circuit can drive different laser diodes 4 according to actual requirements, and the laser diodes 4 comprise different wavelengths and different powers so as to meet different field scanning requirements.
In the above embodiment, preferably, the MEMS micro-galvanometer 7 includes a micro-galvanometer mirror piece and a micro-galvanometer driving circuit, and the micro-galvanometer driving circuit loads a sine signal or a sawtooth signal with preset amplitude and frequency on the X axis and the Y axis of the micro-galvanometer mirror piece, respectively; the micro-vibration mirror driving circuit sends a loading signal aiming at the micro-vibration mirror lens to the data processing and output system 1 so as to feed back the deflection angle of the MEMS micro-vibration mirror 7.
Preferably, sine signals are loaded on the X axis and the Y axis of the MEMS micro-galvanometer 7, and corresponding scanning amplitude and scanning frequency are set, so that the laser diode 4 can complete Lissajous space track scanning in a target environment after passing through the MEMS micro-galvanometer 7.
In the above embodiment, preferably, the target distance of the target field of view is calculated according to the synchronization signal, the echo signal and the laser deflection angle, and converted to form radar point cloud data of a spatial three-dimensional coordinate; and the scanning view field of the camera 6 is switched by the beam expanding collimation system through the MEMS micro-galvanometer 7, and the two-dimensional image information acquired by the camera 6 is spliced to obtain the two-dimensional image information of the whole target view field.
Specifically, the receiving optical system 8 is a large-field optical system, which is plated with an antireflection film for a specific laser wavelength to ensure accurate reception of the echo signal, and the photoelectric detection system 9 performs high-efficiency detection for the echo signal, and feeds back the echo signal to the data processing and output system 1 and the laser emission time measurement system 3 after the echo signal is amplified. The laser diode 4 is controlled to emit light continuously through the system auxiliary control and communication system 2, the emitting angle of the laser emitting diode through the scanning system is changed continuously, and finally the whole target view field is traversed to complete scanning and collection. The data processing and output system 1 processes data according to the laser flight time fed back by the laser time measurement circuit and the deflection angle currently fed back by the MEMS micro-galvanometer 7, converts the data into the distance of a target object in a target environment, and converts the distance into a space three-dimensional coordinate with the acquisition device as a central point.
Specifically, the image acquisition processing system can set relevant parameters such as acquisition frequency and image quality of the camera 6 to control the camera 6 to acquire target field two-dimensional image information, field scanning switching is performed through the MEMS micro-galvanometer 7 through the beam expansion collimation system to obtain different range field images acquired at different moments, and the image acquisition processing system performs splicing processing on required image information to obtain large field target environment two-dimensional image information.
The system auxiliary control and communication system 2 mainly realizes the functions of system working mode switching, system working state monitoring, system power supply control, signal transmission control and the like. And according to the requirement of an actual working mode, the laser diode 4 and the camera 6 are selectively switched to acquire information of different target environment view fields.
In the above embodiment, preferably, the data processing and outputting system 1 performs comparison analysis on the radar point cloud data and the two-dimensional image information by using a data fusion comparison algorithm, and matches and outputs the radar point cloud data and the two-dimensional image information of the target view field, so that the information abundance of the target environment is increased, and the reliability of sensing the target environment is improved.
The above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes will occur to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. The utility model provides a many visual fields target environment information acquisition device based on fuse sensor which characterized in that includes: the system comprises a data processing and output system, a system auxiliary control and communication system, a laser emission time measurement system, a laser diode, an image control acquisition processing system, a camera, an MEMS micro-vibration mirror, a receiving optical system and a photoelectric detection system;
the data processing and output system is respectively connected with the laser emission time measuring system and the image control acquisition processing system through the system auxiliary control and communication system, the laser emission time measuring system is connected with the laser diode, and the image control acquisition processing system is connected with the camera;
the mirror surface of the MEMS micro-vibration mirror is positioned in the angle range of the output light path of the laser diode and the receiving light path of the camera, and the MEMS micro-vibration mirror is connected with the data processing and output system;
the signal receiving range of the receiving optical system is in the range of an echo signal of laser emitted by the laser diode, the receiving optical system is connected with the photoelectric detection system, and the photoelectric detection system is connected with the data processing and output system;
and the data processing and output system calculates and obtains environment point cloud data and two-dimensional image information of the matched target view field according to the two-dimensional image information of the target view field acquired by the camera and the echo signal of the laser received by the receiving optical system and the deflection angle of the MEMS micro-galvanometer to the laser.
2. The sensor-based multi-view-field target environment information acquisition device according to claim 1, wherein the laser diodes adopt at least two groups of laser diodes with different wavelengths, and the laser emission time measurement system drives the laser diodes with different wavelengths according to preset conditions to emit laser with different wavelengths.
3. The multi-view target environment information acquisition device based on the fusion sensor as claimed in claim 2, wherein two groups of laser diodes with wavelength of 1550nm and 905nm are adopted,
a 1550nm laser diode is used as a high-power emitter, and a far-field narrow-field scanning area is formed after small-angle incidence and reflection scanning of the MEMS micro-oscillator,
the 905nm laser diode is used as a low-power emitter, and a near-field large-field scanning area is formed after the incidence of a large angle is reflected and scanned by the MEMS micro-oscillator.
4. The fusion sensor based multi-field-of-view target environment information acquisition device according to any one of claims 1 to 3, further comprising a triple prism through which the laser emitted by the laser diode reaches the MEMS micro-galvanometer.
5. The fusion sensor-based multi-field-of-view target environment information acquisition device according to claim 3, wherein the receiving optical system is plated with antireflection films for 1550nm and 905nm wavelengths.
6. The multi-view field target environment information acquisition method based on the fusion sensor is applied to the multi-view field target environment information acquisition device based on the fusion sensor as claimed in claim 1, and is characterized by comprising the following steps:
sending a light-emitting signal to a laser diode through a laser emission time measuring system so as to control the laser diode to emit laser and trigger a synchronous signal to an MEMS micro-vibrating mirror;
the laser enters a target environment through the MEMS micro-vibration mirror to scan and is reflected to a receiving optical system through a target, and the receiving optical system amplifies an echo signal through a photoelectric detection system and then feeds the amplified echo signal back to a data processing and output system;
the data processing and output system collects the laser deflection angle fed back by the MEMS micro-galvanometer, calculates a target distance by combining the echo signal and converts the current target distance into a spatial coordinate point;
controlling a camera to acquire a two-dimensional image of a target view field obtained by scanning the MEMS micro-galvanometer through an image acquisition processing system;
changing the deflection angle of the MEMS micro-vibrating mirror to traverse the emitted laser scanning area and the two-dimensional image scanning area to the whole target field of view;
and performing fusion matching on the radar point cloud data obtained by converting the space coordinate points of the target environment and the two-dimensional image information, and outputting.
7. The method for collecting environmental information of multi-view-field target based on fusion sensor as claimed in claim 6, wherein the system drives the laser diodes with wavelength of 1550nm and 905nm respectively during laser emission;
a 1550nm laser diode is used as a high-power emitter, and a far-field narrow-field scanning area is formed after small-angle incidence and reflection scanning of the MEMS micro-oscillator,
the 905nm laser diode is used as a low-power emitter, and a near-field large-field scanning area is formed after the incidence of a large angle is reflected and scanned by the MEMS micro-oscillator.
8. The sensor-based multi-field-of-view target environment information acquisition method according to claim 6, wherein the MEMS micro-galvanometer comprises a micro-galvanometer lens and a micro-galvanometer driving circuit, and the micro-galvanometer driving circuit loads sine signals or sawtooth wave signals with preset amplitudes and frequencies on an X axis and a Y axis of the micro-galvanometer lens respectively;
and the micro-vibration mirror driving circuit sends a loading signal aiming at the micro-vibration mirror lens to the data processing and output system so as to feed back the deflection angle of the MEMS micro-vibration mirror.
9. The method for acquiring the multi-view-field target environment information based on the fusion sensor as claimed in claim 6, wherein a target distance of a target view field is calculated according to the synchronous signal, the echo signal and the laser deflection angle, and the radar point cloud data forming a spatial three-dimensional coordinate is converted;
and the scanning view field of the camera is switched by the beam expanding collimation system through the MEMS micro-galvanometer, and the two-dimensional image information acquired by the camera is spliced to obtain the two-dimensional image information of the whole target view field.
10. The method for acquiring multi-view target environment information based on a fusion sensor as claimed in claim 6, wherein the radar point cloud data and the two-dimensional image information are compared and analyzed by using a data fusion comparison algorithm, and the radar point cloud data and the two-dimensional image information of the target view field are matched and output.
CN202010049656.8A 2020-01-16 2020-01-16 Multi-view-field target environment information acquisition device and method based on fusion sensor Pending CN111239760A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010049656.8A CN111239760A (en) 2020-01-16 2020-01-16 Multi-view-field target environment information acquisition device and method based on fusion sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010049656.8A CN111239760A (en) 2020-01-16 2020-01-16 Multi-view-field target environment information acquisition device and method based on fusion sensor

Publications (1)

Publication Number Publication Date
CN111239760A true CN111239760A (en) 2020-06-05

Family

ID=70868714

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010049656.8A Pending CN111239760A (en) 2020-01-16 2020-01-16 Multi-view-field target environment information acquisition device and method based on fusion sensor

Country Status (1)

Country Link
CN (1) CN111239760A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112130160A (en) * 2020-09-25 2020-12-25 重庆盛泰光电有限公司 Ultra-wideband ToF sensor
CN112184564A (en) * 2020-08-21 2021-01-05 湖北三江航天红峰控制有限公司 Three-dimensional hole compensation method based on half-edge sorting method
WO2022188279A1 (en) * 2021-03-11 2022-09-15 深圳市速腾聚创科技有限公司 Detection method and apparatus, and laser radar

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014207965A1 (en) * 2014-04-28 2015-10-29 Robert Bosch Gmbh Device for object recognition
CN107219533A (en) * 2017-08-04 2017-09-29 清华大学 Laser radar point cloud and image co-registration formula detection system
CN207249108U (en) * 2017-07-07 2018-04-17 岭纬公司 The integrated scanning device of multi-wavelength laser radar
CN110488247A (en) * 2019-08-20 2019-11-22 中国科学院苏州纳米技术与纳米仿生研究所 A kind of two dimension MEMS scanning galvanometer laser radar system
CN209803333U (en) * 2018-11-21 2019-12-17 北京万集科技股份有限公司 Three-dimensional laser radar device and system
CN110673160A (en) * 2019-10-29 2020-01-10 北科天绘(合肥)激光技术有限公司 Data fusion processing method, laser camera and corresponding intelligent vehicle or unmanned aerial vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014207965A1 (en) * 2014-04-28 2015-10-29 Robert Bosch Gmbh Device for object recognition
CN207249108U (en) * 2017-07-07 2018-04-17 岭纬公司 The integrated scanning device of multi-wavelength laser radar
CN107219533A (en) * 2017-08-04 2017-09-29 清华大学 Laser radar point cloud and image co-registration formula detection system
CN209803333U (en) * 2018-11-21 2019-12-17 北京万集科技股份有限公司 Three-dimensional laser radar device and system
CN110488247A (en) * 2019-08-20 2019-11-22 中国科学院苏州纳米技术与纳米仿生研究所 A kind of two dimension MEMS scanning galvanometer laser radar system
CN110673160A (en) * 2019-10-29 2020-01-10 北科天绘(合肥)激光技术有限公司 Data fusion processing method, laser camera and corresponding intelligent vehicle or unmanned aerial vehicle

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184564A (en) * 2020-08-21 2021-01-05 湖北三江航天红峰控制有限公司 Three-dimensional hole compensation method based on half-edge sorting method
CN112130160A (en) * 2020-09-25 2020-12-25 重庆盛泰光电有限公司 Ultra-wideband ToF sensor
CN112130160B (en) * 2020-09-25 2023-08-25 盛泰光电科技股份有限公司 Ultra-wideband TOF sensor
WO2022188279A1 (en) * 2021-03-11 2022-09-15 深圳市速腾聚创科技有限公司 Detection method and apparatus, and laser radar

Similar Documents

Publication Publication Date Title
CN107703517B (en) Airborne multi-beam optical phased array laser three-dimensional imaging radar system
CN111239760A (en) Multi-view-field target environment information acquisition device and method based on fusion sensor
EP2913630B1 (en) Laser metrology system
CN108955563B (en) Combined continuous frequency modulation laser radar device for shape scanning and measuring method
CN114545428B (en) Single-pixel-single-photon detector-based tracking range laser radar device and method
CN204758827U (en) A combined type scanning system for laser radar
CN108375762B (en) Laser radar and working method thereof
CN110308454B (en) Wind speed measurement system and method of quasi-non-blind-area Doppler coherent laser radar
CN109683174A (en) Laser radar scanning system and method, vehicle
CN106371101A (en) Intelligent range finding and obstacle avoidance device
CN109581323B (en) Micro-electromechanical laser radar system
US20110228252A1 (en) Device and method for measuring the position of at least one moving object in a three-dimensional grid
CN110018486A (en) A kind of multi-beam laser range sensor based on TOF
CN112748443A (en) Dynamic target three-dimensional imaging device and method
CN112213737A (en) Long-distance photon counting three-dimensional laser radar imaging system and method thereof
CN108594205A (en) A kind of laser radar based on line laser
CN109541545B (en) Multi-wavelength laser space positioning system and method
CN110018491A (en) Laser Scanning, device and laser radar
CN111693966B (en) Astronomical positioning field matching device and method for laser radar
CN114690796A (en) Relative positioning system and method for distance measurement between unmanned aerial vehicles
CN209803333U (en) Three-dimensional laser radar device and system
CN108195291B (en) Moving vehicle three-dimensional detection method and detection device based on differential light spots
KR102610855B1 (en) Integrated fusion sensor apparatus including detachable light source with built-in intelligent camera
CN105182360A (en) Non-scanning high-speed laser three-dimensional imaging method and system
CN209400691U (en) A kind of laser radar based on line laser

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200605