WO2021258246A1 - 一种雷达系统、可移动设备与雷达探测方法 - Google Patents

一种雷达系统、可移动设备与雷达探测方法 Download PDF

Info

Publication number
WO2021258246A1
WO2021258246A1 PCT/CN2020/097435 CN2020097435W WO2021258246A1 WO 2021258246 A1 WO2021258246 A1 WO 2021258246A1 CN 2020097435 W CN2020097435 W CN 2020097435W WO 2021258246 A1 WO2021258246 A1 WO 2021258246A1
Authority
WO
WIPO (PCT)
Prior art keywords
wavelength
optical signal
detection
radar system
point cloud
Prior art date
Application number
PCT/CN2020/097435
Other languages
English (en)
French (fr)
Inventor
程藻
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN202080102277.9A priority Critical patent/CN115702364A/zh
Priority to EP20942163.5A priority patent/EP4166988A4/en
Priority to PCT/CN2020/097435 priority patent/WO2021258246A1/zh
Publication of WO2021258246A1 publication Critical patent/WO2021258246A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Definitions

  • This application relates to the field of radar, and in particular to a radar system, a movable device and a radar detection method.
  • the requirements for the sensors in the autonomous driving system are getting higher and higher.
  • the lidar has higher and higher requirements on the resolution, number of point clouds, and detection distance of the lidar.
  • the existing technology generally uses single-wavelength lidar to detect targets, that is, the single-wavelength laser signal is transmitted externally through the lidar, and its echo signal is received, and the single-wavelength laser signal and its echo signal are analyzed and processed. Obtain the point cloud data of the target, and then determine the distance between the target and the lidar and the target type.
  • the single-wavelength laser signal is limited by the detection range, Time Of Flight (TOF), or environmental adaptability, and only one echo signal can be included in the unit receiving field of view of the single-wavelength lidar
  • TOF Time Of Flight
  • How to improve the detection performance of lidar so as to reduce the safety risk of the automatic driving process has become a technical problem that needs to be solved urgently in this field.
  • the embodiments of the present application provide a radar system, a movable device, and a radar detection method, which are used to improve the detection performance of the lidar so as to reduce the safety risk of the automatic driving process caused thereby.
  • an embodiment of the present application provides a radar system in which a laser module is used to generate a multi-wavelength optical signal, and a two-dimensional scanner is used to perform two-dimensional scanning using the multi-wavelength optical signal, And receive the echo signal in the receiving field of view of the two-dimensional scanner.
  • the echo signal is the reflected signal formed by the scanned object after being irradiated by the multi-wavelength optical signal, and the wavelength division module is used for the echo signal
  • the component light processing obtains multiple single-wavelength optical signals, which are converted into electrical signals corresponding to each wavelength by the detection module, and further, the first point cloud data is obtained based on the electrical signals corresponding to each wavelength.
  • a unit receiving field of view of a two-dimensional scanner includes multiple echo signals, which is compared with the single-wavelength lidar in the prior art.
  • this application can break through the TOF time limit for single-wavelength optical signals on the number of point clouds, and it can also break through the limitation of single-wavelength optical signals on environmental adaptability, effectively increasing the number of points in the unit receiving field of view
  • the number of clouds, the increase in the number of point clouds also increases the effective information carried in the point cloud data, which is conducive to improving the detection performance of the lidar. Therefore, the radar system provided by the present application can effectively improve the detection performance of the lidar, which is beneficial to reduce the safety risk of the automatic driving process caused thereby.
  • the multi-wavelength optical signal includes a first wavelength optical signal and a second wavelength optical signal; the first wavelength optical signal and the second wavelength optical signal have different wavelengths; then, when When the transmission parameters of the first wavelength optical signal and the second wavelength optical signal are different, the parameters of the minimum receiving field of view of the first wavelength optical signal and the second wavelength optical signal are different; wherein, the minimum The parameters of the receiving field of view include: one or more of the position, size or number of the minimum receiving field of view; the transmitting parameters include: divergence angle, emitting position, emitting time, emitting angle, and receiving field of view One or more of the position, the size of the receiving field of view, and the flight time.
  • the laser module includes: one or more lasers.
  • the laser when the laser module includes one laser, the laser is a tunable laser; in this case, the multi-wavelength optical signal includes multiple single-wavelength optical signals.
  • the plurality of the lasers include: a tunable laser and/or a single-wavelength laser; wherein the wavelengths of the optical signals generated by any two single-wavelength lasers are different
  • the multi-wavelength optical signal includes: one optical signal including multiple wavelengths, or multiple single-wavelength optical signals.
  • the wavelength division module is specifically configured to: perform optical splitting processing on the echo signal to obtain a plurality of the single-wavelength optical signals; any two single-wavelength optical signals The wavelength is different.
  • the wavelength division module is further used to: perform spectral processing or condensing processing on the multi-wavelength optical signal generated by the laser module, and provide it to the two-dimensional scanning The device performs a two-dimensional scan.
  • the wavelength division module includes one or more of a beam splitter, an optical fiber, a lens, a prism, a mirror, or a diffractive device.
  • the detection module includes: one or more detectors; when the detection module includes one detector, the detector is a multi-wavelength detector, and the The multi-wavelength detector is used to receive and process the single-wavelength optical signal of multiple wavelengths; when the detection module includes a plurality of the detectors, the multiple detectors include: a multi-wavelength detector and/or a single-wavelength detector. Wavelength detector; wherein any one of the single-wavelength detectors is used to receive and process the single-wavelength optical signal of one wavelength.
  • the radar system is an off-axis optical system or a coaxial optical system.
  • the processor is specifically configured to directly generate the first point cloud data according to the electrical signals corresponding to each of the multiple wavelengths.
  • the processor is specifically configured to: generate second point cloud data according to the electrical signals corresponding to the multiple wavelengths; The data is compensated to obtain the first point cloud data.
  • the processor is further configured to: extract noise parameters corresponding to each wavelength from the electrical signals corresponding to each of the multiple wavelengths; and determine the noise parameters corresponding to the multiple wavelengths The environmental quality parameters.
  • the noise parameter includes: a backscattered noise parameter; at this time, the processor is specifically configured to: use a backscatter function to process the backscatter corresponding to multiple wavelengths Noise parameters to obtain the scattering coefficient; use the atmospheric absorption function to process the electrical signals corresponding to multiple wavelengths and the backscattered noise parameters to obtain the absorption coefficient; determine the environmental quality according to the scattering coefficient and the absorption coefficient parameter.
  • the environmental quality parameter includes one or more of fog type, severe weather level, particle concentration, humidity, or particle size distribution.
  • the processor is specifically configured to: in a preset compensation formula, obtain a target compensation formula that matches the environmental quality parameter;
  • the second point cloud data is compensated to obtain the first point cloud data.
  • the processor is further configured to: obtain a single-wavelength detection result corresponding to each wavelength according to the first point cloud data; and according to the single-wavelength detection result of multiple wavelengths To determine the radar detection result.
  • the processor is specifically configured to: determine the first distance corresponding to each wavelength by using the transmission and reception time of the optical signal corresponding to each wavelength in the first point cloud data, as the The single-wavelength detection result; a second distance is determined according to the first distance corresponding to multiple wavelengths, and the second distance is used to characterize the distance between the radar system and the detection target.
  • the processor is specifically configured to: use the echo intensity of the optical signal corresponding to each wavelength in the first point cloud data to determine the target reflectivity corresponding to each wavelength, as The single-wavelength detection result; the type of the detected target is determined according to the target reflectivity corresponding to multiple wavelengths.
  • the processor is further configured to: determine detection parameters of the radar system; the detection parameters include: the transmission parameters of the multi-wavelength optical signal of multiple wavelengths and the The configuration parameters of the radar system; the target is detected according to the detection parameters.
  • the processor is further configured to: receive first information, where the first information is used to indicate the transmission parameter and/or the configuration parameter of the radar system.
  • an embodiment of the present application provides a movable device, including: the radar system and a controller according to any one of the embodiments in the first aspect, wherein the controller is coupled to the radar system and can be used to Point cloud data to control removable devices.
  • an embodiment of the present application provides a movable device, including: the radar system and a controller according to any one of the embodiments in the first aspect, wherein the controller is coupled to the radar system and can be used to Point cloud data to control removable devices.
  • the controller can also be used to obtain the single-wavelength detection result corresponding to each wavelength based on the first point cloud data, and to The single-wavelength detection result of multiple wavelengths determines the radar detection result.
  • the radar system can emit a multi-wavelength optical signal, and obtain the first point cloud data based on the echo signal, and receive the field of view in a unit of the two-dimensional scanner Contains multiple echo signals.
  • this application can break the TOF time vs. the number of single-wavelength optical signals vs. point clouds.
  • the limitation of single-wavelength optical signal can also break through the environmental adaptability limitation of single-wavelength optical signals, effectively increasing the number of point clouds in the unit receiving field of view.
  • the increase in the number of point clouds also increases the effective information carried in the point cloud data, which is beneficial to improve The detection performance of lidar.
  • the radar system provided by the present application can effectively improve the detection performance of the lidar, which is beneficial to reduce the safety risks of the automatic driving process caused thereby.
  • the controller is specifically configured to: send a first message to the radar system, and the first message is used to instruct all of the radar system The transmission parameters and/or the configuration parameters.
  • the movable device further includes: a sensor and/or a communication module; the controller is further configured to: pass the sensor and/or The communication module obtains environmental parameters; according to the environmental parameters, the emission parameters and/or the configuration parameters of the laser signals of each wavelength are determined.
  • the senor includes one of a millimeter wave radar, an image acquisition device, a global positioning system receiver, an inertial measurement unit, and a human-computer interaction interface. kind or more.
  • the controller is further configured to: receive second information from the radar system, where the second information carries first point cloud data According to the first point cloud data, obtain the single-wavelength detection result corresponding to each wavelength; determine the radar detection result according to the single-wavelength detection result of multiple wavelengths.
  • the controller is specifically configured to: use the transceiver time of the optical signal corresponding to each wavelength in the first point cloud data to determine that each wavelength corresponds to The first distance is used as the single-wavelength detection result; the second distance is determined according to the first distance corresponding to multiple wavelengths, and the second distance is used to characterize the distance between the radar system and the detection target .
  • the controller is specifically configured to: use the echo intensity of the optical signal corresponding to each wavelength in the first point cloud data to determine each wavelength
  • the corresponding target reflectivity is used as the single-wavelength detection result; and the type of the detected target is determined according to the target reflectivity corresponding to multiple wavelengths.
  • the movable device includes: a vehicle, an unmanned aerial vehicle, or a ground robot.
  • an embodiment of the present application provides a radar detection method, which can be applied to the radar system shown in any one of the embodiments of the first aspect, the second aspect, or the third aspect.
  • the radar system It can generate multi-wavelength optical signals, and use the multi-wavelength optical signals to perform two-dimensional scanning, and receive the echo signals in the receiving field of view of the two-dimensional scanner.
  • the echo signals are subjected to optical splitting processing and photoelectric signals are converted into Therefore, the electrical signal corresponding to each wavelength is obtained, and further, the first point cloud data is obtained according to the electrical signal corresponding to each wavelength.
  • a unit receiving field of view of a two-dimensional scanner includes multiple echo signals
  • the unit receiving field of view can only contain one echo signal.
  • this application can break through the TOF time limit for single-wavelength optical signals on the number of point clouds, as well as the limitation of single-wavelength optical signals for environmental adaptability, effectively increasing the number of point clouds in the unit receiving field of view, and accordingly Obtaining more effective information will help improve the detection performance of lidar.
  • the radar system provided by the present application can effectively improve the detection performance of the lidar, which is beneficial to reduce the safety risks of the automatic driving process caused thereby.
  • the acquiring the first point cloud data according to the electrical signal corresponding to each wavelength includes: generating the second point cloud data according to the electrical signal corresponding to each of the multiple wavelengths; and using the environment The quality parameter compensates the second point cloud data to obtain the first point cloud data.
  • the method further includes: extracting noise parameters corresponding to each wavelength from the electrical signals corresponding to each of the multiple wavelengths; and determining the environment according to the noise parameters corresponding to the multiple wavelengths Quality parameters.
  • the noise parameter includes: a backscattered noise parameter; the extracting the noise parameter corresponding to each wavelength from the electrical signals corresponding to each of the multiple wavelengths includes: using a backscatter function Process the backscattered noise parameters corresponding to multiple wavelengths to obtain a scattering coefficient; use atmospheric absorption function to process the electrical signals and the backscattered noise parameters corresponding to multiple wavelengths to obtain the absorption coefficient; With the absorption coefficient, the environmental quality parameter is determined.
  • the use of the environmental quality parameter to compensate the second point cloud data to obtain the first point cloud data includes: obtaining the same value in a preset compensation formula The target compensation formula that matches the environmental quality parameter; the second point cloud data is compensated by using the target compensation formula to obtain the first point cloud data.
  • the method further includes: obtaining a single-wavelength detection result corresponding to each wavelength according to the first point cloud data; and determining a radar based on the single-wavelength detection result of multiple wavelengths Detection results.
  • the obtaining a single-wavelength detection result corresponding to each wavelength according to the first point cloud data includes: using the optical signal corresponding to each wavelength in the first point cloud data At the time of receiving and sending, determining the first distance corresponding to each wavelength as the single-wavelength detection result; the determining the radar detection result according to the single-wavelength detection result of multiple wavelengths includes: The first distance determines the second distance, and the second distance is used to characterize the distance between the radar system and the detection target.
  • the obtaining a single-wavelength detection result corresponding to each wavelength according to the first point cloud data includes: using the optical signal corresponding to each wavelength in the first point cloud data
  • the echo intensity is determined by determining the target reflectivity corresponding to each wavelength as the single-wavelength detection result
  • the determining the radar detection result according to the single-wavelength detection result of multiple wavelengths includes: determining the radar detection result according to the multiple wavelengths Said target reflectivity, determine the type of detection target.
  • the method further includes: determining detection parameters of the radar system; the detection parameters include: transmission parameters of the multi-wavelength optical signal and configuration parameters of the radar system; The detection parameters detect the target.
  • the method further includes: receiving first information, where the first information is used to indicate the transmission parameter and/or the configuration parameter of the radar system.
  • an embodiment of the present application provides a radar detection method, which can be applied to a controller in a movable device as shown in any embodiment of the third aspect.
  • the controller can receive data from For the second information of the radar system, the second information carries the first point cloud data, so that the controller can control the movable device to move based on the first point cloud data.
  • an embodiment of the present application provides a radar detection method, which can be applied to a controller in a movable device shown in any one of the embodiments of the third aspect.
  • the controller can receive data from For the second information of the radar system, the first point cloud data is carried in the second information, so that the controller can obtain the single-wavelength detection results corresponding to each wavelength based on the first point cloud data, and further, the single-wavelength detection results based on multiple wavelengths The wavelength detection result determines the radar detection result. Therefore, through this solution, the mobile device can obtain multiple single-wavelength detection results based on the point cloud data provided by the radar system, and then comprehensively consider the single-wavelength detection results to obtain the final radar detection result.
  • the radar detection result is based on
  • the detection results of multi-wavelength optical signals are comprehensively obtained. Different wavelengths of light have different adaptability to the environment and targets. In this way, more comprehensive information can be obtained, and it also breaks through the application scenarios of single-wavelength light being limited by the environment and targets. Single, the problem of unstable accuracy of detection results is conducive to improving the accuracy of radar detection results, and it is also conducive to reducing the resulting safety risks in the automatic driving process.
  • the acquiring a single-wavelength detection result corresponding to each wavelength according to the first point cloud data includes: using the first point cloud data Determining the first distance corresponding to each wavelength as the single-wavelength detection result; determining the radar detection result according to the single-wavelength detection result of multiple wavelengths includes: The first distance corresponding to the multiple wavelengths determines the second distance, and the second distance is used to characterize the distance between the radar system and the detection target.
  • the acquiring a single-wavelength detection result corresponding to each wavelength according to the first point cloud data includes: using the first point cloud data Determining the target reflectivity corresponding to each wavelength as the single-wavelength detection result; determining the radar detection result according to the single-wavelength detection results of multiple wavelengths, including : Determine the type of detection target according to the target reflectivity corresponding to multiple wavelengths.
  • the method further includes: sending a first message to the radar system, where the first message is used to instruct the transmission of the radar system Parameters and/or the configuration parameters.
  • the movable device further includes: a sensor and/or a communication module; the method further includes: using the sensor and/or the communication The module obtains environmental parameters; and determines the emission parameters and/or the configuration parameters of the laser signal of each wavelength according to the environmental parameters.
  • the embodiments of the present application provide a computer-readable storage medium in which a computer program is stored.
  • the computer program When the computer program is run on a computer, the computer can execute the fourth aspect, fifth aspect, or third aspect.
  • this application provides a computer program, when the computer program is executed by a computer, it is used to execute the method described in any one of the embodiments of the fourth aspect, the fifth aspect, or the sixth aspect.
  • the computer program in the eighth aspect may be stored in whole or in part on a storage medium that is packaged with the processor, or partially or completely stored in a storage medium that is not packaged with the processor. On the memory.
  • the embodiments of the present application provide a radar system, a movable device, and a radar detection method, which perform two-dimensional scanning by emitting multi-wavelength optical signals, so that there can be multiple echoes in a unit receiving field of view of a two-dimensional scanner.
  • the wave signal breaks through the limitation of the detection distance and TOF time on the number of echo signals in the received field of view, and the first point cloud data obtained based on the multi-wavelength optical signal can carry the effective information carried by the optical signal of each wavelength. It is helpful to break through the limitation of single-wavelength optical signal to environmental adaptability, can effectively increase the number of effective information in point cloud data, improve the detection accuracy and detection performance of lidar, and help reduce the safety of the resulting automatic driving process risk.
  • FIG. 1 is a schematic diagram of a radar detection scenario provided by an embodiment of the application
  • FIG. 2 is a schematic diagram of another radar detection scenario provided by an embodiment of the application.
  • FIG. 3 is a schematic diagram of another radar detection scenario provided by an embodiment of the application.
  • FIG. 4 is a schematic diagram of the limitation of the number of point clouds of a single-wavelength optical signal by the time of flight in the prior art
  • FIG. 5 is a schematic diagram of optical signals of different wavelength bands provided by an embodiment of the application.
  • FIG. 6 is a schematic diagram of the architecture of a radar system provided by an embodiment of the application.
  • FIG. 7 is a schematic diagram of a wavelength division module of a radar system according to an embodiment of the application.
  • FIG. 8 is a schematic diagram of a wavelength division module of another radar system according to an embodiment of the application.
  • FIG. 9 is a schematic diagram of a wavelength division module of another radar system according to an embodiment of the application.
  • FIG. 10 is a schematic diagram of a wavelength division module of another radar system according to an embodiment of the application.
  • FIG. 11 is a schematic diagram of a receiving field of view of a two-dimensional scanner in an embodiment of the application.
  • FIG. 12 is a schematic diagram of a point cloud obtained by two-dimensional scanning performed by a radar system in an embodiment of the application.
  • FIG. 13 is a schematic diagram of the relationship between the transmission field of view and the reception field of view of a multi-wavelength optical signal in an embodiment of the application;
  • FIG. 14 is a schematic diagram of a processor obtaining radar detection results (radar detection method) in an embodiment of the application;
  • FIG. 15 is a schematic diagram of the target recognition effect when the single-wavelength optical signal and the multi-wavelength optical signal are used to respectively recognize the same target.
  • Removable devices refer to devices with mobile capabilities.
  • the movable device may include a body and a controller, and the controller is used to control the movable device.
  • the controller can be installed (or referred to as: integrated or mounted) on the body, or the controller can also be set separately from the body of the movable device.
  • the movable equipment may include, but is not limited to: vehicles, unmanned aerial vehicles, ground robots, and so on.
  • the controller may be the main controller of the vehicle (or one or more processing units in the main controller), and the main controller is used to control the vehicle.
  • the main controller is available It is used to realize the control of vehicle driving (including automatic driving), the playback control of on-board multimedia players, and the control of on-board equipment (for example, cameras, lights, positioning systems, etc.).
  • the controller may also be a terminal or other remote controller that is communicatively connected with the main controller of the vehicle.
  • the communication connection method will be explained in detail later.
  • the terminals involved in the embodiments of the present application may include, but are not limited to: mobile phones, tablet computers, notebook computers, palmtop computers, mobile internet devices (mobile internet devices, MIDs), etc., which are not exhaustive.
  • the controller may be a remote controller of the drone (or one or more processing units in the remote controller), and the remote controller may remotely control the drone to perform flight tasks ;
  • the controller may be a processor mounted in the airframe of the drone.
  • the controller may be a remote control server of the ground robot (or one or more processing units in the remote control server), and the remote control server may be used to control one or more ground robots to perform ground tasks ; Or, the controller can also be a process installed in the body of the ground robot.
  • Movable equipment can use the radar system to detect the accessory environment or targets.
  • the radar system can be mounted (or called: integrated installation, installation, setting) on the fuselage of the movable device. In this way, radar detection can be performed while the movable device is moving.
  • the controller in the mobile device can be communicatively connected with the radar system.
  • the communication connection mode may include: wired connection and/or wireless connection.
  • wireless connection methods may include, but are not limited to: wireless fidelity (Wireless-Fidelity, WIFI) connection, Bluetooth connection, near field communication (NFC) connection, vehicle network connection, etc., which are not exhaustive.
  • FIGS. 1 to 3 are given in the embodiments of the present application, which exemplify the radar detection scenarios applied in the embodiments of the present application.
  • Fig. 1 shows a radar detection scenario.
  • the movable device 100 is a vehicle, and the vehicle can run on the road.
  • the vehicle specifically includes: a body (or called a body, a body) 110, a radar system 120 and a controller 130.
  • the radar system 120 and the controller 130 are both mounted on the fuselage 110, and the radar system 120 is used to detect targets near the vehicle 100, for example, the vehicle 100 is traveling in front.
  • the controller 130 communicates with the radar system 120 and is used to control the driving of the vehicle 100.
  • the controller 130 is communicatively connected to the radar system 120, and can receive radar detection results from the radar system 120 (the specific content will be detailed later), and the controller 130 can refer to the radar detection As a result, the vehicle 100 is controlled.
  • Fig. 2 shows another radar detection scenario.
  • the movable device 100 is a vehicle, and a radar system 120 is mounted on the fuselage 110 of the vehicle for detecting targets near the vehicle.
  • the controller 130 of the vehicle is a mobile phone, that is, the mobile phone is used to control the vehicle.
  • the radar system 120 is also in communication connection with a mobile phone (controller 130), and the mobile phone (controller 130) can receive radar detection results from the radar system 120 and control the vehicle (the mobile device 100) accordingly.
  • FIG. 3 shows another radar detection scenario.
  • the mobile device 100 is a drone, and the drone can travel in the air and perform flying tasks.
  • a radar system 120 is mounted on the fuselage 110 of the unmanned aerial vehicle for target detection in the vicinity of the unmanned aerial vehicle.
  • the controller 130 of the drone (the mobile device 100) is a remote control for controlling the drone to perform flight tasks.
  • the remote controller (controller 130) is in communication connection with the radar system 120, and the remote controller can receive radar detection results from the radar system 120 and control the drone (the mobile device 100) accordingly.
  • the controller may be controlled by the user, or the controller 130 may also control the movable device by itself.
  • the controller 130 may be controlled by the driver and execute corresponding control functions according to the obtained driver's operating information, and the radar system 120 may perform target detection in this scenario Task. That is, this application can be applied to the scenario where the driver drives the vehicle.
  • the controller 130 can also control the vehicle to travel by itself.
  • the controller 130 controls the vehicle to drive automatically (or called unmanned driving), and the radar system 120 can also perform target detection tasks. That is, the present application can also be applied to unmanned driving scenarios of vehicles.
  • the embodiment of the present application is irrelevant to whether the movable device is in a motion state.
  • the present application can be applied to target detection in a moving scene of a movable device, and can also be applied to target detection in a stationary scene of the movable device.
  • the radar system can also be used for target detection.
  • the UAV is stationary in mid-air, the radar system can also be used to detect nearby targets.
  • the movable device may also be equipped with other sensors or other radar systems, which are not particularly limited in the embodiment of the present application.
  • the mobile device may also be equipped with, but not limited to, one or more of a speed sensor, a millimeter wave radar, and a global positioning system (Global Positioning System, GPS) receiver.
  • GPS Global Positioning System
  • the radar system 120 may be specifically a lidar system, which may also be called: lidar, optical radar system, lidar (light detection and ranging), etc., which are radars that emit laser beams to detect targets. system.
  • the principle is to emit a laser beam and receive the feedback signal of the laser beam (or called signal echo, echo signal), and analyze the echo signal and the feedback signal to achieve target detection.
  • Lidar is currently used in surveying and mapping, archaeology, geography, geomorphology, earthquake, forestry, remote sensing, and atmospheric physics.
  • Lidar is one of the most important sensors for mobile devices to realize autonomous driving.
  • the radar detection result is directly related to the driving safety of the mobile device.
  • the detection performance of lidar has become the focus of research in this field.
  • the lidar 130 (for example, a vehicle-mounted lidar, etc.) carried in a portable device is generally a single-wavelength lidar.
  • the so-called single-wavelength lidar refers to a lidar system that uses a single-wavelength laser signal to achieve target detection.
  • the single-wavelength laser signal may be one laser signal or a laser beam composed of multiple single-wavelength laser signals, which means that the wavelength of each laser signal in the laser beam is the same, and the wavelength of the laser beam is single.
  • single-point scanning also known as one-dimensional scanning or 1D scanning
  • two-dimensional scanning also known as 2D scanning
  • the lidar system When a single-wavelength lidar is used for two-dimensional scanning, it is generally achieved by emitting a laser beam from a single point. In this case, the radar system can obtain three-dimensional point cloud data. Therefore, the radar system can also be called a three-dimensional radar system.
  • the single-wavelength lidar has good ranging performance and size, which can meet the detection requirements of current vehicle-mounted radars.
  • the single-wavelength lidar has limited laser emission and reception channels and low parallelism, which results in a limited number of point clouds in the unit receiving field of view. Therefore, at present, only the resolution and frame rate can be traded-off. Adjust to improve the detection performance of single-wavelength laser signals as much as possible.
  • the detection performance of the lidar can be characterized by at least one of the number of point clouds per unit time and space, and the resolution. The higher the number of point clouds per unit time and space, the better the detection performance of lidar; the higher the resolution per unit time and space, the better the detection performance of lidar.
  • the restrictive factors of detection performance may include but are not limited to: laser capability, cooperation of laser and scanner, time of flight (i.e. TOF time) required for ranging, laser re-frequency (i.e. repetition frequency), One or more of the number of point clouds generated by a single measurement, the number of pulses required for a single measurement, scanning speed, scanning range, and frame rate.
  • the number of point clouds of the lidar in unit time and space can satisfy the following relationship:
  • Number of point clouds ⁇ (laser repetition frequency/number of pulses required for a single measurement) ⁇ number of lasers ⁇ number of point clouds generated by a single measurement
  • the number of point clouds generated by a single measurement is generally determined by the number of pixels of the detector. However, increasing the number of detectors will affect the power allocated by each pixel and affect the detection range. That is, under the same other conditions, the more the number of lidar detectors, the shorter the detection range.
  • TOF time is the maximum time required for an optical pulse signal (or called: laser pulse signal, laser signal, or optical signal) to go back and forth (flight) between the radar and the target.
  • optical pulse signal or called: laser pulse signal, laser signal, or optical signal
  • each Rx has a different position in its respective unit TOF.
  • L2 ⁇ L1 ⁇ L3 where L1 is the time length between Tx1 and Rx1, L2 is the time length between Tx2 and Rx2, and L3 is the time length between Tx3 and Rx3.
  • the lidar system only emits the optical pulse signal once and receives the optical pulse signal only once within a unit TOF time.
  • the lidar can only transmit at most one single-wavelength optical pulse signal and receive at most one echo signal of the optical pulse signal within a unit TOF time. It can be understood that within a unit TOF, there may be no optical pulse signal transmission, and/or no echo signal reception. It's not too much to expand on this.
  • the unit TOF time can also be determined accordingly. Therefore, the highest point cloud number that the lidar can generate in a unit time (which may include one or more unit TOF time), that is: The product of the number of unit TOF times contained in the unit time and the number of point clouds generated by a single measurement.
  • light pulse signals of different wavelengths have different adaptability and applicability to the environment.
  • optical pulse signals of different wavelengths not only affect the environmental adaptability, anti-noise ability, and anti-interference ability of lidar, but also affect the selection of devices in lidar, chip materials and processes, universal applicability, manufacturing cost and convenience Degree and so on.
  • FIG. 5 shows schematic diagrams of optical signals in different wavelength bands.
  • the wavelength bands of the laser signal that the lidar can emit can include: ultraviolet light (not shown in Figure 5), visible light (Visible light), infrared (IR) light, and near infrared (Near Infrared, NIR) light or short-wave infrared (Short-Wave Infrared, SWIR) light.
  • infrared light bands can be further divided, including: near-infrared band (marked as 51 in Fig. 5), short-wave infrared band (also called near-infrared band, marked as 52 in Fig. 5), intermediate infrared (marked as 52 in Fig.
  • the ultraviolet band is about 10-400nm
  • the visible band is about 390-750nm
  • the near-infrared band is about 700-2500nm
  • the mid-infrared band is about 2.5-25 ⁇ m
  • the mid-infrared band is about 25-500 ⁇ m.
  • a laser signal of one wavelength can be emitted to the outside, and the wavelength of the laser signal is within the wavelength range of the optical signal shown in FIG. 5.
  • the wavelengths of the optical signals shown in FIG. 5 are different, which also results in different environmental adaptability of laser signals of different wavelengths. Specifically, any two laser signals with different wavelengths are different in one or more of atmospheric permeability, particle scattering and absorption, ambient light noise, and target reflectivity. In other words, there is no single-wavelength laser signal that can meet excellent performance in all aspects, and no single-wavelength laser signal can adapt to various environments.
  • the environmental adaptation of laser signals based on different wavelengths is different, so when a single wavelength laser signal is used for target detection, the detection performance of the lidar will inevitably be affected by the corresponding effect of the laser signal on the environmental adaptability. In other words, the environmental adaptability of single-wavelength lidar is poor, resulting in unstable detection performance of lidar.
  • the lidar 1 realizes target detection by emitting a light pulse signal of wavelength 1 to the outside.
  • the optical pulse signal of wavelength 1 has better adaptability to the rain and snow environment, and can obtain higher accuracy radar detection results in the rain and snow environment; however, the optical pulse signal of wavelength 1 has poor adaptability to the sand and dust storm environment. , The accuracy of radar detection results is low.
  • the vehicle is equipped with the lidar 1, and based on this (combined with other sensors, etc., no further description), automatic driving is realized. Then, when the vehicle is driving in the rain and snow environment, the lidar can obtain good radar detection results, which is helpful for the vehicle to avoid obstacles in time or adopt other driving strategies, and can reduce the safety risk of the vehicle's automatic driving process to a certain extent.
  • the accuracy of the radar detection results obtained by lidar is low, and it is very likely that the obstacle cannot be accurately identified, causing the vehicle to collide with the obstacle, rear-end collision and other safety accidents, affecting the safety of life and property .
  • single-wavelength lidar for single-wavelength lidar, it is limited by detection distance, TOF time, and environmental adaptability of laser signals.
  • Single-wavelength lidar has only one echo signal of optical pulse signal per unit TOF time, and cannot break through.
  • TOF time limits the number and resolution of point clouds.
  • the link transmission characteristics, target reflection characteristics, and ambient light noise characteristics of a single-wavelength optical pulse signal are unique. This also results in a small number of single-wavelength lidar points, which in turn leads to the effective information carried in the point cloud. Less, this also greatly affects the detection performance of lidar, and the detection performance of lidar is poor.
  • the embodiments of the present application provide a radar system and a radar detection method thereof.
  • the radar system can be applied to any of the foregoing radar detection scenarios. Exemplarily, in any of the scenarios shown in FIG. 1 to FIG. 3, the radar system provided in the embodiment of the present application may be used to implement the corresponding detection function.
  • FIG. 6 shows a schematic diagram of the system architecture of a radar system.
  • the radar system 600 includes: a laser module 610, a two-dimensional scanner 620, a wavelength division module 630, a detection module 640, and a processor 650; among them, the laser module 610 is used to generate multiple wavelengths.
  • the two-dimensional scanner 620 is used to use the multi-wavelength optical signal to perform two-dimensional scanning, and to receive echo signals in the receiving field of view of the two-dimensional scanner 620, the echo signals being scanned
  • the object forms a reflection signal after being irradiated by the multi-wavelength optical signal, wherein the unit receiving field of view of the two-dimensional scanner includes a plurality of echo signals;
  • the wavelength division module 630 is used to perform spectral processing on the echo signals, Obtain multiple single-wavelength optical signals;
  • the detection module 640 is used to convert multiple single-wavelength optical signals into electrical signals corresponding to multiple wavelengths;
  • the processor 650 is used to obtain the first Point cloud data.
  • the laser module 610 can emit multi-wavelength optical signals, so that when the two-dimensional scanner 620 realizes two-dimensional scanning, the unit FOV of the two-dimensional scanner includes multiple loops.
  • Wave signal Specifically, when the multi-wavelength optical signal includes optical signals of N wavelengths (or called: laser signal, laser pulse signal, optical pulse signal, etc.), as shown in Figure 6: ⁇ 1, ⁇ 2... ⁇ N, where , N is an integer greater than 1; then, N echo signals are included in the unit FOV of the two-dimensional scanner.
  • N echo signals respectively correspond to optical signals of N wavelengths in a one-to-one correspondence.
  • the wavelengths of the N echo signals are also ⁇ 1, ⁇ 2... ⁇ N, which correspond one-to-one with the optical signals of the N wavelengths output by the laser module.
  • this application can break through the TOF time limit on the number of point clouds, and it can also break through the single-wavelength light.
  • the limitation of the signal to the environmental adaptability effectively increases the number of point clouds in the unit receiving field of view.
  • the increase in the number of point clouds also increases the effective information carried in the point cloud data, which is conducive to improving the detection performance of the lidar.
  • the radar system provided by the present application can effectively improve the detection performance of the lidar, which is beneficial to reduce the safety risks of the automatic driving process caused thereby. This technical effect will be explained in detail later.
  • the radar system is now further explained.
  • the radar system provided by the embodiments of the present application is actually a laser radar system, and the laser module in it is first described in detail.
  • the laser module is used to generate and emit multi-wavelength optical signals, and the multi-wavelength optical signals can be composed of multiple single-wavelength optical signals. It should be noted that the multiple single-wavelength optical signals included in the multi-wavelength optical signal can be output at the same time or in a time-sharing manner, which will be described in detail later. Based on this, the laser module can also be called: a multi-wavelength laser source (Multi-wavelength Laser Source).
  • a multi-wavelength Laser Source Multi-wavelength Laser Source
  • the laser module may include one or more lasers.
  • the laser may include a single-wavelength laser and a tunable laser.
  • a single-wavelength laser is used to generate a single-wavelength optical pulse signal; the wavelengths of the optical pulse signals generated by any two single-wavelength lasers are different.
  • the laser module may include five single-wavelength lasers, and each single-wavelength laser is used to generate and emit single-wavelength optical pulse signals of 905 nm, 940 nm, 1064 nm, 1310 nm, and 1550 nm.
  • Tunable lasers can be used to generate optical pulse signals of multiple wavelengths.
  • a tunable laser can be tuned to generate and emit single-wavelength optical pulse signals of 905 nm, 940 nm, 1064 nm, 1310 nm, and 1550 nm, respectively.
  • the laser may be a tunable laser.
  • the multi-wavelength optical signal includes a plurality of single-wavelength optical signals.
  • the generation time of each single-wavelength optical signal generated by the tunable laser is different, that is, the tunable laser can sequentially generate multiple single-wavelength optical signals, and each single-wavelength optical signal can be emitted in sequence.
  • the laser module may also include multiple lasers.
  • the laser module may include: a tunable laser and/or a single-wavelength laser.
  • the multi-wavelength optical signal includes: one optical signal including multiple wavelengths, or multiple single-wavelength optical signals.
  • the laser module may include N single-wavelength lasers.
  • the laser module can generate and output single-wavelength optical signals of N wavelengths; for example, in the laser module It may include N-1 single-wavelength lasers and 1 tunable laser. In this way, the laser module can generate and output at least N-1 single-wavelength optical signals.
  • the multiple lasers can simultaneously generate and emit multiple single-wavelength optical signals, or time-sharing and output multiple single-wavelength optical signals.
  • the embodiments of the present application have no special restrictions on the wavelength values of the multi-wavelength optical signals that can be generated and emitted by the laser module.
  • the foregoing wavelengths are only illustrative. In actual scenarios, the laser modules can use various wavelengths that meet the requirements of their respective wavelength bands. Laser.
  • the wavelength of the multi-wavelength optical signal may be determined based on one or more aspects of atmospheric permeability, weather adaptability, water absorption capacity, target reflectivity, sunlight or human eye safety .
  • Table 1 shows the effects of the aforementioned multiple influencing factors on optical signals of different wavelengths.
  • the impact of the environment on the radar detection process is mainly reflected in two aspects: different weather affects the attenuation of the laser signal in the atmosphere, and different weather also affects the increase or decrease of the target reflectivity.
  • different weather affects the attenuation of the laser signal in the atmosphere
  • different weather also affects the increase or decrease of the target reflectivity.
  • the attenuation of laser signal in the atmosphere is different; for example, rain, snow, sand, dust, haze and other weather will also cause the target reflectivity to have different degrees Increase or decrease.
  • the atmospheric attenuation is determined by the extinction coefficient, and the extinction coefficient is related to absorption and scattering.
  • the extinction coefficient is related to the absorption coefficient and the scattering coefficient.
  • the absorption of the light signal by the atmosphere is mainly due to the energy of the light signal being absorbed by molecules such as water and carbon dioxide (CO2) and converted into thermal energy and/or chemical energy, etc., which is mainly manifested as the attenuation of light intensity.
  • the scattering of the laser signal by the atmosphere is the process by which a part of the light deviates from the original propagation direction when the optical signal passes through an inhomogeneous medium. It is mainly caused by suspended particles or macromolecules.
  • multi-wavelength optical signals are used for target detection, and when optical signals of different wavelengths are used to detect the same target (or called: detection target), the echo will also be caused due to the difference in target reflectivity.
  • detection target the same target
  • the processor can perform differential processing based on the difference of the echo signals of the optical signals of different wavelengths. In this way, the recognition probability of the target material, type, etc. can be enhanced, thereby providing richer and more accurate data for the automatic driving perception fusion to the judgment. support.
  • Human eye safety can be characterized by eye safety thresholds, and the eye safety thresholds of optical signals of different wavelengths are different. Among them, the higher the human eye safety threshold of the optical signal, the higher the allowable output light power, which is more conducive to improving the detection performance of the radar system, and is conducive to reducing the number of transmission channels, and is effective for the receiving end and scanning of the two-dimensional scanner. The requirements on the end are lower.
  • the longer the wavelength of the optical signal the safer the human eye.
  • the wavelength When the wavelength is longer, the light signal will not form a clear image on the retina of the human eye.
  • the water absorption characteristic of this waveband is strong. The water in the human eye can absorb part of the energy, which greatly reduces the energy on the retina and protects the human. Eye safety.
  • the human eye safety threshold may be 1400 nm or 1550 nm. That is, optical signals with a wavelength exceeding 1400 nm or 1550 nm can be considered as a safe wavelength band for the human eye. For example, when the wavelength of the optical signal is greater than 1400nm, the laser is allowed to emit higher power per unit time and unit space. These high-power lasers can significantly improve the system's ranging performance, such as distance measurement above 300m; or spread the energy of a single beam to multiple On each detector pixel, the system resolution can be improved.
  • the dynamic detection range of the optical signal (wavelength) of the radar system can also be improved, and it can even be used with lasers that meet the full (wavelength) dynamic range. Use, can achieve full dynamic range target detection.
  • the existing lidar has a dynamic range of only 60-100dB. When its power is low, it affects the farthest measurement distance. When its power is high, it will saturate and cause a decrease in ranging accuracy; However, the radar system provided by the embodiment of the present application can avoid this problem and even meet the target detection requirement of the full dynamic range.
  • the laser needs to meet the preset requirements for generating optical pulse signals of multiple wavelengths.
  • the embodiment of the present application has no particular limitation on the structure type of the laser.
  • the laser in the laser module may include, but is not limited to: one or more of solid-state lasers, fiber lasers, semiconductor lasers, gas lasers, or dye lasers.
  • the types of lasers applicable to optical pulse signals of different wavelengths please refer to the following Table 1, which will not be further described here.
  • the detection module may also include one or more detectors for detecting different wavelengths.
  • Single-wavelength optical signals are processed for photoelectric conversion.
  • the detector may include a single-wavelength detector and a multi-wavelength detector.
  • the single-wavelength detectors is used to receive and process the single-wavelength optical signal of one wavelength.
  • the single-wavelength detector is used for photoelectric conversion processing of a single-wavelength optical pulse signal.
  • the detection module 640 there are at least two single-wavelength detectors that have different wavelengths of optical pulse signals applicable to them.
  • the detection module 640 may include five single-wavelength detectors, and each single-wavelength detector is used to perform photoelectric conversion processing on single-wavelength optical pulse signals of 905 nm, 940 nm, 1064 nm, 1310 nm, and 1550 nm, respectively.
  • the multi-wavelength laser is applicable to optical pulse signals of multiple wavelengths, that is, the multi-wavelength detector is used to receive and process the single-wavelength optical signals of multiple wavelengths.
  • the multi-wavelength detector can realize photoelectric conversion processing of light pulse signals of multiple wavelengths.
  • multi-wavelength detectors can be used for photoelectric conversion processing of single-wavelength optical pulse signals of 905 nm, 940 nm, 1064 nm, 1310 nm, and 1550 nm.
  • the detector may be a multi-wavelength detector.
  • the multiple detectors may include, but are not limited to: multi-wavelength detectors and/or single-wavelength detectors.
  • the detection module including N detectors Take the detection module including N detectors as an example.
  • the detection module may include N single-wavelength detectors. Processing, N echo signals with wavelengths ⁇ 1, ⁇ 2... ⁇ N are also processed by N detectors; for example, the detection module can include N-1 single-wavelength detectors and one multi-wavelength detector The detector, in this way, the detection module can be used for photoelectric conversion processing of at least N-1 single-wavelength optical signals.
  • the detectors in the detection module may include but are not limited to: indium gallium arsenide (that is, indium gallium arsenide, InGaAs) detectors, silicon (Si) detectors; specifically, the detectors may specifically include But not limited to: avalanche photodiode (APD) detectors, P-type semiconductor-impurity-N-type semiconductor (Positive-Intrinsic-Negative, PIN) detectors, single-photon avalanche diode detectors (Single Photon Avalanche Diode, SPADs) ), silicon photomultiplier (SiPM) detectors, etc.
  • APD avalanche photodiode
  • P-type semiconductor-impurity-N-type semiconductor P-type semiconductor-impurity-N-type semiconductor
  • SPADs single-photon avalanche diode detectors
  • SiPM silicon photomultiplier
  • Table 2 specifically shows the types of lasers and detectors applicable to optical signals of different wavelengths.
  • GaAs means gallium arsenide
  • InP means indium phosphide
  • Er-Glass means erbium-doped glass
  • Er-doped Fiber means erbium-doped fiber
  • Nd:YAG means yttrium aluminum garnet crystal
  • Nd:YLT means neodymium-doped fluoride Lithium yttrium and SiGe stand for germanium silicide.
  • the composition or doping of these materials determines the wavelength of the laser.
  • the type of material determines the type of laser.
  • the radar system provided in this application can be used but not limited to: semiconductor lasers, solid-state lasers, etc. It can be understood that Table 2 is exemplary, and actual scenarios may include but are not limited to the situations shown in Table 2.
  • solid-state lasers may also include but are not limited to: semiconductor-pumped solid-state lasers (DPSSL), lamp-pumped solid-state lasers, fiber lasers, etc., which are not exhaustive.
  • an optical signal of 850-940nm is suitable for semiconductor lasers
  • an optical signal of 1064nm is suitable for solid-state lasers or fiber lasers
  • an optical signal of 1035nm is suitable for semiconductor lasers or solid-state lasers
  • an optical signal of 1550nm is suitable for fiber lasers.
  • the optical signal of 850 ⁇ 940nm is suitable for APD detector or SPADs detector
  • the optical signal of 1064nm is suitable for APD detector
  • the optical signal of 1035nm is suitable for SPADs detector
  • the optical signal of 1550nm is suitable for APD detection.
  • Detectors or SPADs detectors are suitable for optical signals of 850 to 940 nm.
  • the APD detectors and SPADs detectors suitable for optical signals of 850 to 940 nm can be silicon material detectors, and the other detectors can be indium gallium arsenide material detectors.
  • the radar system provided in the embodiment of the present application can deploy the aforementioned lasers of at least two wavelengths, and correspondingly deploy detectors of these two wavelengths.
  • the radar system can use different lasers to generate optical signals of corresponding wavelengths to realize radar detection. Therefore, the characteristics of each laser and detector, such as laser beam quality, detector sensitivity, noise, and other characteristics, can be used to further realize radar detection.
  • the cost and volume of the system are optimized. How to select lasers and detectors to optimize the radar system will not be discussed here, but the optimization effect can be achieved through reasonable selection and layout.
  • the wavelength division module is used to perform optical splitting processing on the echo signal to obtain multiple single-wavelength optical signals.
  • the wavelength division module 630 may be disposed between the two-dimensional scanner 620 and the detection module 640.
  • the wavelength division module 630 may be used to perform optical splitting processing on the echo signal at the same time to obtain multiple single-wavelength optical signals, and any two single-wavelength optical signals have different wavelengths.
  • this application uses multiple optical pulse signals of different wavelengths to To detect the target, even if multiple optical pulse signals with different wavelengths have the same emission time and the same detection distance, the echo signals of each wavelength have different receiving times. In this way, the optical pulse signals and echo signals of each wavelength can be accurate correspond. Then, even if the WDM module processes each echo signal at the same time, there will be no misjudgment. This can also solve the limitation of the number of point clouds per unit TOF time in the prior art, which will be described in detail later.
  • the wavelength division module can also be arranged between the laser module and the two-dimensional scanner.
  • the wavelength division module can also be used to:
  • the wavelength optical signal is subjected to spectroscopic processing or condensing processing, and is provided to the two-dimensional scanner for two-dimensional scanning.
  • the laser module can output N optical pulse signals of different wavelengths through multiple lasers, and these optical pulse signals can be condensed by the wavelength division module to form a beam of optical signals, which are emitted through a two-dimensional scanner .
  • the laser module can simultaneously output N optical pulse signals of different wavelengths through multiple lasers. These optical pulse signals are mixed into a beam of light and emitted. In this case, this beam of light can be split by the wavelength division module. After processing, multiple single-wavelength optical signals are formed and emitted through a two-dimensional scanner. This will be described later in conjunction with FIG. 7.
  • the wavelength division module may include, but is not limited to: one or more of a beam splitter, an optical fiber, a lens, a prism, a mirror, or a diffractive device.
  • the wavelength division module may include only one beam splitter.
  • the wavelength division module 630 may specifically be a beam splitter.
  • the wavelength division module may also be composed of one or more optical elements other than the beam splitter.
  • the solid arrow represents the multi-wavelength optical signal (that is, the emitted light)
  • the dashed arrow represents the echo signal (that is, the received light)
  • the thick black line represents the reflector
  • the ellipse represents the lens module.
  • the lens module can be composed of a lens. There are no particular restrictions on the configuration of at least one of prisms and mirrors.
  • FIG. 7 shows a schematic diagram of a wavelength division module of a radar system provided by an embodiment of the present application.
  • the emitted light from the laser module 610 is reflected by the reflector 71 and the reflector 72 and reaches the two-dimensional scanner 630.
  • the two-dimensional scanner 630 can rotate within a preset range. The reflection of the 630 emits light out.
  • received light of different wavelengths is also received in the receiving field of view of the two-dimensional scanner 630, reflected to the lens module 73, and reflected to the mirror 74 through the lens module 73, and then the received light can enter the detection mode Group 640.
  • a spectroscope may be further connected after the reflector 74, and after further spectroscopic processing by the spectroscope, each single-wavelength optical signal can enter the detector of the corresponding wavelength.
  • the wavelength division module includes: a reflector 71, a reflector 72, a lens module 73, and a reflector 74, wherein the reflector 71 and the reflector 72 are used for splitting multi-wavelength optical signals Processing or focusing processing; and the lens module 73 and the reflector 74 (or a beam splitter may also be included) are used to perform beam splitting processing on the echo signal.
  • each optical element forms a coaxial optical path, in other words, the radar system is a coaxial optical system.
  • FIG. 8 shows a schematic diagram of a wavelength division module of another radar system provided by an embodiment of the present application.
  • the emitted light emitted by the laser module 610 is directly emitted through the two-dimensional scanner 630.
  • there can be multiple echo signals of different wavelengths there can be multiple echo signals of different wavelengths, and these received light can be reflected by the mirror 81 or the mirror 82 to the mirror 83, and then be reflected by the mirror 83. .
  • the wavelength division module includes: a mirror 81 to a mirror 83, where the mirror 81 to a mirror 83 are used to perform spectral processing on the echo signal.
  • each optical element forms a coaxial optical path, in other words, the radar system is a coaxial optical system.
  • FIG. 9 shows a schematic diagram of a wavelength division module of another radar system provided by an embodiment of the present application.
  • the emitted light emitted by the laser module 610 is emitted through the reflecting mirror 91, the reflecting mirror 92, and the two-dimensional scanner 630.
  • the two-dimensional scanner 630 In a unit receiving field of view of the two-dimensional scanner 630, there can be multiple echo signals of different wavelengths. These received light rays are reflected by the lens module 93 and reach the detector, or after passing through the lens module 93, they are processed by a spectroscope. Then enter the detector.
  • the wavelength division module includes: a mirror 91, a mirror 92, and a lens module 93, wherein the mirror 91 and the mirror 92 are used for splitting or condensing multi-wavelength optical signals. Processing; and the lens module 93 (or may also include a beam splitter), which is used to perform beam splitting processing on the echo signal.
  • each optical element forms a coaxial optical path, in other words, the radar system is a coaxial optical system.
  • FIG. 10 shows a schematic diagram of a wavelength division module of another radar system provided by an embodiment of the present application.
  • the emitted light emitted by the laser module 610 is directly emitted through the two-dimensional scanner 630.
  • the two-dimensional scanner 630 In a unit receiving field of the two-dimensional scanner 630, there can be multiple echo signals of different wavelengths. These received light rays are reflected by the lens module 101 and reach the detector, or after passing through the lens module 101, they are processed by a spectroscope. Then enter the detector.
  • the wavelength division module includes: a lens module 101, which is specifically used to perform spectral processing on the echo signal.
  • each optical element forms an off-axis optical path, in other words, the system is an off-axis optical system.
  • FIGS. 7 to 10 are exemplary.
  • the embodiments of the present application have no special restrictions on the wavelength division elements used in the radar system, and there are no special restrictions on whether the optical path formed by each optical element is a coaxial optical path or an off-axis optical path.
  • the radar system is an off-axis optical system or a coaxial optical system.
  • the aforementioned laser module, two-dimensional scanner, and detection module that can output multi-wavelength optical signals can be used to implement this solution.
  • the two-dimensional scanner in the radar system is now specifically explained.
  • the radar system uses a two-dimensional scanner to scan, and the two-dimensional scanner has the ability to scan in a two-dimensional plane.
  • the two-dimensional scanner also has the capability of one-dimensional scanning (single point scanning), and the radar system provided in the embodiment of the present application can also be used to realize one-dimensional scanning. It will not be expanded here. The following describes this solution for a scene where a two-dimensional scanner performs two-dimensional scanning.
  • the two-dimensional scanner can be rotated within a predetermined range. During the movement of the two-dimensional scanner, it can emit light pulse signals and receive their echo signals.
  • the two-dimensional scanner can be rotated sequentially from left to right, and can emit light pulse signals for every unit angle of rotation, and receive the echo signals of the light pulse signals in the current receiving field of view. In this way, the two-dimensional scanner continuously rotates and emits light signals and receives echo signals, thereby realizing two-dimensional scanning.
  • FIG. 11 specifically shows a schematic diagram of a receiving field of view of a two-dimensional scanner.
  • a receiving field of view ie, Receiver FOV
  • the multi-wavelength optical signal can be emitted at the position shown in position 1 (can be recorded as: Tx laser), and position 2 (can be recorded as: Possible Echo 1) and position 3 (can be recorded as: Possible Echo 2) It is the possible receiving position of the echo signal of the multi-wavelength optical signal.
  • the divergence angle of the original laser signal will increase. Then, after the laser signal is reflected by the target, it will show reflected light with multiple angles and uncertain directions (ideal target Generally it is a Lamber reflector).
  • receiving FOV needs to consider the beam divergence angle, the spot at the target, and the target echo within TOF time. The possible incident angles, etc., and the received FOV in both scanning directions need to be greater than the divergence angle of the emitted beam.
  • the multi-wavelength optical signals include optical signals of multiple wavelengths, and the wavelengths of any two optical signals are different, and the optical signals of different wavelengths are emitted
  • the parameters can be different.
  • the emission parameters involved in the embodiments of the present application may include, but are not limited to, one or more of divergence angle, emission position, emission time, emission angle, receiving field of view position, receiving field of view size, and flight time.
  • the divergence angles of the optical signals of each wavelength in the multi-wavelength optical signal may be the same, or may be completely different, or may not be completely the same (there are at least two wavelengths of emission signals that have the same divergence angle).
  • the emission time of the optical signal of each wavelength in the multi-wavelength optical signal may be the same, not completely the same, or completely different.
  • the N optical signals can be sequentially generated and emitted by the laser module. In this way, the emission moments of the N optical signals are completely different.
  • the N optical signals contained in the multi-wavelength optical signal can be generated by N single-wavelength lasers in the laser module and emitted at the same time. In this way, the emission moments of the N optical signals can be exactly the same.
  • some of the optical signals contained in a multi-wavelength optical signal may be generated and emitted by multiple single-wavelength lasers at the same time, and some optical signals may be generated and emitted sequentially by a tunable laser. That is, the emission timings of the N optical signals are not completely the same.
  • the exit angles of the optical signals of each wavelength in the multi-wavelength optical signal are different.
  • the two-dimensional scanner can move continuously. Then, when the emission moments of the N optical signals are different, the emission angle of each optical signal when emitted through the two-dimensional scanner is also different, and the position of the receiving field of view is also different.
  • the receiving field of view of the N optical signals in the multi-wavelength optical signal may be different.
  • the flight time of optical signals of different wavelengths may be different, the same, or not exactly the same.
  • the multi-wavelength optical signal is emitted through the two-dimensional scanner of the radar system and receives the echo signal returned by the target.
  • the unit receiving FOV of the two-dimensional scanner may include multiple echoes.
  • the multi-wavelength optical signal includes optical signals of multiple wavelengths, and the parameters of the minimum receiving field of view of the optical signals of each wavelength are the same, not completely the same, or completely different; among them, the parameters of the minimum receiving field of view include: One or more of the position, size, or number of the field of view.
  • the minimum value of the unit receiving FOV of the two-dimensional scanner (that is, the minimum receiving field of view) is the product of the scanning speed of the two-dimensional scanner and the flight time.
  • the flight time of optical signals of different wavelengths may be different, and therefore, the size of the minimum receiving field of view of different optical signals may be different.
  • the emission time and the emission duration of the optical signals of different wavelengths may be different, and the positions and numbers of the minimum receiving field of view of the optical signals of different wavelengths may also be different.
  • a minimum receiving field of view of the optical signal can be regarded as a "pixel". It should be noted that the pixel is not equivalent to the unit receiving FOV of the two-dimensional scanner: the pixel is the minimum receiving FOV of the optical signal in the target detection process, and the unit receiving FOV is for the two-dimensional scanner.
  • One unit of the received FOV of the dimensional scanner may include one pixel or multiple pixels.
  • the multi-wavelength optical signal may include a first wavelength optical signal and a second wavelength optical signal, and the first wavelength optical signal and the second wavelength optical signal have different wavelengths; then, when the first wavelength optical signal When the transmission parameters of the optical signal and the second wavelength optical signal are different, the parameters of the minimum received field of view of the first wavelength optical signal and the second wavelength optical signal are different.
  • Fig. 12 shows a schematic diagram of a point cloud obtained by a two-dimensional scanning by a radar system.
  • 12A, 12B, and 12C in FIG. 12 respectively show point clouds of optical signals with wavelengths ⁇ 1, ⁇ 2, and ⁇ 3.
  • 12A is the schematic diagram of the point cloud obtained by the radar system through the optical signal 1 of wavelength ⁇ 1 for target detection
  • 12B is the schematic diagram of the point cloud obtained by the radar system through the optical signal 2 of the wavelength ⁇ 2 for target detection
  • 12C is the radar system through the wavelength Schematic diagram of the point cloud obtained by target detection for the optical signal 3 of ⁇ 3.
  • any rectangular area can be regarded as a pixel, or as the minimum scanning field of view of a two-dimensional scanner, and the azimuth angle corresponding to each pixel is the radar resolution of the radar system.
  • the size of the unit rectangular area (that is, the pixel) is related to the time of flight (TOF) of the optical signal.
  • TOF time of flight
  • the TOF time of optical signal 1 and optical signal 2 is the same, and the pixel size corresponding to optical signal 1 is the same as the pixel size corresponding to optical signal 2; while the TOF time of optical signal 1 and optical signal 3 are different, the optical signal
  • the pixel size corresponding to 1 is different from the pixel size corresponding to optical signal 3.
  • the origin in FIG. 12 indicates the possible receiving position of the echo signal, that is, the origin indicates the receiving position of the echo signal in the unit pixel.
  • the receiving position of the echo signal in the pixel is related to the scanning trajectory and the distance of the target. For any single-wavelength optical signal, only one echo signal is allowed in a pixel. For example, in 12A, 12B, and 12C, only one dot (echo signal) is included in any one pixel. This situation is also mentioned above, the point cloud number and resolution of the radar system are limited by the TOF time.
  • multi-wavelength light is used for radar detection.
  • 12D, 12E, and 12F in FIG. 12 show schematic diagrams of point clouds in a multi-wavelength scanning scenario.
  • 12D is a schematic diagram of the point cloud obtained by the radar system through the optical signal 1 and the optical signal 2 in the same receiving FOV for target detection.
  • the pixels of the two are exactly the same.
  • any one pixel includes two origins, that is, includes two echo signals.
  • 12E is a schematic diagram of the point cloud obtained by the radar system through the optical signal 1 and the optical signal 2 in the same size but interlaced receiving FOV for target detection. As shown in 12E, although the pixels of the light signal 1 and the pixels of the light signal 2 are not completely overlapped, compared to 12A and 12B, the number of point clouds can be significantly increased in the unit receiving FOV.
  • And 12F is a schematic diagram of the point cloud obtained by the radar system through the optical signal 1, the optical signal 2 and the optical signal 3 in three different receiving FOVs for target detection.
  • the three optical signals can be transmitted and received at the same time or time-sharing.
  • the optical signals of each wavelength are transmitted through their respective and backbone devices and entered by wavelength division multiplexing.
  • Corresponding detectors receive and do not interfere with each other, which also greatly improves the point cloud number and resolution of the radar system.
  • FIG. 13 shows a schematic diagram of the relationship between the transmission field of view and the reception field of view of a multi-wavelength optical signal.
  • FIG. 13 only takes the optical signal 1 with the wavelength ⁇ 1 and the optical signal 2 with the wavelength ⁇ 2 as examples for specific description.
  • FIG. 13 only exemplarily shows several possible situations, and the actual scene may include but is not limited to the situation shown in FIG. 13.
  • the origin in FIG. 13 is used to illustrate the emission positions of the optical signal 1 and the optical signal 2.
  • the emission positions of the optical signal 1 and the optical signal 2 may be the same size or different.
  • FIG. 13 exemplarily uses two different sizes. This is indicated by a dot.
  • FIG. 13 specifically shows a schematic diagram of 10 receiving fields of view.
  • each receiving field of view may include one or two light signal emission fields.
  • the transmission field of view of the optical signal 1 and the optical signal 2 are different, but the positions overlap; in the receiving field of view 4 to the receiving field of view 6, the optical signal 1
  • the transmission field of view is different from the transmission field of view of the optical signal 2, and the position does not intersect, and the transmission field of view of both is located in the corresponding receiving field of view; the receiving field of view 7 and 9 include the transmission field of light signal 1
  • the receiving field of view 8, 10 includes the emitting field of view of the optical signal 2, and the emitting field of view of the optical signal 1 and the optical signal 2 are completely independent.
  • the 10 receiving fields of view shown in FIG. 13 may be schematic diagrams of the receiving field of view when the radar system moves sequentially during the two-dimensional scanning process. At this time, the radar system is in the two-dimensional scanning process.
  • the receiving field of view of the two-dimensional scanner is variable.
  • the radar system can also perform two-dimensional scanning according to a fixed pattern. For example, it can perform two-dimensional scanning according to receiving field of view 1 to receiving field of view 3, or scanning according to receiving field of view 4 to receiving field of view 6, or through The receiving field of view 7 to the receiving field of view 10 alternately emit light signals of different wavelengths to perform two-dimensional scanning.
  • the divergence angle of the laser of each wavelength, the FOV of the detector, the TOF time, etc. may be the same or different.
  • the emitted laser light of each laser can independently correspond to the receiving FOV of its own wavelength, or it can correspond to the same FOV for the wavelength laser.
  • the emission time and emission azimuth angle of each wavelength laser can be aligned in the time domain and the space domain, or the respective start time and start position can be adopted respectively.
  • the termination time and the maximum receiving azimuth position of each wavelength detector corresponding to the maximum measurement distance can be aligned in the time domain and the space domain, or can be set separately.
  • the processor is configured to obtain the first point cloud data according to the electrical signal corresponding to each wavelength.
  • the embodiments of the present application may provide at least two implementation manners.
  • the processor may directly generate the first point cloud data according to the electrical signals corresponding to each of the multiple wavelengths.
  • the processor receives the electrical signals from the detection module, it performs calculations based on the respective emission parameters of these electrical signals to determine the point cloud position corresponding to each electrical signal, so as to summarize all the electrical signals.
  • the first point cloud data can be obtained by the position of the point cloud corresponding to the electrical signal.
  • the implementation of this embodiment is simple and easy to implement.
  • the first point cloud data is obtained based on multiple wavelength optical signals for target detection.
  • the first point cloud data is The number of point clouds of the one-point cloud data can be multiplied, which not only breaks the limitation of the number of point clouds such as TOF time, but also helps solve the problem of the adaptability of single-wavelength optical signals to the environment (this situation will be explained in detail later) .
  • the processor can also calculate environmental quality parameters and use them to compensate for the generated point cloud data (in this case, the generated electrical cloud data can be recorded as the second point cloud data) , In order to get the (compensated) first point cloud data.
  • the processor may also be specifically configured to: generate second point cloud data according to the electrical signals corresponding to each of the multiple wavelengths, and use environmental quality parameters to compensate for the second point cloud data to obtain the The first point is cloud data.
  • the environmental quality parameter is used to indicate the influence of the current environment on the radar detection result.
  • the environmental quality parameters involved in the embodiments of the present application may include, but are not limited to: one or more of fog type, severe weather level, particle concentration, humidity, or particle size distribution.
  • the processor may also be used to obtain environmental quality parameters.
  • the processor can directly obtain the data stored in the preset location, so that the environmental quality parameter can be obtained.
  • the environmental quality parameter may be calculated by other electronic equipment, such as the controller of the movable platform carried by the radar system, and pre-stored in the preset position.
  • the processor may also be specifically used to calculate and obtain environmental quality parameters.
  • the processor may calculate the environmental quality parameter based on the electrical signals corresponding to each of the multiple wavelengths.
  • the processor may extract the noise parameters corresponding to each wavelength from the electrical signals corresponding to each of the multiple wavelengths, and then determine the environmental quality parameters according to the noise parameters corresponding to the multiple wavelengths.
  • the noise parameters may include, but are not limited to: backscattered noise parameters.
  • the noise parameters may also include: atmospheric noise parameters, noise parameters of other radars, fog noise parameters, ambient light noise (for example, sunlight noise, etc.) parameters, etc.
  • the influence of the atmosphere on the signal is divided into two parts, scattering and absorption. Scattering and absorption both have attenuation and pulse broadening effects on the signal. Signal attenuation will lead to the attenuation of the target signal, which affects the ranging performance and the estimation of the target signal strength (reflectivity); pulse broadening will cause the measurement error of the echo signal at the moment, thereby affecting the ranging accuracy. Scattering is divided into dispersing the laser light into reflected light in different directions. The backscattered light will enter the lidar system and generate backscattered noise on the echo signal, which affects the signal-to-noise ratio of target signal detection.
  • the scattering coefficient and absorption coefficient can be obtained by inversion calculation of the echo signal, the noise signal can be filtered or suppressed, and the waveform and amplitude of the target signal can be corrected.
  • the scattering coefficient and absorption coefficient can be used to determine environmental quality parameters.
  • the processor when it obtains the environmental quality parameters, it may use the backscatter function to process the backscattered noise parameters corresponding to multiple wavelengths to obtain the scattering coefficient; and, use the atmospheric absorption function to process the multiple wavelengths corresponding to the The electrical signal and the backscattered noise parameters of, obtain the absorption coefficient; thus, the environmental quality parameter is determined according to the scattering coefficient and the absorption coefficient.
  • the processor can receive the electrical signal corresponding to optical signal 1.
  • Signal 1 and, electrical signal 2 corresponding to optical signal 2.
  • the processor can perform matched filtering processing on the electrical signal 1 to extract the backscattered noise parameter 1 corresponding to the electrical signal 1, and then use the backscatter function to process the backscattered noise parameter 1 to obtain the scattering coefficient 1; and, using the atmospheric absorption function to process the backscattered noise parameter 1 and the electrical signal 1, to obtain the absorption coefficient 1.
  • the processor can also perform matched filtering processing on the electrical signal 2 to extract the backscattered noise parameter 2 corresponding to the electrical signal 2, and then use the backscatter function to process the backscattered noise parameter 2 to obtain the scattering coefficient 2; And, using the atmospheric absorption function to process the backscattered noise parameter 2 and the electrical signal 2 to obtain the absorption coefficient 2.
  • the processor can determine the environmental quality parameters based on the scattering coefficient 1, the absorption coefficient 1, the scattering coefficient 2, and the absorption coefficient 2.
  • the environmental parameter determination model is used to determine the environmental quality parameter
  • the input of the model may be the scattering coefficient and the absorption coefficient corresponding to the multi-wavelength optical signal
  • the output of the model is the environmental quality parameter
  • the first model is used to determine the type of fog
  • the second model is used to determine the severity of the weather
  • the processor can input the aforementioned scattering coefficient 1, absorption coefficient 1, scattering coefficient 2, and absorption coefficient 2 into the first model, namely The fog cluster type output by the first model can be obtained; the aforementioned scattering coefficient 1, absorption coefficient 1, scattering coefficient 2, and absorption coefficient 2 are input into the second model, and the severe weather output by the second model can be obtained.
  • the example of this application has no particular limitation on the model type of the environmental parameter determination model.
  • the environmental parameter determination model can be a formula model, a neural network model, or other mathematical models.
  • the environmental quality parameters can be calculated by the processor in the radar system, or can also be calculated by other processors.
  • the radar system can send the second point cloud data and/or electrical signals corresponding to multiple wavelengths to the main controller, and the main controller can obtain the environmental quality parameters accordingly; after that, the main controller can send to the radar system
  • the environmental quality parameter, or the main controller may store the environmental quality parameter in a preset location, and the radar system has the data acquisition authority of the preset location.
  • the environmental quality parameters may be obtained by real-time calculation, or may be obtained by interval calculation.
  • the processor receives multiple electrical signals of different wavelengths, it can directly generate the second point cloud data, and calculate the environmental quality parameters based on the received electrical signals in real time, and then, Then use the environmental quality parameters to compensate the second point cloud data to obtain the first point cloud data.
  • the radar system can periodically calculate the environmental quality parameters. In this way, after the processor generates the second point cloud data, it can directly obtain the environmental quality parameters calculated in the current cycle (or the most recent time), and implement it accordingly. Compensate for the second point cloud data to obtain the first point cloud data.
  • the processor can obtain the target compensation formula matching the environmental quality parameters in the preset compensation formula, and then use the target compensation The formula compensates for the second point cloud data to obtain the first point cloud data.
  • the preset compensation formula may be an empirical formula, or a formula obtained through a preset calibration experiment.
  • the embodiments of the present application have no particular limitation on the source of the compensation formula.
  • the corresponding relationship between each environmental quality parameter and each preset compensation formula can be preset. Then, when determining the target compensation formula, only one compensation formula corresponding to the environmental quality parameter needs to be obtained. .
  • the corresponding relationship between the fog group type and the preset compensation formula can be preset in advance. In this way, you only need to determine the fog group type according to the foregoing steps and obtain the compensation formula corresponding to the fog group type to obtain the target compensation formula .
  • a target compensation formula may be used to process the scanning parameters of the two-dimensional scanner, the detection responsivity of the detection module, the amplification factor, the emission parameters of the first optical signal, The receiving parameter of the second optical signal obtains the compensation value, and then the compensation value is used to compensate the second point cloud data to obtain the first point cloud data.
  • the radar system provided by the embodiment of the present application can also analyze the first point cloud data to determine the radar detection result, and further, can also output the radar detection result.
  • the radar system can be used to output: first point cloud data and/or radar detection results.
  • the processor may also obtain a single-wavelength detection result corresponding to each wavelength according to the first point cloud data, and then determine the radar detection result according to the single-wavelength detection results of multiple wavelengths.
  • this application combines the single-wavelength detection results of multiple wavelengths to obtain the radar detection result, which can effectively solve the problem of single-wavelength optical signal receiving Limited to environmental issues, it helps to improve the accuracy of radar detection results.
  • the radar detection results involved in the embodiments of the present application may include, but are not limited to: the distance between the target (or called the detection target) and the radar (that is, the radar system) and/or the target type.
  • the processor may determine the first distance corresponding to each wavelength as the single-wavelength detection result by using the transmission and reception time of the optical signal corresponding to each wavelength in the first point cloud data; then, according to the first distance corresponding to the multiple wavelengths Distance, determine the second distance, the second distance is used to characterize the distance between the radar system and the detection target.
  • the processor can use the speed distance formula to calculate the transmission and reception time of the optical signal and the speed of light, and the first distance corresponding to each wavelength of light can be obtained. It can be understood that the first distance is used to characterize the radar system and the Detect the distance between targets. However, in this application, when the second distance is determined based on multiple first distances, there may be multiple implementation manners.
  • the processor may obtain the average value of each of the plurality of first distances, and determine the average value as the second distance.
  • the processor may filter out the maximum value and the minimum value of the multiple first distances, and then obtain the average value of the remaining first distances, and determine the average value as the second distance.
  • the processor may obtain a minimum value among the multiple first distances, and determine the minimum value as the second distance. Do not exhaustively.
  • the processor can also use the echo intensity of the optical signal corresponding to each wavelength in the first point cloud data to determine the target reflectivity corresponding to each wavelength as a single-wavelength detection result, and then, according to the target reflectivity corresponding to multiple wavelengths, Determine the type of detection target.
  • the processor may obtain an average value of the reflectance of a plurality of targets, and determine the type corresponding to the average value as the type of the detection target.
  • the processor may filter out the maximum and minimum reflectance values of multiple targets, and then obtain the average value of the remaining target reflectance, and determine the type corresponding to the average value as the detection The type of goal.
  • the processor may filter out the type corresponding to the maximum value or the minimum value of the reflectance of the multiple targets, and determine it as the type of the detection target.
  • FIG. 14 shows a schematic diagram of a processor acquiring a radar detection result (a radar detection method).
  • the processor performs matched filtering after receiving electrical signal 1 (wavelength 1) and electrical signal 2 (wavelength 2), and divides electrical signal 1 into noise signal 1 and waveform signal 1.
  • Signal 2 is divided into noise signal 2 and waveform signal 2.
  • wavelength 1 and wavelength 2 are not equal.
  • the processor can determine noise parameter 1 based on noise signal 1, and determine noise parameter 2 based on noise signal 2, and then combine noise parameter 1 and noise parameter 2, as well as electrical signal 1, electrical signal 2, to determine the environment Quality parameters.
  • the processor may obtain the echo time 1 and the echo intensity 1 of the waveform signal 1; and, obtain the echo time 2 and the echo intensity 2 of the waveform signal 2.
  • the processor may calculate the first distance 1 corresponding to the electrical signal 1 based on the environmental quality parameter and the echo time 1.
  • the processor can use the environmental quality parameter to calculate the ranging error 1, and use the ranging error 1 to compensate for the distance 1'to obtain the first distance 1, where the distance 1'is directly based on the electrical signal 1.
  • the corresponding light signal emission time and echo time 1 are calculated.
  • the processor may also calculate the first distance 2 corresponding to the wavelength 2 based on the environmental quality parameter and the echo time 2.
  • the processor can use the environmental quality parameters to calculate the ranging error 2, and use the ranging error 2 to compensate for the distance 2'to obtain the first distance 2, where the distance 2'is directly based on the optical signal corresponding to the electrical signal 2.
  • the launch time and echo time 2 are calculated.
  • the processor can determine the second distance between the radar system and the detection target according to the first distance 1 and the first distance 2.
  • the processor can also calculate the target reflectivity 1 corresponding to the electrical signal 1 based on the environmental quality parameter and the echo intensity 1.
  • the processor can calculate the compensation value 1 of the echo intensity 1 based on the environmental quality parameters, and use the compensation value 1 to compensate the echo intensity 1, and then use the compensated echo intensity 1'to calculate the target Reflectivity 1.
  • the processor may also calculate the target reflectivity 2 corresponding to the electrical signal 2 based on the environmental quality parameter and the echo intensity 2.
  • the processor can calculate the compensation value 2 of the echo intensity 2 based on the environmental quality parameter, and use the compensation value 2 to compensate the echo intensity 2, and then use the compensated echo intensity 2'to calculate the target Reflectivity 2.
  • the processor can determine the type of detection target based on target reflectivity 1 and target reflectivity 2.
  • the reflectivity information is mainly used for target recognition and classification.
  • Target reflectivity is related to incident wavelength, target material, angle, etc.
  • the reflectivity of different materials in a single wavelength band may be the same. Using differentiated reflectance information of the same target at different wavelengths can effectively enhance the target recognition effect.
  • FIG. 15 shows a schematic diagram of a target recognition effect when a single-wavelength optical signal and a multi-wavelength optical signal are used to respectively recognize the same target.
  • 15A is a schematic diagram of the effect of using a single-wavelength optical signal for target detection
  • 15B is a schematic diagram of the effect of using a multi-wavelength optical signal for target detection.
  • 15B contains more electric clouds, and more effective information can be identified.
  • the processor of the radar system may also be used to determine the detection parameters of the radar system; and to detect targets according to the detection parameters.
  • the detection parameters involved here may include, but are not limited to: transmission parameters of multi-wavelength optical signals of multiple wavelengths and configuration parameters of the radar system.
  • the transmission parameters can refer to the previous description, which will not be repeated here.
  • the configuration parameters are used to implement the configuration of each module in the radar system.
  • the configuration parameters may include, but are not limited to: scanning frequency, detection frequency, and so on.
  • the detection parameters of the radar system may be determined by the controller. That is, the controller may determine the detection parameters of the radar system, and send first information to the processor of the radar system, and the first information is used to indicate the transmission parameters and/or configuration parameters of the radar system.
  • the processor determines the detection parameter, it can receive the first information from the controller, and determine the detection parameter based on the received first information.
  • the processor may directly determine the transmission parameter and/or configuration parameter carried in the first information as the detection parameter.
  • the processor can perform custom adjustments based on the transmission parameters and/or configuration parameters carried in the first information, perform custom adjustments according to a preset algorithm, and determine the adjusted transmission parameters and/or configuration parameters as the detection parameter.
  • the detection parameters of the radar system can also be determined by the processor itself.
  • the detection parameters of the radar system have been configured in advance, and the processor may extract the pre-configured data by itself as the detection parameters.
  • the radar system may also obtain environmental parameters of the current environment, and determine the transmission parameters and/or configuration parameters according to the environmental parameters.
  • the environmental parameters may include, but are not limited to: one or more of the weather type of the current environment, the current world coordinates, or the image data of the current environment.
  • the environmental parameters can be derived from other sensors or communication modules that are communicatively connected with the radar system.
  • the senor may include, but is not limited to: one or more of millimeter wave radar, image acquisition device, global positioning system receiver (Global Positioning System, GPS), inertial measurement unit, and human-computer interaction interface.
  • millimeter wave radar image acquisition device
  • global positioning system receiver Global Positioning System, GPS
  • inertial measurement unit inertial measurement unit
  • human-computer interaction interface e.g., GPS
  • the vehicle is equipped with a radar system and a camera (an image acquisition device), and the radar system is in communication with the camera, and the camera can send the collected image data to the radar system.
  • the radar system receives these image data
  • the processor receives the image data, it can determine the weather type of the current environment through image recognition technology, and then, according to the preset weather type and detection parameters (transmission parameters and / Or configure the corresponding relationship between the parameters), and detect the target according to a detection parameter corresponding to the current weather type.
  • the communication module can be connected to other electronic devices or connected to the network. In this way, the current weather type can be obtained through the communication module.
  • the communication module may be the communication module of the radar system, or it may be the communication module of the movable equipment carried by the radar system. In this case, the communication module is directly or indirectly connected to the radar system ( For example, through the controller).
  • the radar system provided by the embodiments of the present application can simultaneously or time-sharing target detection through multi-wavelength optical signals of different wavelengths, and realize a 3D point cloud image within the scanning FOV range through one or more 2D scanners .
  • This technical solution can make up for the various performance shortcomings of current automotive lidar products, especially lidar systems based on 2D scanning architecture, and improve overall measurement capabilities. It is embodied in the following aspects:
  • the use of multiple wavelengths can break the TOF time and scanning performance limitations within the limited receiving FOV range, increase the number of pixel units, thereby improving the resolution of the lidar; it can take advantage of the difference in beam quality of each wavelength, and use the divergence angle in the area where the resolution needs to be improved. Smaller wavelengths are intensively detected; and the difference in human eye safety at different wavelengths can be used to increase the laser power and distribute the emission energy of a single beam to multiple detector pixel units, thereby further improving the system resolution.
  • the number of point clouds is related to the number of transmit and receive channels, TOF time, laser re-frequency, etc. Multi-wavelength is an effective way to break the TOF limit. At the same time, the re-frequency, pulse width, power and other performance of different wavelength lasers are different , Can further improve the light extraction efficiency per unit time and space.
  • Eye safety and dynamic range Different wavelength lasers have different eye safety thresholds.
  • Long-wavelength lasers with high eye safety thresholds can be used to output high-power lasers for long-distance detection.
  • Short-wavelength lasers with low eye safety thresholds can be used to output low-power detection for short and medium distances, while compensating for long distances.
  • the saturation phenomenon and blind zone limitation of high power wavelength in short-distance measurement improve the overall dynamic range of the system and the ranging accuracy in the full measurement range.
  • the reflectivity information is mainly used for target recognition and classification.
  • the target reflectivity is related to the incident wavelength, target material, angle, etc., and is different in a single band.
  • the material reflectivity may be the same, and the use of differentiated reflectivity information of the same target at different wavelengths can effectively enhance the target recognition effect.
  • the embodiment of the present application also provides a movable device and a radar detection method thereof.
  • the movable device may include a fuselage, a radar system and a controller mounted on the fuselage.
  • the controller is used to control the movement of the movable device, and the radar system may be the radar system shown in any of the embodiments provided in this application, and this will not be repeated.
  • the radar system is coupled with the controller, so that the controller can be based on data from the radar system (first point cloud data and/or radar detection results).
  • the controller can be used to control the operation of the radar system to detect targets, and can also be used to receive data output by the radar system.
  • the radar system can output the first point cloud data and/or the radar detection result; and the controller can be used to control the movement of the movable device based on the first point cloud data.
  • the radar system can directly output the first point cloud data.
  • the controller of the movable device can receive the first point cloud data from the radar system, and obtain the radar detection result based on the first point cloud data. That is, the controller may receive the second information from the radar system, the second information carries the first point cloud data, and then, according to the first point cloud data, obtain the single wavelength detection corresponding to each wavelength As a result, therefore, the radar detection result is determined based on the single-wavelength detection result of multiple wavelengths.
  • the controller may determine the first distance corresponding to each wavelength as the single-wavelength detection result by using the transmission and reception time of the optical signal corresponding to each wavelength in the first point cloud data, and then, according to multiple wavelengths Corresponding to the first distance, a second distance is determined, and the second distance is used to characterize the distance between the radar system and the detection target.
  • the controller may use the echo intensity of the optical signal corresponding to each wavelength in the first point cloud data to determine the target reflectivity corresponding to each wavelength as the single-wavelength detection result, and then according to multiple wavelengths Corresponding to the target reflectivity, the type of the detected target is determined.
  • the specific implementation manner for the controller to obtain the radar detection result based on the first point cloud data is the same as the manner in which the processor in the aforementioned radar system obtains the radar detection result based on the first point cloud data.
  • the foregoing may be referred to, and will not be repeated here.
  • the controller can also determine the detection parameters of the radar system when it controls the radar system to perform target detection tasks. That is, the transmission parameters and/or the configuration parameters of the radar system are determined. For the determination method, please refer to the previous article. After that, the controller may send a first message to the radar system, where the first message is used to indicate the transmission parameter and/or the configuration parameter of the radar system.
  • the controller when it determines the detection parameters, it can obtain the environmental parameters through the sensors and/or communication modules carried in the mobile device, and determine the emission parameters and the emission parameters of the laser signals of each wavelength according to the environmental parameters. /Or the configuration parameters.
  • the sensors carried in the mobile device may include, but are not limited to: one or more of millimeter wave radar, image acquisition device, global positioning system receiver, inertial measurement unit, and human-computer interaction interface. It will not be repeated here.
  • the mobile device can undertake part of the calculation function of the radar system, and obtain the radar detection result with higher accuracy based on the first point cloud data obtained by the multi-wavelength light number.
  • the operations or steps implemented by the processor can also be implemented by components (such as chips or circuits) that can be used in the processor, and the operations or steps implemented by the controller can also be implemented by It can be used for the implementation of components (such as chips or circuits) in the controller.
  • the embodiment of the present application further provides an electronic device.
  • the electronic device can be used to implement the method on the processor side or the corresponding part of the controller side described in the foregoing method embodiment. For details, refer to the description in the foregoing embodiment.
  • the electronic device may include one or more processing units, and the processing unit may also be referred to as a processor (note that the processor here refers to the processing module or processing unit in the processor of the radar system mentioned above). Realize certain control functions.
  • the processing unit may be a general-purpose processing unit, a dedicated processing unit, or the like.
  • the processing unit may also store instructions, and the instructions may be executed by the processing unit, so that the electronic device executes the method described in the above method embodiments corresponding to the processor side or the controller side. Methods.
  • the electronic device may include a circuit, and the circuit may implement the sending or receiving or communication function in the foregoing method embodiment.
  • the electronic device may include one or more memories, on which instructions or intermediate data are stored, and the instructions may be executed on the processing unit, so that the electronic device executes the description in the above embodiments Methods.
  • other related data may also be stored in the memory.
  • instructions and/or data may also be stored in the processing unit.
  • the processing unit and the memory can be provided separately or integrated together.
  • the electronic device may further include a transceiver.
  • the transceiver may be called a transceiver unit, a transceiver, a transceiver circuit, or a transceiver, etc., and is used to implement the transceiver function of the electronic device.
  • the processing unit in the electronic device is used to determine and output the first point cloud data and/or radar detection results when receiving electrical signals corresponding to multiple wavelengths
  • the transceiver in the electronic device can be used to receive the first information from the controller.
  • the transceiver can also be used to send the second information to the controller.
  • the transceiver can further complete other corresponding communication functions.
  • the processing unit is used to complete the corresponding determination or control operation.
  • the corresponding instruction can also be stored in the memory.
  • the processing unit in the electronic device can be used to receive the first point cloud data and obtain the radar detection result accordingly, and the transceiver in the electronic device can be used to To receive the second information from the radar system, for example, the transceiver can also be used to send the first information to the radar system.
  • the transceiver can further complete other corresponding communication functions.
  • the processing unit is used to complete the corresponding determination or control operation.
  • the corresponding instruction can also be stored in the memory.
  • the processing unit and transceiver described in this application can be implemented in integrated circuit (IC), analog IC, radio frequency integrated circuit RFIC, mixed signal IC, application specific integrated circuit (ASIC), printed circuit board ( printed circuit board, PCB), electronic equipment, etc.
  • the processing unit and the transceiver can also be manufactured using various 1C process technologies, such as complementary metal oxide semiconductor (CMOS), nMetal-oxide-semiconductor (NMOS), and P-type Metal oxide semiconductor (positive channel metal oxide semiconductor, PMOS), bipolar junction transistor (BJT), bipolar CMOS (BiCMOS), silicon germanium (SiGe), gallium arsenide (GaAs), etc.
  • CMOS complementary metal oxide semiconductor
  • NMOS nMetal-oxide-semiconductor
  • PMOS bipolar junction transistor
  • BiCMOS bipolar CMOS
  • SiGe silicon germanium
  • GaAs gallium arsenide
  • the electronic device may be a stand-alone device or may be part of a larger device.
  • the device may be: (1) an independent integrated circuit IC, or chip, or, a chip system or subsystem; (2) a set of one or more ICs, optionally, the set of ICs may also include Storage components for storing data and/or instructions; (3) ASIC, such as a modem (MSM); (4) modules that can be embedded in other devices; (5) receivers, terminals, cellular phones, wireless devices, handhelds , Mobile units, electronic equipment, etc.; (6) Others, etc.
  • MSM modem
  • the embodiments of the present application also provide a computer-readable storage medium in which a computer program is stored. When it runs on a computer, the computer executes the radar implemented by the processor or controller in the above-mentioned embodiment. Detection method.
  • the embodiments of the present application also provide a computer program product, which includes a computer program, which when running on a computer, causes the computer to execute the radar detection method implemented by the processor or the controller in the above-mentioned embodiments.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium, (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种雷达系统(600)、可移动设备(100)与雷达探测方法。雷达系统(600)中的激光模组(610)生成多波长光信号,二维扫描器(620)利用多波长光信号进行二维扫描,并接收位于二维扫描器(620)的接收视场内的回波信号,波分模组(630)对回波信号分光处理后,探测模组(640)将多个单波长光信号转换为多个电信号,处理器(650)则据此获取第一点云数据。其中,在二维扫描器(620)的一个单位接收视场内包括多个回波信号,突破了飞行时间对单波长光信号对点云数目的限制,也突破了单波长光信号对环境适应性的限制,增加了雷达探测的点云数,这也增加了点云数据中携带的有效信息,有利于提高激光雷达的探测性能,有利于降低由此导致的自动驾驶过程的安全风险。

Description

一种雷达系统、可移动设备与雷达探测方法 技术领域
本申请涉及雷达领域,尤其涉及一种雷达系统、可移动设备与雷达探测方法。
背景技术
随着自动驾驶技术的不断发展,对自动驾驶系统中各传感器的要求也越来越高。其中,激光雷达作为自动驾驶系统的重要传感器之一,对激光雷达的分辨率、点云数目、探测距离等方面的要求也越来越高。
目前,现有技术一般采用单波长激光雷达来探测目标,也即,通过激光雷达对外发射单波长激光信号,并接收其回波信号,并对单波长激光信号及其回波信号进行分析处理,得到目标的点云数据,以及进而确定目标与激光雷达之间的距离以及目标类型。
但是,单波长激光信号受限于探测距离、飞行时间(Time Of Flight,TOF)、或环境的适应性等方面的限制,在单波长激光雷达的单位接收视场内仅能包含一个回波信号,这导致单波长激光雷达的点云数较少,进而导致点云中携带的有效信息也较少,这也极大影响了激光雷达的探测性能。如何提升激光雷达的探测性能,以降低自动驾驶过程的安全风险,就成为本领域亟待解决的技术问题。
发明内容
本申请实施例提供一种雷达系统、可移动设备与雷达探测方法,用以提高激光雷达的探测性能,以降低由此导致的自动驾驶过程的安全风险。
第一方面,本申请实施例提供一种雷达系统,在该雷达系统中,激光模组用于生成多波长光信号,二维扫描器,用于利用所述多波长光信号进行二维扫描,并接收位于所述二维扫描器的接收视场内的回波信号,回波信号为被扫描物体在所述多波长光信号照射后形成反射信号,波分模组则用于对回波信号组分光处理,得到多个单波长光信号,由探测模组将其转换为各波长各自对应的电信号,进而,基于各波长对应的电信号获取第一点云数据。并且,在本申请所提供的雷达系统中,在二维扫描器的一个单位接收视场内包括多个回波信号,相较于现有技术中单波长激光雷达的单位接收视场内仅能包含一个回波信号的情况,本申请能够突破TOF时间对单波长光信号对点云数目的限制,也能够突破单波长光信号对环境适应性的限制,有效增加了单位接收视场内的点云数目,点云数目的增加也增加了点云数据中携带的有效信息,有利于提高激光雷达的探测性能。因此,本申请所提供的雷达系统能够有效提高激光雷达的探测性能,有利于降低由此导致的自动驾驶过程的安全风险。
在一种可能的实施例中,所述多波长光信号包括第一波长光信号与第二波长光信号;所述第一波长光信号与所述第二波长光信号的波长不同;那么,当所述第一波长 光信号与所述第二波长光信号的发射参数不同时,所述第一波长光信号与所述第二波长光信号的最小接收视场的参数不同;其中,所述最小接收视场的参数包括:所述最小接收视场的位置、尺寸或个数中的一种或多种;所述发射参数包括:发散角、出射位置、出射时刻、出射角度、接收视场的位置、接收视场的尺寸、飞行时间中的一种或多种。
在另一种可能的实施例中,所述激光模组包括:一个或多个激光器。其中,当所述激光模组包括一个所述激光器时,所述激光器为可调谐激光器;此时,所述多波长光信号包括多个单一波长的光信号。或者,当所述激光模组包括多个所述激光器时,多个所述激光器包括:可调谐激光器和/或单波长激光器;其中,任意两个所述单波长激光器生成的光信号的波长不同,此时,所述多波长光信号包括:一个包括多种波长的光信号,或多个单一波长的光信号。
在另一种可能的实施例中,所述波分模组,具体用于:对所述回波信号作分光处理,得到多个所述单波长光信号;任意两个所述单波长光信号的波长不同。
在另一种可能的实施例中,所述波分模组还用于:对所述激光模组生成的所述多波长光信号作分光处理或聚光处理,并提供给所述二维扫描器进行二维扫描。
在另一种可能的实施例中,所述波分模组包括:分光器、光纤、透镜、棱镜、反射镜或衍射器件中的一种或多种。
在另一种可能的实施例中,所述探测模组包括:一个或多个探测器;当所述探测模组包括一个所述探测器时,所述探测器为多波长探测器,所述多波长探测器用于接收并处理多种波长的所述单波长光信号;当所述探测模组包括多个所述探测器时,多个所述探测器包括:多波长探测器和/或单波长探测器;其中,任意一个所述单波长探测器用于接收并处理一种波长的所述单波长光信号。
在另一种可能的实施例中,所述雷达系统为离轴光学系统或同轴光学系统。
在另一种可能的实施例中,所述处理器,具体用于:根据多个波长各自对应的电信号,直接生成所述第一点云数据。
在另一种可能的实施例中,所述处理器,具体用于:根据所述多个波长各自对应的电信号,生成第二点云数据;利用环境质量参数,对所述第二点云数据进行补偿,得到所述第一点云数据。
在另一种可能的实施例中,所述处理器,还用于:在多个波长各自对应的电信号中,提取各波长对应的噪声参数;根据多种波长对应的所述噪声参数,确定所述环境质量参数。
在另一种可能的实施例中,所述噪声参数包括:后向散射噪声参数;此时,所述处理器,具体用于:利用后向散射函数处理多个波长对应的所述后向散射噪声参数,得到散射系数;利用大气吸收函数处理多个波长对应的所述电信号与所述后向散射噪声参数,得到吸收系数;根据所述散射系数与所述吸收系数,确定所述环境质量参数。
在另一种可能的实施例中,所述环境质量参数,包括:雾团类型、天气恶劣等级、粒子浓度、湿度或粒子尺寸分布中的一种或多种。
在另一种可能的实施例中,所述处理器,具体用于:在预设的补偿公式中,获取与所述环境质量参数相匹配的目标补偿公式;利用所述目标补偿公式对所述第二点云 数据进行补偿,得到所述第一点云数据。
在另一种可能的实施例中,所述处理器,还用于:根据所述第一点云数据,获取每种波长对应的单波长探测结果;根据多种波长的所述单波长探测结果,确定雷达探测结果。
在另一种可能的实施例中,所述处理器,具体用于:利用所述第一点云数据中各波长对应的光信号的收发时刻,确定各波长对应的第一距离,以作为所述单波长探测结果;根据多种波长对应的所述第一距离,确定第二距离,所述第二距离用于表征所述雷达系统与探测目标之间的距离。
在另一种可能的实施例中,所述处理器,具体用于:利用所述第一点云数据中各波长对应的光信号的回波强度,确定各波长对应的目标反射率,以作为所述单波长探测结果;根据多种波长对应的所述目标反射率,确定探测目标的类型。
在另一种可能的实施例中,所述处理器,还用于:确定所述雷达系统的探测参数;所述探测参数包括:多种波长的所述多波长光信号的发射参数与所述雷达系统的配置参数;根据所述探测参数探测目标。
在另一种可能的实施例中,所述处理器,还用于:接收第一信息,所述第一信息用于指示所述雷达系统的所述发射参数和/或所述配置参数。
第二方面,本申请实施例提供一种可移动设备,包括:如第一方面任一种实施例所述的雷达系统与控制器,其中,控制器则与雷达系统耦合,可用于基于第一点云数据,控制可移动设备。
第三方面,本申请实施例提供一种可移动设备,包括:如第一方面任一种实施例所述的雷达系统与控制器,其中,控制器则与雷达系统耦合,可用于基于第一点云数据,控制可移动设备。除此之外,当雷达系统直接输出第一点云数据时,该控制器还可以用于:基于所述第一点云数据,获取每种波长对应的单波长探测结果,以及,用于根据多种波长的所述单波长探测结果,确定雷达探测结果。
在第二方面或第三方面该可移动设备中,雷达系统可以出射多波长光信号,并基于其回波信号获取到第一点云数据,并且,在二维扫描器的一个单位接收视场内包括多个回波信号,相较于现有技术中单波长激光雷达的单位接收视场内仅能包含一个回波信号的情况,本申请能够突破TOF时间对单波长光信号对点云数目的限制,也能够突破单波长光信号对环境适应性的限制,有效增加了单位接收视场内的点云数目,点云数目的增加也增加了点云数据中携带的有效信息,有利于提高激光雷达的探测性能。综上,本申请所提供的雷达系统能够有效提高激光雷达的探测性能,有利于降低由此导致的自动驾驶过程的安全风险。
在第二方面或第三方面的一种可能的实施例中,所述控制器,具体用于:向所述雷达系统发送第一消息,所述第一消息用于指示所述雷达系统的所述发射参数和/或所述配置参数。
在第二方面或第三方面的另一种可能的实施例中,所述可移动设备还包括:传感器和/或通信模组;所述控制器,还用于:通过所述传感器和/或所述通信模组,获取环境参数;根据所述环境参数,确定各波长的所述激光信号的所述发射参数和/或所述配置参数。
在第二方面或第三方面的另一种可能的实施例中,所述传感器,包括:毫米波雷达、图像采集装置、全球定位系统接收器、惯性量测单元、人机交互接口中的一种或多种。
在第二方面或第三方面的另一种可能的实施例中,所述控制器,还用于:接收来自于所述雷达系统的第二信息,所述第二信息携带第一点云数据;根据所述第一点云数据,获取每种波长对应的单波长探测结果;根据多种波长的所述单波长探测结果,确定雷达探测结果。
在第二方面或第三方面的另一种可能的实施例中,所述控制器,具体用于:利用所述第一点云数据中各波长对应的光信号的收发时刻,确定各波长对应的第一距离,以作为所述单波长探测结果;根据多种波长对应的所述第一距离,确定第二距离,所述第二距离用于表征所述雷达系统与探测目标之间的距离。
在第二方面或第三方面的另一种可能的实施例中,所述控制器,具体用于:利用所述第一点云数据中各波长对应的光信号的回波强度,确定各波长对应的目标反射率,以作为所述单波长探测结果;根据多种波长对应的所述目标反射率,确定探测目标的类型。
在第二方面或第三方面的另一种可能的实施例中,所述可移动设备包括:车辆、无人机或地面机器人。
第四方面,本申请实施例提供一种雷达探测方法,可应用于如第一方面、第二方面或第三方面任意一种实施例所示的雷达系统,在该雷达探测方法中,雷达系统可以生成多波长光信号,并利用多波长光信号进行二维扫描,以及,接收位于二维扫描器接收视场内的回波信号,如此,通过对回波信号作分光处理、光电信号转换成利得到各波长各自对应的电信号,进而,根据各波长对应的电信号获取第一点云数据。在该方法中,由于在二维扫描器的一个单位接收视场内包括多个回波信号,相较于现有技术中单波长激光雷达的单位接收视场内仅能包含一个回波信号的情况,本申请能够突破TOF时间对单波长光信号对点云数目的限制,也能够突破单波长光信号对环境适应性的限制,有效增加了单位接收视场内的点云数目,并据此获取到更多的有效信息,这有利于提高激光雷达的探测性能。综上,本申请所提供的雷达系统能够有效提高激光雷达的探测性能,有利于降低由此导致的自动驾驶过程的安全风险。
在一种可能的实施例中,所述根据各波长对应的所述电信号获取第一点云数据,包括:根据所述多个波长各自对应的电信号,生成第二点云数据;利用环境质量参数,对所述第二点云数据进行补偿,得到所述第一点云数据。
在另一种可能的实施例中,所述方法还包括:在多个波长各自对应的电信号中,提取各波长对应的噪声参数;根据多种波长对应的所述噪声参数,确定所述环境质量参数。
在另一种可能的实施例中,所述噪声参数包括:后向散射噪声参数;所述在多个波长各自对应的电信号中,提取各波长对应的噪声参数,包括:利用后向散射函数处理多个波长对应的所述后向散射噪声参数,得到散射系数;利用大气吸收函数处理多个波长对应的所述电信号与所述后向散射噪声参数,得到吸收系数;根据所述散射系数与所述吸收系数,确定所述环境质量参数。
在另一种可能的实施例中,所述利用环境质量参数,对所述第二点云数据进行补偿,得到所述第一点云数据,包括:在预设的补偿公式中,获取与所述环境质量参数相匹配的目标补偿公式;利用所述目标补偿公式对所述第二点云数据进行补偿,得到所述第一点云数据。
在另一种可能的实施例中,所述方法还包括:根据所述第一点云数据,获取每种波长对应的单波长探测结果;根据多种波长的所述单波长探测结果,确定雷达探测结果。
在另一种可能的实施例中,所述根据所述第一点云数据,获取每种波长对应的单波长探测结果,包括:利用所述第一点云数据中各波长对应的光信号的收发时刻,确定各波长对应的第一距离,以作为所述单波长探测结果;所述根据多种波长的所述单波长探测结果,确定雷达探测结果,包括:根据多种波长对应的所述第一距离,确定第二距离,所述第二距离用于表征所述雷达系统与探测目标之间的距离。
在另一种可能的实施例中,所述根据所述第一点云数据,获取每种波长对应的单波长探测结果,包括:利用所述第一点云数据中各波长对应的光信号的回波强度,确定各波长对应的目标反射率,以作为所述单波长探测结果;所述根据多种波长的所述单波长探测结果,确定雷达探测结果,包括:根据多种波长对应的所述目标反射率,确定探测目标的类型。
在另一种可能的实施例中,所述方法还包括:确定所述雷达系统的探测参数;所述探测参数包括:所述多波长光信号的发射参数与所述雷达系统的配置参数;根据所述探测参数探测目标。
在另一种可能的实施例中,所述方法还包括:接收第一信息,所述第一信息用于指示所述雷达系统的所述发射参数和/或所述配置参数。
第五方面,本申请实施例提供一种雷达探测方法,可应用于如第三方面任意一种实施例所示的可移动设备中的控制器,在该雷达探测方法中,控制器可以接收来自于雷达系统的第二信息,第二信息中携带第一点云数据,从而,控制器可以基于第一点云数据,控制可移动设备移动。
第六方面,本申请实施例提供一种雷达探测方法,可应用于如第三方面任意一种实施例所示的可移动设备中的控制器,在该雷达探测方法中,控制器可以接收来自于雷达系统的第二信息,第二信息中携带第一点云数据,从而,控制器可以基于第一点云数据,获取各波长各自对应的单波长探测结果,进而,基于多种波长的单波长探测结果,确定雷达探测结果。由此,通过本方案,可移动设备可以基于雷达系统提供的点云数据,得到多个单波长探测结果,进而综合考虑各单波长探测结果得到最终的雷达探测结果,换言之,雷达探测结果是基于多波长光信号的探测结果综合获取到的,不同波长的光对环境、目标等的适应性不同,如此,能够得到更加全面的信息,也突破了单波长光受限于环境、目标导致应用场景单一,探测结果的精度不稳定的问题,有利于提高雷达探测结果的精确度,也有利于降低由此导致的自动驾驶过程的安全风险。
在第五方面或第六方面的一种可能的实施例中,所述根据所述第一点云数据,获取每种波长对应的单波长探测结果,包括:利用所述第一点云数据中各波长对应的光 信号的收发时刻,确定各波长对应的第一距离,以作为所述单波长探测结果;所述根据多种波长的所述单波长探测结果,确定雷达探测结果,包括:根据多种波长对应的所述第一距离,确定第二距离,所述第二距离用于表征所述雷达系统与探测目标之间的距离。
在第五方面或第六方面的另一种可能的实施例中,所述根据所述第一点云数据,获取每种波长对应的单波长探测结果,包括:利用所述第一点云数据中各波长对应的光信号的回波强度,确定各波长对应的目标反射率,以作为所述单波长探测结果;所述根据多种波长的所述单波长探测结果,确定雷达探测结果,包括:根据多种波长对应的所述目标反射率,确定探测目标的类型。
在第五方面或第六方面的另一种可能的实施例中,所述方法还包括:向所述雷达系统发送第一消息,所述第一消息用于指示所述雷达系统的所述发射参数和/或所述配置参数。
在第五方面或第六方面的另一种可能的实施例中,所述可移动设备还包括:传感器和/或通信模组;所述方法还包括:通过所述传感器和/或所述通信模组,获取环境参数;根据所述环境参数,确定各波长的所述激光信号的所述发射参数和/或所述配置参数。
第七方面,本申请实施例提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序,当其在计算机上运行时,使得计算机执行如第四方面、第五方面或第六方面中任意一种实施例所述的方法。
第八方面,本申请提供一种计算机程序,当所述计算机程序被计算机执行时,用于执行如第四方面、第五方面或第六方面中任意一种实施例所述的方法。
在一种可能的设计中,第八方面中的计算机程序,可以全部或者部分存储在一起且与处理器封装在一起的存储介质上,也可以部分或者全部存储在不与处理器封装在一起的存储器上。
综上,本申请实施例提供一种雷达系统、可移动设备与雷达探测方法,通过出射多波长光信号进行二维扫描,使得在二维扫描器的一个单位接收视场内可以有多个回波信号,突破了探测距离、TOF时间对单位接收视场内回波信号数目的限制,且基于多波长光信号得到的第一点云数据,可以携带各波长光信号各自携带的有效信息,也有利于突破单波长光信号对环境适应性的限制,能够有效提高点云数据中的有效信息的数目,提高了激光雷达的探测精度与探测性能,有利于降低由此导致的自动驾驶过程的安全风险。
附图说明
图1为本申请实施例提供的一种雷达探测场景的示意图;
图2为本申请实施例提供的另一种雷达探测场景的示意图;
图3为本申请实施例提供的另一种雷达探测场景的示意图;
图4为现有技术中飞行时间对单波长光信号的点云数的限制情况示意图;
图5为本申请实施例提供的不同波段的光信号示意图;
图6为本申请实施例提供的一种雷达系统的架构示意图;
图7为本申请实施例提供的一种雷达系统的波分模组的示意图;
图8为本申请实施例提供的另一种雷达系统的波分模组的示意图;
图9为本申请实施例提供的另一种雷达系统的波分模组的示意图;
图10为本申请实施例提供的另一种雷达系统的波分模组的示意图;
图11为本申请实施例中的一种二维扫描器的接收视场的示意图;
图12为本申请实施例中雷达系统进行二维扫描得到的点云示意图;
图13为本申请实施例中多波长光信号的发射视场与接收视场之间的关系示意图;
图14为本申请实施例中一种处理器获取雷达探测结果(雷达探测方法)的示意图;
图15为利用单波长光信号与多波长光信号分别识别同一个目标时的目标识别效果示意图。
具体实施方式
以下,结合附图对本实施例的实施方式进行详细描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
首先,对本申请实施例的应用场景进行说明。
本申请实施例所提供的技术方案应用于可移动设备利用雷达系统进行目标探测的场景。
可移动设备是指具备移动功能的设备。可移动设备可以包括:机身和控制器,控制器用于控制可移动设备。其中,控制器可以安装(或称为:集成、搭载)在机身上,或者,控制器还可以与可移动设备的机身分离设置。
本申请实施例中,可移动设备可以包括但不限于:车辆、无人机、地面机器人等。
示例性的,当可移动设备为车辆时,控制器可以为车辆的主控制器(或者,主控制器中的一个或多个处理单元),主控制器用于控制车辆,例如,主控制器可用于实现对车辆的行驶(包括自动驾驶)控制、对车载多媒体播放器的播放控制、对车载设备(例如,摄像头、车灯、定位系统等)的控制等。或者,控制器还可以为与车辆的主控制器通信连接的终端或其他远程控制器。后续具体说明通信连接方式。其中,本申请实施例所涉及到的终端可以包括但不限于:手机、平板电脑、笔记本电脑、掌上电脑、移动互联网设备(mobile internet device,MID)等,不作穷举。
示例性的,当可移动设备为无人机时,控制器可以为无人机的遥控器(或者,遥控器中的一个或多个处理单元),遥控器可以远程控制无人机执行飞行任务;或者,控制器可以为搭载于无人机机身中的处理器。
当可移动设备为地面机器人时,控制器可以为地面机器人的远程控制服务器(或者,远程控制服务器中的一个或多个处理单元),远程控制服务器可用于控制一个或多个地面机器人执行地面任务;或者,控制器也可以为安装于地面机器人机身中的处理。
可移动设备可以利用雷达系统对附件环境或目标进行探测。具体实现时,雷达系 统可以搭载(或称为:集成安装、安装、设置)于可移动设备的机身上。如此,可以在可移动设备移动过程中进行雷达探测。
可移动设备中的控制器与雷达系统可以通信连接。本申请实施例中,通信连接方式可以包括:有线连接和/或无线连接。其中,无线连接方式可以包括但不限于:无线保真(Wireless-Fidelity,WIFI)连接、蓝牙连接、近场通信(NFC)连接、车辆网连接等,不作穷举。
本申请实施例给出图1~图3,对本申请实施例所应用的雷达探测场景作示例性说明。
示例性的,图1示出了一种雷达探测场景。如图1所示,可移动设备100为车辆,车辆可以在道路上行驶,车辆中具体包括:机身(或称为:车身、机体)110、雷达系统120与控制器130。其中,雷达系统120与控制器130都搭载于机身110上,雷达系统120用于对车辆100附近,例如车辆100行驶前方,进行目标探测。而控制器130与雷达系统120通信连接,用于控制车辆100行驶。
在图1所示的一种实施例中,控制器130与雷达系统120通信连接,可以接收来自于雷达系统120的雷达探测结果(具体内容后续详述),而控制器130则可参考雷达探测结果来控制车辆100。
示例性的,图2示出了另一种雷达探测场景。在该场景中,可移动设备100为车辆,在车辆的机身110上搭载有雷达系统120,用于对车辆附近进行目标探测。在该场景中,车辆(可移动设备100)的控制器130为手机,也即,手机用于控制车辆。其中,雷达系统120还与手机(控制器130)通信连接,手机(控制器130)可接收来自于雷达系统120的雷达探测结果,并据此控制车辆(可移动设备100)。
示例性的,图3示出了另一种雷达探测场景。在该场景中,可移动设备100为无人机,无人机可在空中行驶并执行飞任务。在无人机的机身110上搭载有雷达系统120,用于对无人机附近进行目标探测。其中,无人机(可移动设备100)的控制器130为遥控器,用于控制无人机执行飞行任务。并且,遥控器(控制器130)与雷达系统120通信连接,遥控器可接收来自于雷达系统120的雷达探测结果,并据此控制无人机(可移动设备100)。
需要说明的是,在前述任意一种雷达探测场景中,控制器可以受控于用户,或者,也可以由控制器130自行控制可移动设备。
示例性的,在图1所示场景中,控制器130可以受控于司机,根据获取到的司机的操作信息,来执行相应的控制功能,而雷达系统120则可以在该场景下执行目标探测任务。也即,本申请可应用于司机驾驶车辆的场景。
或者,控制器130还可以自行控制车辆行驶。这种场景下,控制器130控制车辆自动驾驶(或称为:无人驾驶),雷达系统120也可以执行目标探测任务。也即,本申请亦可应用于车辆的无人驾驶场景。
此外,本申请实施例对于可移动设备是否处于运动状态无关。换言之,本申请可以应用于可移动设备运动场景下的目标探测,亦可应用于可移动设备静止场景下的目标探测。例如,当车辆临时停车但未熄火的情况下,车辆处于静止状态,亦可利用雷达系统进行目标探测。又例如,当无人机在半空中静止不动时,亦可利用雷达系统探 测附近的目标。
以及,除前述雷达系统120之外,可移动设备还可以搭载有其他传感器或其他雷达系统,本申请实施例对此无特别限制。示例性的,可移动设备还可以搭载但不限于:速度传感器、毫米波雷达、全球定位系统(Global Positioning System,GPS)接收器中的一种或多种。
本申请实施例中,雷达系统120可以具体为激光雷达系统,激光雷达系统又可称为:激光雷达、光学雷达系统、Lidar(light detection and ranging)等,是以发射激光束来探测目标的雷达系统。其原理在于,发射激光束,并接收激光束的反馈信号(或称为信号回波、回波信号),对回波信号与反馈信号进行分析,以实现目标探测。激光雷达目前多应用于测绘学、考古学、地理学、地貌、地震、林业、遥感以及大气物理等领域。
激光雷达是可移动设备实现自动驾驶的最重要的传感器之一。换言之,雷达探测结果与可移动设备的驾驶安全直接相关。由此,激光雷达的探测性能就成为本领域重点研究的对象。
目前,可移动设备中搭载的激光雷达130(例如,车载激光雷达等)一般为单波长激光雷达。所谓单波长激光雷达,是指采用单波长激光信号实现目标探测的激光雷达系统。其中,单波长激光信号可以是一个激光信号,也可以是由多个单波长激光信号构成的激光束,指激光束中各激光信号的波长相同,激光束的波长单一。
由于可移动设备对雷达探测结果的及时性要求较高,因此,一般通过激光雷达系统进行单点扫描(也可称为一维扫描或1D扫描)或二维扫描(也可称为2D扫描)。当以单波长激光雷达进行二维扫描时,一般是通过单点出射激光光束来实现,这种情况下,雷达系统可以得到三维点云数据,因此,雷达系统也可以称为三维雷达系统。
单波长激光雷达的测距性能和体积较好,能够满足目前车载雷达的探测需求。但是,单波长激光雷达的激光发射和接收通道有限,并行度较低,导致单位接收视场内的点云数目受限,因此,目前只能通过分辨率和帧率两个参数之间进行权衡调整,以尽可能提高单波长激光信号的探测性能。
激光雷达的探测性能可以利用单位时间和空间内的点云数、分辨率中的至少一种进行表征。单位时间和空间范围内点云数越高,激光雷达的探测性能越好;单位时间和空间范围内分辨率越高,激光雷达的探测性能越好。
单位时间和空间内的点云数和分辨率受到多方面的制约。示例性的,探测性能的制约因素可以包括但不限于:激光器能力、激光器与扫描器的配合、测距所需的飞行时间(也即TOF时间)、激光器重频(也即,重复频率)、单次测量生成的点云数、单次测量所需的脉冲数、扫描速度、扫描范围、帧率中的一种或多种。
示例性的,激光雷达在单位时间和空间内的点云数可以满足如下关系:
点云数≤(激光器重复频率/单次测量所需脉冲数)×激光器数×单次测量生成的点云数
点云数≤激光器数×单次测量生成的点云数/TOF时间
激光雷达在单位时间和空间内的分辨率可以满足如下关系:
分辨率≥TOF时间×扫描速度
分辨率≥扫描范围×帧率×单次测量所需脉冲数/(激光器重频×激光器数量×单次测量生成的点云数)
具体而言,受限于芯片热性能限制,激光器重频难以大幅提升。因此,难以通过调整激光器重频来增加单位时间和空间内的点云数。
单次测量生成的点云数一般由探测器的像素数目决定。但增加探测器的数量,则会影响每个像素分摊的功率,影响探测距离。也即,在其他条件完全相同的情况下,激光雷达的探测器数目越多,探测距离越短。
探测距离又决定了TOF时间。TOF时间即为光脉冲信号(或称为:激光脉冲信号、激光信号、光信号)在雷达与目标之间一次往返(飞行)所需要的最多时长。在其他条件完全相同的情况下,探测距离越远,TOF时间越长,单位时间内和空间内的点云数越少。
现结合图4说明TOF时间对单波长光信号的点云数的限制情况。在图4中,两个虚线之间的区间即为一个单位TOF时间,Tx表示光脉冲信号的发射时刻,而Rx则表示光脉冲信号的回波时刻(也即:光脉冲信号经目标反射后的回波信号的接收时刻)。图4示例性的示出了三个单位飞行时间。在每个单位TOF时间内,只有一个单波长的光脉冲信号出射,也仅有一个回波信号接收。基于目标与雷达系统之间的距离不同,各个Rx在其各自的单位TOF中的位置不同。如图4所示,L2<L1<L3,其中,L1为Tx1与Rx1之间的时长,L2为Tx2与Rx2之间的时长,L3为Tx3与Rx3之间的时长。
如图4所示,在单位TOF时间内,目标与雷达系统之间的距离不同,光脉冲信号的回波时刻不同。这样,就有可能出现一个单位TOF时间内存在两个回波信号的情况,无疑,这可能会导致回波信号与光脉冲信号对应失误,进而导致探测结果的精度极低。因此,为了避免误判,激光雷达系统在一个单位TOF时间内,仅出射一次光脉冲信号,且仅接收一次光脉冲信号。
如图4所示,激光雷达在一个单位TOF时间内,最多只能发射一个单波长光脉冲信号,并最多接收一个光脉冲信号的回波信号。可以理解,在一个单位TOF内,可能无光脉冲信号发射,和/或,无回波信号接收。对此不过多展开。
那么,在探测距离确定的前提下,单位TOF时间亦可以据此确定,从而,激光雷达在单位时间(可以包括一个或多个单位TOF时间)内能够产生的最高的点云数,也就是:该单位时间内所包含的单位TOF时间的数目与单次测量生成的点云数之积。
除此之外,不同波长的光脉冲信号,对环境的适应性、适用性均不同。例如,不同波长的光脉冲信号,不仅影响激光雷达的环境适应性、抗噪能力、抗干扰能力,还能够影响激光雷达中的器件选型、芯片材料及工艺、普遍应用性、制造成本和便利程度等方面。
示例性,图5示出了不同波段的光信号示意图。如图5所示,激光雷达所能发射出的激光信号的波段可以包括:紫外光(图5未示出)、可见光(Visible lighet)、红外(Infrared,IR)光、近红外(Near Infrared,NIR)光或短波红外(Short-Wave Infrared,SWIR)光。如图5所示,红外光波段还可以进一步划分,包括:近红外波段(图5中标识为51)、短波红外波段(也可称近红外波段,图5中标识为52)、中间红外 (Medium-Wave Infrared,MWIR)波段(也可称中红外波段,图5中标识为53)、长波红外(Short-Wave Infrared,SWIR)波段(也可称远红外波段,图5中标识为54)。
如图5所示,紫外光波段约为10~400nm,可见光波段约为390~750nm,近红外波段约为700~2500nm,中红外波段约为2.5~25μm,中红外波段约为25~500μm。在单波长激光雷达中,可以对外发射一种波长的激光信号,该激光信号的波长在图5所示的光信号的波段范围内。
图5所示的光信号的波长不同,这也导致不同波长的激光信号的环境适应性不同。具体而言,任意两种不同波长的激光信号,在大气穿透性、粒子散射和吸收、环境光噪声、目标反射率中的一种或多种方面不同。换言之,没有一种单一波长的激光信号可以满足各方面的优异性能,也没有一种单一波长的激光信号可以适应各种环境。
基于不同波长的激光信号的环境适应情况不同,那么,采用单一波长的激光信号进行目标探测时,激光雷达的探测性能也必然受到激光信号对环境适应性的相应影响。换言之,单波长激光雷达的环境适应性较差,导致激光雷达的探测性能不稳定。
举例说明。激光雷达1通过对外发射波长1的光脉冲信号实现目标探测。但是,波长1的光脉冲信号对雨雪环境的适应性较好,能够在雨雪环境中得到准确率较高的雷达探测结果;但是,波长1的光脉冲信号对沙尘暴环境的适应性较差,雷达探测结果的准确率较低。
在此前提下,车辆搭载有该激光雷达1,并据此(结合其他传感器等,不展开说明)实现自动驾驶。那么,当车辆行驶在雨雪环境时,激光雷达能够得到良好的雷达探测结果,这有利于车辆及时避障或采取其他行驶策略,能够在一定程度上降低车辆自动驾驶过程的安全风险。但是,当车辆行驶在沙尘暴环境中时,激光雷达获得的雷达探测结果的准确率较低,很有可能无法准确识别障碍物,导致车辆与障碍物相撞、追尾等安全事故,影响生命财产安全。
综上,对于单波长激光雷达而言,受限于探测距离、TOF时间、激光信号的环境适应性等,单波长激光雷达在单位TOF时间内,只有一个光脉冲信号的回波信号,无法突破TOF时间对点云数目与分辨率的限制。并且,单一波长的光脉冲信号的链路传输特性、目标反射特性、环境光噪声特性是唯一的,这也导致单波长激光雷达的点云数较少,进而导致点云中携带的有效信息也较少,这也极大影响了激光雷达的探测性能,激光雷达的探测性能较差。
本申请实施例提供一种雷达系统及其雷达探测方法,该雷达系统可以应用于前述任意一种场景的雷达探测场景。示例性的,图1~图3所示的任意场景中,都可以采用本申请实施例所提供的雷达系统实现相应的探测功能。
图6示出了一种雷达系统的系统架构示意图。如图6所示,该雷达系统600包括:激光模组610、二维扫描器620、波分模组630、探测模组640与处理器650;其中,激光模组610,用于生成多波长光信号;二维扫描器620用于利用所述多波长光信号进行二维扫描,并接收位于所述二维扫描器620的接收视场内的回波信号,所述回波信号为被扫描物体在所述多波长光信号照射后形成反射信号,其中,在二维扫描器的单位接收视场内包括多个回波信号;波分模组630,用于对回波信号作分光处理, 得到多个单波长光信号;探测模组640,用于将多个单波长光信号分别转换为多个波长各自对应的电信号;处理器650,用于根据各波长对应的电信号获取第一点云数据。
在图6所示的雷达系统600中,激光模组610可以发出多波长光信号,从而,通过二维扫描器620实现二维扫描时,在二维扫描器的单位FOV内,包括多个回波信号。具体而言,当多波长光信号包括N种波长的光信号(或称为:激光信号、激光脉冲信号、光脉冲信号等)时,如图6所示的:λ1、λ2……λN,其中,N为大于1的整数;那么,在二维扫描器的单位FOV内包括N个回波信号。其中,可以理解,N个回波信号分别与N个波长的光信号一一对应。如图6所示,N个回波信号的波长也分别为λ1、λ2……λN,与激光模组输出的N个波长的光信号一一对应。
如此,相较于现有技术中单波长激光雷达的单位接收视场内仅能包含一个回波信号的情况,本申请能够突破TOF时间对点云数目的限制,并且,也能够突破单波长光信号对环境适应性的限制,有效增加了单位接收视场内的点云数目,点云数目的增加也增加了点云数据中携带的有效信息,有利于提高激光雷达的探测性能。综上,本申请所提供的雷达系统能够有效提高激光雷达的探测性能,有利于降低由此导致的自动驾驶过程的安全风险。后续具体说明这种技术效果。
现对雷达系统作进一步说明。
本申请实施例所提供的雷达系统实际为一种激光雷达系统,首先对其中的激光模组作具体说明。
激光模组用于生成并出射多波长光信号,多波长光信号可以由多个单波长光信号构成。需要说明的是,多波长光信号中包含的多个单波长光信号,可以同时输出,也可以分时输出,后续具体说明。基于此,激光模组还可以称为:多波长激光源(Multi-wavelength Laser Source)。
激光模组可以包括一个或多个激光器。其中,激光器可以包括单波长激光器与可调谐激光器。
单波长激光器用于生成单一波长的光脉冲信号;任意两个单波长激光器生成的光脉冲信号的波长不同。例如,激光模组可以包括5个单波长激光器,各单波长激光器分别用于生成并射出:905nm、940nm、1064nm、1310nm、1550nm的单波长光脉冲信号。
可调谐激光器则可用于生成多种波长的光脉冲信号。例如,可调谐激光器可以通过调谐来生成并分别射出905nm、940nm、1064nm、1310nm、1550nm的单波长光脉冲信号。
当激光模组仅包括一个激光器时,该激光器可以为可调谐激光器。这种情况下,所述多波长光信号包括多个单一波长的光信号。可调谐激光器生成的各单一波长的光信号的生成时刻不同,也即,可调谐激光器可以依次生成多个单一波长的光信号,而各单一波长的光信号可以依次出射。
或者,激光模组也可以包括多个激光器,此时,该激光模组可以包括:可调谐激光器和/或单波长激光器。这种情况下,所述多波长光信号包括:一个包括多种波长的光信号,或多个单一波长的光信号。
以激光模组包括N个激光器为例,例如,激光模组中可以包括N个单波长激光器, 如此,激光模组可以生成并输出N个波长的单波长光信号;又例如,激光模组中可以包括N-1个单波长激光器与1个可调谐激光器,如此,激光模组可以生成并输出至少N-1种单波长光信号。
当激光模组包括多个激光器时,多个激光器可以同时生成并出射多个单波长光信号,或者,分时并出射生成多个单波长光信号。
本申请实施例对于激光模组所能生成并出射的多波长光信号的波长数值无特别限制,前述波长仅示例性的进行说明,实际场景中,激光模组可以采用满足各自波段要求的各种激光器。
示例性的一种实施例中,可以基于大气穿透性、天气适应性、水吸收能力、目标反射率、太阳光或人眼安全中的一个或多个方面,来确定多波长光信号的波长。此时,可以参考表1,表1示出了前述多种影响因素对不同波长的光信号的影响情况。
表1
Figure PCTCN2020097435-appb-000001
环境对雷达探测过程的影响主要体现在两个方面:不同天气对激光信号在大气中的衰减情况存在影响,以及,不同天气也会影响目标反射率的增加或减弱。例如,在雨、雪、沙尘、雾霾等天气中,激光信号在大气中的衰减情况不同;又例如,雨、雪、沙尘、雾霾等天气也会导致目标反射率有不同程度的增加或减弱。
其中,大气衰减情况由消光系数决定,消光系数与吸收和散射相关,换言之,消光系数与吸收系数、散射系数相关。其中,大气对光信号的吸收主要是由于光信号的能量被水、二氧化碳(CO2)等分子吸收转换为热能和/或化学能等,其主要表现为光强的衰减。大气对激光信号的散射,则是光信号通过不均匀介质时造成一部分光偏离原传播方向的过程,主要由悬浮颗粒或大分子导致,不仅表现为光的衰减还有光束质量的劣化。并且,环境中也经常会出现雨、雪、沙尘等覆盖在探测目标表面的情况,此时,覆盖于探测目标表面的雨、雪、沙尘等,同样会导致目标反射特性的变化,造成反射率的降低、反射光强分布变化、反射光强方向变化等。
本申请实施例中,通过多波长光信号作目标探测,而不同的波长的光信号用于探测同一个目标(或称为:探测目标)时,由于目标反射率的不同,也会造成回波信号的差异。因此,处理器可以基于不同波长的光信号的回波信号的差异,作差异性处理,如此,可以增强目标材质、类型等的识别概率,从而为自动驾驶感知融合到判断提供 更加丰富精准的数据支撑。
除天气的影响之外,还需要考虑太阳光或其他可能的环境光对回波信号接收过程的干扰,尽量选择背景干扰光较弱的波段,避免环境光对高灵敏度探测器造成的虚警或动态范围的减小。例如,表1中1.15μm、1.34-1.45μm附近有两个相对最低太阳光窗口,940nm和1550nm的太阳光也较弱。
人眼安全可以通过人眼安全阈值进行表征,不同波长光信号的人眼安全阈值不同。其中,光信号的人眼安全阈值越高,允许的出射光功率越高,越有利于提升雷达系统的探测性能,且有利于减小发射通道数,并对二维扫描器的接收端和扫描端的要求较低。
如表1所示,光信号的波长越长,人眼越安全。当波长较长时,光信号不会在人眼的视网膜上成清晰像,同时,该波段水吸收特性较强,人眼球中的水可以吸收部分能量,使得视网膜上的能量大幅降低,保护人眼安全。
示例性的,人眼安全阈值可以为1400nm或1550nm。也即,波长超过1400nm或1550nm的光信号可以被认为是人眼安全波段。例如,当光信号的波长大于1400nm时,允许激光器单位时间单位空间出射更高功率,这些高功率激光可以显著提升系统测距性能,如300m以上的测距;或者将单个光束的能量分摊到多个探测器像素上,从而提升系统分辨率。
此外,由于不同激光器生成并输出的光信号的性能、人眼安全要求不同,还可以实现提升雷达系统的光信号(波长)的动态探测范围,甚至可以搭配使用满足全(波长)动态范围的激光器使用,可以实现全动态范围的目标探测。相比之下,现有的激光雷达动态范围仅为60~100dB,当其功率较低时,影响最远量测距离,当其功率较高时,又会产生饱和造成测距精度的下降;而本申请实施例所提供的雷达系统则可以避免该问题,甚至满足全动态范围的目标探测要求。
激光器需要满足的对预设的多种波长的光脉冲信号的生成要求,除此之外,本申请实施例对于激光器的结构类型无特别限制。示例性的,激光模组中的激光器可以包括但不限于:固体激光器、光纤激光器、半导体激光器、气体激光器或染料激光器中的一种或多种。不同波长的光脉冲信号所适用的激光器的类型可参考后续表1,此处不再展开说明。
其次,对雷达系统中的探测模组进行说明。
基于波分模组处理后进入探测模组的各单波长光信号的波长不同,由此,在探测模组中,也可以包括一个或多个探测器(Detector),用以对不同波长的各单波长光信号作光电转换处理。
其中,探测器可以包括单波长探测器与多波长探测器。
任意一个所述单波长探测器用于接收并处理一种波长的所述单波长光信号。换言之,单波长探测器用于对单一波长的光脉冲信号作光电转换处理。在探测模组640中,存在至少两个单波长探测器所适用的光脉冲信号的波长不同。例如,探测模组640可以包括5个单波长探测器,各单波长探测器分别用于对905nm、940nm、1064nm、1310nm、1550nm的单波长光脉冲信号作光电转换处理。
而多波长激光器则可适用于多种波长的光脉冲信号,也即多波长探测器用于接收并处理多种波长的所述单波长光信号。具体而言,多波长探测器可实现对多种波长的光脉冲信号的光电转换处理。例如,多波长探测器可用于对905nm、940nm、1064nm、1310nm、1550nm的单波长光脉冲信号作光电转换处理。
那么,当探测模组仅包括一个探测器时,该探测器可以为多波长探测器。
或者,当探测模组包括多个探测器时,多个探测器可以包括但不限于:多波长探测器和/或单波长探测器。以探测模组包括N个探测器为例,例如,探测模组中可以包括N个单波长探测器,如图6所示,一个单波长探测器可用于对一种波长的单波长光信号进行处理,波长分别为λ1、λ2……λN的N个回波信号,也分别由N个探测器进行处理;又例如,探测模组中可以包括N-1个单波长探测器与1个多波长探测器,如此,探测模组可用于对至少N-1种单波长光信号作光电转换处理。
探测器需要满足的对预设的多种波长的光脉冲信号的处理要求,除此之外,本申请对其类型或结构无特别限制。示例性的,探测模组中的探测器可以包括但不限于:铟镓砷(也即:铟砷化镓,InGaAs)探测器、硅(Si)探测器;具体而言,探测器可以具体包括但不限于:雪崩光电二极管(avalanche photodiode,APD)探测器、P型半导体-杂质-N型半导体(Positive-Intrinsic-Negative,PIN)探测器、单光子雪崩二极管探测器(Single Photon Avalanche Diode,SPADs)、硅光电倍增管(Silicon photomultiplier,SiPM)探测器等。
示例性的,本申请还提供了表2。表2具体具体示出了不同波长的光信号所适用的激光器与探测器的类型。
表2
Figure PCTCN2020097435-appb-000002
如表2所示,不同波长的光脉冲信号所适用的激光器、探测器的类型不同。其中,GaAs表示砷化镓,InP表示磷化铟,Er-Glass表示掺铒玻璃,Er-doped Fiber表示掺铒光纤,Nd:YAG表示钇铝石榴石晶体,Nd:YLT表示钕掺杂氟化锂钇,SiGe表示硅化锗,这些材料组分或掺杂决定了激光的波段。此外,材料的类型决定了激光器的类型。如表2所示,本申请所提供的雷达系统可以采用但不限:半导体激光器、固体激光器等。可以理解,表2为示例性的,实际场景中,可以包括但不限于表2所示的情况。 例如,固体激光器还可以包括但不限于:半导体泵浦固体激光器(DPSSL),灯泵浦固体激光器,光纤激光器等,不作穷举。
示例性的,850~940nm的光信号适用于半导体激光器,1064nm的光信号则适用于固体激光器或光纤激光器,1035nm的光信号适用于半导体激光器或固体激光器,1550nm的光信号则适用于光纤激光器。
示例性的,850~940nm的光信号适用于APD探测器或SPADs探测器,1064nm的光信号则适用于APD探测器,1035nm的光信号适用于SPADs探测器,1550nm的光信号则适用于APD探测器或SPADs探测器。其中,适用于850~940nm的光信号的APD探测器与SPADs探测器可以为硅材料探测器,其他探测器则可以为铟镓砷材料探测器。
在表2的一种具体实施例中,本申请实施例所提供的雷达系统,可以部署前述至少两种波长的激光器,以及,相应地部署这两种波长的探测器。如此,雷达系统可以使用不同的激光器生成相应波长的光信号以实现雷达探测,从而,可以利用各激光器、探测器各自的特性,例如激光器光束质量、探测器灵敏度、噪声等特征,进一步实现对雷达系统的成本和体积优化。此处对如何选取激光器、探测器来实现雷达系统的优化不作展开讨论,但通过合理的选取布置,可以达到优化效果。
其次,对雷达系统中的波分模组作具体说明。
如前所述,波分模组,用于对回波信号作分光处理,得到多个单波长光信号。如图6所示,波分模组630可以设置于二维扫描器620与探测模组640之间。
示例性的一种实施例中,波分模组630可用于同时对回波信号作分光处理,并得到多个单波长光信号,任意两个单波长光信号的波长不同。
相较于前述现有技术中多个单一波长的光脉冲信号与相应的多个回波信号可能存在的测距误差或测距精度不足的问题,本申请利用多种不同波长的光脉冲信号来探测目标,如此,即便多个不同波长的光脉冲信号的出射时刻相同,探测距离相同,但各波长的回波信号的接收时刻不同,如此,各波长的光脉冲信号与回波信号可以实现精确对应。那么,即便波分模组同时处理各回波信号,也不会出现误判的情况。这也能够解决现有技术中单位TOF时间对点云数目的限制,后续详述。
在本申请的一种可能的实施例中,波分模组还可以设置于激光模组与二维扫描器之间,此时,波分模组还可以用于:对激光模组输出的多波长光信号作分光处理或聚光处理,并提供给所述二维扫描器进行二维扫描。
示例性的,激光模组可以通过多个激光器输出N个不同波长的光脉冲信号,这些光脉冲信号可以经波分模组作聚光处理,形成一束光信号,并通过二维扫描器出射。
示例性的,激光模组可以通过多个激光器同时输出N个不同波长的光脉冲信号,这些光脉冲信号混杂成一束光出射,这种情况下,这一束光可以经波分模组作分光处理后形成多个单波长光信号,并通过二维扫描器出射。后续结合附图7进行说明。
本申请实施例中,波分模组可以包括但不限于:分光器、光纤、透镜、棱镜、反射镜或衍射器件中的一种或多种。
示例性的,波分模组可以仅包括一个分光器,例如,图6所示的雷达系统中,波 分模组630可以具体为一个分光器(Beam Splitter)。
示例性的,波分模组还可以是由除分光器之外的其他的一种或多种光学元件构成。此时,可以参考图7~图10。其中,实线箭头表示多波长光信号(也即发射光线),虚线箭头表示回波信号(也即接收光线),黑色粗线条表示反射镜,椭圆表示镜头模组,其中,镜头模组可以由透镜、棱镜、反射镜中的至少一种构成,对此无特别限定。
图7示出了本申请实施例提供的一种雷达系统的波分模组的示意图。如图7所示,激光模组610出射的发射光线经反射镜71、反射镜72的反射,到达二维扫描器630,二维扫描器630可以在预设范围内转动,经二维扫描器630的反射,发射光线出射。相应地,不同波长的接收光线也在二维扫描器630的接收视场内接收,并反射至镜头模组73,并经镜头模组73反射至反射镜74,进而,接收光线可进入探测模组640。
在图7的一种可能的实施例中,反射镜74之后还可以进一步连接一个分光器,由分光器作进一步分光处理后,各单波长光信号可以进入相应波长的探测器。
在图7所示实施例中,波分模组包括:反射镜71、反射镜72、镜头模组73与反射镜74,其中,反射镜71与反射镜72用于对多波长光信号作分光处理或聚光处理;而镜头模组73与反射镜74(或者还可以包括分光器),则用于对回波信号作分光处理。
另,在图7所示的实施例中,各光学元件形成一种同轴光路,换言之,该雷达系统为一种同轴光学系统。
图8示出了本申请实施例提供的另一种雷达系统的波分模组的示意图。如图7所示,激光模组610出射的发射光线直接经二维扫描器630出射。在二维扫描器630的一个单位接收视场内,可以有多个不同波长的回波信号,这些接收光线可以经过反射镜81或反射镜82反射到反射镜83,并经过反射镜83反射出去。
在图8所示实施例中,波分模组包括:反射镜81~反射镜83,其中,反射镜81~反射镜83,用于对回波信号作分光处理。
另,在图8所示的实施例中,各光学元件形成一种同轴光路,换言之,该雷达系统为一种同轴光学系统。
图9示出了本申请实施例提供的另一种雷达系统的波分模组的示意图。如图9所示,激光模组610出射的发射光线经反射镜91、反射镜92、二维扫描器630出射。在二维扫描器630的一个单位接收视场内,可以有多个不同波长的回波信号,这些接收光线经过镜头模组93的反射到达探测器,或经过镜头模组93后经分光器处理后再进入探测器。
在图9所示实施例中,波分模组包括:反射镜91、反射镜92与镜头模组93,其中,反射镜91与反射镜92用于对多波长光信号作分光处理或聚光处理;而镜头模组93(或者还可以包括分光器),则用于对回波信号作分光处理。
另,在图9所示的实施例中,各光学元件形成一种同轴光路,换言之,该雷达系统为一种同轴光学系统。
图10示出了本申请实施例提供的另一种雷达系统的波分模组的示意图。如图10所示,激光模组610出射的发射光线直接经二维扫描器630出射。在二维扫描器630的一个单位接收视场内,可以有多个不同波长的回波信号,这些接收光线经过镜头模 组101的反射到达探测器,或经过镜头模组101后经分光器处理后再进入探测器。
在图10所示实施例中,波分模组包括:镜头模组101,具体用于对回波信号作分光处理。如图10所示的雷达系统中,各光学元件形成一种离轴光路,换言之,该系统为一种离轴光学系统。
以上,图7~图10为示例性的,本申请实施例对于雷达系统所采取的波分元件并无特别限制,对各光学元件所构成的光路为同轴光路或离轴光路亦无特别限制,该雷达系统为离轴光学系统或同轴光学系统。以及,实际场景中可配合前述可输出多波长光信号的激光模组、二维扫描器、探测模组来实现本方案。
现对雷达系统中的二维扫描器作具体说明。
本申请实施例中,雷达系统采用二维扫描器进行扫描,二维扫描器具备在二维平面内进行扫描的能力。
可以理解,二维扫描器也具备一维扫描(单点扫描)的能力,本申请实施例所提供的雷达系统也能够用于实现一维扫描。此处不作展开。以下针对二维扫描器进行二维扫描的场景,说明本方案。
如前所述,例如图7~图10所示,二维扫描器可以在预设的一定范围内转动。在二维扫描器移动过程中,都可以出射光脉冲信号并接收其回波信号。示例性的,二维扫描器可以从左到右依次转动,每转动单位角度就可以对外出射光脉冲信号,并且在当前接收视场内接收光脉冲信号的回波信号。如此,二维扫描器不断转动并出射光信号、接收回波信号,从而,实现二维扫描。
任意一个所述多波长光信号的出射位置,落在该多波长光信号的接收视场内。此时可以参考图11,图11具体示出了一种二维扫描器的接收视场的示意图。在二维扫描器(Scanner)转动过程中,可以形成如图11所示的扇形区域的接收视场(也即:Receiver FOV,接收FOV)。在该接收视场内,多波长光信号可以在位置1(可记为:Tx laser)所示位置出射,而位置2(可记为:Possible Echo 1)与位置3(可记为:Possible Echo 2)则为该多波长光信号的可能的回波信号的接收位置。
在二维扫描过程中,由于激光信号入射到空气中后,原有激光信号的发散角将增大,那么,激光信号经目标反射后就会呈现多个角度不确定方向的反射光线(理想目标一般为朗博反射体)。但在二维扫描过程中,二维扫描器出射激光后,还会继续运动,包括等待接收激光回波的过程,因此,接收FOV需要考虑光束发散角、目标处光斑、TOF时间内目标回波可能的入射角度等,以及,在两个扫描方向上的接收FOV均需要大于发射光束发散角。
需要说明的是,本申请实施例通过多波长光信号进行二维扫描的过程中,多波长光信号包含多种波长的光信号,任意两个光信号的波长不同,不同波长的光信号的发射参数可以不同。本申请实施例所涉及到的发射参数可以包括但不限于:发散角、出射位置、出射时刻、出射角度、接收视场的位置、接收视场的尺寸、飞行时间中的一种或多种。
以下具体说明。
示例性的,多波长光信号中各波长的光信号的发散角可以相同,也可以完全不同, 或者,也可以不完全相同(存在至少两种波长的发射信号的发散角相同)。
示例性的,多波长光信号中各波长的光信号的出射时刻可以相同、不完全相同或完全不同。例如,波长分别为λ1、λ2……λN的N个光信号中,N个光信号可以由激光模组依次生成并出射,如此,N个光信号的出射时刻完全不同。又例如,多波长光信号包含的N个光信号,可以由激光模组中的N个单波长激光器生成,并同时出射,如此,N个光信号的出射时刻可以完全相同。又例如,多波长光信号包含的N个光信号中,部分光信号可以是由多个单波长激光器同时生成并出射的,还有部分光信号可以是由可调谐激光器生成并依次出射的,也即,N个光信号的出射时刻不完全相同。
示例性的,多波长光信号中各波长的光信号的出射角度不同。如前所述,二维扫描器可以持续移动,那么,当N个光信号的出射时刻不同时,各光信号通过二维扫描器出射时的出射角度也不同、接收视场的位置也不同。此外,多波长光信号中N个光信号的接收视场可以不同。
示例性的,不同波长的光信号的飞行时间可以不同、相同或不完全相同。
本申请实施例中,多波长光信号通过雷达系统的二维扫描器出射,并接收经目标返回的回波信号,这种情况下,在二维扫描器的单位接收FOV内可以包括多个回波信号。这种情况下,多波长光信号包括多种波长的光信号,各波长的光信号的最小接收视场的参数相同、不完全相同或完全不同;其中,最小接收视场的参数包括:最小接收视场的位置、尺寸或个数中的一种或多种。
二维扫描器的单位接收FOV的最小取值(也即最小接收视场)为二维扫描器的扫描速度与飞行时间的乘积。不同波长的光信号的飞行时间可以不同,因此,不同光信号的最小接收视场的尺寸可以不同。以及,不同波长的光信号的出射时刻、出射时长可以不同,不同波长光信号的最小接收视场的位置及个数也可以不同。
后续,为便于说明,可将光信号的一个最小接收视场视作一个“像素”。需要说明的是,像素并不等同于二维扫描器的单位接收FOV:像素是光信号在目标探测过程中的最小接收FOV,而单位接收FOV则是针对二维扫描器而言的,在二维扫描器的一个单位接收FOV中,可以包括一个像素或多个像素。
示例性的,多波长光信号可以包括第一波长光信号与第二波长光信号,且所述第一波长光信号与所述第二波长光信号的波长不同;那么,当所述第一波长光信号与所述第二波长光信号的发射参数不同时,所述第一波长光信号与所述第二波长光信号的最小接收视场的参数不同。
图12示出了雷达系统进行二维扫描得到的点云示意图。图12中12A、12B与12C分别示出了波长为λ1、λ2、λ3的光信号的点云示意图。其中,12A为雷达系统通过波长为λ1的光信号1进行目标探测得到的点云示意图;12B为雷达系统通过波长为λ2的光信号2进行目标探测得到的点云示意图;12C为雷达系统通过波长为λ3的光信号3进行目标探测得到的点云示意图。
在12A、12B与12C中,任意一个矩形区域可以视作一个像素,也可视作二维扫描器的最小扫描视野,而每个像素对应的方位角即为雷达系统的雷达分辨率。单位矩形区域(也即像素)的尺寸与光信号的飞行时间(TOF)相关。如图12所示,光信号1与光信号2的TOF时间相同,光信号1对应的像素尺寸与光信号2对应的像素尺寸 相同;而光信号1与光信号3的TOF时间不同,光信号1对应的像素尺寸与光信号3对应的像素尺寸不同。
在图12中的原点表示回波信号的可能的接收位置,也即,原点表示回波信号在单位像素中的接收位置。具体而言,回波信号在像素中的接收位置与扫描轨迹、目标的距离相关联。对于任意一个单波长的光信号而言,一个像素中只允许一个回波信号。例如,在12A、12B与12C中,任意一个像素中仅包含一个圆点(回波信号)。这种情况也即前文所述,雷达系统的点云数与分辨率受限于TOF时间的情况。
本申请实施例通过多波长光作雷达探测。此时,图12中的12D、12E与12F示出了多波长扫描场景下的点云示意图。
其中,12D为雷达系统通过光信号1与光信号2在完全相同的接收FOV内进行目标探测得到的点云示意图。如12D所示,当光信号1与光信号2在同一个接收FOV内被接收到,二者的像素完全相同。此时,任意一个像素中包括2个原点,也即包括两个回波信号。
12E为雷达系统通过光信号1与光信号2在尺寸相同但交错的接收FOV内进行目标探测得到的点云示意图。如12E所示,虽然光信号1的像素与光信号2的像素的不完全重合,但相较于12A与12B,在单位接收FOV内也可以显著增加点云数目。
而12F为雷达系统通过光信号1、光信号2与光信号3在3个不同的接收FOV内进行目标探测得到的点云示意图。如12F所示,当以三个不同波长的光信号进行雷达探测时,三个光信号可以同时或分时发射和接收,各波长的光信号通过各自的及骨干器发射并波分复用进入相对应的探测器接收,互不干扰,这也大大提升了雷达系统的点云数与分辨率。
示例性的,可以参考图13,图13示出了多波长光信号的发射视场与接收视场之间的关系示意图。为便于理解,图13仅以波长分别为λ1的光信号1与波长为λ2的光信号2为例作具体说明。以及,可以理解的是,图13仅示例性的示出了几种可能的情况,实际场景则可以包括但不限于图13所示的情况。在图13中的原点用于对光信号1与光信号2的发射位置进行示意,光信号1与光信号2的发射位置可以同样大小,也可以不同,图13示例性的以不同大小的两个圆点对此进行示意。
图13具体示出了10个接收视场的示意图,如图13所示,在每个接收视场中,都可以包括一个或两个光信号的发射视场。如图13中的接收视场1~接收视场3所示,光信号1与光信号2的发射视场大小不同,但位置重叠;在接收视场4~接收视场6中,光信号1的发射视场与光信号2的发射视场大小不同、位置无相交,且二者的发射视场都位于所属的接收视场中;在接收视场7、9中包含光信号1的发射视场,接收视场8、10中则包含光信号2的发射视场,光信号1与光信号2的发射视场完全无关。
需要说明的是,图13所示的10个接收视场,可以是雷达系统在进行二维扫描过程中依次运动时的接收视场的示意图,此时,雷达系统在进行二维扫描过程中,二维扫描器的接收视场是可变动的。或者,雷达系统也可以按照固定模式进行二维扫描,例如,可以按照接收视场1~接收视场3进行二维扫描,也可以按照接收视场4~接收视场6进行扫描,也可以通过接收视场7~接收视场10的方式交替发射不同波长的光信号来进行二维扫描。
如图13所示,各波长激光器的发散角、探测器的FOV、TOF时间等可以相同也可以不同。各激光器出射激光可以独立对应自己波长的接收FOV,也可以对波长激光对应同一FOV。各波长激光器出射时刻和出射方位角可以在时域和空间域上对齐,也可以分别采用各自的起始时刻和起始位置。各波长探测器对应最大测量距离的终止时刻和最大接收方位角位置可以在时域和空间域上对齐,也可以分别设置。
在前述任意一种实施例的雷达系统中,处理器用于根据各波长对应的所述电信号获取第一点云数据。具体而言,本申请实施例可以提供至少两种实现方式。
在一种可能的实施例中,处理器可以根据各多个波长各自对应的电信号,直接生成第一点云数据。在该实施例中,处理器在接收到来自于探测模组的电信号后,根据这些电信号各自对应的发射参数进行计算,以确定各电信号对应的点云位置即可,从而,汇总各电信号对应的点云位置,即可得到第一点云数据。
该实施例的实现方式简捷易行。并且,在该实施例中,第一点云数据是基于多种波长的光信号作目标探测得到的,如前所述,相较于现有的单波长光信号作二维扫描的方式,第一点云数据的点云数目可呈倍数增长,这不仅突破了TOF时间等对点云数目的限制,而且,有利于解决单波长光信号对环境的适应性问题(后续具体说明这种情况)。
除直接将生成第一点云数据之外,处理器还可以计算环境质量参数,并利用其对生成的点云数据(此时可将生成的电云数据记为第二点云数据)进行补偿,以此来得到(补偿后的)第一点云数据。换言之,处理器还可以具体用于:根据所述多个波长各自对应的电信号,生成第二点云数据,以及,利用环境质量参数,对所述第二点云数据进行补偿,得到所述第一点云数据。
其中,环境质量参数用于指示当前环境对雷达探测结果的影响情况。本申请实施例所涉及到的环境质量参数可以包括但不限于:雾团类型、天气恶劣等级、粒子浓度、湿度或粒子尺寸分布中的一种或多种。
本申请实施例中,处理器还可用于获取环境质量参数。
一种可能的实施例中,具体而言,当环境质量参数被存储在预设位置时,则处理器可以直接获取该预设位置存储的数据,如此,即可获取到环境质量参数。在该实施例中,环境质量参数可以由其他电子设备,例如雷达系统所搭载的可移动平台的控制器计算得到,并预存在该预设位置的。
另一种可能的实施例中,处理器还可具体用于计算并获取得到环境质量参数。换言之,处理器可以基于多个波长各自对应的电信号,来计算环境质量参数。
在该实施例中,处理器可以在多个波长各自对应的电信号中,提取各波长对应的噪声参数,然后,根据多种波长对应的噪声参数,确定环境质量参数。其中,噪声参数可以包括但不限于:后向散射噪声参数。除此之外,噪声参数还可以包括:大气噪声参数、其他雷达的噪声参数、雾团噪声参数、环境光噪声(例如,太阳光噪声等)参数等。
大气对信号的影响分两部分,散射和吸收。散射和吸收都会对信号产生衰减和脉冲展宽作用。信号衰减会导致目标信号的减弱,影响测距性能和目标信号强度(反射 率)估计;脉冲展宽会导致回波信号时刻测量误差,从而影响测距精度。散射分为将激光分散为不同方向的反射光,其中后向散射光会进入激光雷达系统,在回波信号上产生后向散射噪声,影响目标信号探测的信噪比。
由此,可以通过回波信号反演计算得到散射系数和吸收系数,对噪声信号进行滤波或抑制,对目标信号进行波形和幅度修正。其中,散射系数和吸收系数可用于确定环境质量参数。
示例性的一种实施例中,处理器获取环境质量参数时,可以利用后向散射函数处理多个波长对应的后向散射噪声参数,得到散射系数;以及,利用大气吸收函数处理多个波长对应的所述电信号与后向散射噪声参数,得到吸收系数;从而,根据散射系数与吸收系数,确定环境质量参数。
例如,当雷达系统利用多波长光信号(包括光信号1与光信号2,且二者波长不相等)进行目标探测时,经探测器处理后,处理器可以接收到对应于光信号1的电信号1,以及,对应于光信号2的电信号2。
在该场景中,处理器可以对电信号1作匹配滤波处理,以提取出电信号1对应的后向散射噪声参数1,然后,利用后向散射函数处理后向散射噪声参数1,得到散射系数1;以及,利用大气吸收函数处理后向散射噪声参数1与电信号1,得到吸收系数1。以及,处理器还可以对电信号2作匹配滤波处理,以提取出电信号2对应的后向散射噪声参数2,然后,利用后向散射函数处理后向散射噪声参数2,得到散射系数2;以及,利用大气吸收函数处理后向散射噪声参数2与电信号2,得到吸收系数2。之后,处理器就可以基于散射系数1、吸收系数1、散射系数2与吸收系数2,确定环境质量参数。
可以理解,不同的环境质量参数的确定方式不同,具体确定环境质量参数时,只需要将计算得到的散射系数与吸收系数带入相应的环境参数确定模型,即可得到环境参数确定模型所输出的环境质量参数。换言之,环境参数确定模型用于确定环境质量参数,模型的输入可以为多波长光信号对应的散射系数与吸收系数,模型的输出为环境质量参数。例如,第一模型用于确定雾团类型,第二模型用于确定天气恶劣程度;那么,处理器可以将前述散射系数1、吸收系数1、散射系数2与吸收系数2输入第一模型,即可获取到第一模型输出的雾团类型;将前述散射系数1、吸收系数1、散射系数2与吸收系数2输入第二模型,即可获取到第二模型输出的天气恶劣程度。
需要说明的是,本申请实例对环境参数确定模型的模型类型无特别限制。例如,环境参数确定模型可以为公式型模型、神经网络模型或其他数学模型等。
如前所述,环境质量参数可以由雷达系统中的处理器计算得到,或者,也可以由其他处理器计算得到。例如,雷达系统可以将第二点云数据和/或多个波长对应的电信号向主控制器发送,而主控制器可以据此来获取环境质量参数;之后,主控制器可以向雷达系统发送该环境质量参数,或者,主控制器可以将环境质量参数存储在预设位置,且雷达系统具备该预设位置的数据获取权限。
本申请实施例中,环境质量参数可以是实时计算得到的,也可以是间隔式计算得到的。例如,在实际的雷达探测场景中,处理器接收到多个不同波长的电信号后,可直接生成第二点云数据,以及,实时的基于接收到的电信号计算出环境质量参数,然 后,再利用环境质量参数对第二点云数据进行补偿,得到第一点云数据。或者,又例如,雷达系统可以周期性计算环境质量参数,如此,当处理器生成第二点云数据后,可以直接获取当前周期(或最近一次)计算得到的环境质量参数,并据此来实现对第二点云数据的补偿,得到第一点云数据。
在具体利用环境质量参数对第二点云数据进行补偿时,可以按照如下方式处理:处理器可以在预设的补偿公式中,获取与环境质量参数相匹配的目标补偿公式,然后,利用目标补偿公式对第二点云数据进行补偿,得到第一点云数据。
需要说明的是,预设的补偿公式可以是经验公式,或者,通过预设的标定实验得到的公式。本申请实施例对于补偿公式的来源无特别限制。
一种可能的实施例中,可以预设各环境质量参数与各预设的补偿公式之间的对应关系,那么,在确定目标补偿公式时,只需要获取环境质量参对应的一个补偿公式即可。例如,可以提前预设雾团类型与预设的补偿公式之间的对应关系,如此,只需要按照前述步骤确定雾团类型,并获取该雾团类型对应的补偿公式,即可得到目标补偿公式。
在利用目标补偿公式对第二点云数据进行补偿时,基于预设的补偿公式不同,补偿方式也有所区别,这里对补偿公式无特别限制。示例性的一种可能的实施例中,可以利用目标补偿公式处理所述二维扫描器的扫描参数、所述探测模组的探测响应度、放大系数、所述第一光信号的发射参数、所述第二光信号的接收参数,得到补偿值,然后,利用补偿值对第二点云数据进行补偿,得到第一点云数据。
本申请实施例所提供的雷达系统,除直接输出第一点云数据之外,还可以对第一点云数据进行分析,以确定雷达探测结果,进而,还可以输出雷达探测结果。换言之,该雷达系统可用于输出:第一点云数据和/或雷达探测结果。
示例性的一种实施例中,处理器,还可以根据第一点云数据,获取每种波长对应的单波长探测结果,然后,根据多种波长的单波长探测结果,确定雷达探测结果。如此,区别于现有技术中直接将单波长探测结果作为雷达探测结果的情况,本申请是综合了多种波长的单波长探测结果,得到的雷达探测结果,这能够有效解决单一波长光信号受限于环境的问题,有利于提高雷达探测结果的精度。
本申请实施例所涉及到的雷达探测结果可以包括但不限于:目标(或称为探测目标)与雷达(也即雷达系统)之间的距离和/或目标类型。
示例性的,处理器可以利用第一点云数据中各波长对应的光信号的收发时刻,确定各波长对应的第一距离,以作为单波长探测结果;然后,根据多种波长对应的第一距离,确定第二距离,第二距离用于表征雷达系统与探测目标之间的距离。
处理器可以利用速度距离公式对光信号的收发时刻与光速进行计算,可以得到各波长光各自对应的第一距离,可以理解,第一距离用于表征一种波长的光探测得到的雷达系统与探测目标之间的距离。而本申请中,根据多个第一距离确定第二距离时,可以有多种实现方式。
示例性的,一种可能的实施例中,处理器可以获取各多个第一距离的平均值,并将该平均值确定为第二距离。另一种可能的实施例中,处理器可以过滤掉多个第一距离中的最大值与最小值之后,再获取剩余各第一距离的平均值,并将该平均值确定为 第二距离。另一种可能的实施例中,处理器可以获取多个第一距离中的最小值,并将该最小值确定为第二距离。不作穷举。
处理器还可以利用第一点云数据中各波长对应的光信号的回波强度,确定各波长对应的目标反射率,以作为单波长探测结果,然后,根据多种波长对应的目标反射率,确定探测目标的类型。
示例性的,一种可能的实施例中,处理器可以获取多个目标反射率的平均值,并将该平均值对应的类型确定为探测目标的类型。另一种可能的实施例中,处理器可以过滤掉多个目标反射率中的最大值与最小值之后,再获取剩余各目标反射率的平均值,并将该平均值对应的类型确定为探测目标的类型。另一种可能的实施例中,处理器可以过滤掉多个目标反射率中的最大值或最小值对应的类型,将其确定为探测目标的类型。
示例性的,图14示出了一种处理器获取雷达探测结果(雷达探测方法)的示意图。如图14所示,处理器在接收到电信号1(波长1)与电信号2(波长2)后,分别进行匹配滤波处理,将电信号1分为噪声信号1与波形信号1,将电信号2分为噪声信号2与波形信号2。另,波长1与波长2不相等。
之后,一方面,处理器可以基于噪声信号1确定噪声参数1,并基于噪声信号2确定噪声参数2,然后,结合噪声参数1与噪声参数2,以及,电信号1、电信号2,确定环境质量参数。
另一方面,处理器可以获取波形信号1的回波时刻1与回波强度1;以及,获取波形信号2的回波时刻2与回波强度2。
在此基础上,处理器可以基于环境质量参数与回波时刻1计算出电信号1对应的的第一距离1。其中,在该步骤中,处理器可利用环境质量参数计算测距误差1,并利用该测距误差1对距离1’进行补偿得到第一距离1,其中,距离1’是直接根据电信号1对应的光信号的发射时刻与回波时刻1计算得到的。以及,处理器还可以基于环境质量参数与回波时刻2计算出波长2对应的第一距离2。其中,处理器可利用环境质量参数计算测距误差2,并利用该测距误差2对距离2’进行补偿得到第一距离2,其中,距离2’是直接根据电信号2对应的光信号的发射时刻与回波时刻2计算得到的。之后,处理器就可以根据第一距离1与第一距离2,确定雷达系统与探测目标之间的第二距离。
除此之外,处理器还可以基于环境质量参数与回波强度1计算出电信号1对应的的目标反射率1。具体实现时,处理器可以基于环境质量参数计算回波强度1的补偿值1,并利用该补偿值1对回波强度1进行补偿,进而,利用补偿后的回波强度1’来计算得到目标反射率1。以及,处理器还可以基于环境质量参数与回波强度2计算出电信号2对应的的目标反射率2。具体实现时,处理器可以基于环境质量参数计算回波强度2的补偿值2,并利用该补偿值2对回波强度2进行补偿,进而,利用补偿后的回波强度2’来计算得到目标反射率2。之后,处理器就可以基于目标反射率1与目标反射率2,确定探测目标的类型。
而基于多个不同波长对应的目标反射率,来确定目标类型的实现方式,可以有效提升目标识别准确率。当目标距离和形貌信息相似时,主要依靠反射率信息进行目标 识别分类。目标反射率与入射波长、目标材质、角度等相关,在某个单一波段的不同材质反射率可能相同,利用相同目标在不同波长下的差异化反射率信息可以有效增强目标识别效果。
示例性的,图15示出了利用单波长光信号与多波长光信号分别识别同一个目标时的目标识别效果示意图。其中,15A为采用单波长光信号进行目标探测的效果示意图,15B则为采用多波长光信号进行目标探测的效果示意图。明显地,15B中包含的电云数目更多,所能识别出的有效信息也更多。
在前述任意一种实施例中,雷达系统的处理器,还可以用于确定雷达系统的探测参数;并根据探测参数探测目标。其中,这里所涉及到的探测参数可以包括但不限于:多种波长的多波长光信号的发射参数与雷达系统的配置参数。其中,发射参数可以参考前文说明,这里不再重复。而配置参数则用于实现对雷达系统中各模块的配置。例如,配置参数可以包括但不限于:扫描频率、探测频率等。
示例性的一种实施例中,雷达系统的探测参数可以由控制器确定。也即,控制器可以确定雷达系统的探测参数,并向雷达系统的处理器发送第一信息,而第一信息用于指示雷达系统的发射参数和/或配置参数。如此,处理器确定探测参数时,可以接收来自于控制器的第一信息,并基于接收到的第一信息,确定探测参数。例如,处理器可以直接接将第一信息中携带的发射参数和/或配置参数确定为探测参数。或者,有力,处理器可以在第一信息中携带的发射参数和/或配置参数的基础上干,按照预设算法进行自定义调整,并将调整后的发射参数和/或配置参数确定为探测参数。
或者,雷达系统的探测参数也可以由处理器自行确定。
示例性的一种可能的实施例中,雷达系统的探测参数已经被提前配置好,处理器可以自行提取预配置的数据,以作为探测参数。
示例性的另一种可能的实施例中,雷达系统也可以获取当前环境的环境参数,并根据环境参数,确定发射参数和/或配置参数。其中,环境参数可以包括但不限于:当前环境的天气类型、当前世界坐标或当前环境的图像数据等中的一种或多种。由此,环境参数可以来源于与雷达系统通信连接的其他传感器或通信模组。
示例性的,传感器可以包括但不限于:毫米波雷达、图像采集装置、全球定位系统接收器(Global Positioning System,GPS)、惯性量测单元、人机交互接口中的一种或多种。
例如,车辆中搭载有雷达系统与摄像头(一种图像采集装置),且雷达系统与摄像头通信连接,摄像头可以将采集到的图像数据向雷达系统发送。如此,雷达系统接收到这些图像数据后,具体而言,处理器接收到图像数据后,可以通过图像识别技术确定当前环境的天气类型,进而,根据预设的天气类型与探测参数(发射参数和/或配置参数)之间的对应关系,并按照当前天气类型对应的一种探测参数来探测目标。
其中,通信模组可以连接到其他电子设备或连接到网络,如此,可以通过通信模组来获取到当前天气类型。需要说明的是,通信模组可以是雷达系统的通信模组,或者,可以是雷达系统所搭载的可移动设备的通信模组,此时,该通信模组与雷达系统直接或间接通信连接(例如,通过控制器)。
综上,本申请实施例所提供的雷达系统,能够通过不同波长的多波长光信号,同时或分时进行目标探测,并通过一个或多个2D扫描器在扫描FOV范围内实现3D的点云图。该技术方案可以弥补当前车载激光雷达产品,特别是基于2D扫描架构的激光雷达系统多种性能方面的缺陷,提升整体测量能力。具体体现在以下几个方面:
分辨率方面。多波长的采用可以在有限的接收FOV范围内突破TOF时间和扫描性能的限制,增加像素单元数,从而提升激光雷达分辨率;可利用各波长光束质量差异,在需要提高分辨率区域利用发散角较小的波长密集探测;并且可以利用不同波长人眼安全的差异,提升激光功率,将单个光束的发射能量分摊到多个探测器像素单元,从而进一步提升系统分辨率。
点云数方面。如前文所述,激光雷达点云数与收发通道数、TOF时间、激光器重频等相关,多波长是有效突破TOF限制的方法,同时,不同波长激光器的重频、脉宽、功率等性能不同,可以进一步提升单位时间和空间内的出光效率。
人眼安全和动态范围。不同波长激光的人眼安全阈值不同,可以利用人眼安全阈值高的长波波段输出高功率激光进行远距探测,用人眼安全阈值低的短波波段输出低功率探测近距离和中距离,同时弥补长波长高功率在短距离测量中的饱和现象和盲区限制,提升系统整体动态范围和全测量范围内的测距精度。
提升目标识别效果。对于装备激光雷达的车载感知系统,当目标距离和形貌信息相似时,主要依靠反射率信息进行目标识别分类,目标反射率与入射波长、目标材质、角度等相关,在某个单一波段的不同材质反射率可能相同,利用相同目标在不同波长下的差异化反射率信息可以有效增强目标识别效果。
提升不同天气适应性。利用不同波长在不同恶劣天气下(沙尘、雨、雪、雾等)光散射和光吸收特性的差异,可获得不同的后向散射回波噪声,判断obscurants(雾团)类型和程度,可用于信号回波强度、测距精度等补偿,也可有助于噪声回波峰的滤除。
本申请实施例还提供了一种可移动设备及其雷达探测方法。可以参考前述图1~图3,该可移动设备可以包括机身、搭载于机身上的雷达系统与控制器。其中,控制器用于控制可移动设备移动,雷达系统可以为本申请所提供的任意一种实施例所示的雷达系统,对此不再重复。雷达系统与控制器耦合,如此,控制器可以基于来自于雷达系统的数据(第一点云数据和/或雷达探测结果)。
现对控制器做简要说明。
本申请实施例中,控制器可以用于控制雷达系统工作,以探测目标,还可以用于接收雷达系统输出的数据。
在一种可能的实施例中,雷达系统可以输出第一点云数据和/或雷达探测结果;而控制器则可以用于基于第一点云数据,控制可移动设备移动。
在另一种可能的实施例中,雷达系统可直接输出第一点云数据。如此,当雷达系统并未输出雷达探测结果的情况下,可移动设备的控制器可以接收来自于雷达系统的第一点云数据,并基于第一点云数据获取雷达探测结果。也即,控制器可以接收来自于所述雷达系统的第二信息,所述第二信息携带第一点云数据,然后,根据所述第一点云数据,获取每种波长对应的单波长探测结果,从而,根据多种波长的所述单波长 探测结果,确定雷达探测结果。
示例性的,控制器可以利用所述第一点云数据中各波长对应的光信号的收发时刻,确定各波长对应的第一距离,以作为所述单波长探测结果,然后,根据多种波长对应的所述第一距离,确定第二距离,所述第二距离用于表征所述雷达系统与探测目标之间的距离。
示例性的,控制器可以利用所述第一点云数据中各波长对应的光信号的回波强度,确定各波长对应的目标反射率,以作为所述单波长探测结果,然后根据多种波长对应的所述目标反射率,确定探测目标的类型。
控制器基于第一点云数据获取雷达探测结果的具体实现方式,与前述雷达系统中的处理器基于第一点云数据获取雷达探测结果的方式相同,可以参考前文,这里不再重复。
除此之外,控制器在控制雷达系统执行目标探测任务时,还可以确定雷达系统的探测参数。也即,确定雷达系统的发射参数和/或所述配置参数。确定方式亦可参考前文。之后,控制器可以向雷达系统发送第一消息,所述第一消息用于指示所述雷达系统的所述发射参数和/或所述配置参数。
其中,控制器确定探测参数时,可以通过可移动设备中搭载的传感器和/或通信模组,获取环境参数,并根据所述环境参数,确定各波长的所述激光信号的所述发射参数和/或所述配置参数。可移动设备中搭载的传感器可以包括但不限于:毫米波雷达、图像采集装置、全球定位系统接收器、惯性量测单元、人机交互接口中的一种或多种。此处亦不再重复。
换言之,本申请实施例中,可移动设备可以承担部分雷达系统的计算功能,并基于多波长光线号得到的第一点云数据得到精确度较高的雷达探测结果。
可以理解的是,上述实施例中的部分或全部步骤或操作仅是示例,本申请实施例还可以执行其它操作或者各种操作的变形。此外,各个步骤可以按照上述实施例呈现的不同的顺序来执行,并且有可能并非要执行上述实施例中的全部操作。
可以理解的是,以上各个实施例中,由处理器实现的操作或者步骤,也可以由可用于处理器中的部件(例如芯片或者电路)实现,由控制器实现的操作或者步骤,也可以由可用于控制器中的部件(例如芯片或者电路)实现。
本申请实施例还进一步提供了一种电子设备。该电子设备可以用于实现上述方法实施例中描述的处理器侧或控制器侧对应部分的方法,具体参见上述实施例中的说明。
该所述电子设备可以包括一个或多个处理单元,所述处理单元也可以称为处理器(注意,此处的处理器是指前文雷达系统的处理器中的处理模块或处理单元),可以实现一定的控制功能。所述处理单元可以是通用处理单元或者专用处理单元等。
在一种可选地设计中,处理单元也可以存有指令,所述指令可以被所述处理单元运行,使得所述电子设备执行上述方法实施例中描述的对应于处理器侧或控制器侧的方法。
在又一种可能的设计中,电子设备可以包括电路,所述电路可以实现前述方法实施例中发送或接收或者通信的功能。
可选地,所述电子设备中可以包括一个或多个存储器,其上存有指令或者中间数据,所述指令可在所述处理单元上被运行,使得所述电子设备执行上述实施例中描述的方法。可选地,所述存储器中还可以存储有其他相关数据。可选地处理单元中也可以存储指令和/或数据。所述处理单元和存储器可以单独设置,也可以集成在一起。
可选地,所述电子设备还可以包括收发器。所述收发器可以称为收发单元、收发机、收发电路、或者收发器等,用于实现电子设备的收发功能。
若该电子设备为雷达系统中的处理器,则该电子设备中的处理单元用于在接收到多种波长对应的电信号时,确定并输出第一点云数据和/或雷达探测结果,而电子设备中的收发器可以用于接收来自于控制器的第一信息,又例如,收发器还可以用于向控制器发送第二信息等。收发器还可以进一步完成其他相应的通信功能。而处理单元用于完成相应的确定或者控制操作,可选的,还可以在存储器中存储相应的指令。各个部件的具体的处理方式可以参考前述实施例的相关描述。
若该电子设备为可移动设备中的控制器,则该电子设备中的处理单元可以用于在接收第一点云数据,并据此获取雷达探测结果,而电子设备中的收发器可以用于接收来自于雷达系统的第二信息,又例如,收发器还可以用于向雷达系统发送第一信息等。收发器还可以进一步完成其他相应的通信功能。而处理单元用于完成相应的确定或者控制操作,可选的,还可以在存储器中存储相应的指令。各个部件的具体的处理方式可以参考前述实施例的相关描述。
本申请中描述的处理单元和收发器可实现在集成电路(integrated circuit,IC)、模拟IC、射频集成电路RFIC、混合信号IC、专用集成电路(application specific integrated circuit,ASIC)、印刷电路板(printed circuit board,PCB)、电子设备等上。该处理单元和收发器也可以用各种1C工艺技术来制造,例如互补金属氧化物半导体(complementary metal oxide semiconductor,CMOS)、N型金属氧化物半导体(nMetal-oxide-semiconductor,NMOS)、P型金属氧化物半导体(positive channel metal oxide semiconductor,PMOS)、双极结型晶体管(Bipolar Junction Transistor,BJT)、双极CMOS(BiCMOS)、硅锗(SiGe)、砷化镓(GaAs)等。
可选的,电子设备可以是独立的设备或者可以是较大设备的一部分。例如所述设备可以是:(1)独立的集成电路IC,或芯片,或,芯片系统或子系统;(2)具有一个或多个IC的集合,可选地,该IC集合也可以包括用于存储数据和/或指令的存储部件;(3)ASIC,例如调制解调器(MSM);(4)可嵌入在其他设备内的模块;(5)接收机、终端、蜂窝电话、无线设备、手持机、移动单元,电子设备等等;(6)其他等等。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序,当其在计算机上运行时,使得计算机执行上述实施例中处理器或控制器所实现的雷达探测方法。
此外,本申请实施例还提供一种计算机程序产品,该计算机程序产品包括计算机程序,当其在计算机上运行时,使得计算机执行上述实施例中处理器或控制器所实现的雷达探测方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实 现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk)等。

Claims (30)

  1. 一种雷达系统,其特征在于,包括:
    激光模组,用于生成多波长光信号;
    二维扫描器,用于利用所述多波长光信号进行二维扫描,并接收位于所述二维扫描器的接收视场内的回波信号,所述回波信号为被扫描物体在所述多波长光信号照射后形成反射信号;其中,在所述二维扫描器的单位接收视场内包括多个所述回波信号;
    所述波分模组,用于对所述二维扫描器接收的所述回波信号作分光处理,得到多个单波长光信号;
    所述探测模组,用于将所述多个单波长光信号分别转换为与各波长对应的电信号;
    所述处理器,用于根据所述电信号获取第一点云数据。
  2. 根据权利要求1所述的系统,其特征在于,所述多波长光信号包括第一波长光信号与第二波长光信号;所述第一波长光信号与所述第二波长光信号的波长不同;
    当所述第一波长光信号与所述第二波长光信号的发射参数不同时,所述第一波长光信号与所述第二波长光信号的最小接收视场的参数不同;
    其中,所述发射参数包括:发散角、出射位置、出射时刻、出射角度、接收视场的位置、接收视场的尺寸或飞行时间中的一种或多种;
    所述最小接收视场的参数包括:所述最小接收视场的位置、尺寸或个数中的一种或多种。
  3. 根据权利要求1或2所述的系统,其特征在于,所述激光模组包括:一个或多个激光器;
    当所述激光模组包括一个所述激光器时,所述激光器为可调谐激光器;所述多波长光信号包括多个单一波长的光信号;
    当所述激光模组包括多个所述激光器时,多个所述激光器包括:可调谐激光器和/或单波长激光器;其中,任意两个所述单波长激光器生成的光信号的波长不同;所述多波长光信号包括:一个包括多种波长的光信号,或多个单一波长的光信号。
  4. 根据权利要求1-3任一项所述的系统,其特征在于,所述波分模组,具体用于:
    对所述回波信号作分光处理,得到多个所述单波长光信号;任意两个所述单波长光信号的波长不同。
  5. 根据权利要求1-4任一项所述的系统,其特征在于,所述波分模组还用于:
    对所述激光模组生成的所述多波长光信号作分光处理或聚光处理,并提供给所述二维扫描器进行二维扫描。
  6. 根据权利要求1-5任一项所述的系统,其特征在于,所述波分模组包括:分光器、光纤、透镜、棱镜、反射镜或衍射器件中的一种或多种。
  7. 根据权利要求1-6任一项所述的系统,其特征在于,所述探测模组包括:一个或多个探测器;
    当所述探测模组包括一个所述探测器时,所述探测器为多波长探测器,所述多波长探测器用于接收并处理多种波长的所述单波长光信号;
    当所述探测模组包括多个所述探测器时,多个所述探测器包括:多波长探测器和/或单波长探测器;其中,任意一个所述单波长探测器用于接收并处理一种波长的所述 单波长光信号。
  8. 根据权利要求1-7任一项所述的系统,其特征在于,所述雷达系统为离轴光学系统或同轴光学系统。
  9. 根据权利要求1-8任一项所述的系统,其特征在于,所述处理器,具体用于:
    根据所述多个波长各自对应的电信号,生成第二点云数据;
    利用环境质量参数,对所述第二点云数据进行补偿,得到所述第一点云数据。
  10. 根据权利要求9所述的系统,其特征在于,所述处理器,还用于:
    在多个波长各自对应的电信号中,提取各波长对应的噪声参数;
    根据多种波长对应的所述噪声参数,确定所述环境质量参数。
  11. 根据权利要求10所述的系统,其特征在于,所述噪声参数包括:后向散射噪声参数;
    所述处理器,具体用于:
    利用后向散射函数处理多个波长对应的所述后向散射噪声参数,得到散射系数;
    利用大气吸收函数处理多个波长对应的所述电信号与所述后向散射噪声参数,得到吸收系数;
    根据所述散射系数与所述吸收系数,确定所述环境质量参数。
  12. 根据权利要求9-11任一项所述的系统,其特征在于,所述环境质量参数,包括:雾团类型、天气恶劣等级、粒子浓度、湿度或粒子尺寸分布中的一种或多种。
  13. 根据权利要求9所述的系统,其特征在于,所述处理器,具体用于:
    在预设的补偿公式中,获取与所述环境质量参数相匹配的目标补偿公式;
    利用所述目标补偿公式对所述第二点云数据进行补偿,得到所述第一点云数据。
  14. 根据权利要求1-13任一项所述的系统,其特征在于,所述处理器,还用于:
    根据所述第一点云数据,获取每种波长对应的单波长探测结果;
    根据多种波长的所述单波长探测结果,确定雷达探测结果。
  15. 根据权利要求14所述的系统,其特征在于,所述处理器,具体用于:
    利用所述第一点云数据中各波长对应的光信号的收发时刻,确定各波长对应的第一距离,以作为所述单波长探测结果;
    根据多种波长对应的所述第一距离,确定第二距离,所述第二距离用于表征所述雷达系统与探测目标之间的距离。
  16. 根据权利要求14或15所述的系统,其特征在于,所述处理器,具体用于:
    利用所述第一点云数据中各波长对应的光信号的回波强度,确定各波长对应的目标反射率,以作为所述单波长探测结果;
    根据多种波长对应的所述目标反射率,确定探测目标的类型。
  17. 根据权利1-16任一项所述的系统,其特征在于,所述处理器,还用于:
    确定所述雷达系统的探测参数;所述探测参数包括:多种波长的所述多波长光信号的发射参数与所述雷达系统的配置参数;
    根据所述探测参数探测目标。
  18. 根据权利要求17所述的系统,其特征在于,所述处理器,还用于:
    接收第一信息,所述第一信息用于指示所述雷达系统的所述发射参数和/或所述配 置参数。
  19. 一种可移动设备,其特征在于,包括:
    如权利要求1-18任一项所述的雷达系统;
    控制器,耦合至所述雷达系统,用于基于所述第一点云数据,控制所述可移动设备移动。
  20. 一种可移动设备,其特征在于,包括:
    如权利要求1-13任一项所述的雷达系统;
    控制器,耦合至所述雷达系统,用于基于所述第一点云数据,获取每种波长对应的单波长探测结果,以及,用于根据多种波长的所述单波长探测结果,确定雷达探测结果。
  21. 根据权利要求19或20所述的可移动设备,其特征在于,所述控制器,具体用于:
    向所述雷达系统发送第一消息,所述第一消息用于指示所述雷达系统的所述发射参数和/或所述配置参数。
  22. 根据权利要求21所述的可移动设备,其特征在于,所述可移动设备还包括:传感器和/或通信模组;
    所述控制器,还用于:
    通过所述传感器和/或所述通信模组,获取环境参数;
    根据所述环境参数,确定各波长的所述激光信号的所述发射参数和/或所述配置参数。
  23. 根据权利要求22所述的可移动设备,其特征在于,所述传感器,包括:毫米波雷达、图像采集装置、全球定位系统接收器、惯性量测单元、人机交互接口中的一种或多种。
  24. 根据权利要求19-23任一项所述的可移动设备,其特征在于,所述控制器,还用于:
    接收来自于所述雷达系统的第二信息,所述第二信息携带第一点云数据;
    根据所述第一点云数据,获取每种波长对应的单波长探测结果;
    根据多种波长的所述单波长探测结果,确定雷达探测结果。
  25. 根据权利要求24所述的可移动设备,其特征在于,所述控制器,具体用于:
    利用所述第一点云数据中各波长对应的光信号的收发时刻,确定各波长对应的第一距离,以作为所述单波长探测结果;
    根据多种波长对应的所述第一距离,确定第二距离,所述第二距离用于表征所述雷达系统与探测目标之间的距离。
  26. 根据权利要求24或25所述的可移动设备,其特征在于,所述控制器,具体用于:
    利用所述第一点云数据中各波长对应的光信号的回波强度,确定各波长对应的目标反射率,以作为所述单波长探测结果;
    根据多种波长对应的所述目标反射率,确定探测目标的类型。
  27. 根据权利要求19-26任一项所述的可移动设备,其特征在于,所述可移动设备包括:车辆、无人机或地面机器人。
  28. 一种雷达探测方法,其特征在于,包括:
    生成多波长光信号;
    利用所述多波长光信号进行二维扫描,并接收位于二维扫描器的接收视场内的回波信号,所述回波信号为被扫描物体在所述多波长光信号照射后形成反射信号;其中,在所述二维扫描器的单位接收视场内包括多个所述回波信号;
    对所述回波信号作分光处理,得到多个单波长光信号;
    将所述多个单波长光信号分别转换为与各波长对应的电信号;
    根据所述电信号获取第一点云数据。
  29. 一种雷达探测方法,其特征在于,应用于可移动设备中的控制器,所述可移动设备还包括机身与搭载于所述机身上的雷达系统;所述控制器与所述雷达系统耦合;
    所述方法包括:
    接收来自于所述雷达系统的第二信息,所述第二信息携带第一点云数据;
    基于所述第一点云数据,控制所述可移动设备移动。
  30. 一种雷达探测方法,其特征在于,应用于可移动设备中的控制器,所述可移动设备还包括机身与搭载于所述机身上的雷达系统;所述控制器与所述雷达系统耦合;
    所述方法包括:
    接收来自于所述雷达系统的第二信息,所述第二信息携带第一点云数据;
    根据所述第一点云数据,获取每种波长对应的单波长探测结果;
    根据多种波长的所述单波长探测结果,确定雷达探测结果。
PCT/CN2020/097435 2020-06-22 2020-06-22 一种雷达系统、可移动设备与雷达探测方法 WO2021258246A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202080102277.9A CN115702364A (zh) 2020-06-22 2020-06-22 一种雷达系统、可移动设备与雷达探测方法
EP20942163.5A EP4166988A4 (en) 2020-06-22 2020-06-22 RADAR SYSTEM, MOBILE DEVICE AND RADAR DETECTION METHOD
PCT/CN2020/097435 WO2021258246A1 (zh) 2020-06-22 2020-06-22 一种雷达系统、可移动设备与雷达探测方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/097435 WO2021258246A1 (zh) 2020-06-22 2020-06-22 一种雷达系统、可移动设备与雷达探测方法

Publications (1)

Publication Number Publication Date
WO2021258246A1 true WO2021258246A1 (zh) 2021-12-30

Family

ID=79282651

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/097435 WO2021258246A1 (zh) 2020-06-22 2020-06-22 一种雷达系统、可移动设备与雷达探测方法

Country Status (3)

Country Link
EP (1) EP4166988A4 (zh)
CN (1) CN115702364A (zh)
WO (1) WO2021258246A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115034324A (zh) * 2022-06-21 2022-09-09 同济大学 一种多传感器融合感知效能增强方法
CN115436962A (zh) * 2022-08-02 2022-12-06 探维科技(北京)有限公司 激光雷达及获取特征信息的方法
CN116106932A (zh) * 2023-04-13 2023-05-12 深圳煜炜光学科技有限公司 一种车载激光雷达装置及其控制方法
CN116819490A (zh) * 2023-08-31 2023-09-29 成都远望科技有限责任公司 基于云雷达与激光雷达的云与气溶胶分类的方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104251995A (zh) * 2013-06-27 2014-12-31 杭州中科天维科技有限公司 彩色激光三维扫描技术
CN108303701A (zh) * 2018-01-19 2018-07-20 上海禾赛光电科技有限公司 激光雷达系统、激光脉冲的发射方法及介质
CN108872968A (zh) * 2018-07-02 2018-11-23 广州市杜格科技有限公司 激光扫描目标物体中彩色点云的提取方法及其系统
CN109073757A (zh) * 2016-04-22 2018-12-21 欧普赛斯技术有限公司 多波长lidar系统
CN109557554A (zh) * 2018-12-03 2019-04-02 北京觉醒纪科技有限公司 激光雷达和车辆
US20190257927A1 (en) * 2018-02-16 2019-08-22 Xiaotian Steve Yao Optical sensing based on wavelength division multiplexed (wdm) light at different wavelengths in light detection and ranging lidar systems

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102136401B1 (ko) * 2013-10-21 2020-07-21 한국전자통신연구원 다-파장 이미지 라이다 센서장치 및 이의 신호처리 방법
KR102457029B1 (ko) * 2016-09-20 2022-10-24 이노비즈 테크놀로지스 엘티디 Lidar 시스템 및 방법
WO2019134745A1 (de) * 2018-01-03 2019-07-11 Hybrid Lidar Systems Ag Anordnung und verfahren zur laufzeitmessung eines signals zwischen zwei ereignissen
CN110988846B (zh) * 2019-04-22 2023-07-18 威力登激光雷达美国有限公司 可用于激光雷达的噪点识别方法以及激光雷达系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104251995A (zh) * 2013-06-27 2014-12-31 杭州中科天维科技有限公司 彩色激光三维扫描技术
CN109073757A (zh) * 2016-04-22 2018-12-21 欧普赛斯技术有限公司 多波长lidar系统
CN108303701A (zh) * 2018-01-19 2018-07-20 上海禾赛光电科技有限公司 激光雷达系统、激光脉冲的发射方法及介质
US20190257927A1 (en) * 2018-02-16 2019-08-22 Xiaotian Steve Yao Optical sensing based on wavelength division multiplexed (wdm) light at different wavelengths in light detection and ranging lidar systems
CN108872968A (zh) * 2018-07-02 2018-11-23 广州市杜格科技有限公司 激光扫描目标物体中彩色点云的提取方法及其系统
CN109557554A (zh) * 2018-12-03 2019-04-02 北京觉醒纪科技有限公司 激光雷达和车辆

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4166988A4 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115034324A (zh) * 2022-06-21 2022-09-09 同济大学 一种多传感器融合感知效能增强方法
CN115034324B (zh) * 2022-06-21 2023-05-02 同济大学 一种多传感器融合感知效能增强方法
CN115436962A (zh) * 2022-08-02 2022-12-06 探维科技(北京)有限公司 激光雷达及获取特征信息的方法
WO2024026742A1 (zh) * 2022-08-02 2024-02-08 探维科技(北京)有限公司 激光雷达及获取特征信息的方法
CN116106932A (zh) * 2023-04-13 2023-05-12 深圳煜炜光学科技有限公司 一种车载激光雷达装置及其控制方法
CN116819490A (zh) * 2023-08-31 2023-09-29 成都远望科技有限责任公司 基于云雷达与激光雷达的云与气溶胶分类的方法
CN116819490B (zh) * 2023-08-31 2023-11-17 成都远望科技有限责任公司 基于云雷达与激光雷达的云与气溶胶分类的方法

Also Published As

Publication number Publication date
EP4166988A4 (en) 2023-08-02
CN115702364A (zh) 2023-02-14
EP4166988A1 (en) 2023-04-19

Similar Documents

Publication Publication Date Title
WO2021258246A1 (zh) 一种雷达系统、可移动设备与雷达探测方法
USRE48763E1 (en) Multiple-field-of-view scannerless optical rangefinder in high ambient background light
US10094925B1 (en) Multispectral lidar system
US20240045038A1 (en) Noise Adaptive Solid-State LIDAR System
JP7086001B2 (ja) 適応性のある光レイダー受信機
US9234964B2 (en) Laser radar system and method for acquiring 3-D image of target
KR101980697B1 (ko) 객체 정보 획득 장치 및 방법
KR102086026B1 (ko) 라이다 장치, 라이다 신호 처리 장치 및 방법
JP7131180B2 (ja) 測距装置、測距方法、プログラム、移動体
US20210349192A1 (en) Hybrid detectors for various detection range in lidar
US20210333371A1 (en) Lidar system with fog detection and adaptive response
US20230221437A1 (en) Application specific integrated circuits for lidar sensor and multi-type sensor systems
WO2020107250A1 (zh) 一种激光接收电路及测距装置、移动平台
KR102154712B1 (ko) 차량용 라이다 장치
US20230358870A1 (en) Systems and methods for tuning filters for use in lidar systems
EP3982149A1 (en) Multispectral lidar systems and methods
Dai et al. Lidars for vehicles: from the requirements to the technical evaluation
US20230305160A1 (en) Multimodal detection with integrated sensors
US20220111863A1 (en) Multispectral lidar systems and methods
WO2024049500A2 (en) Multimodal detection with integrated sensors
US20230204740A1 (en) Lidar system and a method of calibrating the lidar system
RU2792948C2 (ru) Мультиспектральные лидарные системы и способы
US20230324526A1 (en) Method for accurate time-of-flight calculation on the cost-effective tof lidar system
US20220128689A1 (en) Optical systems and methods for controlling thereof
US20230366984A1 (en) Dual emitting co-axial lidar system with zero blind zone

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20942163

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020942163

Country of ref document: EP

Effective date: 20230116

NENP Non-entry into the national phase

Ref country code: DE