WO2022032516A1 - 激光雷达及其探测方法、存储介质和探测系统 - Google Patents

激光雷达及其探测方法、存储介质和探测系统 Download PDF

Info

Publication number
WO2022032516A1
WO2022032516A1 PCT/CN2020/108628 CN2020108628W WO2022032516A1 WO 2022032516 A1 WO2022032516 A1 WO 2022032516A1 CN 2020108628 W CN2020108628 W CN 2020108628W WO 2022032516 A1 WO2022032516 A1 WO 2022032516A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection
sub
point cloud
cloud data
window time
Prior art date
Application number
PCT/CN2020/108628
Other languages
English (en)
French (fr)
Inventor
王超
Original Assignee
深圳市速腾聚创科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市速腾聚创科技有限公司 filed Critical 深圳市速腾聚创科技有限公司
Priority to CN202080004044.5A priority Critical patent/CN112470026A/zh
Priority to PCT/CN2020/108628 priority patent/WO2022032516A1/zh
Publication of WO2022032516A1 publication Critical patent/WO2022032516A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves

Definitions

  • the embodiments of the present application relate to the field of radar technology, and in particular, to a lidar detection method, a storage medium, a detection system, and a lidar.
  • Lidar is a system that emits laser beams to detect the position, speed and other characteristic quantities of targets. It has been widely used in ranging systems, tracking and measurement of low-flying targets, weapon guidance, atmospheric monitoring, mapping, early warning, traffic management and other fields.
  • Lidar includes mechanical lidar, solid-state lidar, and hybrid solid-state lidar.
  • Flash lidar is generally an all-solid-state lidar.
  • the emission system and detection system do not need any mechanical movement and can simultaneously record the entire detection scene to obtain detection target distance information, grayscale imaging information, etc., avoiding the target or lidar itself during the scanning process. Interference caused by movement, low system load, long optical-mechanical life, easy modularization, and low assembly complexity.
  • flash lidar is mainly used in near-field blindness supplementation, auxiliary ranging, and near-field state detection in the field of autonomous driving.
  • Flash lidar uses a surface-array detection array to receive echo signals, all pixels will also accumulate more background ambient photons when receiving echo signals.
  • the immunity of ambient photons is poor, and it can be used in an outdoor environment with strong sunlight background. Effective signal detection is almost impossible under light.
  • the main purpose of the embodiments of the present application is to provide a detection method, storage medium, detection system and lidar for a lidar, which solves the problem that the lidar in the prior art has poor immunity to environmental photons The problem.
  • an embodiment of the present application provides a detection method for a lidar.
  • the detection array of the lidar is divided into N detection parts, and the detection window time is divided into N sub-window times, where N is an integer greater than 1, and the Methods include:
  • the i-th detection unit is turned on to receive the echo laser at the first sub-window time, and the detection units are turned on according to the preset order in each consecutive sub-window time, obtaining 1 set of original point cloud data;
  • i is a positive integer less than or equal to N;
  • the original point cloud data is spliced to obtain a frame of detection point cloud data.
  • an embodiment of the present application provides a computer storage medium, where the computer storage medium stores a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing the above method steps.
  • an embodiment of the present application provides a detection system for a lidar, where the detection window time is divided into N sub-window times, and the system includes:
  • the detection array is divided into N detection parts, where N is an integer greater than 1;
  • the control unit in a detection window time, sends a control command to turn on the i-th detection unit to receive the echo laser at the first sub-window time; sends a control command according to a preset time sequence to make each sub-window
  • the detection units corresponding to time are turned on in turn to obtain 1 group of original point cloud data; i is a positive integer less than or equal to N; repeat the previous step, traverse all the values of i, and obtain N groups of the original point cloud data;
  • the processing unit splices the original point cloud data to obtain a frame of detection point cloud data.
  • an embodiment of the present application provides a lidar, where the lidar includes a transmitting system, a control system, and the detection system as described above;
  • the emission system is used for emitting outgoing laser light
  • the detection system is configured to receive echo laser light, and obtain detection point cloud data of objects in the detection area based on the outgoing laser light and the echo laser light;
  • the control system is used for controlling the transmitting system to emit the outgoing laser light and the detection system to receive the echo laser light.
  • the beneficial effects of the embodiments of the present application are: in the embodiments of the present application, by dividing the detection array into N detection parts, within a detection window time, each detection part only works in one of the sub-window times, and does not work in the whole detection window.
  • the window time operation reduces the length of the time window of the signal collected by each detection part, reduces the noise accumulation when the detection part receives the echo laser, improves the signal-to-noise ratio of the system, and improves the immunity to environmental photons.
  • FIG. 1 is a schematic flowchart of a detection method for a lidar provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of time division of detection windows of a detection array in an embodiment of the present application
  • FIG. 3 is a schematic diagram of the composition of pixels of a detection array in an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a detection array divided by columns into detection parts in an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a detection array divided by columns into detection parts in another embodiment of the present application.
  • FIG. 6 is a schematic diagram of a detection array divided by columns into detection parts in another embodiment of the present application.
  • FIG. 7 is a schematic diagram of point cloud data splicing in the embodiment of the present application.
  • FIG. 8 is a schematic flowchart of a detection method for a lidar provided by another embodiment of the present application.
  • FIG. 9 is a schematic flowchart of removing part of point cloud data in another embodiment of the present application.
  • FIG. 10 is a schematic block diagram of a detection system of a lidar provided by an embodiment of the present application.
  • FIG. 11 is a schematic block diagram of a lidar provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of an optical path of a laser radar provided by an embodiment of the present application.
  • Lidar 100 transmitting system 1, detection system 2, control and signal processing system 3, receiving optical system 4;
  • Lidar detection system 700 Lidar detection system 700 , detection array 701 , control unit 702 , and processing unit 703 .
  • the terms “installed”, “connected”, “connected”, “fixed” and other terms should be understood in a broad sense, for example, it may be a fixed connection or a detachable connection , or integrated; it can be a mechanical connection or an electrical connection; it can be a direct connection or an indirect connection through an intermediate medium, and it can be the internal connection of the two elements or the interaction relationship between the two elements.
  • installed may be a fixed connection or a detachable connection , or integrated; it can be a mechanical connection or an electrical connection; it can be a direct connection or an indirect connection through an intermediate medium, and it can be the internal connection of the two elements or the interaction relationship between the two elements.
  • a first feature "on” or “under” a second feature may be in direct contact with the first and second features, or the first and second features indirectly through an intermediary touch.
  • the first feature being “above”, “over” and “above” the second feature may mean that the first feature is directly above or obliquely above the second feature, or simply means that the first feature is level higher than the second feature.
  • the first feature being “below”, “below” and “below” the second feature may mean that the first feature is directly below or obliquely below the second feature, or simply means that the first feature has a lower level than the second feature.
  • Flash lidar The basic working principle of Flash lidar is: the emission system illuminates the entire detection field of view with the outgoing laser light at one time, and at the same time, the detection system continuously receives the echo laser in the detection field of view, and inverts the photon through a certain solution method. flight time, and finally get the distance information of the target.
  • the main advantages of Flash lidar are: the emission system and detection system do not need any mechanical movement, and can simultaneously record the entire detection field of view to obtain the distance information and grayscale imaging information of the target, avoiding the need for the target or laser during the scanning process.
  • the interference caused by the movement of the radar itself, the system load is low, the opto-mechanical life is long, the modularization is convenient, and the assembly complexity is low.
  • Flash lidar can use the time-of-flight method for distance measurement.
  • the light source in the emission system emits periodic short pulse signals as the outgoing laser.
  • the outgoing laser is emitted to the detection field of view and then is emitted by the object in the detection field of view.
  • the reflected echo laser It is also a short pulse signal.
  • the detection system receives the echo laser, and directly obtains the flight time of the photon through the time difference between the emitted laser and the received echo laser, and then obtains the distance information.
  • the detection system of flash lidar adopts area array reception, information such as the shape and outline of objects in the detection field of view can be obtained at the same time.
  • MEMS Micro-Electro-Mechanical System
  • the flash lidar directly uses the area-array detection array to receive the echo laser, all the pixels will also accumulate more background ambient photons when receiving the echo laser, especially outdoors.
  • Flash lidar is generally suitable for the small target model of the radar equation.
  • the intensity of the echo laser decreases with the increase of the distance approximately in a quadratic relationship, and the integration time of the system is related to the designed farthest detection distance.
  • the background noise of sunlight in the environment is generally additive noise, and the intensity of ambient photons is proportional to the accumulation time and the receiving area of the detection array of the detection system.
  • a detection method for lidar is proposed, which can improve the immunity of the system to ambient photons.
  • the method can be realized by relying on a computer program, and can run on a lidar detection system or lidar based on the von Neumann system.
  • the computer program can be integrated into an application or run as a stand-alone utility application.
  • the detection method of the lidar provided by this embodiment includes:
  • Step 101 In a detection window time, the i-th detection unit is turned on in the first sub-window time to receive the echo laser, and the detection unit is turned on in a preset order in each consecutive sub-window time to obtain a set of original point cloud data ; i is a positive integer less than or equal to N.
  • the detection window time is the integration time of the lidar detection system, and the detection window time is related to the receiving distance of the lidar detection system. It is easy to understand that when the emission system emits laser light, the detection system opens and enters the detection window time; the farther the photon flies, the longer it takes to return to the detection system; correspondingly, the longer the receiving distance of the detection system, the longer the detection window time. long; therefore, the detection window time is determined by the farthest receiving distance of the detection system. And the photons returning from the farthest receiving distance, their flight time exceeds the detection window time, and the detection system is turned off when they reach the detection system and cannot be received.
  • the emission time interval during which the emission system emits the outgoing laser must be greater than the detection window time. It can be seen from the foregoing that the first detection window time is entered after the first outgoing laser light is emitted; if the second outgoing laser light is emitted before the first detection window time has expired, the detection system cannot distinguish whether the received echo laser is from the first outgoing laser light. It is also the second outgoing laser, so the crosstalk cannot be accurately detected.
  • the emission time interval of the outgoing laser of the transmitting system is greater than the detection window time, so that the echo laser received at each detection window time comes from a uniquely determined outgoing laser, so as to avoid ranging errors.
  • the detection window time of the detection array is divided into N sub-window times, and the detection array receives echo laser light in each sub-window time.
  • Each sub-window time can receive the echo laser within the corresponding distance range.
  • the detection window time is divided into 1 , 2 , .
  • the echo laser within the range, and so on, the N sub-window time receives the echo laser within the distance range of L N-1 -L N ; where L N is less than or equal to the range of the detection system.
  • the detection window time can be evenly divided into N sub-window times, so that the received distance range corresponding to each sub-window time is also the same.
  • the detection window time can simplify the design and control method of the detection system, making the system design more regular and easy to implement; at the same time, the ambient photon noise introduced by each sub-window time due to the same integration time is also the same or similar. Avoiding that the signal-to-noise ratio of each sub-window time is affected by different integration times is beneficial to the accuracy of the lidar.
  • the detection window time can also be unevenly divided into N sub-window times. For some distance ranges where the influence of ambient photon noise on the signal-to-noise ratio is relatively small, the sub-window time corresponding to the distance range can be appropriately extended to reduce the time division of the detection window. , which simplifies the detection process.
  • the detection array is divided into N detection parts, such as the first detection part, the second detection part, ..., the Nth detection part. It can be seen from the foregoing that the detection system of flash lidar adopts area array reception, and the detection array of the detection system can be regularly arranged by m ⁇ k pixels. Each pixel of the detection array corresponds to a sub-field of view, and m ⁇ k sub-fields of view are connected together to form the entire detection field of view of the detection array.
  • the detection array is composed of 3 ⁇ 3 pixels, as shown in Figure 3, from left to right, top to bottom are pixel 1, pixel 2, ..., pixel 9; the detection field of view of the detection array Covers -6° to 6° in the horizontal direction and -10° to 2° in the vertical direction; then, the subfield of view of pixel 1 covers -6° to -2° in the horizontal direction and -10° to 2° in the vertical direction.
  • the sub-field of view of pixel 2 covers -2° to 2° in the horizontal direction and -2° to 2° in the vertical direction.
  • the sub-field of view of pixel 3 is in the Covers 2° to 6° in the horizontal direction and -2° to 2° in the vertical direction.
  • each pixel of the detection array corresponds to a coordinate range in the coordinate system perpendicular to the distance direction.
  • the detection array can be evenly divided into N detection parts, and each detection part contains the same number of pixels, so that the field of view covered by the sub-field of view corresponding to each detection part is the same, which is convenient for subsequent stitching of original point cloud data; At the same time, the number of pixels in each detection part is the same, and the accumulated sunlight interference is also the same, which is convenient for subsequent denoising processing.
  • the pixels contained in each detection part can be continuous or discrete.
  • the pixels contained in a detection part are continuous; a plurality of pixels in a certain area in the detection array constitute a detection part, and the plurality of pixels in the detection part are connected to work as a whole; The two sub-fields of view of the pixel are connected, and the sub-fields of view corresponding to the plurality of pixels of the detection part are also connected into a whole, which becomes the field of view corresponding to the detection part.
  • the pixels of the detection part can continuously detect a certain area in the overall field of view, which is convenient to obtain more information from the continuous original point cloud data, and can also reduce the complexity of the original point cloud data splicing.
  • the detection array can be divided into a plurality of detection parts by rows or columns.
  • a 3 ⁇ 3 detection array (as shown in Figure 4) can be divided into 3 detection parts by column, and the first detection part contains the first column of pixels (ie pixel 1, pixel 4. Pixel 7), the second detection part includes the second column of pixels (ie, pixel 2, pixel 5, and pixel 8), and the third detection part includes the third column of pixels (ie, pixel 3, pixel 5, and pixel 8). Pixel 6, Pixel 9).
  • each detection section may also include multiple columns of pixels.
  • a 6 ⁇ 6 detection array is divided into 3 detection parts by column, and each detection part contains two columns of pixels (as shown in Figure 5).
  • the division method of dividing the detection array into N detection parts is chosen according to the actual application requirements.
  • the pixels included in the detection part may also be discrete; a plurality of pixels are discretely distributed in the detection array, and the corresponding field of view corresponding to the detection part is also constituted by a plurality of subfields scattered in the overall field of view. Using the detection part with discrete distribution of pixels for detection can quickly detect different areas in the overall field of view at the same time, and obtain rough information of each area, which is convenient for the subsequent adjustment of detection parameters of the lidar.
  • the detection array can also be divided into N detection parts unevenly, and the number of pixels included in each detection part is not all the same. As mentioned above, on the one hand, the more pixels included in the detection part have an adverse effect on the signal-to-noise ratio;
  • the detection array can divide the detection part according to the surrounding environment to meet the detection requirements of different areas in the overall field of view, and has better adaptability and flexibility.
  • the detection array is divided into N detection parts, and the detection window time is divided into N sub-window times. Within a detection window time, one detection part is correspondingly turned on in each sub-window time. After a detection window time, the N detection parts of the detection array are respectively turned on to receive the echo laser once.
  • the i-th detection unit is turned on to receive the echo laser at the first sub-window time, and the detection units are turned on in a preset order in each subsequent consecutive sub-window time.
  • i is a positive integer less than or equal to N.
  • the preset sequence of opening the N detection units can be selected from the following options:
  • the second sub-window time opens the second detection part... and so on, until the Nth sub-window time opens the Nth detection part .
  • the detection array can be divided into N column detection units by column, as shown in FIG. 6 , the sequence from left to right is the first column detection unit, the second column detection unit, and the Nth column detection unit. Department.
  • the order in which the N detection parts of the detection array are turned on is: the first sub-window time is to turn on the first column of detection parts, the second sub-window time is to turn on the second column of detection parts... and so on, until the Nth sub-window Time to turn on the Nth column detection part.
  • the detection array includes 3 ⁇ 3 pixels, which are divided into 3 column detection parts by column; the detection window time is also divided into 3 sub-window times; the first sub-window time Turn on the first column detection part, namely turn on pixel 1, pixel 4 and pixel 7; turn on the second column detection part at the second sub-window time, namely turn on pixel 2, pixel 5 and pixel 8; Three sub-windows time to open the third column detection part, namely open pixel 3, pixel 6 and pixel 9.
  • the preset sequence of opening the detection units is similar to that of the column detection units, which will not be repeated here.
  • the corresponding detection unit may also be turned on at each sub-window time in a manner of decreasing the sequence number of the detection unit.
  • the first sub-window time turns on the Nth detection part
  • the second sub-window time turns on the N-1th detection part... and so on, until the Nth sub-window time turns on the first detection part.
  • the first sub-window time turns on the i-th (1 ⁇ i ⁇ N) detection part
  • the second sub-window time turns on the i+1-th detection part
  • the (N-i+1)-th sub-window time turns on The Nth detection part
  • the (N-i+2)th sub-window time turns on the 1st detection part
  • the (N-i+3)th sub-window time turns on the 2nd detection part, and so on, until the th
  • the (i-1)th detection unit is turned on for N sub-window times.
  • the column detection parts can also be turned on in this preset order.
  • the preset sequence of opening the detection units is similar to that of the column detection units, which will not be repeated here.
  • the corresponding detection unit can also be turned on at each sub-window time in a manner of decreasing the sequence number of the detection unit.
  • the second sub-window time turns on the third column detection unit
  • the third sub-window time turns on the second column.
  • the first column detection part is turned on at the fourth sub-window time
  • the fifth column detection part is turned on at the fifth sub-window time.
  • one detection unit is correspondingly turned on in each sub-window time.
  • the N detection units When the N detection units are turned on, they may not be turned on sequentially in the order of the sequence numbers of the detection units, and the detection units may be turned on in any order. It only needs to satisfy: in a detection window time, each detection part is turned on once, and only one detection part is turned on in each sub-window time.
  • Step 102 Traverse all the values of i, and execute the first sub-window time to turn on the i-th detection part to receive the echo laser, and start the detection part in a preset order in each consecutive sub-window time to obtain a group of original points
  • N groups of original point cloud data are obtained.
  • the example detection part only turns on the received echo laser within the corresponding sub-window time, that is to say: the example detection part only performs a certain distance range in its sub-field of view.
  • the distance range is the distance range corresponding to the sub-window time when the example detection part is opened.
  • the example detection part only detects one distance range within one detection window time, and only after detecting the remaining distance ranges within the subsequent detection window time can all the original point cloud data in the sub-field of view be acquired. It can be seen from the foregoing that one detection window time is divided into N sub-window times, and the detection distance of each sub-window time corresponds to a distance range within the range of the detection system.
  • the example detection part has been turned on in all the sub-window times, so that it can be completely detected within the range of its sub-field of view.
  • the example detection unit is only turned on once within the corresponding sub-window time; it takes at least N detection window times before the example detection unit can obtain all the original point cloud data in the sub-field of view.
  • the example detection part needs to be turned on once in different sub-window times during consecutive N detection window times, without repetition.
  • An example detector is any detector in the detector array. Each detection part is turned on once in different sub-windows within the same N consecutive detection window times, which can quickly obtain the complete detection point cloud data of the entire field of view.
  • step 101 In order to simplify the design of the detection system and the detection method, all values of i are traversed and step 101 is executed. For each detection window time, the detection units that are turned on at the first sub-window time are different; in step 101, each detection unit will be turned on at the corresponding sub-window time according to the preset sequence, and after N detection window times, each detection unit is opened at All sub-windows have been opened within time.
  • the detection array includes a first column detection unit, a second column detection unit, ... an N-th column detection unit.
  • the second sub-window time opens the second column detection part
  • the Nth column detection part is turned on at the Nth sub-window time.
  • the first sub-window time turns on the second column detection part
  • the second sub-window time turns on the third column detection part
  • the (N-1)th sub-window time turns on the Nth column.
  • the first column detection part is turned on at the Nth sub-window time.
  • the first subwindow time opens the Nth column detection part
  • the second subwindow time opens the first column detection part
  • the third subwindow time opens the second column detection part. part
  • the Nth sub-window time turns on the (N-1)th column detection part.
  • the collection of N groups of original point cloud data is completed.
  • the sub-window time of the same sequence number corresponds to the sequence number of the opened column detection unit, which is delayed by one. i can also take values from small to large.
  • the current detection window time corresponds to the number of the opened column detection unit forward by one; the specific opening method is the same as the aforementioned i from small to large. The value is similar and will not be repeated here.
  • the preset sequence of opening the detection units is similar to that of the column detection units, which will not be repeated here.
  • Step 103 Splicing the original point cloud data to obtain a frame of detection point cloud data.
  • each detection part within a detection window time, each detection part only turns on the received echo laser within a sub-window time, that is, only detects a distance range. From a certain detection part, after N consecutive detection window times, the detection part has been opened in the sub-window time corresponding to all distance ranges within the range; according to the distance information in the original point cloud data, the The detection part splices the original point cloud data in different distance ranges, so as to obtain the original point cloud data of the sub-field of view corresponding to the detection part.
  • the sub-field of view of each detection part corresponds to an area in the overall field of view. According to the orientation information in the original point cloud data, the original point cloud data of the N sub-fields of view corresponding to the N detection parts are spliced to obtain the overall Complete probed point cloud data for the field of view.
  • each detection part by dividing the detection array into N detection parts, within a detection window time, each detection part only works in one sub-window time, but does not work in the whole detection window time, reducing the number of each detection part.
  • the window time length of the signal collected by the detection part reduces the accumulated noise when the echo laser is received, improves the signal-to-noise ratio of the system, and improves the immunity to ambient photons.
  • the emission system adopts floodlight illumination, and the detection system does not continuously receive the echo laser within a long window time when receiving the echo laser, which can reduce the accumulation of ambient photons.
  • the detection system since each sub-window time is only driven to turn on a specific detection part in the detection array to receive echo laser light, the receiving area of ambient photons is reduced. Therefore, the interference of sunlight on the echo laser in the environment is reduced, and the signal-to-noise ratio of the system is improved. Under the same integration time condition, the signal-to-noise ratio of the system can be increased by 2logN times, and the accumulation of noise can be reduced by N2 times.
  • FIG. 8 is a schematic flowchart of another embodiment of a detection method for a lidar proposed in the present application.
  • Step 201 In a detection window time, the i-th detection unit is turned on in the first sub-window time to receive the echo laser, and the detection unit is turned on in a preset order in each consecutive sub-window time to obtain a set of original point cloud data ; i is a positive integer less than or equal to N.
  • step 101 For details, refer to step 101, which will not be repeated here.
  • Step 202 Traverse all the values of i, and execute the first sub-window time to turn on the i-th detection part to receive the echo laser, and start the detection part according to the preset order in each consecutive sub-window time to obtain a group of original points
  • N groups of original point cloud data are obtained.
  • step 102 For details, refer to step 102, which will not be repeated here.
  • Step 203 Determine the distance information of the original point cloud data according to the sub-window time of the original point cloud data.
  • each detection part within a detection window time, each detection part only turns on the received echo laser within a sub-window time, that is, only detects a distance range.
  • the detection distance range corresponding to each original point cloud data can be obtained, and the distance range is the distance information of the original point cloud data.
  • the detection array includes 3 ⁇ 3 pixels, which are divided into 3 column detection sections according to the columns; within the first detection window, the first sub-window opens the first sub-window.
  • the column detection part obtains the original point cloud data D 11 , the second column detection part is opened at the second sub-window time, and the original point cloud data D 12 is obtained, and the third column detection part is opened at the third sub-window time to obtain the original point cloud Data D 13 : After the first detection window time has passed, all the three column detection units are turned on once to obtain the first group of original point cloud data.
  • the distance range corresponding to D 11 is 0-L 1 ; the distance range corresponding to D 12 is L 1 -L 2 ; the distance range corresponding to D 13 is L 2 -L 3 .
  • the first sub-window time opens the second column detection unit to obtain the original point cloud data D 22 , and the corresponding distance range is 0-L 1 ; the second sub-window time opens the third column.
  • a column detection part is used to obtain the original point cloud data D 23 , and the corresponding distance range is L 1 -L 2 ; the first column detection part is turned on at the third sub-window time to obtain the original point cloud data D 21 , and the corresponding distance range is L 2 -L 3 ; after the second detection window time, the second group of original point cloud data is obtained.
  • the first sub-window time opens the third column detection part to obtain the original point cloud data D 33 , and the corresponding distance range is 0-L 1 ;
  • the second sub-window time opens the first column detection part part, obtain the original point cloud data D 31 , and the corresponding distance range is L 1 -L 2 ;
  • the third sub-window time opens the second column detection part to obtain the original point cloud data D 32 , and the corresponding distance range is L 2 - L 2 L 3 ; after the third detection window time has elapsed, the third group of original point cloud data is obtained.
  • Step 204 Determine the orientation information of the original point cloud data according to the position of the detection unit in the detection array.
  • each pixel of the detection array corresponds to a sub-field of view
  • the sub-field of view is projected into a coordinate system perpendicular to the distance direction and corresponds to a coordinate range, and multiple sub-fields of view are connected together to form the entire detection view of the detection array. field.
  • the position of the detection part in the detection array corresponding to the acquired original point cloud data the position of the sub-field of view of the detection part in the entire detection field of view can be known, which is the azimuth information.
  • the detection array includes 3 ⁇ 3 pixels, which are divided into 3 column detection parts by column, and the detection field of view of the detection array covers -6° to 6° in the horizontal direction. , covering -10° ⁇ 2° in the vertical direction.
  • the detection part of the first column is located on the leftmost side of the detection array, and its corresponding sub-field of view is also located on the same side of the entire detection field of view, covering -6° to -2° in the horizontal direction and - 10° to 2° is the azimuth information of the original point cloud data obtained by the detection unit in the first column.
  • the azimuth information of the original point cloud data detected by the detection unit in the second column is: covering -2° to 2° in the horizontal direction, and -10° to 2° in the vertical direction;
  • the orientation information of the original point cloud data detected by the detection unit is: covering 2° to 6° in the horizontal direction, and covering -10° to 2° in the vertical direction.
  • Step 206 splicing and merging the distance information and orientation information corresponding to the original point cloud data to obtain a frame of detection point cloud data.
  • the distance information and azimuth information of each original point cloud data can be obtained, that is, the sub-field of view of which area in the entire detection field of view corresponds to the original point cloud data, and the corresponding range is the range. within which distance range.
  • the distance information and orientation information of the N ⁇ N original point cloud data are spliced to obtain point cloud data covering the entire detection field of view and the entire range, which is a frame of detection point cloud data.
  • some original point cloud data may be pre-eliminated before splicing and merging the point cloud data. Therefore, as shown in FIG. 9, before step 206, the following steps may also be included:
  • Step 2051 Record all original point cloud data with the same orientation information as a point cloud group.
  • the original point cloud data with the same orientation information are N pieces of original point cloud data obtained by the same detection unit after N detection window times, and each original point cloud data corresponds to a different distance range within the range.
  • the detection array includes 3 ⁇ 3 pixels, which are divided into 3 row detection parts by row. The first row of detection parts corresponds to pixel 1, pixel 2 and pixel 3, and the second row of detection parts corresponds to pixel 1, pixel 2 and pixel 3. Pixel 4, Pixel 5, and Pixel 6, and the detection part in the third row corresponds to Pixel 7, Pixel 8, and Pixel 9.
  • the sub-field of view of each line detection unit remains unchanged, and the obtained original point cloud data has the same orientation information, and the original point cloud information with the same orientation information is marked as a point cloud group.
  • Step 2052 Acquire the ranging range according to the orientation information.
  • the ranging range can be preset according to the actual application requirements, for example, in some areas of the entire detection field of view, the required ranging distance is relatively short, such as the sub-field of view facing the sky or the ground; In the area, the required ranging distance is far, such as the sub-field of view facing the front.
  • the ranging range of each azimuth (corresponding to each detection part) can be preset according to the detection requirements.
  • the sub-field of view of the detection part in the third row faces downward, and the outgoing laser in the sub-field of view quickly hits the ground, and is reflected by the ground and returns to the echo laser.
  • La and L b can be equal or unequal, and can be set according to detection requirements.
  • Step 2053 Eliminate part of the original point cloud data in the point cloud group, and the distance information of the discarded original point cloud data is not included in the ranging range.
  • the original point cloud data beyond the corresponding ranging range of the point cloud group is removed from the point cloud group. In this way, unnecessary data is eliminated, the time for subsequent data splicing and merging can be reduced, and the splicing efficiency can be improved.
  • the original point cloud data outside the 0-L b ranging range is excluded.
  • the detection unit in the second row expects to obtain a large range, and L b can be the detection system.
  • the range of ; in the original point cloud data detected by the detection unit in the third row, the original point cloud data outside the 0-L c ranging range is excluded.
  • each detection part by dividing the detection array into N detection parts, in one detection window time, each detection part only works in one sub-window time, but does not work in the whole detection window time, reducing the number of each detection part.
  • the window time length of the signal collected by the detection part reduces the accumulated noise when the echo laser is received, improves the signal-to-noise ratio of the system, and improves the immunity to ambient photons.
  • the original point cloud data outside the required range of distance measurement in the field of view area corresponding to the point cloud group is removed from the point cloud group, and then the original point cloud data is spliced to reduce subsequent data.
  • the computational complexity of splicing and merging improves the efficiency of data processing.
  • An embodiment of the present application further provides a computer storage medium, where the computer storage medium can store a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing the above-described embodiments shown in FIG. 1 to FIG. 9 .
  • the specific execution process can be referred to the specific description of the embodiments shown in FIG. 1 to FIG. 9 , which will not be repeated here.
  • the computer storage medium is a non-volatile computer-readable storage medium.
  • the present application also provides a computer program product, the computer program product stores at least one instruction, and the at least one instruction is loaded and executed by the processor as described in the specific description of the embodiment shown in FIG. 1 to FIG. This is not repeated here.
  • FIG. 10 shows a schematic diagram of a detection system of a lidar provided by an exemplary embodiment of the present application.
  • the detection system can be implemented as a whole or a part of the device through software, hardware or a combination of the two.
  • the detection system includes a detection array, a control unit and a processing unit.
  • the detection array 701 is divided into N detection parts, where N is an integer greater than 1;
  • the control unit 702 in a detection window time, sends a control command to turn on the i-th detection part to receive the echo laser at the first sub-window time; sends the control command according to the preset time sequence, so that the detection parts corresponding to each sub-window time are sequentially Open to get 1 group of original point cloud data; i is a positive integer less than or equal to N; repeat the previous step, traverse all the values of i, and get N groups of original point cloud data;
  • the processing unit 703 splices the original point cloud data to obtain a frame of detection point cloud data.
  • the second said sub-window time turns on the second said detection part
  • the Nth detection unit is turned on at the Nth sub-window time.
  • control unit 702 is further configured to: when i>1, turn on the i-th detection unit at the first sub-window time;
  • the second said sub-window time turns on the (i+1)th said detection part
  • the (N-i+2)th said sub-window time turns on the first said detection part
  • the (i-1)th detection unit is turned on at the Nth sub-window time.
  • processing unit 703 is further configured to:
  • the distance information and azimuth information corresponding to the original point cloud data are spliced and merged to obtain a frame of detection point cloud data.
  • processing unit 703 is further configured to:
  • a part of the original point cloud data in the point cloud group is eliminated, and the distance information of the eliminated original point cloud data is not included in the ranging range.
  • the detection array is divided into N column detection parts, and the detection array is sequentially composed of the first column detection part and the second column detection part from one end to the other end perpendicular to the column direction. part, ..., the Nth column detection part;
  • the detection array is divided into N row detection parts, and the detection array is sequentially composed of the first row detection part, the second row detection part, ... , the Nth row detection part.
  • the column detection unit includes p columns of pixels, or the row detection unit includes p rows of pixels, where p is a positive integer.
  • the emission time interval of the outgoing laser light of the lidar is greater than the detection window time.
  • the detection window time is evenly divided into N sub-window times.
  • each detection part by dividing the detection array into N detection parts, within a detection window time, each detection part only works in one sub-window time, but does not work in the whole detection window time, reducing the number of each detection part.
  • the length of the time window for the detection part to collect the signal reduces the noise accumulation when the detection part receives the echo laser, improves the signal-to-noise ratio of the system, and improves the immunity to environmental photons.
  • FIG. 11 and FIG. 12 show schematic diagrams of a lidar 100 provided by an exemplary embodiment of the present application.
  • the lidar includes a launch system 1 , a detection system 2 and a control system 3 .
  • the emission system 1 is used to emit the outgoing laser; the detection system 2 is used to receive the echo laser.
  • the echo laser is the laser returned after the outgoing laser is reflected by the object in the detection area, and the outgoing laser and the echo laser are given to obtain the object in the detection area.
  • the control system 3 is used to control the launch system 1 and control the detection system 2 .
  • the emission system 1 adopts flood lighting, usually an area array light source.
  • the emission system 1 includes an emission array composed of a light source, and the light source of the emission array can be of various types, such as a pulsed laser (Pulsed laser deposition, PLD), a laser diode (Laser Diode, LD), a vertical cavity surface emitting laser (Vertical Cavity Surface Emitting). Laser, VCSEL), edge emitting laser (Edge Emitting Laser, EEL), light emitting diode (Light Emitting Diode, LED) and other devices, one or more combinations.
  • the emission system 1 further includes an emission control unit for controlling the emission array to emit outgoing laser light.
  • the lidar 100 generally also includes a transmitting optical system (not shown in the figure), which can use one or more combinations of lenses, lens groups, optical fibers, micro-lens groups, micro-lens arrays, etc.
  • the laser is uniformly emitted to the entire detection field at one time.
  • the detection system 2 includes a detection array composed of pixels, which may be a one-dimensional detection array or a two-dimensional detection array.
  • the pixels of the detection array can be integrated elements such as Avalanche Photodiode (APD), Multi-Pixel Photon Counter (MPPC), Silicon photomultiplier (SiPM), fast charge coupled device (Charge Coupled Device, CCD), Complementary Metal Oxide Semiconductor (Complementary Metal Oxide Semiconductor, CMOS) and other devices, one or more combinations.
  • the control unit of the detection system controls the reception and sampling of the detection array, uses the integral element to obtain the signal energy of the echo laser, and then uses the algorithm to restore the signal waveform, and obtains the echo time after sampling.
  • the distance information is obtained according to the echo time, and the azimuth information is obtained according to the sub-field of view corresponding to the pixel in the detection array, and the original point cloud data is obtained.
  • the processing unit of the detection system splices and merges the original point cloud data to obtain a complete frame of detection point cloud data.
  • the detection array is divided into N detection parts, where N is an integer greater than 1.
  • N is an integer greater than 1.
  • the lidar 100 generally also includes a receiving optical system 4 .
  • the receiving optical system 4 is used for condensing the echo laser light, and shooting the converged echo laser light to each pixel in the detection system 2 .
  • the receiving optical system 4 can adopt one or a combination of a lens, a lens group, an optical fiber, a micro-lens group, a micro-lens array, and the like.
  • the control system 3 may adopt one or a combination of a field programmable gate array (Field Programmable Gate Array, FPGA), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a system-on-chip (System on Chip, SoC), etc.
  • the control system 3 may include the launch control unit of the launch system 1 and the control unit of the detection system 2 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种激光雷达(100)的探测方法、存储介质、探测系统(2,700)及激光雷达(100),方法包括:探测阵列(701)分为N个探测部,探测窗口时间分为N个子窗口时间,N为大于1的整数,在一个探测窗口时间中,第一个子窗口时间开启第i个探测部接收回波激光,在连续的每个子窗口时间内按照预设顺序开启探测部,得到1组原始点云数据;i为小于或等于N的正整数(101);遍历i的所有取值,并执行第一个子窗口时间开启第i个探测部接收回波激光,在连续的每个子窗口时间内按照预设顺序开启探测部,得到1组原始点云数据的步骤,得到N组原始点云数据(102);将原始点云数据进行拼接得到一帧探测点云数据(103),提高了对太阳光背景辐射的环境免疫性。

Description

激光雷达及其探测方法、存储介质和探测系统 技术领域
本申请实施例涉及雷达技术领域,特别是涉及一种激光雷达探测方法、存储介质、探测系统及激光雷达。
背景技术
激光雷达是以发射激光束探测目标的位置、速度等特征量的系统,已被广泛应用于测距系统、低飞目标的跟踪测量、武器制导、大气监测、测绘、预警、交通管理等领域。
激光雷达包括机械式激光雷达、固态激光雷达和混合固态激光雷达。Flash激光雷达一般为全固态激光雷达,发射系统、探测系统无需任何机械运动且能够同时记录整个探测场景获取探测目标距离信息、灰度成像信息等,避免了在扫描过程中由于目标或激光雷达自身移动带来的干扰,系统负载低,光机寿命长,便于模块化,装配复杂度低。目前,flash激光雷达主要应用于自动驾驶领域的近场补盲与辅助测距、近场状态检测等场景。
Flash激光雷达由于采用面阵型的探测阵列接收回波信号,所有像元接收回波信号的同时也会积累更多的背景环境光子,环境光子免疫性较差,在室外太阳光背景较强烈的环境光下几乎无法实现有效的信号探测。
发明内容
针对现有技术的上述缺陷,本申请实施例的主要目的在于提供一种激光雷达的探测方法、存储介质、探测系统和激光雷达,解决了现有技术中的激光雷达对环境光子免疫性较差的问题。
第一方面,本申请实施例提供一种激光雷达的探测方法,所述激光雷达的探测阵列分为N个探测部,探测窗口时间分为N个子窗口时间,N为大于1的整数,所述方法包括:
在一个探测窗口时间中,第一个所述子窗口时间开启第i个所述探测部接收回波激光,在连续的每个所述子窗口时间内按照预设顺序开启所述探测部,得到1组原始点云数据;i为小于或等于N的正整数;
遍历i的所有取值,并执行所述第一个所述子窗口时间开启第i个所述探测部接收回波激光,在连续的每个所述子窗口时间内按照预设顺序开启所述探测部,得到1组原始点云数据的步骤,得到N组所述原始点云数据;
将所述原始点云数据进行拼接,得到一帧探测点云数据。
第二方面,本申请实施例提供一种计算机存储介质,所述计算机存储介质存储有多条指令,所述指令适于由处理器加载并执行上述的方法步骤。
第三方面,本申请实施例提供一种激光雷达的探测系统,探测窗口时间分为N个子窗口时间,系统包括:
探测阵列,分为N个探测部,N为大于1的整数;
控制单元,在一个探测窗口时间中,发送控制指令在第一个所述子窗口 时间开启第i个所述探测部接收回波激光;按照预设时序发送控制指令,使每个所述子窗口时间对应的所述探测部依次开启,得到1组原始点云数据;i为小于或等于N的正整数;重复上一步骤,遍历i的所有取值,得到N组所述原始点云数据;
处理单元,将所述原始点云数据进行拼接,得到一帧探测点云数据。
第四方面,本申请实施例提供一种激光雷达,所述激光雷达包括发射系统、控制系统和如上所述的探测系统;
所述发射系统用于发射出射激光;
所述探测系统用于接收回波激光,并基于所述出射激光和所述回波激光得到探测区域内物体的探测点云数据;
所述控制系统用于控制所述发射系统发射所述出射激光和所述探测系统接收所述回波激光。
本申请实施例的有益效果是:本申请实施例通过将探测阵列分为N个探测部,在一个探测窗口时间内,每个探测部仅在其中的一个子窗口时间工作,而没有在整个探测窗口时间工作,减小了每个探测部采集信号的时间窗口长度,降低了探测部接收回波激光时的噪声累计,提高了系统的信噪比,提高了对环境光子的免疫性。
附图说明
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。
图1是本申请实施例提供的激光雷达的探测方法的流程示意图;
图2是本申请实施例中探测阵列的探测窗口时间划分示意图;
图3是本申请实施例中探测阵列的像元组成示意图;
图4是本申请实施例中探测阵列按列划分探测部的示意图;
图5是本申请另一实施例中探测阵列按列划分探测部的示意图;
图6是本申请又一实施例中探测阵列按列划分探测部的示意图;
图7是本申请实施例中点云数据拼接示意图;
图8是本申请另一实施例提供的激光雷达的探测方法的流程示意图;
图9是本本申请另一实施例中剔除部分点云数据的流程示意图;
图10是本申请实施例提供的激光雷达的探测系统的示意性框图;
图11是本申请实施例提供的激光雷达的示意性框图;
图12是本申请实施例提供的激光雷达的光路示意图。
具体实施方式中的附图标号如下:
激光雷达100,发射系统1,探测系统2,控制与信号处理系统3,接收光学系统4;
激光雷达的探测系统700,探测阵列701,控制单元702,处理单元703。
具体实施方式
下面将结合附图对本申请技术方案的实施例进行详细的描述。以下实施例仅用于更加清楚地说明本申请的技术方案,因此只作为示例,而不能以此来限制本申请的保护范围。
需要注意的是,除非另有说明,本申请使用的技术术语或者科学术语应当为本申请所属领域技术人员所理解的通常意义。
在本申请的描述中,需要理解的是,术语“中心”、“纵向”、“横向”、“长度”、“宽度”、“厚度”、“上”、“下”、“前”、“后”、“左”、“右”、“垂直”、“竖直”、“水平”、“顶”、“底”“内”、“外”、“顺时针”、“逆时针”、“轴向”、“径向”、“周向”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。
此外,术语“第一”、“第二”等仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。在本申请的描述中,“多个”、“若干”的含义是两个以上(含两个),除非另有明确具体的限定。
在本申请中,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”、“固定”等术语应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或成一体;可以是机械连接,也可以是电连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
在本申请中,除非另有明确的规定和限定,第一特征在第二特征“上”或“下”可以是第一和第二特征直接接触,或第一和第二特征通过中间媒介间接接触。而且,第一特征在第二特征“之上”、“上方”和“上面”可是第一特征在第二特征正上方或斜上方,或仅仅表示第一特征水平高度高于第二特征。第一特征在第二特征“之下”、“下方”和“下面”可以是第一特征在第二特征正下方或斜下方,或仅仅表示第一特征水平高度小于第二特征。
Flash激光雷达的基本工作原理是:发射系统将出射激光一次性地照亮整个探测视场,与此同时探测系统不断接收探测视场内的回波激光,通过一定的解算方法反演光子的飞行时间,最终得到目标的距离信息。Flash激光雷达的主要优点是:发射系统、探测系统无需任何机械运动且能够同时记录整个探测视场内的场景获取目标的距离信息、灰度成像信息等,避免了在扫描过程中由于目标或激光雷达自身移动带来的干扰,系统负载低,光机寿命长,便于模块化,装配复杂度低。
Flash激光雷达可以采用时间飞行法进行测距,发射系统中的光源发射周期性的短脉冲信号作为出射激光,出射激光射向探测视场后被探测视场内的物体发射,反射的回波激光也是短脉冲信号,探测系统接收回波激光,通过发射出射激光和接收回波激光的时间差直接得到光子的飞行时间,进而获得距离信息。
由于flash激光雷达的探测系统采用面阵接收,可同时得到探测视场范围 的物体形貌、轮廓等信息,相比机械式激光雷达、微机电系统(Micro-Electro-Mechanical System,MEMS)激光雷达等,具有更多维度的信息获取能力。但是,相比其他扫描式的激光雷达,flash激光雷达由于直接采用面阵型的探测阵列接收回波激光,所有像元接收回波激光的同时也会积累更多的背景环境光子,特别是在室外强烈太阳光背景辐照的条件下,即使采用通带更窄、阻带衰减系数更高的滤光片滤除环境光子,激光雷达实际的环境光子免疫性依旧较差;在室外太阳光背景较强烈的环境光下几乎无法实现有效的信号探测。探测系统探测阵列的像元数越多,系统的视场角越大,探测距离越远(积分时间越长),所累积的噪声光子就越多,系统的信噪比也越差。Flash激光雷达一般适用于雷达方程的小目标模型,回波激光的强度下降随距离的增加近似呈二次方关系,系统的积分时间与设计的最远探测距离相关。环境中的太阳光背景噪声一般为加性噪声,环境光子强度与累积时间、探测系统探测阵列的接收面积均成正比。
现有技术中,为了实现更远的探测,需要增加积分时间或者提高发射系统的光源功率,由此带来的代价是系统热功耗飙升,稳定性和可靠性下降,安全性降低,成本增加,环境光子免疫性更差。
下面结合具体的实施例对本申请进行详细说明。
在一个实施例中,如图1所示,特提出了一种激光雷达的探测方法,可以提高系统对环境光子的免疫性。该方法可依赖于计算机程序实现,可运行于基于冯诺依曼体系的激光雷达探测系统或激光雷达上。该计算机程序可集成在应用中,也可以作为独立的工具类应用运行。
请参考图1所示,本实施例提供的激光雷达的探测方法包括:
步骤101:在一个探测窗口时间中,第一个子窗口时间开启第i个探测部接收回波激光,在连续的每个子窗口时间内按照预设顺序开启探测部,得到1组原始点云数据;i为小于或等于N的正整数。
探测窗口时间即为激光雷达探测系统的积分时间,探测窗口时间与激光雷达探测系统的接收距离有关。容易理解,发射系统发射出射激光时,探测系统打开并进入探测窗口时间;光子飞行的距离越远,其回到探测系统的时间越久;相应的,探测系统的接收距离越远,探测窗口时间越长;因此,探测窗口时间由探测系统的最远接收距离决定。而从大于最远接收距离处返回的光子,其飞行时间超过探测窗口时间,到达探测系统时探测系统已关闭,无法接收到。基于此,发射系统发射出射激光的发射时间间隔必须大于探测窗口时间。由前述可知,发射第一出射激光后进入第一探测窗口时间;若第一探测窗口时间仍未结束时,发射第二出射激光,探测系统无法分辨接收到的回波激光是来自第一出射激光还是第二出射激光,故而引起串扰无法准确探测。发射系统的出射激光的发射时间间隔大于探测窗口时间,使每个探测窗口时间接收到的回波激光来自唯一确定的出射激光,避免测距错误。
如图2所示,本实施例将探测阵列的探测窗口时间分为N个子窗口时间,在每一个子窗口时间内探测阵列接收回波激光。每个子窗口时间能够接收对 应距离范围内的回波激光。例如,探测窗口时间从头至尾分为1、2、……、N子窗口时间,1子窗口时间接收0-L 1距离范围内的回波激光,2子窗口时间接收L 1-L 2距离范围内的回波激光,以此类推,N子窗口时间接收L N-1-L N距离范围内的回波激光;其中L N小于或等于探测系统的量程。探测窗口时间可以均匀分为N个子窗口时间,这样每个子窗口时间对应接收的距离范围也是相同的。均匀分割探测窗口时间为N个子窗口时间能够简化探测系统的设计和控制方法,使系统设计更为规整,易于实现;同时,每个子窗口时间因为积分时间相同引入的环境光子噪声也相同或相近,避免因积分时间不同影响各子窗口时间的信噪比,有利于激光雷达的准确性。探测窗口时间也可以不均匀的分为N个子窗口时间,针对某些环境光子噪声对信噪比的影响相对小的距离范围,可以适当延长该距离范围对应的子窗口时间,减少探测窗口时间分割的个数,简化探测过程。
探测阵列分为N个探测部,如第1个探测部、第2个探测部、……、第N个探测部。由前述可知,flash激光雷达的探测系统采用面阵接收,探测系统的探测阵列可以由m×k个像元规律排列而成。探测阵列的每个像元对应一个子视场,m×k个子视场连接在一起形成探测阵列的整个探测视场。例如,探测阵列由3×3个像元构成,如图3所示,由左至右、上至下依次为像元1、像元2、……、像元9;探测阵列的探测视场在水平方向上覆盖-6°~6°,在竖直方向上覆盖-10°~2°;那么,像元1的子视场在水平方向上覆盖-6°~-2°、在竖直方向上覆盖-2°~2°,像元2的子视场在水平方向上覆盖-2°~2°、在竖直方向上覆盖-2°~2°,像元3的子视场在水平方向上覆盖2°~6°、在竖直方向上覆盖-2°~2°,像元4的子视场在水平方向上覆盖-6°~-2°、在竖直方向上覆盖-6°~-2°,以此类推,像元9的子视场在水平方向上覆盖2°~6°、在竖直方向上覆盖-10°~-6°。因此,探测阵列的每个像元在垂直于距离方向的坐标系中对应一个坐标范围。
探测阵列可以均匀的分为N个探测部,每个探测部所包含的像元数量相同,这样每个探测部对应的子视场覆盖的视场大小相同,便于后续原始点云数据的拼接;同时,每个探测部的像元数量相同,累积的太阳光干扰也相同,便于后续的去噪处理。每个探测部包含的像元可以是连续的,也可以是离散的。优选的,一个探测部包含的像元是连续的;探测阵列中某一区域的多个像元构成一个探测部,该探测部内的多个像元连结成一个整体进行工作;同理,相邻像元的两个子视场相接,该探测部的多个像元对应的子视场也连结成一个整体,成为该探测部对应的视场。探测部的像元连续能够对整体视场中某一区域进行探测,便于从连续的原始点云数据中获取更多信息,也能够降低原始点云数据拼接的复杂度。进一步的,探测阵列可以按行或者按列分为多个探测部。以按列分割为例:3×3的探测阵列(如图4所示),可以按列分为3个探测部,第1个探测部包含第1列像元(即像元1、像元4、像元7),第2个探测部包含第2列像元(即像元2、像元5、像元8),第3个探测部包含第3列像元(即像元3、像元6、像元9)。探测阵列按列分为多个探测部时,每个探测部也可以包含多列像元。例如:6×6的探测阵列,按列分为3个探 测部,每个探测部包含两列像元(如图5所示)。探测部包含的像元越多,积分时间相同的情况下,累积的环境光子噪声越多,对信噪比有不利影响;但另一方面,探测部包含的像元越多,对系统设计和控制方法的简化均有利。因此,探测阵列分为N个探测部的分割方法根据实际应用需求取舍。探测部包含的像元也可以是离散的;多个像元离散的分布于探测阵列中,相应的该探测部对应的视场也由分散在整体视场中的多个子视场构成。采用像元离散分布的探测部进行探测能够同时对整体视场中不同区域进行快速探测,获取各个区域的粗略信息,便于激光雷达后续进行探测参数的调整。
探测阵列也可以不均匀的分为N个探测部,每个探测部所包含的像元数量不全都相同。同前述,一方面探测部包含的像元越多,对信噪比有不利影响,另一方面,探测部包含的像元越多,对系统设计和控制方法的简化均有利。探测阵列可以根据周围环境情况划分探测部,以满足整体视场中不同区域的探测需求,具有更好的适应性和灵活性。
探测阵列分为N个探测部,探测窗口时间分为N个子窗口时间。在一个探测窗口时间内,每个子窗口时间内对应开启一个探测部。经过一个探测窗口时间,探测阵列的N个探测部分别开启一次接收回波激光。
在一个探测窗口时间中,第一个子窗口时间开启第i个探测部接收回波激光,在后续连续的每个子窗口时间内按照预设顺序开启探测部。i为小于或等于N的正整数。N个探测部开启的预设顺序可以有下列选择:
1.第一个子窗口时间开启第1(i=1时)个探测部,第2个子窗口时间开启第2个探测部……以此类推,直至第N个子窗口时间开启第N个探测部。
进一步的,探测阵列可以按列分为N个列探测部时,如图6所示,从左至右依次为第1个列探测部、第2个列探测部、……第N个列探测部。探测阵列的N个探测部开启的顺序即为:第一个子窗口时间开启第1个列探测部,第2个子窗口时间开启第2个列探测部……以此类推,直至第N个子窗口时间开启第N个列探测部。以前述实施例为例,如图4所示,探测阵列包括3×3个像元,按列分为3个列探测部;探测窗口时间也分为3个子窗口时间;第一个子窗口时间开启第1个列探测部,即打开像元1、像元4和像元7;第二子窗口时间开启第2个列探测部,即打开像元2、像元5和像元8;第三子窗口时间开启第3个列探测部,即打开像元3、像元6和像元9。
探测阵列按行分为N个行探测部时,其开启的预设顺序与列探测部类似,此处不再赘述。
当然,也可以按照探测部序号递减的方式,在每个子窗口时间打开对应的探测部。第一个子窗口时间开启第N个探测部,第2个子窗口时间开启第N-1个探测部……以此类推,直至第N个子窗口时间开启第1个探测部。
2.第一个子窗口时间开启第i(1<i≤N时)个探测部,第2个子窗口时间开启第i+1个探测部……第(N-i+1)个子窗口时间开启第N个探测部,第(N-i+2)个子窗口时间开启第1个探测部,第(N-i+3)个子窗口时间开启第2个探测部,……以此类推,直至第N个子窗口时间开启第(i-1)个探测部。
优选的,与前述类似,探测阵列按列分为N个列探测部时,也可以按此 预设顺序开启列探测部。例如,探测阵列包括5×2个像元,按列分为5个列探测部,每个列探测部包括1×2个像元;探测窗口时间分为5个子窗口时间;第一个子窗口时间开启第4(i=4时)个列探测部,第二个子窗口时间开启第5个列探测部,第三个子窗口时间开启第1个列探测部,第四个子窗口时间开启第2个列探测部,第五个子窗口时间开启第3个列探测部。
探测阵列按行分为N个行探测部时,其开启的预设顺序与列探测部类似,此处不再赘述。
同理,也可以按照探测部序号递减的方式,在每个子窗口时间打开对应的探测部。以前述实施例为例,第一个子窗口时间开启第4(i=4时)个列探测部,第二个子窗口时间开启第3个列探测部,第三个子窗口时间开启第2个列探测部,第四个子窗口时间开启第1个列探测部,第五个子窗口时间开启第5个列探测部。
3.任意顺序开启。如前述,在一个探测窗口时间内,每个子窗口时间内对应开启一个探测部。开启N个探测部时也可以不按照探测部序号的顺序依次开启,可以按任意顺序开启探测部。只需满足:在一个探测窗口时间内,每个探测部均开启过一次,且每个子窗口时间内仅开启一个探测部。
在第一探测窗口时间内,当探测阵列的所有探测部均完成其子窗口时间内的探测后,则完成了第1组的原始点云数据的采集。
步骤102:遍历i的所有取值,并执行第一个子窗口时间开启第i个探测部接收回波激光,在连续的每个子窗口时间内按照预设顺序开启探测部,得到1组原始点云数据的步骤,得到N组原始点云数据。
在完成第1组原始点云数据采集后,进入下一个探测窗口时间。
以探测阵列中的任意一个探测部为示例探测部进行说明。
在步骤101中,在第一探测窗口时间内,示例探测部仅在对应的子窗口时间内开启接收回波激光,也就是说:示例探测部仅对其子视场中的某一距离范围进行了探测,该距离范围为示例探测部开启的子窗口时间对应的距离范围。示例探测部在一个探测窗口时间内仅探测一距离范围,需在后续探测窗口时间内将其余距离范围均探测完,才能够获取子视场中的全部原始点云数据。由前述可知,一个探测窗口时间分为N个子窗口时间,每个子窗口时间的探测距离分别对应探测系统的量程内的一个距离范围。因此,示例探测部在所有子窗口时间内均开启过,即可对其子视场中量程以内完整探测。一个探测窗口时间内,示例探测部仅在对应的子窗口时间内开启一次;需要经过至少N个探测窗口时间,示例探测部才能获得子视场中的全部原始点云数据。为了提高示例探测部的探测效率和输出的原始点云数据的实时性,在连续的N个探测窗口时间内,示例探测部需在不同的子窗口时间内均开启一次,没有重复。
示例探测部是探测阵列中的任意一个探测部。每个探测部均在同样连续的N个探测窗口时间内,在不同的子窗口时间内均开启一次,能够快速获得整体视场完整的探测点云数据。
为了简化探测系统的设计和探测方法,遍历i的所有取值并执行步骤101。 每个探测窗口时间,第一个子窗口时间开启的探测部均不同;步骤101中每个探测部会按照预设顺序在对应的子窗口时间开启,经过N个探测窗口时间,每个探测部在所有的子窗口时间内均开启过。
进一步的,以探测阵列按列分为多个探测部为例,探测阵列包括第1个列探测部、第2个列探测部、……第N个列探测部。i由小到大取值时,第一个探测窗口时间内,第一个子窗口时间开启第1(i=1时)个列探测部,第二个子窗口时间开启第2个列探测部,……第N个子窗口时间开启第N个列探测部。第二个探测窗口时间内,第一个子窗口时间开启第2个列探测部,第二个子窗口时间开启第3个列探测部,……第(N-1)个子窗口时间开启第N个列探测部,第N个子窗口时间开启第1个列探测部。以此类推,第N个探测窗口时间内,第一个子窗口时间开启第N个列探测部,第二个子窗口时间开启第1个列探测部,第三个子窗口时间开启第2个列探测部,……第N个子窗口时间开启第(N-1)个列探测部。经过N×N个子窗口时间后,完成N组原始点云数据的采集。当前探测窗口时间和前一探测窗口时间相比,相同序号的子窗口时间对应开启的列探测部的序号往后顺延1个。i也可以由小到大取值,当前探测窗口时间和前一探测窗口时间相比,子窗口时间对应开启的列探测部的序号往前顺延1个;具体开启方式与前述i由小到大取值时类似,此处不再赘述。
探测阵列按行分为N个行探测部时,其开启的预设顺序与列探测部类似,此处不再赘述。
步骤103:将原始点云数据进行拼接,得到一帧探测点云数据。
将所有的点云数据按照距离信息和方位信息的对应关系拼接后,即可得到一帧完整的探测点云数据,如图7所示。
由前述可知,在一个探测窗口时间内,每个探测部仅在一个子窗口时间内开启接收回波激光,也即仅探测一个距离范围。从某一探测部来看,在连续的N个探测窗口时间后,该探测部在量程以内的所有距离范围所对应的子窗口时间内均开启过;根据原始点云数据中的距离信息,将该探测部在不同距离范围的原始点云数据进行拼接,即可得到该探测部对应的子视场的原始点云数据。每个探测部的子视场对应整体视场中的一个区域,根据原始点云数据中的方位信息,将N个探测部对应的N个子视场的原始点云数据进行拼接,即可得到整体视场的完整的探测点云数据。
本申请实施例通过将探测阵列分为N个探测部,在一个探测窗口时间内,每个探测部仅在其中的一个子窗口时间工作,而没有在整个探测窗口时间工作,减小了每个探测部采集信号的窗口时间长度,降低了接收回波激光时累积的噪声,提高了系统的信噪比,提高了对环境光子的免疫性。
发射系统采用泛光照明,探测系统接收回波激光时并没有在一个很长的窗口时间内持续接收,这样可以降低环境光子的累计。同时,由于每一个子窗口时间仅驱动开启探测阵列中的特定探测部接收回波激光,减少了环境光子的接收面积。因此,降低了环境中太阳光对回波激光的干扰,提高了系统的信噪比,相同的积分时间条件下,系统的信噪比可以提高2logN倍,噪声 的累积可以降低N 2倍。
请参见图8,图8是本申请提出的一种激光雷达的探测方法的另一种实施例的流程示意图。
步骤201:在一个探测窗口时间中,第一个子窗口时间开启第i个探测部接收回波激光,在连续的每个子窗口时间内按照预设顺序开启探测部,得到1组原始点云数据;i为小于或等于N的正整数。
具体可参见步骤101,此处不再赘述。
步骤202:遍历i的所有取值,并执行第一个子窗口时间开启第i个探测部接收回波激光,在连续的每个子窗口时间内按照预设顺序开启探测部,得到1组原始点云数据的步骤,得到N组原始点云数据。
具体可参见步骤102,此处不再赘述。
步骤203:根据原始点云数据的子窗口时间,确定原始点云数据的距离信息。
由前述可知,在一个探测窗口时间内,每个探测部仅在一个子窗口时间内开启接收回波激光,也即仅探测一个距离范围。根据获取原始点云数据对应的子窗口时间,可以获知每个原始点云数据对应探测的距离范围,该距离范围即为原始点云数据的距离信息。以前述实施例为例,如图4所示,探测阵列包括3×3个像元,按列分为3个列探测部;第一探测窗口时间内,第一个子窗口时间开启第1个列探测部,得到原始点云数据D 11,第二子窗口时间开启第2个列探测部,得到原始点云数据D 12,第三子窗口时间开启第3个列探测部,得到原始点云数据D 13;经过第一探测窗口时间后,3个列探测部均开启过一次,得到第1组原始点云数据。第1组原始点云数据中,D 11对应的距离范围为0-L 1;D 12对应的距离范围为L 1-L 2;D 13对应的距离范围为L 2-L 3。同理,第二探测窗口时间内,第一个子窗口时间开启第2个列探测部,得到原始点云数据D 22,对应的距离范围为0-L 1;第二子窗口时间开启第3个列探测部,得到原始点云数据D 23,对应的距离范围为L 1-L 2;第三子窗口时间开启第1个列探测部,得到原始点云数据D 21,对应的距离范围为L 2-L 3;经过第二探测窗口时间后,得到第2组原始点云数据。第三探测窗口时间内,第一个子窗口时间开启第3个列探测部,得到原始点云数据D 33,对应的距离范围为0-L 1;第二子窗口时间开启第1个列探测部,得到原始点云数据D 31,对应的距离范围为L 1-L 2;第三子窗口时间开启第2个列探测部,得到原始点云数据D 32,对应的距离范围为L 2-L 3;经过第三探测窗口时间后,得到第3组原始点云数据。
步骤204:根据探测部在探测阵列中的位置,确定原始点云数据的方位信息。
由前述可知,探测阵列的每个像元对应一个子视场,该子视场投射到垂直于距离方向的坐标系中对应一个坐标范围,多个子视场连接在一起形成探测阵列的整个探测视场。根据获取原始点云数据对应的探测部在探测阵列中的位置,即可获知该探测部的子视场在整个探测视场中的位置,即为方位信 息。以前述实施例为例,如图4所示,探测阵列包括3×3个像元,按列分为3个列探测部,探测阵列的探测视场在水平方向上覆盖-6°~6°,在竖直方向上覆盖-10°~2°。第1个列探测部位于探测阵列的最左侧,其对应的子视场也位于整个探测视场的同一侧,在水平方向上覆盖-6°~-2°,在竖直方向上覆盖-10°~2°,即为第1列探测部得到的原始点云数据的方位信息。同理,第2个列探测部探测得到的原始点云数据的方位信息为:在水平方向上覆盖-2°~2°,在竖直方向上覆盖-10°~2°;第3个列探测部探测得到的原始点云数据的方位信息为:在水平方向上覆盖2°~6°,在竖直方向上覆盖-10°~2°。
步骤206:将原始点云数据对应的距离信息和方位信息进行拼接合并,得到一帧探测点云数据。
经过步骤203和步骤204后,能够获得每个原始点云数据的距离信息和方位信息,即该原始点云数据对应的是整个探测视场中哪一个区域的子视场、对应的是量程范围内的哪一个距离范围。经过N×N个子窗口时间后,将N×N个原始点云数据的距离信息和方位信息拼接后,得到覆盖整个探测视场和整个量程的点云数据,即为一帧探测点云数据。
在一些实施例中,可以在进行点云数据的拼接合并前,将某些原始点云数据预先剔除。因此,如图9所示,在步骤206之前,还可以包括如下步骤:
步骤2051:将方位信息相同的所有原始点云数据记为点云组。
由前述可知,方位信息相同的原始点云数据,即为同一个探测部经过N个探测窗口时间后得到的N个原始点云数据,每个原始点云数据对应量程以内不同的距离范围。如图3所示,探测阵列包括3×3个像元,按行分为3个行探测部,第1行探测部对应像元1、像元2和像元3,第2行探测部对应像元4、像元5和像元6,第3行探测部对应像元7、像元8和像元9。每个行探测部的子视场不变,获得的原始点云数据方位信息相同,将这些方位信息相同的原始点云信息标记为点云组。
步骤2052:根据方位信息获取测距范围。
本步骤中,测距范围可以是根据实际应用需求预先设定的,例如整个探测视场的某些区域,所需的测距距离较近,例如朝向天空或地面的子视场;而某些区域,需要的测距距离较远,例如朝向前方的子视场。总之,可根据探测需求预先设定每个方位(对应于每个探测部)的测距范围。
以步骤2051中的实施例为例,第1行探测部的子视场朝向上,用于探测上方的障碍物。由于上方的障碍物对运输工具的运行影响不大,通常不需要大量程;可以将第1行探测部的测距范围设置为0-L a,例如L a=20m。第2行探测部的子视场朝向前方,需要获得更远更精确的障碍物信息,通常需要大量程、高分辨率;可以将第2行探测部的测距范围设置为0-L b,其中,L b>L a,例如L b=150m。第3行探测部的子视场朝向下方,该子视场内的出射激光很快射到地面,并被地面反射后返回回波激光,通常也不需要大量程;可以将第3行探测部的测距范围设置为0-L c,其中,L b>L c,例如L c=15m。L a和L b可以相等也可以不等,根据探测需求设定即可。
步骤2053:剔除点云组中的部分原始点云数据,剔除的原始点云数据的 距离信息不包含在测距范围内。
从点云组中将超出该点云组对应的测距范围的原始点云数据剔除。通过该方式,剔除不需要的数据,可以减少后续数据拼接合并的时间,提高拼接效率。以步骤2051、步骤2052中的实施例为例,第1行探测部探测到的原始点云数据中,剔除0-L a测距范围以外的原始点云数据。若L a=L 1,则剔除部分原始点云数据的操作更便捷,将第2子窗口时间和第3子窗口时间探测的部分原始点云数据剔除即可。同理,第2行探测部探测到的原始点云数据中,剔除0-L b测距范围以外的原始点云数据,通常,第2行探测部期望获得大量程,L b可以是探测系统的量程;第3行探测部探测到的原始点云数据中,剔除0-L c测距范围以外的原始点云数据。
本申请实施例通过将探测阵列分为N个探测部,在一个探测窗口时间内,每个探测部仅在其中的一个子窗口时间工作,而没有在整个探测窗口时间工作,减小了每个探测部采集信号的窗口时间长度,降低了接收回波激光时累积的噪声,提高了系统的信噪比,提高了对环境光子的免疫性。
本申请实施例还通过从点云组中将该点云组对应的视场区域中,其所需的测距范围以外的原始点云数据剔除,再进行原始点云数据的拼接,减少后续数据拼接合并的运算量,提高数据处理的效率。
本申请实施例还提供了一种计算机存储介质,所述计算机存储介质可以存储有多条指令,所述指令适于由处理器加载并执行如上述图1-图9所示实施例的所述激光雷达的探测方法,具体执行过程可以参见图1-图9所示实施例的具体说明,在此不进行赘述。所述计算机存储介质为非易失性计算机可读存储介质。
本申请还提供了一种计算机程序产品,该计算机程序产品存储有至少一条指令,所述至少一条指令由所述处理器加载并执行如上述图1-图9所示实施例的具体说明,在此不进行赘述。
下述为本申请装置实施例,可以用于执行本申请方法实施例。对于本申请装置实施例中未披露的细节,请参照本申请方法实施例。
请参见图10,其示出了本申请一个示例性实施例提供的激光雷达的探测系统的示意图。该探测系统可以通过软件、硬件或者两者的结合实现成为装置的全部或一部分。该探测系统包括探测阵列、控制单元和处理单元。
探测阵列701,分为N个探测部,N为大于1的整数;
控制单元702,在一个探测窗口时间中,发送控制指令在第一个子窗口时间开启第i个探测部接收回波激光;按照预设时序发送控制指令,使每个子窗口时间对应的探测部依次开启,得到1组原始点云数据;i为小于或等于N的正整数;重复上一步骤,遍历i的所有取值,得到N组原始点云数据;
处理单元703,将原始点云数据进行拼接,得到一帧探测点云数据。
在一些实施例中,控制单元702进一步用于:当i=1时,第一个所述子窗 口时间开启第1个所述探测部;
第二个所述子窗口时间开启第2个所述探测部;
以此类推,第N个所述子窗口时间开启第N个所述探测部。
在一些实施例中,控制单元702进一步用于:当i>1时,第一个所述子窗口时间开启第i个所述探测部;
第二个所述子窗口时间开启第(i+1)个所述探测部;
以此类推,第(N-i+1)个所述子窗口时间开启第N个所述探测部;
第(N-i+2)个所述子窗口时间开启第1个所述探测部;
以此类推,第N个所述子窗口时间开启第(i-1)个所述探测部。
在一些实施例中,处理单元703进一步用于:
根据原始点云数据的子窗口时间,确定原始点云数据的距离信息;
根据探测部在探测阵列中的位置,确定原始点云数据的方位信息;
将原始点云数据对应的距离信息和方位信息进行拼接合并,得到一帧探测点云数据。
在一些实施例中,处理单元703进一步用于:
将所述方位信息相同的所有所述原始点云数据记为点云组;
根据所述方位信息获取测距范围;
剔除所述点云组中的部分所述原始点云数据,剔除的所述原始点云数据的所述距离信息不包含在所述测距范围内。
在一些实施例中,所述探测阵列分为N个列探测部,所述探测阵列沿垂直于列方向的一端至另一端依次为第1个所述列探测部、第2个所述列探测部、……、第N个所述列探测部;
或者,所述探测阵列分为N个行探测部,所述探测阵列沿垂直于行方向的一端至另一端依次为第1个所述行探测部、第2个所述行探测部、……、第N个所述行探测部。
在一些实施例中,所述列探测部包括p列像元,或者所述行探测部包括p行像元,p为正整数。
在一些实施例中,所述激光雷达的出射激光的发射时间间隔大于所述探测窗口时间。
在一些实施例中,所述探测窗口时间均匀分为N个子窗口时间。
本申请实施例通过将探测阵列分为N个探测部,在一个探测窗口时间内,每个探测部仅在其中的一个子窗口时间工作,而没有在整个探测窗口时间工作,减小了每个探测部采集信号的时间窗口长度,降低了探测部接收回波激光时的噪声累计,提高了系统的信噪比,提高了对环境光子的免疫性。
请参见图11和图12,其示出了本申请一个示例性实施例提供的激光雷达100的示意图。该激光雷达包括发射系统1、探测系统2和控制系统3。
发射系统1用于发射出射激光;探测系统2用于接收回波激光,回波激光为出射激光被探测区域内的物体反射后返回的激光,并给予出射激光和回波激光得到探测区域内物体的探测点云数据。控制系统3用于控制发射系统1 和控制探测系统2。
发射系统1采用泛光照明,通常采用面阵型光源。发射系统1包括光源构成的发射阵列,发射阵列的光源可以为各种类型,例如脉冲激光器(Pulsed laser deposition,PLD)、激光二极管(Laser Diode,LD)、垂直腔面发射激光器(Vertical Cavity Surface Emitting Laser,VCSEL)、边发射激光器(Edge Emitting Laser,EEL)、发光二极管(Light Emitting Diode,LED)等器件中的一种或者多种的组合。发射系统1还包括发射控制单元,用于控制发射阵列发射出射激光。
激光雷达100一般还包括发射光学系统(图中未示出),可以采用透镜、透镜组、光纤、微透镜组、微透镜阵列等中的一种或者多种的组合,将发射系统发射的出射激光一次性的均匀出射至整个探测视场。
探测系统2包括像元构成的探测阵列,可以是一维探测阵列,也可以是二维探测阵列。探测阵列的像元可以采用积分型元件如雪崩光电二极管(Avalanche Photodiode,APD)、多像素光子计数器(Multi-Pixel Photon Counter,MPPC)、硅光电倍增管(Silicon photomultiplier,SiPM)、快速电荷耦合器件(Charge Coupled Device,CCD)、互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)等器件中的一种或者多种的组合。探测系统的控制单元控制探测阵列的接收和采样,采用积分型元件得到回波激光的信号能量,再利用算法恢复信号波形,采样后获取回波时间。根据回波时间获得距离信息、根据探测阵列中像元对应的子视场获得方位信息,得到原始点云数据。探测系统的处理单元将原始点云数据进行拼接合并,得到一帧完整的探测点云数据。
探测阵列分为N个探测部,N为大于1的整数。探测部的分布可以参考前述方法实施例,此处不再赘述。
激光雷达100一般还包括接收光学系统4。接收光学系统4用于会聚回波激光,并将会聚后的回波激光射向探测系统2中的每个像元。接收光学系统4可以采用透镜、透镜组、光纤、微透镜组、微透镜阵列等中的一种或者多种的组合。
控制系统3可采用现场可编程门阵列(Field Programmable Gate Array,FPGA)、专用集成电路(Application Specific Integrated Circuit,ASIC)、系统级芯片(SystemonChip,SoC)等中的一种或者多种的组合。控制系统3可以包括发射系统1的发射控制单元和探测系统2的控制单元。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围,其均应涵盖在本申请的权利要求和说明书的范围当中。尤其是,只要不存在结构冲突,各个实施例中所提到的各项技术特征均可以任意方式组合起来。本申请并不局限于文中公开的特定实施例,而是包括落入权利要求的范围内的所有技术 方案。

Claims (12)

  1. 一种激光雷达的探测方法,其特征在于,所述激光雷达的探测阵列分为N个探测部,探测窗口时间分为N个子窗口时间,N为大于1的整数,所述方法包括:
    在一个探测窗口时间中,第一个所述子窗口时间开启第i个所述探测部接收回波激光,在连续的每个所述子窗口时间内按照预设顺序开启所述探测部,得到1组原始点云数据;i为小于或等于N的正整数;
    遍历i的所有取值,并执行所述第一个所述子窗口时间开启第i个所述探测部接收回波激光,在连续的每个所述子窗口时间内按照预设顺序开启所述探测部,得到1组原始点云数据的步骤,得到N组所述原始点云数据;
    将所述原始点云数据进行拼接,得到一帧探测点云数据。
  2. 如权利要求1所述的方法,其特征在于,所述第一个所述子窗口时间开启第i个所述探测部接收回波激光,在连续的每个所述子窗口时间内按照预设顺序开启所述探测部,包括:
    当i=1时,第一个所述子窗口时间开启第1个所述探测部;
    第二个所述子窗口时间开启第2个所述探测部;
    以此类推,第N个所述子窗口时间开启第N个所述探测部。
  3. 如权利要求1所述的方法,其特征在于,所述第一个所述子窗口时间开启第i个所述探测部接收回波激光,在连续的每个所述子窗口时间内按照预设顺序开启所述探测部,包括:
    当i>1时,第一个所述子窗口时间开启第i个所述探测部;
    第二个所述子窗口时间开启第(i+1)个所述探测部;
    以此类推,第(N-i+1)个所述子窗口时间开启第N个所述探测部;
    第(N-i+2)个所述子窗口时间开启第1个所述探测部;
    以此类推,第N个所述子窗口时间开启第(i-1)个所述探测部。
  4. 如权利要求1所述的方法,其特征在于,所述将所述原始点云数据进行拼接,得到一帧探测点云数据,包括:
    根据原始点云数据的子窗口时间,确定原始点云数据的距离信息;
    根据探测部在探测阵列中的位置,确定原始点云数据的方位信息;
    将原始点云数据对应的距离信息和方位信息进行拼接合并,得到一帧探测点云数据。
  5. 如权利要求4所述的方法,其特征在于,所述将原始点云数据对应的距离信息和方位信息进行拼接合并之前,还包括:
    将所述方位信息相同的所有所述原始点云数据记为点云组;
    根据所述方位信息获取测距范围;
    剔除所述点云组中的部分所述原始点云数据,剔除的所述原始点云数据的所述距离信息不包含在所述测距范围内。
  6. 如权利要求1-3任一项所述的方法,其特征在于,所述探测阵列分为N个列探测部,所述探测阵列沿垂直于列方向的一端至另一端依次为第1个所述列探测部、第2个所述列探测部、……、第N个所述列探测部;
    或者,所述探测阵列分为N个行探测部,所述探测阵列沿垂直于行方向的一端至另一端依次为第1个所述行探测部、第2个所述行探测部、……、第N个所述行探测部。
  7. 如权利要求6所述的方法,其特征在于,所述列探测部包括p列像元,或者所述行探测部包括p行像元,p为正整数。
  8. 如权利要求1-7任一项所述的方法,其特征在于,所述激光雷达的出射激光的发射时间间隔大于所述探测窗口时间。
  9. 如权利要求1-7任一项所述的方法,其特征在于,所述探测窗口时间均匀分为N个子窗口时间。
  10. 一种计算机存储介质,其特征在于,所述计算机存储介质存储有多条指令,所述指令适于由处理器加载并执行如权利要求1~9任意一项的方法步骤。
  11. 一种激光雷达的探测系统,其特征在于,探测窗口时间分为N个子窗口时间,系统包括:
    探测阵列,分为N个探测部,N为大于1的整数;
    控制单元,在一个探测窗口时间中,发送控制指令在第一个所述子窗口时间开启第i个所述探测部接收回波激光;按照预设时序发送控制指令,使每个所述子窗口时间对应的所述探测部依次开启,得到1组原始点云数据;i为小于或等于N的正整数;重复上一步骤,遍历i的所有取值,得到N组所述原始点云数据;
    处理单元,将所述原始点云数据进行拼接,得到一帧探测点云数据。
  12. 一种激光雷达,其特征在于,所述激光雷达包括发射系统、控制系统和权利要求11所述的探测系统;
    所述发射系统用于发射出射激光;
    所述探测系统用于接收回波激光,并基于所述出射激光和所述回波激光得到探测区域内物体的探测点云数据;
    所述控制系统用于控制所述发射系统和所述探测系统。
PCT/CN2020/108628 2020-08-12 2020-08-12 激光雷达及其探测方法、存储介质和探测系统 WO2022032516A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080004044.5A CN112470026A (zh) 2020-08-12 2020-08-12 激光雷达及其探测方法、存储介质和探测系统
PCT/CN2020/108628 WO2022032516A1 (zh) 2020-08-12 2020-08-12 激光雷达及其探测方法、存储介质和探测系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/108628 WO2022032516A1 (zh) 2020-08-12 2020-08-12 激光雷达及其探测方法、存储介质和探测系统

Publications (1)

Publication Number Publication Date
WO2022032516A1 true WO2022032516A1 (zh) 2022-02-17

Family

ID=74802676

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/108628 WO2022032516A1 (zh) 2020-08-12 2020-08-12 激光雷达及其探测方法、存储介质和探测系统

Country Status (2)

Country Link
CN (1) CN112470026A (zh)
WO (1) WO2022032516A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116520288A (zh) * 2023-07-03 2023-08-01 中国人民解放军国防科技大学 一种激光点云测距数据的去噪方法和系统

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023279375A1 (zh) * 2021-07-09 2023-01-12 华为技术有限公司 一种探测控制方法及装置
CN114594455B (zh) * 2022-01-13 2022-11-18 杭州宏景智驾科技有限公司 激光雷达系统及其控制方法
WO2023197570A1 (zh) * 2022-04-14 2023-10-19 上海禾赛科技有限公司 激光雷达及其探测方法、可读存储介质
CN116047470B (zh) * 2023-01-28 2023-06-02 深圳煜炜光学科技有限公司 一种半固态激光雷达及其控制方法
CN117233787B (zh) * 2023-11-09 2024-01-26 北京亮道智能汽车技术有限公司 点云图像获取方法、装置和激光雷达

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150199458A1 (en) * 2014-01-14 2015-07-16 Energid Technologies Corporation Digital proxy simulation of robotic hardware
CN110221309A (zh) * 2019-04-30 2019-09-10 深圳市光鉴科技有限公司 基于异步ToF离散点云的3D成像装置及电子设备
CN110244318A (zh) * 2019-04-30 2019-09-17 深圳市光鉴科技有限公司 基于异步ToF离散点云的3D成像方法
CN110471081A (zh) * 2019-04-30 2019-11-19 深圳市光鉴科技有限公司 基于同步ToF离散点云的3D成像装置及电子设备
CN110888144A (zh) * 2019-12-04 2020-03-17 吉林大学 一种基于滑动窗口的激光雷达数据合成方法
CN210835244U (zh) * 2019-04-30 2020-06-23 深圳市光鉴科技有限公司 基于同步ToF离散点云的3D成像装置及电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150199458A1 (en) * 2014-01-14 2015-07-16 Energid Technologies Corporation Digital proxy simulation of robotic hardware
CN110221309A (zh) * 2019-04-30 2019-09-10 深圳市光鉴科技有限公司 基于异步ToF离散点云的3D成像装置及电子设备
CN110244318A (zh) * 2019-04-30 2019-09-17 深圳市光鉴科技有限公司 基于异步ToF离散点云的3D成像方法
CN110471081A (zh) * 2019-04-30 2019-11-19 深圳市光鉴科技有限公司 基于同步ToF离散点云的3D成像装置及电子设备
CN210835244U (zh) * 2019-04-30 2020-06-23 深圳市光鉴科技有限公司 基于同步ToF离散点云的3D成像装置及电子设备
CN110888144A (zh) * 2019-12-04 2020-03-17 吉林大学 一种基于滑动窗口的激光雷达数据合成方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116520288A (zh) * 2023-07-03 2023-08-01 中国人民解放军国防科技大学 一种激光点云测距数据的去噪方法和系统
CN116520288B (zh) * 2023-07-03 2023-09-22 中国人民解放军国防科技大学 一种激光点云测距数据的去噪方法和系统

Also Published As

Publication number Publication date
CN112470026A (zh) 2021-03-09

Similar Documents

Publication Publication Date Title
WO2022032516A1 (zh) 激光雷达及其探测方法、存储介质和探测系统
US20210181317A1 (en) Time-of-flight-based distance measurement system and method
US9417326B2 (en) Pulsed light optical rangefinder
WO2021051478A1 (zh) 一种双重共享tdc电路的飞行时间距离测量系统及测量方法
CN110927734B (zh) 一种激光雷达系统及其抗干扰方法
US10739445B2 (en) Parallel photon counting
WO2021051479A1 (zh) 一种基于插值的飞行时间测量方法及测量系统
WO2021051481A1 (zh) 一种动态直方图绘制飞行时间距离测量方法及测量系统
WO2021051480A1 (zh) 一种动态直方图绘制飞行时间距离测量方法及测量系统
CN110221274B (zh) 时间飞行深度相机及多频调制解调的距离测量方法
CN115210602A (zh) 用于固态lidar的噪声过滤系统和方法
CN114096882A (zh) 自适应多脉冲lidar系统
CN110221273B (zh) 时间飞行深度相机及单频调制解调的距离测量方法
WO2022062382A1 (zh) 激光雷达的探测方法及激光雷达
CN112305519B (zh) 基于硅光电倍增管的激光雷达快速探测系统
US20220187430A1 (en) Time of flight calculation with inter-bin delta estimation
WO2022206031A1 (zh) 确定噪声水平的方法、激光雷达以及测距方法
US20220099814A1 (en) Power-efficient direct time of flight lidar
CN114488173A (zh) 一种基于飞行时间的距离探测方法和系统
CN113050119B (zh) 一种适用于光学flash三维成像雷达干扰的判决方法
WO2022126429A1 (zh) 测距装置、测距方法和可移动平台
WO2023133964A1 (zh) 激光雷达系统及其环境光去噪方法
WO2023197532A1 (zh) 激光雷达的收发装置和激光雷达
WO2022036714A1 (zh) 激光测距方法、测距装置和可移动平台
WO2023133965A1 (zh) 激光雷达系统及其环境光感知方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20949015

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23/05/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20949015

Country of ref document: EP

Kind code of ref document: A1