CN113325388A - Method and device for filtering floodlight noise of laser radar in automatic driving - Google Patents

Method and device for filtering floodlight noise of laser radar in automatic driving Download PDF

Info

Publication number
CN113325388A
CN113325388A CN202110626524.1A CN202110626524A CN113325388A CN 113325388 A CN113325388 A CN 113325388A CN 202110626524 A CN202110626524 A CN 202110626524A CN 113325388 A CN113325388 A CN 113325388A
Authority
CN
China
Prior art keywords
data
point cloud
pixel
cloud data
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110626524.1A
Other languages
Chinese (zh)
Inventor
张雨
陈东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Qingzhou Zhihang Technology Co ltd
Original Assignee
Beijing Qingzhou Zhihang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Qingzhou Zhihang Technology Co ltd filed Critical Beijing Qingzhou Zhihang Technology Co ltd
Priority to CN202110626524.1A priority Critical patent/CN113325388A/en
Publication of CN113325388A publication Critical patent/CN113325388A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates

Abstract

The embodiment of the invention relates to a method and a device for filtering floodlight noise of a laser radar in automatic driving, wherein the method comprises the following steps: performing laser scanning measurement on a first target environment through a laser radar in an automatic driving process to obtain a first point cloud data set; performing range image conversion processing on the first point cloud data set to generate first image data; carrying out layered image processing on the first image data to generate second image data; performing foreground spot marking processing on the second image data by using a spot detection model to obtain a first spot area; marking the outer edge area of the first spot area to obtain a first outer edge area; marking first outer edge point cloud data; calculating first region depth data; carrying out floodlight noise point identification and marking treatment; and filtering the first point cloud data marked as floodlighting noise points from the first point cloud data set. The invention can solve the problem of reduced accuracy of the laser radar measurement data in a wet/rainy weather state.

Description

Method and device for filtering floodlight noise of laser radar in automatic driving
Technical Field
The invention relates to the technical field of data processing, in particular to a method and a device for filtering floodlight noise of a laser radar in automatic driving.
Background
In the field of automatic driving, the laser radar is mainly used for quickly measuring environmental targets around a driving route of a vehicle and sending point cloud data obtained by measurement to an automatic driving system for decision processing related to the route, speed and the like. Therefore, the accuracy of the lidar measurement data is crucial for safe autonomous driving. In practical applications, it has been found that in wet/rainy weather conditions, lidar is prone to generate noise at the edges of highly reflective objects (such as traffic signs) when scanning the objects, which are also known as flood noise. As the ambient humidity increases or the rainfall increases, the number of flood noise points also increases, which can greatly reduce the accuracy of the lidar measurement data, thereby posing a threat to safe autonomous driving.
Disclosure of Invention
The invention aims to provide a method, a device, electronic equipment and a computer readable storage medium for filtering floodlight noise points of a laser radar in automatic driving, which aim to overcome the defects of the prior art. Thus, the problem of the accuracy reduction of the laser radar measurement data in a wet/rainy weather state can be solved, and the environmental adaptability, safety and stability of the automatic driving can be further improved.
In order to achieve the above object, a first aspect of the embodiments of the present invention provides a method for filtering laser radar flood noise in automatic driving, where the method includes:
the automatic driving control system performs laser scanning measurement on a first target environment through a laser radar in the automatic driving process, and converts a scanning result into point cloud data so as to obtain a corresponding first point cloud data set; the first set of point cloud data comprises a plurality of first point cloud data; the first point cloud data comprises first three-dimensional coordinate data and first reflection intensity data; the first three-dimensional coordinate data comprises first depth data;
performing range image conversion processing on the first point cloud data set according to a preset corresponding relation between the depth and the RGB pixel values to generate first image data; the first image data includes a plurality of first pixel point data; the first point cloud data corresponds to the first pixel point data;
according to the corresponding relation between a preset reflection intensity range and a designated pixel value, carrying out layered image processing on the first image data to generate second image data; the second image data includes a plurality of second pixel point data; the first pixel point data corresponds to the second pixel point data;
performing foreground spot marking processing on the second image data by using a spot detection model to obtain a first spot area;
marking the outer edge area of the first spot area according to preset outer edge range parameters to obtain a first outer edge area; marking the second pixel point data in the first outer edge area as first outer edge pixel point data; in the first point cloud data set, marking the first point cloud data corresponding to the first peripheral pixel point data as first peripheral point cloud data;
calculating the vertical depth of the first spot area to generate first area depth data;
according to the first region depth data, carrying out floodlight noise point identification and marking processing on each first outer edge point cloud data;
filtering the first point cloud data marked as floodlighting noise points from the first point cloud data set.
Preferably, the first three-dimensional coordinate data further includes first width data and first height data;
the first pixel point data includes first two-dimensional coordinate data and first pixel data; the first two-dimensional coordinate data comprises first abscissa data and first ordinate data;
the second pixel point data includes second two-dimensional coordinate data and second pixel data; the second two-dimensional coordinate data comprises second abscissa data and second ordinate data; .
Preferably, the performing range image conversion processing on the first point cloud data set according to a preset corresponding relationship between depth and RGB pixel values to generate first image data specifically includes:
sequentially extracting the first point cloud data of the first point cloud data set as current point cloud data; performing two-dimensional abscissa conversion processing on the first width data of the current point cloud data to generate first abscissa data; performing two-dimensional vertical coordinate conversion processing on the first height data of the current point cloud data to generate first vertical coordinate data; polling a first corresponding relation record of a first corresponding relation table which is preset and reflects the corresponding relation between depth and RGB pixel values according to the first depth data of the current point cloud data, and if a first depth range field of the currently polled first corresponding relation record is matched with the first depth data of the current point cloud data, extracting a first pixel value field of the currently polled first corresponding relation record as the first pixel data; the first abscissa data and the first ordinate data form first two-dimensional coordinate data, and the first two-dimensional coordinate data and the first pixel data form first pixel point data; the first correspondence table includes a plurality of the first correspondence records; the first correspondence record includes the first depth range field and the first pixel value field; the first depth range field is a depth value range or a depth value;
and performing two-dimensional image conversion processing according to the two-dimensional coordinates and the pixel values of the plurality of first pixel point data to generate the first image data.
Preferably, the performing layered image processing on the first image data according to the preset corresponding relationship between the reflection intensity range and the designated pixel value to generate second image data specifically includes:
sequentially extracting the first pixel point data of the first image data as current pixel point data; and the first abscissa data of the current pixel point data is taken as the second abscissa data; taking the first vertical coordinate data of the current pixel point data as the second vertical coordinate data; acquiring the first reflection intensity data of the first point cloud data corresponding to the current pixel point data as first current reflection intensity data; polling a second corresponding relation record of a second corresponding relation table which is preset and reflects the corresponding relation between the reflection intensity range and the specified pixel value according to the first current reflection intensity data, and if a first reflection intensity range field of the second corresponding relation record which is polled currently is matched with the first current reflection intensity data, extracting a first specified pixel value field of the second corresponding relation record which is polled currently as the second pixel data; if there is no record matching with the first current reflection intensity data in the second correspondence table, taking the first pixel data of the current pixel point data as the second pixel data; the second abscissa data and the second ordinate data are combined into second two-dimensional coordinate data, and the second two-dimensional coordinate data and the second pixel data are combined into second pixel point data; the second correspondence table includes two second correspondence records; the second correspondence record includes the first reflection intensity range field and the first specified pixel value field; the first specified pixel value field includes only two specified pixel values;
and performing two-dimensional image conversion processing according to the two-dimensional coordinates and the pixel values of the plurality of second pixel point data to generate second image data.
Preferably, the foreground blob marking processing is performed on the second image data by using a blob detection model to obtain a first blob area, and specifically includes:
inputting the second image data into the spot detection model, and performing foreground spot identification processing according to the depth relation corresponding to the image pixels to obtain a first spot coordinate set corresponding to the foreground spots; and incorporating the second pixel point data which accord with the first spot coordinate set in the second image data into the first spot area.
Preferably, the step of performing floodlighting noise point identification and marking on each first outer edge point cloud data according to the first region depth data specifically includes:
polling each first outer edge point cloud data, and taking the currently polled first outer edge point cloud data as current outer edge point cloud data; extracting the first depth data of the current outer edge point cloud data as current depth data, and extracting the first reflection intensity data of the current outer edge point cloud data as second current reflection intensity data; and when the current depth data is matched with the first region depth data and the second current reflection intensity data is lower than a preset floodlighting noise point reflection intensity threshold value, marking the current outer edge point cloud data as the floodlighting noise point.
A second aspect of an embodiment of the present invention provides an apparatus for implementing the method for filtering floodlight noise of a lidar in automatic driving according to the first aspect, where the apparatus is applied to an automatic driving control system, the automatic driving control system includes the apparatus and the lidar, and the apparatus includes: the system comprises a laser radar data processing module, a range image processing module, a layered image processing module, a spot detection processing module and a floodlight noise point processing module;
the laser radar data processing module is used for carrying out laser scanning measurement on a first target environment through the laser radar in the automatic driving process and converting a scanning result into point cloud data so as to obtain a corresponding first point cloud data set; wherein the first set of point cloud data comprises a plurality of first point cloud data; the first point cloud data comprises first three-dimensional coordinate data and first reflection intensity data; the first three-dimensional coordinate data comprises first depth data;
the range image processing module is used for performing range image conversion processing on the first point cloud data set according to a preset corresponding relation between depth and RGB pixel values to generate first image data; wherein the first image data includes a plurality of first pixel point data; the first point cloud data corresponds to the first pixel point data;
the layered image processing module is used for performing layered image processing on the first image data according to a corresponding relation between a preset reflection intensity range and a specified pixel value to generate second image data; wherein the second image data includes a plurality of second pixel point data; the first pixel point data corresponds to the second pixel point data;
the spot detection processing module is used for performing foreground spot marking processing on the second image data by using a spot detection model to obtain a first spot area;
the floodlight noise processing module is used for marking the outer edge area of the first spot area according to preset outer edge range parameters to obtain a first outer edge area; marking the second pixel point data in the first outer edge area as first outer edge pixel point data; in the first point cloud data set, marking the first point cloud data corresponding to the first peripheral pixel point data as first peripheral point cloud data; calculating the vertical depth of the first spot area to generate first area depth data; according to the first region depth data, carrying out floodlight noise point identification and marking processing on each first outer edge point cloud data; and filtering the first point cloud data marked as floodlighting noise points from the first point cloud data set.
A third aspect of an embodiment of the present invention provides an electronic device, including: a memory, a processor, and a transceiver;
the processor is configured to be coupled to the memory, read and execute instructions in the memory, so as to implement the method steps of the first aspect;
the transceiver is coupled to the processor, and the processor controls the transceiver to transmit and receive messages.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium storing computer instructions that, when executed by a computer, cause the computer to perform the method of the first aspect.
The embodiment of the invention provides a method and a device for filtering floodlight noise points of a laser radar in automatic driving, electronic equipment and a computer readable storage medium. Therefore, the problem that the accuracy of the laser radar measurement data is reduced in a wet/rainy weather state is solved, and the environmental adaptability, safety and stability of automatic driving are further improved.
Drawings
Fig. 1 is a schematic diagram of a method for filtering floodlight noise of a laser radar in automatic driving according to an embodiment of the present invention;
fig. 2 is a block diagram of a filtering apparatus for laser radar flood noise during automatic driving according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device according to a third embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An embodiment of the present invention provides a method for filtering laser radar flood noise in automatic driving, as shown in fig. 1, which is a schematic diagram of a method for filtering laser radar flood noise in automatic driving, and the method mainly includes the following steps:
step 1, an automatic driving control system performs laser scanning measurement on a first target environment through a laser radar in an automatic driving process, and converts a scanning result into point cloud data so as to obtain a corresponding first point cloud data set;
wherein the first point cloud data set comprises a plurality of first point cloud data; the first point cloud data comprises first three-dimensional coordinate data and first reflection intensity data; the first three-dimensional coordinate data includes first width data, first height data, and first depth data.
The laser radar is scanning measurement equipment integrating laser scanning and a positioning and attitude determination system, can transmit light pulse signals to an environmental target through a laser, receive reflected signals reflected from the environmental target through a receiver, obtain depth distances between the laser radar and each scanning point of the environmental target according to time difference of the transmitted signals and the received signals, obtain horizontal displacement and vertical displacement between the laser radar and each scanning point of the environmental target according to difference between the transmitted signals and the received signals, and obtain reflected signal strength of each scanning point of the environmental target according to energy ratio of the transmitted signals and the received signals; the automatic driving control system is a control system additionally arranged on an automatic driving vehicle, and can further perform statistical processing on scanning data of the laser radar, so that a point cloud data (point cloud data) set corresponding to an environmental target, namely a first point cloud data set, can be obtained; the point cloud data is actually scanning data calibrated according to a three-dimensional coordinate system, and each point cloud data comprises a group of three-dimensional coordinates, namely first three-dimensional coordinate data, and laser reflection intensity, namely first reflection intensity data; the three-dimensional coordinates include coordinate components of three directions, namely an X-axis component, a Y-axis component and a Z-axis component, wherein the X-axis component is first width data, the Y-axis component is first depth data, and the Z-axis component is first height data; the first reflection intensity data is laser reflection intensity data calculated after the laser radar transmits laser signals to the environment target point position defined by the three-dimensional coordinates and receives the laser signals reflected by the point.
Step 2, performing range image conversion processing on the first point cloud data set according to a preset depth and RGB pixel value corresponding relation to generate first image data;
wherein the first image data includes a plurality of first pixel point data; the first pixel point data includes first two-dimensional coordinate data and first pixel data; the first two-dimensional coordinate data comprises first abscissa data and first ordinate data; the first point cloud data corresponds to the first pixel point data;
here, the range image is an image in which the distance (depth) of each point in the scene is taken as a pixel value, and the range image may directly reflect the geometric shape of the visible surface of the scene; the automatic driving control system takes the first depth data of each first point cloud data as a pixel value, and then a range image which can reflect the corresponding environment target shape of the first point cloud data set, namely first image data, can be obtained;
the method specifically comprises the following steps: step 21, sequentially extracting first point cloud data of the first point cloud data set as current point cloud data; performing two-dimensional abscissa conversion processing on first width data of the current point cloud data to generate first abscissa data; performing two-dimensional vertical coordinate conversion processing on first height data of the current point cloud data to generate first vertical coordinate data; polling a first corresponding relation record of a first corresponding relation table which reflects the corresponding relation between the depth and the RGB pixel value according to first depth data of the current point cloud data, and if a first depth range field of the currently polled first corresponding relation record is matched with the first depth data of the current point cloud data, extracting a first pixel value field of the currently polled first corresponding relation record as first pixel data; the first horizontal coordinate data and the first vertical coordinate data form first two-dimensional coordinate data, and the first two-dimensional coordinate data and the first pixel data form first pixel point data;
wherein the first correspondence table includes a plurality of first correspondence records; the first correspondence record includes a first depth range field and a first pixel value field; the first depth range field is a depth value range or a depth value;
here, each first correspondence record in the first correspondence table is used to specify a correspondence of one depth range to one pixel value; in the process of converting point cloud data into pixel point data of a two-dimensional image, converting X, Z components of three-dimensional coordinates of the point cloud data, namely first width data and first height data into horizontal coordinates and vertical coordinates of the two-dimensional coordinates, converting Y components of the three-dimensional coordinates of the point cloud data, namely first depth data into corresponding pixel values by looking up a first corresponding relation table, and obtaining all information of independent pixel points on the two-dimensional image, namely the first pixel point data from the two-dimensional coordinates and the pixel values;
it should be noted that, the first depth range field of the first mapping table may be a depth value range, and may also be a depth value; if the first point cloud data is to be converted into first pixel point data of the range image one-to-one, the first depth range field should be a depth value, and the first point cloud data and the first pixel point data are in one-to-one correspondence at the moment; if the massive first point cloud data is to be downsampled, setting a first depth range field as a depth value range, wherein a plurality of first point cloud data correspond to one first pixel point data;
step 22, according to the obtained two-dimensional coordinates and pixel values of the plurality of first pixel point data, performing two-dimensional image conversion processing to generate first image data.
Here, combining all the obtained first pixel point data can obtain the above-described range image, that is, the first image data.
Step 3, according to the corresponding relation between the preset reflection intensity range and the designated pixel value, carrying out layered image processing on the first image data to generate second image data;
wherein the second image data includes a plurality of second pixel point data; the second pixel point data includes second two-dimensional coordinate data and second pixel data; the second two-dimensional coordinate data includes second abscissa data and second ordinate data; the first pixel point data corresponds to the second pixel point data;
here, the purpose of the hierarchical processing of the range image, i.e. the first image data, by the automatic driving system is to highlight the three-layer boundaries of the foreground, the medium view and the distant view (or background) in the image, so that the subsequent steps can effectively extract the foreground region;
the method specifically comprises the following steps: step 31, sequentially extracting first pixel point data of the first image data as current pixel point data; taking the first abscissa data of the current pixel point data as second abscissa data; taking the first vertical coordinate data of the current pixel point data as second vertical coordinate data; acquiring first reflection intensity data of first point cloud data corresponding to the current pixel point data as first current reflection intensity data; polling a second corresponding relation record of a second corresponding relation table which reflects the corresponding relation between the preset reflection intensity range and the specified binary pixel value according to the first current reflection intensity data, and if a first reflection intensity range field of the currently polled second corresponding relation record is matched with the first current reflection intensity data, extracting a first specified pixel value field of the currently polled second corresponding relation record as second pixel data; if no record matched with the first current reflection intensity data exists in the second corresponding relation table, taking the first pixel data of the current pixel point data as second pixel data; if the second abscissa data and the second ordinate data form second two-dimensional coordinate data, the second two-dimensional coordinate data and the second pixel data form second pixel point data;
the second corresponding relation table comprises two second corresponding relation records; the second corresponding relation record comprises a first reflection intensity range field and a first specified pixel value field; the first specified pixel value field includes only two specified pixel values;
here, the second correspondence table includes two second correspondence records, where a first reflection intensity range field of one of the second correspondence records is used for specifying that the reflection intensity range is a high intensity range greater than the first reflection intensity threshold, and a first specified pixel value field of the record is set to a specified pixel value a corresponding to the high intensity range; a first reflection intensity range field of another second correspondence record for specifying that the reflection intensity range is a low intensity range smaller than the second reflection intensity threshold value, the first specified pixel value field of the record being set to a specified pixel value B corresponding to the low intensity range; the first reflection intensity threshold is much larger than the second reflection intensity threshold, for example, if the reflection intensity value range of the current laser radar is 0 to 255, the first reflection intensity threshold is 250, and the second reflection intensity threshold is 10;
here, because the closer the target object is to the laser radar, the greater the intensity of the reflected signal is, and conversely, the farther the target object is from the laser radar, the smaller the intensity of the reflected signal is, the point cloud data with a large reflection intensity can be regarded as foreground point cloud data from a visual angle, and corresponding pixel points in the range image are regarded as foreground pixel points; otherwise, the point cloud data with small reflection intensity can be regarded as distant view (or background) point cloud data, and the corresponding pixel points in the range image are regarded as distant view (or background) pixel points;
when the embodiment of the invention carries out layered image processing on the range image, namely the first image data, the foreground, the middle view and the distant view (or background) are distinguished according to the distance relation of each pixel point, namely according to the first reflection intensity data of the first point cloud data corresponding to each first pixel point data, in the processing process, the first pixel point data corresponding to the reflection intensity exceeding the first reflection intensity threshold value in the range image, namely the first image data are all brought into the foreground layer by looking up the second corresponding relation table, the first pixel point data corresponding to the reflection intensity lower than the second reflection intensity threshold value are all brought into the distant view (or background) layer, then the pixel values of all the pixel points of the foreground layer are all unified into the appointed pixel value A, the pixel values of all the pixel points of the distant view (or background) layer are all unified into the appointed pixel value B, and the original pixel characteristic of the intermediate layer between the foreground layer and the distant view (or background) layer is still kept No transformation is performed;
step 32, performing two-dimensional image conversion processing according to the two-dimensional coordinates and pixel values of the plurality of second pixel point data to generate second image data.
Here, a layered image, that is, the second image data, can be obtained by recombining the pixels after the above conversion.
Step 4, carrying out foreground spot marking processing on the second image data by using a spot detection model to obtain a first spot area;
the method specifically comprises the following steps: inputting the second image data into a spot detection model, and performing foreground spot identification processing according to the depth relation corresponding to the image pixel to obtain a first spot coordinate set corresponding to the foreground spot; and all the second pixel point data which accord with the first spot coordinate set in the second image data are brought into the first spot area.
Here, a blob (blob) is a group of connected pixels that share some attributes such as gray value in one picture; the purpose of blob detection (blob detection) is to identify and mark these regions; commonly used speckle detection models include differential method models, Markov Random Fields (MRF), and the like; because the image has been subjected to foreground, distant view (or background) layered preprocessing in step 3, better speckle, i.e., foreground extraction, can be obtained using the speckle detection model in this step.
Step 5, marking the outer edge area of the first spot area according to preset outer edge range parameters to obtain a first outer edge area; marking second pixel point data in the first outer edge area as first outer edge pixel point data; and marking the first point cloud data corresponding to the first outer edge pixel point data as first outer edge point cloud data in the first point cloud data set.
Here, because the flood noise is present at the edge of the object with high reflection intensity, before identifying the flood noise, it is necessary to pre-define a region of the flood noise for the subsequent step to identify the point cloud data in the region, where the defined region is a first outer edge region, and the first outer edge region is actually a band-shaped region surrounding the first spot region; the preset peripheral range parameter is a preset parameter for defining the peripheral space width of the spot area.
And 6, calculating the vertical depth of the first spot area to generate first area depth data.
Here, the automatic driving control system records first depth data of first point cloud data corresponding to each second pixel point data in the first spot area as first spot depth data, and performs mean calculation on all the first spot data to obtain first area depth data.
Step 7, according to the first region depth data, carrying out floodlight noise point identification and marking processing on each first outer edge point cloud data;
the method specifically comprises the following steps: polling each first outer edge point cloud data, and taking the currently polled first outer edge point cloud data as current outer edge point cloud data; extracting first depth data of the current outer edge point cloud data as current depth data, and extracting first reflection intensity data of the current outer edge point cloud data as second current reflection intensity data; and when the current depth data is matched with the first region depth data and the second current reflection intensity data is lower than a preset floodlighting noise point reflection intensity threshold value, marking the current outer edge point cloud data as floodlighting noise points.
Here, because the flood noise is present at the edge of the object with high reflection intensity, the depth of the flood noise should be substantially the same as the depth of the high reflection intensity area, that is, the foreground area, that is, the first spot area, and because the flood noise belongs to the noise, the reflection intensity of the flood noise is weak, so that when the flood noise is identified in the first outer edge area, the embodiment of the present invention performs the determination according to the depth data and the reflection intensity data of the corresponding point cloud data. The preset floodlight noise reflection intensity threshold is a preset reflection intensity threshold, and if the current reflection intensity value range of the laser radar is 0-255, the floodlight noise reflection intensity threshold is 10.
And 8, filtering the first point cloud data marked as floodlighting noise points from the first point cloud data set.
Here, after the floodlight noise marking is completed, the first point cloud data marked as the floodlight noise may be directly deleted from the first point cloud data set.
Fig. 2 is a block diagram of a filtering apparatus for laser radar floodlight noise during automatic driving according to a second embodiment of the present invention, where the apparatus is applied to the automatic driving control system described above, and the automatic driving control system includes the apparatus and the laser radar described above, and the apparatus is a terminal device or a server that implements the foregoing method embodiment, or an apparatus that enables the foregoing terminal device or server to implement the foregoing method embodiment, for example, the apparatus may be a device or a chip system of the foregoing terminal device or server. As shown in fig. 2, the apparatus includes: the system comprises a laser radar data processing module 101, a range image processing module 102, a layered image processing module 103, a spot detection processing module 104 and a floodlighting noise processing module 105.
The laser radar data processing module 101 is configured to perform laser scanning measurement on a first target environment through a laser radar in an automatic driving process, and convert a scanning result into point cloud data to obtain a corresponding first point cloud data set; wherein the first point cloud data set comprises a plurality of first point cloud data; the first point cloud data comprises first three-dimensional coordinate data and first reflection intensity data; the first three-dimensional coordinate data includes first depth data.
The range image processing module 102 is configured to perform range image conversion processing on the first point cloud data set according to a preset depth and RGB pixel value correspondence to generate first image data; wherein the first image data includes a plurality of first pixel point data; the first dot cloud data corresponds to the first pixel dot data.
The layered image processing module 103 is configured to perform layered image processing on the first image data according to a preset corresponding relationship between a reflection intensity range and a designated pixel value to generate second image data; wherein the second image data includes a plurality of second pixel point data; the first pixel point data corresponds to the second pixel point data.
The blob detection processing module 104 is configured to perform foreground blob marking processing on the second image data by using a blob detection model, so as to obtain a first blob area.
The floodlighting noise processing module 105 is configured to mark an outer edge area of the first spot area according to a preset outer edge range parameter to obtain a first outer edge area; marking second pixel point data in the first outer edge area as first outer edge pixel point data; in the first point cloud data set, marking first point cloud data corresponding to the first outer edge pixel point data as first outer edge point cloud data; calculating the vertical depth of the first spot area to generate first area depth data; according to the first region depth data, carrying out floodlight noise point identification and marking processing on each first outer edge point cloud data; and filtering the first point cloud data marked as floodlighting noise from the first point cloud data set.
The filtering device for the laser radar floodlight noise in automatic driving provided by the embodiment of the invention can execute the method steps in the method embodiment, the implementation principle and the technical effect are similar, and the detailed description is omitted.
It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the lidar data processing module may be a separate processing element, or may be implemented by being integrated into a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and a processing element of the apparatus calls and executes the functions of the above determination module. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more Digital Signal Processors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, when some of the above modules are implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor that can invoke the program code. As another example, these modules may be integrated together and implemented in the form of a System-on-a-chip (SOC).
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the foregoing method embodiments are generated in whole or in part when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optics, Digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, bluetooth, microwave, etc.). DVD), or semiconductor media (e.g., Solid State Disk (SSD)), etc.
Fig. 3 is a schematic structural diagram of an electronic device according to a third embodiment of the present invention. The electronic device may be the terminal device or the server, or may be a terminal device or a server connected to the terminal device or the server and implementing the method according to the embodiment of the present invention. As shown in fig. 3, the electronic device may include: a processor 301 (e.g., a CPU), a memory 302, a transceiver 303; the transceiver 303 is coupled to the processor 301, and the processor 301 controls the transceiving operation of the transceiver 303. Various instructions may be stored in memory 302 for performing various processing functions and implementing the processing steps described in the foregoing method embodiments. Preferably, the electronic device according to an embodiment of the present invention further includes: a power supply 304, a system bus 305, and a communication port 306. The system bus 305 is used to implement communication connections between the elements. The communication port 306 is used for connection communication between the electronic device and other peripherals.
The system bus 305 mentioned in fig. 3 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The system bus may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus. The communication interface is used for realizing communication between the database access device and other equipment (such as a client, a read-write library and a read-only library). The Memory may include a Random Access Memory (RAM) and may also include a Non-Volatile Memory (Non-Volatile Memory), such as at least one disk Memory.
The Processor may be a general-purpose Processor, and includes a central Processing Unit CPU, a Network Processor (NP), a Graphics Processing Unit (GPU), and the like; but also a digital signal processor DSP, an application specific integrated circuit ASIC, a field programmable gate array FPGA or other programmable logic device, discrete gate or transistor logic, discrete hardware components.
It should be noted that the embodiment of the present invention also provides a computer-readable storage medium, which stores instructions that, when executed on a computer, cause the computer to execute the method and the processing procedure provided in the above-mentioned embodiment.
The embodiment of the present invention further provides a chip for executing the instructions, where the chip is configured to execute the processing steps described in the foregoing method embodiment.
The embodiment of the invention provides a method and a device for filtering floodlight noise points of a laser radar in automatic driving, electronic equipment and a computer readable storage medium. Therefore, the problem that the accuracy of the laser radar measurement data is reduced in a wet/rainy weather state is solved, and the environmental adaptability, safety and stability of automatic driving are further improved.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, a software module executed by a processor, or a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (9)

1. A method for filtering floodlight noise of a laser radar in automatic driving is characterized by comprising the following steps:
the automatic driving control system performs laser scanning measurement on a first target environment through a laser radar in the automatic driving process, and converts a scanning result into point cloud data so as to obtain a corresponding first point cloud data set; the first set of point cloud data comprises a plurality of first point cloud data; the first point cloud data comprises first three-dimensional coordinate data and first reflection intensity data; the first three-dimensional coordinate data comprises first depth data;
performing range image conversion processing on the first point cloud data set according to a preset corresponding relation between the depth and the RGB pixel values to generate first image data; the first image data includes a plurality of first pixel point data; the first point cloud data corresponds to the first pixel point data;
according to the corresponding relation between a preset reflection intensity range and a designated pixel value, carrying out layered image processing on the first image data to generate second image data; the second image data includes a plurality of second pixel point data; the first pixel point data corresponds to the second pixel point data;
performing foreground spot marking processing on the second image data by using a spot detection model to obtain a first spot area;
marking the outer edge area of the first spot area according to preset outer edge range parameters to obtain a first outer edge area; marking the second pixel point data in the first outer edge area as first outer edge pixel point data; in the first point cloud data set, marking the first point cloud data corresponding to the first peripheral pixel point data as first peripheral point cloud data;
calculating the vertical depth of the first spot area to generate first area depth data;
according to the first region depth data, carrying out floodlight noise point identification and marking processing on each first outer edge point cloud data;
filtering the first point cloud data marked as floodlighting noise points from the first point cloud data set.
2. The method of filtering lidar flood noise during autopilot according to claim 1,
the first three-dimensional coordinate data further comprises first width data and first height data;
the first pixel point data includes first two-dimensional coordinate data and first pixel data; the first two-dimensional coordinate data comprises first abscissa data and first ordinate data;
the second pixel point data includes second two-dimensional coordinate data and second pixel data; the second two-dimensional coordinate data comprises second abscissa data and second ordinate data; .
3. The method for filtering laser radar floodlight noise during automatic driving according to claim 2, wherein the performing range image conversion processing on the first point cloud data set according to a preset depth and RGB pixel value correspondence relationship to generate first image data specifically comprises:
sequentially extracting the first point cloud data of the first point cloud data set as current point cloud data; performing two-dimensional abscissa conversion processing on the first width data of the current point cloud data to generate first abscissa data; performing two-dimensional vertical coordinate conversion processing on the first height data of the current point cloud data to generate first vertical coordinate data; polling a first corresponding relation record of a first corresponding relation table which is preset and reflects the corresponding relation between depth and RGB pixel values according to the first depth data of the current point cloud data, and if a first depth range field of the currently polled first corresponding relation record is matched with the first depth data of the current point cloud data, extracting a first pixel value field of the currently polled first corresponding relation record as the first pixel data; the first abscissa data and the first ordinate data form first two-dimensional coordinate data, and the first two-dimensional coordinate data and the first pixel data form first pixel point data; the first correspondence table includes a plurality of the first correspondence records; the first correspondence record includes the first depth range field and the first pixel value field; the first depth range field is a depth value range or a depth value;
and performing two-dimensional image conversion processing according to the two-dimensional coordinates and the pixel values of the plurality of first pixel point data to generate the first image data.
4. The method for filtering laser radar floodlight noise during automatic driving according to claim 2, wherein the step of performing hierarchical image processing on the first image data according to the corresponding relationship between a preset reflection intensity range and a specified pixel value to generate second image data specifically comprises:
sequentially extracting the first pixel point data of the first image data as current pixel point data; and the first abscissa data of the current pixel point data is taken as the second abscissa data; taking the first vertical coordinate data of the current pixel point data as the second vertical coordinate data; acquiring the first reflection intensity data of the first point cloud data corresponding to the current pixel point data as first current reflection intensity data; polling a second corresponding relation record of a second corresponding relation table which is preset and reflects the corresponding relation between the reflection intensity range and the specified pixel value according to the first current reflection intensity data, and if a first reflection intensity range field of the second corresponding relation record which is polled currently is matched with the first current reflection intensity data, extracting a first specified pixel value field of the second corresponding relation record which is polled currently as the second pixel data; if there is no record matching with the first current reflection intensity data in the second correspondence table, taking the first pixel data of the current pixel point data as the second pixel data; the second abscissa data and the second ordinate data are combined into second two-dimensional coordinate data, and the second two-dimensional coordinate data and the second pixel data are combined into second pixel point data; the second correspondence table includes two second correspondence records; the second correspondence record includes the first reflection intensity range field and the first specified pixel value field; the first specified pixel value field includes only two specified pixel values;
and performing two-dimensional image conversion processing according to the two-dimensional coordinates and the pixel values of the plurality of second pixel point data to generate second image data.
5. The method for filtering laser radar floodlight noise during automatic driving according to claim 2, wherein the step of performing foreground spot marking processing on the second image data by using a spot detection model to obtain a first spot area specifically comprises:
inputting the second image data into the spot detection model, and performing foreground spot identification processing according to the depth relation corresponding to the image pixels to obtain a first spot coordinate set corresponding to the foreground spots; and incorporating the second pixel point data which accord with the first spot coordinate set in the second image data into the first spot area.
6. The method for filtering floodlight noise of laser radar in automatic driving according to claim 2, wherein the step of performing floodlight noise identification and marking on each first outer edge point cloud data according to the first region depth data specifically comprises:
polling each first outer edge point cloud data, and taking the currently polled first outer edge point cloud data as current outer edge point cloud data; extracting the first depth data of the current outer edge point cloud data as current depth data, and extracting the first reflection intensity data of the current outer edge point cloud data as second current reflection intensity data; and when the current depth data is matched with the first region depth data and the second current reflection intensity data is lower than a preset floodlighting noise point reflection intensity threshold value, marking the current outer edge point cloud data as the floodlighting noise point.
7. Device for carrying out the method for filtering lidar flood noise in autonomous driving according to any of claims 1 to 6, wherein the device is applied to an autonomous driving control system comprising the device and a lidar, the device comprising: the system comprises a laser radar data processing module, a range image processing module, a layered image processing module, a spot detection processing module and a floodlight noise point processing module;
the laser radar data processing module is used for carrying out laser scanning measurement on a first target environment through the laser radar in the automatic driving process and converting a scanning result into point cloud data so as to obtain a corresponding first point cloud data set; wherein the first set of point cloud data comprises a plurality of first point cloud data; the first point cloud data comprises first three-dimensional coordinate data and first reflection intensity data; the first three-dimensional coordinate data comprises first depth data;
the range image processing module is used for performing range image conversion processing on the first point cloud data set according to a preset corresponding relation between depth and RGB pixel values to generate first image data; wherein the first image data includes a plurality of first pixel point data; the first point cloud data corresponds to the first pixel point data;
the layered image processing module is used for performing layered image processing on the first image data according to a corresponding relation between a preset reflection intensity range and a specified pixel value to generate second image data; wherein the second image data includes a plurality of second pixel point data; the first pixel point data corresponds to the second pixel point data;
the spot detection processing module is used for performing foreground spot marking processing on the second image data by using a spot detection model to obtain a first spot area;
the floodlight noise processing module is used for marking the outer edge area of the first spot area according to preset outer edge range parameters to obtain a first outer edge area; marking the second pixel point data in the first outer edge area as first outer edge pixel point data; in the first point cloud data set, marking the first point cloud data corresponding to the first peripheral pixel point data as first peripheral point cloud data; calculating the vertical depth of the first spot area to generate first area depth data; according to the first region depth data, carrying out floodlight noise point identification and marking processing on each first outer edge point cloud data; and filtering the first point cloud data marked as floodlighting noise points from the first point cloud data set.
8. An electronic device, comprising: a memory, a processor, and a transceiver;
the processor is used for being coupled with the memory, reading and executing the instructions in the memory to realize the method steps of any one of claims 1-6;
the transceiver is coupled to the processor, and the processor controls the transceiver to transmit and receive messages.
9. A computer-readable storage medium having stored thereon computer instructions which, when executed by a computer, cause the computer to perform the method of any of claims 1-6.
CN202110626524.1A 2021-06-04 2021-06-04 Method and device for filtering floodlight noise of laser radar in automatic driving Withdrawn CN113325388A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110626524.1A CN113325388A (en) 2021-06-04 2021-06-04 Method and device for filtering floodlight noise of laser radar in automatic driving

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110626524.1A CN113325388A (en) 2021-06-04 2021-06-04 Method and device for filtering floodlight noise of laser radar in automatic driving

Publications (1)

Publication Number Publication Date
CN113325388A true CN113325388A (en) 2021-08-31

Family

ID=77419732

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110626524.1A Withdrawn CN113325388A (en) 2021-06-04 2021-06-04 Method and device for filtering floodlight noise of laser radar in automatic driving

Country Status (1)

Country Link
CN (1) CN113325388A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114966604A (en) * 2022-05-26 2022-08-30 苏州轻棹科技有限公司 Target detection processing method for partition point cloud
CN115825982A (en) * 2023-02-02 2023-03-21 深圳煜炜光学科技有限公司 Method and system for scanning point cloud data of unmanned aerial vehicle in rainy environment
CN116184357A (en) * 2023-03-07 2023-05-30 之江实验室 Ground point cloud data processing method and device, electronic device and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114966604A (en) * 2022-05-26 2022-08-30 苏州轻棹科技有限公司 Target detection processing method for partition point cloud
CN114966604B (en) * 2022-05-26 2024-05-03 苏州轻棹科技有限公司 Target detection processing method for partition point cloud
CN115825982A (en) * 2023-02-02 2023-03-21 深圳煜炜光学科技有限公司 Method and system for scanning point cloud data of unmanned aerial vehicle in rainy environment
CN116184357A (en) * 2023-03-07 2023-05-30 之江实验室 Ground point cloud data processing method and device, electronic device and storage medium
CN116184357B (en) * 2023-03-07 2023-08-15 之江实验室 Ground point cloud data processing method and device, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN113325388A (en) Method and device for filtering floodlight noise of laser radar in automatic driving
CN113012210B (en) Method and device for generating depth map, electronic equipment and storage medium
CN113657224B (en) Method, device and equipment for determining object state in vehicle-road coordination
CN112513679B (en) Target identification method and device
WO2022142628A1 (en) Point cloud data processing method and device
CN111080662A (en) Lane line extraction method and device and computer equipment
CN110782465B (en) Ground segmentation method and device based on laser radar and storage medium
CN115436910B (en) Data processing method and device for performing target detection on laser radar point cloud
WO2022206517A1 (en) Target detection method and apparatus
CN114296056A (en) Laser radar external parameter calibration method, device, equipment and storage medium
CN112683228A (en) Monocular camera ranging method and device
CN113421217A (en) Method and device for detecting travelable area
WO2023024087A1 (en) Method, apparatus and device for processing laser radar point cloud, and storage medium
CN114764885A (en) Obstacle detection method and device, computer-readable storage medium and processor
CN113052916A (en) Laser radar and camera combined calibration method using specially-made calibration object
CN117078767A (en) Laser radar and camera calibration method and device, electronic equipment and storage medium
CN116642490A (en) Visual positioning navigation method based on hybrid map, robot and storage medium
CN111336938A (en) Robot and object distance detection method and device thereof
CN113112551B (en) Camera parameter determining method and device, road side equipment and cloud control platform
CN115376105A (en) Method and device for determining travelable area, electronic device and storage medium
CN113409376A (en) Method for filtering laser radar point cloud based on depth estimation of camera
CN111239740A (en) Method and equipment for removing ray noise
CN115457283A (en) Method and device for processing point cloud noise point
CN117392000B (en) Noise removing method and device, electronic equipment and storage medium
CN113219486B (en) Positioning method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210831