CN117751301A - Method, device, equipment and storage medium for processing laser radar point cloud - Google Patents

Method, device, equipment and storage medium for processing laser radar point cloud Download PDF

Info

Publication number
CN117751301A
CN117751301A CN202180100861.5A CN202180100861A CN117751301A CN 117751301 A CN117751301 A CN 117751301A CN 202180100861 A CN202180100861 A CN 202180100861A CN 117751301 A CN117751301 A CN 117751301A
Authority
CN
China
Prior art keywords
point
reflectivity
point cloud
cloud data
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180100861.5A
Other languages
Chinese (zh)
Inventor
宋妍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suteng Innovation Technology Co Ltd
Original Assignee
Suteng Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suteng Innovation Technology Co Ltd filed Critical Suteng Innovation Technology Co Ltd
Publication of CN117751301A publication Critical patent/CN117751301A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A method, device, equipment and storage medium for processing laser radar point cloud, the method comprises: acquiring point cloud data detected by a laser radar (S101); judging whether the point cloud data contains a high-reflection object or not (S102); when the judgment result is that the point cloud data contains the high-reflectivity object, determining pseudo point clouds in the point cloud data according to the preset judgment condition and the position information and the reflectivity corresponding to each point in the point cloud data (S103). Based on preset discrimination conditions and the position information and reflectivity corresponding to each point, the pseudo point cloud in the point cloud data corresponding to the high-reflection object can be accurately determined, so that the quality of the point cloud is improved, and the accuracy of laser radar measurement is improved.

Description

Method, device, equipment and storage medium for processing laser radar point cloud Technical Field
The present disclosure relates to the field of lidar technologies, and in particular, to a method, an apparatus, a device, and a storage medium for processing a lidar point cloud.
Background
The laser radar is an active remote sensing device which uses a laser as a transmitting light source and adopts a photoelectric detection technology means, and is an advanced detection mode combining a laser technology and a modern photoelectric detection technology. The system consists of a detection sensing system, a data processing system and the like. The working principle is that a detection signal (laser) is emitted to a target, and then the received echo signal is processed, so that the information of the distance, the size, the speed, the reflectivity and the like of the target can be obtained. Its advantages are high resolution and sensitivity, high anti-interference power, and no influence from dark condition. Therefore, the laser radar is widely applied to the fields of automatic driving, logistics vehicles, robots, vehicle-road coordination, public intelligent transportation and the like.
However, in practical application, a high-reflection expansion phenomenon often occurs, that is, a circle of point cloud with lower reflectivity is attached around the high-reflection plate, so that a sensing misjudgment is caused, the quality of the point cloud is affected, and the accuracy of distance measurement is affected, which is the so-called high-reflection expansion phenomenon.
Technical problem
One of the purposes of the embodiments of the present application is: the method, the device, the equipment and the storage medium for processing the laser radar point cloud are provided to solve the technical problem that the pseudo point cloud formed by the high-inverse expansion phenomenon cannot be accurately judged in the prior art.
Technical solution
In a first aspect, an embodiment of the present application provides a method for processing a laser radar point cloud, including:
acquiring point cloud data detected by a laser radar;
judging whether the point cloud data contains a high-reflection object or not;
and when the judgment result is that the point cloud data contains the high-reflectivity object, determining a pseudo point cloud in the point cloud data according to a preset judgment condition and position information and reflectivity corresponding to each point in the point cloud data.
In a second aspect, an embodiment of the present application provides an apparatus for processing a laser radar point cloud, including:
the acquisition unit is used for acquiring point cloud data detected by the laser radar;
The judging unit is used for judging whether the point cloud data contains a high-reflection object or not;
and the determining unit is used for determining the pseudo point cloud in the point cloud data according to preset judging conditions and the position information and the reflectivity corresponding to each point in the point cloud data when the judging result is that the point cloud data contains the high-reflectivity object.
In a third aspect, an embodiment of the present application provides an apparatus for processing a lidar point cloud, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the steps of the method for processing a lidar point cloud according to the first aspect are implemented when the processor executes the computer program.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium, which may be non-volatile or volatile, and which stores a computer program which, when executed by a processor, implements the steps of a method for processing a lidar point cloud as described in the first aspect above.
In a fifth aspect, embodiments of the present application provide a computer program product, which when run on a device for processing a lidar point cloud, causes the device to perform the steps of the method for processing a lidar point cloud according to the first aspect described above.
Advantageous effects
Compared with the prior art, the embodiment of the application has the beneficial effects that: acquiring point cloud data detected by a laser radar; judging whether the point cloud data contains a high-reflection object or not; and when the judgment result is that the point cloud data contains the high-reflectivity object, determining the pseudo point cloud in the point cloud data according to the preset judgment condition and the position information and the reflectivity corresponding to each point in the point cloud data. Based on preset discrimination conditions and the position information and reflectivity corresponding to each point, the pseudo point clouds in the point cloud data corresponding to the high-reflection object can be accurately determined, and then the pseudo point clouds can be accurately removed, so that the quality of the point clouds is improved, and the accuracy of laser radar measurement is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a method of processing a lidar point cloud provided in an exemplary embodiment of the present application;
FIG. 2 is a schematic flow chart diagram illustrating another method of processing a lidar point cloud according to an example embodiment of the present application;
FIG. 3 is a specific flow chart of step S203 of a method of processing a lidar point cloud, as illustrated in an exemplary embodiment of the present application;
FIG. 4 is a schematic view of a point cloud location and orientation according to an exemplary embodiment of the present application;
FIGS. 5a and 5b are graphs of point cloud data effects in one scenario provided by embodiments of the present application;
FIGS. 6a and 6b are graphs of point cloud data effects in another scenario provided by embodiments of the present application;
FIGS. 7a and 7b are graphs of point cloud data effects in yet another scenario provided by embodiments of the present application;
FIGS. 8a and 8b are graphs of point cloud data effects in yet another scenario provided by embodiments of the present application;
FIGS. 9a and 9b are graphs of point cloud data effects in yet another scenario provided by embodiments of the present application;
FIG. 10 is a schematic diagram of an apparatus for processing a lidar point cloud according to an embodiment of the present application;
fig. 11 is a schematic diagram of an apparatus for processing a laser radar point cloud according to another embodiment of the present application.
Embodiments of the invention
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The flow diagrams depicted in the figures are merely illustrative and not necessarily all of the elements and operations/steps are included or performed in the order described. For example, some operations/steps may be further divided, combined, or partially combined, so that the order of actual execution may be changed according to actual situations.
The laser radar is an active remote sensing device which uses a laser as a transmitting light source and adopts a photoelectric detection technology means, and is an advanced detection mode combining a laser technology and a modern photoelectric detection technology. The system consists of a transmitting system, a receiving system, a scanning control system, a data processing system and the like. The working principle is that a detection signal (laser) is emitted to a target, and then the received echo signal is processed, so that the information of the distance, the size, the speed, the reflectivity and the like of the target can be obtained. Its advantages are high resolution and sensitivity, high anti-interference power, and no influence from dark condition. Therefore, the laser radar is widely applied to the fields of automatic driving, logistics vehicles, robots, vehicle-road coordination, public intelligent transportation and the like.
However, the laser radar beam is not ideal, and its spot has a certain area. When a high-reflectivity object (high-reflectivity object) exists in the background of a detection target, the laser radar has the problem of inaccurate measurement. For example, a diffuse reflection signal of a light spot emitted by a laser radar onto a high-reflection object affects a received echo signal, so that a point cloud contour around the high-reflection object is diffused to the periphery to form a pseudo point cloud (a false point cloud with low reflectivity), the quality of the point cloud is affected, an error is caused to the detection of the laser radar, and the accuracy of ranging is affected, which is a so-called high-reflection expansion phenomenon.
In the prior art, the high-reflection expansion phenomenon cannot be accurately distinguished, and if the high-reflection expansion phenomenon is not processed, false target objects can be formed by the pseudo point clouds, so that the accuracy of a detection result of the laser radar is serious. In the prior art, point clouds near a high-reflection object are sometimes removed excessively, so that the point clouds of a real object are also removed mistakenly, and the accuracy of a laser radar detection result is also affected. Therefore, a method for determining a pseudo point cloud in a laser radar point cloud is urgently needed.
In view of the above, the present application provides a method for processing a point cloud of a lidar, by acquiring point cloud data detected by the lidar; judging whether the point cloud data contains a high-reflection object or not; and when the judgment result is that the point cloud data contains the high-reflectivity object, determining the pseudo point cloud in the point cloud data according to the preset judgment condition and the position information and the reflectivity corresponding to each point in the point cloud data. Based on preset discrimination conditions and the position information and reflectivity corresponding to each point, the pseudo point clouds in the point cloud data corresponding to the high-reflection object can be accurately and effectively determined, and then the pseudo point clouds can be accurately removed, so that the quality of the point clouds is improved, the accuracy of laser radar measurement is improved, and the stability of the point cloud image is ensured.
Referring to fig. 1, fig. 1 is a schematic flowchart of a method for processing a lidar point cloud according to an exemplary embodiment of the present application. The execution subject of the method for processing the laser radar point cloud provided by the application is equipment for processing the laser radar point cloud, wherein the equipment comprises, but is not limited to, a mobile terminal such as a smart phone, a tablet personal computer, a personal digital assistant (Personal Digital Assistant, PDA), a desktop computer and the like, and can also comprise various types of servers. For example, the server may be a stand-alone server, or may be a cloud service that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDN), and basic cloud computing services such as big data and artificial intelligence platforms.
The embodiment of the application does not limit the specific type of the device for processing the laser radar point cloud. The method of processing a lidar point cloud as shown in fig. 1 may include: s101 to S103 are specifically as follows:
s101: and acquiring point cloud data detected by the laser radar.
Illustratively, the lidar in the present embodiment is composed of a detection sensing system, a data processing system, and the like; when the laser radar comprises an optical scanning piece, the detection sensing system of the laser radar comprises a transmitting system, a receiving system, a scanning control system and a transmitting and receiving control system. The laser radar comprising the optical scanning element may be, for example, a mechanical laser radar, a Micro-electromechanical system (Micro-Electro Mechanical System, MEMS) laser radar, or a laser radar with turning mirror scanning. The scanning control system of the mechanical laser radar comprises a mechanical rotation control system; the scanning control system of the MEMS laser radar comprises an MEMS galvanometer control system; the rotary mirror scanning laser radar comprises a rotary mirror scanning control system. When the laser radar does not comprise an optical scanning piece, for example, a Flash laser radar, a detection sensing system of the laser radar comprises a transmitting system, a receiving system and a transmitting and receiving control system. The data processing system is a system for processing data obtained by the receiving system so as to output point cloud data, wherein the data processing system can be integrated inside the laser radar or outside the laser radar, and the limitation is not limited herein.
Wherein, it is understood that the detection sensing system of the laser radar is in communication connection with the data processing system.
Wherein the transmitting system comprises a laser transmitter and the receiving system comprises a photodetector.
The laser transmitter is used as a light source to transmit a detection signal to the detected object, and the photoelectric detector receives an echo signal reflected by the detected object, so that the data processing system can obtain point cloud data corresponding to the detected object according to the echo signal and the detection signal.
For example, the laser transmitter is used as a light source to transmit light beams to the object to be detected, and the photoelectric detector receives echo signals reflected by the object to be detected, so that each data point corresponding to the object to be detected can be obtained according to the echo signals and the detection signals, and the laser radar point cloud corresponding to the object to be detected can be acquired.
It should be noted that, if the device for processing the point cloud of the laser radar is integrated with the scanning device such as the laser radar, the device for processing the point cloud of the laser radar can directly control the scanning device such as the laser radar to collect the point cloud data corresponding to the object to be detected. If the equipment for processing the point cloud of the laser radar is connected with the scanning device such as the laser radar, the laser radar detects the object to be detected to obtain the point cloud data, and then the point cloud data is sent to the equipment for processing the point cloud of the laser radar. This is merely illustrative and is not limiting.
S102: and judging whether the point cloud data contains a high-reflection object or not.
Illustratively, in acquiring point cloud data detected by the lidar, position information and reflectivity corresponding to each point are included. Wherein the position information corresponding to each point may include coordinates of each point. For example, the offset of the point is represented by coordinates with the mounting position of the lidar as the origin.
And comparing the reflectivity of each point in the point cloud data with a first reflectivity threshold value, and judging whether the point cloud data contains a high-reflectivity object or not according to a comparison result.
Optionally, in one possible implementation manner, S102 may include S1021 to S1024, which are specifically as follows:
s1021: the reflectivity of each point in the point cloud data is determined.
The point cloud data detected by the laser radar are obtained, and the reflectivity corresponding to each point is included. Therefore, when the point cloud data detected by the laser radar is acquired, the reflectivity of each point in the point cloud data can be extracted.
S1022: the number of points having a reflectivity greater than the first reflectivity threshold is determined.
And comparing the reflectivity corresponding to each point in the point cloud data with a first reflectivity threshold value, wherein the obtained comparison result is that the reflectivity of the point is larger than the first reflectivity threshold value or the reflectivity of the point is smaller than or equal to the first reflectivity threshold value.
And counting the number of points with reflectivity larger than a first reflectivity threshold in the point cloud data according to the comparison result corresponding to each point in the point cloud data.
S1023: when the detected quantity reaches a preset quantity threshold value, the judgment result is that the point cloud data contains high-reflectivity objects.
The preset number threshold can be set and adjusted by a user according to actual conditions, and the preset number threshold is not limited. And detecting whether the number of points with reflectivity larger than a first reflectivity threshold in the point cloud data reaches a preset number threshold. When the detected quantity reaches a preset quantity threshold, the point cloud data is proved to contain the point cloud data corresponding to the high-reflection object, and the judgment result is marked as the point cloud data containing the high-reflection object.
S1024: and when the detected quantity does not reach the preset quantity threshold value, judging that the point cloud data does not contain the high-reflection object.
When the quantity is detected to not reach the preset quantity threshold, proving that the point cloud data does not contain the point cloud data corresponding to the high-reflection object, and marking the judgment result as that the point cloud data does not contain the high-reflection object.
Note that S1023 and S1024 are juxtaposed in the present embodiment, and S1024 is not executed after S1023, but S1023 or S1024 is executed according to the different scene selection, which is not limited thereto.
S103: and when the judgment result is that the point cloud data contains the high-reflectivity object, determining the pseudo point cloud in the point cloud data according to the preset judgment condition and the position information and the reflectivity corresponding to each point in the point cloud data.
When the judgment result is that the point cloud data contains the high-reflectivity object, each point in the point cloud data is traversed, and whether each point meets the preset judgment condition is judged according to the position information and the reflectivity corresponding to each point. For each point in the point cloud data, when the point is detected to meet the preset judging condition, judging the point as a pseudo point cloud; and when the point is detected to not meet the preset judging condition, judging that the point does not belong to the pseudo point cloud.
In the above embodiment, the point cloud data detected by the laser radar is acquired; judging whether the point cloud data contains a high-reflection object or not; and when the judgment result is that the point cloud data contains the high-reflectivity object, determining the pseudo point cloud in the point cloud data according to the preset judgment condition and the position information and the reflectivity corresponding to each point in the point cloud data. Based on preset discrimination conditions and the position information and reflectivity corresponding to each point, the pseudo point clouds in the point cloud data corresponding to the high-reflection object can be accurately determined, and then the pseudo point clouds can be accurately removed, so that the quality of the point clouds is improved, the accuracy of laser radar measurement is improved, and the stability of the point cloud image is ensured.
Optionally, in one possible implementation manner, after S103, the method for processing a laser radar point cloud provided in the present application may further include: and eliminating the pseudo point cloud in the point cloud data.
Illustratively, when it is determined that a point belongs to a pseudo point cloud, the point is deleted, i.e., the point is removed from the point cloud data. Alternatively, each time it is determined that a point belongs to the pseudo point cloud, the point may be directly removed from the point cloud data.
Each time a point is determined to belong to a pseudo point cloud, the point may be marked, that is, the point is marked as a pseudo point cloud, and if the point is determined not to belong to a pseudo point cloud, the point is not marked. And after all the points in the point cloud data are processed, uniformly eliminating all the points marked as pseudo point clouds. Specifically, the coordinates of each point belonging to the pseudo point cloud are recorded during marking, each pseudo point cloud is found in the point cloud data according to the coordinates of the point, and the pseudo point clouds are removed uniformly. This is merely illustrative and is not limiting.
In the embodiment, the pseudo point cloud in the point cloud data is removed, so that the rest points in the point cloud data are all effective points, the accuracy of laser radar measurement can be improved by measuring based on the effective points, the interference of the pseudo point cloud is avoided, the quality of the point cloud is improved, and the stability of the point cloud image is ensured.
Referring to fig. 2, fig. 2 is a schematic flowchart of another method for processing a lidar point cloud according to an exemplary embodiment of the present application, and in some possible implementations of the present application, another method for processing a lidar point cloud as shown in fig. 2 may include: s201 to S205. It should be noted that, S201 to S202 and S205 in the present embodiment are the same as S101 to S102 and S103 in the embodiment corresponding to fig. 1, and reference may be made to descriptions of S101 to S102 and S103 in the embodiment corresponding to fig. 1, which are not repeated herein. S203 to S204 are specifically as follows:
s203: determining discrimination information according to the position information and reflectivity corresponding to each point in the point cloud data.
Illustratively, the discrimination information is used for assisting in generating preset discrimination conditions for discriminating whether each point in the point cloud data belongs to the pseudo point cloud. The discrimination information may include a number of quantity values and a number of discrimination sub-conditions. Each quantity value and each discrimination sub-condition are determined according to the position information and the reflectivity corresponding to each point in the point cloud data.
S204: based on the discrimination information, a preset discrimination condition is generated.
For example, the preset criterion may be generated by arbitrarily combining the number of quantity values and the number of criterion sub-conditions based on a preset rule. The preset discriminating condition may include a plurality of preset discriminating sub-conditions, for example, when a plurality of quantity values and a plurality of discriminating sub-conditions are arbitrarily combined based on a preset rule, each combination result corresponds to one preset discriminating sub-condition. Based on a preset rule, a plurality of quantity values and a plurality of judging sub-conditions are subjected to various combinations, and a plurality of corresponding preset judging sub-conditions are obtained.
Referring to fig. 3, fig. 3 is a specific flowchart illustrating step S203 of a method for processing a lidar point cloud according to an exemplary embodiment of the present application; optionally, in some possible implementations of the present application, S203 may include S2031 to S2037, which are specifically as follows:
s2031: and determining a neighborhood corresponding to the target point in the point cloud data.
Illustratively, the target point represents any point in the point cloud data. For example, in the processing procedure, each point in the point cloud data may be processed according to a preset sequence, and each processed point is the target point. The neighborhood is the area where the data analysis is performed on the target point.
By way of example, each point in the point cloud data is sequentially subjected to sliding traversal, taking a target point as an example, and a two-dimensional window of a preset range of the target point is selected as a data analysis area, namely, a neighborhood corresponding to the target point. The preset range may be set according to actual situations, and is not limited thereto. For example, in this example, a two-dimensional window around l×l (e.g., 11×11) of the target point may be selected as the data analysis area, i.e., as the neighborhood corresponding to the target point. Wherein one L represents the window length in the lateral direction and the other L represents the window length in the longitudinal direction.
Optionally, when determining the neighborhood corresponding to the target point, the target point may be taken as a center, and a two-dimensional window around the target point l×l is selected as the neighborhood corresponding to the target point. The target point can also be used as the point of the upper left corner, the lower left corner, the upper right corner, the lower right corner and the like, and a two-dimensional window of L x L around the target point is selected as the neighborhood corresponding to the target point based on the point. This is merely illustrative and is not limiting.
Optionally, if the area corresponding to the pseudo point cloud is detected to be larger than the preset area, selecting the target point by interval jump to expand the coverage area of the neighborhood, and processing the data volume for a single time is also l×l. For example, the area corresponding to the pseudo point cloud possibly generated by the high-reflection object is preliminarily judged, if the area corresponding to the pseudo point cloud is judged to be larger than the preset area, the method of selecting the target point by interval jump is selected to expand the coverage area of the neighborhood, and meanwhile, the processing speed of the point cloud data can be accelerated.
Alternatively, the window length in the transverse and longitudinal directions may be non-uniform, for example, a two-dimensional window of l×h around the target point is selected as the data analysis area. Wherein L may represent a window length in a lateral direction, and H may represent a window length in a longitudinal direction; h may be a window length in the horizontal direction, and L may be a window length in the vertical direction. Alternatively, the processing on the target point may be one-dimensional operation, and may be adjusted according to the actual situation. This is merely illustrative and is not limiting.
S2032: and when the reflectivity corresponding to the target point is detected to be greater than or equal to a first reflectivity threshold, determining a first quantity value of the high-negative influence point in the neighborhood.
Illustratively, the discrimination information includes a first quantity value, a second quantity value, a third quantity value, a fourth quantity value, a fifth quantity value, a sixth quantity value, and a seventh quantity value. The first quantity value is used for representing the quantity of the high-negative influence points in the neighborhood corresponding to the target point.
Optionally, in one possible implementation, the high-negative impact point includes all points in the neighborhood corresponding to the target point having a reflectivity greater than or equal to the first reflectivity threshold. The reflectivity corresponding to the target point is obtained, the reflectivity corresponding to the target point is compared with a first reflectivity threshold, and when the reflectivity corresponding to the target point is larger than or equal to the first reflectivity threshold as a comparison result, points in the neighborhood corresponding to the target point are obtained, and the points are high-negative influence points. The number of the high-negative influence points is counted and is recorded as a first quantity value.
Optionally, in one possible implementation manner, the high-negative impact point includes all points in the neighborhood corresponding to the target point, where the reflectivity is greater than or equal to a preset value. The method includes the steps of obtaining a reflectivity corresponding to a target point, comparing the reflectivity corresponding to the target point with a first reflectivity threshold, obtaining a point in a neighborhood corresponding to the target point when the reflectivity corresponding to the target point is larger than or equal to the first reflectivity threshold as a comparison result, judging whether the point in the neighborhood is larger than or equal to a preset value, and marking the point as a high-negative influence point if the point in the neighborhood is larger than or equal to the preset value. And judging each point in the neighborhood, counting the number of points marked as high negative influence points, and marking the number of the high negative influence points as a first number value. The first reflectivity threshold and the preset value can be set according to practical situations, and are not limited.
S2033: when the reflectivity corresponding to the target point is detected to be smaller than the second reflectivity threshold, determining a second quantity value of points with reflectivity larger than or equal to the first reflectivity threshold in the neighborhood, and determining a third quantity value of points with reflectivity smaller than the second reflectivity threshold in the neighborhood.
The second number value is used to represent the number of points in the neighborhood corresponding to the target point where the reflectivity is greater than or equal to the first reflectivity threshold when the reflectivity corresponding to the target point is less than the second reflectivity threshold.
The reflectivity corresponding to the target point is obtained, the reflectivity corresponding to the target point is compared with a second reflectivity threshold, when the reflectivity corresponding to the target point is smaller than the second reflectivity threshold as a comparison result, each point in the neighborhood corresponding to the target point is traversed, and the reflectivity corresponding to each point is compared with the first reflectivity threshold to obtain a comparison result. And acquiring points with reflectivity larger than or equal to a first reflectivity threshold value in a neighborhood corresponding to the target point according to the comparison result, counting the number of the points, and recording the number of the points as a second number value. Alternatively, the position information of these points may also be recorded.
The third number value is used for indicating the number of points in the neighborhood corresponding to the target point, the reflectivity of which is smaller than the second reflectivity threshold value, when the reflectivity corresponding to the target point is smaller than the second reflectivity threshold value.
The reflectivity corresponding to the target point is obtained, the reflectivity corresponding to the target point is compared with a second reflectivity threshold, when the reflectivity corresponding to the target point is smaller than the second reflectivity threshold as a comparison result, each point in the neighborhood corresponding to the target point is traversed, and the reflectivity corresponding to each point is compared with the second reflectivity threshold to obtain a comparison result. And acquiring points with reflectivity smaller than a second reflectivity threshold value in the neighborhood corresponding to the target point according to the comparison result, counting the number of the points, and recording the number of the points as a third number value.
Optionally, when the reflectivity corresponding to the target point is detected to be smaller than the second reflectivity threshold, determining points in the neighborhood with reflectivity greater than or equal to the first reflectivity threshold, and recording position information of the points. When the reflectivity corresponding to the target point is detected to be smaller than the second reflectivity threshold value, determining points in the neighborhood, the reflectivity of which is smaller than the second reflectivity threshold value, and recording position information of the points. The first reflectivity threshold value and the second reflectivity threshold value are different, and can be set according to actual conditions, which is not limited.
S2034: and determining a fourth quantity value according to the acquired position information of the point with the reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood and the position information of the target point.
Illustratively, position information of a point having a reflectivity greater than or equal to the first reflectivity threshold value in a neighborhood corresponding to the target point is recorded in advance, and the position information may include coordinates corresponding to the point. And acquiring position information of a point with reflectivity larger than or equal to a first reflectivity threshold value in a neighborhood corresponding to the target point, and acquiring position information corresponding to the target point.
The fourth number value is used for representing the number of points in the neighborhood corresponding to the target point, wherein the first absolute value is smaller than the first distance threshold.
Illustratively, a first absolute value of a distance difference between a point having a reflectivity greater than or equal to a first reflectivity threshold in the neighborhood and the target point is determined from the acquired position information of the point having the reflectivity greater than or equal to the first reflectivity threshold in the neighborhood and the position information of the target point; a fourth quantity value corresponding to a point where the first absolute value is less than the first distance threshold is determined.
The first absolute value represents an absolute value of a distance difference between a point in the neighborhood where the reflectivity is greater than or equal to the first reflectivity threshold and the target point. It should be noted that, when there are a plurality of points in the neighborhood corresponding to the target point, where the reflectivity is greater than or equal to the first reflectivity threshold, there are a plurality of first absolute values, where each first absolute value is used to represent an absolute value of a distance difference between one point in the neighborhood, where the reflectivity is greater than or equal to the first reflectivity threshold, and the target point.
Specifically, for each point in the neighborhood corresponding to the target point where the reflectance is greater than or equal to the first reflectance threshold, the coordinates of the point are obtained, and the coordinates of the target point are obtained, and the first absolute value of the difference in distance from the point to the target point can be calculated. And comparing the first absolute value with a first distance threshold value to obtain a comparison result. The statistical comparison result is the number of points whose first absolute value is smaller than the first distance threshold, and the number of points is noted as a fourth number value.
Optionally, in one possible implementation manner, the fourth number value may be initialized to 0, and when comparing, for each point in the neighborhood corresponding to the target point, where the reflectivity is greater than or equal to the first reflectivity threshold, the first absolute value corresponding to the point with the first distance threshold, if the comparison result is that the first absolute value is less than the first distance threshold, the fourth number value is increased by 1 until all the points in the neighborhood corresponding to the target point are processed.
S2035: and acquiring a symmetrical point which is symmetrical about the center of a point with the reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood, and determining a fifth quantity value according to the symmetrical point and the position information of the target point.
The fifth number value is used to represent the number of points in the neighborhood corresponding to the target point, where the second absolute value is greater than the second distance threshold.
Coordinates of points in the neighborhood of the target point, the reflectivity of which is greater than or equal to the first reflectivity threshold value, and coordinates of the target point are acquired, and symmetric points which are centrosymmetric with respect to the points in the neighborhood, the reflectivity of which is greater than or equal to the first reflectivity threshold value, are determined according to the coordinates. Acquiring position information of a symmetry point; determining a second absolute value of the distance difference between the symmetry point and the target point according to the position information of the symmetry point and the position information of the target point; a fifth quantity value corresponding to a point where the second absolute value is greater than the second distance threshold is determined.
The second absolute value represents the absolute value of the distance difference between the symmetry point and the target point. It should be noted that, when there are a plurality of points in the neighborhood corresponding to the target point, where the reflectivity is greater than or equal to the first reflectivity threshold, there are a plurality of symmetry points, and correspondingly, there are a plurality of second absolute values, where each second absolute value is used to represent an absolute value of a distance difference between one symmetry point in the neighborhood and the target point.
Specifically, for each point in the neighborhood corresponding to the target point, the reflectivity of which is greater than or equal to the first reflectivity threshold, the coordinates of the point are obtained, the coordinates of the target point are obtained, the symmetrical point of the point symmetrical about the center of the target point can be calculated, and the coordinates of the symmetrical point are obtained. And calculating a second absolute value of the distance difference between the symmetrical point and the target point according to the coordinates of the target point and the coordinates of the symmetrical point. And comparing the second absolute value with a second distance threshold value to obtain a comparison result. The statistical comparison result is the number of points whose second absolute value is greater than the second distance threshold, and the number of points is noted as a fifth number value.
Optionally, in one possible implementation manner, the fifth quantity value may be initialized to 0, and when comparing, for each determined symmetry point, the second absolute value corresponding to the symmetry point with the second distance threshold, if the comparison result is that the second absolute value is greater than the second distance threshold, the fifth quantity value is increased by 1 until all points in the neighborhood corresponding to the target point are processed.
S2036: and determining a sixth quantity value according to the acquired position information of the point with the reflectivity smaller than the second reflectivity threshold value in the neighborhood and the position information of the target point.
The sixth number value is used to represent the number of points in the neighborhood corresponding to the target point, where the third absolute value is smaller than the first distance threshold.
Illustratively, determining a third absolute value of a distance difference between the point with the reflectivity smaller than the second reflectivity threshold value in the neighborhood corresponding to the target point and the target point according to the acquired position information of the point with the reflectivity smaller than the second reflectivity threshold value in the neighborhood and the position information of the target point; a sixth number value corresponding to a point where the third absolute value is less than the first distance threshold is determined.
The third absolute value represents an absolute value of a distance difference between a point in the neighborhood having a reflectivity less than the second reflectivity threshold and the target point. It should be noted that, when there are a plurality of points in the neighborhood corresponding to the target point, where the reflectivity is smaller than the second reflectivity threshold, there are a plurality of third absolute values, each of which is used to represent an absolute value of a distance difference between one point in the neighborhood, where the reflectivity is smaller than the second reflectivity threshold, and the target point.
Specifically, for each point in the neighborhood corresponding to the target point where the reflectance is smaller than the second reflectance threshold, the coordinates of the point are obtained, and the coordinates of the target point are obtained, and the third absolute value of the difference in distance from the point to the target point can be calculated. And comparing the third absolute value with the first distance threshold value to obtain a comparison result. The statistical comparison result is the number of points whose third absolute value is smaller than the first distance threshold, and the number of points is noted as a sixth number value.
Optionally, in one possible implementation manner, the sixth number value may be initialized to 0, and when comparing, for each point in the neighborhood corresponding to the target point, the reflectivity of the point is smaller than the second reflectivity threshold, the third absolute value corresponding to the point with the first distance threshold, if the comparison result is that the third absolute value is smaller than the first distance threshold, the sixth number value is increased by 1 until all the points in the neighborhood corresponding to the target point are processed.
The first distance threshold and the second distance threshold may be the same or different, and may be set according to actual situations, which is not limited.
S2037: and determining a seventh quantity value corresponding to the point meeting the first preset distance condition in the neighborhood according to the acquired position information of the point with the reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood.
The first preset distance condition may include closest to the target point, shortest distance to the target point, etc. The distance refers to the coordinate distance of the two-dimensional space, and the Euclidean distance of the non-real three-dimensional point cloud space.
Illustratively, position information of points with reflectivity greater than or equal to a first reflectivity threshold value in a neighborhood corresponding to a target point is obtained, a point closest to the target point is searched for in the points according to the position information of the points, the closest point is connected with the target point, and position information of the point on a connecting line is obtained. And determining a fourth absolute value of the distance difference between each point on the link and the target point according to the position information of the points on the link. And comparing the fourth absolute value with the second distance threshold value to obtain a comparison result. The statistical comparison result is the number of points with the fourth absolute value larger than the second distance threshold, and the number of the points is recorded as a seventh number value.
Optionally, in one possible implementation manner, the seventh number value may be initialized to 0, and when comparing, for each point on the connection, the fourth absolute value corresponding to the point with the second distance threshold, if the comparison result is that the fourth absolute value is greater than the second distance threshold, the seventh number value is increased by 1 until all the points on the connection are processed.
In the implementation manner, a plurality of quantity values are determined in different manners, and the quantity values are helpful for determining preset discrimination conditions subsequently, so that the pseudo point cloud is accurately discriminated according to the preset discrimination conditions.
Optionally, in some possible implementations of the present application, the discrimination information may further include a first discrimination sub-condition and a second discrimination sub-condition. Wherein the first discriminant sub-condition comprises: and detecting whether the point corresponding to the sixth quantity value is monotonically changed in the preset direction.
The third absolute value is an absolute value of a distance difference between the point with the reflectivity smaller than the second reflectivity threshold value and the target point in the neighborhood. The preset direction may include a horizontal direction, a vertical direction, and the like. Monotonic changes may include monotonic increases and monotonic decreases.
And acquiring all points of the point in the neighborhood corresponding to the target point in the horizontal direction aiming at each point corresponding to the sixth quantity value in the neighborhood corresponding to the target point, and judging whether the point is monotonically increased or monotonically decreased in the horizontal direction. Similarly, for each point corresponding to the sixth number value in the neighborhood corresponding to the target point, all points of the point in the neighborhood in the vertical direction are acquired, and whether the point is monotonically increased or monotonically decreased in the vertical direction is judged.
Alternatively, an increment value and a decrement value may be set for each point. When a certain point monotonically increases in a preset direction, the increment value corresponding to the point can be marked as 1, and the decrement value can be marked as 0. When a certain point monotonically decreases in a preset direction, the increment value corresponding to the point can be marked as 0, and the decrement value can be marked as 1. When a certain point is not monotonic in the preset direction, the increment value and the decrement value corresponding to the point can be recorded as 0.
The second discriminant sub-condition includes: and detecting whether a corresponding high-inverse path exists at a point meeting a second preset distance condition in the neighborhood corresponding to the target point.
The second preset distance condition includes that a fifth absolute value of a distance difference between a point in the neighborhood and the target point is smaller than the first distance threshold. The high-inverse path means that points satisfying the second preset distance condition have a path in the second preset direction, and points on the path are marked. The second preset direction is customized by the user, and illustratively, the second preset direction may include any angle direction centered on the target point, for example, a 0 degree direction, a 45 degree direction, a 90 degree direction, a 135 degree direction, a 180 degree direction, a 225 degree direction, a 270 degree direction, a 315 degree direction, and the like centered on the target point.
Illustratively, a fifth absolute value of a distance difference between each point in the neighborhood corresponding to the target point and the target point is determined, and the fifth absolute value is compared with the first distance threshold to obtain a comparison result. And marking the point with the fifth absolute value smaller than the first distance threshold value as a comparison result. For example, a point where the fifth absolute value is less than the first distance threshold is marked as 1. And carrying out the processing on each point in the neighborhood corresponding to the target point until the point processing in the neighborhood corresponding to the target point is completed.
If the points on the paths in the second preset direction are detected to be marked as 1, determining that the points on the paths all have corresponding high-inverse paths, and simultaneously determining that the target points have corresponding high-inverse paths.
Optionally, absolute values of distance differences between boundary points in the neighborhood corresponding to the target point and the target point, and reflectivities corresponding to the boundary points may also be recorded. Referring to fig. 4, fig. 4 is a schematic view of a point cloud position and direction according to an exemplary embodiment of the present application. As shown in fig. 4, U, RU, R, RD, D, LD, L, LU each represents a different direction. Wherein U represents a right-upward direction, i.e., 0 degree direction, RU represents an upper right direction, i.e., 45 degrees direction, R represents a right-right direction, i.e., 90 degrees direction, RD represents a lower right direction, i.e., 135 degrees direction, D represents a right-downward direction, i.e., 180 degrees direction, LD represents a lower left direction, i.e., 225 degrees direction, L represents a right-left direction, i.e., 270 degrees direction, LU represents an upper left direction, i.e., 315 degrees direction. The center-most point of the square represents the target point, and the rest of the black points represent the boundary points of the neighborhood corresponding to the target point.
It should be noted that this is only an exemplary illustration, and the directions of the different angles may be determined by the counterclockwise direction, which is not limited thereto.
Optionally, in some possible implementations of the present application, based on a preset rule, the first quantity value, the second quantity value, the third quantity value, the fourth quantity value, the fifth quantity value, the sixth quantity value, the seventh quantity value, the first criterion and the second criterion are arbitrarily combined to generate the preset criterion.
For example, the preset criteria may include a plurality of preset criteria. The preset rule may include: the second number value is greater than or equal to the first statistical threshold, the fourth number value is greater than or equal to the second statistical threshold, the third number value is greater than or equal to the third statistical threshold, and the fifth number value is greater than or equal to the fourth statistical threshold.
Based on a preset rule, combining the first quantity value, the second quantity value, the third quantity value, the fourth quantity value, the fifth quantity value, the sixth quantity value, the seventh quantity value, the first judgment sub-condition and the second judgment sub-condition to generate a first preset judgment sub-condition.
The first preset discriminant sub-condition includes: the second quantity value is larger than or equal to the first statistical threshold value, the fourth quantity value is larger than or equal to the second statistical threshold value, the third quantity value is larger than or equal to the third statistical threshold value, the fifth quantity value is larger than or equal to the fourth statistical threshold value, the point corresponding to the sixth quantity value is monotonically changed in the preset direction, and a corresponding high reverse path exists.
The first statistical threshold, the second statistical threshold, the third statistical threshold and the fourth statistical threshold can be set and adjusted according to actual conditions. The first statistical threshold, the second statistical threshold, the third statistical threshold, and the fourth statistical threshold may be the same or different. For example, the first statistical threshold may be 3, and the second statistical threshold, the third statistical threshold, and the fourth statistical threshold may be 2. This is merely illustrative and is not limiting.
In the process of actually identifying the pseudo point cloud, the first quantity value, the second quantity value, the third quantity value, the fourth quantity value, the fifth quantity value, the sixth quantity value and the seventh quantity value corresponding to each point to be measured are determined according to the position information and the reflectivity corresponding to each point to be measured, and the determining manner refers to the steps in S2031 to S2037, which are not described herein again. And judging whether each point to be detected in the detection point cloud data meets a first preset judging sub-condition or not, and judging that any point to be detected is a pseudo point cloud when any point to be detected meets the first preset judging sub-condition. It should be noted that, the to-be-measured point is consistent with the above-mentioned target point, only for distinguishing the scene, the actual processing procedure, sequence, etc. of the to-be-measured point are consistent with the above-mentioned target point.
Optionally, the generated second preset discriminatory sub-condition may include: the absolute value of the distance value between at least one point and the target point in the positive left direction, the left upper direction or the left lower direction is smaller than a first distance threshold (for example, 0.1m, 0.2m and the like), the reflectivity corresponding to at least one point in the positive left direction, the left upper direction or the left lower direction is larger than or equal to a first reflectivity threshold (for example, 70%, 80%, 90% and the like), the absolute value of the distance value between at least one point and the target point in the positive right direction, the right lower direction or the right upper direction which is symmetrical about the center is larger than a second distance threshold (for example, 1m, 1.5m and the like), the point corresponding to the sixth number value corresponding to the point to be measured is monotonically decreased in the preset direction, and the seventh number value corresponding to the point to be measured is 0.
Optionally, the generated third preset discriminatory sub-condition may include: the absolute value of the distance value between at least one point and the target point in the right direction, the upper right direction or the lower right direction is smaller than a first distance threshold, the reflectivity corresponding to at least one point in the right direction, the upper right direction or the lower right direction is larger than or equal to the first reflectivity threshold, the absolute value of the distance value between at least one point and the target point in the right left direction, the lower left direction or the upper left direction which is symmetrical about the center is larger than a second distance threshold, the point corresponding to the sixth number value corresponding to the point to be detected is monotonically increased in the preset direction, and the seventh number value corresponding to the point to be detected is 0.
Optionally, the fourth preset discriminatory sub-condition generated may include: the absolute value of the distance value between at least one point and the target point in the right upper direction, the left upper direction or the right upper direction is smaller than a first distance threshold, the reflectivity corresponding to the point in the right upper direction, the left upper direction or the right upper direction is larger than or equal to the first reflectivity threshold, the absolute value of the distance value between at least one point and the target point in the right lower direction, the right lower direction or the left lower direction which is symmetrical about the center is larger than a second distance threshold, the point corresponding to the sixth number value of the point to be detected is monotonically decreasing in the preset direction, and the seventh number value corresponding to the point to be detected is 0.
Optionally, the fifth preset discriminatory sub-condition generated may include: the absolute value of the distance value between at least one point and the target point in the right lower direction, the right lower direction or the left lower direction is smaller than a first distance threshold, the reflectivity corresponding to the point in the right lower direction, the right lower direction or the left lower direction is larger than or equal to the first reflectivity threshold, the absolute value of the distance value between at least one point and the target point in the right upper direction, the left upper direction or the right upper direction which is symmetrical about the center is larger than a second distance threshold, the point corresponding to the sixth number value corresponding to the point to be detected is monotonically increased in the preset direction, and the seventh number value corresponding to the point to be detected is 0.
Optionally, the sixth preset discriminatory sub-condition generated may include: the first number value is greater than or equal to a fifth statistical threshold (e.g., 10, 20, 30, etc.), the number of pseudo point clouds in the nearby processed points is greater than or equal to a sixth statistical threshold (e.g., 3, 5, 8, etc.), the absolute value of the difference between the distance of the point to be measured and the average distance of the surrounding pseudo point clouds is less than the first distance threshold, the sixth number value is greater than or equal to a seventh statistical threshold (e.g., 2, 3, 4, etc.), the point corresponding to the sixth number value of the point to be measured is monotonically changed in the preset direction, and the seventh number value corresponding to the point to be measured is 0. The fifth, sixth, and seventh statistical thresholds are not limited by this example only.
Optionally, the seventh preset discriminatory sub-condition generated may include: the first quantity value is larger than or equal to a fifth statistical threshold value, the number of the pseudo point clouds in the nearby processed points is larger than or equal to a sixth statistical threshold value, the absolute value of the difference between the distance of the point to be detected and the average distance of the surrounding pseudo point clouds is smaller than a first distance threshold value, the absolute value of the distance value between at least one point and the target point in the right left direction, the left upper direction or the left lower direction is smaller than the first distance threshold value, the absolute value of the distance value between at least one point and the target point in the right lower direction or the right upper direction which is symmetrical with respect to the center is larger than a second distance threshold value, and the point corresponding to the sixth quantity value corresponding to the point to be detected is monotonically decreased in the preset direction.
Optionally, the eighth preset discrimination sub-condition generated may include: the first quantity value is larger than or equal to a fifth statistical threshold value, the number of the pseudo point clouds in the nearby processed points is larger than or equal to a sixth statistical threshold value, the absolute value of the difference between the distance of the point to be detected and the average distance of the surrounding pseudo point clouds is smaller than a first distance threshold value, the absolute value of the distance value between at least one point and the target point in the right-right direction, the right-upper direction or the right-lower direction is smaller than the first distance threshold value, the absolute value of the distance value between at least one point and the target point in the right-left direction, the left-lower direction or the left-upper direction which is symmetrical with respect to the center is larger than a second distance threshold value, and the point corresponding to the sixth quantity value corresponding to the point to be detected is monotonically increased in a preset direction.
Optionally, the ninth preset discriminatory sub-condition generated may include: the first quantity value is larger than or equal to a fifth statistical threshold value, the number of the pseudo point clouds in the nearby processed points is larger than or equal to a sixth statistical threshold value, the absolute value of the difference between the distance of the point to be detected and the average distance of the surrounding pseudo point clouds is smaller than a first distance threshold value, the absolute value of the distance value of at least one point and the target point in the right upper direction, the left upper direction or the right upper direction is smaller than the first distance threshold value, the absolute value of the distance value of at least one point and the target point in the right lower direction, the left lower direction or the right lower direction which is symmetrical with respect to the center is larger than a second distance threshold value, and the point corresponding to the sixth quantity value corresponding to the point to be detected is monotonically decreased in a preset direction.
Optionally, the tenth preset discriminating sub-condition generated may include: the first quantity value is larger than or equal to a fifth statistical threshold value, the number of the pseudo point clouds in the nearby processed points is larger than or equal to a sixth statistical threshold value, the absolute value of the difference between the distance of the point to be detected and the average distance of the surrounding pseudo point clouds is smaller than a first distance threshold value, the absolute value of the distance value of at least one point and the target point in the right lower direction, the right lower direction or the left lower direction is smaller than the first distance threshold value, the absolute value of the distance value of at least one point and the target point in the right upper direction, the right upper direction or the left upper direction is larger than a second distance threshold value, and the point corresponding to the sixth quantity value of the point to be detected is monotonically increased in the preset direction.
Optionally, the generated eleventh preset discriminatory sub-condition may include: the first quantity value is larger than or equal to a fifth statistical threshold value, the number of the pseudo point clouds in the nearby processed points is larger than or equal to a sixth statistical threshold value, the absolute value of the difference between the distance of the point to be detected and the average distance of the surrounding pseudo point clouds is smaller than a first distance threshold value, the absolute value of the distance value between at least one point and the target point in the right left direction, the left upper direction or the left lower direction is smaller than the first distance threshold value, the reflectivity corresponding to at least one point in the right left direction, the left upper direction or the left lower direction is larger than or equal to the first reflectivity threshold value, and the point corresponding to the sixth quantity value of the point to be detected is monotonically decreased in the preset direction.
Optionally, the twelfth preset discriminant sub-condition generated may include: the first quantity value is larger than or equal to a fifth statistical threshold value, the number of the pseudo point clouds in the nearby processed points is larger than or equal to a sixth statistical threshold value, the absolute value of the difference between the distance of the point to be detected and the average distance of the surrounding pseudo point clouds is smaller than a first distance threshold value, the absolute value of the distance value between at least one point and the target point in the right-right direction, or the upper-right direction, or the lower-right direction is smaller than the first distance threshold value, the reflectivity corresponding to at least one point in the right-right direction, or the upper-right direction, or the lower-right direction is larger than or equal to the first reflectivity threshold value, and the point corresponding to the sixth quantity value corresponding to the point to be detected is monotonically increased in the preset direction.
Optionally, the thirteenth preset precondition for discrimination may include: the first quantity value is larger than or equal to a fifth statistical threshold value, the number of the pseudo point clouds in the nearby processed points is larger than or equal to a sixth statistical threshold value, the absolute value of the difference between the distance of the point to be detected and the average distance of the surrounding pseudo point clouds is smaller than a first distance threshold value, the absolute value of the distance value between at least one point and the target point in the right upper direction, the left upper direction or the right upper direction is smaller than the first distance threshold value, the reflectivity corresponding to at least one point in the right upper direction, the left upper direction or the right upper direction is larger than or equal to the first reflectivity threshold value, and the point corresponding to the sixth quantity value of the point to be detected is monotonically decreased in the preset direction.
Optionally, the fourteenth preset discriminant sub-condition generated may include: the first quantity value is larger than or equal to a fifth statistical threshold value, the number of the pseudo point clouds in the nearby processed points is larger than or equal to a sixth statistical threshold value, the absolute value of the difference between the distance of the point to be detected and the average distance of the surrounding pseudo point clouds is smaller than a first distance threshold value, the absolute value of the distance value between at least one point and the target point in the right lower direction, the right lower direction or the left lower direction is smaller than the first distance threshold value, the reflectivity corresponding to at least one point in the right lower direction, the right lower direction or the left lower direction is larger than or equal to the first reflectivity threshold value, and the point corresponding to the sixth quantity value of the point to be detected is monotonically increased in the preset direction.
Optionally, the fifteenth preset discrimination sub-condition generated may include: if the to-be-measured point has a corresponding high-reflectivity path, and at least one point in the right left direction, the left upper direction or the left lower direction has an absolute value of a distance value between the at least one point and the target point smaller than a first distance threshold, and the reflectivity corresponding to the at least one point in the right left direction, the left upper direction or the left lower direction is larger than or equal to the first reflectivity threshold, and the point corresponding to the sixth quantity value corresponding to the to-be-measured point is monotonically decreased in the preset direction.
Optionally, the sixteenth preset discriminatory sub-condition generated may include: if the to-be-measured point has a corresponding high-reflectivity path, and at least one point in the right direction, the upper right direction or the lower right direction has an absolute value of a distance value between the at least one point and the target point smaller than a first distance threshold, and the reflectivity corresponding to the at least one point in the right direction, the upper right direction or the lower right direction is larger than or equal to the first reflectivity threshold, and the point corresponding to the sixth quantity value corresponding to the to-be-measured point is monotonically increased in the preset direction.
Optionally, the seventeenth preset discriminant sub-condition generated may include: if the to-be-measured point has a corresponding high-reflectivity path, and at least one point in the right-upward direction, the left-upward direction or the right-upward direction has an absolute value of a distance value between the at least one point and the target point smaller than a first distance threshold, and the reflectivity corresponding to the at least one point in the right-upward direction, the left-upward direction or the right-upward direction is greater than or equal to the first reflectivity threshold, and the point corresponding to a sixth number value corresponding to the to-be-measured point is monotonically decreasing in a preset direction.
Optionally, the eighteenth preset discrimination sub-condition generated may include: if the to-be-measured point has a corresponding high-reflectivity path, and at least one point in the right lower direction, the right lower direction or the left lower direction has an absolute value of a distance value between the at least one point and the target point smaller than a first distance threshold, and the reflectivity corresponding to the at least one point in the right lower direction, the right lower direction or the left lower direction is larger than or equal to the first reflectivity threshold, and the point corresponding to a sixth quantity value corresponding to the to-be-measured point is monotonically increased in a preset direction.
It should be noted that, the above-mentioned preset judging sub-conditions are only a part of them, and the preset judging sub-conditions generated by actual combination are far more than the above-mentioned preset judging sub-conditions. And the preset judging sub-conditions can be recombined and adjusted according to the actual scene. For example, when a boundary condition is encountered, the combination of each preset discriminant sub-condition may be modified according to the corresponding neighborhood correspondence.
In the process of actually distinguishing the pseudo point cloud, whether each point to be detected in the point cloud data meets any one of the preset distinguishing sub-conditions is exemplified, and when any point to be detected meets any one of the preset distinguishing sub-conditions, the point to be detected is judged to be the pseudo point cloud.
In the embodiment, a plurality of different preset judging conditions are determined, and the method is suitable for judging the pseudo point clouds in a plurality of different scenes, and according to the different preset judging conditions, the pseudo point clouds in the point cloud data corresponding to the high-reflection object can be accurately distinguished, so that the pseudo point clouds can be accurately removed, the quality of the point clouds can be improved, the accuracy of laser radar measurement is improved, the accuracy of laser radar ranging is improved, and the stability of the point cloud image is ensured.
For example, in order to more intuitively represent the effect of the method for processing the laser radar point cloud in practical application, the application provides a point cloud data effect diagram after removing the pseudo point cloud in different scenes. Fig. 5a and fig. 5b are graphs of point cloud data effects in a scenario provided by an embodiment of the present application. Fig. 5a is a point cloud data effect diagram of the high-reflection plate before removing the pseudo point cloud, and fig. 5b is a point cloud data effect diagram of the high-reflection plate after removing the pseudo point cloud, and it can be clearly seen that the pseudo point cloud in the area pointed by the arrow is successfully removed.
Fig. 6a and 6b are graphs of point cloud data effects in another scenario provided by an embodiment of the present application. Fig. 6a is a point cloud data effect diagram of the high-reflection card before the pseudo point cloud is removed, and fig. 6b is a point cloud data effect diagram of the high-reflection card after the pseudo point cloud is removed, so that it can be obviously seen that the pseudo point cloud in the area pointed by the arrow is successfully removed.
Fig. 7a and 7b are graphs of point cloud data effects in yet another scenario provided by embodiments of the present application. Fig. 7a and fig. 7b show a false positive test of a low-reflection object beside a high-reflection object under non-high-reflection expansion (for example, a low-reflection object is placed above a high-reflection board), fig. 7a is a point cloud data effect diagram before removing a pseudo point cloud, fig. 7b is a point cloud data effect diagram after removing a pseudo point cloud, it can be obviously seen that point cloud data of an area pointed by an arrow is not removed, and it is seen that the point cloud data of the low-reflection object is not misjudged as the pseudo point cloud.
Fig. 8a and 8b are graphs of point cloud data effects in yet another scenario provided by embodiments of the present application. Fig. 8a and 8b show a misjudgment test of placing a low-reflection object between two high-reflection objects, fig. 8a is a point cloud data effect diagram before removing the pseudo point cloud, fig. 8b is a point cloud data effect diagram after removing the pseudo point cloud, it can be obviously seen that the point cloud data of the area pointed by the arrow is not removed, it is not seen that the point cloud data of the low-reflection object is misjudged as the pseudo point cloud, and the point cloud data of the low-reflection object is well reserved.
Fig. 9a and 9b are graphs of point cloud data effects in yet another scenario provided by embodiments of the present application. Fig. 9a and 9b show pseudo point cloud testing of the high-reflection guideboard, 9a is a point cloud data effect diagram before pseudo point cloud removal, fig. 9b is a point cloud data effect after pseudo point cloud removal, and it can be clearly seen that the pseudo point cloud (pseudo point cloud around the high-reflection guideboard) in the area pointed by the arrow is successfully removed, but the point cloud corresponding to the upright pole connected with the high-reflection guideboard is not removed.
According to the method for processing the laser radar point cloud, in the different scene applications, good effects are shown, the pseudo point cloud in the point cloud data corresponding to the high-reflection object is accurately distinguished, the point cloud data corresponding to the low-reflection object is reserved, the quality of the point cloud is improved, the accuracy of laser radar measurement is improved, and the stability of the point cloud image is ensured.
Optionally, in one possible implementation manner, before acquiring the point cloud data detected by the lidar, the method for processing the lidar point cloud provided by the application further includes: and determining the high-reflection object in the object to be detected.
For example, point cloud data corresponding to an object to be measured may be obtained, and whether the object to be measured is a high-reflectivity object may be determined according to the reflectivity corresponding to each point in the point cloud data.
Specifically, point cloud data corresponding to an object to be detected and reflectivity corresponding to each point in the point cloud data corresponding to the object to be detected are obtained in advance; determining the duty ratio of points with reflectivity larger than or equal to a preset threshold value in point cloud data corresponding to an object to be detected in a preset area; when the fact that the duty ratio in the preset area meets the preset condition is detected, the object to be detected is judged to be a high-reflection object.
For example, for each object to be measured, the point cloud data corresponding to the object to be measured is obtained in advance, and the manner of obtaining the point cloud data may refer to the manner of obtaining the point cloud data in S101, which is not described herein again. The obtained point cloud data comprises the reflectivity corresponding to each point, and the number of the points with the reflectivity larger than a preset threshold value in the point cloud data and the number of all the points in a preset area are counted. Calculating the ratio of the points with the reflectivity larger than or equal to the preset threshold value in the point cloud data to the preset area according to the number of the points with the reflectivity larger than the preset threshold value in the point cloud data and the number of all the points in the preset area, and judging that the object to be detected is a high-reflection object when the ratio of the points with the reflectivity larger than the preset threshold value in the point cloud data to the preset area is detected to meet the preset condition. The preset condition may be a specific duty ratio or a duty ratio range, and the preset condition and the preset threshold may be set and adjusted according to actual situations, which is not limited.
In the embodiment, the high-reflection object is predetermined, the interference of the low-reflection object is eliminated, the follow-up more accurate determination of the pseudo point clouds in the point cloud data corresponding to the high-reflection object is facilitated, and the pseudo point clouds can be accurately removed, so that the quality of the point clouds is improved, the accuracy of laser radar measurement is improved, and the stability of the point cloud image is ensured.
Referring to fig. 10, fig. 10 is a schematic diagram of an apparatus for processing a laser radar point cloud according to an embodiment of the present application. The apparatus comprises means for performing the steps of the corresponding embodiments of fig. 1-3. Refer specifically to the related descriptions in the respective embodiments of fig. 1 to 3. For convenience of explanation, only the portions related to the present embodiment are shown. Referring to fig. 10, comprising:
an acquiring unit 310, configured to acquire point cloud data detected by the laser radar;
a judging unit 320, configured to judge whether the point cloud data includes a high-reflectivity object;
and the determining unit 330 is configured to determine, when the determination result indicates that the point cloud data includes the high-reflectivity object, a pseudo point cloud in the point cloud data according to a preset determination condition and position information and reflectivity corresponding to each point in the point cloud data.
Optionally, the apparatus further comprises:
and the rejecting unit is used for rejecting the pseudo point cloud in the point cloud data.
Optionally, the apparatus further comprises:
the judging information determining unit is used for determining judging information according to the position information and the reflectivity corresponding to each point in the point cloud data;
and the generating unit is used for generating the preset judging conditions based on the judging information.
Optionally, the discrimination information includes a first quantity value, a second quantity value, a third quantity value, a fourth quantity value, a fifth quantity value, a sixth quantity value, and a seventh quantity, and the discrimination information determining unit is specifically configured to:
determining a neighborhood corresponding to a target point in the point cloud data;
when the reflectivity corresponding to the target point is detected to be greater than or equal to a first reflectivity threshold, determining a first quantity value of high-negative influence points in the neighborhood;
when the reflectivity corresponding to the target point is detected to be smaller than a second reflectivity threshold value, determining a second quantity value of points with reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood, and determining a third quantity value of points with reflectivity smaller than the second reflectivity threshold value in the neighborhood;
Determining a fourth quantity value according to the acquired position information of the point with the reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood and the position information of the target point;
acquiring a symmetry point which is symmetrical about the center of a point with the reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood, and determining a fifth quantity value according to the symmetry point and the position information of the target point;
determining a sixth quantity value according to the acquired position information of the point with the reflectivity smaller than the second reflectivity threshold value in the neighborhood and the position information of the target point;
and determining a seventh quantity value corresponding to a point meeting a first preset distance condition in the neighborhood according to the acquired position information of the point with the reflectivity larger than or equal to the first reflectivity threshold in the neighborhood.
Optionally, the discrimination information determining unit is further configured to:
determining a first absolute value of a distance difference between a point with reflectivity greater than or equal to the first reflectivity threshold value in the neighborhood and the target point according to the acquired position information of the point with reflectivity greater than or equal to the first reflectivity threshold value in the neighborhood and the position information of the target point;
And determining the fourth quantity value corresponding to the number of points of which the first absolute value is smaller than a first distance threshold.
Optionally, the discrimination information determining unit is further configured to:
acquiring the position information of the symmetrical points;
determining a second absolute value of the distance difference between the symmetrical point and the target point according to the position information of the symmetrical point and the position information of the target point;
and determining the fifth quantity value corresponding to the number of points of which the second absolute value is larger than a second distance threshold.
Optionally, the discrimination information determining unit is further configured to:
determining a third absolute value of the distance difference between the point with the reflectivity smaller than the second reflectivity threshold value in the neighborhood and the target point according to the acquired position information of the point with the reflectivity smaller than the second reflectivity threshold value in the neighborhood and the position information of the target point;
determining the sixth number value corresponding to the number of points for which the third absolute value is less than the first distance threshold.
Optionally, the discrimination information further includes a first discrimination sub-condition and a second discrimination sub-condition, and the first discrimination sub-condition includes: detecting whether the point corresponding to the sixth quantity value is monotonically changed in a preset direction; the second discriminant sub-condition includes: and detecting whether a point meeting a second preset distance condition in the neighborhood has a corresponding high-inverse path or not.
Optionally, the generating unit is specifically configured to:
based on a preset rule, the first quantity value, the second quantity value, the third quantity value, the fourth quantity value, the fifth quantity value, the sixth quantity value, the seventh quantity value, the first judgment sub-condition and the second judgment sub-condition are combined arbitrarily to generate the preset judgment condition.
Optionally, the judging unit 320 is specifically configured to:
determining the reflectivity of each point in the point cloud data;
determining a number of points for which the reflectivity is greater than the first reflectivity threshold;
when the quantity is detected to reach a preset quantity threshold value, judging that the point cloud data contains the high-reflectivity object;
or when the quantity is detected to not reach the preset quantity threshold, judging that the point cloud data does not contain the high-reflectivity object.
Optionally, the apparatus further comprises:
and the high-reflection object determining unit is used for determining the high-reflection object in the object to be detected.
Optionally, the high-reflection object determination unit is specifically configured to:
acquiring point cloud data corresponding to the object to be detected in advance, and reflectivity corresponding to each point in the point cloud data corresponding to the object to be detected;
Determining the ratio of points with reflectivity larger than or equal to a preset threshold value in point cloud data corresponding to the object to be detected in a preset area;
and when the fact that the duty ratio in the preset area meets the preset condition is detected, judging that the object to be detected is the high-reflection object.
Alternatively, the determining unit 330 is specifically configured to:
judging whether each point meets the preset judging condition according to the position information and the reflectivity corresponding to each point;
and when any point in the point cloud data is detected to meet the preset judging condition, judging the any point as the pseudo point cloud.
Referring to fig. 11, fig. 11 is a schematic diagram of an apparatus for processing a laser radar point cloud according to another embodiment of the present application. As shown in fig. 11, the apparatus 4 of this embodiment includes: a processor 40, a memory 41 and a computer program 42 stored in the memory 41 and executable on the processor 40. The steps in the above-described embodiments of the method of processing a lidar point cloud are implemented by the processor 40 when executing the computer program 42, such as S101 to S103 shown in fig. 1. Alternatively, the processor 40 may perform the functions of the units in the embodiments described above, such as the units 310 to 330 shown in fig. 10, when executing the computer program 42.
Illustratively, the computer program 42 may be partitioned into one or more units that are stored in the memory 41 and executed by the processor 40 to complete the present application. The one or more units may be a series of computer instruction segments capable of performing a specific function describing the execution of the computer program 42 in the device 4. For example, the computer program 42 may be divided into a first acquisition unit, a second acquisition unit and a determination unit, each unit functioning specifically as described above.
The device may include, but is not limited to, a processor 40, a memory 41. It will be appreciated by those skilled in the art that fig. 11 is merely an example of device 4 and is not meant to be limiting, and that more or fewer components than shown may be included, or that certain components may be combined, or that different components may be included, for example, in the device, including an input-output device, a network access device, a bus, etc.
The processor 40 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the device, such as a hard disk or a memory of the device. The memory 41 may also be an external storage terminal of the device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the device. Further, the memory 41 may also include both an internal storage unit and an external storage terminal of the device. The memory 41 is used for storing the computer instructions and other programs and data required by the terminal. The memory 41 may also be used for temporarily storing data that has been output or is to be output.
The embodiment of the application also provides a computer storage medium, which can be nonvolatile or volatile, and stores a computer program, and the computer program is executed by a processor to implement the steps in the method embodiments for processing the laser radar point cloud.
The present application also provides a computer program product which, when run on a device, causes the device to perform the steps of the respective method embodiments described above for processing a lidar point cloud.
The embodiment of the application also provides a chip or an integrated circuit, which comprises: and a processor for calling and running a computer program from the memory, so that the device provided with the chip or the integrated circuit executes the steps in the method embodiment of processing the laser radar point cloud.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (19)

  1. A method of processing a lidar point cloud, comprising:
    acquiring point cloud data detected by a laser radar;
    judging whether the point cloud data contains a high-reflection object or not;
    and when the judgment result is that the point cloud data contains the high-reflectivity object, determining a pseudo point cloud in the point cloud data according to a preset judgment condition and position information and reflectivity corresponding to each point in the point cloud data.
  2. The method of claim 1, wherein before determining the pseudo point cloud in the point cloud data according to the preset discrimination conditions and the position information and the reflectivity corresponding to each point, the method further comprises:
    determining discrimination information according to the position information and reflectivity corresponding to each point in the point cloud data;
    and generating the preset judging conditions based on the judging information.
  3. The method of claim 2, wherein the discrimination information includes a first quantity value, a second quantity value, a third quantity value, a fourth quantity value, a fifth quantity value, a sixth quantity value, and a seventh quantity value, and determining the discrimination information according to the reflectivity and the position information corresponding to each point in the point cloud data includes:
    Determining a neighborhood corresponding to a target point in the point cloud data;
    when the reflectivity corresponding to the target point is detected to be greater than or equal to a first reflectivity threshold, determining a first quantity value of high-negative influence points in the neighborhood;
    when the reflectivity corresponding to the target point is detected to be smaller than a second reflectivity threshold value, determining a second quantity value of points with reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood, and determining a third quantity value of points with reflectivity smaller than the second reflectivity threshold value in the neighborhood;
    determining a fourth quantity value according to the acquired position information of the point with the reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood and the position information of the target point;
    acquiring a symmetry point which is symmetrical about the center of a point with the reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood, and determining a fifth quantity value according to the symmetry point and the position information of the target point;
    determining a sixth quantity value according to the acquired position information of the point with the reflectivity smaller than the second reflectivity threshold value in the neighborhood and the position information of the target point;
    and determining a seventh quantity value corresponding to a point meeting a first preset distance condition in the neighborhood according to the acquired position information of the point with the reflectivity larger than or equal to the first reflectivity threshold in the neighborhood.
  4. A method according to claim 3, wherein said determining a fourth quantity value from the acquired location information of points in the neighborhood where the reflectivity is greater than or equal to the first reflectivity threshold and the location information of the target point comprises:
    determining a first absolute value of a distance difference between a point with reflectivity greater than or equal to the first reflectivity threshold value in the neighborhood and the target point according to the acquired position information of the point with reflectivity greater than or equal to the first reflectivity threshold value in the neighborhood and the position information of the target point;
    and determining the fourth quantity value corresponding to the number of points of which the first absolute value is smaller than a first distance threshold.
  5. A method according to claim 3, wherein the acquiring a symmetry point which is centrosymmetric with respect to a point in the neighborhood where the reflectivity is greater than or equal to the first reflectivity threshold value, and determining a fifth quantity value based on the symmetry point and the position information of the target point, comprises:
    acquiring the position information of the symmetrical points;
    determining a second absolute value of the distance difference between the symmetrical point and the target point according to the position information of the symmetrical point and the position information of the target point;
    And determining the fifth quantity value corresponding to the number of points of which the second absolute value is larger than a second distance threshold.
  6. The method of claim 4, wherein the determining a sixth quantity value based on the acquired location information of the point in the neighborhood having a reflectivity less than the second reflectivity threshold and the location information of the target point comprises:
    determining a third absolute value of the distance difference between the point with the reflectivity smaller than the second reflectivity threshold value in the neighborhood and the target point according to the acquired position information of the point with the reflectivity smaller than the second reflectivity threshold value in the neighborhood and the position information of the target point;
    determining the sixth number value corresponding to the number of points for which the third absolute value is less than the first distance threshold.
  7. The method of claim 3, wherein the discriminant information further comprises a first discriminant sub-condition and a second discriminant sub-condition, the first discriminant sub-condition comprising: detecting whether the point corresponding to the sixth quantity value is monotonically changed in a preset direction; the second discriminant sub-condition includes: and detecting whether a point meeting a second preset distance condition in the neighborhood has a corresponding high-inverse path or not.
  8. The method of claim 7, wherein the generating the preset criteria based on the criteria information comprises:
    based on a preset rule, the first quantity value, the second quantity value, the third quantity value, the fourth quantity value, the fifth quantity value, the sixth quantity value, the seventh quantity value, the first judgment sub-condition and the second judgment sub-condition are combined arbitrarily to generate the preset judgment condition.
  9. The method of claim 3, wherein the determining whether the point cloud data includes a high-contrast object comprises:
    determining the reflectivity of each point in the point cloud data;
    determining a number of points for which the reflectivity is greater than the first reflectivity threshold;
    when the quantity is detected to reach a preset quantity threshold value, judging that the point cloud data contains the high-reflectivity object;
    or when the quantity is detected to not reach the preset quantity threshold, judging that the point cloud data does not contain the high-reflectivity object.
  10. The method of claim 1, wherein prior to the acquiring the point cloud data for lidar detection, the method further comprises:
    And determining the high-reflectivity object in the object to be detected.
  11. The method of claim 10, wherein said determining the high-reflectivity object among objects under test comprises:
    acquiring point cloud data corresponding to the object to be detected in advance, and reflectivity corresponding to each point in the point cloud data corresponding to the object to be detected;
    determining the ratio of points with reflectivity larger than or equal to a preset threshold value in point cloud data corresponding to the object to be detected in a preset area;
    and when the fact that the duty ratio in the preset area meets the preset condition is detected, judging that the object to be detected is the high-reflection object.
  12. The method of claim 1, wherein when the determination result is that the point cloud data includes the high-reflectivity object, determining a pseudo point cloud in the point cloud data according to a preset determination condition and position information and reflectivity corresponding to each point in the point cloud data, includes:
    when the judgment result is that the point cloud data contains the high-reflectivity object, judging whether each point meets the preset judgment condition according to the position information and the reflectivity corresponding to each point;
    and when any point in the point cloud data is detected to meet the preset judging condition, judging the any point as the pseudo point cloud.
  13. An apparatus for processing a lidar point cloud, comprising:
    the acquisition unit is used for acquiring point cloud data detected by the laser radar;
    the judging unit is used for judging whether the point cloud data contains a high-reflection object or not;
    and the determining unit is used for determining the pseudo point cloud in the point cloud data according to preset judging conditions and the position information and the reflectivity corresponding to each point in the point cloud data when the judging result is that the point cloud data contains the high-reflectivity object.
  14. An apparatus for processing a lidar point cloud, comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing when the computer program is executed:
    acquiring point cloud data detected by a laser radar;
    judging whether the point cloud data contains a high-reflection object or not;
    and when the judgment result is that the point cloud data contains the high-reflectivity object, determining a pseudo point cloud in the point cloud data according to a preset judgment condition and position information and reflectivity corresponding to each point in the point cloud data.
  15. The apparatus of claim 14, wherein the processor, when executing the computer program, further implements:
    Determining discrimination information according to the position information and reflectivity corresponding to each point in the point cloud data;
    and generating the preset judging conditions based on the judging information.
  16. The apparatus of claim 15, wherein the discrimination information includes a first quantity value, a second quantity value, a third quantity value, a fourth quantity value, a fifth quantity value, a sixth quantity value, and a seventh quantity value, the processor further implementing when executing the computer program:
    determining a neighborhood corresponding to a target point in the point cloud data;
    when the reflectivity corresponding to the target point is detected to be greater than or equal to a first reflectivity threshold, determining a first quantity value of high-negative influence points in the neighborhood;
    when the reflectivity corresponding to the target point is detected to be smaller than a second reflectivity threshold value, determining a second quantity value of points with reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood, and determining a third quantity value of points with reflectivity smaller than the second reflectivity threshold value in the neighborhood;
    determining a fourth quantity value according to the acquired position information of the point with the reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood and the position information of the target point;
    Acquiring a symmetry point which is symmetrical about the center of a point with the reflectivity larger than or equal to the first reflectivity threshold value in the neighborhood, and determining a fifth quantity value according to the symmetry point and the position information of the target point;
    determining a sixth quantity value according to the acquired position information of the point with the reflectivity smaller than the second reflectivity threshold value in the neighborhood and the position information of the target point;
    and determining a seventh quantity value corresponding to a point meeting a first preset distance condition in the neighborhood according to the acquired position information of the point with the reflectivity larger than or equal to the first reflectivity threshold in the neighborhood.
  17. The apparatus of claim 16, wherein the discriminant information further comprises a first discriminant sub-condition and a second discriminant sub-condition, the first discriminant sub-condition comprising: detecting whether the point corresponding to the sixth quantity value is monotonically changed in a preset direction; the second discriminant sub-condition includes: and detecting whether a point meeting a second preset distance condition in the neighborhood has a corresponding high-inverse path or not.
  18. The apparatus of claim 17, wherein the processor when executing the computer program further implements:
    based on a preset rule, the first quantity value, the second quantity value, the third quantity value, the fourth quantity value, the fifth quantity value, the sixth quantity value, the seventh quantity value, the first judgment sub-condition and the second judgment sub-condition are combined arbitrarily to generate the preset judgment condition.
  19. A computer readable storage medium storing a computer program, wherein the computer program when executed by a processor implements the method of any one of claims 1 to 12.
CN202180100861.5A 2021-08-27 2021-08-27 Method, device, equipment and storage medium for processing laser radar point cloud Pending CN117751301A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/115071 WO2023024087A1 (en) 2021-08-27 2021-08-27 Method, apparatus and device for processing laser radar point cloud, and storage medium

Publications (1)

Publication Number Publication Date
CN117751301A true CN117751301A (en) 2024-03-22

Family

ID=85322387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180100861.5A Pending CN117751301A (en) 2021-08-27 2021-08-27 Method, device, equipment and storage medium for processing laser radar point cloud

Country Status (2)

Country Link
CN (1) CN117751301A (en)
WO (1) WO2023024087A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116777903B (en) * 2023-08-11 2024-01-26 北京斯年智驾科技有限公司 Box door detection method and system
CN116935199B (en) * 2023-09-18 2023-11-28 铁正检测科技有限公司 Intelligent detection method and system for levelness of highway construction

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6941187B2 (en) * 2017-12-06 2021-09-29 古野電気株式会社 Precipitation particle discrimination device, precipitation particle discrimination method, and precipitation particle discrimination program
US11041957B2 (en) * 2018-06-25 2021-06-22 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for mitigating effects of high-reflectivity objects in LiDAR data
CN109870706A (en) * 2019-01-31 2019-06-11 深兰科技(上海)有限公司 A kind of detection method of road surface identification, device, equipment and medium
CN110458055B (en) * 2019-07-29 2021-10-15 江苏必得科技股份有限公司 Obstacle detection method and system
CN112912756A (en) * 2019-09-17 2021-06-04 深圳市大疆创新科技有限公司 Point cloud noise filtering method, distance measuring device, system, storage medium and mobile platform
CN111832536B (en) * 2020-07-27 2024-03-12 北京经纬恒润科技股份有限公司 Lane line detection method and device
CN111965625B (en) * 2020-08-11 2023-02-21 上海禾赛科技有限公司 Correction method and device for laser radar and environment sensing system
CN112034481A (en) * 2020-09-02 2020-12-04 亿嘉和科技股份有限公司 Automatic cable identification method based on reflective sticker and laser radar
CN112147638B (en) * 2020-09-21 2023-10-20 知行汽车科技(苏州)股份有限公司 Ground information acquisition method, device and system based on laser point cloud reflection intensity

Also Published As

Publication number Publication date
WO2023024087A1 (en) 2023-03-02

Similar Documents

Publication Publication Date Title
CN109059902A (en) Relative pose determines method, apparatus, equipment and medium
CN110865393A (en) Positioning method and system based on laser radar, storage medium and processor
CN117751301A (en) Method, device, equipment and storage medium for processing laser radar point cloud
JP2006516728A (en) Target detection method
US10223793B1 (en) Laser distance measuring method and system
CN110443275B (en) Method, apparatus and storage medium for removing noise
CN109343037B (en) Device and method for detecting installation error of optical detector and terminal equipment
CN113111513B (en) Sensor configuration scheme determining method and device, computer equipment and storage medium
EP3721261B1 (en) Distance time-of-flight modules
CN110837077B (en) Sound source position verification method and device
CN116547562A (en) Point cloud noise filtering method, system and movable platform
AU2018373751A1 (en) Method and device for ascertaining an installation angle between a roadway on which a vehicle travels and a detection direction of a measurement or radar sensor
CN110765823A (en) Target identification method and device
CN112771575A (en) Distance determination method, movable platform and computer readable storage medium
CN115047472B (en) Method, device, equipment and storage medium for determining laser radar point cloud layering
CN114966651A (en) Drivable region detection method, computer device, storage medium, and vehicle
US10782409B2 (en) Technologies for LIDAR based moving object detection
CN115980718B (en) Lens dirt detection method and device, electronic equipment and readable storage medium
CN111766600A (en) Photon counting laser radar adaptive noise judgment and filtering method and device
CN113589394A (en) Method and equipment for identifying target detection object based on photoelectric sensor
CN111723797B (en) Method and system for determining bounding box of three-dimensional target
CN117677862A (en) Pseudo image point identification method, terminal equipment and computer readable storage medium
CN113052886A (en) Method for acquiring depth information of double TOF cameras by adopting binocular principle
WO2023279225A1 (en) Point cloud processing method and apparatus for laser radar, and storage medium and terminal device
KR20220128787A (en) Method and apparatus for tracking an object using LIDAR sensor, and recording medium for recording program performing the method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination