WO2021253429A1 - Appareil et procédé de traitement de données, ainsi que radar laser et support de stockage - Google Patents

Appareil et procédé de traitement de données, ainsi que radar laser et support de stockage Download PDF

Info

Publication number
WO2021253429A1
WO2021253429A1 PCT/CN2020/097196 CN2020097196W WO2021253429A1 WO 2021253429 A1 WO2021253429 A1 WO 2021253429A1 CN 2020097196 W CN2020097196 W CN 2020097196W WO 2021253429 A1 WO2021253429 A1 WO 2021253429A1
Authority
WO
WIPO (PCT)
Prior art keywords
point cloud
point
points
edge
cloud data
Prior art date
Application number
PCT/CN2020/097196
Other languages
English (en)
Chinese (zh)
Inventor
朱晏辰
刘政
李延召
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080006485.9A priority Critical patent/CN114080545A/zh
Priority to PCT/CN2020/097196 priority patent/WO2021253429A1/fr
Publication of WO2021253429A1 publication Critical patent/WO2021253429A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00

Definitions

  • This application generally relates to the field of laser detection technology, and more specifically relates to a data processing method, device, and storage medium.
  • the traditional mechanical rotary scanning lidar has regular point cloud patterns, and simpler methods can be used to better complete point cloud feature extraction.
  • semi-solid and solid-state lidars appear, and irregular point cloud patterns appear in lidar data, and many feature extraction methods are no longer applicable or the effect has deteriorated.
  • a data processing method comprising: acquiring point cloud data of the current frame; acquiring feature points from the point cloud points included in the point cloud data, the feature points including plane points and An edge point, the edge point includes a face-to-face intersection edge point and/or a jump edge point, the face-to-face intersection edge point corresponds to a point on the boundary line of the intersecting faces in the three-dimensional space, and the jump edge point corresponds to the three-dimensional A point on the edge of an isolated surface in space.
  • a data processing device includes a memory and a processor, and a computer program run by the processor is stored in the memory.
  • the above data processing method is executed at runtime.
  • a laser radar includes: a transmitting end device for transmitting an optical pulse signal; a receiving end device for receiving an echo signal corresponding to the optical pulse signal; and
  • the processor is configured to obtain point cloud data based on the echo signal, and execute the above-mentioned data processing method on the point cloud data.
  • a storage medium is provided, and a computer program is stored on the storage medium, and the computer program executes the above-mentioned data processing method when running.
  • the edge points are classified in a more detailed manner, so that the feature extraction of point cloud data can be It is suitable for various irregular point cloud patterns and improves the accuracy of point cloud data feature extraction.
  • Fig. 1 shows a schematic flowchart of a data processing method according to an embodiment of the present application.
  • Fig. 2 shows an exemplary schematic diagram of an edge point where a surface intersects obtained in a data processing method according to an embodiment of the present application.
  • Fig. 3 shows an exemplary schematic diagram of jumping edge points obtained in the data processing method according to an embodiment of the present application.
  • Fig. 4 shows a schematic flow chart of obtaining a plane point in a data processing method according to an embodiment of the present application.
  • FIG. 5 shows a schematic flow chart of obtaining the edge points of the intersection of surfaces in the data processing method according to an embodiment of the present application.
  • Fig. 6 shows a schematic flow chart of obtaining a jumping edge point in a data processing method according to an embodiment of the present application.
  • Fig. 7 shows a schematic flow chart of acquiring edge points of small objects in a data processing method according to an embodiment of the present application.
  • Fig. 8 shows a schematic block diagram of a data processing device according to an embodiment of the present application.
  • Fig. 9 shows a schematic block diagram of a lidar according to an embodiment of the present application.
  • FIG. 10 shows a schematic structural diagram of a distance measuring device that can be used to collect point cloud data in a data processing method according to an embodiment of the present application.
  • FIG. 11 shows a schematic diagram of an embodiment in which a distance measuring device that can be used to collect point cloud data in a data processing method according to an embodiment of the present application adopts a coaxial optical path.
  • FIG. 12 shows a schematic diagram of a scanning pattern of the distance measuring device shown in FIG. 11.
  • FIG. 1 shows a schematic flowchart of a data processing method 100 according to an embodiment of the present application.
  • the data processing method 100 according to the embodiment of the present application may include the following steps:
  • step S110 the point cloud data of the current frame is acquired.
  • a feature point is obtained from the point cloud points included in the point cloud data, the feature point includes a plane point and an edge point, and the edge point includes a surface intersection edge point and/or a jump edge point.
  • the surface-surface intersection edge point corresponds to a point on the boundary of the intersecting surfaces in the three-dimensional space
  • the jumping edge point corresponds to a point on the edge of an isolated surface in the three-dimensional space.
  • the feature extraction operation can be performed on the acquired frame of point cloud data, that is, feature points are extracted from the point cloud points included in the frame of point cloud data.
  • the extracted feature points include plane points and edge points.
  • the plane points can be points located on a plane in the real scene, and the edge points can be planes, objects, thin rods, etc. in the real scene.
  • the point on the edge can include a surface intersection edge point and/or a jump edge point.
  • the surface-surface intersection edge point may be a point on the boundary line corresponding to the intersecting surfaces in the three-dimensional space, such as the point cloud point (scan point) A on the boundary line of the two surfaces S1 and S2 shown in FIG. 2 .
  • the jumping edge point may be a point on the edge corresponding to the isolated surface in the three-dimensional space, such as the point cloud point (scan point) B shown in FIG. 3.
  • Figure 3 is a top view angle view, where S3 is a surface in three-dimensional space, S4 is another surface in three-dimensional space, S3 does not intersect with S4, it is an isolated surface, when the laser light emitted by the lidar passes through the edge point of S3 When B shoots back on S4, the point cloud formed on S4 is not an edge point, so point B on the edge of S3 is called a jumping edge point.
  • the edge points when extracting the edge points in the point cloud data, the edge points are classified in a more refined manner (including the edge points where the faces meet and/or the jumping edge points), so that the point cloud
  • the feature extraction of data can be applied to various irregular point cloud patterns to improve the accuracy of point cloud data feature extraction.
  • FIG. 4 shows a schematic flowchart of a process 400 of obtaining a plane point in a data processing method according to an embodiment of the present application. As shown in FIG. 4, the process 400 may include the following steps:
  • step S410 a group of point cloud points are acquired in a time series from the point cloud data, and it is determined whether the acquired group of point cloud points meets a first preset condition.
  • step S420 when the acquired set of point cloud points meets the first preset condition, the set of point cloud points are determined as planar point candidate points, and the next set of point cloud points are acquired to perform the It is determined that the next group of point cloud points includes at least one point cloud point in the previous group of point cloud points.
  • step S430 a final plane point extraction result of the current frame point cloud data is obtained based on the determined plane point candidate points.
  • the plane points when the plane points are obtained from the point cloud data, they can be judged group by group in a grouping manner to improve efficiency. For example, a group of point cloud points can be acquired at a time based on a sliding window, and the number of point cloud points in each group depends on the size of the sliding window. In the embodiment of the present application, it can be determined whether the acquired set of point cloud points meets the first preset condition, and if so, the set of point cloud points can be determined as planar point candidate points, and the next set of point cloud points can be obtained. The point cloud point performs the same judgment; after traversing the entire point cloud pattern to obtain all the plane point candidate points, the final plane point extraction result is obtained based on the plane point candidate points.
  • the aforementioned first preset condition may include: the spatial distribution of the acquired set of point cloud points is approximately a straight line, and the set of point cloud points is approximately center-symmetric when the center point is the center.
  • the expression “the spatial distribution of the acquired set of point cloud points is approximately a straight line” means that the spatial distribution of the acquired set of point cloud points does not necessarily completely coincide with the trajectory of a straight line, but satisfies Approximate coincidence is sufficient, and a certain calculation can be used to determine whether a certain condition is satisfied, and it can be considered as approximate coincidence.
  • the expression "the group of point cloud points is approximately centrally symmetrical when centered on the intermediate point” means that when the group of point cloud points are centered on the intermediate point, it does not necessarily appear to be completely centrally symmetrical, but just satisfies a certain degree , It can be judged whether the symmetry of this degree is satisfied through certain calculation, and it can be regarded as approximately centrosymmetric.
  • the acquired set of point cloud points is completely centrally symmetric with the intermediate point as the center, it also belongs to the situation that satisfies the first preset condition. Through the limitation of the first preset condition, a more accurate plane point acquisition result can be obtained.
  • the method of Principal Components Analysis can be used to determine whether the spatial distribution of the acquired set of point cloud points is approximately a straight line.
  • other suitable methods can also be used to determine whether the spatial distribution of the acquired set of point cloud points is approximately a straight line.
  • the group of point cloud points are determined as planar point candidate points, and the next group of point cloud points is acquired to proceed.
  • the next group of point cloud points should be judged so that the next group of point cloud points should at least include the previous group of point clouds A point cloud point in the point.
  • the determined plane point candidate points can be directly used as the final plane point extraction of the point cloud data of the current frame result.
  • the plane point extraction result can be obtained efficiently.
  • the determined plane point candidate points may be determined according to the satisfaction of the first preset condition
  • the level is sorted, and based on the sorting result, some plane point candidate points are selected as the final plane point extraction result of the point cloud data of the current frame.
  • the first preset condition requires that the spatial distribution of the acquired set of point cloud points is approximately a straight line, and the spatial distribution of the acquired set of point cloud points can be determined by calculation to coincide with a straight line
  • the degree of coincidence and preset conditions such as setting a threshold for the degree of coincidence
  • the first preset condition requires a set of point cloud points to be acquired to be the middle point
  • the center is approximately centrally symmetrical, and the obtained set of point cloud points can be calculated to determine the degree of central symmetry.
  • the degree of satisfaction and preset conditions for example, setting a threshold for the degree of satisfaction
  • the degree to which the acquired set of point cloud points meets the first preset condition is available. Therefore, in this embodiment, the partial plane point candidate points with the highest degree of satisfaction can be sorted according to the degree and obtained as the current The final plane point extraction result of the frame point cloud data, so as to obtain a more accurate plane point extraction result.
  • the point cloud data of the current frame may be divided into several regions, and the point cloud data in each region can be divided into several regions.
  • the plane point candidate points are sorted according to the degree of satisfaction of the first preset condition, and a part of the plane point candidate points selected from each region based on the sorting result is used as the final plane point extraction result of the current frame point cloud data.
  • a slight difference from the previous embodiment is that in this embodiment, instead of sorting all planar point candidate points of a frame of point cloud data, the final planar point extraction result is obtained, but a frame of point cloud data is first divided For several regions, perform such sorting in each region, and then sort the top plane point candidate points in each region as the final plane point extraction result. In this way, the clustering of feature points can be avoided, so that the obtained plane points are more evenly distributed on the point cloud pattern, which is beneficial to the subsequent processing of each region after the feature is extracted.
  • the regions can be divided according to the exit angle of the laser radar.
  • the laser radar scans in 360 degrees, and every 30 degrees can be used as an area to obtain a total of 12 areas and so on.
  • the entire point cloud pattern can be divided into grids, and each grid serves as a region.
  • the area can also be divided in any other suitable manner, as long as the finally extracted plane points can be distributed throughout the point cloud pattern instead of being concentrated in one place.
  • one point cloud point when the acquired set of point cloud points does not meet the first preset condition, one point cloud point can be backed forward according to the time sequence and a point cloud point can be obtained starting from the point cloud point to which it is backed down.
  • the group of point cloud points performs the judgment.
  • each group includes 5 point cloud points
  • when judging a group of point cloud points including the 5th point cloud point to the 9th point cloud point in accordance with the time sequence of all the point cloud points if the group of point cloud points If the point does not satisfy the aforementioned first preset condition, one point cloud point can be backed forward, that is, a group of point cloud points including the 4th point cloud point to the 8th point cloud point is obtained for judgment, and so on.
  • the point cloud data of the current frame before acquiring a set of point cloud points in time series from the point cloud data, can be sorted according to the depth value, and the median value is selected as the scene scale threshold, and based on The scene scale threshold determines the size of the sliding window.
  • the size of the sliding window is selected according to the scene scale, a large sliding window is used for a large scene, and a small sliding window is used for a small scene, so that the plane point acquisition process can obtain accurate results while improving efficiency.
  • FIG. 5 shows a schematic flow chart of the process 500 of the intersecting edge points of the surface in the data processing method according to an embodiment of the present application.
  • the process 500 may include operations of performing the following steps for the point cloud points in the point cloud data:
  • step S510 it is determined whether the two groups of point cloud points before and after the current point cloud point are located meet the first preset condition.
  • step S520 when the two groups of point cloud points before and after the current point cloud point meet the first preset condition, it is determined whether the two groups of point cloud points meet the second preset condition.
  • step S530 when the two groups of point cloud points meet the second preset condition, the current point cloud point is determined as the edge point where the surface intersects.
  • a point cloud point is an edge point where a surface intersects.
  • each group includes 5 point cloud points
  • the previous group of point cloud points ie Whether the first point cloud point to the fifth point cloud point
  • the latter group of point cloud points that is, the fifth point cloud point to the ninth point cloud point
  • the foregoing plane point acquisition process and the surface intersection edge point acquisition process may be performed simultaneously or sequentially.
  • the second preset condition may include: the maximum value of the distance between any two points in each group of point cloud points in the two groups of point cloud points before and after the current point cloud point satisfies the first threshold range , The angle of the direction vector formed by the two groups of point cloud points at the current point cloud point satisfies the second threshold range, and the direction vector formed by the two groups of point cloud points at the current point cloud point is the same as the current point The angle between the exit directions of the cloud points meets the third threshold range.
  • the first threshold range may be [a1, + ⁇ ), that is to say, if a point cloud point is an edge point where a surface intersects, the two groups of point cloud points before and after the point cloud point are located
  • the maximum value of the distance between any two points in each group of point cloud points should be greater than or equal to the threshold a1.
  • the second threshold range may be [b1, b2], that is to say, if a point cloud point is an edge point where a surface intersects, the two groups of point cloud points before and after the point cloud point are located respectively
  • the angle of the formed direction vector should be greater than or equal to b1 and less than or equal to b2.
  • the third threshold range may include a threshold range, that is, if a point cloud point is an edge point where a surface intersects, the direction vectors formed by the two groups of point cloud points before and after the point cloud point are respectively The angle with the exit direction of the current point cloud point should all satisfy the threshold range; or, the third threshold range may include two threshold ranges, that is, if a point cloud point is an edge point where a surface intersects, the point cloud The angle between the direction vector formed by the previous group of point cloud points where the point is located and the exit direction of the current point cloud point should meet one of the two threshold ranges, and the latter group of point cloud points where the point cloud point is located The angle between the formed direction vector and the exit direction of the current point cloud point should satisfy the other of the two threshold ranges.
  • the The point cloud point is determined as the edge point where the surface intersects.
  • FIG. 6 shows a schematic flowchart of a process 600 of jumping edge points in a data processing method according to an embodiment of the present application.
  • the process 600 may include operations of performing the following steps for the point cloud points in the point cloud data:
  • step S610 it is determined whether the difference between the distance between the current point cloud point and the two points before and after it is greater than a predetermined threshold, wherein a point closer to the current point cloud point among the two points before and after is defined as a near side point.
  • step S620 when the difference between the distance between the current point cloud point and the two points before and after it is greater than the predetermined threshold, the current point cloud point is determined as a jumping candidate point.
  • step S630 the point cloud point in the group where the near side point is located in the two groups of point cloud points where the jumping candidate point is located is defined as the near side group point cloud point, and it is determined whether the near side group point cloud point satisfies the third Pre-conditions.
  • step S640 when the near-side group of point cloud points meets the third preset condition, the jumping candidate point is determined as a jumping edge point.
  • a point cloud point is a jumping edge point.
  • it is mainly based on the two point cloud points before and after the point cloud point (according to the time sequence) and the two groups of point cloud points before and after the current point cloud point. For ease of description, assuming that the current point cloud point is the fifth point cloud point in time sequence, the point cloud point before it is the fourth point cloud point, and the point cloud point behind it is the sixth point cloud point.
  • the 4th point cloud point is called the near side point of the 5th point cloud point
  • the 6th point cloud point is called the far side point of the 5th point cloud point
  • the 5th point cloud point is located
  • Two groups of point cloud points before and after taking each group of point cloud points including 5 point cloud points as an example, that is, the first group of point cloud points is the first to the fifth point cloud point, the latter group of point cloud points From the 5th point cloud point to the 9th point cloud point)
  • the group where the near side point (4th point cloud point) is located is called the near side group point cloud point (that is, the previous group of point cloud points-the first Point cloud point to the fifth point cloud point)
  • the group where the far side point (the sixth point cloud point) is located in the two groups of point cloud points before and after the fifth point cloud point is called
  • the fifth point cloud point may be determined as a jumping candidate point. Further, if the fifth point cloud point has been determined to be a jumping candidate point, it can be determined whether the aforementioned near-side group point cloud points (the first point cloud point to the fifth point cloud point) meet the third preset If the conditions are met, the fifth point cloud point can be determined as the jumping edge point.
  • the third preset condition may include: the near-side group of point cloud points satisfy the first preset condition, the direction vector formed by the near-side group of point cloud points and the jump candidate The angle of the direction vector formed by a group of point cloud points on the other side of the point (that is, the far-side group of point cloud points) meets the fourth threshold range, the distance between any two points in the near-side group of point cloud points The maximum value of satisfies the fifth threshold range, and the angle between the direction vector formed by the near-side group of point cloud points and the exit direction of the jump candidate point satisfies the sixth threshold range.
  • the current point cloud point is the 5th point cloud point according to the time sequence
  • the near side group point cloud point of the current point cloud point (the first point The cloud point to the fifth point cloud point) satisfies the third preset condition, which means: the near-side group of point cloud points should meet the aforementioned first preset condition, that is, the near-side group of point cloud points should first be flat Point candidate points; in addition, the angle between the near side group point cloud points and the far side group point cloud points (the 5th point cloud point to the 9th point cloud point) should meet the fourth threshold range; in addition, the near side The maximum value of the distance between any two points in the group of point cloud points should meet the fifth threshold range; in addition, the direction vector formed by the near-side group of point cloud points and the jump candidate point (the fifth point cloud point) The angle of the exit direction satisfies the sixth threshold range.
  • the fourth threshold range may be [b3, b4], that is to say, if a point cloud point is a jumping edge point, the two groups of point cloud points before and after the point cloud point are formed by each The angle of the direction vector should be greater than or equal to b3 and less than or equal to b4.
  • the fifth threshold range may be (- ⁇ , a2), that is, if a point cloud point is a jumping edge point, then any of the near-side group point cloud points where the point cloud point is located The maximum value of the distance between the two points should be less than or equal to the threshold a2.
  • the aforementioned third preset condition may further include: one of the two points before and after the current jump candidate point that is farther from the jump candidate point (that is, the far side point described above) is Non-zero point or non-blind zone zero point.
  • a zero point is also performed on the far side point of the jump candidate point (zero point is the point with coordinates (0,0,0) ) Determine that if the far side point of the jump candidate point is a non-zero point (that is, not a zero point), the jump candidate point can be determined as a jump candidate point.
  • the far side point of the jumping candidate point is a zero point, it is necessary to further determine whether the zero point is a point in the blind zone of a point cloud detection device (such as a lidar) or a point at infinity. If the zero point is a point in the blind zone of a point cloud detection device (such as a lidar), the jump candidate point is not determined as a jump candidate point; if the zero point is a point at infinity, the jump candidate point is determined as a jump candidate point.
  • This embodiment can avoid erroneously dividing the jumping edge points due to the blind zone of the point cloud detection device (such as lidar).
  • the current frame point cloud data may be traversed to obtain and mark the zero point in the current frame point cloud data for the aforementioned implementation example.
  • the obtained edge points may also include edge points of small objects.
  • the edge point of a small object may correspond to a point on the edge of a small object in a three-dimensional space, such as a point on the edge of a small object such as a thin rod, a tree trunk, and so on.
  • the obtained edge points are classified in a more detailed manner, which can further improve the accuracy of point cloud data feature extraction.
  • FIG. 7 shows a schematic flowchart of a process 700 of a small object edge point in a data processing method according to an embodiment of the present application. As shown in FIG. 7, the process 700 may include the following steps:
  • step S710 a predetermined number of consecutive point cloud points in the point cloud data of the current frame are determined as candidate edge points of the small object. Wherein, the maximum value of the distance between any two points in the predetermined number of point cloud points meets the seventh threshold range, and the predetermined number is less than the number of the aforementioned group of point cloud points.
  • step S720 based on the edge point extraction result of the previous frame point cloud data of the current frame point cloud data, it is determined whether the edge point candidate points of the small object in the current frame point cloud data and the edge points in the previous frame point cloud data together constitute a slender edge.
  • step S730 when the small object edge point candidate points in the current frame point cloud data and the edge points in the previous frame point cloud data form a long edge together, the small object edge point candidate points in the current frame point cloud data are determined It is the edge point of a small object.
  • a number of consecutive point cloud points that are less than the aforementioned set of point cloud points may be determined as candidate points for the edge points of the small object. For example, following the previous example, assuming that a group of point cloud points includes 5 point cloud points, such as 4 or 3 or 2 consecutive point cloud points (isolated point cloud clusters) can be determined as edge points of small objects Candidate points. Further, the feature point extraction results (especially the edge point extraction results) of the previous frame point cloud data before the current frame point cloud data can be combined to determine whether the aforementioned small object edge point candidate points are really small object edge points.
  • the edge point candidate points of the small object in the point cloud data of the current frame and the edge points in the point cloud data of the previous frame form a slender edge
  • the small object in the point cloud data of the current frame The candidate point of the edge point is determined as the edge point of the small object.
  • the above shows the process of obtaining the edge points of the small object in the data processing method according to the embodiment of the present application. It should be understood that the process is only exemplary, and any other suitable method may be used to obtain the edge points of the small object.
  • corner points may be obtained from the point cloud points included in the point cloud data based on the obtained edge points.
  • the corner points can also be regarded as a kind of feature points. Obtaining the corner points based on the edge points can further expand the application range of the method according to the embodiments of the present application for irregular point cloud patterns, and at the same time, it is a feature. Subsequent processing after point extraction provides richer information, which is conducive to subsequent processing.
  • acquiring corner points from the point cloud points included in the point cloud data based on the edge points may include performing the following operations for each edge point: analyzing the neighborhood information of the edge points to determine that the same line is formed The neighborhood points of; search for other lines that intersect the same line based on the neighborhood points, and determine the intersection of the same line and the other lines as corner points.
  • the coordinates of at least two points in the neighborhood of the same line can be recorded; then, the line feature that intersects with the line (for example, perpendicular) is searched, and if there is, the intersection of the two lines is calculated, Mark it as a characteristic corner point.
  • the coordinates of the corner point and the respective direction vectors and sizes of the same line and the other lines can be recorded as a descriptor of the corner point.
  • the point cloud data of the current frame may be traversed to obtain and mark the noise points in the point cloud data of the current frame.
  • feature points can be obtained from point cloud points other than noise in the point cloud data, which not only reduces the amount of calculation, but also avoids false extraction results due to noise interference, thereby further improving the accuracy of feature point extraction.
  • subsequent processing may be performed based on the feature points obtained from the point cloud points included in the point cloud data, such as at least one of mapping, object positioning, and object recognition. Based on the previously acquired feature points with high accuracy, the accuracy and reliability of subsequent processing are also improved.
  • the data processing method enables the feature extraction of point cloud data to be applicable to various irregular point cloud patterns, and improves the accuracy of point cloud data feature extraction.
  • FIG. 8 shows a schematic block diagram of a point cloud data data processing device 800 according to an embodiment of the present application.
  • the data processing device 800 includes a memory 810 and a processor 820.
  • the memory 810 stores a program for implementing corresponding steps in the data processing method according to the embodiment of the present application.
  • the processor 820 is configured to run a program stored in the memory 810 to execute corresponding steps of the data processing method according to the embodiment of the present application.
  • Those skilled in the art can understand the operations performed by the processor 820 in combination with the foregoing description. For the sake of brevity, details are not described herein again.
  • FIG. 9 shows a schematic block diagram of a lidar 900 according to an embodiment of the present application.
  • the lidar 900 includes a transmitting end device 910, a receiving end device 920 and a processor 930.
  • the transmitting end device 910 is used to transmit an optical pulse signal.
  • the receiving end device 920 is configured to receive the echo signal corresponding to the optical pulse signal.
  • the processor 930 is configured to obtain point cloud data based on the echo signal, and execute the data processing method described above according to the embodiment of the present application on the point cloud data.
  • the processor 930 can understand the operations performed by the processor 930 in combination with the foregoing description. For the sake of brevity, details are not described herein again.
  • a storage medium is also provided, and program instructions are stored on the storage medium, and the program instructions are used to execute the data processing method of the embodiment of the present application when the program instructions are executed by a computer or a processor.
  • the storage medium may include, for example, a memory card of a smart phone, a storage component of a tablet computer, a hard disk of a personal computer, a read-only memory (ROM), an erasable programmable read-only memory (EPROM), a portable compact disk read-only memory (CD-ROM), USB memory, or any combination of the above storage media.
  • the computer-readable storage medium may be any combination of one or more computer-readable storage media.
  • the edge points are classified in a more detailed manner, so that the point cloud
  • the feature extraction of data can be applied to various irregular point cloud patterns to improve the accuracy of point cloud data feature extraction.
  • the above point cloud data may be point cloud data obtained by the distance measuring device.
  • the distance measuring device may be electronic equipment such as laser radar and laser distance measuring equipment.
  • the distance measuring device is used to sense external environmental information, for example, distance information, orientation information, reflection intensity information, speed information, etc. of environmental targets.
  • One point cloud point in the point cloud data may include at least one of the external environment information measured by the distance measuring device.
  • the distance measuring device can detect the distance from the probe to the distance measuring device by measuring the time of light propagation between the distance measuring device and the probe, that is, the time-of-flight (TOF).
  • the ranging device can also use other technologies to detect the distance from the detected object to the ranging device, such as a ranging method based on phase shift measurement or a ranging method based on frequency shift measurement. There is no restriction.
  • the distance measuring device that generates the point cloud data mentioned herein will be described as an example in conjunction with the distance measuring device 1000 shown in FIG. 10.
  • the distance measuring device 1000 may include a transmitting circuit 1010, a receiving circuit 1020, a sampling circuit 1030, and an arithmetic circuit 1040.
  • the transmitting circuit 1010 may emit a light pulse sequence (for example, a laser pulse sequence).
  • the receiving circuit 1020 can receive the light pulse sequence reflected by the object to be detected, and perform photoelectric conversion on the light pulse sequence to obtain an electrical signal. After processing the electrical signal, the electrical signal can be output to the sampling circuit 1030.
  • the sampling circuit 1030 may sample the electrical signal to obtain the sampling result.
  • the arithmetic circuit 1040 may determine the distance between the distance measuring device 1000 and the detected object based on the sampling result of the sampling circuit 1030.
  • the distance measuring device 1000 may further include a control circuit 1050, which can control other circuits, for example, can control the working time of each circuit and/or set parameters for each circuit.
  • a control circuit 1050 which can control other circuits, for example, can control the working time of each circuit and/or set parameters for each circuit.
  • the distance measuring device shown in FIG. 10 includes a transmitting circuit, a receiving circuit, a sampling circuit, and an arithmetic circuit for emitting a beam for detection
  • the embodiment of the present application is not limited to this, the transmitting circuit
  • the number of any one of the receiving circuit, the sampling circuit, and the arithmetic circuit can also be at least two, which are used to emit at least two light beams in the same direction or in different directions; wherein, the at least two light paths can be simultaneous Shooting can also be shooting at different times.
  • the light-emitting chips in the at least two transmitting circuits are packaged in the same module.
  • each emitting circuit includes a laser emitting chip, and the dies in the laser emitting chips in the at least two emitting circuits are packaged together and housed in the same packaging space.
  • the distance measuring device 1000 may further include a scanning module for changing the propagation direction of at least one laser pulse sequence emitted by the transmitting circuit.
  • a module including a transmitting circuit 1010, a receiving circuit 1020, a sampling circuit 1030, and a calculation circuit 1040, or a module including a transmitting circuit 1010, a receiving circuit 1020, a sampling circuit 1030, a calculation circuit 1040, and a control circuit 1050 may be referred to as a measurement circuit.
  • Distance module the distance measurement module can be independent of other modules, for example, the scanning module.
  • a coaxial optical path can be used in the distance measuring device, that is, the light beam emitted by the distance measuring device and the reflected light beam share at least part of the optical path in the distance measuring device.
  • the distance measuring device may also adopt an off-axis optical path, that is, the light beam emitted by the distance measuring device and the reflected light beam are respectively transmitted along different optical paths in the distance measuring device.
  • FIG. 11 shows a schematic diagram of an embodiment in which the distance measuring device of the present application adopts a coaxial optical path.
  • the ranging device 1100 includes a ranging module 1101.
  • the ranging module 1101 includes a transmitter 1103 (which may include the above-mentioned transmitting circuit), a collimating element 1104, a detector 1105 (which may include the above-mentioned receiving circuit, sampling circuit, and arithmetic circuit) and Light path changing element 1106.
  • the ranging module 1101 is used to emit a light beam, receive the return light, and convert the return light into an electrical signal.
  • the transmitter 1103 can be used to transmit a sequence of light pulses.
  • the transmitter 1103 can emit a sequence of laser pulses.
  • the laser beam emitted by the transmitter 1103 is a narrow-bandwidth beam with a wavelength outside the visible light range.
  • the collimating element 1104 is arranged on the exit light path of the emitter 1103, and is used to collimate the light beam emitted from the emitter 1103, and collimate the light beam emitted from the emitter 1103 into parallel light and output to the scanning module.
  • the collimating element 1104 is also used to condense at least a part of the return light reflected by the probe.
  • the collimating element 1104 may be a collimating lens or other elements capable of collimating light beams.
  • the light path changing element 1106 is used to combine the transmitting light path and the receiving light path in the distance measuring device before the collimating element 1104, so that the transmitting light path and the receiving light path can share the same collimating element, so that the light path More compact.
  • the transmitter 1103 and the detector 1105 may use their respective collimating elements, and the optical path changing element 1106 is arranged on the optical path behind the collimating element.
  • the optical path changing element can use a small-area reflector to combine The transmitting light path and the receiving light path are combined.
  • the light path changing element may also use a reflector with a through hole, where the through hole is used to transmit the emitted light of the transmitter 1103, and the reflector is used to reflect the returned light to the detector 1105. In this way, the shielding of the back light from the support of the small reflector in the case of using the small reflector can be reduced.
  • the optical path changing element is deviated from the optical axis of the collimating element 1104. In some other implementation manners, the optical path changing element may also be located on the optical axis of the collimating element 1104.
  • the distance measuring device 1100 further includes a scanning module 1102.
  • the scanning module 1102 is placed on the exit light path of the distance measuring module 1101.
  • the scanning module 1102 is used to change the transmission direction of the collimated beam 1119 emitted by the collimating element 1104 and project it to the external environment, and project the returned light to the collimating element 1104 .
  • the returned light is converged on the detector 1105 via the collimating element 1104.
  • the scanning module 1102 may include at least one optical element for changing the propagation path of the light beam, wherein the optical element may change the propagation path of the light beam by reflecting, refraction, diffracting the light beam, and the like.
  • the scanning module 1102 includes a lens, a mirror, a prism, a galvanometer, a grating, a liquid crystal, an optical phased array (Optical Phased Array), or any combination of the foregoing optical elements.
  • at least part of the optical element is moving, for example, the at least part of the optical element is driven to move by a driving module, and the moving optical element can reflect, refract, or diffract the light beam to different directions at different times.
  • the multiple optical elements of the scanning module 1102 can rotate or vibrate around a common axis 1109, and each rotating or vibrating optical element is used to continuously change the propagation direction of the incident light beam.
  • the multiple optical elements of the scanning module 1102 may rotate at different speeds or vibrate at different speeds.
  • at least part of the optical elements of the scanning module 1102 may rotate at substantially the same rotation speed.
  • the multiple optical elements of the scanning module may also rotate around different axes.
  • the multiple optical elements of the scanning module may also rotate in the same direction or in different directions; or vibrate in the same direction, or vibrate in different directions, which is not limited herein.
  • the scanning module 1102 includes a first optical element 1114 and a driver 1116 connected to the first optical element 1114.
  • the driver 1116 is used to drive the first optical element 1114 to rotate around the rotation axis 1109 to change the first optical element 1114.
  • the first optical element 1114 projects the collimated beam 1119 to different directions.
  • the angle between the direction of the collimated light beam 1119 changed by the first optical element and the rotation axis 1109 changes as the first optical element 1114 rotates.
  • the first optical element 1114 includes a pair of opposite non-parallel surfaces through which the collimated light beam 1119 passes.
  • the first optical element 1114 includes a prism whose thickness varies in at least one radial direction.
  • the first optical element 1114 includes a wedge angle prism to collimate the beam 1119 for refracting.
  • the scanning module 1102 further includes a second optical element 1115, the second optical element 1115 rotates around the rotation axis 1109, and the rotation speed of the second optical element 1115 is different from the rotation speed of the first optical element 1114.
  • the second optical element 1115 is used to change the direction of the light beam projected by the first optical element 1114.
  • the second optical element 1115 is connected to another driver 1117, and the driver 1117 drives the second optical element 1115 to rotate.
  • the first optical element 1114 and the second optical element 1115 can be driven by the same or different drivers, so that the rotation speed and/or rotation of the first optical element 1114 and the second optical element 1115 are different, so that the collimated light beam 1119 is projected to the outside space Different directions can scan a larger space.
  • the controller 1118 controls the drivers 1116 and 1117 to drive the first optical element 1114 and the second optical element 1115, respectively.
  • the rotational speeds of the first optical element 1114 and the second optical element 1115 can be determined according to the expected scanning area and pattern in actual applications.
  • the drivers 1116 and 1117 may include motors or other drivers.
  • the second optical element 1115 includes a pair of opposite non-parallel surfaces through which the light beam passes. In one embodiment, the second optical element 1115 includes a prism whose thickness varies along at least one radial direction. In one embodiment, the second optical element 1115 includes a wedge prism.
  • the scanning module 1102 further includes a third optical element (not shown) and a driver for driving the third optical element to move.
  • the third optical element includes a pair of opposite non-parallel surfaces, and the light beam passes through the pair of surfaces.
  • the third optical element includes a prism whose thickness varies in at least one radial direction.
  • the third optical element includes a wedge prism. At least two of the first, second, and third optical elements rotate at different rotation speeds and/or rotation directions.
  • each optical element in the scanning module 1102 can project light to different directions, such as the direction of the light 1111 and the direction of the light 1113, so that the space around the distance measuring device 1100 is scanned.
  • FIG. 12 is a schematic diagram of a scanning pattern of the distance measuring device 1100. It is understandable that when the speed of the optical element in the scanning module changes, the scanning pattern will also change accordingly.
  • the detection object 1110 When the light 1111 projected by the scanning module 1102 hits the detection object 1110, a part of the light is reflected by the detection object 1110 to the distance measuring device 1100 in a direction opposite to the projected light 1111.
  • the return light 1112 reflected by the detection object 1110 is incident on the collimating element 1104 after passing through the scanning module 1102.
  • the detector 1105 and the transmitter 1103 are placed on the same side of the collimating element 1104, and the detector 1105 is used to convert at least part of the return light passing through the collimating element 1104 into electrical signals.
  • an anti-reflection coating is plated on each optical element.
  • the thickness of the antireflection coating is equal to or close to the wavelength of the light beam emitted by the transmitter 1103, which can increase the intensity of the transmitted light beam.
  • a filter layer is plated on the surface of an element located on the beam propagation path in the distance measuring device, or a filter is provided on the beam propagation path for transmitting at least the wavelength band of the beam emitted by the transmitter 1103 , Reflect other wavebands to reduce the noise caused by ambient light to the receiver.
  • the transmitter 1103 may include a laser diode through which nanosecond laser pulses are emitted.
  • the laser pulse receiving time can be determined, for example, the laser pulse receiving time can be determined by detecting the rising edge time and/or the falling edge time of the electrical signal pulse.
  • the distance measuring device 1100 can calculate the TOF 1107 by using the pulse receiving time information and the pulse sending time information, so as to determine the distance between the probe 1110 and the distance measuring device 1100.
  • the distance and orientation detected by the distance measuring device 1100 can be used for remote sensing, obstacle avoidance, surveying and mapping, modeling, navigation, and so on.
  • the distance measuring device of the embodiment of the present application can be applied to a mobile platform, and the distance measuring device can be installed on the platform body of the mobile platform.
  • a mobile platform with a distance measuring device can measure the external environment, for example, measuring the distance between the mobile platform and obstacles for obstacle avoidance and other purposes, and for two-dimensional or three-dimensional surveying and mapping of the external environment.
  • the mobile platform includes at least one of an unmanned aerial vehicle, a car, a remote control car, a robot, and a camera.
  • the platform body When the ranging device is applied to an unmanned aerial vehicle, the platform body is the fuselage of the unmanned aerial vehicle.
  • the platform body When the distance measuring device is applied to a car, the platform body is the body of the car.
  • the car can be a self-driving car or a semi-self-driving car, and there is no restriction here.
  • the platform body When the distance measuring device is applied to a remote control car, the platform body is the body of the remote control car.
  • the platform body When the distance measuring device is applied to a robot, the platform body is a robot.
  • the platform body When the distance measuring device is applied to a camera, the platform body is the camera itself.
  • the disclosed device and method can be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components can be combined or It can be integrated into another device, or some features can be ignored or not implemented.
  • the various component embodiments of the present application may be implemented by hardware, or by software modules running on one or more processors, or by a combination of them.
  • a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all of the functions of some modules according to the embodiments of the present application.
  • This application can also be implemented as a device program (for example, a computer program and a computer program product) for executing part or all of the methods described herein.
  • Such a program for implementing the present application may be stored on a computer-readable storage medium, or may have the form of one or more signals.
  • Such a signal can be downloaded from an Internet website, or provided on a carrier signal, or provided in any other form.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé et un appareil de traitement de données, ainsi qu'un support de stockage. Le procédé consiste à : acquérir des données de nuage de points de la trame actuelle (S110) ; et acquérir des points caractéristiques à partir de points de nuage de points inclus dans les données de nuage de points, les points caractéristiques comprenant des points de plan et des points de bord et les points de bord comprenant des points de bord d'intersection plan-plan et/ou des points de bord de saut (S120). Les points de bord d'intersection plan-plan correspondent à des points sur la ligne d'intersection de plans d'intersection dans un espace tridimensionnel et les points de bord de saut correspondent à des points sur un bord d'un plan isolé dans l'espace tridimensionnel. Au moyen de l'extraction de points de bord dans des données de nuage de points, les points de bord sont classés de manière plus raffinée, de telle sorte que l'extraction de caractéristiques des données de nuage de points puisse être applicable à divers motifs de nuage de points irréguliers, ce qui permet d'améliorer la précision de l'extraction de caractéristiques de données de nuage de points.
PCT/CN2020/097196 2020-06-19 2020-06-19 Appareil et procédé de traitement de données, ainsi que radar laser et support de stockage WO2021253429A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080006485.9A CN114080545A (zh) 2020-06-19 2020-06-19 数据处理方法、装置、激光雷达和存储介质
PCT/CN2020/097196 WO2021253429A1 (fr) 2020-06-19 2020-06-19 Appareil et procédé de traitement de données, ainsi que radar laser et support de stockage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/097196 WO2021253429A1 (fr) 2020-06-19 2020-06-19 Appareil et procédé de traitement de données, ainsi que radar laser et support de stockage

Publications (1)

Publication Number Publication Date
WO2021253429A1 true WO2021253429A1 (fr) 2021-12-23

Family

ID=79269023

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/097196 WO2021253429A1 (fr) 2020-06-19 2020-06-19 Appareil et procédé de traitement de données, ainsi que radar laser et support de stockage

Country Status (2)

Country Link
CN (1) CN114080545A (fr)
WO (1) WO2021253429A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115308763A (zh) * 2022-07-06 2022-11-08 北京科技大学 一种基于激光雷达三维点云的冰球护肘角度测量方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102136155A (zh) * 2010-01-27 2011-07-27 首都师范大学 基于三维激光扫描数据的物体立面矢量化方法和系统
CN104063860A (zh) * 2014-06-12 2014-09-24 北京建筑大学 一种激光点云边缘精细化的方法
CN106772425A (zh) * 2016-11-25 2017-05-31 北京拓维思科技有限公司 数据处理方法和装置
US20180211399A1 (en) * 2017-01-26 2018-07-26 Samsung Electronics Co., Ltd. Modeling method and apparatus using three-dimensional (3d) point cloud
CN109752701A (zh) * 2019-01-18 2019-05-14 中南大学 一种基于激光点云的道路边沿检测方法
CN110223379A (zh) * 2019-06-10 2019-09-10 于兴虎 基于激光雷达的三维点云重建方法
CN110570511A (zh) * 2018-06-06 2019-12-13 阿里巴巴集团控股有限公司 点云数据的处理方法、装置、系统和存储介质
CN110823252A (zh) * 2019-11-06 2020-02-21 大连理工大学 一种多线激光雷达和单目视觉的自动标定方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102136155A (zh) * 2010-01-27 2011-07-27 首都师范大学 基于三维激光扫描数据的物体立面矢量化方法和系统
CN104063860A (zh) * 2014-06-12 2014-09-24 北京建筑大学 一种激光点云边缘精细化的方法
CN106772425A (zh) * 2016-11-25 2017-05-31 北京拓维思科技有限公司 数据处理方法和装置
US20180211399A1 (en) * 2017-01-26 2018-07-26 Samsung Electronics Co., Ltd. Modeling method and apparatus using three-dimensional (3d) point cloud
CN110570511A (zh) * 2018-06-06 2019-12-13 阿里巴巴集团控股有限公司 点云数据的处理方法、装置、系统和存储介质
CN109752701A (zh) * 2019-01-18 2019-05-14 中南大学 一种基于激光点云的道路边沿检测方法
CN110223379A (zh) * 2019-06-10 2019-09-10 于兴虎 基于激光雷达的三维点云重建方法
CN110823252A (zh) * 2019-11-06 2020-02-21 大连理工大学 一种多线激光雷达和单目视觉的自动标定方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115308763A (zh) * 2022-07-06 2022-11-08 北京科技大学 一种基于激光雷达三维点云的冰球护肘角度测量方法
CN115308763B (zh) * 2022-07-06 2023-08-22 北京科技大学 一种基于激光雷达三维点云的冰球护肘角度测量方法

Also Published As

Publication number Publication date
CN114080545A (zh) 2022-02-22

Similar Documents

Publication Publication Date Title
WO2021253430A1 (fr) Procédé de détermination de pose absolue, dispositif électronique et plateforme mobile
Liu et al. TOF lidar development in autonomous vehicle
WO2020243962A1 (fr) Procédé de détection d'objet, dispositif électronique et plateforme mobile
WO2021072710A1 (fr) Procédé et système de fusion de nuage de points pour un objet mobile, et support de stockage informatique
WO2021051281A1 (fr) Procédé de filtrage de bruit en nuage de points, dispositif de mesure de distance, système, support d'informations et plateforme mobile
CN108132464A (zh) 一种固态面阵激光雷达探测方法
CN209728172U (zh) 一种室内面积测量仪
WO2021239054A1 (fr) Appareil, procédé et dispositif de mesure d'espace, et support de stockage lisible par ordinateur
US11703588B2 (en) Reflection object position calculating device, reflection object position calculating method, and reflection object position calculating program
WO2021062581A1 (fr) Procédé et appareil de reconnaissance de marquage routier
WO2022198637A1 (fr) Procédé et système de filtrage de bruit en nuage de points et plate-forme mobile
US20210333375A1 (en) Time measurement correction method and device
CN111999744A (zh) 一种无人机多方位探测、多角度智能避障方法
WO2020215252A1 (fr) Procédé de débruitage de nuage de points de dispositif de mesure de distance, dispositif de mesure de distance et plateforme mobile
WO2021253429A1 (fr) Appareil et procédé de traitement de données, ainsi que radar laser et support de stockage
WO2021232227A1 (fr) Procédé de construction de trame de nuage de points, procédé de détection de cible, appareil de télémétrie, plateforme mobile et support de stockage
WO2020237663A1 (fr) Procédé d'interpolation de nuage de points lidar multi-canal et appareil de télémétrie
WO2020155159A1 (fr) Procédé pour augmenter la densité d'échantillonnage de nuage de points, système de balayage de nuage de points et support d'enregistrement lisible
WO2020177076A1 (fr) Procédé et appareil d'étalonnage de l'état initial d'un appareil de détection
WO2020133384A1 (fr) Dispositif de télémétrie laser et plateforme mobile
US20210341588A1 (en) Ranging device and mobile platform
WO2022170535A1 (fr) Procédé de mesure de distance, dispositif de mesure de distance, système et support d'enregistrement lisible par ordinateur
CN114777761A (zh) 清洁机及地图构建方法
WO2020220275A1 (fr) Circuit de détection, procédé de détection, appareil de télémétrie et plateforme mobile
WO2020155142A1 (fr) Procédé, dispositif et système de rééchantillonnage de nuage de points

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20940568

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20940568

Country of ref document: EP

Kind code of ref document: A1