CN115713600A - Method and device for generating digital elevation model of automatic driving scene - Google Patents

Method and device for generating digital elevation model of automatic driving scene Download PDF

Info

Publication number
CN115713600A
CN115713600A CN202211486451.1A CN202211486451A CN115713600A CN 115713600 A CN115713600 A CN 115713600A CN 202211486451 A CN202211486451 A CN 202211486451A CN 115713600 A CN115713600 A CN 115713600A
Authority
CN
China
Prior art keywords
slice
topographic
point
points
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211486451.1A
Other languages
Chinese (zh)
Inventor
王方建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yikong Zhijia Technology Co Ltd
Original Assignee
Beijing Yikong Zhijia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yikong Zhijia Technology Co Ltd filed Critical Beijing Yikong Zhijia Technology Co Ltd
Priority to CN202211486451.1A priority Critical patent/CN115713600A/en
Publication of CN115713600A publication Critical patent/CN115713600A/en
Pending legal-status Critical Current

Links

Images

Abstract

The method comprises the steps of dividing terrain points into corresponding slices, generating slice units according to the slices, and recording the elevation values of all the terrain points in the corresponding slices in each slice unit, so that the digital elevation model only comprises the elevation values of all the terrain points and does not comprise the plane position information of all the terrain points, and the occupation of a storage space during storage of the digital elevation model is effectively reduced. In addition, since the storage sequence of the elevation values of the topographic points in the same slicing unit is associated with the positions of the topographic points in the slice, the plane coordinate information of the topographic points can be restored based on the storage sequence of the elevation values of the topographic points in the slicing unit, so that the complete spatial information of the topographic points can be acquired.

Description

Method and device for generating digital elevation model of automatic driving scene
Technical Field
The disclosure relates to the technical field of automatic driving, and in particular relates to a method and a device for generating a digital elevation model of an automatic driving scene.
Background
With the development of the automatic driving technology, hardware parameters of the laser radar are better and better, and the cost is lower and lower. The laser radar based on the vehicle, the aircraft or other vehicles can acquire high-density and high-precision point cloud data of a target area at low cost and high efficiency. The acquired point cloud data can be used for generating a digital elevation model, so that basic terrain data support is provided for high-precision maps, terrain analysis, decision planning and the like in an automatic driving scene. However, the digital elevation model in the related art includes plane position information and elevation values of each point, and occupies a large storage space.
Disclosure of Invention
In view of this, the embodiments of the present disclosure provide a method and an apparatus for generating a digital elevation model of an automatic driving scene.
In a first aspect of the embodiments of the present disclosure, a method for generating a digital elevation model of an automatic driving scene is provided, where the method includes: acquiring point cloud data of a target operation area of an automatic driving vehicle, and determining plane position information and elevation values of a plurality of topographic points in the target operation area based on the point cloud data; determining slices of the corresponding topographic points in a plurality of pre-divided slices based on the plane position information of each topographic point, wherein different slices correspond to different plane position ranges; generating a digital elevation model of the target operation area based on the elevation value of each topographic point and the slice to which each topographic point belongs, and storing the digital elevation model; the digital elevation model comprises a plurality of slice units, each slice unit corresponds to one slice and is used for recording the elevation value of each topographic point in the corresponding slice; the storage sequence of the elevation values of all topographic points in the same slice unit in the slice unit is related to the positions of all topographic points in the slice.
In a second aspect of the disclosed embodiments, there is provided an apparatus for generating a digital elevation model of an autonomous driving scenario, the apparatus comprising: the system comprises a first determination module, a second determination module and a third determination module, wherein the first determination module is used for acquiring point cloud data of a target operation area of an automatic driving vehicle and determining plane position information and elevation values of a plurality of topographic points in the target operation area based on the point cloud data; the second determining module is used for determining slices of the corresponding topographic points in the pre-divided multiple slices based on the plane position information of each topographic point, and different slices correspond to different plane position ranges; the storage module is used for generating a digital elevation model of the target operation area based on the elevation value of each topographic point and the slice to which each topographic point belongs, and storing the digital elevation model; the digital elevation model comprises a plurality of slice units, each slice unit corresponds to one slice and is used for recording the elevation value of each topographic point in the corresponding slice; the storage sequence of the elevation values of all topographic points in the same slice unit in the slice unit is related to the positions of all topographic points in the slice.
In a third aspect of the embodiments of the present disclosure, a computer-readable storage medium is provided, on which a computer program is stored, which when executed by a processor implements the method of the first aspect described above.
In a fourth aspect of the embodiments of the present disclosure, there is provided an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the method of the first aspect when executing the program.
The embodiment of the present disclosure adopts at least one technical scheme that can achieve the following beneficial effects: the terrain points are divided into the corresponding slices, the slice units are generated according to the slices, and the elevation values of all the terrain points in the corresponding slices are recorded in each slice unit, so that the digital elevation model only comprises the elevation values of all the terrain points and does not comprise the plane position information of all the terrain points, and the occupation of the digital elevation model on the storage space during storage is effectively reduced. In addition, since the storage sequence of the elevation values of the topographic points in the same slicing unit is associated with the positions of the topographic points in the slice to which the topographic points belong, the plane coordinate information of the topographic points can be restored based on the storage sequence of the elevation values of the topographic points in the slicing unit, and thus the complete spatial information of the topographic points can be acquired based on the restored plane coordinate information and the stored elevation values. That is, the embodiment of the present disclosure only needs to store the elevation value of the topographical point in the digital elevation model, and the same effect as storing the planar position information and the elevation value of the topographical point at the same time can be obtained.
Drawings
To more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without inventive efforts.
Fig. 1 is a schematic diagram of an application scenario in one embodiment of the present disclosure.
FIG. 2 is a flow chart of a method of generating a digital elevation model for an autonomous driving scenario in one embodiment of the disclosure.
Fig. 3 is a schematic illustration of a slice in one embodiment of the present disclosure.
FIG. 4 is a schematic illustration of a digital elevation model in one embodiment of the disclosure.
FIG. 5 is a schematic diagram of an incremental update process in one embodiment of the present disclosure.
FIG. 6 is a block diagram of an apparatus for generating a digital elevation model for an autonomous driving scenario in one embodiment of the disclosure.
Fig. 7 is a schematic diagram of an electronic device in one embodiment of the disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if," as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination," depending on the context.
With the development of the automatic driving industry, laser radar (LiDAR) based on vehicles, vehicles or other vehicles can acquire high-density and high-precision point cloud data of a target area at low cost and high efficiency. The collected point cloud data may be used to generate a Digital Elevation Model (DEM), and the Digital Elevation Model implements Digital simulation of a terrain curved surface or Digital representation of a terrain surface morphology through limited terrain Elevation, that is, by rasterizing a terrain surface and recording an Elevation value of each grid point, so as to represent a terrain feature. The digital elevation model can provide basic terrain data support for high-precision maps, terrain analysis, decision planning and the like in an automatic driving scene.
Fig. 1 is a schematic diagram of an application scenario of the embodiment of the present disclosure. Autonomous vehicle 101 operating in a target operating region R (e.g., a mine) may travel between loading point S1 and unloading point S2 along a pre-planned travel path tr to transport the coal mine at loading point S1 to unloading point S2. A laser radar 101a may be installed on the autonomous vehicle 101, and the laser radar 101a may collect point cloud data of the target working area R during the movement of the autonomous vehicle 101, where the point cloud data may include three-dimensional coordinate information of any point H within the sensing range of the laser radar 101a, including, but not limited to, a ground point on the driving path tr, a point on the tree 102 of the target working area R, a point on the pedestrian 103 of the target working area R, a point on the coal mine pile 104 of the loading point S1, and the like. A processing unit may be deployed on autonomous vehicle 101, which may generate a digital elevation model of target work area R based on point cloud data collected by lidar 101 a. Alternatively, the autonomous vehicle 101 may send the point cloud data collected by the laser radar 101a to the cloud end, so as to generate the digital elevation model of the target working area R at the cloud end based on the received point cloud data. It is to be understood that the above application scenarios are exemplary only and are not intended to limit the present disclosure. In addition to the above scenarios, the solution of the embodiment of the present disclosure may also be used in other application scenarios.
In the related art, the digital elevation model includes both the plane position information of each point in the target working area R and the elevation value of each point, and the data size is large, so that the storage space is occupied greatly. Based on this, the present disclosure provides a method of generating a digital elevation model of an autonomous driving scenario. The method of the embodiment of the present disclosure is described below with reference to the application scenario shown in fig. 1. Referring to fig. 2, the method includes:
step S201: acquiring point cloud data C of a target operation area R of the autonomous driving vehicle 101, and determining plane position information and elevation values of a plurality of topographic points H in the target operation area R based on the point cloud data C;
step S202: determining slices of the corresponding topographic points H in a plurality of pre-divided slice slices based on the plane position information of each topographic point H, wherein different slice slices correspond to different plane position ranges;
step S203: and generating a digital elevation model DEM of the target operation area R based on the elevation value of each topographic point H and the slice to which each topographic point H belongs, and storing the digital elevation model DEM.
In step S201, the point cloud data C may be acquired by the laser radar 101a on the autonomous vehicle 101. The point cloud data C may include three-dimensional coordinate information of a plurality of topographical points H, and plane position information and an elevation value of the topographical points may be determined based on the three-dimensional coordinate information of the topographical points, where the plane position information may be denoted as (x, y) and is used to represent position information of the topographical points on a topographical surface, x represents a plane abscissa, and y represents a plane ordinate. The elevation value is the distance from the topographical point to the absolute base along the plumb line and can be designated as z.
In some embodiments, the point cloud data C may be obtained by preprocessing raw point cloud data of the target working area R collected by the laser radar 101 a. Wherein the pre-treatment may include, but is not limited to, at least one of: voxelization filtering, noise filtering and non-topographical point filtering. The following examples illustrate various pretreatment methods.
The voxelization filtering is to filter the original point cloud data through voxelization grids, that is, to down-sample the point cloud data to reduce the density of the point cloud in unit distance, so as to reduce the number of the point cloud data and maintain the shape characteristics of the point cloud data. For example, the point cloud space may be divided into a number and size of voxelized grids (e.g., cubes), with the center of gravity point of the voxelized grid being used to approximate all other points within the voxelized grid to achieve the effect of reducing the point cloud data. The raw point cloud data may be voxelized filtered based on the target resolution of the digital elevation model DEM, i.e. the size of the voxelized grid is determined based on the target resolution of the digital elevation model DEM. In general, to ensure the accuracy of the digital elevation model DEM, the side length parameter of the voxelized grid may be set to a value not lower than the target resolution. The target resolution of the digital elevation model DEM may be set based on actual needs. The smaller the value of the target resolution is, the higher the accuracy of the digital elevation model DEM is, and the larger the data volume is. Therefore, the accuracy and the amount of data of the digital elevation model DEM may be taken into account simultaneously when determining the target resolution.
In one embodiment, assuming a target resolution of the digital elevation model DEM of 0.1 meter, the side length parameter of the voxelized grid may also be set to 0.1 meter. By the method, the data volume of the point cloud data can be effectively reduced under the condition of keeping the geometric characteristics and the precision requirements of the space object, and the processing and calculating efficiency of the point cloud data is improved.
The noise filtering refers to filtering various noise points such as dust, moving scattered points, isolated points, abnormal high points, abnormal low points, abnormal far points and the like in the original point cloud data through one or more filtering algorithms. A full set of filtering algorithms can be used to noise filter the raw point cloud data, including but not limited to histogram filtering, statistical filtering, radius filtering, etc. The histogram filtering can count the number of points in different coordinate ranges, so that noise points such as abnormal high points, abnormal low points, abnormal far points and the like are filtered. The statistical filtering and the radius filtering can determine the density of point cloud data in different areas, so that noise points such as dust, moving scattered points and isolated points are filtered. The various filtering algorithms may be performed in series. By carrying out noise filtering, various noises can be effectively filtered, so that the calculation error in the process of generating the digital elevation model DEM is reduced.
The non-topographic point filtering is to perform semantic segmentation on point cloud data to identify semantic information of the point cloud data so as to filter various non-topographic points such as vehicles, houses, pedestrians, roadblocks, traffic signs and the like in the original point cloud data, and only the topographic points such as pavements, hillsides, isolation retaining walls, steps and the like in the original point cloud data are adopted to participate in the generation of the digital elevation model DEM, so that the influence of the non-topographic points on the digital elevation model DEM is reduced, and the precision of the digital elevation model DEM is improved. Pattern recognition, machine learning algorithms, or a combination of both may be employed to achieve semantic segmentation of the point cloud data.
In step S202, unified slice organization rules and data structures may be set. As shown in fig. 3, the planar position range may be pre-divided into a plurality of slices slice, and different slices correspond to different planar position ranges (including different planar abscissa ranges and/or different planar ordinate ranges). For example, the horizontal coordinate range of the plane of one slice is [ x1, x2], and the vertical coordinate range of the plane is [ y1, y2]; the horizontal coordinate range of the plane of the adjacent slice in the same row with the slice is [ x2, x3], and the vertical coordinate range of the plane is [ y1, y2]; the horizontal coordinate range of the plane of the adjacent slice in the same column with the slice is [ x1, x2], and the vertical coordinate range of the plane is [ y2, y3], wherein x1, x2 and x3 are real numbers which are mutually unequal, and y1, y2 and y3 are real numbers which are mutually unequal.
The slice slices may be the same size or different sizes. For convenience of explanation, the following description will be given taking as an example that the slice slices have the same size. The row side length RL and the column side length CL of the slice may be set based on data management efficiency and flexibility, where the row side length RL and the column side length CL may be the same or different. Alternatively, the row side length RL and the column side length CL may both be set to 10 meters.
In some embodiments, a fixed reference point B0 may be set based on the area of the target work area R, as shown in FIG. 3, with the coordinates of the reference point being (-10000,10000). After the reference point is determined, the slice with the reference point B0 as the top-left corner point may be used as the slice of the 0 th row and the 0 th column, so as to obtain the row number and the column number of each slice.
The row and column coordinates of the slices may be used to identify the individual slice slices, e.g., the slice in row0, column 0, is identified by {0,0}, the slice in row0, column 1, is identified by {0,1}, and so on. Alternatively, the slice numbers may be used to identify individual slices. For example, the slices slice are numbered by row, and assuming that each row includes u slices, the number of the slices in row1 can be sequentially denoted as 0,1,2, … …, u-1; the numbering of the individual slices in row 2 may be noted as u, u +1, u +2, … …,2u-1 in order; and so on.
It is to be understood that the above slice organization rules are merely exemplary. For example, in addition to the reference points shown in the figure, points of other coordinate positions may be employed as the reference point B0. For another example, the slice with the reference point B0 as the lower left corner, the lower right corner or the upper right corner may be used as the slice of the 0 th row and the 0 th column, so as to obtain the row number and the column number of each slice. The appropriate slice organization rules and data structures can be selected according to practical situations, and are not listed in this disclosure. Through the mode, the elevation values of the topographic points and the row numbers and the column numbers of the slices to which the topographic points belong are associated and bound, and the simple and efficient organization management of the digital elevation model based on the row numbers and the column numbers is realized. For convenience of explanation, the following describes a scheme of an embodiment of the present disclosure, taking the slice organization rule shown in fig. 3 as an example.
Based on the slice organization rule, the slice to which the topographic point H belongs may be determined based on the plane position information of the topographic point H, the plane position information of the preset reference point B0, and the size of each slice. The plane position information may include a plane abscissa x and a plane ordinate y, and the size of the slice may be represented by a row side length RL and a column side length CL of the slice, or may be represented by an area of the slice or a diagonal length of the slice in the case where the slice is square.
Specifically, the row number of the slice to which the topographic point H belongs may be determined based on the plane ordinate of the topographic point H, the plane ordinate of the reference point B0, and the row length of each slice. The column number of the slice to which the topographic point H belongs may also be determined based on the plane abscissa of the topographic point H, the plane abscissa of the reference point B0, and the column side length of each slice. In the embodiment shown in fig. 3, assuming that the row length of each slice is CL and the column length of each slice is RL, a first difference between the plane ordinate of the reference point and the plane ordinate of the topographical point H may be obtained, a first ratio of the first difference to the column length RL of the slice is determined, and the first ratio is rounded down to obtain the row number m of the slice to which the topographical point H belongs. And a second difference between the plane abscissa of the topographic point H and the plane abscissa of the reference point can be obtained, a second ratio of the second difference to the row edge length CL of the slice is determined, and the second ratio is rounded downwards to obtain the column number n of the slice to which the topographic point H belongs. The calculation formula of the row number m and the column number n of the slice to which the topographic point H belongs in some embodiments can be written as:
Figure BDA0003962554940000091
wherein, b0.x and b0.y respectively represent the plane abscissa and the plane ordinate of the reference point B0, H.x and H.y respectively represent the plane abscissa and the plane ordinate of the topographical point H, and floor represents the rounding-down operation. In the case where the reference point B0 is the lower left corner of the slice in row0 and column 0, floor in the above equation (1.1) may be replaced by Ceiling, i.e. a rounding-up operation. In other embodiments, the row number m and the column number n of the slice to which the topographical point H belongs may be adjusted according to actual conditions, and are not listed here. After the row number m and the column number n of the slice to which the topographic point H belongs are obtained, the row number m and the column number n may be converted into the number of the slice to which the topographic point H belongs. Still assuming that each row in the slice organization rule shown in fig. 3 includes u slices, the number of the slice to which the topographical point H belongs may be denoted as u × m + n.
In step S203, a digital elevation model DEM may be generated based on the elevation values of the respective topographical points H and the slice to which the respective topographical points belong. The generated digital elevation model only comprises the elevation values of all topographical points and does not comprise the plane position information of all topographical points, so that the data volume of the digital elevation model is effectively reduced, and the occupation of the storage space of the digital elevation model in the storage process is effectively reduced.
The elevation values of the topographical points H included in the digital elevation model DEM may be organized according to the slice to which the topographical points H belong. In particular, the digital elevation model DEM may comprise a plurality of slicing units, one for each slice, for recording the elevation values of each topographical point in the corresponding slice. For example, referring to fig. 4, in the case where the number of slices is v, the number of slice units is also v, and each slice is respectively referred to as slice 1, slice 2, … …, slice v, and each slice unit is respectively referred to as slice unit 1, slice unit 2, … …, slice unit v. The slicing unit 1 corresponds to the slice 1 and is used for recording the elevation value of each topographic point in the slice 1; the slicing unit 2 corresponds to the slice 2 and is used for recording the elevation value of each topographic point in the slice 2; and so on.
In some embodiments, plane position information of the feature point of the corresponding slice is recorded in each slice unit, and the identification information of each slice unit is determined based on the identification information of the slice corresponding to the slice unit (i.e., the row number and column number of the slice, or the number of the slice). For example, each slice unit may be organized into one slice file, and the identification information of the slice corresponding to each slice file may be the file name of the slice file. In this way, the slice corresponding to the slice unit may be determined based on the identification information of the slice unit.
In other embodiments, the planar position information of the feature point of the corresponding slice and the elevation value of each topographic point in the corresponding slice are recorded in each slice unit. The feature point of the slice may be a corner point of the slice (e.g., a corner point at the top left corner of the slice), a center point of the slice, or a midpoint of an edge of the slice. Taking the feature point as an example of the corner point at the upper left corner of the slice, as shown in fig. 3, the feature point of the slice in the mth row and nth column (the slice corresponding to the gray area in the figure) is shown as feature point B in the figure. In this way, the planar position information of the feature point B (including the planar abscissa and planar ordinate of the feature point B) and the elevation value of each topographic point in the slice of the mth row and nth column may be recorded in the slice unit corresponding to the slice of the mth row and nth column.
Wherein the plane position information of the feature points of the slice may be determined based on the plane position information of the preset reference point B0, the position information of the slice in the plurality of slices, and the size of each slice. In the embodiment shown in fig. 3, assuming that the row edge length of each slice is CL and the column edge length of each slice is RL, a first product of the column number of the slice and the row edge length of the slice may be obtained, and the plane abscissa B.x of the feature point B of the slice is determined based on the sum of the first product and the plane abscissa of the reference point. A second product of the row number of the slice and the column side length of the slice may also be obtained, and a plane ordinate B.y of the feature point B of the slice is determined based on a difference between the plane ordinate of the reference point and the second product. The calculation formulas for the plane abscissa B.x and the plane ordinate B.y of the feature point B of the slice of the mth row and nth column in some embodiments can be written as:
Figure BDA0003962554940000111
in the case where the feature point B is the center point of the slice of the m-th row and n-th column, the plane abscissa B.x of the feature point B may be increased by the offset amount of CL/2 on the basis of the formula (1.2), and the plane ordinate B.y of the feature point B may be increased by the offset amount of RL/2 on the basis of the formula (1.2). In other embodiments, the calculation formulas of the plane abscissa B.x and the plane ordinate B.y of the feature point B can be adjusted according to actual conditions, and are not listed.
In some embodiments, the storage order in the slice unit of the elevation values of the respective topographical points in the same slice unit is associated with the positions of the respective topographical points in the slice. For example, a slice may be divided into a plurality of sub-slices based on the target resolution of the digital elevation model DEM, each sub-slice being a grid in a row and a column in the slice to which a topographical point belongs, and the position of a topographical point in the slice to which the topographical point belongs may be represented by the row number and the column number of the sub-slice to which the topographical point belongs in the corresponding slice. For example, the grid in the 0 th row and the 0 th column in the slice is one sub-slice, and the grid in the 0 th row and the 1 st column in the slice is another sub-slice. A plurality of topographical points may be included in each sub-slice. The storage order of the elevation values of the respective topographical points in the corresponding slice unit may be determined based on the row number and column number of the respective sub-slices in the corresponding slice.
As shown in fig. 4, elevation values of topographical points in each sub-slice may be recorded row by row in a slice unit. Specifically, for each slice unit, the elevation value of each topographic point H (row 0, col 0) in the sub-slice in the 0 th row and the 0 th column may be recorded first, then the elevation value of each topographic point H (row 0, col 1) in the sub-slice in the 0 th row and the 1 st column may be recorded, and so on until the elevation value of each topographic point in the sub-slice in the last column in the 0 th row is recorded. Then, the elevation values of all the topographic points H (row 1, col 0) in the sub-slice of the 1 st row and the 0 th column are recorded in sequence, the elevation values of all the topographic points H (row 1, col 1) in the sub-slice of the 1 st row and the 1 st column are recorded, and so on until the elevation values of all the topographic points in the sub-slice of the last column of the 1 st row are recorded. And recording the elevation values of the topographic points in each sub-slice line by line according to the mode until the elevation values of the topographic points in all the sub-slices in the same slice are recorded.
Since the density of the point cloud data is usually high, the number of the topographic points actually acquired in one sub-slice may be greater than 1, and one topographic point may be screened out from each sub-slice, or one topographic point may be fitted based on a plurality of topographic points included in the sub-slice, and each topographic point in the corresponding sub-slice is represented by the screened or fitted topographic point. Since the sub-slices are divided according to the target resolution of the digital elevation model DEM, a topographic point is screened or fitted for each sub-slice in the mode, so that the data volume can be reduced, and the requirement of the target resolution of the digital elevation model DEM can be met. The screened or fitted topographical points may be referred to as grid points of the corresponding sub-slices. On this basis, the elevation values of the respective topographical points recorded in the slice unit may include only the elevation values of the respective grid points in the corresponding slice.
Furthermore, the elevation values of a plurality of topographic points in the same sub-slice can be fused, and the elevation value obtained through fusion is used as the elevation value of each topographic point in the sub-slice. Specifically, the elevation values of a plurality of neighborhood topographic points of the grid point of the sub-slice can be determined, and the elevation values of the neighborhood topographic points are weighted and averaged by using an inverse distance weighting algorithm to obtain the elevation value of the grid point. The neighborhood topographic points of the grid points refer to topographic points, the distance between which and the grid points is smaller than a preset neighborhood radius.
By the mode, the plane coordinate information of the topographic points can be restored based on the storage sequence of the elevation values of the topographic points in the slicing unit, so that the complete space information of each topographic point can be acquired based on the restored plane coordinate information and the stored elevation values. That is, the embodiment of the present disclosure only needs to store the elevation value of the topographical point in the digital elevation model, and the same effect as storing the planar position information and the elevation value of the topographical point at the same time can be obtained.
In the above embodiment, the sub-slice to which the topographic point belongs may be determined based on the planar position information of the topographic point, the planar position information of the feature point of the slice to which the topographic point belongs, and the target resolution of the digital elevation model. The sub-slice to which the topographic point belongs may be identified by a row number and a column number of the sub-slice in the corresponding slice, or may be identified by a number of the sub-slice in the corresponding slice. In an embodiment where the sub-slices are identified by their row and column numbers, a third difference between the plane ordinate of the corner point of the slice to which the sub-slice belongs and the plane ordinate of the topographical point in the sub-slice may be obtained, the row number of the sub-slice may be determined based on a ratio of the third difference to the target resolution of the digital elevation model, a fourth difference between the plane abscissa of the topographical point in the sub-slice and the plane abscissa of the corner point of the slice to which the sub-slice belongs may be obtained, and the column number of the sub-slice may be determined based on a ratio of the fourth difference to the target resolution of the digital elevation model. The formula for calculating the row number and column number of the sub-slice in some embodiments can be written as:
Figure BDA0003962554940000131
assuming that the number of sub-slices per row in a slice is w, the number of a sub-slice in the corresponding slice can be denoted as w × row + col.
After the digital elevation model DEM is generated, plane position information of each topographic point can be recovered from the digital elevation model DEM through certain processing, and an elevation value of each topographic point can be extracted from the digital elevation model DEM. Then, based on the plane position information and the elevation value of each topographic point, a visualized topographic map may be rendered, or a topographic factor of the target working area R, such as accuracy, latitude, altitude, height, slope, grade, surface roughness, topographic relief, slope, etc., may be calculated. It is also possible to generate a high-precision map based on the planar position information and elevation values of the respective topographical points, and/or to perform path planning and decision control for the autonomous driving process of the autonomous vehicle 101. The following exemplifies a manner of restoring the plane position information of the topographical points.
In some embodiments, a target slice unit of the digital elevation model DEM may be read; determining the sub-slices to which the target topographic points belong based on the recording sequence of the target topographic points in the target slice unit; and determining the plane position information of the target topographic point based on the sub-slice to which the target topographic point belongs, the plane position information of the characteristic point in the target slice unit and the target resolution of the digital elevation model DEM.
The target slice unit may be any one of slice units included in the digital elevation model DEM. The target topographic point may be any one topographic point in the target slice unit, or a grid point in any one sub-slice included in a slice corresponding to the target slice unit. Since the respective topographical points and the grid points are sequentially recorded in the slice unit, the sub-slice to which the target topographical point belongs can be determined based on the recording order index of the target topographical point.
Specifically, the total column count sum _ col of the sub-slice in the slice may be determined based on the line width of the slice and the target resolution of the digital elevation model DEM, and is specifically represented as:
sum_col=CL/resolution(2.1)
then, the row number row and column number col of the target topographic point in the corresponding slice are determined based on the recording order index of the target topographic point in the slice unit and the total column number sum _ col of the sub-slices. Specifically, the ratio of the recording order index of the target topographic point in the slice unit to the total column number sum _ col of the sub-slices may be obtained, and the ratio is rounded down to obtain the row number row of the target topographic point. It is also possible to take the remainder of the recording order index of the target topographic point in the slice unit to the total column number sum _ col of the sub-slices, and determine the column number col of the target topographic point based on the remainder. The formula for calculating the row number and column number of the target topographical point can be written as:
Figure BDA0003962554940000141
then, a third product of the row number of the target topographical point and the target resolution of the digital elevation model DEM may be determined, and the plane abscissa of the target topographical point may be determined based on the sum of the third product and the plane abscissa of the feature point B of the slice in which the target topographical point is located. And determining a fourth product of the column number of the target topographic point and the target resolution of the digital elevation model DEM, and determining the plane ordinate of the target topographic point based on the difference between the plane abscissa of the characteristic point B of the slice where the target topographic point is located and the fourth product. The calculation formula for recovering the plane abscissa and the plane ordinate of the target topographic point can be expressed as:
Figure BDA0003962554940000142
in some embodiments, when the updated point cloud data is acquired, a slice (referred to as a slice to be updated) to which the updated point cloud data belongs may also be determined, and the elevation value in a slice unit corresponding to the slice to be updated is updated. For example, referring to fig. 5, in a case where the to-be-updated slice to which the updated point cloud data belongs includes an mth row and an nth column of slices, the updated elevation values of the respective topographical points in the mth row and the nth column of slices may be determined based on the updated point cloud data, and the elevation values in the slice units corresponding to the mth row and the nth column of slices may be updated based on the updated elevation values. The above update process is called incremental update. According to the embodiment of the disclosure, the elevation value of the topographic point and the slice where the topographic point is located are associated and bound, so that incremental updating can be performed based on the row number and the column number of the slice, and the incremental updating efficiency and accuracy are improved.
It is understood that the solutions described in the above embodiments can be freely combined to obtain a new solution without conflict, and for reasons of space, they will not be introduced here.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 6 is a schematic structural diagram of an apparatus for generating a digital elevation model of an autopilot scenario according to an embodiment of the disclosure. As shown in fig. 6, the apparatus includes:
the automatic driving vehicle control system comprises a first determining module 601, a first calculating module and a second determining module, wherein the first determining module is used for acquiring point cloud data of a target working area of an automatic driving vehicle, and determining plane position information and elevation values of a plurality of topographic points in the target working area based on the point cloud data;
a second determining module 602, configured to determine, based on the plane position information of each topographic point, a slice to which the corresponding topographic point belongs in a plurality of pre-divided slices, where different slices correspond to different plane position ranges;
the storage module 603 is configured to generate a digital elevation model of the target operation area based on the elevation value of each topographical point and the slice to which each topographical point belongs, and store the digital elevation model;
the digital elevation model comprises a plurality of slice units, each slice unit corresponds to one slice and is used for recording the elevation value of each topographic point in the corresponding slice;
the storage sequence of the elevation values of all topographic points in the same slice unit in the slice unit is related to the positions of all topographic points in the slice.
The implementation process of the functions and actions of each module in the above device is detailed in the implementation process of the corresponding steps in the above method, and is not described herein again.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in fig. 7, the electronic apparatus 70 of this embodiment includes: a processor 701, a memory 702, and a computer program 703 stored in the memory 702 and executable on the processor 701. The steps in the various method embodiments described above are implemented when the computer program 703 is executed by the processor 701. Alternatively, the processor 701 implements the functions of each module/unit in each device embodiment described above when executing the computer program 703.
Illustratively, the computer program 703 may be partitioned into one or more modules/units, which are stored in the memory 702 and executed by the processor 701 to accomplish the present disclosure. One or more of the modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 703 in the electronic device 70.
The electronic device 70 may be an electronic device such as a desktop computer, a notebook, a palm computer, and a cloud server. The electronic device 70 may include, but is not limited to, a processor 701 and a memory 702. Those skilled in the art will appreciate that fig. 7 is merely an example of the electronic device 70, and does not constitute a limitation of the electronic device 70, and may include more or fewer components than shown, or combine certain components, or different components, e.g., the electronic device may also include input-output devices, network access devices, buses, etc.
The Processor 701 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 702 may be an internal storage unit of the electronic device 70, for example, a hard disk or a memory of the electronic device 70. The memory 702 may also be an external storage device of the electronic device 70, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the electronic device 70. Further, the memory 702 may also include both internal storage units and external storage devices of the electronic device 70. The memory 702 is used to store computer programs and other programs and data required by the electronic device. The memory 702 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the present disclosure. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
In the embodiments provided in the present disclosure, it should be understood that the disclosed apparatus/electronic device and method may be implemented in other ways. For example, the above-described apparatus/electronic device embodiments are merely illustrative, and for example, a module or a unit may be divided into only one logical function, and may be implemented in other ways, and multiple units or components may be combined or integrated into another system, or some features may be omitted or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, the present disclosure may implement all or part of the flow of the method in the above embodiments, and may also be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of the above methods and embodiments. The computer program may comprise computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, U.S. disk, removable hard disk, magnetic disk, optical disk, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution media, and the like. It should be noted that the computer-readable medium may contain suitable additions or subtractions depending on the requirements of legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer-readable media may not include electrical carrier signals or telecommunication signals in accordance with legislation and patent practice.
The above examples are only intended to illustrate the technical solution of the present disclosure, not to limit it; although the present disclosure has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present disclosure, and are intended to be included within the scope of the present disclosure.
The above examples are only intended to illustrate the technical solution of the present disclosure, not to limit it; although the present disclosure has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present disclosure, and are intended to be included within the scope of the present disclosure.

Claims (12)

1. A method of generating a digital elevation model of an autonomous driving scenario, the method comprising:
acquiring point cloud data of a target operation area of an automatic driving vehicle, and determining plane position information and elevation values of a plurality of topographic points in the target operation area based on the point cloud data;
determining slices of the corresponding topographic points in a plurality of pre-divided slices based on the plane position information of each topographic point, wherein different slices correspond to different plane position ranges;
generating a digital elevation model of the target operation area based on the elevation value of each topographic point and the slice to which each topographic point belongs, and storing the digital elevation model;
the digital elevation model comprises a plurality of slice units, each slice unit corresponds to one slice and is used for recording the elevation value of each topographic point in the corresponding slice;
the storage sequence of the elevation values of all topographic points in the same slice unit in the slice unit is related to the positions of all topographic points in the slice.
2. The method according to claim 1, wherein the determining a slice to which the corresponding topographic point belongs among the pre-divided slices based on the plane position information of the respective topographic point comprises:
and determining the slice to which the topographic point belongs based on the plane position information of the topographic point, the plane position information of a preset reference point and the size of each slice.
3. The method according to claim 1, wherein in each slice unit, plane position information of the feature point of the corresponding slice and elevation values of respective topographical points in the corresponding slice are recorded.
4. The method according to claim 3, wherein the plane position information of the feature point of the slice is determined based on the plane position information of a preset reference point, the position information of the slice in the plurality of slices, and the size of each slice.
5. The method according to claim 3 or 4, wherein each slice is divided into a plurality of sub-slices according to a target resolution of the digital elevation model; the slicing unit is used for recording the elevation values of the topographic points in each sub-slice of the corresponding slice line by line;
and determining the sub-slice to which the topographic points belong based on the plane position information of the topographic points, the plane position information of the characteristic points of the slice to which the topographic points belong and the target resolution of the digital elevation model.
6. The method of claim 5, further comprising:
reading a target slice unit of the digital elevation model;
determining a sub-slice to which the target topographic point belongs based on the recording sequence of the target topographic point in the target slice unit;
and determining the plane position information of the target topographic point based on the sub-slice to which the target topographic point belongs, the plane position information of the feature point in the target slice unit and the target resolution of the digital elevation model.
7. The method of claim 1, further comprising:
under the condition that updated point cloud data are collected, determining a slice to be updated to which the updated point cloud data belong, and determining an updated elevation value of each topographic point in the slice to be updated based on the updated point cloud data;
and updating the elevation value in the slice unit corresponding to the slice to be updated based on the updated elevation value of each topographic point in the slice to be updated.
8. The method of claim 1, further comprising:
preprocessing the original point cloud data of the target operation area to obtain the point cloud data; the pre-treatment comprises at least one of: voxelization filtering, noise filtering and non-topographical point filtering.
9. The method of claim 3, further comprising:
acquiring elevation values of a plurality of neighborhood topographic points of grid points of the sub-slices; the grid points of the sub-slice are used for fitting a plurality of topographic points in the sub-slice;
and fusing the elevation values of the plurality of neighborhood topographic points to obtain the elevation value of the grid point.
10. An apparatus for generating a digital elevation model of an autonomous driving scenario, the apparatus comprising:
the system comprises a first determination module, a second determination module and a third determination module, wherein the first determination module is used for acquiring point cloud data of a target operation area of an automatic driving vehicle and determining plane position information and elevation values of a plurality of topographic points in the target operation area based on the point cloud data;
the second determining module is used for determining slices of the corresponding topographic points in the pre-divided multiple slices based on the plane position information of each topographic point, and different slices correspond to different plane position ranges;
the storage module is used for generating a digital elevation model of the target operation area based on the elevation value of each topographic point and the slice to which each topographic point belongs, and storing the digital elevation model;
the digital elevation model comprises a plurality of slice units, each slice unit corresponds to one slice and is used for recording the elevation value of each topographic point in the corresponding slice;
the storage sequence of the elevation values of all topographic points in the same slice unit in the slice unit is related to the positions of all topographic points in the slice.
11. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out the method of any one of claims 1 to 9.
12. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 9 when executing the program.
CN202211486451.1A 2022-11-24 2022-11-24 Method and device for generating digital elevation model of automatic driving scene Pending CN115713600A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211486451.1A CN115713600A (en) 2022-11-24 2022-11-24 Method and device for generating digital elevation model of automatic driving scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211486451.1A CN115713600A (en) 2022-11-24 2022-11-24 Method and device for generating digital elevation model of automatic driving scene

Publications (1)

Publication Number Publication Date
CN115713600A true CN115713600A (en) 2023-02-24

Family

ID=85234686

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211486451.1A Pending CN115713600A (en) 2022-11-24 2022-11-24 Method and device for generating digital elevation model of automatic driving scene

Country Status (1)

Country Link
CN (1) CN115713600A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116340307A (en) * 2023-06-01 2023-06-27 北京易控智驾科技有限公司 Ramp layer generation method and device, high-precision map and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116340307A (en) * 2023-06-01 2023-06-27 北京易控智驾科技有限公司 Ramp layer generation method and device, high-precision map and electronic equipment
CN116340307B (en) * 2023-06-01 2023-08-08 北京易控智驾科技有限公司 Ramp layer generation method and device, high-precision map and electronic equipment

Similar Documents

Publication Publication Date Title
CN109541634B (en) Path planning method and device and mobile device
CN108763287B (en) Construction method of large-scale passable regional driving map and unmanned application method thereof
CN102248947B (en) Object and vehicle detecting and tracking using a 3-D laser rangefinder
CN108509820B (en) Obstacle segmentation method and device, computer equipment and readable medium
US20180173239A1 (en) Method and system for updating occupancy map based on super ray
CN110598541B (en) Method and equipment for extracting road edge information
CN110232329B (en) Point cloud classification method and device based on deep learning, storage medium and equipment
CN111553946B (en) Method and device for removing ground point cloud and method and device for detecting obstacle
CN115713600A (en) Method and device for generating digital elevation model of automatic driving scene
CN114519712A (en) Point cloud data processing method and device, terminal equipment and storage medium
CN115265519A (en) Online point cloud map construction method and device
CN116168246A (en) Method, device, equipment and medium for identifying waste slag field for railway engineering
CN115346183A (en) Lane line detection method, terminal and storage medium
CN111611900A (en) Target point cloud identification method and device, electronic equipment and storage medium
CN111126211A (en) Label identification method and device and electronic equipment
KR102408981B1 (en) Method for Creating ND Map and Updating map Using it
CN112184900B (en) Method, device and storage medium for determining elevation data
CN110174115B (en) Method and device for automatically generating high-precision positioning map based on perception data
CN112559539A (en) Method and device for updating map data
CN116680588A (en) Terminal area division setting method, device, equipment and medium based on improved Agent
CN111742242A (en) Point cloud processing method, system, device and storage medium
CN112166446A (en) Method, system, device and computer readable storage medium for identifying trafficability
CN115937817A (en) Target detection method and system and excavator
CN114387293A (en) Road edge detection method and device, electronic equipment and vehicle
CN115546437A (en) Slope detection method, processor and slope detection device for mechanical equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination