CN116468811A - Point cloud data processing method, device, equipment and storage medium - Google Patents

Point cloud data processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN116468811A
CN116468811A CN202310295055.9A CN202310295055A CN116468811A CN 116468811 A CN116468811 A CN 116468811A CN 202310295055 A CN202310295055 A CN 202310295055A CN 116468811 A CN116468811 A CN 116468811A
Authority
CN
China
Prior art keywords
point cloud
cloud data
frame
point
compression processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310295055.9A
Other languages
Chinese (zh)
Inventor
杨宽
陈时远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Autonavi Software Co Ltd
Original Assignee
Autonavi Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autonavi Software Co Ltd filed Critical Autonavi Software Co Ltd
Priority to CN202310295055.9A priority Critical patent/CN116468811A/en
Publication of CN116468811A publication Critical patent/CN116468811A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application provides a point cloud data processing method, a device, equipment and a storage medium, wherein the method comprises the following steps: acquiring a point cloud data set to be processed; slicing the point cloud data set frame by frame to obtain at least one frame of point cloud data; compressing at least one frame of point cloud data to obtain a compression processing result of the at least one frame of point cloud data; and storing the compression processing result of at least one frame of point cloud data. The method and the device can improve the accuracy of data processing by using the point cloud data later.

Description

Point cloud data processing method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of map technologies, and in particular, to a method, an apparatus, a device, and a storage medium for processing point cloud data.
Background
The lidar may collect point cloud data of the road. Because the amount of data of point cloud data is generally huge, how to store the point cloud data is a problem to be solved. At present, the existing point cloud data processing method mainly performs slicing processing on point cloud data and then stores the point cloud data. However, in the existing point cloud data slicing process, the point cloud data is sliced according to the preset road section length, so that the integrity of the point cloud data corresponding to the object in each slice cannot be ensured, and the accuracy of the subsequent data processing using the sliced point cloud data is possibly poor.
Disclosure of Invention
The application provides a point cloud data processing method, a device, equipment and a storage medium, which can improve the accuracy of data processing based on point cloud data.
In a first aspect, the present application provides a method for processing point cloud data, where the method includes:
acquiring a point cloud data set to be processed;
slicing the point cloud data set to be processed frame by frame to obtain at least one frame of point cloud data;
compressing the at least one frame of point cloud data to obtain a compression processing result of the at least one frame of point cloud data;
and storing the compression processing result of the at least one frame of point cloud data.
Optionally, the step of slicing the point cloud data set to be processed frame by frame to obtain at least one frame of point cloud data includes:
aiming at any point cloud data in the point cloud data set to be processed, acquiring a rotation angle of a laser head of the laser radar for acquiring the point cloud data when acquiring the point cloud data;
and slicing the point cloud data set to be processed frame by frame according to the rotation angle corresponding to each point cloud data in the point cloud data set to be processed to obtain at least one frame of point cloud data.
Optionally, the step of slicing the point cloud data set to be processed frame by frame to obtain at least one frame of point cloud data includes:
And according to the required time length of one circle of laser head scanning of the laser radar and the acquisition time corresponding to each point cloud data in the point cloud data set to be processed, slicing the point cloud data set to be processed frame by frame to obtain at least one frame of point cloud data.
Optionally, the compressing the at least one frame of point cloud data to obtain a compression result of the at least one frame of point cloud data includes:
for any frame of point cloud data, performing compression processing on fields of the frame of point cloud data to obtain an intermediate compression processing result of the frame of point cloud data;
and performing byte compression processing on the intermediate compression processing result to obtain a final compression processing result of the frame point cloud data.
Optionally, the intermediate compression processing result includes: a first compression result, a second compression result, and a third compression result; the compressing the field of the frame point cloud data to obtain an intermediate compression result of the frame point cloud data, including:
acquiring a reference point of the frame point cloud data;
according to the space position and the acquisition time of the reference point, carrying out differential encoding on the space information and the time information of the frame point cloud data to obtain a first compression processing result of the frame point cloud data;
Performing median predictive coding on the reflectivity of the frame point cloud data to obtain a second compression processing result of the frame point cloud data;
dictionary coding is carried out on the target identification of the frame point cloud data to obtain a third compression processing result of the frame point cloud data, wherein the target identification is used for representing whether the point cloud data are ground point cloud data or not.
Optionally, the acquiring the reference point of the frame point cloud data includes:
acquiring target acquisition time according to the acquisition time of each point cloud data in the frame of point cloud data;
and determining the reference point according to the distance between the point cloud data corresponding to the target acquisition time and the track point corresponding to the target acquisition time of the laser radar.
Optionally, the determining the reference point according to the distance between the point cloud data corresponding to the target acquisition time and the track point corresponding to the target acquisition time of the laser radar includes:
if the distance is greater than a preset distance threshold, determining the reference point according to the track point;
and if the distance is smaller than or equal to the preset distance threshold, taking the point cloud data corresponding to the target acquisition time as the reference point.
Optionally, the determining the reference point according to the track point includes:
taking the track point as the reference point;
or taking the point cloud data of which the spatial position is positioned in the track point preset range in the frame of point cloud data as the reference point.
Optionally, the storing the compression processing result of the at least one frame of point cloud data includes:
and storing the compression processing result of the at least one frame of point cloud data into at least one point cloud file according to the ordering sequence of the point cloud frames and the preset maximum storable point cloud frame number of the single point cloud file.
Optionally, the method further comprises:
and constructing and storing an index file of the point cloud file.
Optionally, the index file includes at least one of:
version identification of the index file;
the acquisition time range and the space range corresponding to the point cloud file;
acquiring a time range and a space range corresponding to each frame of point cloud data in the point cloud file;
the point cloud file comprises descriptive information of compression processing results of each frame of point cloud data, and the descriptive information comprises at least one of the following:
the relative position in the point cloud file, byte length, compression mode, the spatial position and acquisition time of the reference point and the laser head type of the corresponding laser radar.
Optionally, before slicing the point cloud data set frame by frame to obtain at least one frame of point cloud data, the method further includes:
and preprocessing the point cloud data set according to the compression processing requirement.
In a second aspect, the present application provides a data processing apparatus, the apparatus comprising:
the acquisition module is used for acquiring the point cloud data set to be processed;
the slicing module is used for slicing the point cloud data set to be processed frame by frame to obtain at least one frame of point cloud data;
the processing module is used for compressing the at least one frame of point cloud data to obtain a compression processing result of the at least one frame of point cloud data;
and the storage module is used for storing the compression processing result of the at least one frame of point cloud data.
In a third aspect, the present application provides an electronic device, comprising: a processor and a memory; the processor is in communication with the memory;
the memory stores computer instructions;
the processor executes the computer instructions stored by the memory to implement the method of any one of the first aspects.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon computer-executable instructions which, when executed by a processor, implement the method according to any of the first aspects.
In a fifth aspect, the present application provides a computer program product comprising a computer program which, when executed by a processor, implements the method of any of the first aspects.
According to the point cloud data processing method, device and equipment and storage medium, the point cloud data set is sliced frame by frame, one frame of point cloud data is used as one rigid body, and compared with the prior art that the point cloud data set is sliced according to the preset road section length, the physical rigidity of the laser radar is reserved, the situation that the one rigid body is sliced is avoided, and then the accuracy of data processing by using at least one frame of point cloud data after slicing is improved. By compressing at least one frame of point cloud data and storing the compression result of the at least one frame of point cloud data, the data amount for storing the at least one frame of point cloud data is reduced, so that the data amount required to be acquired when the at least one frame of point cloud data is used later is less, and the efficiency of acquiring the at least one frame of point cloud data later is improved.
Drawings
For a clearer description of the technical solutions of the present application or of the prior art, the drawings that are used in the description of the embodiments or of the prior art will be briefly described, it being obvious that the drawings in the description below are some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a point cloud data set;
fig. 2 is a schematic diagram of different point cloud data storage specifications of S laser radars;
FIG. 3 is a schematic diagram of a conventional point cloud data slicing method;
fig. 4 is a flow chart of a point cloud data processing method according to an embodiment of the present application;
fig. 5 is a flow chart of a method for obtaining a compression result of at least one frame of point cloud data;
FIG. 6 is a schematic diagram of a data processing apparatus according to the present application;
fig. 7 is a flow chart of another method for processing point cloud data provided in the present application;
FIG. 8 is a schematic diagram of a local coordinate transformation provided herein;
FIG. 9 is a schematic diagram of another data processing apparatus provided in the present application;
fig. 10 is a schematic hardware structure of an electronic device provided in the present application.
Specific embodiments thereof have been shown by way of example in the drawings and will herein be described in more detail. These drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but to illustrate the concepts of the present application to those skilled in the art by reference to specific embodiments.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is apparent that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The following first explains some of the noun concepts referred to in this application:
and (3) one-frame point cloud data: the point cloud data obtained by scanning the laser head of the laser radar for one circle can be called one frame of point cloud data. Taking the laser head as a multi-line laser head for example, the one frame of point cloud data may include point cloud data on a plurality of scan lines. Among them, the lidar provided with one multiline laser head may be referred to as a multiline lidar. Furthermore, the types of lidar may also include single-line lidar provided with one single-line laser head, or multi-laser head combination (e.g., including at least one single-line laser head and at least one multi-line laser head), and the like.
Taking map data acquisition as an example, when map data acquisition is performed on a road segment, point cloud data of the road segment can be acquired through a laser radar and stored. Subsequent point cloud data processing nodes such as automatic identification, data alignment correction, pyramid slicing, multi-level of Detail (LOD) division and the like can acquire point cloud data from the equipment storing the point cloud data for processing.
Taking the above laser radar as an example of the laser radar combined by multiple laser heads, fig. 1 is a schematic diagram of a point cloud data set. As shown in fig. 1, the lidar may collect point cloud data along a lidar collection track, and store the point cloud data collected by different laser heads into different point cloud data sets. Typically, the amount of data per point cloud data set is large. Taking the example in fig. 1, the data volume of the point cloud data set may be up to 4 Gigabytes (GB). Therefore, how to store the point cloud data after the point cloud data is acquired becomes a problem to be solved.
In addition, the storage specifications (also referred to as storage formats) of point cloud data acquired by laser radars of different manufacturers are not uniform. For example, fig. 2 is a schematic diagram of different point cloud data storage specifications of S types of lidars. As shown in fig. 2, point cloud data acquired by the laser radar 1 is stored according to a point cloud storage specification 1; and storing the point cloud data acquired by the laser radar 2 according to the point cloud storage specification 2. However, the storage specifications of the point cloud data are not uniform, which may cause the point cloud data processing node to fail to process the point cloud data with different storage specifications. Therefore, the storage specification unification processing needs to be performed on the point cloud data, so that different point cloud data processing nodes can use the point cloud data in the unified format. At present, the unification of the storage specification of the point cloud data can be realized by slicing the original point cloud data set acquired by different laser radars.
Fig. 3 is a schematic diagram of a conventional point cloud data slicing method. As shown in fig. 3, in the existing point cloud data processing method, the point cloud data set is sliced according to a preset road segment length (or called a preset distance), so that the road segment length covered by the point cloud data included in each slice is equal to the preset road segment length. Then, the point cloud data in each slice can be compressed respectively, and the storage space occupied by the point cloud data is reduced and then stored.
However, when slicing the point cloud data set according to the preset road segment length, the integrity of the point cloud data corresponding object in each slice cannot be guaranteed. That is, an object may be divided among a plurality of different slices. Taking the arrow mark on the road surface as an example, when the point cloud data is sliced according to the preset road segment length, one arrow mark may be divided into two parts, one part is in slice 1, and the other part is in slice 2.
After the point cloud data slices are stored by the existing point cloud data processing method, a subsequent downstream processing node can acquire each slice and process the point cloud data in each slice respectively. Taking a downstream processing node as an example of a point cloud data rectification processing node, the point cloud data rectification processing node can rectify coordinates of point cloud data in a slice according to the point cloud data in the slice for any slice. Still taking the foregoing example as an example, when one arrow mark is divided into two parts, one part is in the slice 1, and the other part is in the slice 2, the point cloud data rectification processing node obtains, according to the point cloud data in the slice 1, a rectification amount 1 for coordinates of the point cloud data in the slice 1, and rectifies the coordinates of the point cloud data in the slice 1 based on the rectification amount 1. And the point cloud data correction processing node obtains correction amount 2 aiming at the coordinates of the point cloud data in the slice 2 according to the point cloud data in the slice 2, and corrects the coordinates of the point cloud data in the slice 2 based on the correction amount 2. However, if the deviation correction amount 1 is different from the deviation correction amount 2, the arrow mark portion in the slice 1 and the arrow mark portion in the slice 2 after deviation correction may be misaligned (not aligned), which may result in poor deviation correction accuracy.
That is, when the existing point cloud data set is used for slice storage according to the preset road segment length, the accuracy of the subsequent data processing using the sliced point cloud data may be poor.
In view of the above problems of the existing point cloud data processing method, the present application proposes a slicing method capable of avoiding slicing an object, thereby improving accuracy of data processing using the sliced point cloud data.
It should be understood that the present application is not limited to the type of lidar described above. The lidar may be, for example, a single-line lidar, or a multi-laser head combined lidar, or the like.
Optionally, the execution body of the point cloud data processing method may be, for example, a data processing platform, or any electronic device with a processing function, such as a terminal, a server, or the like.
The following takes an execution main body of the point cloud data processing method as an example of an electronic device, and a technical scheme of the application is described in detail with reference to specific embodiments. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
Fig. 4 is a flow chart of a point cloud data processing method according to an embodiment of the present application. As shown in fig. 4, the method may include the steps of:
s101, acquiring a point cloud data set to be processed.
The set of point cloud data may include, for example, raw point cloud data collected by a laser head. The laser head may be a multi-line laser head, or a single-line laser head. Taking the laser head as a multi-line laser head for example, the point cloud data set may include original point cloud data on a plurality of scan lines.
The electronic device may, for example, receive the above-described set of point cloud data entered by a user via an application program interface (Application Programming Interface, API), or a graphical user interface (Graphical User Interface, GUI), or the like. Alternatively, the electronic device may receive, for example, a point cloud data set transmitted to the electronic device by the map data collection vehicle after the point cloud data set is collected by the lidar. Still alternatively, the point cloud data set may be stored in the electronic device in advance. In this implementation manner, the electronic device may obtain the point cloud data set from the data stored in the electronic device.
S102, slicing the point cloud data set to be processed frame by frame to obtain at least one frame of point cloud data.
For example, taking the above-mentioned lidar as a multi-line lidar as an example, one frame of point cloud data may include point cloud data on a plurality of scan lines obtained by scanning a laser head of the multi-line lidar one circle. Taking the laser radar as a single-line laser radar as an example, the one-frame point cloud data may include point cloud data on a scanning line obtained by scanning a circle by a laser head of the single-line laser radar. Taking the laser radar as the laser radar combined by Q laser heads as an example, optionally, the Q laser heads scan one circle to obtain Q frame point cloud data corresponding to the Q laser heads one by one.
Optionally, for any point cloud data in the point cloud data set to be processed, the electronic device may acquire a rotation angle of a laser head of the laser radar for acquiring the point cloud data when acquiring the point cloud data, and slice the point cloud data set to be processed frame by frame according to the rotation angle corresponding to each point cloud data in the point cloud data set to be processed, so as to obtain at least one frame of point cloud data. For example, the above-mentioned point cloud data set (the point cloud data set and the point cloud data set to be processed in the present application are the same concept) may include a rotation angle of the laser head when the laser head corresponding to the point cloud data scans to obtain the point cloud data. That is, the electronic device may obtain the rotation angle corresponding to each point cloud data from the point cloud data set. Then, the electronic device may sort the cloud data of each point according to the sequence of the acquisition time, and set the rotation angle of the corresponding laser head to be [0 ] ° ,360 ° ]And the point cloud data in the interval is used as one frame of point cloud data, so that the frame-by-frame slicing of the point cloud data set is realized.
Or the electronic equipment scans a circle of time required by the laser head of the laser radar and the acquisition time corresponding to each point cloud data in the point cloud data set to be processed, and performs frame-by-frame slicing on the point cloud data set to be processed to obtain at least one frame of point cloud data. Wherein the time required for each turn of laser head scanning of the laser radar may be the same. For example, the electronic device may scan, as one frame of point cloud data, the point cloud data included in the time period required for one circle of laser radar according to the acquisition time of each point cloud data in the point cloud data set, so as to implement frame-by-frame slicing of the point cloud data set.
S103, compressing the at least one frame of point cloud data to obtain a compression processing result of the at least one frame of point cloud data.
As a possible implementation manner, the electronic device may compress each frame of point cloud data respectively, to obtain a compression processing result of each frame of point cloud data.
For example, for any frame of point cloud data, the electronic device may perform compression processing on a field of the frame of point cloud data to obtain an intermediate compression processing result of the frame of point cloud data, and perform byte compression processing on the intermediate compression processing result to obtain a final compression processing result of the frame of point cloud data. The fields of the frame point cloud data may include, for example: at least one field of coordinates, time stamp, reflectivity and the like of each point cloud object on each scanning line in the frame point cloud data. The above-described method of compressing the field of the frame point cloud data may be, for example, an encoding compression method such as arithmetic encoding.
Or, for any frame point cloud data, the electronic device may further directly perform byte compression processing on the frame point cloud data to obtain a final compression processing result of the frame point cloud data. Further, for example, the electronic device may use the intermediate compression result corresponding to any frame point cloud data as the compression result of the frame point cloud data.
As another possible implementation manner, the electronic device may further directly perform integral byte compression processing on the at least one frame of point cloud data after the at least one frame of point cloud data is acquired, so as to obtain a compression processing result of the at least one frame of point cloud data. In this implementation manner, after decompressing the compression processing result of the at least one frame of point cloud data, the obtained at least one frame of point cloud data may be the point cloud data after the processing of the point cloud data set by slicing frame by frame.
S104, storing compression processing results of at least one frame of point cloud data.
Optionally, the electronic device may store the compression processing result of the at least one frame of point cloud data in at least one point cloud file according to the ordering order of the point cloud frames. Each point cloud file may include a compression processing result of one or more frames of point cloud data, so as to improve efficiency when the compression processing result of the multiple frames of point cloud data is obtained later. Alternatively, the number of frames of the compression processing results of the point cloud data included in the different point cloud files may be the same or different. Alternatively, the electronic device may add, for example, only a compression result of one frame of point cloud data to one point cloud file.
Alternatively, taking an example that the electronic device includes a storage device that can be used to store the compression processing result, the electronic device may store the compression processing result of the at least one frame of point cloud data in the storage device, for example. Or, the electronic device may store the compression result to the cloud platform, for example, through a storage service provided by the cloud platform.
It should be understood that the application is not limited to how the downstream processing node (i.e., the processing node that performs the data processing based on the compression processing result) obtains and uses the compression processing result, and the use of the compression processing result by the downstream processing node. Illustratively, the downstream processing node may perform recognition of lane lines or the like based on the compression processing result.
In this embodiment, slicing is performed on a frame-by-frame basis on the point cloud data set, and one frame of point cloud data is used as one rigid body. By compressing at least one frame of point cloud data and storing the compression result of the at least one frame of point cloud data, the data amount for storing the at least one frame of point cloud data is reduced, so that the data amount required to be acquired when the at least one frame of point cloud data is used later is less, and the efficiency of acquiring the at least one frame of point cloud data later is improved.
In some embodiments, before the electronic device slices the point cloud data set frame by frame to obtain at least one frame of point cloud data, the electronic device may further perform preprocessing on the point cloud data set according to a compression processing requirement.
By way of example, the compression processing requirements described above may include, for example, at least one of: compression processing is carried out on ground point cloud data, compression processing is carried out on non-ground point cloud data, denoising and thinning are carried out on point cloud data, or cutting and the like are carried out.
Taking compression processing requirement as an example to compress the ground point cloud data, the preprocessing may be to classify the point cloud data in the point cloud data set, and then reserve the ground point cloud data as the point cloud data set. Taking compression processing requirement as an example to compress non-ground point cloud data, the preprocessing may be to classify the point cloud data in the point cloud data set, and then reserve the non-ground point cloud data as the point cloud data set. Taking the compression processing requirement as an example of denoising the point cloud data, optionally, the preprocessing may be performing point cloud filtering on the point cloud data in the point cloud data set, filtering out noise points in the point cloud data, and obtaining denoised point cloud data as the point cloud data set.
In this embodiment, the electronic device may perform frame-by-frame slicing and compression storage on the point cloud data set after preprocessing the point cloud data set based on the compression processing requirement. By the method, the flexibility of the compression processing process of the point cloud data set is improved, and the compression processing result meeting the compression processing requirement is obtained. Furthermore, by the method, the compression processing result obtained based on the preprocessed point cloud data set can directly meet the use requirement of a downstream processing node, and further the efficiency of data processing by using the compression processing result is improved.
The following details how the electronic device performs compression processing on at least one frame of point cloud data to obtain a compression processing result of the at least one frame of point cloud data:
fig. 5 is a flowchart of a method for obtaining a compression result of at least one frame of point cloud data according to the present application. As shown in fig. 5, as a possible implementation manner, the foregoing step S103 may include the following steps:
s201, compressing the field of any frame point cloud data to obtain an intermediate compression processing result of the frame point cloud data.
By way of example, the fields of the frame point cloud data may include, for example: at least one field of spatial position (for example, coordinates of the point cloud object), acquisition time (which may also be referred to as a timestamp), reflectivity, a target identifier for representing whether the point cloud data is ground point cloud data, and the like of each point cloud data on each scan line in the frame of point cloud data. Optionally, compression modes of compression processing for different types of fields (for example, spatial positions and acquisition times of point cloud objects, and fields with different types between reflectivities) of the frame point cloud data may be the same or different.
For example, the electronic device may perform compression processing on different fields of the frame point cloud data separately through a plurality of different encoding modes. For any frame of point cloud data, the electronic device may first acquire a reference point of the frame of point cloud data, and then differentially encode spatial information and time information of the frame of point cloud data according to a spatial position and acquisition time of the reference point, to obtain a first compression processing result of the frame of point cloud data. And the electronic equipment can conduct median predictive coding on the reflectivity of the frame point cloud data to obtain a second compression processing result of the frame point cloud data. The electronic device may perform dictionary encoding on the target identifier of the frame point cloud data, which is used for characterizing whether the point cloud data is ground point cloud data, to obtain a third compression processing result of the frame point cloud data. That is, in this example, the above-described intermediate compression processing result may include: the first compression result, the second compression result, and the third compression result.
Optionally, when the electronic device obtains the first compression processing result, for example, the spatial position offset of the point cloud data relative to the reference point in the spatial position may be obtained according to the spatial position of the point cloud data (for example, the coordinates of the point cloud data) and the spatial position of the reference point with respect to any point cloud data on any scan line in the frame of point cloud data. The spatial information of the frame point cloud data may include: and the spatial position offset of each point cloud data in the frame of point cloud data. Then, the electronic device can encode the spatial position offset of each point cloud data through the differential encoding, and a coding compression result of the spatial position offset corresponding to the frame of point cloud data is obtained.
For any point cloud data on any scanning line in the frame of point cloud data, the electronic device can obtain the time offset of the point cloud data relative to the reference point in the acquisition time according to the acquisition time of the point cloud data and the acquisition time of the reference point. The time information of the frame point cloud data may include: the time offset of each point cloud data in the frame of point cloud data. Then, the electronic device can encode the time offset of each point cloud data through the differential encoding, and a result of encoding compression of the time offset corresponding to the point cloud data of the frame is obtained.
Then, the electronic device may use the encoded compression result of the spatial position offset and the encoded compression result of the temporal offset as a first compression result of the frame point cloud data.
When the electronic device obtains the second compression processing result, optionally, for example, a reflectivity sequence formed by reflectivities of each point cloud data in the frame point cloud data may be input to a median predictive coding algorithm according to the sequence of the acquisition time, so as to obtain the second compression processing result of the frame point cloud data.
When the electronic device obtains the third compression processing result, optionally, for example, a target identification sequence formed by target identifications of the point cloud data in the frame point cloud data may be input to a dictionary coding algorithm according to the sequence of the collection time, so as to obtain the third compression processing result of the frame point cloud data.
It should be understood that the order in which the electronic device obtains the first compression result, the second compression result, and the third compression result is not limited.
Under the implementation mode, spatial information and time information of the frame point cloud data are compressed through differential coding, reflectivity of the frame point cloud data is compressed through median predictive coding, and a target mark used for representing whether the point cloud data is ground point cloud data of the frame point cloud data is compressed through dictionary coding, so that fields of the frame point cloud data are compressed in different compression modes. By the method, the field can be compressed by using a proper coding mode aiming at different fields, so that the problem of too slow decoding caused by excessive compression while ensuring the field compression rate can be solved. Therefore, by the method, the flexibility of compressing the at least one frame of point cloud data is improved, the efficiency of decoding the compression result is improved, and the efficiency of data processing based on the compression result is improved.
In some embodiments, for any frame point cloud data, the electronic device may further obtain any two of the above "the first compression result, the second compression result, and the third compression result", or any one of them, as an intermediate compression result of the frame point cloud data. For example, the electronic device may acquire the first compression result, and use the first compression result as an intermediate compression result of the frame point cloud data. Because the space position and the collection time of the point cloud data occupy a large amount of data, and the reflectivity and the target mark occupy a small amount of data, the compression efficiency can be improved while the compression rate of the point cloud data set is ensured by taking the first compression processing result as the intermediate compression processing result of the frame point cloud data.
In some embodiments, the intermediate compression result of the frame point cloud data may further include, for example, performing compression processing on other fields of the frame point cloud data to obtain a compression result. The fields of the point cloud data in this frame may also include, for example: the confidence of the spatial position of each point cloud data in each scan line in the frame of point cloud data, or the identification of the map element corresponding to the point cloud data, for example, the intermediate compression processing result includes: and compressing the confidence coefficient corresponding to the point cloud data of the frame to obtain a fourth compression result, compressing the identification of the map element corresponding to each point cloud data in the point cloud data of the frame to obtain a fifth compression result, and the like.
S202, performing byte compression processing on the intermediate compression processing result to obtain a final compression processing result of the frame point cloud data.
Optionally, the electronic device may perform byte compression processing on the intermediate compression processing result by any existing byte compression processing method, for example, a Zstd byte compression algorithm (or called a string compression algorithm, collectively called Zstandard, a name of a lossless compression algorithm), GZIP (a name of a byte compression mode), and so on, so as to obtain a final compression processing result of the frame point cloud data.
In this embodiment, the field of the frame point cloud data is compressed, so that an intermediate compression result of the frame point cloud data can be obtained, and the field of the frame point cloud data is compressed. Then, the final compression processing result of the frame point cloud data can be obtained by carrying out byte compression processing on the intermediate compression processing result, so that the field of the frame point cloud data is compressed again, the compression rate of the frame point cloud data is further improved, the data size of the compression processing result is further reduced, and the storage space occupied by the compression processing result is further reduced.
The following describes in detail how the electronic device obtains the above reference point of the frame point cloud data:
as a possible implementation manner, the electronic device may determine the reference point of the frame point cloud data based on the track point of the laser radar.
For example, the electronic device may first obtain the target acquisition time according to the acquisition time of each point cloud data in the frame of point cloud data. Then, the electronic device may determine the reference point according to the distance between the point cloud data corresponding to the target acquisition time and the track point corresponding to the target acquisition time of the laser radar.
For example, the point cloud data set collected by the laser radar may further include a track point of the laser radar when the laser radar collects each point cloud data. In this implementation manner, the electronic device may determine, according to the target acquisition time, a track point corresponding to the target acquisition time by the laser radar from the point cloud data set. Or taking the arrangement of the laser radar on the map acquisition vehicle as an example, the electronic equipment can use the track point corresponding to the target acquisition time of the laser radar of the map acquisition vehicle as the track point corresponding to the target acquisition time of the laser radar.
For example, the electronic device may determine, from the acquisition times of each point cloud data in the frame of point cloud data, an earliest acquisition time and a latest acquisition time, and use a time intermediate between the earliest acquisition time and the latest acquisition time as the target acquisition time. Or the electronic device may, for example, sequentially sort the collection times of the point cloud data in the frame of point cloud data, and use the collection time of the point cloud data arranged in the middle as the target collection time.
In some embodiments, the electronic device may determine the reference point by determining whether a distance between the point cloud data corresponding to the target acquisition time and the trajectory point corresponding to the target acquisition time of the laser radar is greater than a preset distance threshold. The preset distance threshold may be, for example, stored in the electronic device in advance.
Considering that the frame of point cloud data is the point cloud data acquired by the laser radar in one turn, the position of the point cloud data corresponding to the target acquisition time should be closer to the track line of the laser radar. Therefore, if the distance is greater than the preset distance threshold, the error of the position of the point cloud data corresponding to the target acquisition time is larger. Therefore, optionally, the electronic device may determine the reference point according to the track point corresponding to the target acquisition time of the laser radar.
Alternatively, the electronic device may take the trajectory point as the reference point, for example, to improve the efficiency of determining the reference point. Or the electronic device may further use the point cloud data spatially located in the track point preset range in the frame of point cloud data as the reference point. The above-mentioned preset range may be, for example, stored in the electronic device in advance. By the implementation mode, the point cloud data is determined to serve as the reference point from the frame of point cloud data.
If the distance between the point cloud data corresponding to the target acquisition time and the track point corresponding to the target acquisition time of the laser radar is smaller than or equal to the preset distance threshold, the error of the position of the point cloud data corresponding to the target acquisition time is smaller. Therefore, optionally, the electronic device may use the point cloud data corresponding to the target acquisition time as a reference point. Alternatively, the electronic device may use a track point corresponding to the target acquisition time of the laser radar as the reference point.
In the implementation manner, the reference point of the frame point cloud data is determined based on the target acquisition time and the distance between the track points corresponding to the target acquisition time by the laser radar, so that the acquisition time and the spatial position of the reference point are ensured to be close to the middle of the frame point cloud data, the offset of more points in the frame point cloud data relative to the reference point is smaller, and the compression rate of the subsequent differential coding based on the spatial information and the time information of the frame point cloud data is improved.
As another possible implementation manner, after the target acquisition time is acquired, the electronic device may further directly acquire a track point corresponding to the target acquisition time by using the laser radar as the reference point, so as to improve efficiency of acquiring the reference point of the frame point cloud data.
The following describes in detail how the electronic device stores the compression processing result of at least one frame of point cloud data:
as a possible implementation manner, the electronic device may store the compression processing result of the at least one frame of point cloud data into the "at least one point cloud file" according to the ordering order of the point cloud frames and the preset maximum storable point cloud frame number of the single point cloud file.
Taking the compression processing result of the 25 frames of point cloud data as an example, assuming that the maximum storable point cloud frame number of the preset single point cloud file is 10, the electronic device may store the compression processing result of the point cloud data from the 1 st frame to the 10 th frame into the 1 st point cloud file according to the ordering sequence of the point cloud frames of the compression processing result of the 25 frames of point cloud data; storing the compression processing result of the point cloud data of the 11 th frame to the 20 th frame into the 1 st point cloud file; and storing the compression processing result of the point cloud data of the 21 st frame to the 25 th frame into the 3 rd point cloud file.
In this embodiment, the compression processing result of the at least one frame of point cloud data is stored into the "at least one point cloud file" according to the preset maximum storable point cloud frame number of the single point cloud file, so that the compression processing result of the multi-frame point cloud data may be included in one point cloud file. In consideration of the fact that when the point cloud file is read later, each point cloud file needs to be linked with a storage device storing the point cloud file once, and after the point cloud file is read, the link is disconnected, and when the next file is read, the link with the storage device is reestablished, therefore, when the compression processing result of multi-frame point cloud data in one file is read, only one-time link is needed to be carried out with the storage device storing the point cloud file. That is, by the method, the number of times of establishing a link with a storage device storing the point cloud file when the point cloud file is read later is reduced, and the efficiency of acquiring the compression processing result later is improved.
As another possible implementation manner, the electronic device may store the compression processing result of the at least one frame of point cloud data into the "at least one point cloud file" according to, for example, an ordering order of the point cloud frames and a preset maximum storable data size of a single point cloud file.
For example, taking the data size of each frame in the compression processing result of the 25 frames of point cloud data as an example, assuming that the maximum storable data size of the preset single point cloud file is 28kB, the electronic device may store the compression processing result of the point cloud data of the 1 st frame to the 9 th frame into the 1 st point cloud file according to the ordering order of the point cloud frames of the compression processing result of the 25 frames of point cloud data; storing the compression processing result of the point cloud data of the 10 th frame to the 18 th frame into the 1 st point cloud file; and storing the compression processing result of the point cloud data of the 19 th frame to the 25 th frame into the 3 rd point cloud file.
In this embodiment, based on the preset maximum storable data size of a single point cloud file, the compression processing result of the at least one frame of point cloud data is stored in the at least one point cloud file, so that the data size of each point cloud file is ensured not to be too large, and the efficiency of subsequently acquiring the point cloud file is improved.
As a possible implementation manner, further, the electronic device may also construct and store the index file of the point cloud file. By constructing and storing the index file of the point cloud file, the downstream processing node can acquire the compression processing result of the point cloud data of the required frame by using the index file of the point cloud file, and further the efficiency of acquiring the compression processing result of the point cloud data of the required frame by the downstream processing node is improved.
In this implementation manner, optionally, the index file includes at least one of the following, for example: version identification of an index file, an acquisition time range and a space range corresponding to a point cloud file, an acquisition time range and a space range corresponding to each frame of point cloud data in the point cloud file, description information of compression processing results of each frame of point cloud data included in the point cloud file, and the like. Wherein the description information may include at least one of: the relative position in the point cloud file, byte length, compression mode, spatial position and acquisition time of the reference point, the type of laser head of the corresponding laser radar, and the like.
The index file comprises: for example, the version identifier of the index file may be used by the downstream processing node to determine how to use the index file, and index the index file from the at least one point cloud file to obtain a compression processing result of the required at least one frame of point cloud data. By the method, the situation that the content in the index file can be updated to generate index files of different versions is considered, and universality and expandability of compression processing results obtained based on the point cloud data processing method are improved.
The index file comprises: for example, the collection time range and the space range corresponding to the point cloud file may be the collection time range between the collection time of the earliest point cloud data and the collection time of the latest point cloud data in the point cloud file. The spatial range corresponding to the point cloud file may be a bounding box (bbox) including spatial coordinates of all point cloud data in the point cloud file. Optionally, the electronic device may acquire the spatial range corresponding to the point cloud file according to the spatial coordinates of all the point cloud data in the point cloud file by using any one of the existing bbox acquisition methods. And the index file comprises a corresponding acquisition time range and a corresponding spatial range of the point cloud file, so that the downstream processing node can index and obtain the point cloud file with the compression processing result of at least one frame of point cloud data from the at least one point cloud file according to the spatial coordinates of the point cloud data to be queried and/or the acquisition time of the point cloud data to be queried.
The index file comprises: the collection time range and the space range corresponding to each frame of point cloud data in the point cloud file are taken as an example, and optionally, the collection time range and the space range can refer to the collection time range and the space range corresponding to the point cloud file, which are not described herein. And the acquisition time range and the spatial range corresponding to each frame of point cloud data in the point cloud file are included by the index file, so that the downstream processing node can obtain a required compression processing result of at least one frame of point cloud data from the at least one point cloud file according to the spatial coordinates of the point cloud data to be queried and/or the acquisition time of the point cloud data to be queried.
The index file comprises: description information of compression processing results of each frame of point cloud data included in the point cloud file is taken as an example, and it is assumed that the description information may include: the relative position and byte length in the point cloud file can be determined by the downstream processing node according to the relative position and byte length, the required compression processing result of at least one frame of point cloud data is determined, and the byte position in the corresponding point cloud file is located, so that the data in the byte position is read from the point cloud file, and the required compression processing result is obtained. The relative position may be a result of compression processing of the point cloud data of each frame, and may be a position relative to a result of compression processing of the point cloud data of the first frame in the file.
The description information is assumed to include: the compression mode, the spatial position and the acquisition time of the reference point and the type of the laser head of the corresponding laser radar, and the downstream processing node can determine the decompression mode and parameters required by decompression of the compression processing result (namely the spatial position and the acquisition time of the reference point) according to the compression mode and the spatial position and the acquisition time of the reference point, so as to decompress the acquired compression processing result. In some embodiments, the downstream processing node may further process each frame of decompressed point cloud data according to the laser head type of the corresponding laser radar (for example, the single-line laser head and the multi-line laser head may be different types of laser heads), which is not limited in this application.
Fig. 6 is a flow chart of another method for processing point cloud data provided in the present application. As shown in fig. 6, the method comprises the steps of:
and 1, configuring a laser radar of a map data acquisition vehicle.
The laser radar can be single-line laser radar, multi-line laser radar or laser radar with multiple laser heads combined and other laser radars with different specifications.
And step 2, acquiring a point cloud data set acquired by the laser radar, and carrying out track analysis and frame-by-frame slicing on the point cloud data set.
The track analysis may refer to, for example, the rotation angle of the laser radar laser head corresponding to the obtained point cloud data. Then, the point cloud data set is framed based on the rotation angle so as to mainly keep the physical rigidity of laser, and data segmentation is carried out according to the distance instead of being rough.
The point cloud data set is sliced frame by frame, so to speak, the point cloud data of the laser radars with different specifications are converted into byte blocks according to the frame granularity. The laser point cloud data may be divided into a plurality of point cloud data sets. Fig. 7 is a schematic flow chart of another point cloud data processing method provided in the present application. The laser point cloud data, for example, shown in fig. 7, can be divided into a point cloud data set 1 and a point cloud data set 2. Wherein each set of point cloud data may include point cloud data at N frames (i.e., frame N data as shown in fig. 7). Further, as shown in fig. 7, one frame of point cloud data may include point cloud data of M scan lines (line M data as shown in fig. 7). Further, each scan line may include K point cloud data (point K as shown in fig. 7).
By carrying out track analysis and frame-by-frame slicing on the point cloud data set, the method realizes the unified specification processing of the point cloud supporting multiple laser radar acquisition, and is convenient for the subsequent downstream processing node to identify nodes, the data alignment deviation correcting nodes and the like to read the point cloud according to requirements.
And 3, compressing the at least one frame of point cloud data to obtain a compression processing result of the at least one frame of point cloud data.
In some embodiments, before the compression processing, the point cloud data in the point cloud data set may be subjected to point cloud clipping, point cloud thinning, point cloud filtering, and preprocessing such as point cloud classification, pavement frame identification, and the like. Optionally, the above preprocessing process may refer to any existing implementation manner, which is not described herein. In some embodiments, preprocessing such as point cloud clipping, point cloud thinning, road surface frame identification and the like may not be performed, so as to improve the point cloud data processing efficiency.
The electronic equipment can obtain the space position offset of the point cloud data relative to the reference point in the space position and the time offset of the point cloud data relative to the reference point in the acquisition time through a local coordinate conversion mode. Fig. 8 is a schematic diagram of a local coordinate transformation provided in the present application. As shown in fig. 8, for any point cloud data, the coordinates in the engineering coordinate system may be the coordinates of the original acquisition of the point cloud data. The "establishing local coordinates for each slice" refers to determining a reference point corresponding to each frame of point cloud data (for a specific implementation, reference may be made to the foregoing embodiments, which are not repeated herein). The local coordinate system refers to a coordinate system having a reference point as an origin (the slice origin is based on the reference point). In this coordinate system, a spatial position offset of the point cloud data with respect to the reference point in the spatial position and a temporal offset of the point cloud data with respect to the reference point in the acquisition time (for example, data stream K shown in fig. 7) can be determined.
By calculating the spatial position offset and the time offset of the point cloud data, the data volume used for representing the spatial position and the acquisition time of the point cloud data is reduced. In addition, taking the subsequent downstream processing node to rectify the coordinate values of the point cloud data as an example, the coordinate of each point cloud data does not need to be rectified, and only the coordinate of the reference point of each frame of data can be changed. Because the coordinates of the point cloud data of the frame data are offset relative to the reference point, the coordinates of the reference point are directly changed, which is equivalent to correcting the coordinates of all the point cloud data, thereby improving the efficiency of the subsequent downstream processing nodes for data processing based on the point cloud data.
For any frame of point cloud data, after the spatial position offset and the time offset of each point cloud data in the frame of point cloud data are obtained, the spatial position offset and the time offset can be encoded and compressed through differential encoding, so that a first compression processing result of the frame of point cloud data is obtained. Differential encoding (delta encoding) is described in detail below: setting the value of a certain attribute of the nth data point to be equal to a (n), and setting the value of a certain attribute of the nth-1 data point to be equal to a (n-1); a (n) -a (n-1) is the result after differential encoding. Differential encoding can reduce the dynamic range of data attributes that change more slowly, and is a simple and effective means of removing correlation between adjacent data points. The compression rate improvement obtained by differential encoding is different from each other for different attributes, and the compression rate improvement of more than 10% can be obtained at the highest, but the compression rate is reduced when the differential encoding is adopted for the attributes with severe adjacent data point changes, such as the reflectivity and other attributes. Thus, the spatial position offset and the temporal offset can be compressed in this manner.
And carrying out median predictive coding on the reflectivity of the frame point cloud data to obtain a second compression processing result of the frame point cloud data. The median predictive coding is described in detail below: median prediction is a special predictive coding. For a certain attribute of the point cloud data, the median prediction maintains an ordered list of historical data with the length equal to 5 in an ordered manner, predicts the current data by using the third data (i.e. the data positioned in the middle position) in the list, and writes the predicted residual error into a code stream file or sends the predicted residual error to a subsequent processing flow. The median prediction is applicable to data with severe instantaneous change and relatively stable long-term trend, such as data obtained by differential coding of coordinates and original data of properties such as reflectivity. Median prediction often works well when used in combination with other predictive coding.
The method for compressing the field of the frame point cloud data may be any other existing encoding method. Such as other types of predictive coding or arithmetic coding, etc.
Wherein, let the value of an attribute of the nth data point be equal to a (n), its predicted value be equal to p (n). The operation of writing the residuals [ a (n) -p (n) ] and parameters required to calculate p (n) into a code stream file or to a subsequent process flow when compressing data is called predictive coding. Differential encoding can be considered as the simplest predictive encoding, wherein p (n) =a (n-1), i.e. the corresponding attribute a (n) of a data point numbered n is directly predicted with the corresponding attribute a (n-1) of an adjacent data point numbered n-1, the way in which the predicted value p (n) is calculated depends on the correlation between the data point and the attribute. The target identifier for characterizing whether the point cloud data is ground point cloud data may be compressed in this manner.
Arithmetic coding is a widely used lossless data compression method, and is entropy coding. The method divides unit intervals according to the occurrence probability of the character to be encoded, and encodes the intervals where the character is located, so that the character with higher occurrence probability has smaller length after encoding. Different attributes of point cloud data have different probability distributions, so that the probabilities of occurrence of the point cloud data need to be counted respectively, and each counting process is called a context. Different attributes of the point cloud data use different contexts. In particular, for the above-described target mark for characterizing the ground points and the non-ground points, since the ground points and the non-ground points are distinguished, the same attribute of the ground points and the non-ground points also uses different contexts, and the compression rate can be improved by about 2%.
The electronic device may further perform byte compression processing on the intermediate compression processing result to obtain a final compression processing result of the frame point cloud data (e.g., frame 1data stream in the XBC data body shown in fig. 7). Illustratively, table 1 below is a comparison of compression effects of several existing byte compression algorithms:
TABLE 1
The above-described encoding algorithm may be deployed in an encoder of an electronic device. And the electronic equipment deployed with the decoder decompresses the compression result through a corresponding deployed decoding algorithm.
By the method, the storage space of the point cloud is reduced, the compression rate is improved, the possibility of overflow (OOM) of a point cloud program in subsequent processing is reduced, and the time consumption of Input/Output (I/O) of a disk for storing the compression processing result is reduced.
And 4, storing the compression processing result of the at least one frame of point cloud data according to the mode that the compression processing result of the multi-frame point cloud data is stored in one point cloud file.
The electronic device may store the compression processing result of at least one frame of point cloud data in at least one point cloud file according to the ordering sequence of the point cloud frames and the preset maximum storable point cloud frame number of a single point cloud file, and store the at least one point cloud file (such as the XBC data body shown in fig. 7) according to the storage specification of the XBC, and an index file (such as the POSLOG index shown in fig. 6 and fig. 7) corresponding to the point cloud file. Wherein, the XBC is a file format name. X represents a new or larger number, B represents a binary Byte (Byte), and C represents compression (compression).
Compared with the existing PCD, PLY and other file formats which do not support space-time indexes and have the problems of low compression rate, the XBC file constructed by the method supports space-time indexes, has higher compression rate and reduces the storage space occupied by compression processing results. The PCD is called as Point Cloud Data (PCD), and has two data storage types, namely ASCII and binary, and has a file header for describing the overall information of the Point Cloud. PLY is a file format storing 3D models, and is fully named polygon archive (Polygon File Format) or Schdanver triangle archive (Stanford Triangle Format). The format is mainly used for storing three-dimensional values of a three-dimensional scanning result and describing a three-dimensional object through a set of polygon facets.
And 5, constructing and storing an index file of the point cloud file.
The electronic device may build a frame-granularity spatio-temporal index (simply referred to as a frame spatio-temporal index, an index file that builds a point cloud file as described in the previous embodiments). The content (frame N index) of the index file may include, for example, a relative position (i.e., offset shown in fig. 7) in the point cloud file of the compression processing result of the point cloud data per frame, a byte length (i.e., byte size shown in fig. 7, etc.).
In this embodiment, the point cloud data compression and the byte compression are performed at the granularity of the point cloud frames, and the point cloud file storage is performed according to the granularity of a fixed batch of continuous frames (the preset maximum storable point cloud frame number of a single point cloud file), so as to facilitate the transmission. Meanwhile, one point cloud file can contain compression processing results of a plurality of frames, each frame can contain a plurality of pieces of scanning line information, and the results are written into an index file poslog, so that a subsequent downstream processing node can perform space-time indexing according to the frame granularity. By the method, a service user (downstream processing node) can directly index point cloud data, and the flow of redundant data is reduced by establishing time and space indexes instead of track buffer (buffering, namely, additionally acquiring slices near a required slice in addition to the slice).
Fig. 9 is a schematic structural diagram of another data processing apparatus provided in the present application. As shown in fig. 9, the apparatus includes: an acquisition module 31, a slicing module 32, a processing module 33, and a storage module 34. Wherein,,
the acquiring module 31 is configured to acquire a point cloud data set to be processed.
The slicing module 32 is configured to slice the point cloud data set to be processed frame by frame, so as to obtain at least one frame of point cloud data.
And the processing module 33 is configured to perform compression processing on the at least one frame of point cloud data to obtain a compression processing result of the at least one frame of point cloud data.
The storage module 34 is configured to store a compression processing result of the at least one frame of point cloud data.
Optionally, the slicing module 32 is specifically configured to acquire, for any point cloud data in the point cloud data set to be processed, a rotation angle of a laser head of the laser radar for acquiring the point cloud data when acquiring the point cloud data; and slicing the point cloud data set to be processed frame by frame according to the rotation angle corresponding to each point cloud data in the point cloud data set to be processed to obtain at least one frame of point cloud data. Or, the slicing module 32 is specifically configured to slice the point cloud data set to be processed frame by frame according to a required time period for scanning by the laser head of the laser radar and an acquisition time corresponding to each point cloud data in the point cloud data set to be processed, so as to obtain at least one frame of point cloud data.
Optionally, the processing module 33 is specifically configured to compress, for any frame of point cloud data, a field of the frame of point cloud data to obtain an intermediate compression result of the frame of point cloud data; and performing byte compression processing on the intermediate compression processing result to obtain a final compression processing result of the frame point cloud data.
Optionally, the intermediate compression processing result includes: a first compression result, a second compression result, and a third compression result. Optionally, the processing module 33 is specifically configured to obtain a reference point of the frame point cloud data; according to the space position and the acquisition time of the reference point, carrying out differential encoding on the space information and the time information of the frame point cloud data to obtain a first compression processing result of the frame point cloud data; performing median predictive coding on the reflectivity of the frame point cloud data to obtain a second compression processing result of the frame point cloud data; and carrying out dictionary coding on the target identification of the frame point cloud data to obtain a third compression processing result of the frame point cloud data. Wherein the target identifier is used for representing whether the point cloud data is ground point cloud data.
Optionally, the processing module 33 is specifically configured to obtain a target acquisition time according to the acquisition time of each point cloud data in the frame of point cloud data; and determining the reference point according to the distance between the point cloud data corresponding to the target acquisition time and the track point corresponding to the target acquisition time of the laser radar.
Optionally, the processing module 33 is specifically configured to determine the reference point according to the track point when the distance is greater than a preset distance threshold; and when the distance is smaller than or equal to the preset distance threshold, taking the point cloud data corresponding to the target acquisition time as the reference point.
Optionally, the processing module 33 is specifically configured to take the track point as the reference point; or taking the point cloud data of which the spatial position is positioned in the track point preset range in the frame of point cloud data as the reference point.
Optionally, the storage module 34 is specifically configured to store the compression processing result of the at least one frame of point cloud data in at least one point cloud file according to the ordering order of the point cloud frames and the preset maximum storable point cloud frame number of the single point cloud file.
Optionally, the apparatus may further include a construction module 35, configured to construct an index file of the point cloud file. Optionally, the storage module 34 is further configured to store an index file of the point cloud file. Optionally, the index file includes at least one of: version identification of the index file; the acquisition time range and the space range corresponding to the point cloud file; acquiring a time range and a space range corresponding to each frame of point cloud data in the point cloud file; and the point cloud file comprises descriptive information of compression processing results of each frame of point cloud data. Wherein the description information includes at least one of: the relative position in the point cloud file, byte length, compression mode, the spatial position and acquisition time of the reference point and the laser head type of the corresponding laser radar.
The data processing device provided by the application is used for executing the embodiment of the point cloud data processing method, and the implementation principle and the technical effect are similar, and are not repeated.
Fig. 10 is a schematic hardware structure of an electronic device provided in the present application, and the electronic device 40 shown in fig. 10 includes a memory 41, a processor 42, and a communication interface 43. The memory 41, the processor 42 and the communication interface 43 are communicatively connected to each other. For example, the memory 41, the processor 42, and the communication interface 43 may be connected by a network. Alternatively, the electronic device 40 may also include a bus 44. The memory 41, the processor 42 and the communication interface 43 are communicatively connected to each other via a bus 44. Fig. 10 shows an electronic device 40 in which a memory 41, a processor 42, and a communication interface 43 are connected to each other by a bus 44.
The Memory 41 may be a Read Only Memory (ROM), a static storage device, a dynamic storage device, or a random access Memory (Random Access Memory, RAM). The memory 41 may store a program which, when executed by the processor 42, the processor 42 and the communication interface 43 are adapted to carry out the positioning method according to any of the preceding embodiments. The memory may also store data required for the positioning method.
The processor 42 may employ a general purpose CPU, microprocessor, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), graphics processor (graphics processing unit, GPU) or one or more integrated circuits.
The processor 42 may also be an integrated circuit chip with signal processing capabilities. In implementation, the positioning function of the present application may be accomplished by integrated logic circuitry of hardware in the processor 42 or instructions in the form of software. The processor 42 described above may also be a general purpose processor, a digital signal processor (digital signal processing, DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or may implement or perform the methods, steps, and logic blocks disclosed in the embodiments described herein below. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments disclosed below may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 41 and the processor 42 reads the information in the memory 41 and in combination with its hardware performs the positioning function of the present application.
The communication interface 43 enables communication between the electronic device 40 and other devices or communication networks using a transceiver module such as, but not limited to, a transceiver. For example, the data set may be acquired through the communication interface 43.
When the electronic device 40 includes a bus 44, the bus 44 may include a path that communicates information between the various components of the electronic device 40 (e.g., memory 41, processor 42, communication interface 43).
The present application also provides a computer-readable storage medium, which may include: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk or an optical disk, etc., in which program codes may be stored, and in particular, the computer-readable storage medium stores program instructions for the methods in the above embodiments.
The present application also provides a program product comprising execution instructions stored in a readable storage medium. The at least one processor of the electronic device may read the execution instructions from the readable storage medium, and execution of the execution instructions by the at least one processor causes the electronic device to implement the point cloud data processing method provided by the various embodiments described above.
The term "plurality" herein refers to two or more. The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship; in the formula, the character "/" indicates that the front and rear associated objects are a "division" relationship. In addition, it should be understood that in the description of this application, the words "first," "second," and the like are used merely for distinguishing between the descriptions and not for indicating or implying any relative importance or order.
It will be appreciated that the various numerical numbers referred to in the embodiments of the present application are merely for ease of description and are not intended to limit the scope of the embodiments of the present application.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (12)

1. A method for processing point cloud data, the method comprising:
acquiring a point cloud data set to be processed;
slicing the point cloud data set to be processed frame by frame to obtain at least one frame of point cloud data;
compressing the at least one frame of point cloud data to obtain a compression processing result of the at least one frame of point cloud data;
and storing the compression processing result of the at least one frame of point cloud data.
2. The method according to claim 1, wherein slicing the set of point cloud data to be processed frame by frame to obtain at least one frame of point cloud data comprises:
aiming at any point cloud data in the point cloud data set to be processed, acquiring a rotation angle of a laser head of the laser radar for acquiring the point cloud data when acquiring the point cloud data;
and slicing the point cloud data set to be processed frame by frame according to the rotation angle corresponding to each point cloud data in the point cloud data set to be processed to obtain at least one frame of point cloud data.
3. The method according to claim 1, wherein slicing the set of point cloud data to be processed frame by frame to obtain at least one frame of point cloud data comprises:
And according to the required time length of one circle of laser head scanning of the laser radar and the acquisition time corresponding to each point cloud data in the point cloud data set to be processed, slicing the point cloud data set to be processed frame by frame to obtain at least one frame of point cloud data.
4. A method according to any one of claims 1 to 3, wherein the compressing the at least one frame of point cloud data to obtain a compression result of the at least one frame of point cloud data includes:
for any frame of point cloud data, performing compression processing on fields of the frame of point cloud data to obtain an intermediate compression processing result of the frame of point cloud data;
and performing byte compression processing on the intermediate compression processing result to obtain a final compression processing result of the frame point cloud data.
5. The method of claim 4, wherein the intermediate compression processing result comprises: a first compression result, a second compression result, and a third compression result; the compressing the field of the frame point cloud data to obtain an intermediate compression result of the frame point cloud data, including:
acquiring a reference point of the frame point cloud data;
According to the space position and the acquisition time of the reference point, carrying out differential encoding on the space information and the time information of the frame point cloud data to obtain a first compression processing result of the frame point cloud data;
performing median predictive coding on the reflectivity of the frame point cloud data to obtain a second compression processing result of the frame point cloud data;
dictionary coding is carried out on the target identification of the frame point cloud data to obtain a third compression processing result of the frame point cloud data, wherein the target identification is used for representing whether the point cloud data are ground point cloud data or not.
6. The method of claim 5, wherein the obtaining the reference point of the frame point cloud data comprises:
acquiring target acquisition time according to the acquisition time of each point cloud data in the frame of point cloud data;
if the distance between the point cloud data corresponding to the target acquisition time and the track point corresponding to the target acquisition time of the laser radar is greater than a preset distance threshold, determining the reference point according to the track point;
and if the distance is smaller than or equal to the preset distance threshold, taking the point cloud data corresponding to the target acquisition time as the reference point.
7. The method of claim 6, wherein said determining said reference point from said trajectory point comprises:
Taking the track point as the reference point;
or taking the point cloud data of which the spatial position is positioned in the track point preset range in the frame of point cloud data as the reference point.
8. The method according to any one of claims 5-7, wherein storing the compression processing result of the at least one frame of point cloud data comprises:
and storing the compression processing result of the at least one frame of point cloud data into at least one point cloud file according to the ordering sequence of the point cloud frames and the preset maximum storable point cloud frame number of the single point cloud file.
9. The method of claim 8, wherein the method further comprises:
constructing and storing an index file of the point cloud file; the index file includes at least one of: version identification of the index file; the acquisition time range and the space range corresponding to the point cloud file; acquiring a time range and a space range corresponding to each frame of point cloud data in the point cloud file; the point cloud file comprises descriptive information of compression processing results of each frame of point cloud data, and the descriptive information comprises at least one of the following: the relative position in the point cloud file, byte length, compression mode, the spatial position and acquisition time of the reference point and the laser head type of the corresponding laser radar.
10. A data processing apparatus, the apparatus comprising:
the acquisition module is used for acquiring the point cloud data set to be processed;
the slicing module is used for slicing the point cloud data set to be processed frame by frame to obtain at least one frame of point cloud data;
the processing module is used for compressing the at least one frame of point cloud data to obtain a compression processing result of the at least one frame of point cloud data;
and the storage module is used for storing the compression processing result of the at least one frame of point cloud data.
11. An electronic device, comprising: a processor and a memory; the processor is in communication with the memory;
the memory stores computer instructions;
the processor executes the computer instructions stored by the memory to implement the method of any one of claims 1-9.
12. A computer readable storage medium having stored thereon computer executable instructions which, when executed by a processor, implement the method of any of claims 1-9.
CN202310295055.9A 2023-03-23 2023-03-23 Point cloud data processing method, device, equipment and storage medium Pending CN116468811A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310295055.9A CN116468811A (en) 2023-03-23 2023-03-23 Point cloud data processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310295055.9A CN116468811A (en) 2023-03-23 2023-03-23 Point cloud data processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116468811A true CN116468811A (en) 2023-07-21

Family

ID=87172624

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310295055.9A Pending CN116468811A (en) 2023-03-23 2023-03-23 Point cloud data processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116468811A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116758174A (en) * 2023-08-16 2023-09-15 北京易控智驾科技有限公司 Compression transmission method and device for laser point cloud data, electronic equipment and storage medium
CN117056749A (en) * 2023-10-12 2023-11-14 深圳市信润富联数字科技有限公司 Point cloud data processing method and device, electronic equipment and readable storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116758174A (en) * 2023-08-16 2023-09-15 北京易控智驾科技有限公司 Compression transmission method and device for laser point cloud data, electronic equipment and storage medium
CN116758174B (en) * 2023-08-16 2023-11-10 北京易控智驾科技有限公司 Compression transmission method and device for laser radar point cloud data and electronic equipment
CN117056749A (en) * 2023-10-12 2023-11-14 深圳市信润富联数字科技有限公司 Point cloud data processing method and device, electronic equipment and readable storage medium
CN117056749B (en) * 2023-10-12 2024-02-06 深圳市信润富联数字科技有限公司 Point cloud data processing method and device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN116468811A (en) Point cloud data processing method, device, equipment and storage medium
CN111052189B (en) System, method, and computer-readable medium for generating a compressed point cloud
US20220130074A1 (en) Method, device, and storage medium for data encoding/decoding
US9384387B2 (en) Concept for encoding data defining coded positions representing a trajectory of an object
US20230086264A1 (en) Decoding method, encoding method, decoder, and encoder based on point cloud attribute prediction
CN106897454B (en) File classification method and device
CN115208414B (en) Data compression method, data compression device, computer device and storage medium
CN113792816B (en) Data encoding method, data encoding device, computer equipment and storage medium
US11922018B2 (en) Storage system and storage control method including dimension setting information representing attribute for each of data dimensions of multidimensional dataset
WO2022067775A1 (en) Point cloud encoding and decoding method, encoder, decoder and codec system
Pensiri et al. A lossless image compression algorithm using predictive coding based on quantized colors
EP4216553A1 (en) Point cloud decoding and encoding method, and decoder, encoder and encoding and decoding system
EP2469469A1 (en) Concept for encoding data defining coded orientations representing a reorientation of an object
Zimmer et al. PointCompress3D--A Point Cloud Compression Framework for Roadside LiDARs in Intelligent Transportation Systems
CN115412713B (en) Prediction coding and decoding method and device for point cloud depth information
WO2024149258A1 (en) Method, apparatus, and medium for point cloud coding
CN115412715B (en) Method and device for predicting coding and decoding of point cloud attribute information
WO2024149203A1 (en) Method, apparatus, and medium for point cloud coding
WO2024149309A1 (en) Method, apparatus, and medium for point cloud coding
US20240242393A1 (en) Method, apparatus and medium for point cloud coding
US20240348772A1 (en) Method, apparatus, and medium for point cloud coding
WO2023131131A1 (en) Method, apparatus, and medium for point cloud coding
WO2024074121A1 (en) Method, apparatus, and medium for point cloud coding
WO2024074123A1 (en) Method, apparatus, and medium for point cloud coding
WO2024213148A1 (en) Method, apparatus, and medium for point cloud coding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination