CN114415141A - Radar data processing method and device, vehicle and storage medium - Google Patents

Radar data processing method and device, vehicle and storage medium Download PDF

Info

Publication number
CN114415141A
CN114415141A CN202210101645.9A CN202210101645A CN114415141A CN 114415141 A CN114415141 A CN 114415141A CN 202210101645 A CN202210101645 A CN 202210101645A CN 114415141 A CN114415141 A CN 114415141A
Authority
CN
China
Prior art keywords
target
information
channel
radar
gray scale
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210101645.9A
Other languages
Chinese (zh)
Inventor
张良良
刘强
张倬睿
赖健明
黄雨其
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xiaopeng Autopilot Technology Co Ltd
Original Assignee
Guangzhou Xiaopeng Autopilot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xiaopeng Autopilot Technology Co Ltd filed Critical Guangzhou Xiaopeng Autopilot Technology Co Ltd
Priority to CN202210101645.9A priority Critical patent/CN114415141A/en
Publication of CN114415141A publication Critical patent/CN114415141A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/411Identification of targets based on measurements of radar reflectivity

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the invention provides a radar data processing method and device, a vehicle and a storage medium, wherein the method comprises the following steps: firstly, acquiring target radar point cloud data, wherein the target radar data comprises target depth information and target azimuth information; then, generating a gray scale map according to the target depth information and the target azimuth angle information; and generating a target output file according to the gray-scale image, and outputting the target output file. According to the embodiment of the invention, the point cloud data collected by the radar is converted into a plurality of gray level images, so that the data volume of the file output by the radar is reduced, the transmission efficiency of the file output by the radar is improved, and the use efficiency of the file output by the radar used by other follow-up systems is improved. In addition, the gray level map is generated based on the depth information and the azimuth information, so that the data types of data input into other systems can be rich, and the data processing accuracy of the other systems is further ensured.

Description

Radar data processing method and device, vehicle and storage medium
Technical Field
The present invention relates to the technical field of radar data processing, and in particular, to a method and an apparatus for processing radar data, a vehicle, and a storage medium.
Background
When the radar detects a target, point cloud data aiming at the target is generated; point cloud data may refer to a collection of vectors in a three-dimensional coordinate system.
After generating the point cloud data, the radar may output the point cloud data for use by other systems, such as: active obstacle avoidance system for vehicle and the like
However, since the data amount of the point cloud data is large, if the radar directly outputs the point cloud data and transmits the point cloud data to other systems for use, the transmission efficiency and the use efficiency may be affected.
Disclosure of Invention
In view of the above problems, it is proposed to provide a radar data processing method and apparatus, a vehicle, and a storage medium that overcome or at least partially solve the above problems, including:
a method of processing radar data, the method comprising:
acquiring target radar point cloud data, wherein the target radar data comprises target depth information and target azimuth information;
generating a gray scale map according to the target depth information and the target azimuth information;
and generating a target output file according to the gray-scale image, and outputting the target output file.
Optionally, generating a gray scale map according to the target depth information and the target azimuth information, including:
storing the target depth information and the target azimuth angle information into an R channel in the RGB image to obtain a first gray scale image corresponding to the R channel;
and storing the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain a second gray image corresponding to the G channel.
Optionally, the target radar data further includes target reflectivity information, and the generating of the target output file according to the grayscale map includes:
storing the target reflectivity information into a channel B in the RGB image to obtain a third grayscale image corresponding to the channel B;
and generating a target output file according to the first gray scale map, the second gray scale map and the third gray scale map.
Optionally, the storing the target depth information and the target azimuth information in an R channel in the RGB image to obtain a first grayscale map corresponding to the R channel includes:
storing 8-bit high information in binary information corresponding to the target depth information and the target azimuth information into an R channel in the RGB image to obtain a first gray scale map;
storing the target depth information and the target azimuth information into a G channel in the RGB image to obtain a second gray image corresponding to the G channel, wherein the method comprises the following steps:
and storing the information of lower 8 bits in binary information corresponding to the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain a second gray scale map.
Optionally, the obtaining target radar point cloud data includes:
acquiring collected initial radar point cloud data;
and extracting target radar point cloud data from the initial radar point cloud data according to preset effective angle range information.
The embodiment of the invention also provides a radar data processing device, which comprises:
the acquisition module is used for acquiring target radar point cloud data, and the target radar data comprises target depth information and target azimuth information;
the gray-scale image generation module is used for generating a gray-scale image according to the target depth information and the target azimuth angle information;
and the output module is used for generating a target output file according to the gray-scale image and outputting the target output file.
Optionally, the grayscale map generating module includes:
the first gray-scale image generation submodule is used for storing the target depth information and the target azimuth angle information into an R channel in the RGB image to obtain a first gray-scale image corresponding to the R channel;
and the second gray scale map generation submodule is used for storing the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain a second gray scale image corresponding to the G channel.
Optionally, the target radar data further includes target reflectivity information, and the output module includes:
the third gray scale image generation submodule is used for storing the target reflectivity information into a channel B in the RGB image to obtain a third gray scale image corresponding to the channel B;
and the output file generation submodule is used for generating a target output file according to the first gray scale image, the second gray scale image and the third gray scale image.
Optionally, the first grayscale map generation sub-module is configured to store information of 8 bits higher in binary information corresponding to the target depth information and the target azimuth information into an R channel in the RGB image to obtain a first grayscale map;
and the first gray map generation submodule is used for storing the information of lower 8 bits in binary information corresponding to the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain a second gray map.
Optionally, the obtaining module includes:
the initial data acquisition submodule is used for acquiring the acquired initial radar point cloud data;
and the effective data extraction submodule is used for extracting target radar point cloud data from the initial radar point cloud data according to preset effective angle range information.
Embodiments of the present invention further provide a vehicle, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, and when the computer program is executed by the processor, the method for processing radar data as above is implemented.
The embodiment of the invention also provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the method for processing radar data is realized.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, target radar point cloud data can be obtained firstly, wherein the target radar data comprises target depth information and target azimuth information; then, generating a gray scale map according to the target depth information and the target azimuth information; and generating a target output file according to the gray-scale image and outputting the target output file. According to the embodiment of the invention, the point cloud data collected by the radar is converted into a plurality of gray level images, so that the data volume of the file output by the radar is reduced, the transmission efficiency of the file output by the radar is improved, and the use efficiency of the file output by the radar used by other follow-up systems is improved.
In addition, the gray level map is generated based on the depth information and the azimuth information, so that the data types of data input into other systems can be rich, and the data processing accuracy of the other systems is further ensured.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings needed to be used in the description of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a flow chart of steps of a method of processing radar data in accordance with an embodiment of the present invention;
FIG. 2 is a flow chart of steps of another method of radar data processing according to an embodiment of the present invention;
fig. 3 is a block diagram of a radar data processing apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flowchart illustrating steps of a radar data processing method according to an embodiment of the present invention includes the following steps:
step 101, obtaining target radar point cloud data, wherein the target radar data comprises target depth information and target azimuth information;
the target radar point cloud data can be radar point cloud data obtained by detecting each object in the current environment by a radar; the radar may refer to a laser radar, and may also refer to other types of radars, which is not limited in this embodiment of the present invention.
The target depth information may be used to characterize the distance of an object relative to the radar; the target azimuth information may be used to characterize the angle of deviation of a beam emitted by the radar from a reference line.
In practical application, target radar point cloud data including target depth information and target azimuth information may be acquired first.
102, generating a gray scale map according to the target depth information and the target azimuth angle information;
then, the target depth information and the target azimuth angle information can be converted into a plurality of gray-scale images; specifically, the target depth information and the target azimuth information may be encoded to generate a three-dimensional matrix, and then at least two grayscale images are generated according to the three-dimensional matrix.
And 103, generating a target output file according to the gray-scale map, and outputting the target output file.
Then, a target output file can be generated according to the generated gray-scale image; the target output file may include all of the generated gray maps.
After generating the target output file, the radar may output the target output file so that other systems may use the target output file; compared with the original point cloud data, the data volume of the gray-scale image can be reduced to 15% of the original data volume, so that the transmission efficiency of subsequently transmitting the target output file and the use efficiency of other systems using the target output file can be effectively improved.
In the embodiment of the invention, target radar point cloud data can be obtained firstly, wherein the target radar data comprises target depth information and target azimuth information; then, generating a gray scale map according to the target depth information and the target azimuth information; and generating a target output file according to the gray-scale image and outputting the target output file. According to the embodiment of the invention, the point cloud data collected by the radar is converted into a plurality of gray level images, so that the data volume of the file output by the radar is reduced, the transmission efficiency of the file output by the radar is improved, and the use efficiency of the file output by the radar used by other follow-up systems is improved.
In addition, the gray level map is generated based on the depth information and the azimuth information, so that the data types of data input into other systems can be rich, and the data processing accuracy of the other systems is further ensured.
Referring to fig. 2, a flow chart of steps of another radar data processing method according to an embodiment of the present invention is shown, including the steps of:
step 201, acquiring collected initial radar point cloud data;
the initial radar point cloud data can refer to all radar point cloud data detected by a radar for each object in the current environment; which may include valid target radar point cloud data and invalid radar point cloud data.
Step 202, extracting target radar point cloud data from initial radar point cloud data according to preset effective angle range information;
in practical application, the radar can scan within a 360-degree range, but due to the factors of installation positions, all radar point cloud data collected by the radar may not be effective; for example: the radar mounted on the vehicle may cause the radar point cloud data obtained by scanning the vehicle body by the radar to be invalid due to the shielding of the vehicle body.
In order to screen out the invalid radar point cloud data and further reduce the data volume of the file output by the radar, effective angle range information can be preset according to the installation position of the radar; for example: if the installation position of the radar leads to that the data collected by the radar at 0-180 degrees are valid data and the data collected by the radar at 180-360 degrees are invalid data, valid angle range information for representing 0-180 degrees can be preset.
Furthermore, after initial radar point cloud data acquired by a radar is acquired, effective target radar point cloud data can be extracted from the initial radar point cloud data based on preset effective angle range information; in the above example, the radar point cloud data corresponding to 0-180 ° in the initial radar point cloud data may be used as the target radar point cloud data.
Step 203, storing the target depth information and the target azimuth angle information into an R channel in the RGB image to obtain a first gray scale image corresponding to the R channel;
after the target radar point cloud data is obtained, target depth information and target azimuth information in the target radar point cloud data can be stored in an R channel in the RGB image, so that a gray scale image corresponding to the R channel is obtained; for the sake of convenience of distinction, the grayscale map corresponding to the R channel is referred to as a first grayscale map.
As an example, the first grayscale map may be generated by:
and storing the information of 8 high bits in binary information corresponding to the target depth information and the target azimuth angle information into an R channel in the RGB image to obtain a first gray scale map.
Specifically, the target depth information and the target azimuth information may be converted into binary information, and then the information of the upper 8 bits in the binary information is stored in the R channel in the RGB image, so as to obtain the first grayscale map.
Step 204, storing the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain a second gray image corresponding to the G channel;
after the target radar point cloud data is obtained, target depth information and target azimuth information in the target radar point cloud data can be stored in a G channel in the RGB image, so that a gray scale image corresponding to the G channel is obtained; for the sake of convenience of distinction, the grayscale map corresponding to the G channel is referred to as a second grayscale map.
As an example, the second gray scale map may be generated by:
and storing the information of lower 8 bits in binary information corresponding to the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain a second gray scale map.
Specifically, the information of 8 bits in the binary information may be stored in the G channel in the RGB image, so as to obtain the second gray scale map.
Step 205, the target radar data further comprises target reflectivity information, and the target reflectivity information is stored in a B channel in the RGB image to obtain a third grayscale image corresponding to the B channel;
in practical application, the target radar data may include target depth information, target azimuth information, and target reflectivity information; wherein the target reflectivity information may be used to identify the type of object.
Other systems may also output the target reflectivity information in the target radar point cloud data as part of the output file if they also need to identify the type of object.
Specifically, the target reflectivity information can be stored in a B channel in the RGB image, so as to obtain a grayscale map corresponding to the B channel; for the sake of convenience of distinction, the grayscale map corresponding to the B channel is taken as the third grayscale map.
For example: if the laser radar has (m-1) lines and the azimuth angle is 1 line, the laser of each line rotates in an effective angle to generate n pieces of depth information, reflectivity information and n pieces of azimuth angle information; coding the obtained information, and storing the high 8 bits of the depth information of the (m-1) line laser radar in an R channel, storing the low 8 bits in a G channel, and not storing the depth information in a B channel;
storing the reflectivity information into a B channel, wherein the reflectivity information is not stored in an R channel and a G channel;
storing the high 8 bits of the azimuth information into an R channel, storing the low 8 bits of the azimuth information into a G channel, and storing no azimuth information into a B channel;
after the data processing process, the depth information, the reflectivity information and the azimuth angle information can be reconstructed into an m × n × 3 matrix.
Then, the matrix can be regarded as an n × m RGB-format picture, and different information is stored in R, G and the B channel respectively; then, the three channels can be stored in 8-bit grayscale images respectively, so as to obtain a first grayscale image corresponding to the R channel, a second grayscale image corresponding to the G channel, and a third grayscale image corresponding to the B channel.
And step 206, generating a target output file according to the first gray scale map, the second gray scale map and the third gray scale map, and outputting the target output file.
After obtaining the first gray scale map, the second gray scale map, and the third gray scale map, the radar may generate a target output file including the first gray scale map, the second gray scale map, and the third gray scale map, and output the target output file to other systems for use.
As an example, if other systems only need to perform positioning, the third grayscale map may be discarded, and the target output file is generated only according to the first grayscale map and the second grayscale map, thereby reducing the data amount of the target output file.
If other systems need to not only locate according to the target output file, but also need to know the type of the object, the third grayscale map may be retained, and the target output file may be generated according to the first grayscale map, the second grayscale map, and the third grayscale map, which is not limited in this embodiment of the present invention.
In an embodiment of the present invention, after receiving the target output file, if it is desired to obtain the related information from the gray-scale image, other systems may perform inverse transformation on the gray-scale image according to a rule defined in the compression process, so as to obtain the target depth information, the target azimuth information, and the target reflectivity information.
After the target depth information, the target azimuth angle information and the target reflectivity information are obtained, the coordinates of the point cloud in a radar coordinate system can be calculated and obtained based on the parameters of the radar, and the type of the object can be determined.
In the embodiment of the invention, the collected initial radar point cloud data can be obtained firstly, and the target radar point cloud data is extracted from the initial radar point cloud data according to the preset effective angle range information; then storing the target depth information and the target azimuth angle information into an R channel in the RGB image to obtain a first gray scale image corresponding to the R channel; storing the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain a second gray image corresponding to the G channel; storing the target reflectivity information into a B channel in the RGB image to obtain a third grayscale image corresponding to the B channel; and generating a target output file according to the first gray scale map, the second gray scale map and the third gray scale map, and outputting the target output file. According to the embodiment of the invention, the point cloud data of the radar is converted into the three gray level images, so that the data volume of the file output by the radar is reduced, the transmission efficiency of the file output by the radar is improved, and the use efficiency of the file output by the radar used by other follow-up systems is improved.
And only effective radar point cloud data are processed, so that the data volume of the file output by the radar can be further reduced.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 3, a schematic structural diagram of a radar data processing apparatus according to an embodiment of the present invention is shown, including the following modules:
the acquisition module 301 is configured to acquire target radar point cloud data, where the target radar data includes target depth information and target azimuth information;
a grayscale map generation module 302, configured to generate a grayscale map according to the target depth information and the target azimuth information;
and the output module 303 is configured to generate a target output file according to the grayscale map and output the target output file.
In an alternative embodiment of the present invention, the gray map generation module 302 includes:
the first gray-scale image generation submodule is used for storing the target depth information and the target azimuth angle information into an R channel in the RGB image to obtain a first gray-scale image corresponding to the R channel;
and the second gray scale map generation submodule is used for storing the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain a second gray scale image corresponding to the G channel.
In an optional embodiment of the present invention, the target radar data further includes target reflectivity information, and the output module 303 includes:
the third gray scale image generation submodule is used for storing the target reflectivity information into a channel B in the RGB image to obtain a third gray scale image corresponding to the channel B;
and the output file generation submodule is used for generating a target output file according to the first gray scale image, the second gray scale image and the third gray scale image.
In an optional embodiment of the present invention, the first grayscale map generation sub-module is configured to store information of 8 bits higher in binary information corresponding to the target depth information and the target azimuth information into an R channel in the RGB image to obtain a first grayscale map;
and the first gray map generation submodule is used for storing the information of lower 8 bits in binary information corresponding to the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain a second gray map.
In an optional embodiment of the present invention, the obtaining module 301 includes:
the initial data acquisition submodule is used for acquiring the acquired initial radar point cloud data;
and the effective data extraction submodule is used for extracting target radar point cloud data from the initial radar point cloud data according to preset effective angle range information.
In the embodiment of the invention, target radar point cloud data can be obtained firstly, wherein the target radar data comprises target depth information and target azimuth information; then, generating a gray scale map according to the target depth information and the target azimuth information; and generating a target output file according to the gray-scale image and outputting the target output file. According to the embodiment of the invention, the point cloud data collected by the radar is converted into a plurality of gray level images, so that the data volume of the file output by the radar is reduced, the transmission efficiency of the file output by the radar is improved, and the use efficiency of the file output by the radar used by other follow-up systems is improved.
In addition, the gray level map is generated based on the depth information and the azimuth information, so that the data types of data input into other systems can be rich, and the data processing accuracy of the other systems is further ensured.
Embodiments of the present invention further provide a vehicle, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, and when the computer program is executed by the processor, the method for processing radar data as above is implemented.
The embodiment of the invention also provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the method for processing radar data is realized.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The radar data processing method and apparatus, the vehicle and the storage medium provided above are introduced in detail, and a specific example is applied in the present disclosure to explain the principle and the implementation of the present invention, and the description of the above embodiment is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A method of processing radar data, the method comprising:
acquiring target radar point cloud data, wherein the target radar data comprises target depth information and target azimuth information;
generating a gray scale map according to the target depth information and the target azimuth angle information;
and generating a target output file according to the gray-scale image, and outputting the target output file.
2. The method of claim 1, wherein generating a gray scale map from the target depth information and the target azimuth information comprises:
storing the target depth information and the target azimuth angle information into an R channel in an RGB image to obtain a first gray scale image corresponding to the R channel;
and storing the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain a second gray image corresponding to the G channel.
3. The method of claim 2, wherein the target radar data further comprises target reflectivity information, and wherein generating a target output file from the grayscale map comprises:
storing the target reflectivity information into a channel B in the RGB image to obtain a third grayscale image corresponding to the channel B;
and generating the target output file according to the first gray scale map, the second gray scale map and the third gray scale map.
4. The method of claim 2, wherein storing the target depth information and the target azimuth information in an R channel of an RGB image to obtain a first gray scale map corresponding to the R channel comprises:
storing 8-bit high information in binary information corresponding to the target depth information and the target azimuth information into an R channel in an RGB image to obtain the first gray scale map;
the storing the target depth information and the target azimuth information into a G channel in an RGB image to obtain a second gray image corresponding to the G channel includes:
and storing the information of lower 8 bits in binary information corresponding to the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain the second gray scale map.
5. The method of any one of claims 1-4, wherein the obtaining target radar point cloud data comprises:
acquiring collected initial radar point cloud data;
and extracting the target radar point cloud data from the initial radar point cloud data according to preset effective angle range information.
6. An apparatus for processing radar data, the apparatus comprising:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring target radar point cloud data, and the target radar data comprises target depth information and target azimuth information;
the gray-scale image generation module is used for generating a gray-scale image according to the target depth information and the target azimuth angle information;
and the output module is used for generating a target output file according to the gray-scale image and outputting the target output file.
7. The apparatus of claim 6, wherein the grayscale map generation module comprises:
the first gray scale image generation submodule is used for storing the target depth information and the target azimuth angle information into an R channel in an RGB image to obtain a first gray scale image corresponding to the R channel;
and the second gray scale map generation submodule is used for storing the target depth information and the target azimuth angle information into a G channel in the RGB image to obtain a second gray scale image corresponding to the G channel.
8. The apparatus of claim 7, wherein the target radar data further comprises target reflectivity information, and wherein the output module comprises:
the third gray scale image generation submodule is used for storing the target reflectivity information into a channel B in the RGB image to obtain a third gray scale image corresponding to the channel B;
and the output file generation submodule is used for generating the target output file according to the first gray scale map, the second gray scale map and the third gray scale map.
9. A vehicle comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing a method of processing radar data according to any one of claims 1 to 5.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements a method of processing radar data according to any one of claims 1 to 5.
CN202210101645.9A 2022-01-27 2022-01-27 Radar data processing method and device, vehicle and storage medium Pending CN114415141A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210101645.9A CN114415141A (en) 2022-01-27 2022-01-27 Radar data processing method and device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210101645.9A CN114415141A (en) 2022-01-27 2022-01-27 Radar data processing method and device, vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN114415141A true CN114415141A (en) 2022-04-29

Family

ID=81279723

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210101645.9A Pending CN114415141A (en) 2022-01-27 2022-01-27 Radar data processing method and device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN114415141A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150089364A (en) * 2014-01-27 2015-08-05 한국생산기술연구원 Real-time transmitting system and method for point cloud data, and apparatus applied to the same
CN110147706A (en) * 2018-10-24 2019-08-20 腾讯科技(深圳)有限公司 The recognition methods of barrier and device, storage medium, electronic device
CN113064179A (en) * 2021-03-22 2021-07-02 上海商汤临港智能科技有限公司 Point cloud data screening method and vehicle control method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150089364A (en) * 2014-01-27 2015-08-05 한국생산기술연구원 Real-time transmitting system and method for point cloud data, and apparatus applied to the same
CN110147706A (en) * 2018-10-24 2019-08-20 腾讯科技(深圳)有限公司 The recognition methods of barrier and device, storage medium, electronic device
CN113064179A (en) * 2021-03-22 2021-07-02 上海商汤临港智能科技有限公司 Point cloud data screening method and vehicle control method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
何为: "基于H.264 的3D 视频压缩和传输研究", 中国优秀硕士学位论文全文数据库 信息科技辑, no. 4, 15 April 2016 (2016-04-15), pages 29 - 32 *

Similar Documents

Publication Publication Date Title
CN107817502B (en) Laser point cloud data processing method and device
CN109118542B (en) Calibration method, device, equipment and storage medium between laser radar and camera
US10162358B2 (en) Unmanned vehicle, method, apparatus and system for positioning unmanned vehicle
CN112257605B (en) Three-dimensional target detection method, system and device based on self-labeling training sample
CN111009011B (en) Method, device, system and storage medium for predicting vehicle direction angle
CN111784776A (en) Visual positioning method and device, computer readable medium and electronic equipment
CN111307163B (en) Positioning method and positioning device of movable equipment and electronic equipment
CN110895821A (en) Image processing device, storage medium storing image processing program, and driving support system
CN112241978A (en) Data processing method and device
CN111862208B (en) Vehicle positioning method, device and server based on screen optical communication
CN114415141A (en) Radar data processing method and device, vehicle and storage medium
JP2018116452A (en) Data compression apparatus, data decompressor, control method, program and storage medium
CN113393508B (en) Laser ranging imaging method, device, electronic equipment and storage medium
CN115409861A (en) Laser radar ground point cloud segmentation method, device, equipment and medium based on image processing
CN110334657B (en) Training sample generation method and system for fisheye distortion image and electronic equipment
CN114913105A (en) Laser point cloud fusion method and device, server and computer readable storage medium
CN109711363B (en) Vehicle positioning method, device, equipment and storage medium
CN114067136A (en) Image matching method and device, electronic equipment, storage medium and related product
CN113840130A (en) Depth map generation method, device and storage medium
US20230230265A1 (en) Method and apparatus for patch gan-based depth completion in autonomous vehicles
CN111539361B (en) Noise identification method, device, storage medium, processor and carrier
CN115147738B (en) Positioning method, device, equipment and storage medium
CN110969578B (en) Quick splicing method, medium, terminal and device for local grid map
CN111325712B (en) Method and device for detecting image validity
CN110223388B (en) Three-dimensional reconstruction method and device based on spatial structured light, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination