CN117850651A - Radar data display method, device, equipment, medium and product - Google Patents

Radar data display method, device, equipment, medium and product Download PDF

Info

Publication number
CN117850651A
CN117850651A CN202311845663.9A CN202311845663A CN117850651A CN 117850651 A CN117850651 A CN 117850651A CN 202311845663 A CN202311845663 A CN 202311845663A CN 117850651 A CN117850651 A CN 117850651A
Authority
CN
China
Prior art keywords
radar data
radar
texture image
coordinate system
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311845663.9A
Other languages
Chinese (zh)
Inventor
张强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Guoke Chuzhi Technology Co ltd
Original Assignee
Beijing Guoke Chuzhi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Guoke Chuzhi Technology Co ltd filed Critical Beijing Guoke Chuzhi Technology Co ltd
Priority to CN202311845663.9A priority Critical patent/CN117850651A/en
Publication of CN117850651A publication Critical patent/CN117850651A/en
Pending legal-status Critical Current

Links

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The disclosure relates to a radar data display method, device, equipment, medium and product, wherein the method comprises the following steps: under the condition that radar data are obtained by simulating radar, creating a window for drawing the radar data through a graphic engine; creating a texture image in the window for rendering radar data; the radar data is mapped onto the texture image such that the radar data is displayed on the texture image. Through the steps, the radar data can be drawn through the graphic engine, and the application range of the existing display method is enlarged.

Description

Radar data display method, device, equipment, medium and product
Technical Field
The disclosure relates to the technical field of image drawing, and in particular relates to a radar data display method, device, equipment, medium and product.
Background
In the related art, the automatic driving simulation test is mainly to digitally restore the application scene of automatic driving in a mathematical modeling mode, establish a system model as close as possible to the real world, and achieve the purpose of testing and verifying an automatic driving system and algorithm without directly carrying out the simulation test by a real vehicle through software.
At the heart of the autopilot simulation test is a simulation platform, which generally includes a simulation framework, a physics engine, and a graphics engine. The simulation framework is the core of a platform software platform and supports sensor simulation, vehicle dynamics simulation, communication simulation, traffic environment simulation and the like. The sensor simulation can be accessed to the sensing data of the simulation scene, and simulation data support is provided for automatic driving test. The sensor simulation comprises a camera, a laser radar, a millimeter wave radar, a GPS/IMU and the like.
When the automatic driving simulation test is carried out on the laser radar and the millimeter wave radar, radar data obtained through simulation needs to be displayed, the radar data needs to be transmitted to a client in the existing display method, the client is responsible for displaying, and a graphic engine cannot display the radar data, so that the existing display method of the radar data has the problem of small application range.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a radar data display method, device, equipment, medium and product, which can draw radar data through a graphic engine, and increase the application range of the existing display method.
According to a first aspect of an embodiment of the present disclosure, there is provided a radar data display method including:
under the condition that radar data are obtained by simulating radar, creating a window for drawing the radar data through a graphic engine;
creating a texture image in the window for rendering radar data;
the radar data is mapped onto the texture image such that the radar data is displayed on the texture image.
According to a second aspect of the embodiments of the present disclosure, there is provided a radar data display apparatus including:
the first creating module is used for creating a window for drawing radar data through the graphic engine under the condition that radar data are obtained by simulating the radar;
a second creating module for creating a texture image for drawing radar data in the window;
and the mapping module is used for mapping the radar data to the texture image so as to display the radar data on the texture image.
According to a third aspect of embodiments of the present disclosure, there is provided a vehicle storing a set of instructions that are executed by the vehicle to implement the radar data display method provided by the first aspect of the present disclosure.
According to a fourth aspect of embodiments of the present disclosure, there is provided an electronic device, comprising: a processor; a memory for storing the processor-executable instructions; the processor is configured to read the executable instructions from the memory and execute the instructions to implement the radar data display method provided in the first aspect of the present disclosure.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the radar data display method provided by the first aspect of the present disclosure.
According to a sixth aspect of embodiments of the present disclosure, there is provided a computer program product, instructions in which, when executed by a processor of an electronic device, cause the electronic device to perform the steps of the radar data display method as provided in the first aspect.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: creating, by a graphics engine, a window for rendering radar data, and creating a texture image in the window for rendering radar data, and mapping the radar data onto the texture image such that the radar data is displayed on the texture image. The simulation method has the advantages that the simulation method can enable the radar data obtained through simulation to be displayed in the graphic engine, the problem that the radar data can only be sent to a client for display is solved, the radar data display method can be further applied to more scenes, and the application range of the radar data display method is enlarged.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a radar data display method according to an exemplary embodiment.
Fig. 2 is a block diagram illustrating a radar data display apparatus according to an exemplary embodiment.
FIG. 3 is a block diagram of a vehicle, according to an exemplary embodiment.
Fig. 4 is a block diagram of an electronic device, according to an example embodiment.
Detailed Description
Exemplary embodiments will be described in detail below with reference to the accompanying drawings.
It should be noted that the related embodiments and the drawings are only for the purpose of describing exemplary embodiments provided by the present disclosure, and not all embodiments of the present disclosure, nor should the present disclosure be construed to be limited by the related exemplary embodiments.
It should be noted that the terms "first," "second," and the like, as used in this disclosure, are used merely to distinguish between different steps, devices, or modules, and the like. Relational terms are used not to indicate any particular technical meaning nor sequence or interdependence between them.
It should be noted that the modifications of the terms "one", "a plurality", "at least one" as used in this disclosure are intended to be illustrative rather than limiting. Unless the context clearly indicates otherwise, it should be understood as "one or more".
It should be noted that the term "and/or" is used in this disclosure to describe an association between associated objects, and generally indicates that there are at least three associations. For example, a and/or B may at least represent: a exists independently, A and B exist simultaneously, and B exists independently.
It should be noted that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. The scope of the present disclosure is not limited by the order of description of the steps in the related embodiments unless specifically stated.
It should be noted that, all actions for acquiring signals, information or data in the present disclosure are performed under the condition of conforming to the corresponding data protection rule policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
Exemplary method
Fig. 1 is a flowchart illustrating a radar data display method according to an exemplary embodiment, which is used in a illusion engine as shown in fig. 1, including the following steps.
In step S110, in the case where radar data is obtained by simulating radar, a window for drawing radar data is created by the graphic engine;
in step S120, a texture image for rendering radar data is created in the window;
in step S130, the radar data is mapped onto the texture image so that the radar data is displayed on the texture image.
In this embodiment, the graphics engine is a fantasy engine. The illusion engine can adopt advanced rendering technology, and can achieve the effects of high-quality dynamic shadow, advanced shadow rendering, 64-bit color high-precision dynamic rendering pipeline and the like. Thus, in an embodiment the phantom engine is used to map the radar data. The window refers to an interface for displaying radar data, and the drawing and displaying functions are realized through the illusion engine.
When creating a window of radar data, the creation can be performed using the c++ class in the illusion engine, and inherit from uuserwidgets. The uuserwidgets are a class in the illusion engine, which provides the functionality to create and manage user interfaces. By inheriting from the UUserWidget, the window can obtain all the attributes and methods of the UserWidget, so that the drawing of radar data is realized.
The texture image is a data structure for representing and operating image data, and after the texture image is created, the radar data and the texture image can be associated, so that the radar data can achieve a visual effect in the illusion engine, and the radar data does not need to be sent to a client for display, thereby enhancing the user experience.
The radar data include millimeter wave radar data and point cloud data of a laser radar. The radar data can be obtained by simulating a radar, and in particular, the millimeter wave radar data can be obtained by the following steps:
1. setting input control parameters of the millimeter wave radar, including: horizontal viewport extent, vertical viewport extent, detection extent, number of scan points per second.
2. And calculating the maximum detection range in the horizontal and vertical directions according to the horizontal view port range, the vertical view port range and the detection range.
3. The number of scan points per second is multiplied by the updated elapsed time to obtain the number of vertices of the frame, i.e., the number of rays detected.
4. Each ray is within the detection range, its radius and azimuth angle are generated using a random number.
5. Each ray is detected in the scene, finding the intersection of the first object it hits.
6. The speed, azimuth and depth of the intersection of the ray hits are calculated as the return result of the ray.
7. And collecting calculation results of all rays to obtain millimeter wave radar data.
The point cloud data of the laser radar can be obtained through the following steps:
1. setting input control parameters of the laser radar, including: the number of emitted laser beams, the detection range, the number of points found by all the beams, the laser rotation scanning frequency, the view port angle range and the like.
2. The point to be found for each beam of a frame is calculated based on the maximum number of points to be found for all beams per second and the time elapsed for a frame update (Deltatime).
3. Each beam scans the frequency according to the view port range, and the point to be found by each beam sets the direction of the beam.
4. And finding the detected point of the light beam according to the direction of the light beam and the detection range.
5. All points found by all beams are collected to generate point cloud data.
In this embodiment, a window for drawing radar data is created by a graphic engine, and a texture image for drawing radar data is created in the window, and the radar data is mapped onto the texture image so that the radar data is displayed on the texture image. The simulation method has the advantages that the simulation method can enable the radar data obtained through simulation to be displayed in the graphic engine, the problem that the radar data can only be sent to a client for display is solved, the radar data display method can be further applied to more scenes, and the application range of the radar data display method is enlarged.
In an embodiment of the present application, the creating a texture image for rendering radar data in the window includes:
creating a control blueprint in the window, wherein the control blueprint inherits the functions of the window;
creating an initial image through the control blueprint;
and setting textures for the initial image to obtain a texture image.
In this embodiment, a control blueprint, a tool in the illusion engine for implementing and editing interface elements, may be created in the editor mode of the illusion engine.
An "image" component is added to the control blueprint. This component is used to display the initial picture and then the initial image can be created in the window by dragging the "image" component to the blueprint, and in addition, various interface elements, such as buttons, sliders, text boxes, etc., can be created and added with properties and methods. And, the blueprint inherits from the c++ class above.
After the initial image is obtained, the "image" component may be selected and then its "anchor" attribute set to "fill". Thereby enabling the picture to be automatically resized to fill the entire control interface. And creating a texture object through a window C++ class of the radar data, and setting the texture object to a picture in a control blueprint to obtain a texture image.
In the embodiment, the drawing efficiency of the radar data is improved, and meanwhile, the radar data is easy to maintain and update, and the user experience is enhanced by inheriting the functions of the window, creating an initial image, setting textures and the like.
In an embodiment of the present application, the mapping the radar data onto the texture image includes:
calculating the position of the intersection point of the rays of the millimeter wave radar and the detected object on the texture image according to the azimuth angle and the detection distance;
obtaining a target color corresponding to a speed interval in which the speed is located according to a mapping relation between the speed interval and the color;
setting the color of the location to the target color.
In this embodiment, the radar data includes an azimuth angle and a detection distance corresponding to a ray emitted by the millimeter wave radar, and a speed of an intersection point at which the ray intersects with the detected object;
the speed intervals are in one-to-one correspondence with the colors, and when the rays of the radar intersect with the detected object, the speed of the detected object can be obtained, so that the speed can be converted into the target color, and the detected object can be displayed on the texture image. The mapping relationship between the color and the speed interval can be set by a person skilled in the art according to the actual situation, and the embodiment is not limited herein.
In an embodiment, the calculating, according to the azimuth angle and the detection distance, a position of an intersection point of a ray of the millimeter wave radar and a detected object on the texture image includes:
calculating according to the azimuth angle and the detection distance to obtain a first coordinate of the intersection point in a first coordinate system, wherein the first coordinate system is a world coordinate system;
and determining the projection coordinate of the first coordinate under a second coordinate system as the position, wherein the second coordinate system is a three-dimensional coordinate system established by taking the vehicle bound by the millimeter wave radar as an origin, and the second coordinate system is a local coordinate system on the vehicle.
In this embodiment, in determining the position of the intersection point of the millimeter wave radar frontal ray and the measured object on the texture image, geometric operation is required to be performed according to the azimuth angle and the detection distance of the ray emitted by the radar, so as to obtain the first coordinate of the intersection point in the world coordinate system.
In one example, the first coordinate may be calculated by:
first, in the world coordinate system, a point P (x, y, z) may be defined to represent the position of the millimeter wave radar. Second, from the azimuth angle of the rays emitted by the radar, the direction vector of the rays can be determined. Azimuth is typically expressed in terms of angle, which can be converted into a corresponding unit vector by a trigonometric function. Next, the intersection of the ray with the object needs to be determined. May be implemented by geometric operations. Assuming that the position of the object in the world coordinate system is Q (x ', y ', z '), the intersection point of the ray and the object can be calculated by the following formula: p+t direction_vector=q
Wherein t is a parameter representing the distance between the intersection point of the ray and the object to be measured. direction_vector is the ray direction vector. By solving the above equation, we can get the value of the parameter t. This value represents the distance of the intersection of the ray with the object under test. The first coordinate of the intersection point in the world coordinate system may be calculated by substituting the parameter t into the formula p+t direct_vector=q.
After the first coordinate is obtained, the projection coordinate of the first coordinate in the three-dimensional coordinate system established by taking the vehicle bound by the millimeter wave radar as the origin is determined as the position of the intersection point on the texture image, and the three-dimensional coordinate system is a local coordinate system on the vehicle.
In the embodiment, the method can improve and accurately calculate the position of the intersection point of the ray of the millimeter wave radar and the detected object on the texture image, simplify data processing and improve the drawing efficiency of radar data.
In an embodiment of the present application, the mapping the radar data onto the texture image includes:
converting the point cloud data into a third coordinate system of the laser radar to obtain a second coordinate corresponding to the point cloud data, wherein the third coordinate system is a world coordinate system;
and determining the projection coordinates of the second coordinates in a second coordinate system as the positions of the radar data in the texture image, wherein the second coordinate system is a local coordinate system on the vehicle.
In this embodiment, the radar data includes point cloud data of a lidar; because the point cloud data is three-dimensional data, the point cloud data can be mutually converted in two three-dimensional coordinate systems to obtain a second coordinate of the point cloud data in the world coordinate system. As one example, conversion of point cloud data to a world coordinate system may be implemented using a graphics library such as OpenGL or Direct 3D.
After the conversion between the coordinates is completed, the second coordinates are projected into a second coordinate system, and then the front two-dimensional of the projection position is determined as the position of radar data in the texture image, so that the effect of mapping the point cloud data into the texture image is achieved.
In this embodiment, by converting the point cloud data into the second coordinate system and taking the first two-dimensional coordinate system, the point cloud data can be accurately mapped into the texture image, and the drawing efficiency of the radar data can be improved.
In an embodiment of the present application, after the setting a texture for the initial image to obtain a texture image, the method further includes:
setting the color of the texture image to a first color when the radar data is point cloud data;
after the projection coordinates of the second coordinates in the second coordinate system are determined as the position, the method further includes:
setting the color of the position from black to a second color, wherein the first color and the second color are different colors.
In this embodiment, in order to ensure the difference in display of the last radar data in the illusion engine, the color of the texture image is distinguished from the color of the position, thereby displaying the radar data. Wherein the first color and the second color may be colors having a large difference, such as black, white, and the like.
In this embodiment, by setting different colors, radar data is easier to observe, and the use experience of a user is improved.
Exemplary apparatus
Fig. 2 is a block diagram of a radar data display apparatus according to an exemplary embodiment. Referring to fig. 2, the apparatus 200 includes a first creation module 210, a second creation module 220, and a mapping module 230.
A first creating module 210, configured to create, by using a graphics engine, a window for drawing radar data when simulating radar to obtain radar data;
a second creating module 220 for creating a texture image for rendering radar data in the window;
a mapping module 230, configured to map the radar data onto the texture image, so that the radar data is displayed on the texture image.
Optionally, the first creating module 210 includes:
the first creation sub-module is used for creating a control blueprint in the window, and the control blueprint inherits the functions of the window;
the second creation sub-module is used for creating an initial image through the control blueprint;
and the first setting submodule is used for setting textures for the initial image to obtain a texture image.
Optionally, the mapping module 230 includes:
the calculation sub-module is used for calculating the position of the intersection point of the rays of the millimeter wave radar and the detected object on the texture image according to the azimuth angle and the detection distance;
the first acquisition submodule is used for acquiring a target color corresponding to a speed interval in which the speed is positioned according to the mapping relation between the speed interval and the color;
and the second setting submodule is used for setting the color of the position to be the target color.
Optionally, the computing submodule includes:
the calculation unit is used for calculating according to the azimuth angle and the detection distance to obtain a first coordinate of the intersection point in a first coordinate system, wherein the first coordinate system is a world coordinate system;
and the determining unit is used for determining the projection coordinate of the first coordinate under a second coordinate system as the position, wherein the second coordinate system is a three-dimensional coordinate system established by taking the vehicle bound by the millimeter wave radar as an origin.
Optionally, the mapping module 230 includes:
the conversion sub-module is used for converting the point cloud data into a third coordinate system of the laser radar to obtain a second coordinate corresponding to the point cloud data, wherein the third coordinate system is a world coordinate system;
and the determining submodule is used for determining the projection coordinate of the second coordinate under a second coordinate system as the position of the radar data in the texture image, and the second coordinate system is a three-dimensional coordinate system established by taking a vehicle bound by the laser radar as an origin.
Optionally, the apparatus 200 is specifically configured to:
setting the color of the texture image to a first color when the radar data is point cloud data;
setting the color of the position from black to a second color, wherein the first color and the second color are different colors.
Exemplary vehicle
Fig. 3 is a block diagram of a vehicle 300, according to an exemplary embodiment. The vehicle 300 may be a fuel-powered vehicle, a hybrid vehicle, an electric vehicle, a fuel cell vehicle, or other type of vehicle.
Referring to fig. 3, a vehicle 300 may include a plurality of subsystems, such as a drive system 310, a control system 320, a perception system 330, a communication system 340, an information display system 350, and a computing processing system 360. The vehicle 300 may also include more or fewer subsystems, and each subsystem may also include multiple components, which are not described in detail herein.
The drive system 310 includes components that provide powered movement of the vehicle 300. Such as an engine, energy source, transmission, etc.
The control system 320 includes components that provide control for the vehicle 300. Such as vehicle control, cabin equipment control, driving assistance control, etc.
The perception system 330 includes components that provide the vehicle 300 with a perception of the surrounding environment. For example, a vehicle positioning system, a laser sensor, a voice sensor, an ultrasonic sensor, an image pickup apparatus, and the like.
The communication system 340 includes components that provide communication connectivity for the vehicle 300. For example, mobile communication networks (e.g., 3G, 4G, 5G networks, etc.), wiFi, bluetooth, internet of vehicles, etc.
The information display system 350 includes components that provide various information displays for the vehicle 300. For example, vehicle information display, navigation information display, entertainment information display, and the like.
The computing processing system 360 includes components that provide data computing and processing capabilities for the vehicle 300. The computing processing system 360 may include at least one processor 361 and a memory 362. The processor 361 may execute instructions stored in the memory 362.
The processor 361 may be any conventional processor, such as a commercially available CPU. The processor may also include, for example, an image processor (Graphic Process Unit, GPU), a field programmable gate array (Field Programmable gate array, FPGA), a System On Chip (SOC), an application specific integrated Chip (Application Specific Integrated Circuit, ASIC), or a combination thereof.
The memory 362 may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
In the disclosed embodiment, the memory 362 has stored therein a set of instructions that can be executed by the processor 361 to implement all or part of the steps of the radar data display method described in any of the above exemplary embodiments.
Exemplary electronic device
Fig. 4 is a block diagram of an electronic device 400, shown in accordance with an exemplary embodiment. The electronic device 400 may be a vehicle controller, an in-vehicle terminal, an in-vehicle computer, or other type of electronic device.
Referring to fig. 4, an electronic device 400 may include at least one processor 410 and memory 420. Processor 410 may execute instructions stored in memory 420. The processor 410 is communicatively coupled to the memory 420 via a data bus. In addition to memory 420, processor 410 may be communicatively coupled with input device 430, output device 440, and communication device 450 via a data bus.
The processor 410 may be any conventional processor, such as a commercially available CPU. The processor may also include, for example, an image processor (Graphic Process Unit, GPU), a field programmable gate array (Field Programmable gate array, FPGA), a System On Chip (SOC), an application specific integrated Chip (Application Specific Integrated Circuit, ASIC), or a combination thereof.
The memory 420 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
In the embodiment of the present disclosure, the memory 420 stores executable instructions, and the processor 410 may read the executable instructions from the memory 420 and execute the instructions to implement all or part of the steps of the radar data display method according to any one of the above exemplary embodiments.
Exemplary computer-readable storage Medium
In addition to the methods and apparatus described above, exemplary embodiments of the present disclosure may also be a computer program product or a computer readable storage medium storing the computer program product. The computer program product comprises computer program instructions executable by a processor to perform all or part of the steps described in any of the methods of the exemplary embodiments described above.
The computer program product may write program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages, as well as scripting languages (e.g., python). The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
The computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the readable storage medium include: a Static Random Access Memory (SRAM), an electrically erasable programmable read-only memory (EEPROM), an erasable programmable read-only memory (EPROM), a programmable read-only memory (PROM), a read-only memory (ROM), a magnetic memory, a flash memory, a magnetic or optical disk, or any suitable combination of the foregoing having one or more electrical conductors.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A radar data display method, the method comprising:
under the condition that radar data are obtained by simulating radar, creating a window for drawing the radar data through a graphic engine;
creating a texture image in the window for rendering radar data;
the radar data is mapped onto the texture image such that the radar data is displayed on the texture image.
2. The radar data display method of claim 1, wherein creating a texture image for rendering radar data in the window comprises:
creating a control blueprint in the window, wherein the control blueprint inherits the functions of the window;
creating an initial image through the control blueprint;
and setting textures for the initial image to obtain a texture image.
3. The radar data display method according to claim 2, wherein the radar data includes an azimuth angle and a detection distance corresponding to a ray emitted by a millimeter wave radar, and a speed of an intersection point at which the ray intersects with a detected object;
the mapping the radar data onto the texture image includes:
calculating the position of the intersection point of the rays of the millimeter wave radar and the detected object on the texture image according to the azimuth angle and the detection distance;
obtaining a target color corresponding to a speed interval in which the speed is located according to a mapping relation between the speed interval and the color;
setting the color of the location to the target color.
4. The radar data display method according to claim 3, wherein the calculating the position of the intersection of the ray of the millimeter wave radar and the detected object on the texture image based on the azimuth angle and the detection distance includes:
calculating according to the azimuth angle and the detection distance to obtain a first coordinate of the intersection point in a first coordinate system, wherein the first coordinate system is a world coordinate system;
and determining the projection coordinate of the first coordinate under a second coordinate system as the position, wherein the second coordinate system is a three-dimensional coordinate system established by taking the vehicle bound by the millimeter wave radar as an origin.
5. The radar data display method of claim 2, wherein the radar data includes point cloud data of a lidar;
the mapping the radar data onto the texture image includes:
converting the point cloud data into a third coordinate system of the laser radar to obtain a second coordinate corresponding to the point cloud data, wherein the third coordinate system is a world coordinate system;
and determining the projection coordinates of the second coordinates in a second coordinate system as the positions of the radar data in the texture image, wherein the second coordinate system is a three-dimensional coordinate system established by taking a vehicle bound by the laser radar as an origin.
6. The radar data display method according to claim 5, wherein after the setting of the texture for the initial image to obtain the texture image, the method further comprises:
setting the color of the texture image to a first color when the radar data is point cloud data;
after the projection coordinates of the second coordinates in the second coordinate system are determined as the position, the method further includes:
setting the color of the position from black to a second color, wherein the first color and the second color are different colors.
7. A radar data display device, comprising:
the first creating module is used for creating a window for drawing radar data through the graphic engine under the condition that radar data are obtained by simulating the radar;
a second creating module for creating a texture image for drawing radar data in the window;
and the mapping module is used for mapping the radar data to the texture image so as to display the radar data on the texture image.
8. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement the radar data display method of any one of claims 1-6.
9. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, perform the steps of the radar data display method of any of claims 1-6.
10. A computer program product, characterized in that instructions in the computer program product, when executed by a processor of an electronic device, cause the electronic device to perform the radar data display method according to any of claims 1-6.
CN202311845663.9A 2023-12-28 2023-12-28 Radar data display method, device, equipment, medium and product Pending CN117850651A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311845663.9A CN117850651A (en) 2023-12-28 2023-12-28 Radar data display method, device, equipment, medium and product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311845663.9A CN117850651A (en) 2023-12-28 2023-12-28 Radar data display method, device, equipment, medium and product

Publications (1)

Publication Number Publication Date
CN117850651A true CN117850651A (en) 2024-04-09

Family

ID=90528247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311845663.9A Pending CN117850651A (en) 2023-12-28 2023-12-28 Radar data display method, device, equipment, medium and product

Country Status (1)

Country Link
CN (1) CN117850651A (en)

Similar Documents

Publication Publication Date Title
US11763474B2 (en) Method for generating simulated point cloud data, device, and storage medium
US11982747B2 (en) Systems and methods for generating synthetic sensor data
CN107966693B (en) Vehicle-mounted laser radar simulation method based on depth rendering
CN110765620B (en) Aircraft visual simulation method, system, server and storage medium
CN113366341B (en) Point cloud data processing method and device, storage medium and laser radar system
CN109635639B (en) Method, device, equipment and storage medium for detecting position of traffic sign
CN116027951B (en) Visualization method and device for vehicle acquisition data and storage medium
KR20210057943A (en) Method, apparatus and computer program for conducting automatic driving data labeling
CN111630520A (en) Method and device for processing point cloud
CN115825067A (en) Geological information acquisition method and system based on unmanned aerial vehicle and electronic equipment
CN117872590A (en) Space target optical imaging simulation method and system
CN111781611B (en) Method and device for establishing model, storage medium and electronic equipment
CN104285242A (en) Method and arrangement for model generation
CN114612622A (en) Robot three-dimensional map pose display method, device and equipment and storage medium
CN110021210B (en) Unmanned aerial vehicle VR training method with extensible virtual space
CN117197339A (en) Model display method, device and equipment based on DEM and storage medium
CN116978010A (en) Image labeling method and device, storage medium and electronic equipment
CN115619986B (en) Scene roaming method, device, equipment and medium
CN117850651A (en) Radar data display method, device, equipment, medium and product
CN116136408A (en) Indoor navigation method, server, device and terminal
Koduri et al. AUREATE: An Augmented Reality Test Environment for Realistic Simulations
JP2023534888A (en) Method and Apparatus for Combining Augmented Reality Objects in Real World Images
CN117368869B (en) Visualization method, device, equipment and medium for radar three-dimensional power range
CN116719054B (en) Virtual laser radar point cloud generation method, computer equipment and storage medium
CN117934680A (en) Front endpoint cloud data visual playback method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination