CN112381705B - Method for rapidly drawing point cloud in Unity3d - Google Patents
Method for rapidly drawing point cloud in Unity3d Download PDFInfo
- Publication number
- CN112381705B CN112381705B CN202110045882.3A CN202110045882A CN112381705B CN 112381705 B CN112381705 B CN 112381705B CN 202110045882 A CN202110045882 A CN 202110045882A CN 112381705 B CN112381705 B CN 112381705B
- Authority
- CN
- China
- Prior art keywords
- data
- gpu
- point
- data point
- point cloud
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Image Generation (AREA)
Abstract
The invention relates to a method for quickly drawing point cloud in Unity3d, which comprises the following steps: generating four vertex coordinates and four UV data corresponding to the four vertex coordinates one by each data point in the point cloud data C, and acquiring the position Pc of the current camera; and transmitting the generated data to a GPU, distributing coordinates of four vertexes in the quadrangle Q corresponding to each data point to different GPU threads for calculation, calculating the distance between the position Pc of the camera and the coordinate P of the data point when calculating a certain data point, expanding the four vertexes to form a square and then transmitting the square to a rasterization production line of the GPU if the distance is larger than a set threshold value, and deleting the data point if the distance is not larger than the set threshold value. The invention realizes the point cloud drawing method under the GPU hardware programmable architecture, and can realize the purpose of drawing point cloud data more quickly.
Description
Technical Field
The invention relates to a method for quickly drawing point cloud in Unity3d, and belongs to the technical field of automatic driving.
Background
A point cloud is a data set of data points under a coordinate system, and the data points contain rich information including three-dimensional coordinates (X, Y, Z), color, classification value, intensity value, time, and the like. The point cloud is commonly used for describing a three-dimensional scene, and the generation ways of the point cloud are also various, wherein the laser radar point cloud is generated by scanning and post-processing of the laser radar, plays an important role in the fields of automatic driving and remote control and remote measurement, and can be used for generating a high-precision map and the like of automatic driving, so that the point cloud data can be quickly, efficiently and correctly rendered.
In the fields of automatic driving, remote control and remote measurement, agricultural archaeology and the like, Unity3d has been increasingly used as a universal development tool, and although several methods and methods for drawing point clouds have been provided in Unity3d, because the methods all need to complete a large amount of calculation work in a CPU, GPU resources are idle in a computer equipped with mainstream GPU hardware, cloud computing power is wasted, and drawing efficiency is low.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the method for quickly drawing the point cloud can fully utilize GPU hardware resources and has high operation efficiency.
In order to solve the technical problems, the technical scheme provided by the invention is as follows: a method of fast point cloud rendering in Unity3d, comprising the steps of:
acquiring point cloud data C to be displayed, wherein the point cloud data C comprises N data points, each data point generates four vertex coordinates and four UV data corresponding to the four vertex coordinates one by one, the four vertex coordinates are all filled as a coordinate P of the data point in a space, and the four UV data are respectively filled as (0,0), (0,1), (1,0) and (1, 1); then acquiring the position Pc of the current camera;
step two, transmitting the vertex coordinates and the UV data of each data point generated in the step one to a GPU, and setting a preset data point drawing size S as a parameter into the GPU;
step three, the GPU distributes four vertex coordinates in the quadrangle Q corresponding to each data point to different GPU threads for calculation, when a certain data point is calculated, the distance between the position Pc of the camera and the coordinate P of the data point is firstly calculated, and if the distance is larger than a set threshold value, the data point is deleted and is not displayed any more;
if the distance between the camera position Pc and the data point coordinate P is smaller than or equal to a set threshold value, expanding the four vertexes to form a plane square with the side length S according to UV data corresponding to the four vertexes of the data point, and transforming the four vertex coordinates after expansion into a projection space and then transmitting the four vertex coordinates into a rasterization pipeline of a GPU.
In the traditional Unity3d point cloud rendering method, coordinate transformation calculation of all data points in point cloud data needs to be completed in a CPU, and the calculation usually consumes a large amount of CPU operation time, and in the period of time, a GPU is completely idle, the GPU can send the calculated data to a GPU end only by waiting for the CPU to transmit the data to the GPU end, so that the strong calculation performance of the GPU is definitely wasted.
The method for realizing point cloud drawing by using the GPU hardware programmable architecture can fully utilize GPU hardware resources, enables the CPU and the GPU to work cooperatively, achieves higher operation efficiency, and achieves the purpose of drawing point cloud data more quickly in Unity3 d.
Drawings
The invention will be further explained with reference to the drawings.
Fig. 1 is a difference between the method of the present invention and the conventional point cloud drawing method.
Fig. 2 is a schematic diagram of a square with a side length S formed by expanding four vertexes.
Detailed Description
Examples
As shown in fig. 1, the method of this embodiment is different from the conventional point cloud drawing method, in the conventional algorithm (shown on the left side in fig. 1), the coordinate transformation calculation of all point data in the point cloud data needs to be completed in the CPU, whereas in the method of this embodiment (shown on the right side in fig. 1), the CPU only needs to send initial data and parameters to the GPU side, and the calculation work of most data points is efficiently completed on the GPU in a multithreading manner, so that the execution time is greatly shortened.
The method for fast point cloud drawing in Unity3d of the embodiment comprises the following steps:
the method comprises the steps of firstly, acquiring point cloud data C to be displayed, wherein the point cloud data C comprises N data points, each data point generates four vertex coordinates and four UV data corresponding to the four vertex coordinates one by one, the four vertex coordinates are all filled as coordinates P of the data point in a space, and the four UV data are respectively filled as (0,0), (0,1), (1,0) and (1,1), namely, each data point corresponds to a quadrangle Q consisting of two triangles in the space, so that the point cloud data C generates a data structure of N quadrangles Q in total; then the current camera position Pc is acquired.
The UV data is a set of data recording how the map on the model should be attached, and the UV coordinates can be understood as two-dimensional coordinates, each coordinate corresponding to one of the vertex coordinates of Q, and the position of the percentage of the map in the coordinates is represented by [0-1 ]. In this embodiment, it is desirable to have a tile fully fill a square, so the UV data of the four vertices of each data point are respectively set as (0,0) lower left corner, (0,1) lower right corner, (1,0) upper left corner, and (1,1) upper right corner, which respectively correspond to the four corners of the tile.
And step two, transmitting the vertex coordinates and the UV data of each data point generated in the step one to a GPU, and setting a preset data point drawing size S as a parameter into the GPU.
Step three, the GPU distributes the four vertex coordinates in the quadrangle Q corresponding to each data point to different GPU threads for calculation, sequentially calculates the distance between the camera position Pc and the coordinate P of each data point, if the distance is larger than a set threshold value, the four vertex coordinates of the data point are not expanded, at this time, the coordinates of four vertexes corresponding to the data point are superposed on one point before entering the next-stage rendering pipeline, the four vertexes are transmitted to the rasterization stage of the GPU in a superposed state, at this stage, according to the built-in rule of the GPU, when three vertexes of a triangle are superposed, the raster pipeline directly discards the data of the triangle without participating in subsequent calculation, that is, the GPU does not need to display the data point to remove it from the rasterized list, thereby achieving the effect of cropping the invisible point cloud data according to the visible distance threshold.
If the distance between the camera position Pc and the data point coordinate P is less than or equal to the set threshold, then according to the UV data corresponding to each vertex of the data point, the four vertices are expanded to form a square with side length S, the expansion method is shown in FIG. 2, V0、V1、V2、V3The four vertexes are overlapped before the operation is started, and after the operation is finished, the four vertexes in the Q are unfolded into a quadrangle with the side length S. And transforming the coordinates of the four extended vertexes into a projection space, and then transmitting the coordinates into a rasterization pipeline of the GPU, so that the required point cloud data can be displayed on display equipment.
Preferably, four vertex coordinates of each data point after expansion are calculated in the GPU in a VertexShader mode, because the VertexShader has great flexibility, operations such as vertex mixing, deformation, coordinate system conversion and the like can be conveniently and quickly carried out, and the result is accurate and reliable.
The method for realizing point cloud drawing under the GPU hardware programmable architecture can make full use of GPU hardware resources, enables the CPU and the GPU to work cooperatively, achieves high operation efficiency, and achieves the purpose of drawing point cloud data more quickly in Unity3 d.
The present invention is not limited to the specific technical solutions described in the above embodiments, and other embodiments may be made in the present invention in addition to the above embodiments. It will be understood by those skilled in the art that various changes, substitutions of equivalents, and alterations can be made without departing from the spirit and scope of the invention.
Claims (2)
1. A method of fast point cloud rendering in Unity3d, comprising the steps of:
acquiring point cloud data C to be displayed, wherein the point cloud data C comprises N data points, each data point generates four vertex coordinates and four UV data corresponding to the four vertex coordinates one by one, the four vertex coordinates are all filled as a coordinate P of the data point in a space, and the four UV data are respectively filled as (0,0), (0,1), (1,0) and (1, 1); then acquiring the position Pc of the current camera;
step two, transmitting the vertex coordinates and the UV data of each data point generated in the step one to a GPU, and setting a preset data point drawing size S as a parameter into the GPU;
step three, the GPU distributes four vertex coordinates in the quadrangle Q corresponding to each data point to different GPU threads for calculation, when a certain data point is calculated, the distance between the position Pc of the camera and the coordinate P of the data point is firstly calculated, and if the distance is larger than a set threshold value, the data point is deleted and is not displayed any more;
if the distance between the camera position Pc and the data point coordinate P is smaller than or equal to a set threshold value, expanding the four vertexes to form a plane square with the side length S according to UV data corresponding to the four vertexes of the data point, and transforming the four vertex coordinates after expansion into a projection space and then transmitting the four vertex coordinates into a rasterization pipeline of a GPU.
2. The method of fast point cloud rendering in Unity3d as claimed in claim 1, wherein: in the third step, four vertex coordinates of each data point after expansion are calculated in the GPU in a VertexShader mode.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110045882.3A CN112381705B (en) | 2021-01-14 | 2021-01-14 | Method for rapidly drawing point cloud in Unity3d |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110045882.3A CN112381705B (en) | 2021-01-14 | 2021-01-14 | Method for rapidly drawing point cloud in Unity3d |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112381705A CN112381705A (en) | 2021-02-19 |
CN112381705B true CN112381705B (en) | 2021-03-26 |
Family
ID=74590089
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110045882.3A Active CN112381705B (en) | 2021-01-14 | 2021-01-14 | Method for rapidly drawing point cloud in Unity3d |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112381705B (en) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107870334A (en) * | 2017-10-27 | 2018-04-03 | 西安电子科技大学昆山创新研究院 | Single pixel laser infrared radar imaging device and imaging method based on embedded gpu |
US10997433B2 (en) * | 2018-02-27 | 2021-05-04 | Nvidia Corporation | Real-time detection of lanes and boundaries by autonomous vehicles |
US10885705B2 (en) * | 2018-08-14 | 2021-01-05 | Ideaforge Technology Pvt. Ltd. | Point cloud rendering on GPU using dynamic point retention |
CN110009741B (en) * | 2019-06-04 | 2019-08-16 | 奥特酷智能科技(南京)有限公司 | A method of the build environment point cloud map in Unity |
CN111612882B (en) * | 2020-06-10 | 2023-04-07 | 腾讯科技(深圳)有限公司 | Image processing method, image processing device, computer storage medium and electronic equipment |
-
2021
- 2021-01-14 CN CN202110045882.3A patent/CN112381705B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN112381705A (en) | 2021-02-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11257286B2 (en) | Method for rendering of simulating illumination and terminal | |
CN109493407B (en) | Method and device for realizing laser point cloud densification and computer equipment | |
Steinbrücker et al. | Volumetric 3D mapping in real-time on a CPU | |
CN107341846B (en) | Method and device for displaying large-scale three-dimensional reconstruction scene in real time | |
JP2018129051A (en) | Adjustment of inclination of texture mapping of plurality of rendering of target whose resolution varies according to location of screen | |
US8970586B2 (en) | Building controllable clairvoyance device in virtual world | |
CN103500467A (en) | Constructive method of image-based three-dimensional model | |
KR20090064239A (en) | Method and system for texturing of 3d model in 2d environment | |
CN106908052B (en) | Path planning method and device for intelligent robot | |
CN104956404A (en) | Real-time 3d reconstruction with power efficient depth sensor usage | |
CN106558017B (en) | Spherical display image processing method and system | |
CN103180882A (en) | Tessellation of patches of surfaces in a tile based rendering system | |
JPH05342310A (en) | Method and device for three-dimensional conversion of linear element data | |
KR101591427B1 (en) | Method for Adaptive LOD Rendering in 3-D Terrain Visualization System | |
CN103544731A (en) | Quick reflection drawing method on basis of multiple cameras | |
US20210082165A1 (en) | Rendering of cubic bezier curves in a graphics processing unit (gpu) | |
CN114998503A (en) | White mold automatic texture construction method based on live-action three-dimension | |
JP2023529790A (en) | Method, apparatus and program for generating floorplans | |
CN112381705B (en) | Method for rapidly drawing point cloud in Unity3d | |
JP2837584B2 (en) | How to create terrain data | |
CN117274527A (en) | Method for constructing three-dimensional visualization model data set of generator equipment | |
CN114972612B (en) | Image texture generation method based on three-dimensional simplified model and related equipment | |
KR20220155245A (en) | Positioning method, method for generating visual map and device thereof | |
CN116310753A (en) | Vectorized skeleton extraction method and system for outdoor scene point cloud data | |
CN110502305A (en) | Method and device for realizing dynamic interface and related equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP02 | Change in the address of a patent holder |
Address after: 210012 room 401-404, building 5, chuqiaocheng, No. 57, Andemen street, Yuhuatai District, Nanjing, Jiangsu Province Patentee after: AUTOCORE INTELLIGENT TECHNOLOGY (NANJING) Co.,Ltd. Address before: 211800 building 12-289, 29 buyue Road, Qiaolin street, Pukou District, Nanjing City, Jiangsu Province Patentee before: AUTOCORE INTELLIGENT TECHNOLOGY (NANJING) Co.,Ltd. |
|
CP02 | Change in the address of a patent holder |