CN113034660A - Laser radar simulation method based on PBR reflection model - Google Patents

Laser radar simulation method based on PBR reflection model Download PDF

Info

Publication number
CN113034660A
CN113034660A CN202110330512.4A CN202110330512A CN113034660A CN 113034660 A CN113034660 A CN 113034660A CN 202110330512 A CN202110330512 A CN 202110330512A CN 113034660 A CN113034660 A CN 113034660A
Authority
CN
China
Prior art keywords
scene
texture
laser radar
point cloud
textures
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110330512.4A
Other languages
Chinese (zh)
Other versions
CN113034660B (en
Inventor
李红
吕攀
辛越
杨国青
吴朝晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202110330512.4A priority Critical patent/CN113034660B/en
Publication of CN113034660A publication Critical patent/CN113034660A/en
Application granted granted Critical
Publication of CN113034660B publication Critical patent/CN113034660B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a laser radar simulation method based on a PBR (peripheral component distance) reflection model, which can fully utilize the acceleration capability of a rendering engine and a graphic processor and obtain vivid point cloud intensity information by utilizing material information such as the normal direction, roughness, high light coefficient and the like of the surface of an obstacle. The realistic laser radar simulation process comprises two rendering stages: in the first rendering stage, the surface position, the surface normal direction and the surface material information of a scene model under the view angle of the laser radar are output in the form of scene textures; and the second rendering stage calculates the position and intensity of each point in the point cloud according to the surface information of the scene model, and outputs the position and intensity in the form of point cloud textures. Therefore, the simulation method can truly simulate the intensity information of the laser radar and improve the calculation efficiency by means of the graphic rendering pipeline of the GPU.

Description

Laser radar simulation method based on PBR reflection model
Technical Field
The invention belongs to the technical field of sensor simulation, and particularly relates to a laser radar simulation method based on a PBR reflection model.
Background
The sensor is a main way for the robot to sense the surrounding environment, the sensor simulation is a crucial ring of the mobile robot simulation system, and whether the simulation system can provide high-quality sensor data similar to the real world in real time is the key for the normal work of the whole simulation system.
The laser radar can provide accurate environmental depth measurement for the mobile robot, and therefore, the laser radar is widely applied to environmental perception and positioning tasks. In recent years, with the appearance of simulation systems using professional graphic engines, the trueness degree of laser radar simulation is far behind that of camera simulation, but the progress of perception and positioning algorithms puts higher requirements on laser radar simulation.
At present, mainstream laser radar simulation methods include a ray projection method and a scene depth reduction method, wherein the ray projection method simulates a scanning mode of a laser radar, and calculates an intersection point of a virtual ray on a scene object to obtain coordinate information of a point cloud; the scene depth reduction method is to obtain point cloud coordinate information after carrying out graphic processing and correction through virtual camera depth buffering in a simulation environment.
Chinese patent publication No. CN103400003A provides a laser radar scene simulation method implemented based on GPU programming, which stores parameters of BRDF as DDS data textures, samples the data textures in a fragment program, and calculates a laser brightness value according to a BRDF reflection model; the technology of the patent generates the scene BRDF texture file on the CPU, which needs to set the material number in advance, and has low applicability in the scene with complex materials. Chinese patent publication No. CN110133625A provides a rapid spherical coordinate lidar simulation method, in which a CPU and a GPU perform cooperative operation, and a fragment shader performs ray detection on a triangular surface to calculate coordinates of collision points. Chinese patent publication No. CN109814093A provides a laser radar simulation method and device based on CPU multi-core calculation, which can simulate laser radar point cloud offset errors when a vehicle runs at a high speed.
Disclosure of Invention
In view of the above, the present invention provides a laser radar simulation method based on a PBR (Physical based rendered) reflection model, which can fully utilize the acceleration capabilities of a rendering engine and a graphics processor, and obtain realistic point cloud intensity information by using material information such as the normal direction, roughness, and high optical coefficient of the surface of an obstacle.
A laser radar simulation method based on a PBR reflection model comprises the following steps:
(1) rendering scene position, normal direction and material information under the visual angle of the laser radar into three scene textures respectively;
(2) creating and storing point cloud textures in a laser radar simulation result, and mapping texture coordinates of the point cloud textures to three scene textures in an equiangular sampling mode; for each texel of the point cloud texture, sampling the scene texture according to coordinates after equal angle mapping, wherein the resolution ratio of the point cloud texture is the same as that of the laser radar, and each texel corresponds to each point in a laser radar simulation result;
(3) for each texel of the point cloud texture, calculating the intensity information of the point cloud according to the sampling result of the scene texture and the PBR reflection model, and storing the calculation result into the point cloud texture;
(4) and reading point cloud data in the point cloud texture, discarding points with too small intensity values to obtain laser radar simulation data, and storing the laser radar simulation data in a laser radar simulation structure array.
Further, in the rendering process in the step (1), a laser radar simulation module is used for capturing scene information of a 360-degree visual angle of a laser radar in the pose state according to the pose of the robot and storing the scene information into scene textures, wherein the scene textures comprise scene position textures, scene normal textures and scene material textures, R, G, B channels of each texel in the scene position textures are x, y and z coordinates of a corresponding point in a laser radar coordinate system respectively, R, G, B channels of each texel in the scene normal textures are x, y and z components of a single normal vector of the corresponding point in the laser radar coordinate system respectively, and R, G, B channels of each texel in the scene material textures are R channel ground color, roughness and highlight of the material respectively.
Further, in the step (1), the position and the view angle of the camera in the simulation software are adjusted to be consistent with those of the laser radar, and then the position information, the surface normal direction and the material information of the scene under the view angle of the camera are respectively rendered into three scene textures, wherein the material information is derived from the image rendering material.
Further, the generating process of the scene texture utilizes a vertex shader and a pixel shader of the GPU rendering pipeline, where the vertex shader is used to calculate attributes of model vertices, and the pixel shader is used to calculate RGBA channel values of each pixel in the rendering target, specifically: firstly, calculating the position of a model vertex in a world coordinate system and normal information of the position of the vertex in the world coordinate system by using a transformation matrix from a model space to the world space in a vertex shader, and then obtaining the position and the normal of a model point corresponding to each texel in a rendering target texture in the world coordinate system and texture coordinates of a texture map through rasterization processing of a rendering pipeline; in a pixel shader, for each texel in the scene position texture, an RGB channel value is a coordinate value of the point in a laser radar temporary coordinate system; for each texel in the scene normal texture, the RGB channel value is a unit normal vector of the surface of the point in a laser radar temporary coordinate system and is equal to the unit normal vector of the surface in a world coordinate system; before generating the scene texture, the model texture needs to be sampled according to texture mapping texture coordinates to obtain a texture attribute, and for each texel in the scene texture, RGB channel values respectively correspond to R channel background color, roughness and highlight of the texture attribute.
Further, the scene texture sampling in the step (2) is to perform cylindrical surface sampling on the scene position texture, the scene normal texture and the scene material texture according to the equiangular sampling mode of the laser radar in the horizontal direction, and the sampling result is the attribute information of the position, the surface normal and the material of each point in the point cloud, so that the mapping relation between the scene texture coordinate and the point cloud texture coordinate under the equiangular sampling model of the laser radar, namely the point cloud texture coordinate (u-coordinatep,vp) Mapped scene texture coordinates are
Figure BDA0002989254980000031
Wherein
Figure BDA0002989254980000032
Further, a BRDF coloring model based on Lambertian diffuse reflection BRDF (Bidirectional reflection distribution function) and Cook-Torrance micro-surface specular reflection is adopted in the step (3), and the model is obtained by adding a diffuse reflection component and a specular reflection component; considering the relationship between the reflection intensity and the incident angle of the laser radar, namely the light source is an observer, and the reflection intensity and the incident angle of the laser, a PBR reflection model with material roughness, ground color and high light coefficient as parameters is obtained through the BRDF coloring model, and then the intensity information of the laser point cloud is calculated in the global shader according to the scene texture and the PBR reflection model.
Further, the steps (1) to (3) are finished on the GPU, the CPU reads point cloud data from a GPU memory in the step (4), RGBA channels of each texel of the point cloud texture respectively correspond to XYZ coordinates and intensity values of the point cloud, and the CPU discards abnormal data to obtain a laser radar simulation result.
Based on the technical scheme, the invention has the following beneficial technical effects:
1. the texture information used by the method is derived from texture in image rendering, can be directly used in simulation software, automatically generates complex scene texture information, and does not need manual processing.
2. The invention uses the graphics rendering pipeline of the GPU, and only calculates the intensity value in the fragment shader, thereby improving the simulation calculation efficiency.
Drawings
Fig. 1 is a schematic flow diagram of a laser radar simulation rendering stage according to the present invention.
Fig. 2 is a schematic flow chart of a laser radar simulation method according to the present invention.
Fig. 3 is a top view of an equiangular sampling model of the laser radar.
FIG. 4 is a side view of an equiangular sampling model of a lidar.
Fig. 5 is a scene schematic diagram of an experimental simulation sample.
Fig. 6 is a diagram illustrating a visualization result of simulation data.
Detailed Description
In order to more specifically describe the present invention, the following detailed description is provided for the technical solution of the present invention with reference to the accompanying drawings and the specific embodiments.
The realistic laser radar simulation process provided by the invention comprises two rendering stages, as shown in fig. 1, wherein the first rendering stage outputs the surface position, the surface normal direction and the surface material information of a scene model under the view angle of the laser radar in the form of scene textures; in the second rendering stage, the position and the intensity of each point in the point cloud are calculated according to the surface information of the scene model and are output in the form of point cloud textures; the specific simulation flow is shown in fig. 2.
(1) In the first rendering stage, a laser radar simulation module captures scene information of a laser radar 360-degree visual angle in the pose state according to the pose of the robot and stores the scene information into textures, wherein the scene textures mainly comprise scene position textures, scene normal textures, scene texture and the like. The R, G, B channel of each texel in the scene position texture is the x, y and z coordinates of a corresponding point in a laser radar coordinate system, the R, G, B channel of each texel in the scene normal texture is the x, y and z components of a unit normal vector of the corresponding point in the laser radar coordinate system, and the R, G, B channel of each texel in the scene texture is the attributes of ground color R channel, roughness, highlight and the like of the texture; the data format of the scene texture is shown in table 1.
TABLE 1
Figure BDA0002989254980000041
Figure BDA0002989254980000051
The algorithm for generating scene texture is as follows, the process inputs scene Model, Model space to world space transformation matrix MM→WAnd the material map Tex of the modelmOutputting the sceneLocation texture RTpScene normal texture RTnAnd scene texture RTm
Figure BDA0002989254980000052
The generation process of the scene texture utilizes a vertex shader and a pixel shader of a rendering pipeline of the GPU, wherein the vertex shader calculates attributes of model vertexes, and the pixel shader calculates RGBA channel values of each pixel in a rendering target. First, a transformation matrix M is used in a vertex shaderM→WCalculating the position of the model vertex in the world coordinate system and the normal information of the position of the vertex, and then obtaining the position p and the normal n of the model point corresponding to each texel in the rendering target texture RT in the world coordinate system and the texture coordinate uv of the texture map through the rasterization processing of a rendering pipelinem. In a pixel shader, texture RT is applied to scene positionspThe RGB channel value of each texel in the image is a coordinate value of the point in a temporary coordinate system of the laser radar; for scene normal texture RTnThe RGB channel value of each texel in the point surface is a unit normal vector of the point surface in a laser radar temporary coordinate system and is equal to the unit normal vector of the point surface in a world coordinate system; scene texture RTmTexture coordinates uv of a texture map are required to be generated according to the materialmSampling the model material texture to obtain the material property material, RTmEach texel, RGB channel value corresponds to the material's red background, roughness, and highlights, respectively.
(2) And creating point cloud textures for storing the laser radar simulation result, wherein the resolution ratio of the textures is the same as that of the laser radar, and each texel corresponds to each point of the laser simulation result.
(3) And starting a second rendering stage, performing cylindrical surface sampling on the scene position texture, the scene normal texture, the scene material texture and the like according to an equiangular sampling mode of the laser radar in the horizontal direction, wherein the sampling result is the position, the surface normal, the material and other attribute information of each point in the point cloud.
As shown in FIG. 3, by lidarTaking a 90-degree horizontal view angle area in the front direction as an example, let r be the distance between the projection cylinder of the lidar and the projection center O, and (u) be the scene texture coordinate corresponding to a certain sampling pointc,vc) The corresponding point cloud texture coordinate is (u)p,vp) Then, the forward included angle α between the sampling line and the x-axis is:
Figure BDA0002989254980000061
and because:
Figure BDA0002989254980000062
therefore:
Figure BDA0002989254980000063
the lidar cylinder intersects exactly the edge of the camera view cone, as shown in fig. 4, so that the height h of the lidar cylinder ispHeight h of projection plane of cameracThe relationship of (1) is:
Figure BDA0002989254980000064
vpand vcThere is a similar relationship, namely:
Figure BDA0002989254980000065
therefore:
Figure BDA0002989254980000071
thus, the mapping relation between the scene texture coordinates and the point cloud texture coordinates under the laser radar equal-angle sampling model is obtained: point cloud texture coordinates (u)p,vp) The corresponding scene texture sampling coordinates are
Figure BDA0002989254980000072
Figure BDA0002989254980000073
Wherein
Figure BDA0002989254980000074
(4) And calculating the reflection intensity of the points according to the point attribute sampling result of the previous step and the PBR light reflection model, removing partial abnormal points, and storing the final calculation result into the point cloud texture.
The photorealistic laser radar simulation of the invention uses a Lambertian diffuse reflection BRDF (Bidirectional reflection distribution function) and a Cook-Torrance micro-surface specular reflection BRDF coloring model, wherein the BRDF coloring model of the laser radar is obtained by adding a diffuse reflection component and a specular reflection component:
f(l,v)=fdiff(l,v)+fspec(l,v)
in the formula: f. ofdiffAnd fspecRespectively, a diffuse reflection BRDF component and a specular reflection BRDF component, and l and v respectively, a unit vector of a reflection point directed to a light source and a unit vector of a reflection point directed to an observer.
According to the Lambertian diffuse reflection theory, the BRDF diffuse reflection component of the laser radar is as follows:
Figure BDA0002989254980000075
in the formula: c. CbaseIs the ground color of the material.
The general look-Torrance micro-surface mirror surface coloring model is as follows:
Figure BDA0002989254980000076
in the formula: d is a normal distribution function and describes the found distribution condition of the micro surface; f is a Fresnel coefficient and describes the sum of surface highlight conditions; g is a geometric function and describes the shielding condition between the micro surfaces; n is the unit normal to the surface, h is defined as:
Figure BDA0002989254980000077
considering the above-mentioned laser radar, even the light source, is the observer, a variant of the micro-surface model is obtained:
Figure BDA0002989254980000078
normal distribution function terms variants of the GGX/Trowbridge-Reitz model of Disney were used:
Figure BDA0002989254980000081
in the formula: alpha is defined as the square of the Roughness parameter of the material, i.e. alpha is roughnesss2
In order to take simulation truth and calculation effectiveness into consideration, the Fresnel term uses an approximation method of Fresnel coefficient calculation proposed by Schlick, a spherical Gaussian approximation is used for replacing an exponential term of the Fresnel term, and the definition of the Fresnel term is as follows:
F(l)=F0+(1-F0)2-12.53789
in the formula: f0Is a high coefficient of material.
Geometry function a variation of the Disney geometry function is used:
Figure BDA0002989254980000082
Figure BDA0002989254980000083
wherein: roughnesss is the Roughness parameter of a surface.
Considering the relationship between the reflection intensity of the laser and the incident angle of the laser beam, the PBR reflection model of the laser radar is:
Figure BDA0002989254980000084
wherein: l represents the ratio of the intensity of the laser light reflected by a point in the scene to the intensity of the laser beam emitted by the lidar.
The analysis discussion obtains a PBR reflection model with material roughness, ground color and highlight coefficient as parameters, and the intensity information of each point in the laser point cloud can be calculated in the global shader according to the scene texture and the laser reflection model.
(5) And reading each texel in the texture by the CPU (central processing unit) in the point cloud texture, and storing the result in the laser radar simulation structure array.
The realistic laser radar simulation model is realized in the illusion engine, has certain requirements on scene information, and has the following premises and assumptions in the simulation process: (1) before simulation, information such as scene models, materials, textures and the like is imported into an engine; (2) the material information of the scene model comprises PBR material parameters such as Base Color (Base Color), Roughness (Roughress), Metallic Color (Metallic), highlight (Specular) and the like.
The laser radar simulation method is deployed on a computer with high graphic performance, the model of a graphic processor used in implementation is NVIDIA GTX 1080Ti, and the video memory is 11 GB. The operating system of the simulation computer is Windows 10, the simulation computer is connected with a testing computer provided with Ubuntu 18.04 and ROS through a network cable, the version of the ROS is melodic, and after the simulation computer starts simulation software, a visualization tool Rviz of the ROS is used for displaying the simulation result of the laser radar.
The simulation is divided into two rendering stages: generating three scene textures of position, normal direction and material in the first stage, storing the position, normal direction and material attribute values of perspective projection sampling points under the visual angle of the laser radar, and using the position, normal direction and material attribute values for calculation in the second stage; and in the second stage, scene textures are sampled according to a laser radar equal-angle sampling model, the intensity of the point cloud is calculated according to a PBR reflection model, and the position information and the intensity information of the point cloud are stored in the point cloud textures. After the second stage is finished, the CPU reads the point cloud texture, the point cloud data is converted into ROS messages to be issued, and the visualization result of the point cloud can be seen by using the Rviz software.
In the experiment, the number of vertical scanning lines of the laser radar is set to be 64, the horizontal resolution is set to be 0.2 degrees, and the working frequency is set to be 10 Hz. The simulation sample illustrates that as shown in fig. 5, an area a1 is a wall surface right opposite to the robot, an area a2 is a wall surface right to the robot, an area A3 is a supporting column, and the materials of the areas a1, a2 and A3 are all wall surface materials and are marked as material a; the area B1 is a window frame on the left side of the robot, the area B2 is a window frame on the rear side of the robot, and the materials B1 and B2 are black paint surfaces and are marked as material B; the C1 area is window glass, the material is transparent glass material C, and the material is directly penetrated in laser simulation; the material used for the ground is marked as D. Table 2 shows the attribute parameters of two materials that affect the simulated intensity value of the lidar.
TABLE 2
Figure BDA0002989254980000091
The point cloud simulation data visualized in the Rviz software is shown in fig. 6, and from the point cloud geometric distribution, the point cloud simulation data accords with the point cloud characteristics of the 64-line laser radar. From the analysis of the intensity values, the intensity values of the opposite regions A1, A2, B1 and B2 are higher than those of other regions of the same material, and the influence of the incident angle of the laser on the reflection intensity is reflected; the high light coefficient of the material B is larger than that of the material A, the reflection intensity of opposite areas B1 and B2 is higher than that of A1 and A2, but the reflection intensity of a non-opposite area of the material B is lower than that of a non-opposite area of the material A, the reflection intensity of the ground is lower than that of an area of the material A, and the reflection intensity of the ground is higher than that of the non-opposite area of the material B, so that the influence of the material property on the reflection intensity is reflected.
The foregoing description of the embodiments is provided to enable one of ordinary skill in the art to make and use the invention, and it is to be understood that other modifications of the embodiments, and the generic principles defined herein may be applied to other embodiments without the use of inventive faculty, as will be readily apparent to those skilled in the art. Therefore, the present invention is not limited to the above embodiments, and those skilled in the art should make improvements and modifications to the present invention based on the disclosure of the present invention within the protection scope of the present invention.

Claims (8)

1. A laser radar simulation method based on a PBR reflection model comprises the following steps:
(1) rendering scene position, normal direction and material information under the visual angle of the laser radar into three scene textures respectively;
(2) creating and storing point cloud textures in a laser radar simulation result, and mapping texture coordinates of the point cloud textures to three scene textures in an equiangular sampling mode; for each texel of the point cloud texture, sampling the scene texture according to coordinates after equal angle mapping, wherein the resolution ratio of the point cloud texture is the same as that of the laser radar, and each texel corresponds to each point in a laser radar simulation result;
(3) for each texel of the point cloud texture, calculating the intensity information of the point cloud according to the sampling result of the scene texture and the PBR reflection model, and storing the calculation result into the point cloud texture;
(4) and reading point cloud data in the point cloud texture, discarding points with too small intensity values to obtain laser radar simulation data, and storing the laser radar simulation data in a laser radar simulation structure array.
2. The lidar simulation method of claim 1, wherein: the rendering process in the step (1) is to capture scene information of a 360-degree visual angle of the laser radar in the pose state by using a laser radar simulation module according to the pose of the robot and store the scene information into scene textures, wherein the scene textures comprise scene position textures, scene normal textures and scene material textures, R, G, B channels of each texel in the scene position textures are respectively x, y and z coordinates of a corresponding point in a laser radar coordinate system, R, G, B channels of each texel in the scene normal textures are respectively x, y and z components of a single normal vector of the corresponding point in the laser radar coordinate system, and R, G, B channels of each texel in the scene material textures are respectively R channel ground color, roughness and highlight of the material.
3. The lidar simulation method of claim 1, wherein: in the step (1), the position and the visual angle of the camera in the simulation software are adjusted to be consistent with those of the laser radar, and then position information, a surface normal direction and material information of a scene under the visual angle of the camera are respectively rendered into three scene textures, wherein the material information is derived from an image rendering material.
4. The lidar simulation method of claim 1, wherein: the generation process of the scene texture utilizes a vertex shader and a pixel shader of a GPU rendering pipeline, wherein the vertex shader is used for calculating attributes of model vertices, the pixel shader is used for calculating RGBA channel values of each pixel in a rendering target, and specifically: firstly, calculating the position of a model vertex in a world coordinate system and normal information of the position of the vertex in the world coordinate system by using a transformation matrix from a model space to the world space in a vertex shader, and then obtaining the position and the normal of a model point corresponding to each texel in a rendering target texture in the world coordinate system and texture coordinates of a texture map through rasterization processing of a rendering pipeline; in a pixel shader, for each texel in the scene position texture, an RGB channel value is a coordinate value of the point in a laser radar temporary coordinate system; for each texel in the scene normal texture, the RGB channel value is a unit normal vector of the surface of the point in a laser radar temporary coordinate system and is equal to the unit normal vector of the surface in a world coordinate system; before generating the scene texture, the model texture needs to be sampled according to texture mapping texture coordinates to obtain a texture attribute, and for each texel in the scene texture, RGB channel values respectively correspond to R channel background color, roughness and highlight of the texture attribute.
5. The lidar simulation method of claim 1, wherein: the scene texture sampling in the step (2)The method comprises the steps of sampling a scene position texture, a scene normal texture and a scene material texture in a cylindrical surface mode according to a laser radar horizontal direction equal-angle sampling mode, wherein a sampling result is the position, the surface normal and the material attribute information of each point in a point cloud, and thus the mapping relation between a scene texture coordinate and a point cloud texture coordinate under a laser radar equal-angle sampling model, namely the point cloud texture coordinate (u & ltu & gt) is obtainedp,vp) Mapped scene texture coordinates are
Figure FDA0002989254970000021
Wherein
Figure FDA0002989254970000022
6. The lidar simulation method of claim 1, wherein: in the step (3), a BRDF coloring model based on Lambertian diffuse reflection BRDF and Cook-Torrance micro-surface specular reflection is adopted, and the model is obtained by adding a diffuse reflection component and a specular reflection component; considering the relationship between the reflection intensity and the incident angle of the laser radar, namely the light source is an observer, and the reflection intensity and the incident angle of the laser, a PBR reflection model with material roughness, ground color and high light coefficient as parameters is obtained through the BRDF coloring model, and then the intensity information of the laser point cloud is calculated in the global shader according to the scene texture and the PBR reflection model.
7. The lidar simulation method of claim 1, wherein: and (3) finishing the steps (1) to (3) on a GPU, reading point cloud data from a GPU memory by a CPU (central processing unit), wherein RGBA channels of each texel of the point cloud texture respectively correspond to XYZ coordinates and intensity values of the point cloud, and the CPU discards abnormal data to obtain a laser radar simulation result.
8. The lidar simulation method of claim 1, wherein: the simulation method fully utilizes the acceleration capability of a rendering engine and a graphic processor, and utilizes material information such as the normal direction, roughness, high light coefficient and the like of the surface of the barrier to obtain vivid point cloud intensity information; the simulation method flow comprises two rendering stages: in the first rendering stage, the surface position, the surface normal direction and the surface material information of a scene model under the view angle of the laser radar are output in the form of scene textures; and the second rendering stage calculates the position and intensity of each point in the point cloud according to the surface information of the scene model, and outputs the position and intensity in the form of point cloud textures.
CN202110330512.4A 2021-03-23 2021-03-23 Laser radar simulation method based on PBR reflection model Active CN113034660B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110330512.4A CN113034660B (en) 2021-03-23 2021-03-23 Laser radar simulation method based on PBR reflection model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110330512.4A CN113034660B (en) 2021-03-23 2021-03-23 Laser radar simulation method based on PBR reflection model

Publications (2)

Publication Number Publication Date
CN113034660A true CN113034660A (en) 2021-06-25
CN113034660B CN113034660B (en) 2022-06-14

Family

ID=76473304

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110330512.4A Active CN113034660B (en) 2021-03-23 2021-03-23 Laser radar simulation method based on PBR reflection model

Country Status (1)

Country Link
CN (1) CN113034660B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114895316A (en) * 2022-07-11 2022-08-12 之江实验室 Rapid numerical simulation method and device for multi-laser radar ranging
CN117437345A (en) * 2023-12-22 2024-01-23 山东捷瑞数字科技股份有限公司 Method and system for realizing rendering texture specular reflection effect based on three-dimensional engine

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5179638A (en) * 1990-04-26 1993-01-12 Honeywell Inc. Method and apparatus for generating a texture mapped perspective view
CN103400003A (en) * 2013-07-22 2013-11-20 西安电子科技大学 Method for achieving laser radar scene simulation on basis of GPU programming
US20190302259A1 (en) * 2018-03-27 2019-10-03 The Mathworks, Inc. Systems and methods for generating synthetic sensor data
CN110824443A (en) * 2019-04-29 2020-02-21 当家移动绿色互联网技术集团有限公司 Radar simulation method and device, storage medium and electronic equipment
CN111366153A (en) * 2020-03-19 2020-07-03 浙江大学 Positioning method for tight coupling of laser radar and IMU

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5179638A (en) * 1990-04-26 1993-01-12 Honeywell Inc. Method and apparatus for generating a texture mapped perspective view
CN103400003A (en) * 2013-07-22 2013-11-20 西安电子科技大学 Method for achieving laser radar scene simulation on basis of GPU programming
US20190302259A1 (en) * 2018-03-27 2019-10-03 The Mathworks, Inc. Systems and methods for generating synthetic sensor data
CN110824443A (en) * 2019-04-29 2020-02-21 当家移动绿色互联网技术集团有限公司 Radar simulation method and device, storage medium and electronic equipment
CN111366153A (en) * 2020-03-19 2020-07-03 浙江大学 Positioning method for tight coupling of laser radar and IMU

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
吕攀等: "基于MSCKF的IMU与激光雷达紧耦合定位方法", 《仪器仪表学报》 *
吕攀等: "基于MSCKF的IMU与激光雷达紧耦合定位方法", 《仪器仪表学报》, vol. 41, no. 8, 31 August 2020 (2020-08-31) *
郑维欣等: "基于PBR的轻量级WebGL实时真实感渲染算法", 《系统仿真学报》 *
郑维欣等: "基于PBR的轻量级WebGL实时真实感渲染算法", 《系统仿真学报》, no. 11, 8 November 2017 (2017-11-08) *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114895316A (en) * 2022-07-11 2022-08-12 之江实验室 Rapid numerical simulation method and device for multi-laser radar ranging
CN117437345A (en) * 2023-12-22 2024-01-23 山东捷瑞数字科技股份有限公司 Method and system for realizing rendering texture specular reflection effect based on three-dimensional engine
CN117437345B (en) * 2023-12-22 2024-03-19 山东捷瑞数字科技股份有限公司 Method and system for realizing rendering texture specular reflection effect based on three-dimensional engine

Also Published As

Publication number Publication date
CN113034660B (en) 2022-06-14

Similar Documents

Publication Publication Date Title
CN111508052B (en) Rendering method and device of three-dimensional grid body
US10467805B2 (en) Image rendering of laser scan data
US11024077B2 (en) Global illumination calculation method and apparatus
EP4242973A1 (en) Image processing method and related apparatus
US7583264B2 (en) Apparatus and program for image generation
CN113034660B (en) Laser radar simulation method based on PBR reflection model
JPH0757117A (en) Forming method of index to texture map and computer control display system
CN110276791B (en) Parameter-configurable depth camera simulation method
CN102831634B (en) Efficient accurate general soft shadow generation method
CN116310018A (en) Model hybrid rendering method based on virtual illumination environment and light query
CN105976423B (en) A kind of generation method and device of Lens Flare
JP4584956B2 (en) Graphics processor and drawing processing method
KR101118597B1 (en) Method and System for Rendering Mobile Computer Graphic
KR20190067070A (en) System and method for projection mapping in virtual space
Popovski et al. Comparison of rendering processes on 3D model
CN117437345B (en) Method and system for realizing rendering texture specular reflection effect based on three-dimensional engine
CN117333598B (en) 3D model rendering system and method based on digital scene
WO2023197689A1 (en) Data processing method, system, and device
US20230090732A1 (en) System and method for real-time ray tracing in a 3d environment
CN116993894B (en) Virtual picture generation method, device, equipment, storage medium and program product
JP2952585B1 (en) Image generation method
JP2518712B2 (en) Method and apparatus for producing high quality rendering drawing in computer graphic
KIM REAL-TIME RAY TRACING REFLECTIONS AND SHADOWS IMPLEMENTATION USING DIRECTX RAYTRACING
Yuan et al. Near-Surface Atmospheric Scattering Rendering Method
JPH0729034A (en) Shadowing processor and picture generation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant