CN113176557A - Virtual laser radar online simulation method based on projection - Google Patents
Virtual laser radar online simulation method based on projection Download PDFInfo
- Publication number
- CN113176557A CN113176557A CN202110472858.8A CN202110472858A CN113176557A CN 113176557 A CN113176557 A CN 113176557A CN 202110472858 A CN202110472858 A CN 202110472858A CN 113176557 A CN113176557 A CN 113176557A
- Authority
- CN
- China
- Prior art keywords
- virtual
- laser radar
- projection
- lidar
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000004088 simulation Methods 0.000 title claims abstract description 55
- 238000000034 method Methods 0.000 title claims abstract description 48
- 230000008447 perception Effects 0.000 claims description 23
- 239000011159 matrix material Substances 0.000 claims description 19
- 239000000126 substance Substances 0.000 claims description 7
- 238000003384 imaging method Methods 0.000 claims description 6
- 238000005070 sampling Methods 0.000 claims description 3
- 238000012545 processing Methods 0.000 abstract description 6
- 238000013135 deep learning Methods 0.000 abstract description 5
- 238000012549 training Methods 0.000 abstract description 5
- 238000010586 diagram Methods 0.000 description 8
- 238000011161 development Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
Abstract
The invention belongs to the field of laser radar simulation, particularly relates to a virtual laser radar online simulation method based on projection, and aims to solve the problems that the virtual laser radar simulation processing flow is complicated and cannot be integrated into the existing deep learning training flow in the prior art. The invention comprises the following steps: setting virtual laser radar parameters and camera internal and external parameters; establishing a virtual laser radar equivalent projection plane by utilizing the internal and external parameters of the camera; extracting surface point cloud coordinate information of a target to be scanned; projecting the point cloud of the target surface to be scanned to the equivalent projection plane of the laser radar by utilizing the internal and external parameters of the camera; and determining the information of the object surface points which can be scanned by the laser radar according to the laser radar line beam and the projection points of the target to be scanned on the equivalent projection plane, and finishing the on-line simulation of the virtual laser radar. The virtual laser radar simulation processing flow is simple and can be integrated into the existing deep learning training flow.
Description
Technical Field
The invention belongs to the field of laser radar simulation, and particularly relates to a virtual laser radar online simulation method based on projection.
Background
With the development of artificial intelligence and sensor technology, three-dimensional sensing is becoming a key task in the fields of unmanned driving, robots and the like. The data-driven artificial intelligence perception technology relies on marked data in a large number, but due to the high price of a laser radar sensor, the data acquisition and marking face a series of problems of high cost, low acquisition speed and the like. Sensor simulation and data generation techniques based on virtual scenes are new options to reduce the cost of algorithm development. The existing laser radar simulation needs to perform complex calculation of an object surface patch and a laser beam, so that the calculation can be completed by depending on professional three-dimensional software, the automation of a virtual point cloud acquisition process is greatly limited, and meanwhile, the simulation cannot be integrated with the training of a deep learning model, so that the algorithm development efficiency is influenced.
Therefore, there is still a need in the art for a virtual lidar simulation method that can be directly integrated into deep learning data preprocessing without relying on large-scale computer graphics software, thereby achieving on-line virtual lidar simulation.
Disclosure of Invention
In order to solve the above problems in the prior art, that is, the problems in the prior art that the virtual lidar simulation processing flow is complicated and cannot be integrated into the existing deep learning training flow, the invention provides a virtual lidar online simulation method based on projection, which comprises the following steps:
step S10, setting virtual laser radar parameters and proxy camera internal and external parameters;
step S20, establishing a virtual laser radar equivalent projection plane by using the virtual laser radar parameters and the internal and external parameters of the proxy camera; the virtual laser radar equivalent projection plane consists of a camera projection plane, projection points of virtual laser radar beam endpoints and corresponding perception radiuses;
step S30, extracting the coordinate information of the point cloud on the surface of the target to be scanned, and projecting the point cloud on the surface of the target to be scanned to the equivalent projection plane of the laser radar by using the parameters of the proxy camera;
and step S40, determining the information of the target surface points which can be scanned by the laser radar according to the distance between the laser radar line beam and the projection points of the target to be scanned on the equivalent projection plane, and completing the on-line simulation of the virtual laser radar.
In some preferred embodiments, the setting of the virtual lidar parameters and the proxy camera internal and external parameters in step a10 includes:
setting a horizontal view FoV of the virtual lidarhHas a range of [ -alpha ]0,α0]Horizontal resolution of rhVertical view angle FoVvIn the range of [ -beta ]0,β0]Vertical resolution of rv;
Wherein f isxAnd fyPixel length, p, of focal length in x-and y-directions of the camera, respectivelyx,pyIs the pixel position of the main center of the camera on the virtual laser radar equivalent projection plane, s is the camera distortion parameter, R is the rotation matrix between the virtual laser radar coordinate system and the camera coordinate system,t represents the transpose for the position of the camera origin in the virtual lidar coordinate system.
In some preferred embodiments, step S20 includes:
step S21, constructing angle information of a virtual laser radar beam [ i, j ]; wherein i is the horizontal scanning id of the laser radar beam, and j is the line beam id of the laser radar beam;
step S22, defining the coordinates of the end points of the virtual laser radar beams [ i, j ] in a virtual laser radar coordinate system according to the angle information of the virtual laser radar beams [ i, j ] respectively;
step S23, projecting the end point of the virtual laser radar beam [ i, j ] to a camera imaging plane based on the coordinate of the end point in a virtual laser radar coordinate system;
and step S24, calculating the perception range of each projection point, thereby constructing a camera projection plane, the projection points of the virtual laser radar beam end points and a virtual laser radar equivalent projection plane formed by corresponding perception radiuses.
In some preferred embodiments, constructing the angle information of each virtual lidar beam in step S21 includes:
the angular information defining the virtual lidar beam [ i, j ] is:
Di,j=[αi,j,βi,j]
wherein alpha isi,j=-α0+i×rhAs virtual lidar beams [ i, j ]]Angle of course of (beta)i,j=-β0+j×rvAs virtual lidar beams [ i, j ]]The pitch angle of (d).
In some preferred embodiments, step S22 includes:
defining the coordinates of the end point of the virtual lidar beam [ i, j ] in the virtual lidar coordinate system as:
Pi,j=[xi,j,yi,j,zi,j,1]
wherein x isi,j=I0×cos(-βi,j)×sin(αi,j) As virtual lidar beams [ i, j ]]X-coordinate, y-coordinate of the end point in the virtual lidar coordinate systemi,j=I0×cos(-βi,j)×cos(αi,j) As virtual lidar beams [ i, j ]]Y-coordinate, z-coordinate of the end point in the virtual lidar coordinate systemi,j=I0×sin(βi,j) As virtual lidar beams [ i, j ]]Z-coordinate of the end point in the virtual lidar coordinate system, I0As virtual lidar beams [ i, j ]]Is the distance from the coordinate point of the virtual lidar coordinate system to the coordinate origin.
In some preferred embodiments, step S23 includes:
calculating the projection of the endpoint of the virtual laser radar beam [ i, j ] on the camera imaging plane by using a projection formula based on the coordinate of the endpoint in the virtual laser radar coordinate system:
In some preferred embodiments, step S24 includes:
according to the projection pointTo its effective neighbor proxelsAverage distance d ofi,jCalculating the effective perception radius of the projection point:
si,j=γ×di,j/2
wherein gamma belongs to [0,1] for controlling the perception range;
the camera projection plane, the projection points of the virtual laser radar beam end points and the corresponding perception radius form a virtual laser radar equivalent projection plane.
In some preferred embodiments, the extracting of the surface point cloud coordinate information of the object to be scanned in step S30 includes:
uniformly sampling a surface patch of the surface of the target to be scanned according to the interval of 1cm between points to obtain an Nx 3 matrix S formed by point cloud coordinates of the target surface;
wherein, N is the number of sampled target surface points, and 3 represents that each row in the matrix S stores the three-dimensional coordinates of a point in the virtual lidar coordinate system.
In some preferred embodiments, the projecting the surface point cloud of the target to be scanned to the lidar equivalent projection plane using the proxy camera parameters in step S30 includes:
for the k point S in the surface point cloud of the target to be scannedkCalculating the projection coordinates thereof
Wherein the content of the first and second substances,is SkK is the camera internal reference matrix, CwIs a camera external parameter matrix.
In some preferred embodiments, step S40 includes:
step S41, projecting point of target surface pointLook for it atNearest neighbor of (1)If it isAndis less thanS range of perception ofik,jkThen will beRetaining and converting sik,jkSetting the value as-1;
and step S42, traversing the projection points of the target surface points to obtain the target surface point information which can be scanned by the laser radar, and completing the on-line simulation of the virtual laser radar.
The invention has the beneficial effects that:
(1) the virtual laser radar online simulation method based on projection does not depend on large-scale computer graphics software, effectively solves the problem of complicated processing flow in the laser radar simulation field, reduces resource consumption, and greatly improves processing efficiency.
(2) The virtual laser radar online simulation method based on projection can be directly integrated into deep learning data preprocessing, so that online laser radar simulation is realized, and the efficiency of subsequent model training and the universality of the model are improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a schematic diagram of a scheme and a flow chart of the projection-based virtual lidar online simulation method of the invention;
FIG. 2 is a schematic diagram of an equivalent projection plane of the projection-based virtual lidar online simulation method of the present invention;
FIG. 3 is a schematic diagram of object surface point cloud extraction in the projection-based virtual lidar online simulation method of the present invention;
fig. 4 is a schematic diagram of cloud projection of object surface points and judgment of effective sensing points in the projection-based virtual lidar online simulation method of the invention.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
The invention relates to a virtual laser radar online simulation method based on projection, which comprises the following steps:
step S10, setting virtual laser radar parameters and proxy camera internal and external parameters;
step S20, establishing a virtual laser radar equivalent projection plane by using the virtual laser radar parameters and the internal and external parameters of the proxy camera; the virtual laser radar equivalent projection plane consists of a camera projection plane, projection points of virtual laser radar beam endpoints and corresponding perception radiuses;
step S30, extracting the coordinate information of the point cloud on the surface of the target to be scanned, and projecting the point cloud on the surface of the target to be scanned to the equivalent projection plane of the laser radar by using the parameters of the proxy camera;
and step S40, determining the information of the target surface points which can be scanned by the laser radar according to the distance between the laser radar line beam and the projection points of the target to be scanned on the equivalent projection plane, and completing the on-line simulation of the virtual laser radar.
In order to more clearly describe the projection-based virtual lidar online simulation method of the present invention, the following describes in detail the steps in the embodiment of the present invention with reference to fig. 1.
The projection-based virtual lidar online simulation method of the first embodiment of the invention comprises the steps of S10-S40, and the steps are described in detail as follows:
step S10, setting virtual laser radar parameters and proxy camera internal and external parameters, respectively including:
as shown in the left middle part of fig. 1, a schematic diagram of parameters of a virtual lidar based on the projected virtual lidar online simulation method of the present invention is shown, where the parameters include four horizontal viewing angles, horizontal resolutions, and vertical viewing angles and vertical resolutions, and the specific parameters set by the method are as follows:
setting a horizontal view FoV of the virtual lidarhHas a range of [ -alpha ]0,α0]Horizontal resolution of rhPerpendicular visionAngle FoVvIn the range of [ -beta ]0,β0]Vertical resolution of rv;
Wherein f isxAnd fyPixel length, p, of focal length in x-and y-directions of the camera, respectivelyx,pyIs the pixel position of the main center of the camera on the virtual laser radar equivalent projection plane, s is the camera distortion parameter, R is the rotation matrix between the virtual laser radar coordinate system and the camera coordinate system,t represents the transpose for the position of the camera origin in the virtual lidar coordinate system.
As shown in the upper left of fig. 1, which is a schematic diagram of a virtual lidar coordinate system of the virtual lidar online simulation method based on projection of the present invention, a horizontal forward direction is an X-axis direction of the virtual lidar coordinate system, a horizontal leftward direction (corresponding to an arrow in the right direction in the figure) is a Y-axis direction of the virtual lidar coordinate system, a vertical upward direction is a Z-axis direction of the virtual lidar coordinate system, an included angle between a projection line of a lidar beam on an X0Y plane and the Y-axis is a heading angle, and an included angle between a projection line of the lidar beam on a Z0Y plane and the Y-axis is a pitch angle.
Step S20, establishing a virtual laser radar equivalent projection plane by using the virtual laser radar parameters and the internal and external parameters of the proxy camera; the virtual laser radar equivalent projection plane consists of a camera projection plane, projection points of virtual laser radar beam endpoints and corresponding perception radiuses.
Step S21, constructing angle information of the virtual laser radar beam [ i, j ], including:
the angle information defining the virtual lidar beam [ i, j ] is shown in equation (1):
Di,j=[αi,j,βi,j] (1)
wherein alpha isi,j=-α0+i×rhAs virtual lidar beams [ i, j ]]Angle of course of (beta)i,j=-β0+j×rvAs virtual lidar beams [ i, j ]]I is the horizontal scanning id of the lidar beam and j is the beam id of the lidar beam.
Step S22, defining coordinates of the end points of the virtual lidar beams [ i, j ] in a virtual lidar coordinate system according to the angle information of the virtual lidar beams [ i, j ], respectively, including:
defining the coordinates of the end point of the virtual lidar beam [ i, j ] in the virtual lidar coordinate system as shown in equation (2):
Pi,j=[xi,j,yi,j,zi,j,1] (2)
wherein x isi,j=I0×cos(-βi,j)×sin(αi,j) As virtual lidar beams [ i, j ]]X-coordinate, y-coordinate of the end point in the virtual lidar coordinate systemi,j=I0×cos(-βi,j)×cos(αi,j) As virtual lidar beams [ i, j ]]Y-coordinate, z-coordinate of the end point in the virtual lidar coordinate systemi,j=I0×sin(βi,j) As virtual lidar beams [ i, j ]]Z-coordinate of the end point in the virtual lidar coordinate system, I0As virtual lidar beams [ i, j ]]Is the distance from the coordinate point of the virtual lidar coordinate system to the coordinate origin.
Step S23, projecting the end point of the virtual lidar beam [ i, j ] to a camera imaging plane based on the coordinates of the end point in a virtual lidar coordinate system, comprising:
calculating the projection of the endpoint of the virtual laser radar beam [ i, j ] on the camera imaging plane by using a projection formula based on the coordinate of the endpoint in the virtual laser radar coordinate system, wherein the formula (3) is as follows:
Step S24, calculating the perception range of each projection point, thereby constructing a virtual lidar equivalent projection plane composed of a camera projection plane, projection points of virtual lidar beam endpoints and corresponding perception radii, comprising:
according to the projection pointTo its effective neighbor proxelsAverage distance d ofi,jCalculating the effective sensing radius of the projection point, as shown in formula (4):
si,j=γ×di,j/2 (4)
wherein gamma belongs to [0,1] for controlling the perception range;
the camera projection plane, the projection points of the virtual laser radar beam end points and the corresponding perception radius form a virtual laser radar equivalent projection plane.
Fig. 2 is a schematic view of an equivalent projection plane of the projection-based virtual lidar online simulation method of the present invention, and fig. 2 is a schematic view of an equivalent projection plane of a three-line lidar online simulation example, in which three rows of dotted circles respectively represent scanning effects generated by rotation of three lidar beams.
And step S30, extracting the surface point cloud coordinate information of the target to be scanned, and projecting the surface point cloud of the target to be scanned to the equivalent projection plane of the laser radar by using the proxy camera parameters.
The method for extracting the surface point cloud coordinate information of the target to be scanned comprises the following steps:
uniformly sampling a surface patch of the surface of the target to be scanned according to the interval of 1cm between points to obtain an Nx 3 matrix S formed by point cloud coordinates of the target surface;
wherein, N is the number of sampled target surface points, and 3 represents that each row in the matrix S stores the three-dimensional coordinates of a point in the virtual lidar coordinate system.
As shown in fig. 3, a schematic diagram of extracting object surface point clouds in the virtual lidar online simulation method based on projection is shown, where the right side is an object model and the left side is an object point cloud corresponding to the object model.
Projecting the surface point cloud of the target to be scanned to a laser radar equivalent projection plane by using the proxy camera parameters comprises:
for the k point S in the surface point cloud of the target to be scannedkCalculating the projection coordinates thereofAs shown in formula (5):
Step S40, according to the distance between the laser radar beam and the projection point of the target to be scanned on the equivalent projection plane, determining the target surface point information which can be scanned by the laser radar, and completing the virtual laser radar on-line simulation, including:
step S41, projecting point of target surface pointLook for it atNearest neighbor of (1)If it isAndis less thanS range of perception ofik,jkThen will beRetaining and converting sik,jkSetting the laser radar line beam as-1 to ensure that each laser radar line beam only scans one point;
and step S42, traversing the projection points of the target surface points to obtain the target surface point information which can be scanned by the laser radar, and completing the on-line simulation of the virtual laser radar.
As shown in fig. 4, which is a schematic diagram of cloud projection of object surface points and determination of effective sensing points in the projection-based virtual lidar online simulation method of the present invention, when the cloud projection of the object point is projected onto the equivalent projection plane of the lidar, the object points in a certain lidar point sensing range are captured by the lidar.
Although the foregoing embodiments describe the steps in the above sequential order, those skilled in the art will understand that, in order to achieve the effect of the present embodiments, the steps may not be executed in such an order, and may be executed simultaneously (in parallel) or in an inverse order, and these simple variations are within the scope of the present invention.
The virtual laser radar online simulation system based on projection of the second embodiment of the invention comprises the following modules:
the parameter definition module is configured to set virtual laser radar parameters and proxy camera internal and external parameters;
the virtual laser radar equivalent projection plane construction module is configured to establish a virtual laser radar equivalent projection plane by using the virtual laser radar parameters and the internal and external parameters of the proxy camera; the virtual laser radar equivalent projection plane consists of a camera projection plane, projection points of virtual laser radar beam endpoints and corresponding perception radiuses;
the projection module of the target to be scanned is configured to extract the coordinate information of the point cloud of the surface of the target to be scanned, and project the point cloud of the surface of the target to be scanned to the equivalent projection plane of the laser radar by using the parameters of the proxy camera;
and the virtual laser radar online simulation module is configured to determine target surface point information which can be scanned by the laser radar according to the distance between the laser radar wiring harness and the projection point of the target to be scanned on the equivalent projection plane, and complete the virtual laser radar online simulation.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process and related description of the system described above may refer to the corresponding process in the foregoing method embodiments, and will not be described herein again.
It should be noted that, the projection-based virtual lidar online simulation system provided in the foregoing embodiment is only illustrated by the division of the functional modules, and in practical applications, the functions may be allocated to different functional modules according to needs, that is, the modules or steps in the embodiment of the present invention are further decomposed or combined, for example, the modules in the foregoing embodiment may be combined into one module, or may be further split into multiple sub-modules, so as to complete all or part of the functions described above. The names of the modules and steps involved in the embodiments of the present invention are only for distinguishing the modules or steps, and are not to be construed as unduly limiting the present invention.
An electronic apparatus according to a third embodiment of the present invention includes:
at least one processor; and
a memory communicatively coupled to at least one of the processors; wherein the content of the first and second substances,
the memory stores instructions executable by the processor for execution by the processor to implement the projection-based virtual lidar online simulation method described above.
A computer-readable storage medium of a fourth embodiment of the present invention stores computer instructions for being executed by the computer to implement the projection-based virtual lidar online simulation method described above.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes and related descriptions of the storage device and the processing device described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Those of skill in the art would appreciate that the various illustrative modules, method steps, and modules described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that programs corresponding to the software modules, method steps may be located in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. To clearly illustrate this interchangeability of electronic hardware and software, various illustrative components and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as electronic hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The terms "first," "second," and the like are used for distinguishing between similar elements and not necessarily for describing or implying a particular order or sequence.
The terms "comprises," "comprising," or any other similar term are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.
Claims (10)
1. A virtual laser radar online simulation method based on projection is characterized by comprising the following steps:
step S10, setting virtual laser radar parameters and proxy camera internal and external parameters;
step S20, establishing a virtual laser radar equivalent projection plane by using the virtual laser radar parameters and the internal and external parameters of the proxy camera; the virtual laser radar equivalent projection plane consists of a camera projection plane, projection points of virtual laser radar beam endpoints and corresponding perception radiuses;
step S30, extracting the coordinate information of the point cloud on the surface of the target to be scanned, and projecting the point cloud on the surface of the target to be scanned to the equivalent projection plane of the laser radar by using the parameters of the proxy camera;
and step S40, determining the information of the target surface points which can be scanned by the laser radar according to the distance between the laser radar line beam and the projection points of the target to be scanned on the equivalent projection plane, and completing the on-line simulation of the virtual laser radar.
2. The projection-based virtual lidar online simulation method of claim 1, wherein the setting of the virtual lidar parameters and the proxy camera internal and external parameters in step a10 respectively comprises:
setting a horizontal view FoV of the virtual lidarhHas a range of [ -alpha ]0,α0]Horizontal resolution of rhVertical view angle FoVvIn the range of [ -beta ]0,β0]Vertical resolution of rv;
Wherein f isxAnd fyPixel length, p, of focal length in x-and y-directions of the camera, respectivelyx,pyIs the pixel position of the main center of the camera on the virtual laser radar equivalent projection plane, s is the camera distortion parameter, R is the rotation matrix between the virtual laser radar coordinate system and the camera coordinate system,t represents the transpose for the position of the camera origin in the virtual lidar coordinate system.
3. The projection-based virtual lidar online simulation method of claim 2, wherein step S20 comprises:
step S21, constructing angle information of a virtual laser radar beam [ i, j ]; wherein i is the horizontal scanning id of the laser radar beam, and j is the line beam id of the laser radar beam;
step S22, defining the coordinates of the end points of the virtual laser radar beams [ i, j ] in a virtual laser radar coordinate system according to the angle information of the virtual laser radar beams [ i, j ] respectively;
step S23, projecting the end point of the virtual laser radar beam [ i, j ] to a camera imaging plane based on the coordinate of the end point in a virtual laser radar coordinate system;
and step S24, calculating the perception range of each projection point, thereby constructing a camera projection plane, the projection points of the virtual laser radar beam end points and a virtual laser radar equivalent projection plane formed by corresponding perception radiuses.
4. The method for projection-based virtual lidar online simulation of claim 3, wherein the constructing of the angle information for each virtual lidar beam in step S21 comprises:
the angular information defining the virtual lidar beam [ i, j ] is:
Di,j=[αi,j,βi,j]
wherein alpha isi,j=-α0+i×rhAs virtual lidar beams [ i, j ]]Angle of course of (beta)i,j=-β0+j×rvAs virtual lidar beams [ i, j ]]The pitch angle of (d).
5. The projection-based virtual lidar online simulation method of claim 4, wherein the step S22 comprises:
defining the coordinates of the end point of the virtual lidar beam [ i, j ] in the virtual lidar coordinate system as:
Pi,j=[xi,j,yi,j,zi,j,1]
wherein x isi,j=I0×cos(-βi,j)×sin(αi,j) As virtual lidar beams [ i, j ]]X-coordinate, y-coordinate of the end point in the virtual lidar coordinate systemi,j=I0×cos(-βi,j)×cos(αi,j) As virtual lidar beams [ i, j ]]Y-coordinate, z-coordinate of the end point in the virtual lidar coordinate systemi,j=I0×sin(βi,j) As virtual lidar beams [ i, j ]]Z-coordinate of the end point in the virtual lidar coordinate system, I0As virtual lidar beams [ i, j ]]Is the distance from the coordinate point of the virtual lidar coordinate system to the coordinate origin.
6. The projection-based virtual lidar on-line simulation method of claim 5, wherein the step S23 comprises:
calculating the projection of the endpoint of the virtual laser radar beam [ i, j ] on the camera imaging plane by using a projection formula based on the coordinate of the endpoint in the virtual laser radar coordinate system:
7. The method for projection-based virtual lidar online simulation of claim 6, wherein step S24 comprises:
according to the projection pointTo its effective neighbor proxelsAverage distance d ofi,jCalculating the effective perception radius of the projection point:
si,j=γ×di,j/2
wherein gamma belongs to [0,1] for controlling the perception range;
the camera projection plane, the projection points of the virtual laser radar beam end points and the corresponding perception radius form a virtual laser radar equivalent projection plane.
8. The projection-based virtual lidar online simulation method of claim 1, wherein the extracting the surface point cloud coordinate information of the target to be scanned in step S30 comprises:
uniformly sampling a surface patch of the surface of the target to be scanned according to the interval of 1cm between points to obtain an Nx 3 matrix S formed by point cloud coordinates of the target surface;
wherein, N is the number of sampled target surface points, and 3 represents that each row in the matrix S stores the three-dimensional coordinates of a point in the virtual lidar coordinate system.
9. The method of claim 8, wherein the step of projecting the point cloud of the surface of the target to be scanned onto the equivalent projection plane of the lidar using the proxy camera parameters in step S30 comprises:
for the k point S in the surface point cloud of the target to be scannedkCalculating the projection coordinates thereof
10. The method for projection-based virtual lidar online simulation of claim 9, wherein step S40 comprises:
step S41, projecting point of target surface pointLook for it atNearest neighbor of (1)If it isAndis less thanS range of perception ofik,jkThen will beRetaining and converting sik,jkSetting the value as-1;
and step S42, traversing the projection points of the target surface points to obtain the target surface point information which can be scanned by the laser radar, and completing the on-line simulation of the virtual laser radar.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110472858.8A CN113176557B (en) | 2021-04-29 | 2021-04-29 | Virtual laser radar online simulation method based on projection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110472858.8A CN113176557B (en) | 2021-04-29 | 2021-04-29 | Virtual laser radar online simulation method based on projection |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113176557A true CN113176557A (en) | 2021-07-27 |
CN113176557B CN113176557B (en) | 2023-03-24 |
Family
ID=76925254
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110472858.8A Active CN113176557B (en) | 2021-04-29 | 2021-04-29 | Virtual laser radar online simulation method based on projection |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113176557B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108596979A (en) * | 2018-03-27 | 2018-09-28 | 深圳市智能机器人研究院 | A kind of caliberating device and method for laser radar and depth camera |
CN111127563A (en) * | 2019-12-18 | 2020-05-08 | 北京万集科技股份有限公司 | Combined calibration method and device, electronic equipment and storage medium |
CN111427026A (en) * | 2020-02-21 | 2020-07-17 | 深圳市镭神智能系统有限公司 | Laser radar calibration method and device, storage medium and self-moving equipment |
US10838049B1 (en) * | 2019-12-17 | 2020-11-17 | The Boeing Company | Calibration procedure for establishing an extrinsic relationship between lidar and camera sensors |
CN111965624A (en) * | 2020-08-06 | 2020-11-20 | 北京百度网讯科技有限公司 | Calibration method, device and equipment for laser radar and camera and readable storage medium |
US20210003683A1 (en) * | 2019-07-03 | 2021-01-07 | DeepMap Inc. | Interactive sensor calibration for autonomous vehicles |
CN112581505A (en) * | 2020-12-24 | 2021-03-30 | 天津师范大学 | Simple automatic registration method for laser radar point cloud and optical image |
-
2021
- 2021-04-29 CN CN202110472858.8A patent/CN113176557B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108596979A (en) * | 2018-03-27 | 2018-09-28 | 深圳市智能机器人研究院 | A kind of caliberating device and method for laser radar and depth camera |
US20210003683A1 (en) * | 2019-07-03 | 2021-01-07 | DeepMap Inc. | Interactive sensor calibration for autonomous vehicles |
US10838049B1 (en) * | 2019-12-17 | 2020-11-17 | The Boeing Company | Calibration procedure for establishing an extrinsic relationship between lidar and camera sensors |
CN111127563A (en) * | 2019-12-18 | 2020-05-08 | 北京万集科技股份有限公司 | Combined calibration method and device, electronic equipment and storage medium |
CN111427026A (en) * | 2020-02-21 | 2020-07-17 | 深圳市镭神智能系统有限公司 | Laser radar calibration method and device, storage medium and self-moving equipment |
CN111965624A (en) * | 2020-08-06 | 2020-11-20 | 北京百度网讯科技有限公司 | Calibration method, device and equipment for laser radar and camera and readable storage medium |
CN112581505A (en) * | 2020-12-24 | 2021-03-30 | 天津师范大学 | Simple automatic registration method for laser radar point cloud and optical image |
Non-Patent Citations (1)
Title |
---|
黄志清等: "二维激光雷达与可见光相机外参标定方法研究", 《仪器仪表学报》 * |
Also Published As
Publication number | Publication date |
---|---|
CN113176557B (en) | 2023-03-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107292927B (en) | Binocular vision-based symmetric motion platform pose measurement method | |
CN108765584B (en) | Laser point cloud data set augmentation method, device and readable storage medium | |
US20170308736A1 (en) | Three dimensional object recognition | |
Zuo et al. | Devo: Depth-event camera visual odometry in challenging conditions | |
CN111046776A (en) | Mobile robot traveling path obstacle detection method based on depth camera | |
CN114419147A (en) | Rescue robot intelligent remote human-computer interaction control method and system | |
JP2021168143A (en) | System and method for efficiently scoring probe in image by vision system | |
CN112414403A (en) | Robot positioning and attitude determining method, equipment and storage medium | |
CN112329846A (en) | Laser point cloud data high-precision marking method and system, server and medium | |
CN116193108B (en) | Online self-calibration method, device, equipment and medium for camera | |
CN112700498A (en) | Wind driven generator blade tip positioning method and system based on deep learning | |
CN116379915A (en) | Building mapping method, device, system and storage medium | |
CN115984766A (en) | Rapid monocular vision three-dimensional target detection method for underground coal mine | |
CN113176557B (en) | Virtual laser radar online simulation method based on projection | |
Cai et al. | Improving CNN-based planar object detection with geometric prior knowledge | |
CN113763478A (en) | Unmanned vehicle camera calibration method, device, equipment, storage medium and system | |
CN116630411A (en) | Mining electric shovel material surface identification method, device and system based on fusion perception | |
CN116642490A (en) | Visual positioning navigation method based on hybrid map, robot and storage medium | |
CN113269803B (en) | Scanning positioning method, system and equipment based on 2D laser and depth image fusion | |
CN115239822A (en) | Real-time visual identification and positioning method and system for multi-module space of split type flying vehicle | |
CN113920020A (en) | Human point cloud real-time repairing method based on depth generation model | |
CN114694106A (en) | Extraction method and device of road detection area, computer equipment and storage medium | |
Wang et al. | Full Period Three-dimensional (3-D) Reconstruction Method for a Low Cost Singlelayer Lidar. | |
Ahmad Yusri et al. | Preservation of cultural heritage: a comparison study of 3D modelling between laser scanning, depth image, and photogrammetry methods | |
CN115937457B (en) | Real-time topography sketch method based on DEM image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |