CN116644616B - Point cloud distortion effect reduction method and device, electronic equipment and storage medium - Google Patents
Point cloud distortion effect reduction method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN116644616B CN116644616B CN202310913118.2A CN202310913118A CN116644616B CN 116644616 B CN116644616 B CN 116644616B CN 202310913118 A CN202310913118 A CN 202310913118A CN 116644616 B CN116644616 B CN 116644616B
- Authority
- CN
- China
- Prior art keywords
- laser
- simulation
- simulation environment
- frame
- point cloud
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000000694 effects Effects 0.000 title claims abstract description 65
- 238000000034 method Methods 0.000 title claims abstract description 36
- 230000009467 reduction Effects 0.000 title claims description 24
- 238000004088 simulation Methods 0.000 claims abstract description 185
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 29
- 238000006243 chemical reaction Methods 0.000 claims description 8
- 238000012545 processing Methods 0.000 claims description 8
- 238000004590 computer program Methods 0.000 claims description 5
- 238000012937 correction Methods 0.000 claims description 3
- 238000006073 displacement reaction Methods 0.000 claims 1
- 238000010304 firing Methods 0.000 claims 1
- 230000003287 optical effect Effects 0.000 description 21
- 238000005516 engineering process Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 230000008447 perception Effects 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 238000011084 recovery Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013524 data verification Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/06—Ray-tracing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Hardware Design (AREA)
- Remote Sensing (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Radar, Positioning & Navigation (AREA)
- Evolutionary Computation (AREA)
- Electromagnetism (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Traffic Control Systems (AREA)
Abstract
The application provides a point cloud distortion effect restoration method, a point cloud distortion effect restoration device, electronic equipment and a storage medium, and relates to the technical field of automatic driving, wherein the point cloud distortion effect restoration method comprises the following steps of: if the automatic driving simulation environment is a low-frame-rate simulation environment, controlling the laser radar to simultaneously emit a plurality of groups of lasers along different angles according to the horizontal angle resolution, and determining a plurality of groups of lasers captured by each simulation frame; in each simulation frame, utilizing a ray tracing algorithm to respectively determine the corresponding collision point coordinates of each beam of laser in the simulation frame, correcting the corresponding collision point coordinates of each beam of laser according to the current speed vector of the host vehicle corresponding to each beam of laser and the laser emission time difference, and determining corrected collision sitting; and obtaining the point cloud with the distortion effect by the corrected collision point coordinates corresponding to each row of laser. According to the application, the distortion effect is added to the point cloud through the movement characteristics of the main vehicle, so that the automatic driving environment is more real, and the subsequent operation precision is improved.
Description
Technical Field
The present application relates to the field of autopilot technologies, and in particular, to a point cloud distortion effect restoration method, a point cloud distortion effect restoration device, an electronic device, and a storage medium.
Background
In the automatic driving technology, the point cloud of the laser radar plays a very important role in perception, and because the laser radar generates the density of the point cloud, the laser radar becomes one of the most important perception tools in the automatic driving perception field, and in the development of automatic driving simulation software, the point cloud of the laser radar needs to be simulated as a data verification control algorithm and a perception and fusion algorithm of real perception.
The optical tracking technology is an important technology of point cloud output of the laser radar, has been widely applied to simulation of the laser radar, and one disadvantage of the point cloud manufactured by the optical tracking technology at present is that distortion phenomenon cannot be reflected, because an optical tracking algorithm needs to operate in a certain fixed frame, a real scene is dynamic, and laser pulse emission of the scanning laser radar is also dynamic, so that the point cloud generated by the optical tracking algorithm is perfect and cannot reflect the real simulation environment.
Disclosure of Invention
In view of the above, the present application aims to provide at least a method, an apparatus, an electronic device and a storage medium for recovering a point cloud distortion effect, which are capable of improving the accuracy of subsequent operations by adding a distortion effect to a point cloud through the motion characteristics of a host vehicle in high-low frequency simulation.
The application mainly comprises the following aspects:
in a first aspect, an embodiment of the present application provides a method for recovering a point cloud distortion effect, where the method includes:
acquiring a simulation frame rate corresponding to an automatic driving simulation environment, judging whether the automatic driving simulation environment is an ultrahigh frame rate simulation environment according to the simulation frame rate, wherein a host vehicle carrying a laser radar is arranged in the automatic driving simulation environment, if the automatic driving simulation environment is the ultrahigh frame rate simulation environment, controlling the laser radar to rotate and sequentially emit a plurality of lasers to obtain a point cloud with a distortion effect, and if the automatic driving simulation environment is a low frame rate simulation environment, controlling the laser radar to simultaneously emit a plurality of groups of lasers along different angles according to the resolution of horizontal angles, and determining the plurality of groups of lasers captured by each simulation frame; in each simulation frame, utilizing a ray tracing algorithm to respectively determine the corresponding collision point coordinates of each beam of laser in the simulation frame, correcting the corresponding collision point coordinates of each beam of laser according to the current speed vector of the host vehicle corresponding to each beam of laser and the laser emission time difference, and determining the corrected collision point coordinates; and obtaining the point cloud with the distortion effect by the corrected collision point coordinates corresponding to each laser beam.
In a preferred embodiment, it is determined whether the autopilot simulation environment is a super high frame rate simulation environment by: acquiring rotation frequency and horizontal angle resolution corresponding to the laser radar; determining a laser emission time difference corresponding to the laser radar according to the rotation frequency and the horizontal angle resolution; calculating a simulation time difference according to the simulation frequency; if the laser emission time difference is greater than or equal to the simulation time difference, determining that the automatic driving simulation environment is an ultrahigh frame rate simulation environment; and if the laser emission time difference is smaller than the simulation time difference, determining that the automatic driving simulation environment is a low-frame-rate simulation environment.
In a preferred embodiment, the step of determining the laser emission time difference corresponding to the laser radar according to the rotation frequency and the horizontal angle resolution comprises: determining the total rotation time for the laser radar to rotate for one circle according to the rotation frequency corresponding to the laser radar; determining the total quantity of emitted laser of the laser radar rotating for one circle according to the horizontal angle resolution; the ratio between the total rotation time and the total number of emitted lasers is determined as the laser emission time difference.
In a preferred embodiment, if the autopilot simulation environment is a super high frame rate simulation environment, the point cloud with added distortion effects is obtained by: sequentially emitting laser according to the laser emission time difference; for each emitted laser light, the following processing is performed: extracting the surrounding environment information of the host vehicle and the host vehicle position recorded by the simulation frame corresponding to the emitted laser; respectively calculating optical tracking data generated by the emitted laser in the corresponding simulation frame by using an optical tracking algorithm, surrounding environment information of the host vehicle and the position of the host vehicle, wherein the optical tracking data comprises distance information and emission angle information between collision points of the host vehicle and the surrounding environment; and after the laser radar rotates for one circle, performing polar coordinate conversion according to the distance information and the emission angle information corresponding to each emission laser to obtain a point cloud with the added distortion effect.
In a preferred embodiment, the corrected collision point coordinates for each laser beam are determined by: determining the product of the current speed vector of the host vehicle corresponding to the laser beam and the laser emission time difference as the host vehicle position vector corresponding to the laser beam; and determining the difference value between the collision point coordinates and the vehicle position vector of the main vehicle as corrected collision point coordinates corresponding to the laser beam.
In a preferred embodiment, the step of obtaining the point cloud with the distortion effect from the corrected collision point coordinates corresponding to each laser beam includes: performing polar coordinate conversion on the collision point coordinates corrected by each laser beam to obtain corresponding distance information and emission angle information under the laser beam; and outputting the point cloud with the added distortion effect by utilizing the distance information and the emission angle information corresponding to each laser beam.
In a second aspect, an embodiment of the present application further provides a point cloud distortion effect reduction apparatus, where the apparatus includes: the frame rate simulation environment judging module is used for acquiring the simulation frame rate corresponding to the automatic driving simulation environment and judging whether the automatic driving simulation environment is an ultrahigh frame rate simulation environment according to the simulation frame rate, wherein a host vehicle carrying the laser radar is arranged in the automatic driving simulation environment, and the host vehicle is in a traveling state; the high-frame-rate distortion reduction module is used for controlling the laser radar to rotate and sequentially emit a plurality of lasers to obtain a point cloud with a distortion effect if the automatic driving simulation environment is an ultrahigh-frame-rate simulation environment; the first low-frame-rate distortion reduction module is used for controlling the laser radar to simultaneously emit a plurality of groups of lasers along different angles according to the horizontal angle resolution if the automatic driving simulation environment is a low-frame-rate simulation environment, and determining a plurality of groups of lasers captured by each simulation frame; the correction module is used for respectively determining the corresponding collision point coordinates of each beam of laser in each simulation frame by utilizing a light pursuit algorithm, correcting the corresponding collision point coordinates of each beam of laser according to the current speed vector of the host vehicle corresponding to each beam of laser and the laser emission time difference, and determining the corrected collision point coordinates; and the second low-frame-rate distortion reduction module is used for obtaining a point cloud with a distortion effect by the corrected collision point coordinates corresponding to each laser beam.
In a third aspect, an embodiment of the present application further provides an electronic device, including: the system comprises a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, when the electronic device is running, the processor and the memory are communicated through the bus, and the machine-readable instructions are executed by the processor to perform the steps of the point cloud distortion effect reduction method in the first aspect or any possible implementation manner of the first aspect.
In a fourth aspect, the embodiment of the present application further provides a computer readable storage medium, where a computer program is stored, where the computer program when executed by a processor performs the step of recovering a point cloud distortion effect in the first aspect or any of the possible implementation manners of the first aspect.
According to the point cloud distortion effect restoration method, the point cloud distortion effect restoration device, the electronic equipment and the storage medium, if the automatic driving simulation environment is a low-frame-rate simulation environment, the laser radar is controlled to emit multiple groups of lasers along different angles according to the horizontal angle resolution, and multiple groups of lasers captured by each simulation frame are determined; in each simulation frame, utilizing a ray tracing algorithm to respectively determine the corresponding collision point coordinates of each beam of laser in the simulation frame, correcting the corresponding collision point coordinates of each beam of laser according to the current speed vector of the host vehicle corresponding to each beam of laser and the laser emission time difference, and determining corrected collision sitting; and obtaining the point cloud with the distortion effect by the corrected collision point coordinates corresponding to each row of laser. According to the application, in high-low frequency simulation, the distortion effect is added for the point cloud through the movement characteristics of the main vehicle, so that the automatic driving environment is more real, and the subsequent operation precision is improved.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows a flowchart of a point cloud distortion effect reduction method provided by an embodiment of the present application;
FIG. 2 shows a schematic diagram of the ultra-high frame rate simulation environment lidar action of the present application;
fig. 3 is a schematic structural diagram of a point cloud distortion effect reduction device according to an embodiment of the present application;
fig. 4 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for the purpose of illustration and description only and are not intended to limit the scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this disclosure, illustrates operations implemented according to some embodiments of the present application. It should be appreciated that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to or removed from the flow diagrams by those skilled in the art under the direction of the present disclosure.
In addition, the described embodiments are only some, but not all, embodiments of the application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by a person skilled in the art based on embodiments of the application without making any inventive effort, fall within the scope of the application.
The output of the point cloud is taken as an important sensor output parameter, a large number of algorithms are used for operation, wherein most notable is that a light-chasing technology is utilized to detect the collision point in each laser pulse direction and output the coordinates of the collision point so as to generate the point cloud, and the light-chasing technology is an important technology of the current laser radar point cloud output and has been widely applied to simulation of a laser radar.
The optical tracking technology is based on rendering environment for calculation, and based on computer graphics, one disadvantage of the point cloud manufactured by the optical tracking technology at present is that distortion phenomenon cannot be reflected, because the optical tracking algorithm needs to operate within a certain fixed frame, the real scene is dynamic, and the laser pulse emission of the scanning laser radar is also dynamic, which results in the point cloud generated by the optical tracking algorithm being too perfect, which is disadvantageous for us to verify the real perception algorithm.
Based on this, the embodiment of the application provides a method for restoring the point cloud distortion effect, which adds the distortion effect to the point cloud through the movement characteristics of a main vehicle, is more realistic in an automatic driving environment, improves the subsequent operation precision, and is specifically as follows:
referring to fig. 1, fig. 1 shows a flowchart of a method for recovering a point cloud distortion effect according to an embodiment of the present application. As shown in fig. 1, the method for restoring the point cloud distortion effect provided by the embodiment of the application comprises the following steps:
s100, acquiring a simulation frame rate corresponding to the automatic driving simulation environment.
And S200, judging whether the automatic driving simulation environment is an ultrahigh frame rate simulation environment according to the simulation frame rate.
The automatic driving simulation environment comprises a main vehicle carrying the laser radar, and the main vehicle is in a traveling state.
And S300, if the automatic driving simulation environment is an ultrahigh frame rate simulation environment, controlling the laser radar to rotate and sequentially emitting a plurality of lasers so as to obtain a point cloud with a distortion effect.
S400, if the automatic driving simulation environment is a low-frame-rate simulation environment, controlling the laser radar to simultaneously emit a plurality of groups of lasers along different angles according to the horizontal angle resolution, and determining a plurality of groups of lasers captured by each simulation frame.
S500, in each simulation frame, collision point coordinates corresponding to each beam of laser in the simulation frame are respectively determined by utilizing a ray tracing algorithm.
S600, in each simulation frame, correcting the collision point coordinate corresponding to each laser according to the current speed vector of the host vehicle corresponding to each laser and the laser emission time difference, and determining the corrected collision point coordinate.
And S700, obtaining a point cloud with a distortion effect by corrected collision point coordinates corresponding to each row of laser.
In step S200, it is determined whether the autopilot simulation environment is an ultra-high frame rate simulation environment by:
the method comprises the steps of obtaining rotation frequency and horizontal angle resolution corresponding to a laser radar, determining laser emission time difference corresponding to the laser radar according to the rotation frequency and the horizontal angle resolution, calculating simulation time difference according to simulation frequency, determining that an automatic driving simulation environment is an ultrahigh frame rate simulation environment if the laser emission time difference is larger than or equal to the simulation time difference, and determining that the automatic driving simulation environment is a low frame rate simulation environment if the laser emission time difference is smaller than the simulation time difference.
Specifically, the rotation frequency of the lidar indicates the number of rotations of the lidar in each second, and the horizontal angular resolution refers to the angular interval between two adjacent lasers of the lidar.
In another preferred embodiment, the step of determining the laser emission time difference corresponding to the lidar according to the rotation frequency and the horizontal angle resolution comprises:
and determining the total rotation time for the laser radar to rotate for one circle according to the corresponding rotation frequency of the laser radar, determining the total laser emission quantity of the laser radar to rotate for one circle according to the horizontal angle resolution, and determining the ratio of the total rotation time to the total laser emission quantity as the laser emission time difference.
Wherein, the laser emission time difference can be calculated by the following formula:
in the course of this formula (ii) the formula,indicating the laser emission time difference,/->Indicating the total number of laser shots in one revolution of the lidar,/->Indicating the corresponding rotational frequency of the lidar.
Wherein the total quantity of emitted laser light of one rotation of the laser radar is determined by the following formula:
In the course of this formula (ii) the formula,indicating the horizontal angular resolution corresponding to the lidar.
Wherein the simulation time difference is the inverse of the simulation frequency, specifically, the simulation time difference=Wherein->Representing the simulation frequency.
In the application, for different simulation frame rates, the corresponding algorithm can be selected in a targeted way to reconstruct the point cloud distortion corresponding to the laser radar, and the judgment conditions are as follows:
at this time, the automatic driving simulation environment is an ultra-high frame rate simulation environment, namely, the simulation frequency is high enough, so that the calculation requirement of each laser beam in different frames can be met.
When:
at this time, the autopilot simulation environment is a low frame rate simulation environment, i.e., the simulation frequency cannot meet the requirement of the laser radar frame number, i.e., there is laser that cannot be captured by the simulation frame.
Step S300 includes:
the laser light is emitted in turn according to the laser light emission time difference, and for each emitted laser light, the following processing is performed: and extracting host vehicle surrounding environment information and host vehicle position recorded by a simulation frame corresponding to the emitted laser, respectively calculating optical tracking data generated by the emitted laser in the corresponding simulation frame by using an optical tracking algorithm, the host vehicle surrounding environment information and the host vehicle position, wherein the optical tracking data comprises distance information and emission angle information between the host vehicle and surrounding environment collision points, and performing polar coordinate conversion according to the distance information and the emission angle information corresponding to each emitted laser after the laser radar rotates for one circle so as to obtain point clouds with added distortion effects.
The host vehicle surroundings information may include other vehicles or obstacles located around the host vehicle, and the eating is not particularly limited.
Specifically, in the optical tracking algorithm in the prior art, all laser rays are emitted synchronously and calculated simultaneously, so that no distortion effect exists, in the application, the laser radar is sequentially emitted according to the horizontal angle resolution corresponding to the laser radar, and referring to fig. 2, fig. 2 shows a schematic diagram of the operation of the laser radar in the ultra-high frame rate simulation environment in the application. As shown in fig. 2, the horizontal angle resolution of the laser radar is 45 °, and 1-8 beams of laser are obtained as required, for example, after the first beam of laser is emitted, the second beam of laser is emitted after the laser emission time difference, and so on until the laser radar rotates for one circle, so that for two adjacent beams of laser, the host vehicle carrying the laser radar is displaced within the laser emission time difference due to the existence of the laser emission time difference, the host vehicle and the surrounding environment thereof can be ensured to be in dynamic change, and thus, the point cloud output by the laser radar can be more true.
And storing the distance information and the emission angle information corresponding to each beam of emission laser together, and outputting the distance information and the emission angle information of all the emission lasers into point clouds (namely, converting polar coordinates into rectangular coordinates) after the laser radar rotates for one circle.
In the application, considering that the laser emission time difference of the laser radar can be between 2 frames, the total emission laser quantity with the simulation frame rate being more than or equal to 1 time is taken at random for each beam of laser, and the frame approximation frame is taken.
In step S400, the lack of the simulation frame rate results in failure to make the laser radar emit laser light in separate frames, and the algorithm of the above ultra-high frame rate simulation environment is not suitable, so that in order to cope with this situation, it is necessary to control the laser radar to emit multiple groups of laser light along different angles according to the horizontal angle resolution, then calculate the allocation index, where the allocation index indicates the number of simulation frames generated in the automatic driving simulation environment in the time of one rotation of the laser radar, and after the allocation index is obtained, all the laser light emitted in one rotation of the laser radar can be equally allocated to different simulation frames to determine multiple groups of laser light captured by each simulation frame.
Wherein the allocation index is determined by the following formula:
in the course of this formula (ii) the formula,representing the allocation index.
At this time, the calculation in each simulation frame is not one laser beam but a plurality of laser beams, if only the optical tracking algorithm is adopted to calculate the point cloud, the calculation result will not have distortion effect.
In a preferred embodiment, in step S600, the corrected collision point coordinates corresponding to each laser beam are determined by:
and determining the product of the current speed vector of the host vehicle corresponding to the laser beam and the laser emission time difference as a host vehicle position vector corresponding to the laser beam, and determining the difference between the collision point coordinate and the host vehicle position vector as a corrected collision point coordinate corresponding to the laser beam.
Specifically, the corrected collision point coordinates corresponding to each laser beam are determined by the following formula:
wherein,,indicating corrected collision point coordinates corresponding to the ith laser beam in the jth simulation frame,/th simulation frame>Representing the coordinate of the collision point obtained after the i-th laser collision in the j-th simulation frame,/for the laser beam>And representing the current speed vector of the host vehicle extracted in the j-th simulation frame.
In the application, compared with the collision point coordinate directly calculated by using the ray tracing in each simulation frame, the method considers the main vehicle speed vector and the time difference of laser emission, and corrects the finally output collision point coordinate so as to be convenient for adding distortion effect to the subsequent point cloud output.
In a preferred embodiment, step S700 includes:
and performing polar coordinate conversion on the collision point coordinates corrected by each laser beam to obtain corresponding distance information and emission angle information under the laser beam, and outputting a point cloud with a distortion effect by utilizing the distance information and the emission angle information corresponding to each laser beam.
The automatic driving perception algorithm is still an important algorithm verification content for the distortion effect processing, if the point cloud generated by the laser radar has no distortion effect, the distortion processing of the algorithm cannot be verified, and based on the principle of distortion, the point cloud carrying the distortion effect is generated by applying different strategies according to different simulation environments corresponding to the simulation frame rate on the classical method of constructing the point cloud by utilizing the optical tracking algorithm, so that the distortion processing of the automatic driving perception algorithm is conveniently verified.
Based on the same application conception, the embodiment of the application also provides a point cloud distortion effect reduction device corresponding to the point cloud distortion effect reduction method provided by the embodiment, and because the principle of solving the problem by the device in the embodiment of the application is similar to that of the point cloud distortion effect reduction method of the embodiment of the application, the implementation of the device can refer to the implementation of the method, and the repetition is omitted.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a point cloud distortion effect reduction device according to an embodiment of the present application. As shown in fig. 3, the apparatus includes:
the frame rate simulation environment judging module 800 is configured to obtain a simulation frame rate corresponding to the autopilot simulation environment and judge whether the autopilot simulation environment is an ultra-high frame rate simulation environment according to the simulation frame rate, where a host vehicle carrying a laser radar is included in the autopilot simulation environment, and the host vehicle is in a traveling state;
the high frame rate distortion reduction module 810 is configured to control the laser radar to rotate and sequentially emit a plurality of lasers to obtain a point cloud with a distortion effect if the autopilot simulation environment is an ultra-high frame rate simulation environment;
the first low-frame-rate distortion recovery module 820 is configured to control the laser radar to simultaneously emit multiple groups of lasers along different angles according to the horizontal angle resolution and determine multiple groups of lasers captured by each simulation frame if the autopilot simulation environment is a low-frame-rate simulation environment;
the correction module 830 is configured to determine, in each simulation frame, a collision point coordinate corresponding to each laser beam in the simulation frame by using a light tracking algorithm, correct the collision point coordinate corresponding to each laser beam according to a current vehicle speed vector of the host vehicle corresponding to each laser beam and a laser emission time difference, and determine a corrected collision point coordinate;
the second low frame rate distortion recovery module 840 is configured to obtain a point cloud with a distortion effect from the corrected collision point coordinates corresponding to each laser beam.
Preferably, the frame rate simulation environment judging module 800 is further configured to: acquiring rotation frequency and horizontal angle resolution corresponding to the laser radar; determining a laser emission time difference corresponding to the laser radar according to the rotation frequency and the horizontal angle resolution; calculating a simulation time difference according to the simulation frequency; if the laser emission time difference is greater than or equal to the simulation time difference, determining that the automatic driving simulation environment is an ultrahigh frame rate simulation environment; and if the laser emission time difference is smaller than the simulation time difference, determining that the automatic driving simulation environment is a low-frame-rate simulation environment.
Preferably, the frame rate simulation environment judging module 800 is further configured to: determining the total rotation time for the laser radar to rotate for one circle according to the rotation frequency corresponding to the laser radar; determining the total quantity of emitted laser of the laser radar rotating for one circle according to the horizontal angle resolution; the ratio between the total rotation time and the total number of emitted lasers is determined as the laser emission time difference.
Preferably, the high frame rate distortion reduction module 810 is further configured to: for each emitted laser light, the following processing is performed: extracting the surrounding environment information of the host vehicle and the host vehicle position recorded by the simulation frame corresponding to the emitted laser; respectively calculating optical tracking data generated by the emitted laser in the corresponding simulation frame by using an optical tracking algorithm, surrounding environment information of the host vehicle and the position of the host vehicle, wherein the optical tracking data comprises distance information and emission angle information between collision points of the host vehicle and the surrounding environment; and after the laser radar rotates for one circle, performing polar coordinate conversion according to the distance information and the emission angle information corresponding to each emission laser to obtain a point cloud with the added distortion effect.
Preferably, the compensation module 830 is further configured to: and determining the product of the current speed vector of the host vehicle corresponding to the laser beam and the laser emission time difference as a host vehicle position vector corresponding to the laser beam, and determining the difference between the collision point coordinate and the host vehicle position vector as a corrected collision point coordinate corresponding to the laser beam.
Preferably, the second low frame rate distortion reduction module 840 is further configured to: performing polar coordinate conversion on the collision point coordinates corrected by each laser beam to obtain corresponding distance information and emission angle information under the laser beam; and outputting the point cloud with the added distortion effect by utilizing the distance information and the emission angle information corresponding to each laser beam.
Based on the same application concept, please refer to fig. 4, fig. 4 shows a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 900 includes: processor 910, memory 920 and bus 930, memory 920 stores machine-readable instructions executable by processor 910, which when executed by processor 910 performs the steps of the point cloud distortion effect reduction method as provided in any of the above embodiments, when executed by processor 910, by communication between processor 910 and memory 920 via bus 930.
Based on the same application conception, the embodiment of the application also provides a computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, and the computer program executes the steps of the point cloud distortion effect reduction method provided by the embodiment when being run by a processor.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily appreciate variations or alternatives within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.
Claims (10)
1. A method for recovering a point cloud distortion effect, the method comprising:
acquiring a simulation frame rate corresponding to an automatic driving simulation environment, and judging whether the automatic driving simulation environment is an ultrahigh frame rate simulation environment according to the simulation frame rate, wherein a host vehicle carrying a laser radar is arranged in the automatic driving simulation environment, and the host vehicle is in a traveling state;
if the automatic driving simulation environment is an ultrahigh frame rate simulation environment, controlling the laser radar to rotate and sequentially emit a plurality of lasers so as to obtain a point cloud with a distortion effect;
if the automatic driving simulation environment is a low-frame-rate simulation environment, controlling the laser radar to simultaneously emit a plurality of groups of lasers along different angles according to the horizontal angle resolution, and determining a plurality of groups of lasers captured by each simulation frame;
in each simulation frame, utilizing a ray tracing algorithm to respectively determine the corresponding collision point coordinates of each beam of laser in the simulation frame, correcting the corresponding collision point coordinates of each beam of laser according to the current speed vector of the host vehicle corresponding to each beam of laser and the laser emission time difference, and determining the corrected collision point coordinates;
and obtaining the point cloud with the distortion effect by the corrected collision point coordinates corresponding to each laser beam.
2. The method of claim 1, wherein determining whether the autopilot simulation environment is a super high frame rate simulation environment is performed by:
acquiring rotation frequency and horizontal angle resolution corresponding to the laser radar;
determining a laser emission time difference corresponding to the laser radar according to the rotation frequency and the horizontal angle resolution;
calculating a simulation time difference according to the simulation frequency;
if the laser emission time difference is larger than or equal to the simulation time difference, determining that the automatic driving simulation environment is an ultrahigh frame rate simulation environment;
and if the laser emission time difference is smaller than the simulation time difference, determining that the automatic driving simulation environment is a low-frame-rate simulation environment.
3. The method of claim 2, wherein determining a laser firing time difference corresponding to the lidar based on the rotational frequency and the horizontal angle resolution comprises:
determining the total rotation time for the laser radar to rotate for one circle according to the rotation frequency corresponding to the laser radar;
determining the total quantity of emitted laser of the laser radar rotating for one circle according to the horizontal angle resolution;
the ratio between the total rotation time and the total number of emitted lasers is determined as a laser emission time difference.
4. The method of claim 1, wherein if the autopilot simulation environment is an ultra-high frame rate simulation environment, the point cloud with added distortion effects is obtained by:
sequentially emitting laser according to the laser emission time difference;
for each emitted laser light, the following processing is performed:
extracting the surrounding environment information of the host vehicle and the host vehicle position recorded by the simulation frame corresponding to the emitted laser;
respectively calculating light tracking data generated by the emitted laser in a corresponding simulation frame by using a light tracking algorithm, surrounding environment information of the host vehicle and the position of the host vehicle, wherein the light tracking data comprises distance information and emission angle information between collision points of the host vehicle and the surrounding environment;
and after the laser radar rotates for one circle, performing polar coordinate conversion according to the distance information and the emission angle information corresponding to each emission laser to obtain a point cloud with the added distortion effect.
5. The method of claim 1, wherein the corrected collision point coordinates for each laser beam are determined by:
determining the product of the current speed vector of the host vehicle corresponding to the laser beam and the laser emission time difference as a host vehicle displacement vector corresponding to the laser beam;
and determining the difference value between the collision point coordinates and the main vehicle parking position vector as corrected collision point coordinates corresponding to the laser beam.
6. The method of claim 1, wherein the step of obtaining a point cloud with added distortion effects from the corrected collision point coordinates for each laser beam comprises:
performing polar coordinate conversion on the collision point coordinates corrected by each laser beam to obtain corresponding distance information and emission angle information under the laser beam;
and outputting the point cloud with the added distortion effect by utilizing the distance information and the emission angle information corresponding to each laser beam.
7. A point cloud distortion effect reduction apparatus, the apparatus comprising:
the frame rate simulation environment judging module is used for acquiring a simulation frame rate corresponding to an automatic driving simulation environment and judging whether the automatic driving simulation environment is an ultrahigh frame rate simulation environment according to the simulation frame rate, wherein a host vehicle carrying a laser radar is arranged in the automatic driving simulation environment, and the host vehicle is in a traveling state;
the high-frame-rate distortion reduction module is used for controlling the laser radar to rotate and sequentially emit a plurality of lasers to obtain a point cloud with a distortion effect if the automatic driving simulation environment is an ultrahigh-frame-rate simulation environment;
the first low-frame-rate distortion reduction module is used for controlling the laser radar to simultaneously emit a plurality of groups of lasers along different angles according to the horizontal angle resolution if the automatic driving simulation environment is a low-frame-rate simulation environment, and determining a plurality of groups of lasers captured by each simulation frame;
the correction module is used for respectively determining the corresponding collision point coordinates of each beam of laser in each simulation frame by utilizing a light pursuit algorithm, correcting the corresponding collision point coordinates of each beam of laser according to the current speed vector of the host vehicle corresponding to each beam of laser and the laser emission time difference, and determining the corrected collision point coordinates;
and the second low-frame-rate distortion reduction module is used for obtaining a point cloud with a distortion effect by the corrected collision point coordinates corresponding to each laser beam.
8. The apparatus of claim 7, wherein the frame rate simulation environment determination module is further configured to:
acquiring rotation frequency and horizontal angle resolution corresponding to the laser radar, wherein the rotation frequency indicates;
determining a laser emission time difference corresponding to the laser radar according to the rotation frequency and the horizontal angle resolution;
calculating a simulation time difference according to the simulation frequency;
if the laser emission time difference is larger than or equal to the simulation time difference, determining that the automatic driving simulation environment is an ultrahigh frame rate simulation environment;
and if the laser emission time difference is smaller than the simulation time difference, determining that the automatic driving simulation environment is a low-frame-rate simulation environment.
9. An electronic device, comprising: a processor, a memory and a bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating via said bus when the electronic device is running, said machine readable instructions when executed by said processor performing the steps of the point cloud distortion effect reduction method according to any of claims 1 to 6.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the point cloud distortion effect reduction method according to any of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310913118.2A CN116644616B (en) | 2023-07-25 | 2023-07-25 | Point cloud distortion effect reduction method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310913118.2A CN116644616B (en) | 2023-07-25 | 2023-07-25 | Point cloud distortion effect reduction method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116644616A CN116644616A (en) | 2023-08-25 |
CN116644616B true CN116644616B (en) | 2023-09-22 |
Family
ID=87623366
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310913118.2A Active CN116644616B (en) | 2023-07-25 | 2023-07-25 | Point cloud distortion effect reduction method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116644616B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111221334A (en) * | 2020-01-17 | 2020-06-02 | 清华大学 | Environmental sensor simulation method for simulating automatic driving automobile |
CN115081240A (en) * | 2022-07-14 | 2022-09-20 | 浙江大学 | Point cloud data processing method for improving authenticity of simulated laser radar data |
CN115081195A (en) * | 2022-06-06 | 2022-09-20 | 北京易航远智科技有限公司 | Laser radar simulation method and device, electronic equipment and storage medium |
CN115308754A (en) * | 2022-07-18 | 2022-11-08 | 襄阳达安汽车检测中心有限公司 | Laser radar point cloud simulation time delay test method and system |
CN115731350A (en) * | 2022-11-23 | 2023-03-03 | 北京宾理信息科技有限公司 | Simulation method and device for virtual laser radar of vehicle |
CN115997234A (en) * | 2020-12-31 | 2023-04-21 | 华为技术有限公司 | Pose estimation method and related device |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11016496B2 (en) * | 2019-04-10 | 2021-05-25 | Argo AI, LLC | Transferring synthetic LiDAR system data to real world domain for autonomous vehicle training applications |
-
2023
- 2023-07-25 CN CN202310913118.2A patent/CN116644616B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111221334A (en) * | 2020-01-17 | 2020-06-02 | 清华大学 | Environmental sensor simulation method for simulating automatic driving automobile |
CN115997234A (en) * | 2020-12-31 | 2023-04-21 | 华为技术有限公司 | Pose estimation method and related device |
CN115081195A (en) * | 2022-06-06 | 2022-09-20 | 北京易航远智科技有限公司 | Laser radar simulation method and device, electronic equipment and storage medium |
CN115081240A (en) * | 2022-07-14 | 2022-09-20 | 浙江大学 | Point cloud data processing method for improving authenticity of simulated laser radar data |
CN115308754A (en) * | 2022-07-18 | 2022-11-08 | 襄阳达安汽车检测中心有限公司 | Laser radar point cloud simulation time delay test method and system |
CN115731350A (en) * | 2022-11-23 | 2023-03-03 | 北京宾理信息科技有限公司 | Simulation method and device for virtual laser radar of vehicle |
Also Published As
Publication number | Publication date |
---|---|
CN116644616A (en) | 2023-08-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109118542B (en) | Calibration method, device, equipment and storage medium between laser radar and camera | |
CN111443359B (en) | Positioning method, device and equipment | |
CN108629231B (en) | Obstacle detection method, apparatus, device and storage medium | |
EP3326156B1 (en) | Consistent tessellation via topology-aware surface tracking | |
CN114255252B (en) | Obstacle contour acquisition method, device, equipment and computer-readable storage medium | |
CN111080784B (en) | Ground three-dimensional reconstruction method and device based on ground image texture | |
WO2021056516A1 (en) | Method and device for target detection, and movable platform | |
US11975738B2 (en) | Image annotation for deep neural networks | |
CN111308500A (en) | Obstacle sensing method and device based on single-line laser radar and computer terminal | |
CN112734837B (en) | Image matching method and device, electronic equipment and vehicle | |
JP2020122754A (en) | Three-dimensional position estimation device and program | |
CN116644616B (en) | Point cloud distortion effect reduction method and device, electronic equipment and storage medium | |
CN117590362A (en) | Multi-laser radar external parameter calibration method, device and equipment | |
CN109389633B (en) | Depth information estimation method based on LSD-SLAM and laser radar | |
CN114147707B (en) | Robot docking method and device based on visual identification information | |
CN117111491B (en) | Method, device and equipment for determining closest point of object in automatic driving simulation | |
Hauswiesner et al. | Temporal coherence in image-based visual hull rendering | |
EP4202834A1 (en) | Systems and methods for generating three-dimensional reconstructions of environments | |
EP3731130B1 (en) | Apparatus for determining an occupancy map | |
CN115115708B (en) | Image pose calculation method and system | |
CN117668574B (en) | Data model optimization method, device and equipment for light shadow show and storage medium | |
WO2023234384A1 (en) | Map generation device, map generation method, and computer-readable recording medium | |
CN116481515B (en) | Map generation method, map generation device, computer equipment and storage medium | |
CN113029166B (en) | Positioning method, positioning device, electronic equipment and storage medium | |
US20240112363A1 (en) | Position estimation system, position estimation method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |