WO2020133230A1 - 雷达仿真方法、装置及系统 - Google Patents

雷达仿真方法、装置及系统 Download PDF

Info

Publication number
WO2020133230A1
WO2020133230A1 PCT/CN2018/124924 CN2018124924W WO2020133230A1 WO 2020133230 A1 WO2020133230 A1 WO 2020133230A1 CN 2018124924 W CN2018124924 W CN 2018124924W WO 2020133230 A1 WO2020133230 A1 WO 2020133230A1
Authority
WO
WIPO (PCT)
Prior art keywords
lidar sensor
target pixel
target
depth value
detection result
Prior art date
Application number
PCT/CN2018/124924
Other languages
English (en)
French (fr)
Inventor
郑石真
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201880042394.3A priority Critical patent/CN110809723A/zh
Priority to PCT/CN2018/124924 priority patent/WO2020133230A1/zh
Publication of WO2020133230A1 publication Critical patent/WO2020133230A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the invention relates to the technical field of radar, and in particular to a radar simulation method, device and system.
  • lidar is a radar system that uses a laser beam to detect target position, speed, and other characteristic quantities. It is widely used in aircraft and is also used in the latest automatic driving technology. At present, the development of algorithms related to lidar sensors needs to rely on real lidar sensors. Moreover, the real lidar sensor has the characteristics of high price, diverse specifications, difficult installation, and inconvenient data collection.
  • Embodiments of the present invention provide a radar simulation method, device, and system, which are used to solve the problem in the prior art that relies on the development algorithm of the lidar sensor and has a long algorithm development period.
  • an embodiment of the present invention provides a radar simulation method, including:
  • the target pixel point of each laser ray sampled in the target image corresponds to the detection range of the current sampling time of the lidar sensor
  • the image of the target includes a depth map, the depth value of the pixel in the depth map represents the distance between the pixel and the lidar sensor;
  • the detection result of the lidar sensor is output according to the depth value of each target pixel in the plurality of target pixels.
  • an embodiment of the present invention provides a radar simulation device, including: a processor and a memory;
  • the memory is used to store program codes
  • the processor calls the program code, and when the program code is executed, it is used to perform the following operations:
  • the target pixel point of each laser ray sampled in the target image corresponds to the detection range of the current sampling time of the lidar sensor
  • the image of the target includes a depth map, the depth value of the pixel in the depth map represents the distance between the pixel and the lidar sensor;
  • the detection result of the lidar sensor is output according to the depth value of each target pixel in the plurality of target pixels.
  • an embodiment of the present invention provides a simulation system, including: the radar simulation device according to any one of the above-mentioned second aspects.
  • an embodiment of the present invention provides a computer-readable storage medium that stores a computer program.
  • the computer program includes at least one piece of code.
  • the at least one piece of code can be executed by a computer to control the computer
  • the computer executes the radar simulation method according to any one of the first aspect.
  • an embodiment of the present invention provides a computer program that, when executed by a computer, is used to implement the radar simulation method according to any one of the first aspects described above.
  • the radar simulation method, device and system provided by the embodiments of the present invention determine the target pixel points sampled in the target image by each laser ray according to the emission direction of each laser ray among the multiple laser rays of the lidar sensor, and The depth value of each target pixel in each target pixel outputs the detection result of the lidar sensor, so that the point where each laser ray of the lidar sensor is sampled can be obtained through simulation, and the detection result of the lidar sensor is further simulated to achieve
  • the simulation of the lidar sensor has been made, which makes it possible to avoid the problem of long development cycle caused by the need to rely on the real lidar sensor in the algorithm development process.
  • Figure 1A is a top view of a physical model of a real lidar sensor
  • Figure 1B is a side view of a physical model of a real lidar sensor
  • FIG. 2 is a schematic flowchart of a radar simulation method provided by an embodiment of the present invention.
  • FIG. 3 is a schematic flowchart of a radar simulation method provided by another embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention.
  • FIG. 5 is a schematic diagram of N images provided by an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of target pixel points in an image provided by an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of a radar simulation device provided by an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a simulation system provided by an embodiment of the present invention.
  • the radar simulation method provided by the embodiment of the present invention can realize the simulation of the lidar sensor, and the detection result of the lidar sensor can be obtained through simulation by means of software calculation.
  • the radar simulation method provided by the embodiment of the present invention can be applied to any algorithm development scenario that relies on lidar sensors, which can avoid the dependence on real lidar sensors in the development process, thereby solving the problem of algorithm development due to the dependence on real lidar sensors. And there is the problem of long algorithm development cycle.
  • FIG. 1A is a top view of a physical model of a real lidar sensor
  • FIG. 1B is a side view of a physical model of a real lidar sensor.
  • a real lidar sensor rotates around the Z axis at a fixed angular velocity.
  • multiple laser beams are emitted for detection, and the distribution of the multiple laser beams may be as shown in FIG. 1B.
  • the emission directions of multiple laser rays are different, which is used to detect a certain field of view (FOV, Field of View) range centered on the lidar sensor, and the X-axis direction is a certain field of view
  • FOV Field of View
  • n sets of sampling points scanned at equal angular intervals within the unit time T can be obtained; according to the auto-transmitted angular velocity of the lidar sensor, the sweep angle a of the laser radar sensor per unit time T can be obtained.
  • the number of laser rays emitted by the lidar sensor is 16, the angle between the two adjacent laser rays is 2 degrees, and the FOV is 30 degrees as an example.
  • FIG. 2 is a schematic flowchart of a radar simulation method according to an embodiment of the present invention.
  • the execution subject of this embodiment may be a radar simulation device for implementing radar simulation, and may specifically be a processor of the radar simulation device.
  • the method of this embodiment may include:
  • Step 201 according to the emission direction of each laser beam among the multiple laser beams of the lidar sensor, determine the target pixel point sampled by each laser beam in the target image.
  • the target image is an image corresponding to the detection range of the lidar sensor at the current sampling time; the target image includes a depth map, and the depth values of pixels in the depth map represent the pixels and Describe the distance between lidar sensors.
  • the target image corresponding to the detection range of the lidar sensor at the current sampling time may be understood as: the target image may include an image corresponding to the detection range of the lidar sensor at the current sampling time.
  • the laser beam emitted by the lidar sensor is centered on the lidar sensor, so the target image corresponding to the detection range is also centered on the lidar sensor.
  • the target pixel point sampled by each laser beam in the target image can be determined.
  • the target pixel points sampled by the laser beams in the target image are the points sampled by the laser beams of the laser radar sensor obtained by simulation.
  • the target images corresponding to the detection ranges at multiple consecutive sampling moments may be the same.
  • the processing can be simplified.
  • the current sampling time can be determined according to the above sampling frequency.
  • Sampling frequency also known as sampling speed or sampling rate, defines the number of samplings per second and can be expressed in Hertz (Hz).
  • Hz Hertz
  • the sampling frequency is 1 Hz, which can be expressed as a sampling time every 1 second.
  • the number of target pixels may be the same as the number of laser rays of the lidar sensor.
  • the present invention may not limit the acquisition method of the target image.
  • it can be obtained by camera shooting, or it can also be obtained by rendering.
  • step 202 the detection result of the lidar sensor is output according to the depth value of each of the plurality of target pixels.
  • each target pixel point can be obtained The distance from the radar sensor.
  • the target pixel point is the point where each laser ray of the simulated lidar sensor is sampled
  • the distance between each target pixel point and the radar sensor is the sample of each laser ray of the simulated lidar sensor The distance between the point and the radar sensor.
  • the output detection result of the lidar sensor may include the distance between each target pixel and the lidar sensor.
  • the detection result of the lidar sensor may further include the intensity of the reflected signal of each target pixel point to the laser beam. Further optionally, the intensity of the reflected signal of each target pixel point to the laser beam may be the same, for example, all are the preset intensity.
  • the depth value of the target pixel can be directly used as the distance between the target pixel and the lidar sensor; or, the depth value of the target pixel can be mathematically calculated to obtain the relationship between the target pixel and the lidar sensor The distance between.
  • the target pixel points sampled in the target image by each laser ray are determined, and according to each target pixel point among the multiple target pixel points.
  • the depth value of the output of the detection result of the lidar sensor makes it possible to obtain the sampling point of each laser ray of the lidar sensor through simulation, and further obtain the detection result of the lidar sensor through simulation, thus realizing the simulation of the lidar sensor, thus This makes it possible to avoid problems such as long development cycles caused by the need to rely on real lidar sensors during the algorithm development process.
  • FIG. 3 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention. Based on the embodiment shown in FIG. 2, this embodiment mainly describes an optional implementation manner of step 202. As shown in FIG. 3, the method of this embodiment may include:
  • Step 301 Determine a target pixel point sampled by each laser beam in a target image according to the emission direction of each laser beam among the multiple laser beams of the lidar sensor.
  • step 301 is similar to step 201 and will not be repeated here.
  • Step 302 Output the detection result of the lidar sensor according to the depth value of each of the plurality of target pixels and the intensity of the reflected signal of each target pixel to the laser beam.
  • the detection result of the lidar sensor output may include the distance between each target pixel and the radar sensor, and each target pixel The reflected signal strength of the ray.
  • the correspondence between each pixel in the target image and the intensity of the reflected signal may be stored.
  • the intensity of the reflected signal corresponding to a pixel is the intensity of the reflected signal of the pixel for the laser beam.
  • the correspondence between each pixel point and the intensity of the reflected signal may be set in advance, or may be set by the user.
  • the target image further includes: a reflectance map, and the reflectivity of the pixel in the reflectivity map represents the reflectivity of the pixel for the laser beam.
  • the method of this embodiment may further include the following steps A and B.
  • Step A according to the reflectance map, determine the reflectance of each of the plurality of target pixels to the laser beam.
  • the target image includes pixel 1 to pixel 100, 100 pixels
  • the depth map includes the depth value of each pixel in the 100 pixels
  • the reflectance map includes each pixel in the 100 pixels Of reflectivity.
  • the target pixel points are pixel point 10, pixel point 20, pixel point 30, and pixel point 40
  • each pixel point of pixel point 10, pixel point 20, pixel point 30, and pixel point 40 can be obtained according to the reflectance map The reflectivity of the laser beam.
  • Step B According to the depth value of each target pixel in the plurality of target pixels and the reflectivity of the laser beam corresponding to each target pixel, the intensity of the reflected signal of each target pixel to the laser beam is determined.
  • the intensity of the reflected signal of each laser ray sampled by the real laser radar sensor to the laser ray is related to the distance between the sampled point and the laser radar sensor and the reflectivity of the sampled point to the laser ray, Therefore, according to the depth value of a target pixel point and the reflectance of the target pixel point to the laser beam, the reflection intensity of the target pixel point corresponding to the laser beam can be determined.
  • the specific method for determining the intensity of the reflected signal of the target pixel to the laser beam is not limited by the present invention.
  • the above output result may also include the type of obstacle to which the target pixel belongs.
  • the correspondence between each pixel in the target image and the type of obstacle may be stored, and the type of obstacle corresponding to a pixel is the type of obstacle to which the pixel belongs. Further optionally, the correspondence between each pixel and the type of obstacle may be set in advance, or may be set by the user.
  • the target image further includes: a logo, and the logo of the pixel in the logo indicates the type of obstacle to which the pixel belongs.
  • the target image includes pixel point 1 to pixel point 100, 100 pixels
  • the depth map includes the depth value of each pixel in the 100 pixels, and identifies the image that includes each pixel in the 100 pixels The type of obstacle it belongs to.
  • the target pixel points are pixel point 10, pixel point 20, pixel point 30, and pixel point 40, then according to the identification map, each pixel point of pixel point 10, pixel point 20, pixel point 30, and pixel point 40 can be obtained The type of obstacle.
  • the outputting the detection result of the lidar sensor according to the depth value of each target pixel in the plurality of target pixels includes: according to the depth value of each target pixel in the plurality of target pixels, And the type of obstacle to which each target pixel belongs, and outputs the detection result of the lidar sensor.
  • the output detection result of the lidar sensor may include the distance between each target pixel and the radar sensor, and the type of obstacle to which each target pixel belongs.
  • the outputting the detection result of the lidar sensor according to the depth value of each target pixel in the plurality of target pixels and the reflected signal intensity of each target pixel for laser rays includes: The depth value of each target pixel in the target pixels, the intensity of the reflected signal of each target pixel to the laser beam, and the type of obstacle to which each target pixel belongs, output the detection result of the lidar sensor.
  • the output detection result of the lidar sensor may include the distance between each target pixel and the radar sensor, the intensity of the reflected signal of each target pixel to the laser beam, and the type of obstacle to which each target pixel belongs.
  • the target pixel points sampled in the target image by each laser ray are determined, and according to each target pixel point among the multiple target pixel points
  • the depth value and the intensity of the reflected signal of each target pixel point to the laser beam output the detection result of the lidar sensor, which realizes the simulation of the reflectivity of the point sampled by the real lidar sensor, making the simulation obtain the lidar sensor’s
  • the detection result includes the distance between each target pixel and the radar sensor, and the intensity of the reflected signal of each target pixel to the laser beam.
  • FIG. 4 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention. This embodiment is based on the embodiments shown in FIGS. 2 and 3, and mainly describes how the noise is related to the lidar sensor in the simulation. The impact of detection results. As shown in FIG. 4, the method of this embodiment may include:
  • Step 401 according to the emission direction of each laser beam among the multiple laser beams of the lidar sensor, determine the target pixel point sampled by each laser beam in the target image.
  • step 401 is similar to step 201 and will not be repeated here.
  • step 402 according to the noise signal, the depth value of each target pixel among the plurality of target pixels is denoised to obtain the depth value of each target pixel after denoising.
  • the actual detection result of a real lidar sensor is not completely correct, and it is usually affected by noise.
  • the depth of the target pixel indicates the distance between the target pixel and the lidar sensor before The correct distance, so in order to improve the authenticity of the simulation, you can add noise to the depth value of the target pixel. It can be understood that, after noise is added to the depth value of the target pixel, the detection result output according to the depth value of the target pixel includes the influence of noise.
  • step 402 may specifically include: according to the noise signal, denoising the depth value of each of the plurality of target pixels and the intensity of the reflected signal of the laser ray from each target pixel to obtain noise The depth value of each target pixel after the noise, and the intensity of the reflected signal of each target pixel after the noise to the laser beam.
  • the same noise signal can be used for noise addition, or different noise signals can also be used for noise addition, which is not limited in the present invention.
  • the noise signal may specifically be a Gaussian white noise signal.
  • the detection result includes the distance after noise addition.
  • Step 403 According to the depth value of each target pixel after noise addition, output the detection result of the lidar sensor.
  • the depth value of each target pixel after noise can be used as the distance between each target pixel after noise in the detection result and the lidar sensor; or, the noise
  • the depth value of each target pixel point is mathematically calculated to obtain the distance between each target pixel point and the lidar sensor after the noise is added in the detection result.
  • step 403 may specifically include: outputting the detection result of the lidar sensor according to the depth value of each target pixel after noise addition and the reflected signal intensity of each target pixel after noise for the laser beam.
  • the detection result may be encapsulated into a result satisfying the data protocol according to the data protocol of the lidar sensor.
  • step 403 may specifically include: packaging the depth value of each target pixel after the noise is added according to the data protocol of the lidar sensor to obtain the encapsulated result; outputting the detection result of the lidar sensor, The detection result of the lidar sensor is the result after the packaging.
  • step 403 may specifically include: according to the depth value of each target pixel after noise And the intensity of the reflected signal of each target pixel to the laser beam after noise addition, and outputting the detection result of the lidar sensor, including: according to the data protocol of the lidar sensor, the noise of each target pixel after noise Depth value, and each target pixel after noise is encapsulated with the reflected signal intensity of the laser beam to obtain the encapsulated result; the detection result of the lidar sensor is output, and the detection result of the lidar sensor is the package After the result.
  • the depth value of each target pixel can be packaged according to the data protocol of the lidar sensor to obtain the packaged result, and The detection result of the lidar sensor is output, and the detection result of the lidar sensor is the packaged result.
  • the data protocol of the lidar sensor is not limited by the present invention.
  • a specific header can be added.
  • the depth value of each target pixel in the plurality of target pixels is obtained, and according to each target pixel after denoising
  • the depth value of the point and output of the detection result of the lidar sensor can improve the authenticity of the simulation.
  • the output of the detection result of the lidar sensor includes: sending the detection result of the lidar sensor to an algorithm system; or, storing the detection result of the lidar sensor as local files.
  • the algorithm system may specifically refer to an algorithm system under development or an algorithm system under test.
  • the detection results including the distance between each target pixel and the lidar sensor can be sent to the algorithm system or stored as a local file.
  • the detection result including the distance between each target pixel after noise addition and the lidar sensor may be sent to the algorithm system or stored as a local file.
  • the plurality of laser rays among the plurality of laser rays according to the lidar sensor may be all laser rays of the lidar sensor, or may be among all laser rays of the lidar sensor Part of the laser beam.
  • the plurality of laser rays are part of the laser rays of all the laser rays of the lidar sensor, the simulation of the damage of one or some laser rays in the real lidar sensor can be realized, and the flexibility of simulation is improved .
  • the target image may correspond to a certain stereo scene.
  • the target image may be a stereo scene, at least one of the N images corresponding to all detection ranges traversed by a scanning cycle at the position of the lidar sensor at the current sampling time, where N is greater than An integer of 2.
  • N may be equal to 4.
  • critical pixel points of adjacent two images may overlap.
  • N when N is equal to 4, the horizontal field angle of each of the N images is equal to the sum of 90° and the offset angle, and the offset angle is a positive number.
  • the N images may specifically be image 1, image 2, image 3, and image 4.
  • N images are used as plane images as an example.
  • the N images may also be stereoscopic images, such as fan-shaped stereoscopic images or spherical stereoscopic images.
  • all target pixel points on one image of the N images by the lidar sensor may be as shown in FIG. 6.
  • the target pixel point of each column may specifically be the target pixel point obtained at a sampling time.
  • the horizontal angle of view of the image is equal to the angle of view of the detection range of the lidar sensor, which is 30 degrees as an example.
  • the target image may be two of the N images.
  • the above output of the detection result of the lidar sensor according to the depth value of each of the plurality of target pixels can specifically include the following steps C and D.
  • Step C Determine the joint depth value of each target pixel according to the depth value of each target pixel of the two images.
  • Step D according to the joint depth value of each target pixel, output the detection result of the lidar sensor.
  • the output detection result of the lidar sensor may include the distance from the lidar sensor indicated by the joint depth value of each target pixel.
  • it may include the joint depth value of each target pixel.
  • step D when outputting the detection result of the lidar sensor according to the reflected signal intensity of each target pixel point for laser rays, the following step may be further included before step D: according to each target pixel point of the two images For the reflected signal intensity of the laser beam, determine the combined reflected signal intensity of each target pixel for the laser beam.
  • step D may specifically include: outputting the detection result of the lidar sensor according to the joint depth value of each target pixel and the joint reflection signal intensity of the laser beam corresponding to each target pixel.
  • the output detection result of the lidar sensor may include the distance from the lidar sensor indicated by the joint depth value of each target pixel, and the joint reflection signal strength of each target pixel for the laser beam.
  • the joint depth value and the joint reflection signal strength may be denoised according to the noise signal to obtain the joint depth value after noise addition and the joint reflection signal strength after noise addition.
  • the present invention may not limit the specific method for determining the joint depth value of a target pixel.
  • the depth values of the target pixels in the two images may be weighted and summed to obtain a joint depth value.
  • the depth values of the target pixels in the two images may be averaged to obtain a joint depth value, that is, the joint depth value is an average value of depth values.
  • the present invention may not limit the specific manner of determining the combined reflected signal intensity of a target pixel.
  • the reflected signal strength of the target pixel in the two images may be weighted and summed to obtain a joint reflected signal strength.
  • the reflected signal intensity value of the target pixel in the two images may be averaged to obtain a combined reflected signal intensity, that is, the combined reflected signal intensity is an average value of the reflected signal intensity.
  • the specific method of determining the combined reflected signal intensity of a target pixel may be the same as or different from the specific method of determining the combined reflected signal intensity of the target pixel, which is not limited in the present invention.
  • the offset angle may be equal to 2°.
  • the position of the lidar sensor in the stereo scene may be fixed or may change. It can be understood that when the position of the lidar sensor in the stereo scene at the current sampling time is changed relative to the position of the lidar sensor in the stereo scene at the last sampling time, the position of the lidar sensor The detection range may change, so the N images change accordingly.
  • the movement of the radar sensor can be simulated by simulating the movement of the carrier of the radar sensor. Specifically, the position of the lidar sensor in the stereo scene changes with the movement of the carrier of the lidar sensor.
  • the carrier may specifically be any device capable of carrying a lidar sensor.
  • the carrier includes a drone or a car.
  • a computer-readable storage medium is also provided in an embodiment of the present invention.
  • the computer-readable storage medium stores program instructions, and when the program is executed, it may include some or all of the radar simulation methods in the foregoing method embodiments step.
  • An embodiment of the present invention provides a computer program which, when executed by a computer, is used to implement the radar simulation method in any of the above method embodiments.
  • the radar simulation device 700 of this embodiment may include: a memory 701 and a processor 702; connection.
  • the memory 701 may include a read-only memory and a random access memory, and provide instructions and data to the processor 702.
  • a portion of the memory 701 may also include non-volatile random access memory.
  • the memory 701 is used to store program codes.
  • the processor 702 calls the program code, and when the program code is executed, it is used to perform the following operations:
  • the target pixel point of each laser ray sampled in the target image corresponds to the detection range of the current sampling time of the lidar sensor
  • the image of the target includes a depth map, the depth value of the pixel in the depth map represents the distance between the pixel and the lidar sensor;
  • the detection result of the lidar sensor is output according to the depth value of each target pixel in the plurality of target pixels.
  • the processor 702 is configured to output the detection result of the lidar sensor according to the depth value of each of the target pixels, specifically including:
  • the detection result of the lidar sensor is output according to the depth value of each of the plurality of target pixels and the intensity of the reflected signal of each target pixel to the laser beam.
  • the target image further includes: a reflectance map, where the reflectivity of a pixel in the reflectivity map represents the reflectivity of the pixel for the laser beam;
  • the processor 702 is also used to:
  • the reflectance map determine the reflectance of each of the plurality of target pixels to the laser beam
  • the intensity of the reflected signal of each target pixel to the laser beam is determined according to the depth value of each target pixel in the plurality of target pixels and the reflectance of the laser beam corresponding to each target pixel.
  • the target image further includes: a logo, and the logo of the pixel in the logo indicates the type of obstacle to which the pixel belongs;
  • the processor 702 is configured to output the detection result of the lidar sensor according to the depth value of each target pixel in the plurality of target pixels and the reflected signal intensity of each target pixel for laser rays, specifically including :
  • the detection result of the lidar sensor is output according to the depth value of each target pixel in the plurality of target pixels, the intensity of the reflected signal of each target pixel to the laser beam, and the type of obstacle to which each target pixel belongs.
  • the processor 702 is further configured to determine the depth value of each target pixel among the multiple target pixels according to the noise signal, and the intensity of the reflected signal from each target pixel to the laser beam Perform noise addition to obtain the depth value of each target pixel after noise addition, and the reflected signal intensity of each target pixel after noise to the laser beam;
  • the processor 702 is configured to output the detection result of the lidar sensor according to the depth value of each of the plurality of target pixels and the intensity of the reflected signal of each target pixel to the laser beam, specifically including:
  • the detection result of the lidar sensor is output according to the depth value of each target pixel after noise addition and the intensity of the reflected signal of each target pixel after noise on the laser beam.
  • the noise signal is a Gaussian white noise signal.
  • the processor 702 is configured to output the laser according to the depth value of each target pixel after noise addition and the reflected signal intensity of each target pixel after noise to the laser beam
  • the detection results of the radar sensor include:
  • the depth value of each target pixel after noise addition and the intensity of the reflected signal intensity of each target pixel after noise addition are packaged to obtain the packaged result;
  • the detection result of the lidar sensor is output, and the detection result of the lidar sensor is the packaged result.
  • the processor 702 is configured to output the detection result of the lidar sensor, specifically including:
  • the target image is a stereo scene
  • the N images are plane images or stereo images.
  • N is equal to 4.
  • the horizontal field angle of each of the N images is equal to the sum of 90° and the offset angle, and the offset angle is a positive number.
  • the offset angle is equal to 2°.
  • the target image is two images of the N images
  • the processor 702 is configured to output the detection result of the lidar sensor according to the depth value of each target pixel in the plurality of target pixels and the reflected signal intensity of each target pixel for laser rays, specifically including :
  • the detection result of the lidar sensor is output according to the joint depth value of each target pixel and the joint reflected signal intensity of the laser beam corresponding to each target pixel.
  • the joint depth value is an average value of depth values.
  • the combined reflected signal strength is an average value of the reflected signal strength.
  • the The N images change accordingly.
  • the position of the lidar sensor in the stereo scene changes with the movement of the carrier of the lidar sensor.
  • the carrier includes a drone or a vehicle.
  • the plurality of laser rays are a part of laser rays among all laser rays of the lidar sensor.
  • the radar simulation device provided in this embodiment can be used to execute the technical solutions of the above method embodiments of the present invention, and its implementation principles and technical effects are similar, and are not repeated here.
  • FIG. 8 is a schematic structural diagram of a simulation system provided by an embodiment of the present invention.
  • the simulation system 800 of this embodiment includes: a radar simulation device 801.
  • the radar simulation device 801 can adopt the structure of the embodiment shown in FIG. 7, and accordingly, the technical solutions of the above method embodiments can be executed.
  • the implementation principles and technical effects are similar, and are not repeated here.
  • the simulation system 800 further includes: a rendering device 802, configured to render the target image.
  • the target image is at least one of N images corresponding to all detection ranges traversed in one scanning period in the stereo scene, where the lidar sensor is at the current sampling time, and N is greater than An integer of 2.
  • the rendering device 802 is specifically used to render the N images.
  • the simulation system 800 further includes: a movement device 803, and the movement device 803 is used to simulate the lidar The movement of the sensor carrier.
  • the rendering device 802 is specifically configured to render the N images according to the motion of the carrier. Specifically, the position of the radar sensor in the stereo scene at the current sampling time may be determined according to the motion of the carrier, and the N images are rendered according to the position of the radar sensor in the stereo scene at the current sampling time.
  • the radar simulation device 801 is configured to output the detection result of the lidar sensor, which specifically includes: transmitting the detection result of the lidar sensor to the algorithm system 900.
  • the motion device 803 is further configured to receive a control instruction sent by the algorithm system 900 according to the detection result, the control instruction is used to control the motion of the carrier, and simulate the motion of the carrier according to the control instruction .
  • the time synchronization can be achieved between the simulation system 800 and the algorithm system 900 through time stamp synchronization.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种雷达仿真方法、装置(700)及系统(800)。方法包括:根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点(201,301,401);目标图像为与激光雷达传感器当前采样时刻的探测范围对应的图像;目标图像包括深度图,深度图中像素点的深度值表示像素点与激光雷达传感器之间的距离;根据多个目标像素点各目标像素点的深度值,输出激光雷达传感器的探测结果(202)。实现了激光雷达传感器的仿真。

Description

雷达仿真方法、装置及系统 技术领域
本发明涉及雷达技术领域,尤其涉及一种雷达仿真方法、装置及系统。
背景技术
目前,随着自动化水平的不断提高,雷达传感器的应用越来越广泛。
现有技术中,激光雷达是一种采用发射激光束来探测目标位置、速度等特征量的雷达系统,广泛使用于飞行器,也被应用于最新的自动驾驶技术上。目前,开发激光雷达传感器相关的算法,需要依赖真实的激光雷达传感器。并且,真实的激光雷达传感器拥有价格贵,规格多样,安装困难,采集数据不方便等特点。
因此,目前依赖于激光雷达传感器的算法开发,存在算法开发周期长的问题。
发明内容
本发明实施例提供一种雷达仿真方法、装置及系统,用于解决现有技术中依赖于激光雷达传感器的开发算法,存在算法开发周期长的问题。
第一方面,本发明实施例提供一种雷达仿真方法,包括:
根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点;所述目标图像为与所述激光雷达传感器当前采样时刻的探测范围对应的图像;所述目标图像包括深度图,所述深度图中像素点的深度值表示所述像素点与所述激光雷达传感器之间的距离;
根据多个所述目标像素点中各目标像素点的深度值,输出所述激光雷达传感器的探测结果。
第二方面,本发明实施例提供一种雷达仿真装置,包括:处理器和存储器;
所述存储器,用于存储程序代码;
所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点;所述目标图像为与所述激光雷达传感器当前采样时刻的探测范围对应的图像;所述目标图像包括深度图,所述深度图中像素点的深度值表示所述像素点与所述激光雷达传感器之间的距离;
根据多个所述目标像素点中各目标像素点的深度值,输出所述激光雷达传感器的探测结果。
第三方面,本发明实施例提供一种仿真系统,包括:上述第二方面任一项所述的雷达仿真装置。
第四方面,本发明实施例提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序包含至少一段代码,所述至少一段代码可由计算机执行,以控制所述计算机执行如第一方面任一项所述的雷达仿真方法。
第五方面,本发明实施例提供一种计算机程序,当所述计算机程序被计算机执行时,用于实现如上述第一方面任一项所述的雷达仿真方法。
本发明实施例提供的雷达仿真方法、装置及系统,通过根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点,并根据多个目标像素点中各目标像素点的深度值,输出激光雷达传感器的探测结果,使得能够通过仿真得到激光雷达传感器的各激光射线采样到的点,并进一步仿真得到激光雷达传感器的探测结果,实现了对于激光雷达传感器的仿真,从而使得可以避免算法开发过程中需要依赖真实的激光雷达传感器而导致的开发周期长等问题。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1A为真实的激光雷达传感器的物理模型的俯视图;
图1B为真实的激光雷达传感器的物理模型的侧视图
图2为本发明一实施例提供的雷达仿真方法的流程示意图;
图3为本发明另一实施例提供的雷达仿真方法的流程示意图;
图4为本发明又一实施例提供的雷达仿真方法的流程示意图;
图5为本发明实施例提供的N个图像的示意图;
图6为本发明实施例提供的一个图像中目标像素点的示意图;
图7本发明一实施例提供的雷达仿真装置的结构示意图;
图8为本发明一实施例提供的仿真系统的结构示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明实施例提供的雷达仿真方法可以实现对激光雷达传感器的仿真,通过软件运算的方式,仿真得到激光雷达传感器的探测结果。本发明实施例提供的雷达仿真方法可以应用于任何依赖激光雷达传感器的算法开发场景,可以避免在开发过程中对于真实的激光雷达传感器的依赖,从而解决了算法开发由于依赖真实的激光雷达传感器,而存在的算法开发周期长的问题。
图1A为真实的激光雷达传感器的物理模型的俯视图,图1B为真实的激光雷达传感器的物理模型的侧视图。如图1A和图1B所示,真实的激光雷达传感器以固定的角速度绕着Z轴进行自转。在转动的过程中,以一定的采集频率,通过发射多个激光射线进行探测,多个激光射线的分布可以如图1B所示。可以看出,多个激光射线的发射方向不同,用以实现对以激光雷达传感器为中心的一定视场角(FOV,Field of View)范围进行探测,X轴方向为一定视场角
其中,根据采集频率,可以得到在单位时间T内等角度间隔扫描的n组采样点;根据激光雷达传感器的自传的角速度,可以得到单位时间T激光雷 达传感器扫过角度a。
需要说明的是,图1B中以激光雷达传感器发射的激光射线的个数为16,相邻两个激光射线之间间隔的角度为2度,FOV为30度为例。
下面结合附图,对本发明的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
图2为本发明一实施例提供的雷达仿真方法的流程示意图,本实施例的执行主体可以为用于实现雷达仿真的雷达仿真装置,具体可以为雷达仿真装置的处理器。如图2所示,本实施例的方法可以包括:
步骤201,根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点。
本步骤中,所述目标图像为与所述激光雷达传感器当前采样时刻的探测范围对应的图像;所述目标图像包括深度图,所述深度图中像素点的深度值表示所述像素点与所述激光雷达传感器之间的距离。
这里,与所述激光雷达传感器当前采样时刻的探测范围对应的目标图像,可以理解为:目标图像中可以包括所述激光雷达传感器在当前采样时刻的探测范围对应的图像。
由于激光雷达传感器的探测范围是由激光雷达传感器发射的激光射线决定的,激光雷达传感器发射的激光射线是以激光雷达传感器为中心,因此与探测范围对应的目标图像也是以激光雷达传感器为中心。
进一步的,由于对于需要进行仿真的特定的激光雷达传感器,其激光射线的发射方向是已知的,进一步的,由于目标图像和激光射线的发射方向均是以激光雷达传感器为中心的,因此目标图像中各像素点与激光雷达传感器的位置关系也是已知的。因此,根据各激光射线的发射方向以及目标图像,可以确定出各激光射线在目标图像中采样到的目标像素点。这里,各激光射线在目标图像中采样到的目标像素点,即为仿真得到的激光雷达传感器的各激光射线采样到的点。
可选的,多个连续的采样时刻的探测范围对应的目标图像可以相同。这里,通过多个连续的采样时刻的探测范围对应的目标图像相同,可以简化处理。
其中,当前采样时刻可以根据上述采样频率确定。采样频率,也可以称 为采样速度或者采样率,定义了每秒采样的次数,可以用赫兹(Hz)来表示。例如,采样频率为1Hz,可以表示每隔1秒为一个采样时刻。
可选的,目标像素点的个数可以与激光雷达传感器的激光射线的个数相同。
需要说明的是,对于目标图像的获取方式,本发明可以不作限定。例如,可以通过相机拍摄得到,或者,也可以通过渲染得到。
步骤202,根据多个所述目标像素点中各目标像素点的深度值,输出所述激光雷达传感器的探测结果。
本步骤中,由于一个目标像素点的深度值可以表示该目标像素点与激光雷达传感器之间的距离,因此,根据多个目标像素点总各目标像素点的深度值,可以得到各目标像素点与雷达传感器之间的距离。
进一步的,由于目标像素点即为仿真得到的激光雷达传感器的各激光射线采样到的点,因此各目标像素点与雷达传感器之间的距离,即为仿真得到的激光雷达传感器的各激光射线采样到的点与雷达传感器之间的距离。
因此,根据多个所述目标像素点中各目标像素点的深度值,输出的所述激光雷达传感器的探测结果可以包括各目标像素点与激光雷达传感器之间的距离。
需要说明的是,除了距离之外,可选的,激光雷达传感器的探测结果还可以包括各目标像素点对于激光射线的反射信号强度。进一步可选的,各目标像素点对于激光射线的反射信号强度可以相同,例如都为预设强度。
可选的,可以直接将目标像素点的深度值作为目标像素点与激光雷达传感器之间的距离;或者,也可以对目标像素点的深度值进行数学运算,得到目标像素点与激光雷达传感器之间的距离。
本实施例中,通过根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点,并根据多个目标像素点中各目标像素点的深度值,输出激光雷达传感器的探测结果,使得能够通过仿真得到激光雷达传感器的各激光射线采样到的点,并进一步仿真得到激光雷达传感器的探测结果,实现了对于激光雷达传感器的仿真,从而使得可以避免算法开发过程中需要依赖真实的激光雷达传感器而导致的开发周期长等问题。
图3为本发明另一实施例提供的雷达仿真方法的流程示意图,本实施例在图2所示实施例的基础上,主要描述了步骤202的一种可选的实现方式。如图3所示,本实施例的方法可以包括:
步骤301,根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点。
需要说明的是,步骤301与步骤201类似,在此不再赘述。
步骤302,根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果。
本步骤中,由于目标像素点即为仿真得到的激光雷达传感器的各激光射线采样到的点,因此各目标像素点对于激光射线的反射信号强度,即为仿真得到的激光雷达传感器的各激光射线采样到的点对于激光射线的反射信号强度。因此,根据多个所述目标像素点中各目标像素点的深度值,输出的所述激光雷达传感器的探测结果可以包括各目标像素点与雷达传感器之间的距离,以及各目标像素点对于激光射线的反射信号强度。
可选的,可以存储有目标图像中各像素点与反射信号强度的对应关系,一个像素点对应的反射信号强度即为该像素点对于激光射线的反射信号强度。进一步可选的,该各像素点与反射信号强度的对应关系可以预先设置,或者也可以由用户设置。
或者,可选的,所述目标图像还包括:反射率图,所述反射率图中像素点的反射率表示所述像素点对于激光射线的反射率。相应的,本实施例的方法还可以包括如下步骤A和步骤B。
步骤A,根据所述反射率图,确定多个所述目标像素点中各目标像素点对于激光射线的反射率。
这里,假设目标图像包括像素点1至像素点100,100个像素,则深度图中包括该100个像素点中各像素点的深度值,反射率图中包括该100个像素点中各像素点的反射率。进一步的,假设目标像素点为像素点10、像素点20、像素点30和像素点40,则根据反射率图可以得到像素点10、像素点20、像素点30和像素点40中各像素点对于激光射线的反射率。
步骤B,根据多个所述目标像素点中各目标像素点的深度值,以及各目 标像素点对应激光射线的反射率,确定各目标像素点对于激光射线的反射信号强度。
这里,由于真实的激光雷达传感器的各激光射线采样到的点对于激光射线的反射信号强度,与采样到的点和激光雷达传感器之间的距离以及采样到的点对于激光射线的反射率有关,因此根据一个目标像素点的深度值以及该目标像素点对于激光射线的反射率,可以确定出该目标像素点对应激光射线的反射强度。
需要说明的是,对于根据一个目标像素点的深度值,以及该目标像素点对应激光射线的反射率,确定该目标像素点对于激光射线的反射信号强度的具体方式,本发明不作限定。
可选的,为了便于算法开发,上述输出结果中还可以包括目标像素点所属障碍物的类型。
进一步可选的,可以存储有目标图像中各像素点与障碍物的类型的对应关系,一个像素点对应的障碍物的类型即为该像素点所属障碍物的类型。进一步可选的,各像素点与障碍物的类型的对应关系可以预先设置,或者也可以由用户设置。
或者,可选的,所述目标图像还包括:标识图,所述标识图中像素点的标识表示所述像素点所属障碍物的类型。这里,假设目标图像包括像素点1至像素点100,100个像素,则深度图中包括该100个像素点中各像素点的深度值,标识图中包括该100个像素点中各像素点的所属障碍物的类型。进一步的,假设目标像素点为像素点10、像素点20、像素点30和像素点40,则根据标识图可以得到像素点10、像素点20、像素点30和像素点40中各像素点所属障碍物的类型。
相应的,上述根据多个所述目标像素点中各目标像素点的深度值,输出所述激光雷达传感器的探测结果,包括:根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点所属障碍物的类型,输出所述激光雷达传感器的探测结果。这里,输出的所述激光雷达传感器的探测结果可以包括各目标像素点与雷达传感器之间的距离,以及各目标像素点所属障碍物的类型。
进一步可选的,上述根据多个所述目标像素点中各目标像素点的深度值, 以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,包括:根据多个所述目标像素点中各目标像素点的深度值,各目标像素点对于激光射线的反射信号强度,以及各目标像素点所属障碍物的类型,输出所述激光雷达传感器的探测结果。这里,输出的所述激光雷达传感器的探测结果可以包括各目标像素点与雷达传感器之间的距离,各目标像素点对于激光射线的反射信号强度,以及各目标像素点所属障碍物的类型。
本实施例中,通过根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点,并根据多个目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出激光雷达传感器的探测结果,实现了对于真实的激光雷达传感器采样到的点的反射率的仿真,使得仿真得到激光雷达传感器的探测结果中包括各目标像素点与雷达传感器之间的距离,以及各目标像素点对于激光射线的反射信号强度。
图4为本发明又一实施例提供的雷达仿真方法的流程示意图,本实施例在图2、图3所示实施例的基础上,主要描述了在仿真中,考虑到噪声对于激光雷达传感器的探测结果的影响。如图4所示,本实施例的方法可以包括:
步骤401,根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点。
需要说明的是,步骤401与步骤201类似,在此不再赘述。
步骤402,根据噪声信号,对多个所述目标像素点中各目标像素点的深度值进行加噪,得到加噪后的各目标像素点的深度值。
本步骤中,考虑到真实的激光雷达传感器通常的探测结果并不是完全正确的,通常会受到噪声的影响,然而,目标像素点的深度值所表示的目标像素点与激光雷达传感器之前的距离是正确的距离,因此为了提高仿真的真实性,可以对目标像素点的深度值进行加噪。可以理解的是,在对目标像素点的深度值进行加噪之后,根据目标像素点的深度值输出的探测结果中包括了噪声的影响。
可选的,步骤402具体可以包括:根据噪声信号,对多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度进行加噪,得到加噪后的各目标像素点的深度值,以及加噪后的各目标像 素点对于激光射线的反射信号强度。需要说明的是,这里对于深度值和反射信号强度可以使用同一噪声信号进行加噪,或者,也可以使用不同噪声信号进行加噪,本发明对此不作限定。
可选的,所述噪声信号具体可以为高斯白噪声信号。
需要说明的是,当目标像素点与激光雷达传感器之间的距离是通过对目标像素点的深度值进行数学运算得到时,可替换的,也可以在根据深度值得到距离之后,根据噪声信号对距离进行加噪,得到加噪后的距离。这里,探测结果中包括加噪后的距离。
步骤403,根据加噪后的各目标像素点的深度值,输出所述激光雷达传感器的探测结果。
本步骤中,可选的,可以将加噪后的各目标像素点的深度值,作为探测结果中加噪后的各目标像素点与激光雷达传感器之间的距离;或者,可以通过对加噪后的各目标像素点的深度值进行数学运算,得到探测结果中加噪后的各目标像素点与激光雷达传感器之间的距离。
可选的,步骤403具体可以包括:根据加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果。
进一步可选的,为了便于算法开发,可以根据激光雷达传感器的数据协议将探测结果封装为满足该数据协议的结果。相应的,步骤403具体可以包括:根据所述激光雷达传感器的数据协议,对加噪后的各目标像素点的深度值进行封装,得到封装后的结果;输出所述激光雷达传感器的探测结果,所述激光雷达传感器的探测结果为所述封装后的结果。
可选的,当激光雷达传感器的探测结果中包括加噪后的各目标像素点对于激光射线的反射信号强度时,步骤403具体可以包括:所述根据加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,包括:根据所述激光雷达传感器的数据协议,对加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度进行封装,得到封装后的结果;输出所述激光雷达传感器的探测结果,所述激光雷达传感器的探测结果为所述封装后的结果。
可以理解的是,在不考虑噪声对于激光雷达传感器的影响时,可替换的,可以根据所述激光雷达传感器的数据协议,对各目标像素点的深度值进行封装,得到封装后的结果,并输出所述激光雷达传感器的探测结果,所述激光雷达传感器的探测结果为所述封装后的结果。
需要说明的是,对于激光雷达传感器的数据协议,本发明不作限定。例如,可以增加特定的包头等。
本实施例中,通过根据噪声信号,对多个目标像素点中各目标像素点的深度值进行加噪,得到加噪后的各目标像素点的深度值,并根据加噪后的各目标像素点的深度值,输出激光雷达传感器的探测结果,可以提高仿真的真实性。
可选的,上述实施例中,所述输出所述激光雷达传感器的探测结果,包括:将所述激光雷达传感器的探测结果发送至算法系统;或者,将所述激光雷达传感器的探测结果存储为本地文件。可选的,算法系统具体可以是指正在开发的算法系统,或者,正在测试的算法系统。
例如,可以将包括各目标像素点与激光雷达传感器之间距离的探测结果发送至算法系统,或者存储为本地文件。又例如,可以将包括加噪后的各目标像素点与激光雷达传感器之间距离的探测结果发送至算法系统,或者存储为本地文件。
可选的,上述实施例中,上述根据激光雷达传感器的多个激光射线中所述多个激光射线可以为所述激光雷达传感器所有激光射线,或者,可以为所述激光雷达传感器所有激光射线中的部分激光射线。这里,通过所述多个激光射线为所述激光雷达传感器所有激光射线中的部分激光射线,可以实现对于真实的激光雷达传感器中某个或某些激光射线损坏的仿真,提高了仿真的灵活性。
可选的,上述实施例中,所述目标图像可以为一定的立体场景对应。可选的,所述目标图像可以为立体场景中,所述激光雷达传感器在所述当前采样时刻所在位置,一个扫描周期所遍历的所有探测范围对应的N个图像中的至少一个,N为大于2的整数。
可以理解的是,N过大时,N个图像中各图像对应的采样时刻越少,各图像的像素点个数越少,但是存在由于N过大而导致仿真的复杂性过大的问 题;N过小时,N个图像中各图像对应的采样时刻越多,各图像的像素点个数越多,但是存在由于N过小,导致图像中像素点的深度值不准确,从而导致真的准确性过低的问题。因此,需要合理选择N。可选的,N可以等于4。
可选的,为了避免由于N个图像中相邻两幅图像的临界像素点的深度值等的突变,而导致输出的探测结果不准确的问题,相邻两幅图像的临界像素点可以重叠。具体的,当N等于4时,所述N个图像中各图像的水平视场角等于90°与偏移角度之和,所述偏移角度为正数。例如,如图5所示,N个图像具体可以为图像1、图像2、图像3和图像4。
需要说明的是,图5中以N个图像为平面图像为例。可替换的,所述N个图像也可以是立体图像,例如可以为扇形的立体图像,或球形的立体图像。其中,当所述N个图像为平面图像时,激光雷达传感器在N个图像中的一个图像上所有目标像素点可以如图6所示。其中,每一列的目标像素点具体可以为在一个采样时刻得到的目标像素点。需要说明的是,图6中以图像的水平视场角等于激光雷达传感器的探测范围的视场角,均为30度为例。
可以理解的是,当相邻两幅图像的临界像素点重叠时,所述目标图像可以为所述N个图像中的两个图像。相应的,上述根据多个所述目标像素点中各目标像素点的深度值,输出所述激光雷达传感器的探测结果,具体可以包括如下步骤C和步骤D。
步骤C,根据所述两个图像的各目标像素点的深度值,确定各目标像素点的联合深度值。
步骤D,根据各目标像素点的联合深度值,输出所述激光雷达传感器的探测结果。
这里,输出的所述激光雷达传感器的探测结果可以包括各目标像素点的联合深度值表示的与激光雷达传感器之间的距离。例如,可以包括各目标像素点的联合深度值。
进一步可选的,在根据各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果时,步骤D之前还可以包括如下步骤:根据所述两个图像的各目标像素点对于激光射线的反射信号强度,确定各目标像素点对于激光射线的联合反射信号强度。
相应的,步骤D具体可以包括:根据各目标像素点的联合深度值,以及 各目标像素点对应激光射线的联合反射信号强度,输出所述激光雷达传感器的探测结果。这里,输出的所述激光雷达传感器的探测结果可以包括各目标像素点的联合深度值表示的与激光雷达传感器之间的距离,以及各目标像素点对于激光射线的联合反射信号强度。
可以理解的是,在考虑噪声时,可以根据噪声信号对联合深度值和联合反射信号强度进行加噪,得到加噪后的联合深度值和加噪后的联合反射信号强度。
需要说明的是,对于确定一个目标像素点的联合深度值的具体方式,本发明可以不作限定。例如,可以将所述两个图像中该目标像素点的深度值进行加权求和,得到联合深度值。又例如,可以将所述两个图像中该目标像素点的深度值进行平均,得到联合深度值,即所述联合深度值为深度值的平均值。
需要说明的是,对于确定一个目标像素点的联合反射信号强度的具体方式,本发明可以不作限定。例如,可以将所述两个图像中该目标像素点的反射信号强度进行加权求和,得到联合反射信号强度。又例如,可以将所述两个图像中该目标像素点的反射信号强度值进行平均,得到联合反射信号强度,即所述联合反射信号强度为反射信号强度的平均值。
需要说明的是,确定一个目标像素点的联合反射信号强度的具体方式,与确定该目标像素点的联合反射信号强度的具体方式,可以相同也可以不同,本发明对此不作限定。
可选的,所述偏移角度可以等于2°。
可选的,所述激光雷达传感器在所述立体场景中的位置可以固定,也可以变化。可以理解的是,当所述当前采样时刻所述激光雷达传感器在所述立体场景中的位置相对于上一采样时刻所述激光雷达传感器在所述立体场景中的位置变化时,激光雷达传感器的探测范围可以发生变化,因此所述N个图像相应变化。
可选的,考虑到激光雷达传感器通常是设置在载体上进行探测,因此可以通过仿真雷达传感器的载体的运动,从而仿真雷达传感器的运动。具体的,所述激光雷达传感器在所述立体场景中的位置,随着所述激光雷达传感器的载体的运动而变化。
需要说明的是,所述载体具体可以为任意能够承载激光雷达传感器的设备。可选的,所述载体包括无人机或车等。
本发明实施例中还提供了一种计算机可读存储介质,该计算机可读存储介质中存储有程序指令,所述程序执行时可包括如上述各方法实施例中的雷达仿真方法的部分或全部步骤。
本发明实施例提供一种计算机程序,当所述计算机程序被计算机执行时,用于实现上述任一方法实施例中的雷达仿真方法。
图7本发明一实施例提供的雷达仿真装置的结构示意图,如图7所示,本实施例的雷达仿真装置700可以包括:存储器701和处理器702;上述存储器701和处理器702可以通过总线连接。存储器701可以包括只读存储器和随机存取存储器,并向处理器702提供指令和数据。存储器701的一部分还可以包括非易失性随机存取存储器。
所述存储器701,用于存储程序代码。
所述处理器702,调用所述程序代码,当程序代码被执行时,用于执行以下操作:
根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点;所述目标图像为与所述激光雷达传感器当前采样时刻的探测范围对应的图像;所述目标图像包括深度图,所述深度图中像素点的深度值表示所述像素点与所述激光雷达传感器之间的距离;
根据多个所述目标像素点中各目标像素点的深度值,输出所述激光雷达传感器的探测结果。
在一种可能的实现中,所述处理器702,用于根据多个所述目标像素点中各目标像素点的深度值,输出所述激光雷达传感器的探测结果,具体包括:
根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果。
在一种可能的实现中,所述目标图像还包括:反射率图,所述反射率图中像素点的反射率表示所述像素点对于激光射线的反射率;
所述处理器702,还用于:
根据所述反射率图,确定多个所述目标像素点中各目标像素点对于激光 射线的反射率;
根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对应激光射线的反射率,确定各目标像素点对于激光射线的反射信号强度。
在一种可能的实现中,所述目标图像还包括:标识图,所述标识图中像素点的标识表示所述像素点所属障碍物的类型;
所述处理器702,用于根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,具体包括:
根据多个所述目标像素点中各目标像素点的深度值,各目标像素点对于激光射线的反射信号强度,以及各目标像素点所属障碍物的类型,输出所述激光雷达传感器的探测结果。
在一种可能的实现中,所述处理器702,还用于根据噪声信号,对多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度进行加噪,得到加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度;
所述处理器702,用于根据多个所述目标像素点中各目标像素点的深度值,各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,具体包括:
根据加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果。
在一种可能的实现中,所述噪声信号为高斯白噪声信号。
在一种可能的实现中,所述处理器702,用于根据加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,具体包括:
根据所述激光雷达传感器的数据协议,对加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度进行封装,得到封装后的结果;
输出所述激光雷达传感器的探测结果,所述激光雷达传感器的探测结果为所述封装后的结果。
在一种可能的实现中,所述处理器702,用于输出所述激光雷达传感器 的探测结果,具体包括:
将所述激光雷达传感器的探测结果发送至算法系统;
或者,将所述激光雷达传感器的探测结果存储为本地文件。
在一种可能的实现中,所述目标图像为立体场景中,所述激光雷达传感器在所述当前采样时刻所在位置,一个扫描周期所遍历的所有探测范围对应的N个图像中的至少一个,N为大于2的整数。
在一种可能的实现中,所述N个图像是平面图像或者立体图像。
在一种可能的实现中,N等于4。
在一种可能的实现中,所述N个图像中各图像的水平视场角等于90°与偏移角度之和,所述偏移角度为正数。
在一种可能的实现中,所述偏移角度等于2°。
在一种可能的实现中,所述目标图像为所述N个图像中的两个图像;
所述处理器702,用于根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,具体包括:
根据所述两个图像的各目标像素点的深度值,确定各目标像素点的联合深度值;
根据所述两个图像的各目标像素点对于激光射线的反射信号强度,确定各目标像素点对于激光射线的联合反射信号强度;
根据各目标像素点的联合深度值,以及各目标像素点对应激光射线的联合反射信号强度,输出所述激光雷达传感器的探测结果。
在一种可能的实现中,所述联合深度值为深度值的平均值。
在一种可能的实现中,所述联合反射信号强度为反射信号强度的平均值。
在一种可能的实现中,所述当前采样时刻所述激光雷达传感器在所述立体场景中的位置,相对于上一采样时刻所述激光雷达传感器在所述立体场景中的位置变化时,所述N个图像相应变化。
在一种可能的实现中,所述激光雷达传感器在所述立体场景中的位置,随着所述激光雷达传感器的载体的运动而变化。
在一种可能的实现中,所述载体包括无人机或车。
在一种可能的实现中,所述多个激光射线为所述激光雷达传感器所有激 光射线中的部分激光射线。
本实施例提供的雷达仿真装置,可以用于执行本发明上述方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
图8为本发明一实施例提供的仿真系统的结构示意图,如图8所示,本实施例的仿真系统800包括:雷达仿真装置801。其中,雷达仿真装置801可以采用图7所示实施例的结构,其相应地,可以执行上述各方法实施例的技术方案,其实现原理和技术效果类似,此处不再赘述。
可选的,当所述目标图像通过渲染得到时,如图8所示,仿真系统800还包括:渲染装置802,用于渲染得到所述目标图像。
可选的,所述目标图像为在立体场景中,所述激光雷达传感器在所述当前采样时刻所在位置,一个扫描周期所遍历的所有探测范围对应的N个图像中的至少一个,N为大于2的整数。
相应的,所述渲染装置802,具体用于渲染得到所述N个图像。
可选的,当通过仿真雷达传感器的载体的运动,从而仿真雷达传感器的运动时,如图8所示,仿真系统800还包括:运动装置803,所述运动装置803用于仿真所述激光雷达传感器的载体的运动。
所述渲染装置802,具体用于根据所述载体的运动,渲染得到所述N个图像。具体的,可以根据所述载体的运动,确定当前采样时刻雷达传感器在所述立体场景中的位置,并根据当前采样时刻雷达传感器在立体场景中的位置,渲染得到所述N个图像。
可选的,如图8所示,雷达仿真装置801,用于输出所述激光雷达传感器的探测结果,具体包括:将所述激光雷达传感器的探测结果传输至算法系统900。
所述运动装置803,还用于接收所述算法系统900根据所述探测结果发送的控制指令,所述控制指令用于控制所述载体的运动,并根据所述控制指令仿真所述载体的运动。
可选的,为了实现算法系统和仿真系统的时间上的同步,仿真系统800和算法系统900之间可以通过时间戳同步实现时间上的同步。
本领域普通技术人员可以理解:实现上述各方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成。前述的程序可以存储于一计算机可 读取存储介质中。该程序在执行时,执行包括上述各方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (47)

  1. 一种雷达仿真方法,其特征在于,包括:
    根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点;所述目标图像为与所述激光雷达传感器当前采样时刻的探测范围对应的图像;所述目标图像包括深度图,所述深度图中像素点的深度值表示所述像素点与所述激光雷达传感器之间的距离;
    根据多个所述目标像素点中各目标像素点的深度值,输出所述激光雷达传感器的探测结果。
  2. 根据权利要求1所述的方法,其特征在于,所述根据多个所述目标像素点中各目标像素点的深度值,输出所述激光雷达传感器的探测结果,包括:
    根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果。
  3. 根据权利要求2所述的方法,其特征在于,所述目标图像还包括:反射率图,所述反射率图中像素点的反射率表示所述像素点对于激光射线的反射率;
    所述方法还包括:
    根据所述反射率图,确定多个所述目标像素点中各目标像素点对于激光射线的反射率;
    根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对应激光射线的反射率,确定各目标像素点对于激光射线的反射信号强度。
  4. 根据权利要求2或3所述的方法,其特征在于,所述目标图像还包括:标识图,所述标识图中像素点的标识表示所述像素点所属障碍物的类型;
    所述根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,包括:
    根据多个所述目标像素点中各目标像素点的深度值,各目标像素点对于激光射线的反射信号强度,以及各目标像素点所属障碍物的类型,输出所述激光雷达传感器的探测结果。
  5. 根据权利要求2-4任一项所述的方法,其特征在于,所述根据多个所 述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果之前,还包括:
    根据噪声信号,对多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度进行加噪,得到加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度;
    所述根据多个所述目标像素点中各目标像素点的深度值,各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,包括:
    根据加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果。
  6. 根据权利要求5所述的方法,其特征在于,所述噪声信号为高斯白噪声信号。
  7. 根据权利要求5或6所述的方法,其特征在于,所述根据加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,包括:
    根据所述激光雷达传感器的数据协议,对加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度进行封装,得到封装后的结果;
    输出所述激光雷达传感器的探测结果,所述激光雷达传感器的探测结果为所述封装后的结果。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,所述输出所述激光雷达传感器的探测结果,包括:
    将所述激光雷达传感器的探测结果发送至算法系统;
    或者,将所述激光雷达传感器的探测结果存储为本地文件。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,所述目标图像为立体场景中,所述激光雷达传感器在所述当前采样时刻所在位置,一个扫描周期所遍历的所有探测范围对应的N个图像中的至少一个,N为大于2的整数。
  10. 根据权利要求9所述的方法,其特征在于,所述N个图像是平面图像或者立体图像。
  11. 根据权利要求9或10所述的方法,其特征在于,N等于4。
  12. 根据权利要求11所述的方法,其特征在于,所述N个图像中各图像的水平视场角等于90°与偏移角度之和,所述偏移角度为正数。
  13. 根据权利要求12所述的方法,其特征在于,所述偏移角度等于2°。
  14. 根据权利要求12或13所述的方法,其特征在于,所述目标图像为所述N个图像中的两个图像;
    所述根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,包括:
    根据所述两个图像的各目标像素点的深度值,确定各目标像素点的联合深度值;
    根据所述两个图像的各目标像素点对于激光射线的反射信号强度,确定各目标像素点对于激光射线的联合反射信号强度;
    根据各目标像素点的联合深度值,以及各目标像素点对应激光射线的联合反射信号强度,输出所述激光雷达传感器的探测结果。
  15. 根据权利要求14所述的方法,其特征在于,所述联合深度值为深度值的平均值。
  16. 根据权利要求14或15所述的方法,其特征在于,所述联合反射信号强度为反射信号强度的平均值。
  17. 根据权利要求9-16任一项所述的方法,其特征在于,所述当前采样时刻所述激光雷达传感器在所述立体场景中的位置,相对于上一采样时刻所述激光雷达传感器在所述立体场景中的位置变化时,所述N个图像相应变化。
  18. 根据权利要求17所述的方法,其特征在于,所述激光雷达传感器在所述立体场景中的位置,随着所述激光雷达传感器的载体的运动而变化。
  19. 根据权利要求18所述的方法,其特征在于,所述载体包括无人机或车。
  20. 根据权利要求1-19任一项所述的方法,其特征在于,所述多个激光射线为所述激光雷达传感器所有激光射线中的部分激光射线。
  21. 一种雷达仿真装置,其特征在于,包括:处理器和存储器;
    所述存储器,用于存储程序代码;
    所述处理器,调用所述程序代码,当程序代码被执行时,用于执行以下 操作:
    根据激光雷达传感器的多个激光射线中各激光射线的发射方向,确定各激光射线在目标图像中采样到的目标像素点;所述目标图像为与所述激光雷达传感器当前采样时刻的探测范围对应的图像;所述目标图像包括深度图,所述深度图中像素点的深度值表示所述像素点与所述激光雷达传感器之间的距离;
    根据多个所述目标像素点中各目标像素点的深度值,输出所述激光雷达传感器的探测结果。
  22. 根据权利要求21所述的装置,其特征在于,所述处理器,用于根据多个所述目标像素点中各目标像素点的深度值,输出所述激光雷达传感器的探测结果,具体包括:
    根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果。
  23. 根据权利要求22所述的装置,其特征在于,所述目标图像还包括:反射率图,所述反射率图中像素点的反射率表示所述像素点对于激光射线的反射率;
    所述处理器,还用于:
    根据所述反射率图,确定多个所述目标像素点中各目标像素点对于激光射线的反射率;
    根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对应激光射线的反射率,确定各目标像素点对于激光射线的反射信号强度。
  24. 根据权利要求22或23所述的装置,其特征在于,所述目标图像还包括:标识图,所述标识图中像素点的标识表示所述像素点所属障碍物的类型;
    所述处理器,用于根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,具体包括:
    根据多个所述目标像素点中各目标像素点的深度值,各目标像素点对于激光射线的反射信号强度,以及各目标像素点所属障碍物的类型,输出所述激光雷达传感器的探测结果。
  25. 根据权利要求22-24任一项所述的装置,其特征在于,所述处理器,还用于根据噪声信号,对多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度进行加噪,得到加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度;
    所述处理器,用于根据多个所述目标像素点中各目标像素点的深度值,各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,具体包括:
    根据加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果。
  26. 根据权利要求25所述的装置,其特征在于,所述噪声信号为高斯白噪声信号。
  27. 根据权利要求25或26所述的装置,其特征在于,所述处理器,用于根据加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,具体包括:
    根据所述激光雷达传感器的数据协议,对加噪后的各目标像素点的深度值,以及加噪后的各目标像素点对于激光射线的反射信号强度进行封装,得到封装后的结果;
    输出所述激光雷达传感器的探测结果,所述激光雷达传感器的探测结果为所述封装后的结果。
  28. 根据权利要求21-27任一项所述的装置,其特征在于,所述处理器,用于输出所述激光雷达传感器的探测结果,具体包括:
    将所述激光雷达传感器的探测结果发送至算法系统;
    或者,将所述激光雷达传感器的探测结果存储为本地文件。
  29. 根据权利要求21-28任一项所述的装置,其特征在于,所述目标图像为立体场景中,所述激光雷达传感器在所述当前采样时刻所在位置,一个扫描周期所遍历的所有探测范围对应的N个图像中的至少一个,N为大于2的整数。
  30. 根据权利要求29所述的装置,其特征在于,所述N个图像是平面图像或者立体图像。
  31. 根据权利要求29或30所述的装置,其特征在于,N等于4。
  32. 根据权利要求31所述的装置,其特征在于,所述N个图像中各图像的水平视场角等于90°与偏移角度之和,所述偏移角度为正数。
  33. 根据权利要求32所述的装置,其特征在于,所述偏移角度等于2°。
  34. 根据权利要求32或33所述的装置,其特征在于,所述目标图像为所述N个图像中的两个图像;
    所述处理器,用于根据多个所述目标像素点中各目标像素点的深度值,以及各目标像素点对于激光射线的反射信号强度,输出所述激光雷达传感器的探测结果,具体包括:
    根据所述两个图像的各目标像素点的深度值,确定各目标像素点的联合深度值;
    根据所述两个图像的各目标像素点对于激光射线的反射信号强度,确定各目标像素点对于激光射线的联合反射信号强度;
    根据各目标像素点的联合深度值,以及各目标像素点对应激光射线的联合反射信号强度,输出所述激光雷达传感器的探测结果。
  35. 根据权利要求34所述的装置,其特征在于,所述联合深度值为深度值的平均值。
  36. 根据权利要求34或35所述的装置,其特征在于,所述联合反射信号强度为反射信号强度的平均值。
  37. 根据权利要求29-36任一项所述的装置,其特征在于,所述当前采样时刻所述激光雷达传感器在所述立体场景中的位置,相对于上一采样时刻所述激光雷达传感器在所述立体场景中的位置变化时,所述N个图像相应变化。
  38. 根据权利要求37所述的装置,其特征在于,所述激光雷达传感器在所述立体场景中的位置,随着所述激光雷达传感器的载体的运动而变化。
  39. 根据权利要求38所述的装置,其特征在于,所述载体包括无人机或车。
  40. 根据权利要求21-39任一项所述的装置,其特征在于,所述多个激光射线为所述激光雷达传感器所有激光射线中的部分激光射线。
  41. 一种仿真系统,其特征在于,包括:权利要求21-40任一项所述的 雷达仿真装置。
  42. 根据权利要求41所述的系统,其特征在于,所述仿真系统还包括:渲染装置,用于渲染得到所述目标图像。
  43. 根据权利要求42所述的系统,其特征在于,所述目标图像为在立体场景中,所述激光雷达传感器在所述当前采样时刻所在位置,一个扫描周期所遍历的所有探测范围对应的N个图像中的至少一个,N为大于2的整数;
    所述渲染装置,具体用于渲染得到所述N个图像。
  44. 根据权利要求43所述的系统,其特征在于,所述仿真系统还包括:运动装置,所述运动装置用于仿真所述激光雷达传感器的载体的运动;
    所述渲染装置,具体用于根据所述载体的运动,渲染得到所述N个图像。
  45. 根据权利要求44所述的系统,其特征在于,所述雷达仿真装置,用于输出所述激光雷达传感器的探测结果,具体包括:将所述激光雷达传感器的探测结果传输至算法系统;
    所述运动装置,还用于接收所述算法系统根据所述探测结果发送的控制指令,所述控制指令用于控制所述载体的运动,并根据所述控制指令仿真所述载体的运动。
  46. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序包含至少一段代码,所述至少一段代码可由计算机执行,以控制所述计算机执行如权利要求1-20任一项所述的雷达仿真方法。
  47. 一种计算机程序,其特征在于,当所述计算机程序被计算机执行时,用于实现如权利要求1-20任一项所述的雷达仿真方法。
PCT/CN2018/124924 2018-12-28 2018-12-28 雷达仿真方法、装置及系统 WO2020133230A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880042394.3A CN110809723A (zh) 2018-12-28 2018-12-28 雷达仿真方法、装置及系统
PCT/CN2018/124924 WO2020133230A1 (zh) 2018-12-28 2018-12-28 雷达仿真方法、装置及系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/124924 WO2020133230A1 (zh) 2018-12-28 2018-12-28 雷达仿真方法、装置及系统

Publications (1)

Publication Number Publication Date
WO2020133230A1 true WO2020133230A1 (zh) 2020-07-02

Family

ID=69487895

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/124924 WO2020133230A1 (zh) 2018-12-28 2018-12-28 雷达仿真方法、装置及系统

Country Status (2)

Country Link
CN (1) CN110809723A (zh)
WO (1) WO2020133230A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113820694A (zh) * 2021-11-24 2021-12-21 腾讯科技(深圳)有限公司 一种仿真测距的方法、相关装置、设备以及存储介质

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111624583B (zh) * 2020-07-30 2020-10-30 之江实验室 一种考虑速度因素的激光雷达测距的快速数值仿真方法
CN112269182B (zh) * 2020-09-24 2022-08-12 北京一径科技有限公司 目标雷达信号的确定方法和装置、存储介质、电子装置
CN112061131B (zh) * 2020-11-13 2021-01-15 奥特酷智能科技(南京)有限公司 一种基于道路数据的仿真车避开障碍行驶的方法
WO2023010540A1 (zh) * 2021-08-06 2023-02-09 深圳市大疆创新科技有限公司 激光雷达的扫描结果的验证方法、装置、设备及存储介质
CN114898037B (zh) * 2022-04-24 2023-03-10 哈尔滨方聚科技发展有限公司 激光三维动态场景建模系统及建模方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100164948A1 (en) * 2008-12-29 2010-07-01 Samsung Electronics Co., Ltd. Apparatus and method of enhancing ray tracing speed
CN104268323A (zh) * 2014-09-17 2015-01-07 西安电子科技大学 基于光线跟踪的激光雷达场景仿真方法
CN106526605A (zh) * 2016-10-28 2017-03-22 北京康力优蓝机器人科技有限公司 激光雷达与深度相机的数据融合方法及系统
CN107255821A (zh) * 2017-06-07 2017-10-17 旗瀚科技有限公司 一种基于多台深度相机拼接模拟激光雷达数据的方法
CN108519590A (zh) * 2018-03-26 2018-09-11 北京理工大学 激光成像雷达目标回波信号模拟方法及模拟器
CN108564615A (zh) * 2018-04-20 2018-09-21 驭势(上海)汽车科技有限公司 模拟激光雷达探测的方法、装置、系统及存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108387907B (zh) * 2018-01-15 2020-05-29 上海机电工程研究所 闪光式激光雷达回波信号物理图像模拟系统和方法
CN108226891B (zh) * 2018-01-26 2021-09-03 中国电子科技集团公司第三十八研究所 一种扫描雷达回波计算方法
CN109031253A (zh) * 2018-08-27 2018-12-18 森思泰克河北科技有限公司 激光雷达标定系统及标定方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100164948A1 (en) * 2008-12-29 2010-07-01 Samsung Electronics Co., Ltd. Apparatus and method of enhancing ray tracing speed
CN104268323A (zh) * 2014-09-17 2015-01-07 西安电子科技大学 基于光线跟踪的激光雷达场景仿真方法
CN106526605A (zh) * 2016-10-28 2017-03-22 北京康力优蓝机器人科技有限公司 激光雷达与深度相机的数据融合方法及系统
CN107255821A (zh) * 2017-06-07 2017-10-17 旗瀚科技有限公司 一种基于多台深度相机拼接模拟激光雷达数据的方法
CN108519590A (zh) * 2018-03-26 2018-09-11 北京理工大学 激光成像雷达目标回波信号模拟方法及模拟器
CN108564615A (zh) * 2018-04-20 2018-09-21 驭势(上海)汽车科技有限公司 模拟激光雷达探测的方法、装置、系统及存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113820694A (zh) * 2021-11-24 2021-12-21 腾讯科技(深圳)有限公司 一种仿真测距的方法、相关装置、设备以及存储介质
CN113820694B (zh) * 2021-11-24 2022-03-01 腾讯科技(深圳)有限公司 一种仿真测距的方法、相关装置、设备以及存储介质

Also Published As

Publication number Publication date
CN110809723A (zh) 2020-02-18

Similar Documents

Publication Publication Date Title
WO2020133230A1 (zh) 雷达仿真方法、装置及系统
US20230177819A1 (en) Data synthesis for autonomous control systems
US11455565B2 (en) Augmenting real sensor recordings with simulated sensor data
US10877152B2 (en) Systems and methods for generating synthetic sensor data
US11487988B2 (en) Augmenting real sensor recordings with simulated sensor data
WO2019127445A1 (zh) 三维建图方法、装置、系统、云端平台、电子设备和计算机程序产品
US9342890B2 (en) Registering of a scene disintegrating into clusters with visualized clusters
Cerqueira et al. A novel GPU-based sonar simulator for real-time applications
CN102763420B (zh) 深度相机兼容性
CN111080662A (zh) 车道线的提取方法、装置及计算机设备
CN113366341B (zh) 点云数据的处理方法、装置、存储介质及激光雷达系统
US11941888B2 (en) Method and device for generating training data for a recognition model for recognizing objects in sensor data of a sensor, in particular, of a vehicle, method for training and method for activating
US20220206117A1 (en) System and method for emulating echo signals for a lidar sensor
JP7060157B2 (ja) データ圧縮装置、データ圧縮方法、及びプログラム
WO2022135594A1 (zh) 目标物体的检测方法及装置、融合处理单元、介质
CN114814758B (zh) 相机-毫米波雷达-激光雷达的联合标定方法、设备
US20150235410A1 (en) Image processing apparatus and method
KR20160000084A (ko) 이미징 소나의 이미지 예측 시뮬레이션 방법 및 이를 이용한 장치
CN114072697B (zh) 一种模拟连续波lidar传感器的方法
CN116503566B (zh) 一种三维建模方法、装置、电子设备及存储介质
CN112630798B (zh) 用于估计地面的方法和装置
CN117250956A (zh) 一种多观测源融合的移动机器人避障方法和避障装置
US9245346B2 (en) Registering of a scene disintegrating into clusters with pairs of scans
CN112651405B (zh) 目标检测方法及装置
Almanza-Medina et al. Imaging sonar simulator for assessment of image registration techniques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18944723

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18944723

Country of ref document: EP

Kind code of ref document: A1