CN111316119A - Radar simulation method and device - Google Patents

Radar simulation method and device Download PDF

Info

Publication number
CN111316119A
CN111316119A CN201880072069.1A CN201880072069A CN111316119A CN 111316119 A CN111316119 A CN 111316119A CN 201880072069 A CN201880072069 A CN 201880072069A CN 111316119 A CN111316119 A CN 111316119A
Authority
CN
China
Prior art keywords
depth
value
camera
depth map
radar sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880072069.1A
Other languages
Chinese (zh)
Inventor
黎晓键
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Shenzhen Dajiang Innovations Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111316119A publication Critical patent/CN111316119A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/03Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring coordinates of points
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

A radar simulation method and device. The radar simulation method comprises the following steps: according to the depth map of the current frame, a plurality of pixel points (101) in the depth map are determined, the depth values of the pixel points in the depth map represent the distances between the pixel points and the camera, and the distances represented by the depth values of the pixel points are smaller than the distances represented by the depth values of other pixel points in the depth map; outputting a detection result (102) of the radar sensor according to the depth value of each pixel point in the plurality of pixel points; simulation of a radar sensor emitting cone-shaped beams is achieved.

Description

Radar simulation method and device
Technical Field
The invention relates to the technical field of radars, in particular to a radar simulation method and device.
Background
At present, with the continuous improvement of automation level, the application of the radar sensor is more and more extensive.
In the prior art, a radar sensor which emits electromagnetic waves in a cone-shaped beam, for example, a millimeter wave radar, may be applied to a scene with a high requirement for reliability due to its advantages such as reliability. For example, in the present day that the automatic driving technology is becoming more and more sophisticated, the millimeter wave radar is already an indispensable sensor in the automatic driving technology as a sensor with accurate distance measurement, long detection distance and all-weather operation.
However, the current technology of the radar sensor which emits the cone-shaped beam is relied on, and the problems of complicated field test and high development cost are difficult to avoid in the development process.
Disclosure of Invention
The embodiment of the invention provides a radar simulation method and device, which are used for solving the problems that in the prior art, the technology of a radar sensor for emitting cone-shaped beams is relied on, the field test is complicated and the development cost is high in the development process.
In a first aspect, an embodiment of the present invention provides a radar simulation method, including:
determining a plurality of pixel points in the depth map according to the depth map of the current frame, wherein the depth values of the pixel points in the depth map represent the distances between the pixel points and the camera, and the distances represented by the depth values of the pixel points are smaller than the distances represented by the depth values of other pixel points in the depth map;
outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points;
wherein the depth map satisfies a field angle FOV condition of the camera, the FOV condition of the camera corresponding to a detection range of the radar sensor.
In a second aspect, an embodiment of the present invention provides a radar simulation apparatus, including: a processor and a memory;
the memory for storing program code;
the processor, invoking the program code, when executed, is configured to:
determining a plurality of pixel points in the depth map according to the depth map of the current frame, wherein the depth values of the pixel points in the depth map represent the distances between the pixel points and the camera, and the distances represented by the depth values of the pixel points are smaller than the distances represented by the depth values of other pixel points in the depth map;
outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points;
wherein the depth map satisfies a field angle FOV condition of the camera, the FOV condition of the camera corresponding to a detection range of the radar sensor.
In a third aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and the computer program includes at least one piece of code, where the at least one piece of code is executable by a computer to control the computer to execute the radar simulation method according to any one of the claims in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer program, which is configured to, when executed by a computer, implement the radar simulation method according to any one of the first aspect.
According to the radar simulation method and device provided by the embodiment of the invention, the plurality of pixel points in the depth map are determined according to the depth map of the current frame, and the detection result of the radar sensor is output according to the depth value of each pixel point in the plurality of pixel points, so that the target point detected by the radar sensor for transmitting the cone-shaped beam can be obtained through simulation, the detection result of the radar sensor can be further obtained through simulation, the simulation of the radar sensor for transmitting the cone-shaped beam is realized, and the problems of complicated field test, high development cost and the like caused by dependence on the real radar sensor for transmitting the cone-shaped beam in the development process can be avoided.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a radar simulation method according to an embodiment of the present invention;
FIG. 2A is a schematic diagram of a detection range according to an embodiment of the present invention;
FIG. 2B is a schematic diagram of a depth map according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention;
fig. 4 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention;
fig. 5 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention;
FIG. 6 is a schematic diagram of an output detection architecture according to an embodiment of the present invention;
fig. 7 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention;
fig. 8 is a schematic structural diagram of a radar simulation apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The radar simulation method provided by the embodiment of the invention can realize the simulation of the radar sensor which emits the electromagnetic wave as the cone-shaped wave beam, and the detection result of the radar sensor which emits the cone-shaped wave beam is obtained through the simulation in a software operation mode. The radar simulation method provided by the embodiment of the invention can be applied to any development scene of the radar sensor which depends on transmitting the cone-shaped wave beam, and can avoid the dependence on the real radar sensor which transmits the cone-shaped wave beam in the development process, thereby solving the problems of complicated field test and high development cost caused by the dependence on the real radar sensor which transmits the cone-shaped wave beam in the development process.
Optionally, the radar sensor emitting the cone-shaped beam may be specifically a millimeter wave radar sensor.
Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Fig. 1 is a schematic flowchart of a radar simulation method according to an embodiment of the present invention, where an execution main body of the embodiment may be a radar simulation apparatus for implementing radar simulation, and may specifically be a processor of the radar simulation apparatus. As shown in fig. 1, the method of this embodiment may include:
step 101, determining a plurality of pixel points in a depth map according to the depth map of a current frame.
In this step, the depth values of the pixels in the depth map represent distances between the pixels and the camera, and the distances represented by the depth values of the pixels are smaller than the distances represented by the depth values of other pixels in the depth map. For example, if the depth map includes 12 pixels, i.e., pixel 1 to pixel 12, and the distance represented by the depth value from pixel 1 to pixel 12 is larger and larger, the plurality of pixels determined according to the depth map may be, specifically, pixel 1 to pixel 3 among the 12 pixels.
Wherein the depth map satisfies a Field of View (FOV) condition of the camera, the FOV condition of the camera corresponding to a detection range of the radar sensor.
Here, since the beam emitted from the radar sensor is a cone-shaped beam whose detection range is similar to the shooting range of the camera, when the depth map satisfies the field angle condition corresponding to the detection range of the radar sensor, the distance to the radar sensor can be simulated by the distance to the camera, which is represented by the depth value of the pixel point in the depth map, so that a plurality of pixel points closest to the radar sensor can be determined according to the depth value of the pixel point in the depth map. Here, the plurality of pixel points may be understood as a plurality of target points detected by a radar sensor obtained through simulation, and the plurality of pixel points may correspond to the plurality of target points one to one. It should be understood that the FOV condition of the camera here may be consistent or mapped with the detection range of the radar sensor.
Optionally, the FOV condition of the camera corresponds to the detection range of the radar sensor, specifically, the FOV condition of the camera is completely the same as the detection range of the radar sensor, or the FOV condition of the camera is approximately the same as the detection range of the radar sensor.
For example, a depth map corresponding to the detection range in front of the vehicle X1 in the scene shown in fig. 2A may be as shown in fig. 2B, when a larger depth value indicates a closer distance to the camera.
Optionally, the number of the plurality of pixel points may be a preset number, for example, 64. Assuming that the depth map includes pixels 1 to 128 and 128 pixels, and the depth values of the pixels 1 to 128 decrease in turn, when the preset number is equal to 64 and the larger the depth value is, the closer the distance between the camera is, the plurality of pixels may be specifically pixels 1 to 64.
The present invention is not limited to the depth map acquisition method. For example, the image may be obtained by shooting with a camera, or may be obtained by rendering.
It should be noted that the radar sensor can detect at a certain sampling frequency. Specifically, the detection range of the radar sensor at the last sampling moment may correspond to the last frame, and a plurality of pixel points may be determined according to the depth map of the last frame; the detection range of the radar sensor at the current sampling moment can correspond to the depth map of the current frame, and a plurality of pixel points can be determined according to the depth map of the current frame; the detection range of the radar sensor at the next sampling moment can correspond to the depth map of the next frame, and a plurality of pixel points can be determined according to the depth map of the next frame. It is to be understood that the depth map of the current frame may be the same as or different from the depth map of the previous frame, and the depth map of the current frame may be the same as or different from the depth map of the next frame.
And 102, outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points.
In this step, since the depth value of one pixel point can represent the distance between the pixel point and the camera, the distance between each pixel point and the camera can be obtained according to the depth value of each pixel point in the plurality of pixel points. Further, since the plurality of pixel points are target points detected by the radar sensor obtained through simulation, and the detection range of the radar sensor corresponds to the FOV of the camera, the distance between the pixel point and the camera is the distance between the target point detected by the radar sensor obtained through simulation and the radar sensor.
Therefore, the output detection result of the radar sensor may include a distance between each pixel and the camera (which may be understood as a distance between each target point detected by the radar sensor and the radar sensor) according to the depth value of each pixel in the plurality of pixels.
It should be noted that, in addition to the distance, optionally, the detection result of the radar sensor may further include motion information of each pixel point relative to the radar sensor. Further optionally, the motion information of each pixel point relative to the radar sensor may be the same, for example, all are preset motion information. The motion information may include, for example, a motion direction, a motion speed, and the like.
Optionally, the depth value of the pixel point may be directly used as the distance between the pixel point and the radar sensor; or, the depth value of the pixel point can be mathematically calculated to obtain the distance between the pixel point and the radar sensor.
In this embodiment, a plurality of pixel points in the depth map are determined according to the depth map of the current frame, and the detection result of the radar sensor is output according to the depth value of each pixel point in the plurality of pixel points, so that a target point detected by the radar sensor emitting the cone beam can be obtained through simulation, the detection result of the radar sensor can be further obtained through simulation, and the simulation of the radar sensor emitting the cone beam is realized, thereby avoiding the problems of complicated field test, high development cost and the like caused by relying on the real radar sensor emitting the cone beam in the development process.
Fig. 3 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention, and this embodiment mainly describes an alternative implementation manner of step 102 on the basis of the embodiment shown in fig. 1. As shown in fig. 3, the method of this embodiment may include:
step 301, determining a plurality of pixel points in the depth map according to the depth map of the current frame.
It should be noted that step 301 is similar to step 101, and is not described herein again.
Step 302, outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the motion information of each pixel point relative to the camera.
In this step, since the plurality of pixel points are target points detected by the radar sensor obtained through simulation, and the detection range of the radar sensor corresponds to the FOV of the camera, the motion information of the pixel points relative to the camera is the motion information of the target points detected by the radar sensor obtained through simulation relative to the radar sensor. Therefore, the output detection result of the radar sensor may include a distance between each pixel and the camera (which may be understood as a distance between each target point detected by the radar sensor and the radar sensor), and a motion information of each pixel relative to the camera (which may be understood as a motion information of each target point detected by the radar sensor relative to the radar sensor), according to the depth value of each pixel in the plurality of pixels and the motion information of each pixel relative to the camera.
Optionally, a correspondence between each pixel point in the depth map and the motion information may be stored, and the motion information corresponding to one pixel point is the motion information of the pixel point relative to the camera. Further optionally, the correspondence between each pixel point and the motion information may be preset, or may be set by a user.
Or, optionally, the motion information of the pixel point relative to the camera may be determined according to the label map corresponding to the depth map. The label graph is used for indicating an object to which each pixel point in the depth graph belongs, and the object corresponds to motion information. For example, the label graph may indicate the object to which each pixel point belongs through different color labels, and when the color labels corresponding to two pixel points in the label graph are both of a first color, it may be indicated that the two pixel points belong to the same object and are both the objects represented by the first color. For example, as shown in the depth map of fig. 2B, the color labels corresponding to all the pixels belonging to the road rail X2 may be dark green, and the color labels corresponding to all the pixels belonging to the distant house X3 may be light green.
Further optionally, the method of this embodiment may further include the following step a and step B.
And step A, determining an object to which each pixel point in the plurality of pixel points belongs according to the label graph corresponding to the depth graph.
Here, it is assumed that the depth map includes pixel points 1 to 100, and the depth value of each pixel point in 100 pixel points, the tag map includes the object to which each pixel point in the 100 pixel points belongs. Further, assuming that the plurality of pixels determined according to the depth map are the pixel 10, the pixel 20, the pixel 30, and the pixel 40, the object to which each of the pixel 10, the pixel 20, the pixel 30, and the pixel 40 belongs can be obtained according to the tag map.
The object may specifically be any object that can be detected by the radar sensor, such as the ground, a building on the ground, and the like.
And step B, determining the motion information of each pixel point according to the object to which each pixel point in the plurality of pixel points belongs.
Here, since the object corresponds to the motion information, the motion information of each pixel point with respect to the camera, that is, the motion information of each pixel point with respect to the radar sensor can be obtained according to the object to which each pixel point belongs.
Optionally, when the depth value and the distance need to be converted, step 302 may specifically include: determining the distance between each pixel point in the plurality of pixel points and the camera according to the respective depth value of the plurality of pixel points; and outputting the detection result of the radar sensor according to the distance between each pixel point in the pixel points and the camera and the motion information of each pixel point relative to the camera.
Optionally, for convenience of development, the detection result may further include identification information for indicating an object to which each pixel belongs. Further optionally, the objects may correspond to the target identification numbers one to one; the method further comprises the following steps: and determining the target identification number of each pixel point (which can be understood as the target identification number of each target point) according to the object to which each pixel point in the plurality of pixel points belongs.
Correspondingly, the outputting the detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points includes: and outputting the detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the target identification number of each pixel point. Here, the output detection result of the radar sensor may include a distance between each target point and the camera, and a target identification number of each target point.
Further optionally, the outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the motion information of each pixel point relative to the camera includes:
and outputting the detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points, the motion information of each pixel point relative to the camera and the target identification number of each pixel point. Here, the output detection result of the radar sensor may include a distance between each target point and the radar sensor, movement information of each target point with respect to the radar sensor, and a target identification number of each target point.
In this embodiment, a plurality of pixel points in the depth map are determined according to the depth map of the current frame, and the detection result of the radar sensor is output according to the depth value of each pixel point in the plurality of pixel points and the motion information of each pixel point relative to the camera, so that the simulation of the motion information of the target point sampled by the real radar sensor is realized, and the detection result of the radar sensor obtained by the simulation includes the distance between the target point and the radar sensor and the motion speed of the target point relative to the radar sensor.
Fig. 4 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention, and this embodiment mainly describes, on the basis of the embodiments shown in fig. 1 and fig. 3, the influence of the accuracy loss of the radar sensor on the detection result of the radar sensor in the simulation. As shown in fig. 4, the method of this embodiment may include:
step 401, according to the corresponding relationship between the depth level and the depth value range, updating the depth value of each pixel point in the depth map of the current frame to the maximum value of the depth value range corresponding to the depth level, and obtaining the updated depth map.
Here, updating the depth value of each pixel point in the depth map of the current frame to the maximum value of the depth value range corresponding to the depth level to which the pixel point belongs may indicate that the precision of the depth values in the depth level range is lost, and updating all the depth values in the same depth level range to a fixed depth value in the depth level range, that is, the maximum value in the depth level range. It will be appreciated that, depending on the nature of the loss of accuracy of the radar sensor, the depth values within the same depth level range may alternatively each be updated to other depth values within the depth level range, for example the minimum value within the depth level range.
Optionally, the depth value of the pixel point in the depth map may be any one of integers from 0 to 255,256.
It can be understood that, the higher the detection accuracy of the radar sensor, the finer the granularity of the depth level and depth range division can be; for example, the depth levels 1 to 4,4 depth levels may be divided, and the depth level 1 corresponds to a depth value range 0 to 63, the depth level 2 corresponds to a depth value range 64 to 127, the depth level 3 corresponds to a depth value range 128 to 192, and the depth level 4 corresponds to a depth value range 193 to 255. The lower the detection precision of the radar sensor is, the coarser the granularity of the depth level and the depth range division can be; for example, the depth levels 1 to 6, 6 depth levels may be divided, and the depth level 1 corresponds to a depth value range 0 to 42, the depth level 2 corresponds to a depth value range 43 to 85, the depth level 3 corresponds to a depth value range 86 to 128, the depth level 4 corresponds to a depth value range 129 to 171, the depth level 5 corresponds to a depth value range 172 to 214, and the depth level 6 corresponds to a depth value range 215 to 255.
Optionally, for convenience of calculation, step 401 may further include: and normalizing the depth value of each pixel point in the depth map of the current frame. Correspondingly, step 401 may specifically include updating the depth value of each pixel point in the normalized depth map to the maximum value of the depth value range corresponding to the depth level according to the corresponding relationship between the depth level and the depth value range, so as to obtain the updated depth map. It is to be understood that here, the depth value range is also the range after normalization.
Step 402, determining a plurality of pixel points in the depth map according to the updated depth map.
It should be noted that, the specific manner of determining the plurality of pixel points in the updated depth map according to the updated depth map in step 402 is similar to the specific manner of determining the plurality of pixel points in the depth map according to the depth map in step 101, and is not described herein again.
And 403, outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points.
It should be noted that step 403 is similar to step 102 or step 302, and is not described herein again.
In this embodiment, the depth value of each pixel point in the depth map of the current frame is updated to the maximum value of the depth value range corresponding to the depth level according to the correspondence between the depth level and the depth value range, so as to obtain the updated depth map, and a plurality of pixel points in the depth map are determined according to the updated depth map, thereby realizing the simulation of the precision loss of the radar sensor.
Fig. 5 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention, and this embodiment mainly describes an alternative implementation manner for obtaining the depth map of the current frame in simulation based on the foregoing embodiment. As shown in fig. 5, the method of this embodiment may include:
step 501, obtaining an original depth map of a current frame.
In this step, optionally, the original depth map of the current frame may be obtained through image rendering. Further optionally, the original depth map of the current frame in a certain stereoscopic scene may be obtained through image rendering according to the motion of the radar sensor. Alternatively, the movement of the radar sensor may specifically include a movement of a carrier carrying the radar sensor.
In the above embodiment, each frame of multiple frames, which are consecutive frames related to one stereoscopic scene, may be sequentially used as the current frame at the target frequency. When the original depth map is obtained through image rendering, in particular, the original depth maps of a plurality of continuous frames in the stereoscopic scene can be obtained through image rendering. Wherein the target frequency may be equal to a sampling frequency of a real radar sensor to emulate the sampling frequency of the radar sensor. Optionally, the target frequency is 20 hz.
Step 502, determining a yaw angle and a pitch angle of each pixel point in the original depth map under a camera coordinate system of the camera according to the depth value and the two-dimensional coordinate of each pixel point in the original depth map and the parameters of the camera.
In this step, specifically, the position of each pixel point in the camera coordinate system may be determined according to the parameters of the camera and the two-dimensional coordinates of each pixel point in the original image. Wherein the parameters of the camera may include internal parameters of the camera. Further, the yaw angle and the pitch angle of each pixel point in the camera coordinate system can be determined according to the position of each pixel point in the original depth map in the camera coordinate system and the depth value of each pixel point. Here, the yaw angle and the pitch angle of each pixel point in the camera coordinate system may be understood as the yaw angle and the pitch angle of each pixel point with respect to the radar sensor.
Step 503, according to the pitch angle and the yaw angle of each pixel point in the original depth map in the camera coordinate system, setting the depth value of the pixel point in the original depth map, of which the pitch angle and the yaw angle do not meet the FOV condition, as a preset value, and obtaining the depth map of the current frame.
In this step, the preset value may be used to indicate that the distance from the camera is greater than the maximum detection distance of the radar sensor. When the pitch angle and the yaw angle of one pixel point do not meet the FOV condition, the pixel point can be out of the detection range of the radar sensor, and the radar sensor cannot detect the point as a target point. Here, setting the depth value of a pixel point to a preset value may be understood as excluding the pixel point from a target point that can be detected by the radar sensor.
Optionally, in the embodiment of the present invention, when the real radar sensor has a limitation on both the horizontal FOV and the vertical FOV, the FOV condition may include a horizontal FOV condition and a vertical FOV condition. Wherein the horizontal FOV condition may be used to represent a limit for the radar sensor to the horizontal FOV. The vertical FOV condition may be used to represent a limit for the radar sensor to the vertical FOV.
Optionally, the horizontal FOV condition satisfies a condition that the farther the distance from the camera, the smaller the horizontal FOV value, and the closer the distance from the camera, the larger the horizontal FOV value. Further optionally, the horizontal FOV conditions include:
a horizontal FOV value is equal to a first FOV value when the distance from the camera is less than or equal to a first distance threshold;
a horizontal FOV value is equal to a second FOV value when the distance from the camera is greater than the first distance threshold and less than or equal to a second distance threshold;
a horizontal FOV value is equal to a third FOV value when the distance from the camera is greater than the second distance threshold and less than or equal to a third distance threshold;
a distance from the camera by the third distance threshold, the horizontal FOV value being equal to a fourth FOV value;
wherein the second distance threshold is greater than the first distance threshold and less than the third distance threshold; the second FOV value is greater than the third FOV value and less than the first FOV value, the third FOV value being greater than the first FOV value.
Further optionally, the first distance threshold is equal to 70m, the second distance threshold is equal to 160 m, and the third distance threshold is equal to 250 m.
Further optionally, the first FOV value is equal to 60 °, the second FOV value is equal to 8 °, the third FOV value is equal to 4 °, and the fourth FOV value is equal to 0 °.
Alternatively, the vertical FOV condition may satisfy a condition independent of the distance of the camera. Further optionally, the vertical FOV condition comprises: the vertical FOV value is equal to 10.
It should be noted that, for the horizontal FOV condition and the vertical FOV condition, a person skilled in the art may flexibly design according to the detection range of the real radar sensor, and the invention is not limited to this.
Alternatively, as shown in fig. 6, when the original depth map is obtained by image rendering, the depth map may be obtained by a step of rendering the depth map, and the marker map may be obtained by rendering the marker map. The steps 501 to 503 may be understood as steps of rendering a depth map. Further, the detection result of the radar sensor may be output according to the depth map and the tag map obtained by rendering, in a relevant manner according to the foregoing embodiment. It should be noted that the rendering of the depth map and the marker map of the same frame may be performed based on the same shooting range of the camera in the same stereo scene, where the shooting range is the detection range of the radar sensor.
In this embodiment, a yaw angle and a pitch angle of each pixel point in the original depth map under a camera coordinate system of the camera are determined by obtaining the original depth map of the current frame, according to the depth value and two-dimensional coordinates of each pixel point in the original depth map and parameters of the camera, and further, the depth value of the pixel point in the original depth map, for which the pitch angle and the yaw angle do not satisfy the FOV condition, is set to a preset value, so as to obtain the depth map of the current frame, thereby obtaining the depth map satisfying the FOV condition of the camera, i.e., the depth map corresponding to the detection range of the radar sensor.
Fig. 7 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention, which mainly describes that, based on the foregoing embodiment, the influence of ground clutter on the detection result of the radar sensor is considered in the simulation. As shown in fig. 7, the method of this embodiment may include:
step 701, determining a plurality of pixel points in a depth map according to the depth map of a current frame.
It should be noted that step 701 is similar to step 101, and is not described herein again.
Step 702, determining the detection result that the ground point in the current frame interferes with the current frame.
In this step, the ground points are used to simulate the influence of ground clutter on the detection result of the radar sensor, that is, the radar sensor identifies the ground points as target points due to the influence of the ground clutter. Since the effect of ground clutter is not always present, it may be determined that ground points of the current frame interfere with the detection of the current frame via step 702.
Considering that after a ground point is identified as a target point in a real radar sensor, the ground point is continuously identified as the target point when the ground point is in the detection range of the radar sensor, optionally, in order to improve the reality of the influence of the simulated ground clutter on the detection result, the step 702 may specifically include: and if the ground point in the previous frame of the current frame interferes with the detection result of the previous frame, the ground point in the current frame interferes with the detection result of the current frame.
Further optionally, when the detection result in the previous frame is not interfered by the ground point, the detection result of the current frame may be interfered by the ground point or may not be interfered by the ground point, and at this time, it may be determined that the detection result of the current frame is interfered by the ground point according to the probability. Specifically, the method of this embodiment further includes: and if the ground point in the previous frame does not interfere with the detection result of the previous frame, determining the detection result of the ground point in the current frame interfering with the current frame according to the target probability.
Optionally, the target probability may be a preset probability, or may also be positively correlated with the target frame number. Further optionally, the more the number of target frames is, the greater the target probability is; the target frame number is the number of continuous frames which continue to the current frame and the ground point does not interfere with the detection result.
For example, assuming that 11 frames 1 to 11 are continuous, the detection results of frames 1 to 3 are interfered by ground points, the ground points interfering the detection results of frames 1 to 3 are not included in frame 4, the target frame number corresponding to frame 4 is 1, the target probability is 1 and the detection results of frame 4 are not interfered by ground points, the target frame number corresponding to frame 5 is 2, the target probability is 2 and the detection results of frame 5 are not interfered by ground points, the target frame number corresponding to frame 6 is 3, the target probability is 3 and the detection results of frame 6 are not interfered by ground points, the target frame number corresponding to frame 7 is 4, the target probability is 4 and the probability 4> probability 3> probability 2> probability 1.
Further, assuming that the detection result of the frame 7 is interfered by the ground point, and the frames 8 and 9 include the ground point interfering with the detection result in the frame 7, the ground point interfering with the detection result in the frame 7 is not included in the frame 10, the target frame number corresponding to the frame 10 is 1, the target probability is 5, and the detection result of the frame 10 is not interfered by the ground point, the target frame number corresponding to the frame 11 is 2, the target probability is 6, and the probability 6> the probability 5.
Considering the influence of ground clutter on the radar sensor is random, optionally, to further improve the realism of the simulation, the ground points may be randomly selected points. It is understood that the ground point may be a preset point without considering the randomness of the ground clutter.
Step 703, determining the distance between the ground point and the camera.
In this step, it can be understood that the ground point is a pixel point corresponding to the ground in the current frame, and therefore, the distance between the ground point and the camera can be determined by the depth value of the ground point.
It should be noted that there is no restriction on the order between step 702, step 703 and step 701.
Step 704, outputting a detection result of the radar sensor according to the depth value of each pixel point of the plurality of pixel points and the distance between the ground point and the camera.
In this step, according to the depth value of each of the plurality of pixel points and the distance between the ground point and the camera, the output detection result of the radar sensor may include the distance between each pixel point and the camera and the distance between the ground point and the camera (which may be understood as the distance between a target point detected by the radar sensor due to ground clutter interference and the radar sensor relative to the radar sensor).
Optionally, in step 704, the detection result of the radar sensor may also be output according to the motion information of each pixel point relative to the camera. That is, the output radar detection result may further include: the motion information of each pixel point relative to the camera, and the motion information of the ground point relative to the camera. Since the absolute velocity of the ground point is 0, the motion information of the ground point relative to the camera can be obtained from the motion state of the camera.
In the embodiment, the distance between the ground point and the camera is determined by determining the detection result that the ground point in the current frame interferes with the current frame, and the detection result of the radar sensor is output according to the depth value of each pixel point in the plurality of pixel points and the distance between the ground point and the camera, so that the detection result of the radar sensor can include the influence of the ground clutter, the simulation that the ground clutter influences the radar sensor is realized, and the authenticity of the simulation is improved.
In the foregoing embodiment, optionally, the outputting a detection result of the radar sensor according to the depth value of each of the plurality of pixel points includes: and sequentially outputting the detection results of the radar sensor according to the depth values of the pixels in the plurality of pixels and the sequence of the distances represented by the depth values from small to large.
The embodiment of the present invention further provides a computer-readable storage medium, in which program instructions are stored, and when the program is executed, some or all of the steps of the radar simulation method in the above method embodiments may be included.
An embodiment of the present invention provides a computer program, which is used to implement the radar simulation method in any one of the above method embodiments when the computer program is executed by a computer.
Fig. 8 is a schematic structural diagram of a radar simulation apparatus according to an embodiment of the present invention, and as shown in fig. 8, a radar simulation apparatus 800 according to an embodiment of the present invention may include: a memory 801 and a processor 802; the memory 801 and the processor 802 may be connected by a bus. The memory 801 may include a read-only memory 801 and a random access memory 801 and provides instructions and data to the processor 802. A portion of the memory 801 may also include non-volatile random access memory 801.
The memory 801 is used for storing program codes.
The processor 802, invoking the program code, when executed, is configured to:
in a possible implementation, the processor 802 is configured to output a detection result of the radar sensor according to a depth value of each of the plurality of pixel points, and specifically includes:
and outputting the detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the motion information of each pixel point relative to the camera.
In one possible implementation, the processor 802 is further configured to:
determining an object to which each pixel point in the plurality of pixel points belongs according to the label graph corresponding to the depth graph; the label graph is used for indicating an object to which each pixel point in the depth map belongs, and the object corresponds to motion information;
and determining the motion information of each pixel point according to the object to which each pixel point belongs in the plurality of pixel points.
In one possible implementation, the objects correspond to the target identification numbers one by one; the processor 802 is further configured to:
determining a target identification number of each pixel point according to an object to which each pixel point in the plurality of pixel points belongs;
the outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points includes:
and outputting the detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the target identification number of each pixel point.
In a possible implementation, the processor 802 is configured to output a detection result of a radar sensor according to a depth value of each of the plurality of pixels and motion information of each pixel relative to the camera, and specifically includes:
determining the distance between each pixel point in the plurality of pixel points and the camera according to the respective depth value of the plurality of pixel points;
and outputting the detection result of the radar sensor according to the distance between each pixel point in the pixel points and the camera and the motion information of each pixel point relative to the camera.
In one possible implementation, the processor 802 is further configured to:
according to the corresponding relation between the depth level and the depth value range, updating the depth value of each pixel point in the depth map of the current frame to be the maximum value of the depth value range corresponding to the depth level to obtain the updated depth map;
the processor 802 is configured to determine a plurality of pixel points in the depth map according to the depth map of the current frame, and specifically includes:
and determining a plurality of pixel points in the depth map according to the updated depth map.
In one possible implementation, the processor 802 is further configured to:
and normalizing the depth value of each pixel point in the depth map of the current frame.
In one possible implementation, the processor 802 is further configured to:
obtaining an original depth map of a current frame;
determining a yaw angle and a pitch angle of each pixel point in the original depth map under a camera coordinate system of the camera according to the depth value and the two-dimensional coordinate of each pixel point in the original depth map and the parameters of the camera;
according to the pitch angle and the yaw angle of each pixel point in the original depth map under the camera coordinate system, setting the depth value of the pixel point in the original depth map, of which the pitch angle and the yaw angle do not meet the FOV condition, as a preset value to obtain the depth map of the current frame; the preset value is used for indicating that the distance between the camera and the radar sensor is larger than the maximum detection distance of the radar sensor.
In one possible implementation, the processor 802 is configured to obtain an original depth map of a current frame, and specifically includes:
and obtaining an original depth map of the current frame through image rendering.
In one possible implementation, the processor 802 is further configured to: and sequentially taking each frame in a plurality of frames as the current frame by using the target frequency, wherein the plurality of frames are a plurality of continuous frames related to a three-dimensional scene.
In one possible implementation, the target frequency is 20 hertz.
In one possible implementation, the processor 802 is further configured to:
determining a detection result that a ground point in a current frame interferes with the current frame;
determining a distance between the ground point and the camera;
the processor 802 is configured to output a detection result of the radar sensor according to a depth value of each of the plurality of pixel points, and specifically includes:
and outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the distance between the ground point and the camera.
In one possible implementation, the processor 802 is configured to determine that a ground point in a current frame interferes with a detection result of the current frame, and specifically includes:
and if the ground point in the previous frame of the current frame interferes with the detection result of the previous frame, the ground point in the current frame interferes with the detection result of the current frame.
In one possible implementation, the processor 802 is further configured to:
and if the ground point in the previous frame does not interfere with the detection result of the previous frame, determining the detection result of the ground point in the current frame interfering with the current frame according to the target probability.
In one possible implementation, the greater the target frame number, which is the number of consecutive frames that continue until the current frame and the ground point does not interfere with the detection result, the greater the target probability.
In one possible implementation, the ground points are obtained by random selection.
In a possible implementation, the processor 802 is configured to output a detection result of the radar sensor according to a depth value of each of the plurality of pixel points, and specifically includes:
and sequentially outputting the detection results of the radar sensor according to the depth values of the pixels in the plurality of pixels and the sequence of the distances represented by the depth values from small to large.
In one possible implementation, the pieces include horizontal FOV conditions and vertical FOV conditions.
In one possible implementation, the horizontal FOV condition comprises:
a horizontal FOV value is equal to a first FOV value when the distance from the camera is less than or equal to a first distance threshold;
a horizontal FOV value is equal to a second FOV value when the distance from the camera is greater than the first distance threshold and less than or equal to a second distance threshold;
a horizontal FOV value is equal to a third FOV value when the distance from the camera is greater than the second distance threshold and less than or equal to a third distance threshold;
a distance from the camera by the third distance threshold, the horizontal FOV value being equal to a fourth FOV value;
wherein the second distance threshold is greater than the first distance threshold and less than the third distance threshold; the second FOV value is greater than the third FOV value and less than the first FOV value, the third FOV value being greater than the first FOV value.
In one possible implementation, the radar sensor is a millimeter wave radar sensor.
The radar simulation apparatus provided in this embodiment may be used to implement the technical solution of the above method embodiment of the present invention, and the implementation principle and the technical effect are similar, which are not described herein again.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (42)

1. A radar simulation method, comprising:
determining a plurality of pixel points in the depth map according to the depth map of the current frame, wherein the depth values of the pixel points in the depth map represent the distances between the pixel points and the camera, and the distances represented by the depth values of the pixel points are smaller than the distances represented by the depth values of other pixel points in the depth map;
outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points;
wherein the depth map satisfies a field angle FOV condition of the camera, the FOV condition of the camera corresponding to a detection range of the radar sensor.
2. The method of claim 1, wherein outputting the detection result of the radar sensor according to the depth value of each of the plurality of pixel points comprises:
and outputting the detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the motion information of each pixel point relative to the camera.
3. The method of claim 2, further comprising:
determining an object to which each pixel point in the plurality of pixel points belongs according to the label graph corresponding to the depth graph; the label graph is used for indicating an object to which each pixel point in the depth map belongs, and the object corresponds to motion information;
and determining the motion information of each pixel point according to the object to which each pixel point belongs in the plurality of pixel points.
4. The method of claim 3, wherein the objects correspond one-to-one to object identification numbers; the method further comprises the following steps:
determining a target identification number of each pixel point according to an object to which each pixel point in the plurality of pixel points belongs;
the outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points includes:
and outputting the detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the target identification number of each pixel point.
5. The method according to any one of claims 2-4, wherein outputting the detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the motion information of each pixel point relative to the camera comprises:
determining the distance between each pixel point in the plurality of pixel points and the camera according to the respective depth value of the plurality of pixel points;
and outputting the detection result of the radar sensor according to the distance between each pixel point in the pixel points and the camera and the motion information of each pixel point relative to the camera.
6. The method according to any one of claims 1-5, wherein before determining a plurality of pixel points in the depth map according to the depth map of the current frame, the method further comprises:
according to the corresponding relation between the depth level and the depth value range, updating the depth value of each pixel point in the depth map of the current frame to be the maximum value of the depth value range corresponding to the depth level to obtain the updated depth map;
the determining a plurality of pixel points in the depth map according to the depth map of the current frame includes:
and determining a plurality of pixel points in the depth map according to the updated depth map.
7. The method according to claim 6, wherein before the updating the depth value of each pixel point in the depth map of the current frame to the maximum value of the depth value range corresponding to the depth level according to the correspondence between the depth level and the depth value range to obtain the updated depth map, the method further comprises:
and normalizing the depth value of each pixel point in the depth map of the current frame.
8. The method according to any one of claims 1-7, wherein before determining a plurality of pixel points in the depth map according to the depth map of the current frame, the method further comprises:
obtaining an original depth map of a current frame;
determining a yaw angle and a pitch angle of each pixel point in the original depth map under a camera coordinate system of the camera according to the depth value and the two-dimensional coordinate of each pixel point in the original depth map and the parameters of the camera;
according to the pitch angle and the yaw angle of each pixel point in the original depth map under the camera coordinate system, setting the depth value of the pixel point in the original depth map, of which the pitch angle and the yaw angle do not meet the FOV condition, as a preset value to obtain the depth map of the current frame; the preset value is used for indicating that the distance between the camera and the radar sensor is larger than the maximum detection distance of the radar sensor.
9. The method of claim 8, wherein obtaining the original depth map of the current frame comprises:
and obtaining an original depth map of the current frame through image rendering.
10. The method according to any one of claims 1 to 9, wherein each of a plurality of frames, which are consecutive frames related to one stereoscopic scene, is sequentially taken as the current frame at a target frequency.
11. The method of claim 10, wherein the target frequency is 20 hertz.
12. The method according to claim 10 or 11, characterized in that the method further comprises:
determining a detection result that a ground point in a current frame interferes with the current frame;
determining a distance between the ground point and the camera;
the outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points includes:
and outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the distance between the ground point and the camera.
13. The method of claim 12 wherein determining that a ground point in a current frame interferes with a sounding result of the current frame comprises:
and if the ground point in the previous frame of the current frame interferes with the detection result of the previous frame, the ground point in the current frame interferes with the detection result of the current frame.
14. The method of claim 13, further comprising:
and if the ground point in the previous frame does not interfere with the detection result of the previous frame, determining the detection result of the ground point in the current frame interfering with the current frame according to the target probability.
15. The method of claim 14 wherein the target probability is greater the number of target frames, which is the number of consecutive frames that last until the current frame, the ground point does not interfere with the sounding results.
16. The method of any of claims 12-15 wherein the ground points are randomly selected.
17. The method according to any one of claims 1 to 16, wherein outputting the detection result of the radar sensor according to the depth value of each of the plurality of pixels comprises:
and sequentially outputting the detection results of the radar sensor according to the depth values of the pixels in the plurality of pixels and the sequence of the distances represented by the depth values from small to large.
18. The method of any of claims 1-17, wherein the FOV conditions include a horizontal FOV condition and a vertical FOV condition.
19. The method of claim 18, wherein the horizontal FOV condition comprises:
a horizontal FOV value is equal to a first FOV value when the distance from the camera is less than or equal to a first distance threshold;
a horizontal FOV value is equal to a second FOV value when the distance from the camera is greater than the first distance threshold and less than or equal to a second distance threshold;
a horizontal FOV value is equal to a third FOV value when the distance from the camera is greater than the second distance threshold and less than or equal to a third distance threshold;
a distance from the camera by the third distance threshold, the horizontal FOV value being equal to a fourth FOV value;
wherein the second distance threshold is greater than the first distance threshold and less than the third distance threshold; the second FOV value is greater than the third FOV value and less than the first FOV value, the third FOV value being greater than the first FOV value.
20. The method of any one of claims 1-19, wherein the radar sensor is a millimeter wave radar sensor.
21. A radar simulation apparatus, comprising: a processor and a memory;
the memory for storing program code;
the processor, invoking the program code, when executed, is configured to:
determining a plurality of pixel points in the depth map according to the depth map of the current frame, wherein the depth values of the pixel points in the depth map represent the distances between the pixel points and the camera, and the distances represented by the depth values of the pixel points are smaller than the distances represented by the depth values of other pixel points in the depth map;
outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points;
wherein the depth map satisfies a field angle FOV condition of the camera, the FOV condition of the camera corresponding to a detection range of the radar sensor.
22. The apparatus of claim 21, wherein the processor is configured to output the detection result of the radar sensor according to the depth value of each of the plurality of pixels, and specifically comprises:
and outputting the detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the motion information of each pixel point relative to the camera.
23. The apparatus of claim 22, wherein the processor is further configured to:
determining an object to which each pixel point in the plurality of pixel points belongs according to the label graph corresponding to the depth graph; the label graph is used for indicating an object to which each pixel point in the depth map belongs, and the object corresponds to motion information;
and determining the motion information of each pixel point according to the object to which each pixel point belongs in the plurality of pixel points.
24. The apparatus of claim 23, wherein the objects have a one-to-one correspondence with object identification numbers; the processor is further configured to:
determining a target identification number of each pixel point according to an object to which each pixel point in the plurality of pixel points belongs;
the outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points includes:
and outputting the detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the target identification number of each pixel point.
25. The apparatus according to any one of claims 22-24, wherein the processor is configured to output the detection result of the radar sensor according to the depth value of each of the plurality of pixels and the motion information of each pixel relative to the camera, and specifically includes:
determining the distance between each pixel point in the plurality of pixel points and the camera according to the respective depth value of the plurality of pixel points;
and outputting the detection result of the radar sensor according to the distance between each pixel point in the pixel points and the camera and the motion information of each pixel point relative to the camera.
26. The apparatus according to any of claims 21-25, wherein the processor is further configured to:
according to the corresponding relation between the depth level and the depth value range, updating the depth value of each pixel point in the depth map of the current frame to be the maximum value of the depth value range corresponding to the depth level to obtain the updated depth map;
the processor is configured to determine a plurality of pixel points in the depth map according to the depth map of the current frame, and specifically includes:
and determining a plurality of pixel points in the depth map according to the updated depth map.
27. The apparatus of claim 26, wherein the processor is further configured to:
and normalizing the depth value of each pixel point in the depth map of the current frame.
28. The apparatus according to any of claims 21-27, wherein the processor is further configured to:
obtaining an original depth map of a current frame;
determining a yaw angle and a pitch angle of each pixel point in the original depth map under a camera coordinate system of the camera according to the depth value and the two-dimensional coordinate of each pixel point in the original depth map and the parameters of the camera;
according to the pitch angle and the yaw angle of each pixel point in the original depth map under the camera coordinate system, setting the depth value of the pixel point in the original depth map, of which the pitch angle and the yaw angle do not meet the FOV condition, as a preset value to obtain the depth map of the current frame; the preset value is used for indicating that the distance between the camera and the radar sensor is larger than the maximum detection distance of the radar sensor.
29. The apparatus of claim 28, wherein the processor is configured to obtain an original depth map of the current frame, and specifically comprises:
and obtaining an original depth map of the current frame through image rendering.
30. The apparatus according to any of claims 21-29, wherein the processor is further configured to: and sequentially taking each frame in a plurality of frames as the current frame by using the target frequency, wherein the plurality of frames are a plurality of continuous frames related to a three-dimensional scene.
31. The apparatus of claim 30, wherein the target frequency is 20 hertz.
32. The apparatus of claim 30 or 31, wherein the processor is further configured to:
determining a detection result that a ground point in a current frame interferes with the current frame;
determining a distance between the ground point and the camera;
the processor is configured to output a detection result of the radar sensor according to a depth value of each of the plurality of pixel points, and specifically includes:
and outputting a detection result of the radar sensor according to the depth value of each pixel point in the plurality of pixel points and the distance between the ground point and the camera.
33. The apparatus of claim 32 wherein the processor is configured to determine that a ground point in a current frame interferes with a detection of the current frame, and further comprising:
and if the ground point in the previous frame of the current frame interferes with the detection result of the previous frame, the ground point in the current frame interferes with the detection result of the current frame.
34. The apparatus of claim 33, wherein the processor is further configured to:
and if the ground point in the previous frame does not interfere with the detection result of the previous frame, determining the detection result of the ground point in the current frame interfering with the current frame according to the target probability.
35. The apparatus of claim 34 wherein the higher the target probability, the higher the target frame number, which is the number of consecutive frames that last until the current frame, the ground point does not interfere with the sounding results.
36. The apparatus of any of claims 32-35 wherein the ground points are randomly selected.
37. The apparatus according to any one of claims 21 to 36, wherein the processor is configured to output the detection result of the radar sensor according to the depth value of each of the plurality of pixels, and specifically includes:
and sequentially outputting the detection results of the radar sensor according to the depth values of the pixels in the plurality of pixels and the sequence of the distances represented by the depth values from small to large.
38. The apparatus of any of claims 21-37, wherein the FOV condition comprises a horizontal FOV condition and a vertical FOV condition.
39. The apparatus of claim 38, wherein the horizontal FOV condition comprises:
a horizontal FOV value is equal to a first FOV value when the distance from the camera is less than or equal to a first distance threshold;
a horizontal FOV value is equal to a second FOV value when the distance from the camera is greater than the first distance threshold and less than or equal to a second distance threshold;
a horizontal FOV value is equal to a third FOV value when the distance from the camera is greater than the second distance threshold and less than or equal to a third distance threshold;
a distance from the camera by the third distance threshold, the horizontal FOV value being equal to a fourth FOV value;
wherein the second distance threshold is greater than the first distance threshold and less than the third distance threshold; the second FOV value is greater than the third FOV value and less than the first FOV value, the third FOV value being greater than the first FOV value.
40. The apparatus of any one of claims 21-39, wherein the radar sensor is a millimeter wave radar sensor.
41. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program comprising at least one piece of code executable by a computer for controlling the computer to perform the radar simulation method according to any one of claims 1-20.
42. A computer program for implementing a radar simulation method according to any one of claims 1 to 20 when the computer program is executed by a computer.
CN201880072069.1A 2018-12-28 2018-12-28 Radar simulation method and device Pending CN111316119A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/124822 WO2020133206A1 (en) 2018-12-28 2018-12-28 Radar simulation method and apparatus

Publications (1)

Publication Number Publication Date
CN111316119A true CN111316119A (en) 2020-06-19

Family

ID=71127383

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880072069.1A Pending CN111316119A (en) 2018-12-28 2018-12-28 Radar simulation method and device

Country Status (2)

Country Link
CN (1) CN111316119A (en)
WO (1) WO2020133206A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113820694B (en) * 2021-11-24 2022-03-01 腾讯科技(深圳)有限公司 Simulation ranging method, related device, equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102168954A (en) * 2011-01-14 2011-08-31 浙江大学 Monocular-camera-based method for measuring depth, depth field and sizes of objects
US20110279648A1 (en) * 2010-05-12 2011-11-17 Microsoft Corporation Scanned-beam depth mapping to 2d image
CN103456038A (en) * 2013-08-19 2013-12-18 华中科技大学 Method for rebuilding three-dimensional scene of downhole environment
CN105894492A (en) * 2015-01-06 2016-08-24 三星电子株式会社 T-O-F depth imaging device rendering depth image of object and method thereof
CN107766847A (en) * 2017-11-21 2018-03-06 海信集团有限公司 A kind of method for detecting lane lines and device
CN107966693A (en) * 2017-12-05 2018-04-27 成都合纵连横数字科技有限公司 A kind of mobile lidar emulation mode rendered based on depth
CN108280401A (en) * 2017-12-27 2018-07-13 达闼科技(北京)有限公司 A kind of pavement detection method, apparatus, cloud server and computer program product
CN108419446A (en) * 2015-08-24 2018-08-17 高通股份有限公司 System and method for the sampling of laser depth map
CN108564615A (en) * 2018-04-20 2018-09-21 驭势(上海)汽车科技有限公司 Method, apparatus, system and the storage medium of simulated laser radar detection

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010031068A1 (en) * 2000-04-14 2001-10-18 Akihiro Ohta Target detection system using radar and image processing
JP5115912B2 (en) * 2001-02-23 2013-01-09 独立行政法人日本原子力研究開発機構 High-speed gate sweep type 3D laser radar system
CN104965202B (en) * 2015-06-18 2017-10-27 奇瑞汽车股份有限公司 Obstacle detection method and device
CN105261039B (en) * 2015-10-14 2016-08-17 山东大学 A kind of self-adaptative adjustment target tracking algorism based on depth image
CN105607635B (en) * 2016-01-05 2018-12-14 东莞市松迪智能机器人科技有限公司 Automatic guided vehicle panoramic optical vision navigation control system and omnidirectional's automatic guided vehicle

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110279648A1 (en) * 2010-05-12 2011-11-17 Microsoft Corporation Scanned-beam depth mapping to 2d image
CN102168954A (en) * 2011-01-14 2011-08-31 浙江大学 Monocular-camera-based method for measuring depth, depth field and sizes of objects
CN103456038A (en) * 2013-08-19 2013-12-18 华中科技大学 Method for rebuilding three-dimensional scene of downhole environment
CN105894492A (en) * 2015-01-06 2016-08-24 三星电子株式会社 T-O-F depth imaging device rendering depth image of object and method thereof
CN108419446A (en) * 2015-08-24 2018-08-17 高通股份有限公司 System and method for the sampling of laser depth map
CN107766847A (en) * 2017-11-21 2018-03-06 海信集团有限公司 A kind of method for detecting lane lines and device
CN107966693A (en) * 2017-12-05 2018-04-27 成都合纵连横数字科技有限公司 A kind of mobile lidar emulation mode rendered based on depth
CN108280401A (en) * 2017-12-27 2018-07-13 达闼科技(北京)有限公司 A kind of pavement detection method, apparatus, cloud server and computer program product
CN108564615A (en) * 2018-04-20 2018-09-21 驭势(上海)汽车科技有限公司 Method, apparatus, system and the storage medium of simulated laser radar detection

Also Published As

Publication number Publication date
WO2020133206A1 (en) 2020-07-02

Similar Documents

Publication Publication Date Title
CN108875804B (en) Data processing method based on laser point cloud data and related device
EP3798974B1 (en) Method and apparatus for detecting ground point cloud points
CN111192295B (en) Target detection and tracking method, apparatus, and computer-readable storage medium
US11568654B2 (en) Object recognition method and object recognition device performing the same
CN110809723A (en) Radar simulation method, device and system
CN111563450B (en) Data processing method, device, equipment and storage medium
KR20210061597A (en) Method and device to improve radar data using reference data
US11313947B2 (en) Method and system for simulation-assisted determination of echo points, and emulation method and emulation apparatus
US20210117696A1 (en) Method and device for generating training data for a recognition model for recognizing objects in sensor data of a sensor, in particular, of a vehicle, method for training and method for activating
KR20210106864A (en) Method and apparatus for detecting object based on radar signal
CN112781599A (en) Method for determining the position of a vehicle
US20030169943A1 (en) Methods and arrangements to enhance correlation
CN110954912B (en) Method and apparatus for optical distance measurement
CN114072697B (en) Method for simulating continuous wave lidar sensor
US20200380085A1 (en) Simulations with Realistic Sensor-Fusion Detection Estimates of Objects
CN111316119A (en) Radar simulation method and device
CN112630798B (en) Method and apparatus for estimating ground
JP2023164502A (en) Stationary object data generator, method for control, program, and storage medium
KR102420585B1 (en) Apparatus and method for determining point cloud information in consideration of the operating environment of a light detection and ranging system
CN116091701A (en) Three-dimensional reconstruction method, three-dimensional reconstruction device, computer equipment and storage medium
JP2020034451A (en) Data structure, storage medium, and storage device
CN112651405B (en) Target detection method and device
CN116577762B (en) Simulation radar data generation method, device, equipment and storage medium
CN112684450B (en) Sensor deployment method and device, electronic equipment and storage medium
US20230194664A1 (en) Method for training a radar-based object detection and method for radar-based surroundings detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200619