WO2020133206A1 - Procédé et appareil de simulation radar - Google Patents

Procédé et appareil de simulation radar Download PDF

Info

Publication number
WO2020133206A1
WO2020133206A1 PCT/CN2018/124822 CN2018124822W WO2020133206A1 WO 2020133206 A1 WO2020133206 A1 WO 2020133206A1 CN 2018124822 W CN2018124822 W CN 2018124822W WO 2020133206 A1 WO2020133206 A1 WO 2020133206A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
depth
value
pixels
depth map
Prior art date
Application number
PCT/CN2018/124822
Other languages
English (en)
Chinese (zh)
Inventor
黎晓键
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201880072069.1A priority Critical patent/CN111316119A/zh
Priority to PCT/CN2018/124822 priority patent/WO2020133206A1/fr
Publication of WO2020133206A1 publication Critical patent/WO2020133206A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/03Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring coordinates of points
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras

Definitions

  • the invention relates to the technical field of radar, and in particular to a radar simulation method and device.
  • a radar sensor that emits electromagnetic waves with a cone beam for example, a millimeter wave radar
  • a radar sensor that emits electromagnetic waves with a cone beam for example, a millimeter wave radar
  • millimeter wave radar as a sensor with accurate ranging, long detection distance, and all-weather work, has become an indispensable sensor in autonomous driving technology.
  • Embodiments of the present invention provide a radar simulation method and device, which are used to solve the technology in the prior art that relies on a radar sensor that emits a cone beam. It is difficult to avoid the problems of cumbersome field testing and high development cost during the development process.
  • an embodiment of the present invention provides a radar simulation method, including:
  • the depth map of the current frame determine a plurality of pixels in the depth map, the depth value of the pixels in the depth map represents the distance between the pixel and the camera, and the depth of the plurality of pixels The distance represented by the value is less than the distance represented by the depth value of other pixels in the depth map;
  • the depth map satisfies the FOV condition of the camera's FOV, and the FOV condition of the camera corresponds to the detection range of the radar sensor.
  • an embodiment of the present invention provides a radar simulation device including: a processor and a memory;
  • the memory is used to store program codes
  • the processor calls the program code, and when the program code is executed, it is used to perform the following operations:
  • the depth map of the current frame determine a plurality of pixels in the depth map, the depth value of the pixels in the depth map represents the distance between the pixel and the camera, and the depth of the plurality of pixels The distance represented by the value is less than the distance represented by the depth value of other pixels in the depth map;
  • the depth map satisfies the FOV condition of the camera's FOV, and the FOV condition of the camera corresponds to the detection range of the radar sensor.
  • an embodiment of the present invention provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, and the computer program includes at least one piece of code, and the at least one piece of code can be executed by a computer To control the computer to execute the radar simulation method according to any one of the first claims.
  • an embodiment of the present invention provides a computer program, characterized in that, when the computer program is executed by a computer, it is used to implement the radar simulation method according to any one of the first aspect.
  • the radar simulation method and device provided in the embodiments of the present invention determine multiple pixels in the depth map according to the depth map of the current frame, and output the detection of the radar sensor according to the depth value of each pixel in the multiple pixels
  • the target point detected by the radar sensor emitting the cone beam can be obtained through simulation, and the detection result of the radar sensor can be further simulated to realize the simulation of the radar sensor emitting the cone beam, thereby making it possible to avoid the development process Due to the reliance on real radar sensors that emit cone beams, the on-site testing is cumbersome and the development cost is high.
  • FIG. 1 is a schematic flowchart of a radar simulation method provided by an embodiment of the present invention
  • 2A is a schematic diagram of a detection range provided by an embodiment of the present invention.
  • 2B is a schematic diagram of a depth map provided by an embodiment of the present invention.
  • FIG. 3 is a schematic flowchart of a radar simulation method provided by another embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention.
  • FIG. 6 is a schematic diagram of an output detection structure provided by an embodiment of the present invention.
  • FIG. 7 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a radar simulation device provided by an embodiment of the present invention.
  • the radar simulation method provided by the embodiment of the present invention can realize the simulation of a radar sensor whose emitted electromagnetic wave is a cone beam, and the detection result of the radar sensor that emits the cone beam can be obtained through simulation by means of software calculation.
  • the radar simulation method provided by the embodiment of the present invention can be applied to any development scenario of a radar sensor that relies on emitting cone beams, which can avoid dependence on real radar sensors that emit cone beams during the development process, thereby solving the development process Due to the reliance on real radar sensors that emit cone beams, the on-site testing is cumbersome and the development cost is high.
  • the radar sensor emitting the cone beam may specifically be a millimeter wave radar sensor.
  • FIG. 1 is a schematic flowchart of a radar simulation method provided by an embodiment of the present invention.
  • the execution subject of this embodiment may be a radar simulation device for implementing radar simulation, and may specifically be a processor of the radar simulation device.
  • the method of this embodiment may include:
  • Step 101 Determine multiple pixels in the depth map according to the depth map of the current frame.
  • the depth value of the pixel in the depth map represents the distance between the pixel and the camera, and the distance represented by the depth value of the plurality of pixels is less than that of other pixels in the depth map The distance indicated by the depth value.
  • the depth map includes 12 pixels, namely pixel 1 to pixel 12, and the depth value represented by the pixel 1 to pixel 12 is getting larger and larger, then according to the depth map
  • the plurality of pixels may specifically be pixel 1 to pixel 3 of 12 pixels.
  • the depth map satisfies the field of view (FOV, Field of View) condition of the camera, and the FOV condition of the camera corresponds to the detection range of the radar sensor.
  • FOV Field of View
  • the detection range is similar to the shooting range of the camera, so when the depth map satisfies the angle of view condition corresponding to the detection range of the radar sensor, the depth The distance from the camera represented by the depth value of the pixel in the figure simulates the distance from the radar sensor, so that according to the depth value of the pixel in the depth map, multiple pixels closest to the radar sensor can be determined.
  • the multiple pixel points may be understood as multiple target points detected by the simulated radar sensor, and the multiple pixel points may correspond to the multiple target points one-to-one.
  • the FOV condition of the camera here may correspond to the detection range of the radar sensor or may have a mapping relationship.
  • the FOV condition of the camera corresponds to the detection range of the radar sensor, specifically, the FOV condition of the camera and the detection range of the radar sensor are completely the same, or the FOV condition of the camera is The detection range of the radar sensor is approximately the same.
  • a depth map corresponding to the detection range in front of the vehicle X1 in the scene shown in FIG. 2A may be as shown in FIG. 2B.
  • the number of the plurality of pixels may be a preset number, for example, 64. Assuming that the depth map includes pixel 1 to pixel 128, 128 pixels, and pixel 1 to pixel 128, the depth value decreases in sequence, then when the preset number is equal to 64 and the depth value is greater, it means that When the distance is closer, the plurality of pixel points may specifically be pixel point 1 to pixel point 64.
  • the present invention may not limit the manner of acquiring the depth map.
  • it can be obtained by camera shooting, or it can also be obtained by rendering.
  • the radar sensor can detect with a certain sampling frequency.
  • the detection range of the radar sensor at the previous sampling time can correspond to the previous frame, and multiple pixels can be determined according to the depth map of the previous frame;
  • the detection range of the radar sensor at the current sampling time can correspond to the depth map of the current frame, Multiple pixel points can be determined according to the depth map of the current frame;
  • the detection range of the radar sensor at the next sampling time can correspond to the depth map of the next frame, and multiple pixel points can be determined according to the depth map of the next frame.
  • the depth map of the current frame may be the same as or different from the depth map of the previous frame, and the depth map of the current frame may be the same as or different from the depth map of the next frame.
  • Step 102 Output the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels.
  • the depth value of a pixel point can represent the distance between the pixel point and the camera
  • the distance between each pixel point and the thunder camera can be obtained according to the depth value of each pixel point among the multiple pixel points .
  • the multiple pixels are the target points detected by the simulated radar sensor, and the detection range of the radar sensor corresponds to the FOV of the camera, the distance between the pixel and the camera is obtained by simulation The distance between the target point detected by the radar sensor and the radar sensor.
  • the detection result of the output radar sensor may include the distance between each pixel and the camera (it can be understood that each target point detected by the radar sensor and the radar sensor the distance between).
  • the detection result of the radar sensor may also include motion information of each pixel relative to the radar sensor.
  • the motion information of each pixel relative to the radar sensor may be the same, for example, all are preset motion information.
  • the movement information may include, for example, movement direction, movement speed and the like.
  • the depth value of the pixel can be directly used as the distance between the pixel and the radar sensor; or, the depth value of the pixel can be mathematically calculated to obtain the distance between the pixel and the radar sensor.
  • FIG. 3 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention. Based on the embodiment shown in FIG. 1, this embodiment mainly describes an optional implementation manner of step 102. As shown in FIG. 3, the method of this embodiment may include:
  • Step 301 Determine multiple pixels in the depth map according to the depth map of the current frame.
  • step 301 is similar to step 101 and will not be repeated here.
  • Step 302 Output the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels and the motion information of each pixel relative to the camera.
  • the motion information of the pixel relative to the camera is the simulation
  • the correspondence between each pixel in the depth map and the motion information may be stored, and the motion information corresponding to one pixel is the motion information of the pixel relative to the camera. Further optionally, the correspondence between each pixel point and the motion information may be set in advance, or may be set by the user.
  • the motion information of the pixels relative to the camera may be determined according to the label map corresponding to the depth map.
  • the label map is used to indicate the object to which each pixel in the depth map belongs, and the object corresponds to the motion information.
  • the label map can indicate the object to which each pixel belongs by different color labels.
  • the color labels corresponding to the two pixels in the label map are the first color, it can indicate that the two pixels belong to the same object. All are objects represented by the first color.
  • the color labels corresponding to all pixels belonging to the road railing X2 may be dark green
  • the color labels corresponding to all pixels belonging to the distant house X3 may be light green.
  • the method of this embodiment may further include the following steps A and B.
  • Step A According to the label map corresponding to the depth map, determine the object to which each pixel of the plurality of pixels belongs.
  • the label map includes the object to which each pixel in the 100 pixels belongs.
  • the plurality of pixel points determined according to the depth map are pixel point 10, pixel point 20, pixel point 30, and pixel point 40, then pixel point 10, pixel point 20, pixel point 30, and pixel point can be obtained according to the label map The object to which each pixel in 40 belongs.
  • the object may specifically be any object that can be detected by the radar sensor, such as the ground, buildings on the ground, and so on.
  • Step B Determine the motion information of each pixel according to the object to which each pixel of the plurality of pixels belongs.
  • the motion information of each pixel relative to the camera that is, the motion information of each pixel relative to the radar sensor can be obtained.
  • step 302 may specifically include: determining each pixel of the plurality of pixels and the camera according to the respective depth values of the plurality of pixels The distance between each pixel of the plurality of pixels and the camera, and the motion information of each pixel relative to the camera, output the detection result of the radar sensor.
  • the detection result may further include identification information indicating the object to which each pixel belongs.
  • the objects may correspond one-to-one with the target identification number; the method further includes: determining the target identification number of each pixel according to the object to which each pixel of the plurality of pixels belongs (can be understood as The target identification number of each target point).
  • the output of the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels includes: according to the depth value of each pixel in the plurality of pixels, and the target recognition of each pixel No., output the detection result of the radar sensor.
  • the output detection result of the radar sensor may include the distance between each target point and the camera, and the target identification number of each target point.
  • the outputting the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels and the motion information of each pixel relative to the camera includes:
  • the output detection result of the radar sensor may include the distance between each target point and the radar sensor, the movement information of each target point relative to the radar sensor, and the target identification number of each target point.
  • multiple pixels in the depth map are determined according to the depth map of the current frame, and the radar sensor is output according to the depth value of each pixel in the multiple pixels and the motion information of each pixel relative to the camera
  • the detection result of the simulation realizes the simulation of the motion information of the target point sampled by the real radar sensor, so that the detection result of the radar sensor obtained by the simulation includes the distance between the target point and the radar sensor, and the target point relative to the radar sensor Speed of movement.
  • FIG. 4 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention. This embodiment is based on the embodiments shown in FIGS. 1 and 3, and mainly describes that in the simulation, the accuracy loss of the radar sensor is considered for The effect of the detection result of the radar sensor. As shown in FIG. 4, the method of this embodiment may include:
  • Step 401 According to the correspondence between the depth level and the depth value range, update the depth value of each pixel in the depth map of the current frame to the maximum value of the depth value range corresponding to the depth level to which the updated depth map is obtained.
  • the depth value of each pixel in the depth map of the current frame is updated to the maximum value of the depth value range corresponding to the depth level to which it belongs, which can represent the loss of the accuracy of the depth value within the depth level range, and the depth within the same depth level range
  • the values are updated to a fixed depth value within the depth level range, that is, the maximum value within the depth level range. It can be understood that, according to the characteristics of the accuracy loss of the radar sensor, alternatively, the depth values within the same depth level range can be updated to other depth values within the depth level range, such as the minimum value within the depth level range.
  • the depth value of the pixel in the depth map may specifically be any one of 0 to 255,256 integers.
  • depth level 1 corresponds to Depth value range 0 to 63
  • depth level 2 corresponds to depth value range 64 to 127
  • depth level 3 corresponds to depth value range 128 to 192
  • depth level 4 corresponds to depth value range 193 to 255.
  • step 401 may further include: normalizing the depth value of each pixel in the depth map of the current frame.
  • step 401 may specifically include updating the depth value of each pixel in the normalized depth map to the maximum value of the depth value range corresponding to the depth level according to the corresponding relationship between the depth level and the depth value range to obtain an update After the depth map.
  • the depth value range is also the range after normalization.
  • Step 402 Determine a plurality of pixels in the depth map according to the updated depth map.
  • the specific method of determining multiple pixels in the updated depth map in step 402 according to the updated depth map is the same as the specific method for determining multiple pixels in the depth map according to the depth map in step 101. Similarly, I will not repeat them here.
  • Step 403 Output the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels.
  • step 403 is similar to step 102 or step 302 and will not be repeated here.
  • the updated depth map is obtained by updating the depth value of each pixel in the depth map of the current frame to the maximum value of the depth value range corresponding to the corresponding depth level according to the correspondence between the depth level and the depth value range Based on the updated depth map, multiple pixels in the depth map are determined to realize the simulation of the accuracy loss of the radar sensor.
  • FIG. 5 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention.
  • This embodiment based on the foregoing embodiment, mainly describes an optional implementation of obtaining the depth map of the current frame in the simulation. the way.
  • the method of this embodiment may include:
  • Step 501 Obtain the original depth map of the current frame.
  • the original depth map of the current frame can be obtained by image rendering. Further optionally, the original depth map of the current frame in a certain stereo scene can be obtained through image rendering according to the motion of the radar sensor.
  • the movement of the radar sensor may specifically include the movement of the carrier carrying the radar sensor.
  • each frame in multiple frames may be used as the current frame in sequence at the target frequency, where the multiple frames are consecutive multiple frames related to a stereoscopic scene.
  • the target frequency may be equal to the sampling frequency of the real radar sensor to simulate the sampling frequency of the radar sensor.
  • the target frequency is 20 Hz.
  • Step 502 Determine the yaw of each pixel in the original depth map in the camera coordinate system of the camera according to the depth value and two-dimensional coordinates of each pixel in the original depth map, and the parameters of the camera Angle and pitch angle.
  • the position of each pixel in the camera coordinate system can be determined according to the camera parameters and the two-dimensional coordinates of each pixel in the original image.
  • the parameters of the camera may include internal parameters of the camera.
  • the yaw angle and pitch angle of each pixel point in the camera coordinate system can be determined according to the position of each pixel point in the original depth map under the camera coordinate system and the depth value of each pixel point.
  • the yaw angle and pitch angle of each pixel point in the camera coordinate system can be understood as the yaw angle and pitch angle of each pixel point relative to the radar sensor.
  • Step 503 According to the pitch angle and yaw angle of each pixel in the original depth map in the camera coordinate system, convert the pixel points in the original depth map that do not satisfy the FOV condition The depth value of is set to a preset value to obtain the depth map of the current frame.
  • the preset value may be used to indicate that the distance to the camera is greater than the maximum detection distance of the radar sensor.
  • the pitch angle and yaw angle of a pixel point do not satisfy the FOV condition, it can indicate that it is outside the detection range of the radar sensor, and the radar sensor will not detect the point as the target point.
  • setting the depth value of a pixel to a preset value can be understood as excluding the pixel from the target point that can be detected by the radar sensor.
  • the FOV conditions may include horizontal FOV conditions and vertical FOV conditions.
  • the horizontal FOV condition can be used to express the restriction on the horizontal FOV for the radar sensor.
  • the vertical FOV condition can be used to express the restriction on the vertical FOV for the radar sensor.
  • the horizontal FOV condition satisfies the condition that the further the distance from the camera is, the smaller the horizontal FOV value is, and the closer to the camera is, the greater the horizontal FOV value is.
  • the horizontal FOV condition includes:
  • the horizontal FOV value is equal to the first FOV value
  • the horizontal FOV value is equal to the second FOV value
  • the horizontal FOV value is equal to the third FOV value
  • the horizontal FOV value is equal to the fourth FOV value
  • the second distance threshold is greater than the first distance threshold and less than the third distance threshold; the second FOV value is greater than the third FOV value and less than the first FOV value, the third The FOV value is greater than the first FOV value.
  • the first distance threshold is equal to 70m
  • the second distance threshold is equal to 160 meters
  • the third distance threshold is equal to 250 meters.
  • the first FOV value is equal to 60°
  • the second FOV value is equal to 8°
  • the third FOV value is equal to 4°
  • the fourth FOV value is equal to 0°.
  • the vertical FOV condition may satisfy the condition independent of the distance of the camera. Further optionally, the vertical FOV condition includes: the vertical FOV value is equal to 10°.
  • the above depth map may be obtained through the step of rendering the depth map, and the above logo map may be obtained by rendering the logo map.
  • the above steps 501 to 503 can be understood as steps for rendering a depth map.
  • the detection result of the radar sensor may be output according to the depth map and the label map according to the depth map and the label map obtained by rendering, in a related manner of the foregoing embodiment. It should be noted that the above-mentioned rendering of the depth map and the identification map of the same frame may be based on the same shooting range of the same stereo scene, and the shooting range is the detection range of the radar sensor.
  • each pixel in the original depth map is under the camera coordinate system of the camera Yaw angle and pitch angle, further, set the depth value of the pixel point in the original depth map that does not meet the FOV condition to the preset value to obtain the depth map of the current frame, thus obtaining the camera
  • the depth map of the FOV condition that is, the depth map corresponding to the detection range of the radar sensor.
  • FIG. 7 is a schematic flowchart of a radar simulation method according to another embodiment of the present invention. Based on the foregoing embodiment, this embodiment mainly describes the influence of ground clutter on the detection result of the radar sensor in the simulation. As shown in FIG. 7, the method of this embodiment may include:
  • Step 701 Determine multiple pixels in the depth map according to the depth map of the current frame.
  • step 701 is similar to step 101 and will not be repeated here.
  • Step 702 Determine that the ground point in the current frame interferes with the detection result of the current frame.
  • the ground point is used to simulate the influence of ground clutter on the detection result of the radar sensor, that is, the radar sensor recognizes the ground point as the target point due to the influence of the ground clutter. Since the influence of the ground clutter is not always present, the ground point of the current frame can be determined to interfere with the detection result of the current frame through step 702.
  • step 702 may specifically include: if the ground point in the previous frame of the current frame interferes with the detection result of the previous frame, the ground in the current frame The point interferes with the detection result of the current frame.
  • the detection result in the previous frame when the detection result in the previous frame is not interfered by the ground point, the detection result of the current frame may or may not be interfered by the ground point.
  • the probability To determine that the detection result of the current frame is disturbed by the ground point.
  • the method of this embodiment further includes: if the ground point in the previous frame does not interfere with the detection result of the previous frame, determining the ground point in the current frame to interfere with the current frame according to the target probability Detection results.
  • the target probability may be a preset probability, or may be positively related to the target frame number. Further optionally, the greater the number of target frames, the greater the target probability; the target frame number is the number of consecutive frames that continue to the current frame and the ground point does not interfere with the detection result.
  • Frame 4 does not include the ground points of frames 1 to 3 that interfere with the detection results.
  • Frame 4 corresponds to The target frame number is 1, the target probability is probability 1 and the detection result of frame 4 is not disturbed by the ground point, the target frame number corresponding to frame 5 is 2, the target probability is probability 2 and the detection result of frame 5 is not affected by the ground point Interference, the target frame number corresponding to frame 6 is 3, the target probability is probability 3 and the detection result of frame 6 is not interfered by the ground point, then the target frame number corresponding to frame 7 is 4, the target probability is probability 4, and the probability 4 >Probability 3>Probability 2>Probability 1.
  • the detection result of frame 7 is disturbed by the ground point, and that frame 8 and frame 9 include the ground point of the interference detection result in frame 7, and frame 10 does not include the ground point of the interference detection result in frame 7, frame 10
  • the corresponding target frame number is 1, the target probability is probability 5, and the detection result of frame 10 is not disturbed by the ground point, then the target frame number corresponding to frame 11 is 2, the target probability is probability 6, and probability 6>probability 5.
  • the ground point may be a randomly selected point. It can be understood that, when the randomness of the ground clutter is not considered, the ground point may also be a preset point.
  • Step 703 Determine the distance between the ground point and the camera.
  • the ground point is a pixel point corresponding to the ground in the current frame, therefore, the distance between the ground point and the camera can be determined by the depth value of the ground point determine.
  • Step 704 Output the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels and the distance between the ground point and the camera.
  • the detection result of the output radar sensor may include the distance between each pixel and the camera The distance, and the distance between the ground point and the camera (can be understood as the distance of the target point detected by the radar sensor interfered by ground clutter relative to the radar sensor).
  • the detection result of the radar sensor may also be output according to the motion information of each pixel relative to the camera. That is, the output radar detection result may further include: motion information of each pixel point relative to the camera, and motion information of the ground point relative to the camera. Since the absolute speed of the ground point is 0, the motion information of the ground point relative to the camera can be obtained according to the motion state of the camera.
  • the ground point in the current frame interferes with the detection result of the current frame
  • the distance between the ground point and the camera is determined, according to the depth value of each pixel in the multiple pixels, and between the ground point and the camera
  • the distance output the detection result of the radar sensor, so that the detection result of the radar sensor can include the influence of the ground clutter, and realize the simulation of the ground clutter affecting the radar sensor, thereby improving the authenticity of the simulation.
  • the outputting the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels includes: according to the depth value of each pixel in the plurality of pixels, The detection results of the radar sensor are output in order according to the order of the distance indicated by the depth value from small to large.
  • a computer-readable storage medium is also provided in an embodiment of the present invention.
  • the computer-readable storage medium stores program instructions, and when the program is executed, it may include some or all of the radar simulation methods in the foregoing method embodiments step.
  • An embodiment of the present invention provides a computer program which, when executed by a computer, is used to implement the radar simulation method in any of the above method embodiments.
  • FIG. 8 is a schematic structural diagram of a radar simulation device provided by an embodiment of the present invention.
  • the radar simulation device 800 of this embodiment may include: a memory 801 and a processor 802; connection.
  • the memory 801 may include a read-only memory 801 and a random access memory 801, and provide instructions and data to the processor 802.
  • a part of the memory 801 may further include a non-volatile random access memory 801.
  • the memory 801 is used to store program codes.
  • the processor 802 calls the program code, and when the program code is executed, it is used to perform the following operations:
  • the processor 802 is configured to output the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels, specifically including:
  • the detection result of the radar sensor is output.
  • the processor 802 is further used to:
  • the label map is used to indicate the object to which each pixel in the depth map belongs, the object and the motion information correspond;
  • the motion information of each pixel is determined according to the object to which each pixel of the plurality of pixels belongs.
  • the objects correspond to the target identification numbers in one-to-one correspondence; the processor 802 is also used to:
  • the outputting the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels includes:
  • the detection result of the radar sensor is output according to the depth value of each pixel in the plurality of pixels and the target identification number of each pixel.
  • the processor 802 is used to output the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels and the motion information of each pixel relative to the camera, This includes:
  • the distance between each pixel of the plurality of pixels and the camera and the motion information of each pixel relative to the camera are output as the detection result of the radar sensor.
  • the processor 802 is further used to:
  • the processor 802 is configured to determine a plurality of pixels in the depth map according to the depth map of the current frame, specifically including:
  • the processor 802 is further used to:
  • the depth value of each pixel in the depth map of the current frame is normalized.
  • the processor 802 is further used to:
  • the depth value of the pixel point in the original depth map that does not satisfy the FOV condition Set is used to indicate that the distance from the camera is greater than the maximum detection distance of the radar sensor.
  • the processor 802 is used to obtain an original depth map of the current frame, which specifically includes:
  • the processor 802 is further configured to sequentially use each frame in multiple frames as the current frame at a target frequency, where the multiple frames are consecutive multiple frames related to a stereoscopic scene .
  • the target frequency is 20 Hz.
  • the processor 802 is further used to:
  • the processor 802 is configured to output the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels, specifically including:
  • the detection result of the radar sensor is output according to the depth value of each pixel in the plurality of pixels and the distance between the ground point and the camera.
  • the processor 802 is used to determine a ground point in the current frame interferes with the detection result of the current frame, specifically including:
  • the processor 802 is further used to:
  • the ground point in the previous frame does not interfere with the detection result of the previous frame, the ground point in the current frame interferes with the detection result of the current frame according to the target probability.
  • the greater the number of target frames the greater the target probability.
  • the target frame number is the number of consecutive frames that continue to the current frame and the ground point does not interfere with the detection result.
  • the ground point is randomly selected.
  • the processor 802 is configured to output the detection result of the radar sensor according to the depth value of each pixel in the plurality of pixels, specifically including:
  • the detection results of the radar sensor are output in order according to the order of the distance indicated by the depth value from small to large.
  • the pieces include horizontal FOV conditions and vertical FOV conditions.
  • the horizontal FOV condition includes:
  • the horizontal FOV value is equal to the first FOV value
  • the horizontal FOV value is equal to the second FOV value
  • the horizontal FOV value is equal to the third FOV value
  • the horizontal FOV value is equal to the fourth FOV value
  • the second distance threshold is greater than the first distance threshold and less than the third distance threshold; the second FOV value is greater than the third FOV value and less than the first FOV value, the third The FOV value is greater than the first FOV value.
  • the radar sensor is a millimeter wave radar sensor.
  • the radar simulation device provided in this embodiment can be used to execute the technical solutions of the above method embodiments of the present invention, and its implementation principles and technical effects are similar, and are not repeated here.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

L'invention concerne un procédé et un appareil de simulation radar. Le procédé de simulation radar comprend les étapes suivantes : selon une carte de profondeur d'une trame courante, déterminer de multiples points de pixel dans la carte de profondeur (101), une valeur de profondeur d'un point de pixel dans la carte de profondeur représentant la distance entre le point de pixel et une caméra, et les distances représentées par les valeurs de profondeur des multiples points de pixel étant inférieures à des distances représentées par des valeurs de profondeur des autres points de pixel dans la carte de profondeur ; et en fonction de la valeur de profondeur de chacun des multiples points de pixel, délivrer en sortie un résultat de détection d'un capteur radar (102). Le procédé réalise la simulation d'un capteur radar émettant un faisceau d'ondes en forme de cône.
PCT/CN2018/124822 2018-12-28 2018-12-28 Procédé et appareil de simulation radar WO2020133206A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880072069.1A CN111316119A (zh) 2018-12-28 2018-12-28 雷达仿真方法及装置
PCT/CN2018/124822 WO2020133206A1 (fr) 2018-12-28 2018-12-28 Procédé et appareil de simulation radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/124822 WO2020133206A1 (fr) 2018-12-28 2018-12-28 Procédé et appareil de simulation radar

Publications (1)

Publication Number Publication Date
WO2020133206A1 true WO2020133206A1 (fr) 2020-07-02

Family

ID=71127383

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/124822 WO2020133206A1 (fr) 2018-12-28 2018-12-28 Procédé et appareil de simulation radar

Country Status (2)

Country Link
CN (1) CN111316119A (fr)
WO (1) WO2020133206A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113820694A (zh) * 2021-11-24 2021-12-21 腾讯科技(深圳)有限公司 一种仿真测距的方法、相关装置、设备以及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010031068A1 (en) * 2000-04-14 2001-10-18 Akihiro Ohta Target detection system using radar and image processing
US20020118352A1 (en) * 2001-02-23 2002-08-29 Japan Atomic Energy Research Institute Fast gate scanning three-dimensional laser radar apparatus
CN102168954A (zh) * 2011-01-14 2011-08-31 浙江大学 基于单目摄像机的深度、深度场及物体大小的测量方法
CN104965202A (zh) * 2015-06-18 2015-10-07 奇瑞汽车股份有限公司 障碍物探测方法和装置
CN105261039A (zh) * 2015-10-14 2016-01-20 山东大学 一种基于深度图像的自适应调整目标跟踪算法
CN105607635A (zh) * 2016-01-05 2016-05-25 东莞市松迪智能机器人科技有限公司 自动导引车全景光学视觉导航控制系统及全向自动导引车

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8330804B2 (en) * 2010-05-12 2012-12-11 Microsoft Corporation Scanned-beam depth mapping to 2D image
CN103456038A (zh) * 2013-08-19 2013-12-18 华中科技大学 一种井下环境三维场景重建方法
US10061029B2 (en) * 2015-01-06 2018-08-28 Samsung Electronics Co., Ltd. Correction of depth images from T-O-F 3D camera with electronic-rolling-shutter for light modulation changes taking place during light integration
US10282591B2 (en) * 2015-08-24 2019-05-07 Qualcomm Incorporated Systems and methods for depth map sampling
CN107766847B (zh) * 2017-11-21 2020-10-30 海信集团有限公司 一种车道线检测方法及装置
CN107966693B (zh) * 2017-12-05 2021-08-13 成都合纵连横数字科技有限公司 一种基于深度渲染的车载激光雷达仿真方法
CN108280401B (zh) * 2017-12-27 2020-04-07 达闼科技(北京)有限公司 一种路面检测方法、装置、云端服务器及计算机程序产品
CN108564615B (zh) * 2018-04-20 2022-04-29 驭势(上海)汽车科技有限公司 模拟激光雷达探测的方法、装置、系统及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010031068A1 (en) * 2000-04-14 2001-10-18 Akihiro Ohta Target detection system using radar and image processing
US20020118352A1 (en) * 2001-02-23 2002-08-29 Japan Atomic Energy Research Institute Fast gate scanning three-dimensional laser radar apparatus
CN102168954A (zh) * 2011-01-14 2011-08-31 浙江大学 基于单目摄像机的深度、深度场及物体大小的测量方法
CN104965202A (zh) * 2015-06-18 2015-10-07 奇瑞汽车股份有限公司 障碍物探测方法和装置
CN105261039A (zh) * 2015-10-14 2016-01-20 山东大学 一种基于深度图像的自适应调整目标跟踪算法
CN105607635A (zh) * 2016-01-05 2016-05-25 东莞市松迪智能机器人科技有限公司 自动导引车全景光学视觉导航控制系统及全向自动导引车

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113820694A (zh) * 2021-11-24 2021-12-21 腾讯科技(深圳)有限公司 一种仿真测距的方法、相关装置、设备以及存储介质
CN113820694B (zh) * 2021-11-24 2022-03-01 腾讯科技(深圳)有限公司 一种仿真测距的方法、相关装置、设备以及存储介质

Also Published As

Publication number Publication date
CN111316119A (zh) 2020-06-19

Similar Documents

Publication Publication Date Title
US10970864B2 (en) Method and apparatus for recovering point cloud data
US11455565B2 (en) Augmenting real sensor recordings with simulated sensor data
US11487988B2 (en) Augmenting real sensor recordings with simulated sensor data
CN108875804B (zh) 一种基于激光点云数据的数据处理方法和相关装置
CN111324115B (zh) 障碍物位置检测融合方法、装置、电子设备和存储介质
WO2020133230A1 (fr) Procédé, appareil et système de simulation de radar
US11568654B2 (en) Object recognition method and object recognition device performing the same
US20200311985A1 (en) Radio coverage map generation
US10140722B2 (en) Distance measurement apparatus, distance measurement method, and non-transitory computer-readable storage medium
CN111339876B (zh) 用于识别场景中各区域类型的方法和装置
US11092690B1 (en) Predicting lidar data using machine learning
CN108169729A (zh) 激光雷达的视场的调整方法、介质、激光雷达系统
KR20210074163A (ko) 공동 검출 및 기술 시스템 및 방법
CN111354022A (zh) 基于核相关滤波的目标跟踪方法及系统
CN115147333A (zh) 一种目标检测方法及装置
CN115984637A (zh) 时序融合的点云3d目标检测方法、系统、终端及介质
CN113820694B (zh) 一种仿真测距的方法、相关装置、设备以及存储介质
WO2020133206A1 (fr) Procédé et appareil de simulation radar
JP2018116004A (ja) データ圧縮装置、制御方法、プログラム及び記憶媒体
CN109035390B (zh) 基于激光雷达的建模方法及装置
CN113920273B (zh) 图像处理方法、装置、电子设备和存储介质
CN115407302A (zh) 激光雷达的位姿估计方法、装置和电子设备
CN116047537B (zh) 基于激光雷达的道路信息生成方法及系统
CN116577762B (zh) 仿真雷达数据生成方法、装置、设备及存储介质
EP4206723A1 (fr) Procédé et dispositif de télémétrie, support d'informations et lidar

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18944233

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18944233

Country of ref document: EP

Kind code of ref document: A1