WO2022156276A1 - Target detection method and apparatus - Google Patents

Target detection method and apparatus Download PDF

Info

Publication number
WO2022156276A1
WO2022156276A1 PCT/CN2021/124194 CN2021124194W WO2022156276A1 WO 2022156276 A1 WO2022156276 A1 WO 2022156276A1 CN 2021124194 W CN2021124194 W CN 2021124194W WO 2022156276 A1 WO2022156276 A1 WO 2022156276A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
millimeter
wave radar
target
time
Prior art date
Application number
PCT/CN2021/124194
Other languages
French (fr)
Chinese (zh)
Inventor
黄梓亮
郑永豪
王灿
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022156276A1 publication Critical patent/WO2022156276A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications

Definitions

  • the present application relates to the technical field of automatic driving, and in particular, to a target detection method and device.
  • the existing technology is mainly aimed at tracking the millimeter-wave radar and the forward-looking camera first and then merging in flat and straight road conditions, with only some minor modifications to the fusion method; Curve lane lines to remove invalid targets, and continuous frame tracking of targets within the drivable space.
  • Curve lane lines to remove invalid targets
  • continuous frame tracking of targets within the drivable space due to the fixed detection range of sensors such as cameras and millimeter-wave radars, blind spots may still be undetectable in curved road conditions, causing the target obstacle to be considered lost, resulting in the illusion that the vehicle has no target obstacle.
  • V2V vehicle-to-vehicle
  • the vehicle obtains the speed of the car in the blind spot through the V2V system of the Internet of Vehicles, and fuses the speed with the blind spot boundary to form a virtual blind spot boundary following model with speed.
  • This solution relies on the expensive V2X to obtain the information of the preceding vehicle, and the cost is extremely high; and the current V2X network deployment is not perfect, so the whole solution is very difficult to implement.
  • the present application provides a target detection method and device for effectively identifying and tracking a target vehicle in a curve with low cost and high efficiency, thereby improving the safety and stability of the vehicle during driving.
  • a target detection method which can be applied to a vehicle or a chip inside the vehicle.
  • the method includes: the first vehicle detects the speed of the preceding vehicle based on a millimeter-wave radar, and the preceding vehicle is the second vehicle as an example, and the speed includes linear velocity and angular velocity; The road conditions where the vehicle and the second vehicle are located; the first vehicle determines the first threshold according to the acquired road conditions; the first vehicle compares the speed of the second vehicle with the first threshold, and if the speed of the second vehicle exceeds the first threshold, the A vehicle determines that the millimeter-wave radar is in a target-losing state or a target is about to be lost, otherwise the first vehicle determines that the millimeter-wave radar is not in a target-losing state or a target is about to be lost.
  • the first vehicle determines the first threshold according to the road conditions, and compares the real-time speed of the preceding vehicle (ie, the second vehicle) with the first threshold, and then determines whether the millimeter-wave radar is in a target loss state or the target is about to be lost. state.
  • the first vehicle can timely and effectively identify the loss of the millimeter-wave radar target caused by the road conditions, so as to avoid the illusion that the first vehicle has no target obstacles. , thereby improving the safety and stability of the vehicle during driving.
  • the first vehicle may use a camera to acquire an RGB image in front of the first vehicle, and then determine the road condition where the second vehicle is located according to the RGB image.
  • the road condition in front of the first vehicle can be identified efficiently and quickly, that is, the road condition where the second vehicle is located can be determined.
  • the first vehicle determines the road condition on which the second vehicle is located according to the RGB image.
  • the feature points may be extracted from the lane lines in the far field of view included in the RGB image, and then the feature points are extracted according to the extracted feature points. Determine the turning point and direction of the road where the second vehicle is located, and then obtain the road condition where the second vehicle is located.
  • the embodiment of the present application only processes the lane lines in the far field of vision, which can ensure accurate Under the premise of stability, the amount of calculation is reduced and the calculation efficiency is improved.
  • the first vehicle may acquire location information of the first vehicle based on the positioning device, and then determine the road condition where the first vehicle is located according to the location information.
  • the positioning device may be GPS, Beidou system or other positioning systems, for receiving satellite signals and locating the current position of the first vehicle.
  • the positioning system may also be visual positioning, millimeter wave radar positioning, fusion positioning, etc., which is not limited in this application.
  • the first vehicle After obtaining the location information of the first vehicle, the first vehicle determines the road conditions where it is located based on the map, such as being on a curve, or a straight road, or going uphill or downhill.
  • the first vehicle can quickly and accurately obtain the road conditions where it is located.
  • the first threshold may be determined according to a speed threshold between the second vehicle being within the detection range of the millimeter-wave radar and being located outside the detection range of the millimeter-wave radar, for example, the first threshold is less than or equal to the speed threshold .
  • the second vehicle When the speed of the second vehicle is much greater than the speed threshold, the second vehicle exceeds the detection range of the millimeter-wave radar and enters the blind spot for detection of the millimeter-wave radar, so the millimeter-wave radar is in a target loss state; when the speed of the second vehicle is at the critical speed When the value is near, the second vehicle may exceed the detection range of the millimeter-wave radar at any time, and is about to enter the blind spot of the millimeter-wave radar detection, so the millimeter-wave radar is in a state where the target is about to be lost.
  • the design of the first threshold may be different according to different road conditions, and three specific examples are given below for description.
  • Example 1 The road conditions where the first vehicle and the second vehicle are located are: the first vehicle is on a straight road, and the second vehicle is on a curve. Then, detecting the speed of the second vehicle based on the millimeter-wave radar may include: detecting the instantaneous angular velocity of the second vehicle making a circular motion around the curve based on the millimeter-wave radar; the first threshold N determined according to the road conditions conforms to:
  • ⁇ r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1, is the Euclidean distance between the first vehicle and the second vehicle at time t0; is the displacement of the second vehicle from time t0 to time t1, ⁇ is half of the detection beam angle of the millimeter-wave radar, time t0 is the time when the millimeter-wave radar collects the first frame of data, and time t1 is the time when the millimeter-wave radar collects the second frame of data.
  • the first frame data and the second frame data are two consecutive frames of data; K is a coefficient greater than 0 and less than or equal to 1; ⁇ is the position A of the first vehicle, the position B of the second vehicle at t0, The angle value of the central angle formed between the centers O of the curve.
  • Example 2 The road conditions where the first vehicle and the second vehicle are located are: the first vehicle is on a curve, and the second vehicle is on a straight road. Then, detecting the speed of the second vehicle based on the millimeter-wave radar may include: detecting the instantaneous running speed of the second vehicle on the straight road based on the millimeter-wave radar; and the first threshold N determined according to the road conditions conforms to:
  • Example 3 The road conditions where the first vehicle and the second vehicle are located are: both the first vehicle and the second vehicle are in a curve. Then, detecting the speed of the second vehicle based on the millimeter-wave radar may include: detecting the instantaneous angular velocity of the second vehicle making a circular motion around the curve based on the millimeter-wave radar; the first threshold N determined according to the road conditions conforms to:
  • ⁇ r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1;
  • v is the instantaneous speed of the first vehicle;
  • is half of the detection beam angle of the millimeter-wave radar, and time t0 is the millimeter-wave radar.
  • Q is a coefficient greater than 0 and less than or equal to 1; is the Euclidean distance between the first vehicle and the second vehicle at time t0;
  • road conditions are not limited to curve scenarios.
  • road scenarios such as up/downhill, acceleration/deceleration, etc.
  • the same idea can also be used to design preset conditions.
  • the first vehicle may also control the millimeter-wave radar to rotate around the Z R axis and/or around the X R axis according to road conditions The axis is rotated so that the millimeter wave radar can detect the second vehicle again; wherein the Z R axis is perpendicular to the horizontal plane, and the X R axis is parallel to the horizontal plane and perpendicular to the driving direction of the first vehicle.
  • the degrees of freedom of the above two directions are independent of each other (or decoupled) in the mechanical structure and independent of each other in the control logic. (or decoupling), so the first vehicle can adjust only one of the degrees of freedom, or can adjust two degrees of freedom at the same time.
  • the first vehicle may control the millimeter-wave radar to rotate around the Z R axis by a first angle.
  • the first vehicle may control the millimeter-wave radar to rotate about the X R axis by a second angle.
  • the first vehicle may control the millimeter-wave radar to rotate around the Z R axis by a third angle and around the X R axis by a fourth angle.
  • the rotation of the millimeter-wave radar around the Z R axis and the rotation around the X R axis are independent of each other and not dependent on each other, which can improve the accuracy and efficiency of the attitude adjustment of the millimeter-wave radar, thereby further improving the accuracy and accuracy of target detection and tracking. efficiency.
  • the calibration matrix of the millimeter-wave radar can be updated in real time according to the angle value of the rotation. .
  • the first vehicle can detect and track targets based on cameras and millimeter-wave radars, respectively, and perform target fusion according to IOU fusion rules. Specifically, the first vehicle acquires the RGB image in front of the first vehicle based on the camera, and performs target recognition on the RGB image based on the target recognition model to obtain a first recognition result, where the first recognition result includes the location and type of the second vehicle, The input of the target recognition model is an RGB image, and the output is the position and type of the target; the first vehicle obtains the radar spot data in front of the first vehicle based on the millimeter wave radar, and processes the radar spot data to obtain the second recognition result.
  • the second identification result includes the position and speed of the second vehicle; when the first vehicle determines that the area where the second vehicle is located in the RGB image and the IoU of the area where the second vehicle is located in the radar trace data is greater than the second threshold M, the The first recognition result and the second recognition result are fused to obtain fusion data, otherwise the first recognition result and the second recognition result are not fused.
  • the second threshold M and the curvature ⁇ of the curve where the first vehicle and/or the second vehicle are located, the driving speed V of the first vehicle, and the driving distance L of the first vehicle satisfy the following relationships:
  • a and b are preset coefficients.
  • the second threshold K may also be related to other factors, such as the acceleration of the first vehicle, etc., which is not limited here.
  • the second threshold value K is related to the road condition where the vehicle is located (the curvature ⁇ of the curve where the first vehicle and/or the second vehicle is located), and the driving state of the first vehicle (that is, the driving speed V, the driving distance L) correlation, which can improve the accuracy of fusion recognition, thereby further improving the safety and stability of the vehicle during driving.
  • the first vehicle when the millimeter-wave radar is in the target-missing state or the target is about to be lost, the first vehicle may be in The second vehicle is tracked by the first recognition result when the target is lost or the target is about to be lost; or, when the millimeter-wave radar is not in the target-lost state or the target is about to be lost, the first vehicle can be based on the continuous multi-frame fusion data pair. The second vehicle is tracked.
  • the first vehicle adopts different tracking mechanisms according to different millimeter wave radar states, which can improve the accuracy of target tracking, thereby further improving the safety and stability of the vehicle during driving.
  • the first vehicle performs target recognition on the RGB image based on the target recognition model, which may specifically include: when the millimeter-wave radar is in the target loss state or the target is about to be lost, using the lightweight target recognition model to identify the camera
  • the collected RGB images are used for target recognition; when the millimeter-wave radar is not in the target loss state or the target is about to be lost, the weighted target recognition model is used to perform target recognition on the RGB images collected by the camera; among them, the lightweight target recognition model is used.
  • the recognition speed is higher than that of the heavyweight target recognition model, and the recognition accuracy of the lightweight target recognition model is lower than that of the heavyweight target recognition model.
  • two different target recognition models are designed for camera target recognition.
  • a lightweight recognition model is used to perform target recognition on the data collected by the camera to improve the recognition speed.
  • the weighted recognition model is used to identify the data collected by the camera to improve the recognition accuracy, so that the speed and accuracy of fusion recognition can be achieved.
  • a target detection device is provided, and the device is located in a first vehicle, for example, a chip provided inside the vehicle.
  • the device includes modules or units or means corresponding to the steps performed by the first vehicle in the first aspect or any possible design of the first aspect, and the functions, units or means can be implemented by software, or by It can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • the apparatus may include: a detection module for detecting the speed of the second vehicle based on a millimeter-wave radar; wherein the second vehicle is located in front of the first vehicle, and the speed includes a linear velocity and an angular velocity; and a processing module for obtaining The road conditions where the first vehicle and the second vehicle are located; the first threshold is determined according to the road conditions; if the speed of the second vehicle exceeds the first threshold, it is determined that the millimeter-wave radar is in a target loss state or a target loss state.
  • a computer-readable storage medium comprising a program or an instruction, when the program or instruction is executed on a computer, the method as described above in the first aspect or any possible design of the first aspect is performed.
  • a target detection device comprising a processor and a memory; wherein, the memory stores instructions that can be executed by the processor, and the processor executes the instructions stored in the memory to cause the device to execute the above-mentioned first aspect or the first Aspect the method in any possible design.
  • a chip is provided, which is coupled to a memory and used to read and execute program instructions stored in the memory, so as to implement the method in the first aspect or any possible design of the first aspect.
  • a sixth aspect provides a computer program product comprising instructions, the computer program product having instructions stored in the computer program product that, when run on a computer, cause the computer to perform the above-mentioned first aspect or any possible design of the first aspect. method.
  • a vehicle in a seventh aspect, includes a target detection device, a millimeter-wave radar, and a camera; the target detection device is configured to control the millimeter-wave radar and the camera to achieve the above-mentioned first aspect or any one of the first aspects possible. method in design.
  • the target detection device can detect the speed of another vehicle in front of the current vehicle through a millimeter-wave radar, and obtain the road conditions where the first vehicle and the second vehicle are located through a camera, and the target detection device further determines the first threshold according to the road conditions; if The speed of another vehicle in front of the current vehicle exceeds the first threshold, and it is determined that the millimeter-wave radar is in a target loss state or a target loss state.
  • Figure 1 is a schematic diagram of the relationship between the detection beam angle and the detection distance of the millimeter-wave radar
  • FIG. 2 is a schematic diagram of the detection range of the camera and the detection range of the millimeter wave radar;
  • FIG. 3A is a possible application scenario applicable to the embodiment of the present application.
  • FIG. 3B is another possible application scenario applicable to the embodiment of the present application.
  • FIG. 3C is another possible application scenario applicable to the embodiment of the present application.
  • FIG. 3D is another possible application scenario to which the embodiment of the present application is applicable.
  • Fig. 4 is a possible vehicle architecture diagram
  • FIG. 5 provides a flowchart of a target detection method according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of a first vehicle in a straight road and a second vehicle in a curve
  • FIG. 7 is a schematic diagram of a first vehicle in a curve and a second vehicle in a straight road;
  • FIG. 8 is a schematic diagram of a first vehicle in a curve and a second vehicle in a curve
  • FIG. 9 is a schematic diagram of a three-dimensional coordinate system of a millimeter-wave radar
  • Figure 10 is a schematic diagram of adjusting the attitude of the millimeter wave radar
  • Figure 11 is a schematic diagram of a millimeter-wave radar coordinate system and a camera coordinate system
  • FIG. 12 is a flowchart of a target tracking method provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of the ROI corresponding to the detection result of the millimeter wave radar and the ROI corresponding to the detection result of the camera;
  • Figure 14 is a schematic diagram of the intersection and union of the millimeter-wave radar detection results and the camera detection results;
  • Figure 15 and Figure 16 are schematic diagrams of an uphill scene
  • FIG. 17 is a system architecture diagram provided by an embodiment of the present application.
  • FIG. 18 is a schematic structural diagram of a target detection apparatus 180 provided by an embodiment of the present application.
  • FIG. 19 is a schematic structural diagram of another target detection apparatus 190 provided by an embodiment of the present application.
  • the camera can basically cover the short-range target obstacle, while the long-distance target obstacle needs to rely on the millimeter wave radar.
  • FIG. 1 is a schematic diagram of the relationship between the detection beam angle and the detection distance of the millimeter wave radar.
  • the detection beam angle changes with the detection distance, and the detection angle of the millimeter-wave radar in the far field of view is smaller than that in the near field of view. Therefore, in curved road conditions, the target obstacle (such as the front vehicle) detected and tracked by the rear vehicle is likely to exceed the detection area of the rear vehicle millimeter-wave radar, as shown in Figure 2.
  • the detection range of the camera is generally a sector, and the field of view (fov) of the commonly used camera is 60°, 90°, 150° and so on.
  • the detection distance of a camera is inversely proportional to the field of view. For example, a camera with a fov of 90° has a maximum detection distance of 80 meters, while a camera with a fov of 150° has a detection distance that is much shorter than 80 meters.
  • the detection range of the camera ie, the detection area of the camera
  • the detection distance is shorter than that of the radar.
  • the accurate position and speed of target obstacles come from the detection data of millimeter-wave radar. Therefore, when the target exceeds the detection range of the millimeter-wave radar, since the rear vehicle cannot obtain the accurate position, speed and other information of the target obstacle, the target obstacle will be considered lost by the rear vehicle, resulting in the illusion of no target obstacle.
  • the embodiments of the present application provide a target detection method and device, which aim to introduce the consideration of road detection blind spots, quickly identify special road conditions according to the positioning information and visual information of the vehicle, and use kinematic physical quantities (such as target Angular velocity/velocity, etc.) to determine whether sensors such as millimeter-wave radar are in a state of target loss or the target is about to be lost, so as to avoid the illusion that the vehicle has no target obstacles.
  • kinematic physical quantities such as target Angular velocity/velocity, etc.
  • the embodiment of the present application compensates and corrects the attitude of the radar from two degrees of freedom (for example, left and right ( ⁇ ) and up and down ( ⁇ )).
  • the adjustment of each degree of freedom is independent and does not depend on each other, so the accuracy and efficiency of the adjustment can be improved.
  • a plurality of means two or more. At least one means one or more.
  • "And/or" which describes the association relationship of the associated objects means that there can be three kinds of relationships, for example, A and/or B, which can mean that A exists alone, A and B exist at the same time, and B exists alone.
  • the character “/” generally indicates that the associated objects are an "or” relationship. Words such as “first” and “second” are only used for the purpose of distinguishing and describing, and cannot be understood as indicating or implying relative importance, nor can they be understood as indicating or implying order.
  • FIG. 3A is a possible application scenario to which the embodiment of the present application is applicable, and the scenario may be scenarios such as automatic driving, driving assistance, and manual driving.
  • the scene includes at least two vehicles, and FIG. 3A takes two vehicles as an example.
  • a radar sensor such as a millimeter-wave radar sensor, is also arranged in front of the rear vehicle (ie, the self-vehicle), and a camera is also arranged, and the rear vehicle can detect and track the preceding vehicle through the millimeter-wave radar and the camera.
  • both the leading and trailing vehicles are in a curve.
  • FIG. 3B is another possible application scenario to which the embodiment of the present application is applicable, and the scenario may be scenarios such as automatic driving, driving assistance, and manual driving.
  • the preceding vehicle is on a curve
  • the following vehicle ie, the ego vehicle
  • FIG. 3C is another possible application scenario to which the embodiment of the present application is applicable, and the scenario may be scenarios such as automatic driving, driving assistance, and manual driving.
  • the vehicle behind ie, the ego vehicle
  • the vehicle in front is on a straight road.
  • FIGS. 3A , 3B, 3C, and 3D take curved road conditions as an example, and the embodiments of the present application can also be applied to other road conditions that may lead to vehicle detection blind spots, such as up/downhill, acceleration/deceleration, etc. , this application does not limit.
  • the target detection method provided by the embodiment of the present application can be applied to the vehicle in the above-mentioned scenario.
  • the method can be carried in the computing device of the vehicle through software, and the computing device is, for example, a separate vehicle-mounted device (such as a vehicle-mounted computer, Driving control device, driving control device, etc.), or one or more processing chips, or integrated circuits, etc., which are not limited in this application.
  • FIG. 4 it is a possible vehicle architecture diagram, including computing equipment, control equipment, and on-board sensors. The various components in the vehicle are described below.
  • Vehicle-mounted sensors are used to collect various sensor data of the vehicle in real time.
  • the sensors installed on the vehicle in the embodiments of the present application include, for example, a camera (or referred to as a camera), a positioning system, a radar sensor, an attitude sensor, and the like.
  • other sensors may also be included, such as a shaft rotational speed sensor, a wheel speed sensor, etc., which are not limited in this application.
  • the radar sensor may be referred to as radar.
  • the radar can measure the radar trace data of the surrounding environment, and can also measure the information of the obstacles in the set direction of the vehicle, such as the position and speed of the preceding vehicle (including linear velocity, angular velocity), etc.
  • the radar in the embodiments of the present application mainly takes a millimeter-wave radar as an example.
  • Millimeter wave radar It is a radar that works in the millimeter wave band (millimeter wave). Generally, millimeter wave refers to the 30-300GHz frequency band (wavelength is 1-10mm). The wavelength of millimeter wave is between centimeter wave and light wave, so millimeter wave has the advantages of microwave guidance and photoelectric guidance. Installing the millimeter-wave radar on the car can measure the distance, angle and relative speed from the millimeter-wave radar to the object to be measured.
  • millimeter wave radar can realize adaptive cruise control, forward collision warning (Forward Collision Warning), blind spot detection (Blind Spot Detection), assisted parking (Parking aid), auxiliary lane change (Lane change assistant), autonomous cruise control, etc. Advanced Driver Assistance Systems (ADAS) features.
  • Forward Collision Warning Forward Collision Warning
  • Blind spot detection Blind Spot Detection
  • assisted parking Parking aid
  • auxiliary lane change Lane change assistant
  • autonomous cruise control etc.
  • ADAS Advanced Driver Assistance Systems
  • the millimeter-wave radar may be linked with the follower mechanism, and when the follower mechanism rotates, the attitude of the millimeter-wave radar changes accordingly.
  • the follow-up mechanism can be a component of the millimeter-wave radar, or it can be a supporting device of the millimeter-wave radar, which is not limited here.
  • Attitude sensor is a high-performance three-dimensional motion attitude measurement system based on sensor, namely Microelectro Mechanical Systems (MEMS) technology.
  • MEMS Microelectro Mechanical Systems
  • the attitude sensor is installed on the radar and is used to detect the attitude of the radar.
  • Cameras can be deployed around the vehicle and collect environmental parameters around the vehicle.
  • at least one camera may be installed on the front and rear bumpers, side mirrors, windshield, and roof of the vehicle, respectively.
  • the camera can at least acquire an image in front of the vehicle, so that the computing device can determine the type of the target obstacle (for example, the vehicle in front) in the driving direction of the vehicle according to the image, and can also determine the distance between the vehicle and the target obstacle. , the location of the target obstacle and the road conditions.
  • the vehicle-mounted sensor in the embodiment of the present application is a combination of a camera and a millimeter-wave radar as an example.
  • this application does not limit the specific type of sensor.
  • the sensor of a vehicle can also be a combination of solid-state lidar and camera, a combination of millimeter-wave radar and solid-state lidar, a combination of camera and camera, millimeter-wave radar, solid-state laser and camera. combination, etc.
  • a sensor with a large detection range, short detection distance, and inaccurate detection results is used in conjunction with a sensor with a small detection range, long detection distance, and accurate detection results, the embodiments of the present application are applicable.
  • the positioning system is used to locate the current position of the vehicle.
  • the positioning system may be a global positioning system (Global Positioning System, GPS), a Beidou system or other positioning systems, for receiving satellite signals and locating the current position of the vehicle.
  • the positioning system may also be visual positioning, radar positioning, fusion positioning, etc., which are not limited in this application, and GPS is mainly used as an example in the following text.
  • the computing device is responsible for the computing function, which is used to collect the parameter information of the surrounding environment of the vehicle in real time during the driving of the vehicle according to various sensors (such as cameras, radars, positioning systems, etc.) installed on the vehicle.
  • the parameter information is calculated and analyzed, and the vehicle control command can be determined according to the analysis result.
  • Various computing functions may be provided in computing devices. For example, it includes: perceiving the surrounding environment information during the driving process of the vehicle according to the parameter information collected by the camera and the radar; determining the geographic location of the vehicle according to the parameter information collected by the positioning system; according to the parameter information collected by the camera, radar, positioning system and other sensors Perform calculation and analysis to determine the state of the vehicle, for example, determine whether the radar is in the state of target loss or the target is about to be lost; generate control instructions according to the parameter information collected by the above sensors and send them to the control device, so that the control device controls the corresponding sensors, such as When the radar is in the target loss state or is about to be in the target loss state, an instruction to control the rotation of the follower mechanism of the radar can be generated, and then the instruction is sent to the control device, so that the control device controls the rotation of the follower mechanism, and then adjusts the attitude of the radar, so that The radar was able to regain the target.
  • the computing device may be a Mobile Data Center (MDC).
  • MDC Mobile Data Center
  • the MDC is the local computing platform of the autonomous vehicle.
  • the autonomous driving software running on the MDC includes the camera perception algorithm, millimeter-wave radar perception algorithm, and target fusion tracking algorithm of this solution (that is, the target tracking method based on the fusion of radar information and camera information corresponds to algorithm) etc.
  • the MDC also runs some simple operations, such as motor control.
  • control device may be a micro-control unit (MCU).
  • MCU micro-control unit
  • SCM single-chip microcomputer
  • SCM single-chip microcomputer
  • the micro control unit MCU can appropriately reduce the frequency and specifications of the central processing unit (CPU), and combine the memory (memory), counter (timer), universal serial bus (universal serial bus, USB), Analog-to-digital (analog to digital, AD) conversion, universal asynchronous receiver transmitter (UART), programmable logic controller (PLC), direct memory access (direct memory access, DMA), etc.
  • At least one peripheral interface, and even a liquid crystal display (LCD) driver circuit are integrated on a single chip to form a chip-level computing device that can be controlled in different combinations for different applications.
  • the control device After receiving the vehicle control command sent by the computing device, the control device can control the vehicle sensor (eg, control the attitude of the radar) through the vehicle control interface, so as to realize the auxiliary control of the vehicle.
  • the vehicle sensor eg, control the attitude of the radar
  • the structure of the in-vehicle device shown in FIG. 4 does not constitute a limitation on the in-vehicle device, and the in-vehicle device provided in the embodiments of the present application may include more or less modules than those shown in the figure, or a combination of certain modules may be included. Some modules or different component arrangements are not limited in this application.
  • the in-vehicle device may also include braking mechanisms (such as brakes, accelerators, gears, etc.), human-computer interaction input and output components (such as display screens, etc.), wireless communication modules, communication interfaces, and the like.
  • FIG. 5 a flow chart of a method for object detection is provided in an embodiment of the present application.
  • the method can be applied to the vehicle shown in FIG. 4 , and the method includes:
  • the first vehicle acquires the road conditions where the first vehicle (own vehicle) and the second vehicle (the preceding vehicle) are located, and acquires the speed of the second vehicle.
  • the first vehicle is equipped with sensors such as a camera, a millimeter-wave radar, and a positioning system
  • the computing device collects sensor data collected by each sensor in real time.
  • the camera can obtain the image information in front of the first vehicle in real time and detect and track the second vehicle (for example, detect the type and position of the second vehicle)
  • the positioning system can locate the position of the first vehicle in real time
  • the millimeter wave radar The target obstacle (taking the second vehicle as an example herein) can be detected and tracked (eg, the position and speed of the second vehicle are detected).
  • FIG. 4 For the specific structure of the first vehicle, reference may be made to the structure shown in FIG. 4 , which will not be described in detail here.
  • the computing device of the first vehicle may determine the road condition in which the first vehicle is located based on the positioning system. Specifically, the computing device of the first vehicle can obtain the position of the first vehicle on the map based on the positioning system, so as to determine the road condition on which it is located, such as a curve, a straight road, an uphill, or a downhill, based on the map.
  • the computing device of the first vehicle may also determine the distance traveled by the first vehicle based on the positioning system. For example, referring to FIG. 6 or FIG. 7 or FIG. 8 , at time t0 the positioning system locates the location of the first vehicle as point A; at time t1 the positioning system locates the location of the first vehicle as point A', and the computing device locates the location of the first vehicle as point A' according to point A and point A' The position can calculate the distance from point A to point A', namely (Displacement of the first vehicle from time t0 to time t1).
  • the computing device of the first vehicle may also acquire the road conditions where the second vehicle is located through the camera. Specifically, the computing device of the first vehicle controls the camera to capture an image in front of the first vehicle, and the image includes information about the second vehicle and the road where the second vehicle is located, such as the curvature radius of the lane line or other road signs, etc., Then, based on the image, determine the road condition on which the second vehicle is located, such as a curve or a straight road, or an uphill, or a downhill, and the like.
  • the computing device of the first vehicle may also comprehensively determine information such as the position and speed of the second vehicle according to the data detected by the camera and the millimeter-wave radar.
  • the specific implementation can refer to the target tracking method based on the fusion of millimeter-wave radar information and camera information, which is described later, and will not be introduced in detail here.
  • the first vehicle determines that the millimeter-wave radar of the first vehicle is in a target loss state or a target loss state.
  • the speed of the second vehicle includes at least two kinds of angular speed and linear speed, and the preset condition includes whether the speed of the second vehicle is greater than or equal to the set first threshold.
  • the set first threshold should be less than or equal to the speed of the second vehicle when driving on the boundary of the detection range of the millimeter-wave radar (that is, the second vehicle is located within the detection range of the millimeter-wave radar and located outside the detection range of the millimeter-wave radar. speed threshold in between).
  • the first threshold is equal to the speed of the second vehicle when it is traveling on the boundary of the detection range of the millimeter-wave radar, then when the speed of the second vehicle exceeds the first threshold, the second vehicle will exceed the detection range of the millimeter-wave radar and enter the millimeter-wave radar The radar detects the blind spot. At this time, the millimeter wave radar is in the target loss state.
  • the speed of the second vehicle is equal to the first threshold (or the speed of the second vehicle is less than the first threshold and the difference between the speed of the second vehicle and the first threshold is less than When the preset difference is set, the millimeter-wave radar is in a state where the target is about to be lost.
  • the first threshold is smaller than the speed of the second vehicle when it is traveling on the boundary of the detection range of the millimeter-wave radar, and the difference between the first threshold and the critical value is set to be ⁇ X, then the speed of the second vehicle exceeds the first threshold and the third
  • the difference between the speed of the second vehicle and the first threshold is less than or equal to ⁇ X
  • the millimeter-wave radar is in a state where the target is about to be lost
  • the speed of the second vehicle exceeds the first threshold
  • the speed of the second vehicle is equal to the first threshold.
  • the difference is greater than ⁇ X
  • the millimeter-wave radar is in a target loss state.
  • the state of the millimeter-wave radar can also be set as the target loss state when the millimeter-wave radar is actually in the target loss state (that is, the target loss state includes actual loss and impending loss.) .
  • the millimeter-wave radar is in a target-losing state or a state where the target is about to be lost, for example, the millimeter-wave radar is in a frame-loss state (this state can be understood as the millimeter-wave radar does not detect a target containing a target). data frame), which is not limited in this application.
  • the preset conditions may be different, for example, the first threshold is different.
  • the computing device of the first vehicle determines that the millimeter-wave radar of the first vehicle is in a target loss state or a target loss state.
  • the detection range of the millimeter-wave radar is fixed relative to the first vehicle.
  • the first threshold value is different.
  • Example 1 Referring to FIG. 6 , the first vehicle is on a straight road and the second vehicle is on a curve, then the preset conditions include:
  • Time t0 is the time when the millimeter-wave radar collects the first frame of data
  • time t1 is the time when the millimeter-wave radar collects the second frame of data
  • the first frame of data and the second frame of data can be two consecutive frames of data. It can also be two frames of data separated by a small number of frames (for example, the first frame data and the second frame data are separated by 1 frame or 2 frames). In the embodiments of the present application, two consecutive frames of data are mainly used as an example.
  • the boundary position may be an area or a range.
  • the position within the ⁇ L area from the boundary line of the radar detection range may be defined as the boundary position of the millimeter wave radar detection range.
  • ⁇ r ' is the instantaneous angular velocity of the second vehicle making a circular motion around the curve at time t2; N is the first threshold;
  • ⁇ r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1 (the average angular velocity, when the interval between t0 and t1 is extremely short (for example, the time to collect a frame of data), the angular velocity can also be considered as instantaneous. speed), t2 ⁇ t1>t0; is the displacement of the first vehicle from time t0 to time t1;
  • is the angle value of the central angle formed between A, B, and O at time t0;
  • is half of the detection beam angle of the millimeter-wave radar, which is determined by the characteristics of the millimeter-wave radar;
  • the coordinates of A and A' can be obtained by the positioning system, and the position coordinates of B and B' can be obtained by the target tracking method based on the fusion of millimeter-wave radar information and camera information.
  • the target tracking method will be further detailed later. introduce.
  • ⁇ , ⁇ r , etc. can be calculated from the geometric points of the position coordinates of A, A', B, B' output by the target tracking method.
  • ⁇ r ' may be detected by the millimeter-wave radar or calculated according to the linear velocity v r ' of the second vehicle detected by the millimeter-wave radar.
  • K is the first preset coefficient, and the value range is (0, 1].
  • the value of K may be related to the execution time of the first vehicle to adjust the follow-up mechanism of the millimeter-wave radar and/or the urgency of the curve, etc. . The longer the execution time, the smaller the K, the shorter the execution time, the larger the K;
  • t is the execution time
  • R is the radius of the curve
  • a is the undetermined coefficient
  • the first vehicle is on a straight road, and the target vehicle is on a curve.
  • the first vehicle is at position A
  • the target vehicle is at position B
  • the position B is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at position A.
  • the first vehicle is at A' and the target vehicle is at B'
  • the B' position is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at the A' position.
  • O indicates that it is the center of the curve (the center of curvature), and the graph is analyzed and connected are perpendicular to the tangent directions of B and B' respectively (because the time from t0 to t1 is very short, so can be considered to be perpendicular to ).
  • the turning radius of the second vehicle in the curve is:
  • Example 2 Referring to FIG. 7 , the first vehicle is on a curve and the second vehicle is on a straight road, the preset conditions include:
  • Time t0 is the time when the millimeter-wave radar collects the first frame of data
  • time t1 is the time when the millimeter-wave radar collects the second frame of data
  • the first frame of data and the second frame of data can be two consecutive frames of data. It can also be two frames of data separated by a small number of frames (for example, the first frame data and the second frame data are separated by 1 frame or 2 frames). In the embodiments of the present application, two consecutive frames of data are mainly used as an example.
  • the first vehicle is at position A
  • the second vehicle is at position B
  • position B is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at position A.
  • the first vehicle is at the A' position
  • the second vehicle is at the B' position
  • the B' position is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at the A' position.
  • the boundary position may be an area or a range.
  • v r ' is the instantaneous traveling speed of the second vehicle on the straight road at time t2 (the traveling speed in this paper refers to the linear speed), t2 ⁇ t1>t0; N is the first threshold;
  • v r is the running speed of the second vehicle on the straight road from time t0 to time t1 (travel speed in this paper refers to the linear speed, where v r is the average linear speed, when the interval between t0 and t1 is very short (for example, collecting a frame data time), the linear velocity can also be considered as the instantaneous linear velocity);
  • the coordinates of A and A' can be obtained by the positioning system, and the position coordinates of B and B' can be obtained by the target tracking method based on the fusion of millimeter-wave radar information and camera information.
  • the target tracking method will be introduced in detail later.
  • v r can be calculated from the geometric points of the position coordinates of A, A', B, and B' output by the target tracking method.
  • v r ' can be detected by millimeter wave radar.
  • P is the second preset coefficient, and the value range is (0, 1].
  • P is the same as or different from K in Example 1.
  • the value of P is the same as that of the first vehicle to adjust the execution of the follow-up mechanism of the millimeter-wave radar. Time and/or the priority of the curve, etc. The longer the execution time, the smaller the P, the shorter the execution time, the larger the P;
  • t is the execution time
  • R is the radius of the curve
  • a is the undetermined coefficient
  • is equal to ⁇ , which is half of the detection beam angle of the radar.
  • Example 3 Referring to FIG. 8 , the first vehicle is on a curve and the second vehicle is on a curve, then the preset conditions include:
  • Time t0 is the time when the millimeter-wave radar collects the first frame of data
  • time t1 is the time when the millimeter-wave radar collects the second frame of data
  • the first frame of data and the second frame of data can be two consecutive frames of data. It can also be two frames of data separated by a small number of frames (for example, the first frame data and the second frame data are separated by 1 frame or 2 frames). In the embodiments of the present application, two consecutive frames of data are mainly used as an example.
  • the first vehicle is at position A
  • the second vehicle is at position B
  • position B is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at position A.
  • the first vehicle is at the A' position
  • the second vehicle is at the B' position
  • the B' position is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at the A' position.
  • ⁇ r ' is the instantaneous angular velocity of the second vehicle making a circular motion around the curve at time t2;
  • ⁇ r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1 (average angular velocity, when the interval between t0 and t1 is extremely short (for example, the time to collect one frame of data), the angular velocity can also be considered as the instantaneous velocity. ), t2 ⁇ t1>t0; N is the first threshold;
  • v is the traveling speed (instantaneous linear speed) of the first vehicle, which can be obtained from the chassis information of the first vehicle or the first vehicle positioning system;
  • is half of the detection beam angle of the millimeter-wave radar, which is determined by the characteristics of the millimeter-wave radar;
  • the coordinates of A and A' can be obtained by the positioning system, and the position coordinates of B and B' can be obtained by the target tracking method based on the fusion of millimeter-wave radar information and camera information.
  • the target tracking method will be introduced in detail later.
  • ⁇ r ' may be detected by the millimeter-wave radar or calculated according to the linear velocity v r ' of the second vehicle detected by the millimeter-wave radar.
  • Q is the third preset coefficient, the value range is (0,1], and Q is the same as or different from P and K.
  • the value of Q is the same as the execution time of the first vehicle to adjust the follow-up mechanism of the millimeter-wave radar and the / or the priority of the curve, etc. The longer the execution time, the smaller the Q, the shorter the execution time, the larger the Q; the more acute the curve, the smaller the Q, the slower the curve, the larger the Q.
  • t is the execution time
  • R is the radius of the curve
  • a is the undetermined coefficient
  • the first vehicle in the embodiment of the present application can quickly identify special road conditions through the positioning information and visual information collected by itself, and can monitor the kinematic physical quantities of the second vehicle (such as angular velocity/line under special road conditions) Speed, etc.) can quickly judge whether the sensors such as the millimeter-wave radar of the vehicle are in the target loss state or the target is about to be lost (the fastest only need to be identified according to the continuous two frames of data collected by the millimeter-wave radar), so as to prevent the vehicle from generating no noise.
  • the illusion of target obstacles improves the accuracy of target detection.
  • the following describes a scheme for adjusting the attitude of the millimeter-wave radar by the first vehicle after the millimeter-wave radar is in a target-loss state or a target is about to be lost (or in other words, the millimeter-wave radar is in a frame-loss state).
  • attitude sensors are implemented by one or more of three types of sensors: acceleration sensor (ie accelerometer), angular velocity sensor (ie gyroscope), and magnetic induction sensor (ie magnetometer). Combination implementation.
  • Attitude sensors include three-axis attitude sensors (or three-dimensional attitude sensors), six-axis attitude sensors (or six-dimensional attitude sensors), nine-axis attitude sensors (or nine-dimensional attitude sensors), and the like.
  • the three-axis attitude sensor is implemented by one type of sensor (such as three-axis accelerometer or three-axis gyroscope or three-axis magnetometer); the six-axis attitude sensor is generally implemented by two types of sensors (such as three-axis accelerometer + three-axis gyroscope) ); the nine-axis attitude sensor generally consists of a three-axis gyroscope + three-axis accelerometer + three-axis geomagnetometer, there are also six-axis accelerometer + three-axis gyroscope, and there are also six-axis gyroscope + three-axis accelerometer.
  • the attitude sensor in the embodiment of the present application may be a six-dimensional attitude sensor or a nine-dimensional attitude sensor, etc., which is not limited in this application.
  • the attitude sensor can detect the attitude of the millimeter wave radar in real time.
  • the embodiment of the present application takes the center of the millimeter-wave radar as the center of the circle when the vehicle is driving on a straight road, and establishes a three-dimensional coordinate system of the millimeter-wave radar, wherein the X- R axis is parallel to the ground direction (take the right direction of the millimeter-wave radar as the center of the circle).
  • the attitude of the millimeter-wave radar can be represented by the following three parameters:
  • Roll Roll angle
  • the attitude sensor mainly acquires ⁇ and ⁇ of the millimeter-wave radar. Assuming that when the vehicle is driving on a straight road, the attitude sensor detects that the attitude parameters ⁇ and ⁇ of the millimeter-wave radar are both 0, then when the vehicle is driving on a curve and/or a ramp, the attitude sensor detects that the millimeter-wave radar ⁇ and ⁇ are may be greater or less than 0.
  • the attitude of the millimeter-wave radar is compensated and corrected by comprehensively considering the degrees of freedom of the millimeter-wave radar that are closely related to road characteristics.
  • the attitude of the millimeter-wave radar can be adjusted from the left and right rotational degrees of freedom:
  • Left and right rotational degrees of freedom are the degrees of freedom of the radar to rotate around the Z R axis, and the corresponding rotation angle is ⁇ .
  • the attitude of the millimeter-wave radar can be adjusted from the up and down swing degrees of freedom:
  • the degrees of freedom of the above two directions are independent (or decoupled) in the mechanical structure and independent (or decoupled) in the control logic, so the computing device Only one degree of freedom can be adjusted.
  • the computing device can also adjust the attitude of the millimeter-wave radar from the left and right rotation degrees of freedom and the up and down swing degrees of freedom at the same time.
  • FIG. 10 it is a schematic diagram of adjusting the attitude of the millimeter-wave radar by the computing device of the first vehicle (the MDC is taken as an example in FIG. 10 ).
  • the target detection method described in the embodiment shown in FIG. 5 is run on the computing device, which can identify whether the millimeter-wave radar is in a target loss state or a target is about to be lost.
  • the computing device also runs based on the millimeter-wave radar information and camera information.
  • a fused target tracking method (the target tracking method may be a method in the prior art, or may be the target tracking method introduced in the related embodiment of FIG. 12 later, which is not limited here).
  • the follow-up mechanism of the millimeter-wave radar includes a Z- R -axis rotating motor and an X- R -axis rotating motor.
  • the Z- R -axis rotating motor can rotate under the control of voltage to drive the millimeter-wave radar to rotate around the Z- R -axis
  • the X- R -axis rotating motor can rotate. It rotates under the control of voltage and drives the millimeter-wave radar to rotate around the X R axis.
  • the process that the computing device controls and adjusts the attitude of the millimeter-wave radar includes:
  • the computing device Based on the target detection method, the computing device detects whether the millimeter-wave radar is in the target loss state or the target is about to be lost (ie, whether the frame is lost);
  • the computing device determines the first angle adjustment value of the millimeter-wave radar according to the current attitude of the millimeter-wave radar (for example, the millimeter-wave radar needs to rotate around the Z R axis by ⁇ . The angle needs to be rotated around the X R axis by the angle of ⁇ ); the computing device sends the angle to the control device through an instruction (Figure 10 takes the MCU as an example); the control device saves the mapping relationship between the angle adjustment value and the voltage adjustment value.
  • the device After receiving the instruction, the device converts the first angle adjustment value into the first voltage V1 that needs to be sent to the Z R -axis rotating motor and the second voltage V2 that needs to be sent to the X R -axis rotating motor according to the mapping relationship; the control device controls the Z R
  • the input voltage of the axis rotating motor is V1 and the input voltage controlling the X R axis rotating motor is V2, so that the Z R axis rotating motor rotates to drive the millimeter wave radar to rotate around the Z R axis by an angle of ⁇ , and the X R axis rotates and then Drive the millimeter-wave radar to rotate around the X R axis by an angle of ⁇ ; or,
  • the computing device saves the mapping relationship between the angle adjustment value and the voltage adjustment value.
  • the computing device determines the first angle adjustment value of the millimeter-wave radar according to the current attitude of the millimeter-wave radar.
  • the first angle adjustment value is converted into a first voltage V1 that needs to be delivered to the Z R -axis rotary motor and a second voltage V2 that needs to be delivered to the X R -axis rotary motor; the computing device sends V1 and V2 through the command
  • the input voltage to control the Z R axis rotating motor is V1
  • the input voltage to control the X R axis rotating motor is V2, so that the Z R axis rotating motor rotates and drives the millimeter wave radar around the Z R axis.
  • the computing device After each attitude adjustment of the millimeter-wave radar by the computing device (for example, the millimeter-wave radar rotates ⁇ around the Z R axis and rotates ⁇ around the X R axis), the computing device re-detects the attitude of the millimeter-wave radar and judges the millimeter-wave radar.
  • the computing device determines that the attitude adjustment is completed; if not, the computing device continues to adjust the attitude of the millimeter-wave radar, and this cycle , until both the left and right rotation degrees of freedom and the up and down swing degrees of freedom of the millimeter-wave radar meet the corresponding angular closed-loop control standards.
  • the left and right rotation degrees of freedom and the up and down swing degrees of freedom of the millimeter-wave radar meet the corresponding angle closed-loop control standards, and the millimeter-wave radar can accurately detect the target again.
  • the computing device updates the millimeter-wave radar in real time according to the adjustment angle value of the millimeter-wave radar.
  • the calibration matrix of the wave radar ensures the reliability of the information fusion of the millimeter wave radar and the camera during the attitude adjustment of the millimeter wave radar.
  • the millimeter-wave radar coordinate system can describe the relative position of the object and the millimeter-wave radar, expressed as (X R , Y R , Z R ), and the camera coordinate system can describe the relative position of the object and the camera, expressed as (X C , Y C , Z C ). Since the camera and the millimeter-wave radar are installed at different positions of the vehicle, the position data of each feature point collected by the camera is the coordinate of each feature point in the camera coordinate system, and the position data of each feature point collected by the millimeter-wave radar is the feature point.
  • the same object has different coordinate parameters in the camera coordinate system and the millimeter-wave radar coordinate system, so a calibration matrix is required to correspond the data collected by the millimeter-wave radar and the data collected by the camera to the same one In the coordinate system, it is convenient for the computing device to perform data operations.
  • the computing device can convert the data collected by the millimeter-wave radar into the camera coordinate system, and the calibration matrix can include the rotation matrix R, translation matrix T, etc. required for conversion between the millimeter-wave radar coordinate system and the camera coordinate system. .
  • the computing device can convert both the data collected by the millimeter-wave radar and the data collected by the camera into the world coordinate system (or image coordinate system or other coordinate system), then the calibration matrix can include the coordinate system for the millimeter-wave radar and the world coordinate system. Rotation matrix R, translation matrix T, etc. required for conversion between coordinate systems (or image coordinate systems or other coordinate systems).
  • the embodiment of the present application combines the road conditions characteristics of the road, and when the millimeter-wave radar loses the target or is about to lose the target, the millimeter-wave radar has two degrees of freedom in the left and right ( ⁇ ) and up and down ( ⁇ ) aspects of the millimeter-wave radar.
  • the attitude is compensated and corrected, and the two degrees of freedom are independently adjusted and not dependent on each other, which can improve the accuracy and efficiency of the attitude adjustment of the millimeter-wave radar, thereby further improving the accuracy and efficiency of target detection and tracking.
  • the embodiment of the present application can also compensate and correct the calibration matrix of the millimeter-wave radar in real time, so as to achieve a fast and accurate response of information fusion between the millimeter-wave radar and the camera, and ensure the millimeter-wave radar during the attitude adjustment process of the millimeter-wave radar. Reliability of radar and camera information fusion.
  • the above takes the curved road conditions as an example to introduce the process of adjusting the attitude of the millimeter-wave radar by the first vehicle when the millimeter-wave radar is in the target loss state or the target is about to be lost (or the millimeter-wave radar is in the frame-loss state).
  • the attitude of the millimeter-wave radar can also be adjusted in the same way for other road scenarios (such as up/downhill, acceleration/deceleration, etc.), which is not limited in this application.
  • an embodiment of the present application also provides a target tracking method based on the fusion of millimeter-wave radar information and camera information.
  • the target tracking method can be applied to the vehicle shown in FIG. 4 , including:
  • the first vehicle acquires a red green blue (Red Green Blue, RGB) image collected by a camera, and performs rough fitting on the lane lines in the far field of view included in the RGB image.
  • RGB Red Green Blue
  • the computing device of the first vehicle acquires the RGB image collected by the camera, analyzes the RGB image, and performs rough fitting on the lane lines in the far field of view included in the RGB image.
  • the processing chip in the camera analyzes the RGB image, performs rough fitting on the lane lines in the far field of view included in the RGB image, and then transmits the fitting result to the computing device.
  • the rough fitting here refers to projecting certain pixels of the lane line onto the bird's-eye view, and extracting several points at equal intervals to determine the turning point and direction of the curve without spending a lot of computing power and consumption. Time to fit complex equations such as cubic curves of curves, and do not need to do a lot of lane lines and smooth post-processing.
  • the target (such as the second vehicle) is generally not easily lost in the near field of view.
  • the amount of calculation is reduced and the calculation efficiency is improved.
  • the far field of view is a position where the distance from the vehicle exceeds a preset distance (eg, 50m, 60m, or 100m, etc.).
  • a preset distance eg, 50m, 60m, or 100m, etc.
  • the preset distance may be determined according to the characteristics of the detection beam angle and detection distance of the millimeter wave radar.
  • the first vehicle simultaneously performs target recognition based on the millimeter-wave radar and the camera.
  • the computing device of the first vehicle performs target recognition on the RGB image obtained by the camera to obtain the first recognition result; or, the processing chip in the camera performs target recognition on the RGB image, and after obtaining the first recognition result, the first recognition result is obtained.
  • the results are transmitted to the computing device (ie S1202.1, camera target recognition).
  • the computing device of the first vehicle performs target recognition on the radar trace data obtained by the millimeter-wave radar to obtain the second recognition result; or, the processing chip in the millimeter-wave radar performs target recognition on the radar trace data to obtain the first recognition result.
  • the second identification result is transmitted to the computing device (ie S1202.2, millimeter wave radar target identification).
  • the computing device saves a trained target recognition model corresponding to it. After obtaining the RGB image collected by the camera, the computing device inputs the RGB image into the target recognition model corresponding to the target recognition of the camera, so as to obtain the target recognition result corresponding to the camera, that is, the first recognition result. After the computing device obtains the radar spot trace data collected by the millimeter-wave radar, it can use the neural network model or traditional clustering and tracking algorithms to process the radar spot data, and the millimeter-wave radar can obtain the target recognition result corresponding to the millimeter-wave radar. , that is, the second recognition result.
  • the first target recognition result is represented as (Xc, Yc, C) T , where Xc and Zc represent the position data of the target, and C represents the type of the target.
  • the second target recognition result be represented as (X R , Y R , v, w) T , where X R and Y R represent the position data of the target, v represents the linear velocity of the target, and w represents the angular velocity of the target.
  • the position data of the target recognition of the millimeter-wave radar and the target recognition of the camera are based on the position data in the two-dimensional plane as an example, that is, the first recognition result only includes the data in the two directions of Xc and Yc, and the third 2.
  • the identification result only includes the data in the two directions of X R and Y R.
  • the position data of the target recognition of the millimeter-wave radar and the target recognition of the camera may also include more parameters, which are not limited here.
  • different target recognition models may be designed according to different recognition scenarios.
  • the computing device uses a lightweight recognition model to identify the data collected by the camera, and the lightweight recognition model focuses on the recognition speed (that is, the algorithm processing delay is small).
  • lightweight recognition The recognition model can use You Only Look Once (YOLO) v3 (version number, representing the third edition).
  • the weighted recognition model is used to identify the data collected by the camera, and the weighted recognition model focuses on the accuracy of the algorithm (that is, the accuracy of the position, speed and type of the target output by the algorithm should be High), so that the computing device can adjust the attitude of the millimeter-wave radar as soon as possible based on the type and position of the target, and re-detect the target.
  • the weighted recognition model can use a regional convolutional neural network (RCNN).
  • the first vehicle performs time alignment and target alignment on the first recognition result and the second recognition result.
  • the so-called time alignment is to perform time synchronization between the first recognition result and the second recognition result.
  • the so-called target alignment is to perform spatial synchronization between the first recognition result and the second recognition result, for example, converting the position data into the same coordinate system as described in the above-mentioned embodiment of FIG. 11 .
  • the computing device performs target fusion according to the IOU fusion rule.
  • the computing device of the first vehicle after the computing device of the first vehicle performs target recognition on the RGB image to obtain the first recognition result, it can also generate a region of interest (ROI) (that is, a target frame) in the RGB image; And, after the first recognition result is obtained by performing target recognition on the radar spot trace data, the ROI may also be generated in the radar spot trace data.
  • ROI refers to the area that needs to be processed from the processed image in the form of box, circle, ellipse, irregular polygon, etc. in machine vision and image processing, which is called the region of interest.
  • the ROI is The area where the target (ie the second vehicle) is located.
  • the ROI in the embodiment of the present application takes a rectangle as an example. As shown in FIG. 13 , it is the ROI corresponding to the detection result of the millimeter wave radar (the rectangular area indicated by A) and the ROI (the rectangular area indicated by B) corresponding to the detection result of the camera. schematic diagram.
  • the value of IoU should be between [0,1].
  • the larger the value of IoU the higher the probability that the target detected based on the camera and the target detected based on the radar are the same target.
  • the smaller the value of IoU the target detected based on the camera and the target detected based on the radar are The probability of the same target is lower.
  • the IOU fusion rule may include: if IoU>M (or IoU ⁇ M), the computing device fuses the first recognition result and the second recognition result, and outputs the third recognition result.
  • the data that is the third recognition result is expressed as (X*, Y*, C*, v*, w*) T ; if IoU ⁇ M (or IoU ⁇ M), then the computing device does not combine the first recognition result and the second The recognition results are fused, and the first recognition result and the second recognition result are output.
  • M is the set second threshold, and the value range is between (0, 1).
  • the second threshold M may be related to the curvature ⁇ of the curve where the first vehicle and/or the second vehicle is located, the driving speed V of the first vehicle, and the driving distance L of the first vehicle (that is, the distance to the second vehicle). ) and other factors.
  • the second threshold M and the curvature ⁇ of the curve, the driving speed V, and the driving distance L satisfy the following formulas:
  • a and b are coefficients, which can be set by technicians according to experiments or experience.
  • the second threshold K may also be related to other factors, such as the acceleration of the first vehicle.
  • the first vehicle executes the millimeter-wave radar target not lost tracking mechanism: the computing device of the first vehicle performs target tracking based on continuous multi-frame fusion data;
  • the computing device performs target tracking based on consecutive multiple frames (X*, Y*, C*, v*, w*) T.
  • the first vehicle executes the millimeter-wave radar target not lost tracking mechanism: the computing device of the first vehicle is based on the radar in the target lost state or the target is about to be lost.
  • Target tracking is performed with the recognition result and the last frame or multi-frame fusion data before the radar is in the target lost state or the target is about to be lost.
  • the computing device fuses the data based on the last frame before the target is lost (that is, the original frame (X*, Y*, C* , v*, w*) T ) and the first recognition result after the target is lost (ie, consecutive frames (X C , Y C , C) T ) to track the target.
  • the embodiment of the present application only fits the lane line thickness in the far field of view, which can reduce the amount of calculation and improve the calculation efficiency on the premise of ensuring the accuracy.
  • the embodiments of the present application design at least two target recognition models for camera target recognition. In the case of frame loss of millimeter-wave radar, a lightweight recognition model is used to perform target recognition on the data collected by the camera to improve the recognition speed. When the wave radar does not lose frames, the weighted recognition model is used to identify the data collected by the camera to improve the recognition accuracy, thereby achieving both the speed and accuracy of fusion recognition.
  • the embodiment of the present application also designs the second threshold K of the IoU in combination with the curve scene, which can further improve the recognition accuracy.
  • this paper takes the blind spot of the curve as an example to introduce the solution of how to detect whether the target is lost and how to re-acquire the detected target, but in practical applications, the technical solutions of the embodiments of the present application are also applicable to other similar blind spot scenarios.
  • the vehicle will also have a blind spot, so the millimeter-wave radar can also be controlled to achieve the degree of freedom of its up and down swing.
  • uneven road sections or the start/stop time of a car also have a great influence on the pitch angle of the millimeter-wave radar, so the millimeter-wave radar can also be controlled to achieve the degree of freedom of its up and down swing.
  • FIG. 15 is a scene where the vehicle is on an uphill slope.
  • both the first vehicle and the second vehicle are on a flat road, and the second vehicle is within the detection range of the millimeter-wave radar of the first vehicle; at time t1, the second vehicle is driving on the uphill.
  • the attitude is consistent with the attitude at time t0, so due to the existence of the ramp, the second vehicle exceeds (or is about to exceed) the detection range of the millimeter-wave radar of the first vehicle.
  • the computing device of the first vehicle adjusts the attitude of the millimeter-wave radar, so that the pitch angle of the millimeter-wave radar is adjusted upward, and then the first The millimeter-wave radar of one vehicle re-detects the second vehicle.
  • the embodiments of this application mainly take millimeter-wave radar as an example to introduce the scheme of how to detect target loss and how to re-detect the target in special scenarios such as curves, ramps, acceleration and deceleration, but in practical applications Among them, the millimeter-wave radar can also be replaced with other sensors, such as laser millimeter-wave radar, etc., especially in the mass production scheme of autonomous driving, after the solid-state laser millimeter-wave radar is adopted, the laser millimeter-wave radar also has a definition similar to the field of view angle.
  • FIG. 17 is a system architecture diagram provided in an embodiment of the present application
  • the system can execute the solutions in the foregoing method embodiments.
  • the system includes: a positioning device, a camera, a millimeter-wave radar, a follow-up mechanism, and a computing device.
  • the system is applicable to the vehicle shown in FIG. 4 , and is used to implement the above-mentioned target detection method in the embodiment of the present application.
  • the positioning device is used to obtain the positioning information of the vehicle.
  • a camera to obtain visual information in front of the vehicle.
  • Millimeter-wave radar used to obtain the information of the millimeter-wave radar in front of the vehicle.
  • the target detection in the present application takes the combination of a camera and a millimeter-wave radar as an example.
  • this application does not limit the specific type of sensor, as long as it is a sensor with a large detection range, a short detection distance, and inaccurate detection results (obstacle position and speed, etc.), and a sensor with a small detection range, a long detection distance, and accurate detection results.
  • all the embodiments of the present application are applicable.
  • the follower mechanism is set on the millimeter-wave radar and used to control the attitude of the millimeter-wave radar.
  • a computing device is a device with computing/processing capabilities. Specifically, it may be a vehicle-mounted device, or one or more processing chips, or an integrated circuit, etc., which is not limited here.
  • the computing device may include the following modules:
  • the road condition identification module is used to identify the road conditions where the vehicle and the preceding vehicle are located according to the positioning information collected by the positioning device and the visual information collected by the camera; and according to the road conditions, start the corresponding threshold comparator (where different road conditions correspond to different threshold comparisons)
  • Figure 12 takes the threshold comparators A, B, and C as an example, the actual number can be more or less).
  • the threshold comparator is used to determine whether the speed of the target obstacle (including the linear velocity or the angular velocity) exceeds the threshold of the threshold comparator; the specific implementation of each threshold comparator can refer to the embodiments shown in FIGS. 6 to 8 above. Related introduction.
  • the image mark in FIG. 12 is an RGB image that carries a target mark (such as a target frame) output by the camera when the threshold comparator confirms as "Yes”.
  • the millimeter-wave radar camera fusion target detection module is used to fuse the detection results of the millimeter-wave radar and the detection results of the camera.
  • the millimeter-wave radar camera fusion target detection module is used to fuse the detection results of the millimeter-wave radar and the detection results of the camera.
  • the target loss tracking module is used to implement the millimeter wave radar target loss tracking mechanism.
  • the target loss tracking module is used to implement the millimeter wave radar target loss tracking mechanism.
  • the target not lost tracking module is used to implement the millimeter wave radar target not lost tracking mechanism.
  • the target not lost tracking module is used to implement the millimeter wave radar target not lost tracking mechanism.
  • the final output result can have various forms. For example, output the specific position, type, speed and other data of the target (such as X*, Y*, C*, v*, w*, etc.), or output the RGB image with the target frame, etc., there are no restrictions here.
  • an embodiment of the present application further provides a target detection device 180, where the device 180 is located inside the first vehicle, for example, a chip arranged inside the first vehicle.
  • the apparatus 180 includes modules or units or means (means) for executing the steps in the methods shown in FIG. 5 , FIG. 10 , and FIG. 12 .
  • the functions or units or means may be implemented by software, or by hardware, or The corresponding software implementation can be performed by hardware.
  • the apparatus 180 may include: a detection module 1801 for detecting the speed of the second vehicle based on a millimeter-wave radar; wherein the second vehicle is located in front of the first vehicle, and the speed includes a linear velocity and an angular velocity; a processing module 1802 , used to obtain the road conditions where the first vehicle and the second vehicle are located; the first threshold is determined according to the road conditions; if the speed of the second vehicle exceeds the first threshold, it is determined that the millimeter-wave radar is in the target loss state or the target is about to be lost.
  • the device 180 determines a first threshold value according to road conditions, and compares the speed of the second vehicle with the first threshold value, thereby determining whether the millimeter-wave radar is in a target loss state or a target loss state. In this way, under special road conditions (such as curves, ramps, etc.), the device 180 can timely and effectively identify the loss of the millimeter-wave radar target caused by the road conditions, so as to avoid the first vehicle from generating the illusion that there is no target obstacle, and further Improve the safety and stability of the vehicle during driving.
  • the processing module 1802 when acquiring the road conditions where the first vehicle and the second vehicle are located, is specifically configured to: use a camera to acquire an RGB image in front of the first vehicle, and then determine the location where the second vehicle is located according to the RGB image. road conditions. In this way, the road condition ahead can be identified efficiently and quickly, that is, the road condition where the second vehicle is located can be determined.
  • the processing module 1802 when determining the road condition where the second vehicle is located according to the RGB image, is specifically configured to: extract feature points from the lane lines in the far field of view included in the RGB image, and then determine according to the extracted feature points. The inflection point and direction of the road where the second vehicle is located, so as to obtain the road condition where the second vehicle is located. Since the detection range of the millimeter-wave radar in the near field of view is larger than that in the far field of view, the target in the near field of view is generally not easy to be lost. The processing module 1802 only processes the lane lines in the far field of view, which can ensure accuracy. Under the premise, the amount of calculation is reduced and the calculation efficiency is improved.
  • the processing module 1802 may also acquire location information of the first vehicle based on the positioning device, and then determine the road conditions where the first vehicle is located according to the location information.
  • the positioning device may be a GPS, a Beidou system, or other positioning systems, and is used for receiving satellite signals and positioning the current position of the first vehicle.
  • the positioning system may also be visual positioning, millimeter wave radar positioning, fusion positioning, etc., which is not limited in this application.
  • the processing module 1802 determines the road conditions where it is located based on the map, for example, in a curve, a straight road, an uphill, or a downhill.
  • the device 180 can quickly and accurately obtain the road conditions where the first vehicle is located.
  • the first threshold may be determined according to a speed threshold between the second vehicle being within the detection range of the millimeter wave radar and the speed threshold being outside the detection range of the millimeter wave radar, for example, the first threshold is less than or equal to the speed threshold.
  • the second vehicle When the speed of the second vehicle is much greater than the speed threshold, the second vehicle exceeds the detection range of the millimeter-wave radar and enters the blind spot for detection of the millimeter-wave radar, so the millimeter-wave radar is in a target loss state; when the speed of the second vehicle is at the critical speed When the value is near, the second vehicle may exceed the detection range of the millimeter-wave radar at any time, and is about to enter the blind spot of the millimeter-wave radar detection, so the millimeter-wave radar is in a state where the target is about to be lost.
  • the design of the first threshold may be different according to different road conditions.
  • Example 1 The road conditions where the first vehicle and the second vehicle are located are: the first vehicle is on a straight road, and the second vehicle is on a curve. Then, when the detection module 1801 detects the speed of the second vehicle based on the millimeter-wave radar, it is specifically used for:
  • the instantaneous angular velocity of the circular motion of the second vehicle around the curve is detected based on the millimeter-wave radar; wherein, the first threshold N determined by the processing module 1802 according to the road conditions conforms to:
  • ⁇ r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1, is the Euclidean distance between the first vehicle and the second vehicle at time t0; is the displacement of the second vehicle from time t0 to time t1, ⁇ is half of the detection beam angle of the millimeter-wave radar, time t0 is the time when the millimeter-wave radar collects the first frame of data, and time t1 is the time when the millimeter-wave radar collects the second frame of data.
  • the first frame data and the second frame data are two consecutive frames of data; K is a coefficient greater than 0 and less than or equal to 1; ⁇ is the position A of the first vehicle, the position B of the second vehicle at t0, The angle value of the central angle formed between the centers O of the curve.
  • Example 2 The road conditions where the first vehicle and the second vehicle are located are: the first vehicle is on a curve, and the second vehicle is on a straight road. Then, when the detection module 1801 detects the speed of the second vehicle based on the millimeter-wave radar, it is specifically used for:
  • the first threshold N determined by the processing module 1802 according to the road conditions conforms to:
  • Example 3 The road conditions where the first vehicle and the second vehicle are located are: both the first vehicle and the second vehicle are in a curve. Then, when the detection module 1801 detects the speed of the second vehicle based on the millimeter-wave radar, it is specifically used for:
  • the first threshold N determined by the processing module 1802 according to the road conditions conforms to:
  • ⁇ r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1;
  • v is the instantaneous speed of the first vehicle;
  • is half of the detection beam angle of the millimeter-wave radar, and time t0 is the millimeter-wave radar.
  • Q is a coefficient greater than 0 and less than or equal to 1; is the Euclidean distance between the first vehicle and the second vehicle at time t0;
  • road conditions are not limited to curve scenarios.
  • road scenarios such as up/downhill, acceleration/deceleration, etc.
  • the same idea can also be used to design preset conditions.
  • the processing module 1802 may also be used to: after determining that the millimeter-wave radar is in a target loss state or a target is about to be lost, control the millimeter-wave radar to rotate around the Z R axis and/or around the X R axis according to road conditions, to The millimeter wave radar is made to detect the second vehicle again; wherein, the Z R axis is perpendicular to the horizontal plane, and the X R axis is parallel to the horizontal plane and perpendicular to the traveling direction of the first vehicle.
  • the degrees of freedom of the above two directions are independent of each other (or decoupled) in the mechanical structure and independent of each other in the control logic. (or decoupling), so the processing module 1802 can control and adjust only one of the degrees of freedom, or can control and adjust two degrees of freedom at the same time.
  • the processing module 1802 may control the millimeter-wave radar to rotate around the Z R axis by a first angle.
  • the processing module 1802 may control the millimeter wave radar to rotate around the X R axis by a second angle.
  • the processing module 1802 may control the millimeter-wave radar to rotate around the Z R axis by a third angle and around the X R axis by a fourth angle.
  • the rotation of the millimeter-wave radar around the Z R axis and the rotation around the X R axis are independent of each other and not dependent on each other, the accuracy and efficiency of the attitude adjustment of the millimeter-wave radar can be improved, thereby further improving the accuracy and efficiency of target detection and tracking.
  • the processing module 1802 may further update the calibration matrix of the millimeter-wave radar in real time according to the angle value of the rotation after each time the millimeter-wave radar is controlled to rotate around the Z R axis and/or around the X R axis. In this way, a fast and accurate response of the information fusion of the millimeter-wave radar and the camera can be achieved, and the reliability of the information fusion of the millimeter-wave radar and the camera during the attitude adjustment of the millimeter-wave radar can be guaranteed.
  • the processing module 1802 can also be used to: detect and track the target based on the camera and the millimeter-wave radar respectively, and perform target fusion according to the IOU fusion rule.
  • the specific target fusion process includes:
  • the processing module 1802 after acquiring the RGB image in front of the first vehicle based on the camera, performs target recognition on the RGB image based on the target recognition model to obtain a first recognition result, wherein the first recognition result includes the position and type of the second vehicle, and the target recognition
  • the input of the model is an RGB image
  • the output is the position and type of the target
  • the processing module 1802 obtains the radar spot data in front of the first vehicle based on the millimeter-wave radar, it processes the radar spot data to obtain a second recognition result
  • the second recognition result includes the position and speed of the second vehicle; when the processing module 1802 determines that the area where the second vehicle is located in the RGB image and the IoU of the area where the second vehicle is located in the radar trace data is greater than the second threshold M, the first The recognition result and the second recognition result are fused to obtain fusion data, otherwise the first recognition result and the second recognition result are not fused.
  • the second threshold M and the curvature ⁇ of the curve where the first vehicle and/or the second vehicle are located, the driving speed V of the first vehicle, and the driving distance L of the first vehicle satisfy the following relationships:
  • a and b are preset coefficients.
  • the second threshold K may also be related to other factors, such as the acceleration of the first vehicle, etc., which is not limited here.
  • the second threshold K is related to the road conditions where the vehicle is located (the curvature ⁇ of the curve where the first vehicle and/or the second vehicle is located) and the driving state of the first vehicle (that is, the driving speed V, the driving distance L), it can be Improve the accuracy of fusion recognition, thereby further improving the safety and stability of the vehicle during driving.
  • the processing module 1802 can also be used to: when the millimeter-wave radar is in the target-losing state or the target is about to be lost, based on the fusion data before the millimeter-wave radar is in the target-losing state or the target is about to be lost, and the millimeter-wave radar is in the target-losing state or the target is about to be lost.
  • the second vehicle is tracked by the first recognition result when the target is lost or the target is about to be lost; or when the millimeter-wave radar is not in the target-loss state or the target is about to be lost, the second vehicle is tracked based on continuous multi-frame fusion data . Since the device 180 adopts different tracking mechanisms according to different millimeter wave radar states, the accuracy of target tracking can be improved, thereby further improving the safety and stability of the vehicle during driving.
  • the processing module when the processing module performs target recognition on the RGB image based on the target recognition model, it is specifically used for: when the millimeter-wave radar is in the target loss state or the target is about to be lost, use the lightweight target recognition model to collect the RGB images collected by the camera. The image is used for target recognition; when the millimeter-wave radar is not in the target loss state or the target is about to be lost, the weighted target recognition model is used to perform target recognition on the RGB images collected by the camera; among them, the lightweight target recognition model The recognition speed is greater than The recognition speed of the weighted target recognition model and the recognition accuracy of the lightweight target recognition model are lower than the recognition accuracy of the heavyweight target recognition model.
  • two different target recognition models are designed to switch according to different scenarios (in the case of frame loss of millimeter wave radar, a lightweight recognition model is used to recognize the data collected by the camera to improve the recognition speed.
  • a lightweight recognition model is used to recognize the data collected by the camera to improve the recognition speed.
  • the weighted recognition model is used to identify the data collected by the camera to improve the recognition accuracy), which can achieve both the speed and accuracy of fusion recognition.
  • an embodiment of the present application further provides a target detection apparatus 190, including a processor 1901 and a memory 1902; wherein, the memory 1902 stores instructions that can be executed by the processor 1901, and the processor 1901 passes Executing the instructions stored in the memory 1902 causes the apparatus 190 to execute the methods shown in FIG. 5 , FIG. 10 , and FIG. 12 .
  • the processor 1901 and the memory 1902 may be coupled through an interface circuit, or may be integrated together, which is not limited here.
  • connection medium between the processor 1901 and the memory 1902 is not limited in this embodiment of the present application.
  • the processor 1901 and the memory 1902 are connected through a bus in FIG. 19 , and the bus is represented by a thick line in FIG. 19 .
  • the connection mode between other components is only for schematic illustration, and is not cited. limited.
  • the bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of presentation, only one thick line is shown in FIG. 19, but it does not mean that there is only one bus or one type of bus.
  • the processor mentioned in the embodiments of the present application may be implemented by hardware or software.
  • the processor When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like.
  • the processor When implemented in software, the processor may be a general-purpose processor implemented by reading software codes stored in memory.
  • the processor may be a central processing unit (Central Processing Unit, CPU), or other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC) , Off-the-shelf Programmable Gate Array (Field Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the memory mentioned in the embodiments of the present application may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory.
  • the non-volatile memory may be a read-only memory (Read-Only Memory, ROM), a programmable read-only memory (Programmable ROM, PROM), an erasable programmable read-only memory (Erasable PROM, EPROM), an electrically programmable read-only memory (Erasable PROM, EPROM). Erase programmable read-only memory (Electrically EPROM, EEPROM) or flash memory.
  • Volatile memory may be Random Access Memory (RAM), which acts as an external cache.
  • RAM Static Random Access Memory
  • DRAM Dynamic Random Access Memory
  • SDRAM Synchronous DRAM
  • SDRAM double data rate synchronous dynamic random access memory
  • Double Data Eate SDRAM, DDR SDRAM enhanced synchronous dynamic random access memory
  • Enhanced SDRAM, ESDRAM synchronous link dynamic random access memory
  • Synchlink DRAM, SLDRAM synchronous link dynamic random access memory
  • Direct Rambus RAM Direct Rambus RAM
  • the processor is a general-purpose processor, DSP, ASIC, FPGA or other programmable logic devices, discrete gate or transistor logic devices, or discrete hardware components
  • the memory storage module
  • memory described herein is intended to include, but not be limited to, these and any other suitable types of memory.
  • the embodiments of the present application also provide a computer-readable storage medium, including a program or an instruction, when the program or instruction is run on a computer, the methods shown in FIG. 5 , FIG. 10 , and FIG. 12 can be implement.
  • an embodiment of the present application further provides a chip, which is coupled to a memory and used to read and execute program instructions stored in the memory to implement the methods shown in FIG. 5 , FIG. 10 , and FIG. 12 .
  • the embodiments of the present application also provide a computer program product containing instructions.
  • the computer program product stores instructions.
  • the computer program product runs on a computer, the computer can execute the instructions shown in FIG. 5 , FIG. 10 , and FIG. 12 . method shown.
  • an embodiment of the present application also provides a vehicle, the vehicle includes a target detection device, a millimeter-wave radar, and a camera; The method shown in Figure 12.
  • the structure of the vehicle may be as shown in FIG. 11 .
  • the embodiments of the present application may be provided as a method, a system, or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture comprising instruction means, the instructions
  • the apparatus implements the functions specified in the flow or flow of the flowcharts and/or the block or blocks of the block diagrams.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

Provided in the embodiments of the present application are a target detection method and apparatus. The method comprises: a first vehicle detecting the speed of a second vehicle on the basis of a millimeter wave radar; acquiring a road condition where the first vehicle and the second vehicle are located; determining a first threshold value according to the acquired road condition; then comparing the speed of the second vehicle with the first threshold value; and if the speed of the second vehicle exceeds the first threshold value, the first vehicle determining that the millimeter wave radar is in a state where a target is lost or a state where the target is about to be lost, otherwise, the first vehicle determining that the millimeter wave radar is not in the state where the target is lost or the state where the target is about to be lost. Therefore, under special road conditions (such as a curve or a ramp), a first vehicle can effectively identify, in a timely manner, the situation of losing a target of a millimeter wave radar due to the road condition, thereby preventing the first vehicle from producing an illusion of there being no target obstacle, and further improving the safety and stability of a vehicle during a driving process.

Description

一种目标检测方法及装置A target detection method and device
相关申请的交叉引用CROSS-REFERENCE TO RELATED APPLICATIONS
本申请要求在2021年01月22日提交中国专利局、申请号为202110089080.2、申请名称为“一种目标检测方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of the Chinese patent application with the application number of 202110089080.2 and the application title of "A method and device for target detection" filed with the China Patent Office on January 22, 2021, the entire contents of which are incorporated into this application by reference .
技术领域technical field
本申请涉及自动驾驶技术领域,尤其涉及一种目标检测方法及装置。The present application relates to the technical field of automatic driving, and in particular, to a target detection method and device.
背景技术Background technique
随着经济社会的高速发展,自动驾驶技术逐渐迈向成熟,走向落地。但是,一些特殊危险路况下的功能,如弯道处目标的检测跟踪,是自动驾驶感领域的重大难题。With the rapid development of economy and society, autonomous driving technology is gradually becoming mature and landing. However, some functions in special dangerous road conditions, such as the detection and tracking of targets at curves, are a major problem in the field of autonomous driving.
现有技术主要针对于平直路况下毫米波雷达与前视相机先跟踪再做融合,只对融合方法做些小修改;或者在各个死区范围上再额外加装传感器,考虑更多的是结合弯道车道线来去除无效目标,在可行驶空间内来确定目标的连续帧跟踪。但由于相机与毫米波雷达等传感器的检测范围固定,所以在弯道路况下仍会出现盲区无法检测到,导致目标障碍物被认为丢失,造成车辆无目标障碍物的错觉。The existing technology is mainly aimed at tracking the millimeter-wave radar and the forward-looking camera first and then merging in flat and straight road conditions, with only some minor modifications to the fusion method; Curve lane lines to remove invalid targets, and continuous frame tracking of targets within the drivable space. However, due to the fixed detection range of sensors such as cameras and millimeter-wave radars, blind spots may still be undetectable in curved road conditions, causing the target obstacle to be considered lost, resulting in the illusion that the vehicle has no target obstacle.
一些技术提出基于车联网车辆到车辆(Vehicle-to-Vehicle,V2V)技术来实现弯道中目标跟踪车辆的有效识别。例如,在前车车尾和盲区边界线重合时,本车通过车联网V2V系统获取盲区内汽车的速度,将该速度与盲区边界融合,形成一个虚拟的有速度的盲区边界跟随模型。该方案依赖于成本高昂的V2X获取前车信息,成本极高;并且当前V2X网络的部署尚不完善,所以整个方案实现起来十分困难。Some technologies propose vehicle-to-vehicle (V2V) technology based on the Internet of Vehicles to achieve effective identification of target tracking vehicles in curves. For example, when the rear of the front car and the boundary line of the blind spot overlap, the vehicle obtains the speed of the car in the blind spot through the V2V system of the Internet of Vehicles, and fuses the speed with the blind spot boundary to form a virtual blind spot boundary following model with speed. This solution relies on the expensive V2X to obtain the information of the preceding vehicle, and the cost is extremely high; and the current V2X network deployment is not perfect, so the whole solution is very difficult to implement.
因此,如何低成本且高效率地实现弯道中目标车辆的有效识别跟踪,提高车辆的安全性,是本申请所要解决的技术问题。Therefore, how to realize the effective identification and tracking of the target vehicle in the curve with low cost and high efficiency, and improve the safety of the vehicle, is the technical problem to be solved by the present application.
发明内容SUMMARY OF THE INVENTION
本申请提供一种目标检测方法及装置,用于低成本且高效率地对弯道中的目标车辆进行有效识别跟踪,进而提高车辆在驾驶过程中的安全性和稳定性。The present application provides a target detection method and device for effectively identifying and tracking a target vehicle in a curve with low cost and high efficiency, thereby improving the safety and stability of the vehicle during driving.
第一方面,提供一种目标检测方法,可以应用于车辆,也可以应用于车辆内部的芯片。以该方法应用于第一车辆为例,该方法包括:第一车辆基于毫米波雷达检测前车的速度,以前车为第二车辆为例,速度包括线速度和角速度;第一车辆获取第一车辆和第二车辆所处的路况;第一车辆根据获取到的路况确定第一阈值;第一车辆比较第二车辆的速度和第一阈值,若第二车辆的速度超过第一阈值,则第一车辆确定毫米波雷达处于目标丢失状态或目标即将丢失状态,否则第一车辆确定毫米波雷达未处于目标丢失状态或目标即将丢失状态。In a first aspect, a target detection method is provided, which can be applied to a vehicle or a chip inside the vehicle. Taking the method applied to the first vehicle as an example, the method includes: the first vehicle detects the speed of the preceding vehicle based on a millimeter-wave radar, and the preceding vehicle is the second vehicle as an example, and the speed includes linear velocity and angular velocity; The road conditions where the vehicle and the second vehicle are located; the first vehicle determines the first threshold according to the acquired road conditions; the first vehicle compares the speed of the second vehicle with the first threshold, and if the speed of the second vehicle exceeds the first threshold, the A vehicle determines that the millimeter-wave radar is in a target-losing state or a target is about to be lost, otherwise the first vehicle determines that the millimeter-wave radar is not in a target-losing state or a target is about to be lost.
本申请实施例中,第一车辆根据路况确定第一阈值,并将前车(即第二车辆)的实时速度与第一阈值进行比较,进而判断毫米波雷达是否处于目标丢失状态或目标即将丢失状 态。如此,在特殊路况(如弯道、坡道等)下,第一车辆可以及时有效地识别到由路况原因所造成的毫米波雷达目标丢失的情况,避免第一车辆产生无目标障碍物的错觉,进而提高车辆在驾驶过程中的安全性和稳定性。In the embodiment of the present application, the first vehicle determines the first threshold according to the road conditions, and compares the real-time speed of the preceding vehicle (ie, the second vehicle) with the first threshold, and then determines whether the millimeter-wave radar is in a target loss state or the target is about to be lost. state. In this way, under special road conditions (such as curves, ramps, etc.), the first vehicle can timely and effectively identify the loss of the millimeter-wave radar target caused by the road conditions, so as to avoid the illusion that the first vehicle has no target obstacles. , thereby improving the safety and stability of the vehicle during driving.
在一种可能的设计中,第一车辆可以使用相机获取第一车辆的前方的RGB图像,然后根据该RGB图像确定第二车辆所处的路况。In a possible design, the first vehicle may use a camera to acquire an RGB image in front of the first vehicle, and then determine the road condition where the second vehicle is located according to the RGB image.
如此,可以高效快捷的识别第一车辆前方的路况,即确定第二车辆所处的路况。In this way, the road condition in front of the first vehicle can be identified efficiently and quickly, that is, the road condition where the second vehicle is located can be determined.
在一种可能的设计中,第一车辆根据该RGB图像确定第二车辆所处的路况,具体可以是从该RGB图像包括的远视场中的车道线上提取特征点,然后根据提取的特征点确定第二车辆所处道路的拐点和方向,进而得到第二车辆所处的路况。In a possible design, the first vehicle determines the road condition on which the second vehicle is located according to the RGB image. Specifically, the feature points may be extracted from the lane lines in the far field of view included in the RGB image, and then the feature points are extracted according to the extracted feature points. Determine the turning point and direction of the road where the second vehicle is located, and then obtain the road condition where the second vehicle is located.
由于毫米波雷达在近视场中的检测范围比在远视场中的检测范围大,所以近视场中的目标一般不易丢失,所以本申请实施例只对远视场中的车道线处理,可以在保证准确性的前提下减少计算量,提高计算效率。Since the detection range of the millimeter-wave radar in the near field of view is larger than that in the far field of view, the target in the near field of view is generally not easily lost. Therefore, the embodiment of the present application only processes the lane lines in the far field of vision, which can ensure accurate Under the premise of stability, the amount of calculation is reduced and the calculation efficiency is improved.
在一种可能的设计中,第一车辆可以基于定位装置获取第一车辆的位置信息,然后根据位置信息确定第一车辆所处的路况。In a possible design, the first vehicle may acquire location information of the first vehicle based on the positioning device, and then determine the road condition where the first vehicle is located according to the location information.
示例性的,定位装置可以是GPS、北斗系统或者其他定位系统,用于接收卫星信号,并对第一车辆当前的位置进行定位。除此之外,定位系统还可以是视觉定位、毫米波雷达定位、融合定位等,本申请不做限制。Exemplarily, the positioning device may be GPS, Beidou system or other positioning systems, for receiving satellite signals and locating the current position of the first vehicle. In addition, the positioning system may also be visual positioning, millimeter wave radar positioning, fusion positioning, etc., which is not limited in this application.
第一车辆在得到第一车辆的位置信息后,就基于地图判断自身所处位置的路况,例如处于弯道、或者直道、或者上坡、或者下坡等。After obtaining the location information of the first vehicle, the first vehicle determines the road conditions where it is located based on the map, such as being on a curve, or a straight road, or going uphill or downhill.
如此,第一车辆可以快速准确的获得自身所处的路况。In this way, the first vehicle can quickly and accurately obtain the road conditions where it is located.
在一种可能的设计中,第一阈值可以根据第二车辆位于毫米波雷达检测范围内与位于毫米波雷达检测范围外之间的速度临界值确定,例如第一阈值小于或者等于该速度临界值。当第二车辆的速度远大于该速度临界值时,第二车辆超出毫米波雷达检测范围,进入毫米波雷达检测盲区,所以毫米波雷达处于目标丢失状态;当第二车辆的速度在该速度临界值附近时,第二车辆随时可能超出毫米波雷达检测范围,即将进入毫米波雷达检测盲区,所以毫米波雷达处于目标即将丢失状态。在本申请实施例根据路况不同,第一阈值的设计可以不同,以下给出三个具体的示例来说明。In a possible design, the first threshold may be determined according to a speed threshold between the second vehicle being within the detection range of the millimeter-wave radar and being located outside the detection range of the millimeter-wave radar, for example, the first threshold is less than or equal to the speed threshold . When the speed of the second vehicle is much greater than the speed threshold, the second vehicle exceeds the detection range of the millimeter-wave radar and enters the blind spot for detection of the millimeter-wave radar, so the millimeter-wave radar is in a target loss state; when the speed of the second vehicle is at the critical speed When the value is near, the second vehicle may exceed the detection range of the millimeter-wave radar at any time, and is about to enter the blind spot of the millimeter-wave radar detection, so the millimeter-wave radar is in a state where the target is about to be lost. In this embodiment of the present application, the design of the first threshold may be different according to different road conditions, and three specific examples are given below for description.
示例1、第一车辆和第二车辆所处的路况为:第一车辆处于直道,第二车辆处于弯道。则基于毫米波雷达检测第二车辆的速度,可以包括:基于毫米波雷达检测第二车辆绕弯道做圆周运动的瞬时角速度;根据路况确定的第一阈值N符合:Example 1. The road conditions where the first vehicle and the second vehicle are located are: the first vehicle is on a straight road, and the second vehicle is on a curve. Then, detecting the speed of the second vehicle based on the millimeter-wave radar may include: detecting the instantaneous angular velocity of the second vehicle making a circular motion around the curve based on the millimeter-wave radar; the first threshold N determined according to the road conditions conforms to:
Figure PCTCN2021124194-appb-000001
Figure PCTCN2021124194-appb-000001
其中,ω r为第二车辆从t0时刻到t1时刻绕弯道做圆周运动的角速度,
Figure PCTCN2021124194-appb-000002
为t0时刻第一车辆和第二车辆之间的欧式距离;
Figure PCTCN2021124194-appb-000003
为第二车辆从t0时刻到t1时刻之间的位移,α为毫米波雷达的检测波束角的一半,t0时刻为毫米波雷达采集第一帧数据的时刻,t1时刻为毫米波雷达采集第二帧数据的时刻,第一帧数据和第二帧数据为连续的两帧数据;K为大于0且小于等于1的系数;ε为t0时刻第一车辆所在位置A、第二车辆所在位置B、弯道的圆心O之间所形成的圆心角的角度值。
Among them, ω r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1,
Figure PCTCN2021124194-appb-000002
is the Euclidean distance between the first vehicle and the second vehicle at time t0;
Figure PCTCN2021124194-appb-000003
is the displacement of the second vehicle from time t0 to time t1, α is half of the detection beam angle of the millimeter-wave radar, time t0 is the time when the millimeter-wave radar collects the first frame of data, and time t1 is the time when the millimeter-wave radar collects the second frame of data. At the moment of frame data, the first frame data and the second frame data are two consecutive frames of data; K is a coefficient greater than 0 and less than or equal to 1; ε is the position A of the first vehicle, the position B of the second vehicle at t0, The angle value of the central angle formed between the centers O of the curve.
示例2、第一车辆和第二车辆所处的路况为:第一车辆处于弯道,第二车辆处于直道。则基于毫米波雷达检测第二车辆的速度,可以包括:基于毫米波雷达检测第二车辆在直道上的瞬时行驶速度;根据路况确定的第一阈值N符合:Example 2. The road conditions where the first vehicle and the second vehicle are located are: the first vehicle is on a curve, and the second vehicle is on a straight road. Then, detecting the speed of the second vehicle based on the millimeter-wave radar may include: detecting the instantaneous running speed of the second vehicle on the straight road based on the millimeter-wave radar; and the first threshold N determined according to the road conditions conforms to:
Figure PCTCN2021124194-appb-000004
Figure PCTCN2021124194-appb-000004
其中,
Figure PCTCN2021124194-appb-000005
是第二车辆从t0时刻到t1时刻之间的位移;v r为第二车辆从t0时刻到t1时刻在直道上的行驶速度;t0时刻为毫米波雷达采集第一帧数据的时刻,t1时刻为毫米波雷达采集第二帧数据的时刻,第一帧数据和第二帧数据为连续的两帧数据;P为大于0且小于等于1的系数。
in,
Figure PCTCN2021124194-appb-000005
is the displacement of the second vehicle from time t0 to time t1; v r is the driving speed of the second vehicle on the straight from time t0 to time t1; time t0 is the time when the millimeter-wave radar collects the first frame of data, time t1 For the moment when the millimeter-wave radar collects the second frame of data, the first frame of data and the second frame of data are two consecutive frames of data; P is a coefficient greater than 0 and less than or equal to 1.
示例3、第一车辆和第二车辆所处的路况为:第一车辆、第二车辆均处于弯道。则基于毫米波雷达检测第二车辆的速度,可以包括:基于毫米波雷达检测第二车辆绕弯道做圆周运动的瞬时角速度;根据路况确定的第一阈值N符合:Example 3. The road conditions where the first vehicle and the second vehicle are located are: both the first vehicle and the second vehicle are in a curve. Then, detecting the speed of the second vehicle based on the millimeter-wave radar may include: detecting the instantaneous angular velocity of the second vehicle making a circular motion around the curve based on the millimeter-wave radar; the first threshold N determined according to the road conditions conforms to:
Figure PCTCN2021124194-appb-000006
Figure PCTCN2021124194-appb-000006
其中,ω r为第二车辆从t0时刻到t1时刻绕弯道做圆周运动的角速度;v是第一车辆的瞬时行驶速度;α为毫米波雷达的检测波束角的一半,t0时刻为毫米波雷达采集第一帧数据的时刻,t1时刻为毫米波雷达采集第二帧数据的时刻,第一帧数据和第二帧数据为连续的两帧数据;Q为大于0且小于等于1的系数;
Figure PCTCN2021124194-appb-000007
为t0时刻第一车辆和第二车辆之间的欧式距离;
Among them, ω r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1; v is the instantaneous speed of the first vehicle; α is half of the detection beam angle of the millimeter-wave radar, and time t0 is the millimeter-wave radar. The moment when the first frame of data is collected, the moment t1 is the moment when the millimeter-wave radar collects the second frame of data, the first frame of data and the second frame of data are two consecutive frames of data; Q is a coefficient greater than 0 and less than or equal to 1;
Figure PCTCN2021124194-appb-000007
is the Euclidean distance between the first vehicle and the second vehicle at time t0;
需要说明的是,以上三种仅为示例而非限定。在实际应用中,路况并不仅限于弯道场景,针对其它道路场景(如上/下坡、加/减速等),也可以采用相同的思路来设计预设条件。It should be noted that the above three types are only examples and not limitations. In practical applications, road conditions are not limited to curve scenarios. For other road scenarios (such as up/downhill, acceleration/deceleration, etc.), the same idea can also be used to design preset conditions.
在一种可能的设计中,第一车辆在确定毫米波雷达处于目标丢失状态或目标即将丢失状态之后,第一车辆还可以根据路况,控制毫米波雷达绕Z R轴旋转和/或绕X R轴旋转,以使毫米波雷达重新检测到第二车辆;其中,Z R轴垂直于水平面,X R轴平行于水平面且垂直于第一车辆的行驶方向。 In a possible design, after the first vehicle determines that the millimeter-wave radar is in a target loss state or a target is about to be lost, the first vehicle may also control the millimeter-wave radar to rotate around the Z R axis and/or around the X R axis according to road conditions The axis is rotated so that the millimeter wave radar can detect the second vehicle again; wherein the Z R axis is perpendicular to the horizontal plane, and the X R axis is parallel to the horizontal plane and perpendicular to the driving direction of the first vehicle.
应理解,上述两个方向(即绕Z R轴旋转的方向、绕X R轴旋转的方向)的自由度在机械结构是相互独立的(或者说解耦的),在控制逻辑上也是相互独立的(或者说解耦的),所以可以第一车辆可以只调整其中一个自由度,也可以同时调整两个自由度。例如,当第一车辆和/或第二车辆处于弯道时,第一车辆可以控制毫米波雷达绕Z R轴旋转第一角度。例如,当第一车辆和/或第二车辆处于坡道时,第一车辆可以控制毫米波雷达绕X R轴旋转第二角度。例如,当第一车辆和/或第二车辆处于弯坡组合路段时,第一车辆可以控制毫米波雷达绕Z R轴旋转第三角度和绕X R轴旋转第四角度。 It should be understood that the degrees of freedom of the above two directions (that is, the direction of rotation around the Z R axis and the direction of rotation around the X R axis) are independent of each other (or decoupled) in the mechanical structure and independent of each other in the control logic. (or decoupling), so the first vehicle can adjust only one of the degrees of freedom, or can adjust two degrees of freedom at the same time. For example, when the first vehicle and/or the second vehicle is in a curve, the first vehicle may control the millimeter-wave radar to rotate around the Z R axis by a first angle. For example, when the first vehicle and/or the second vehicle is on a slope, the first vehicle may control the millimeter-wave radar to rotate about the X R axis by a second angle. For example, when the first vehicle and/or the second vehicle is in a combination road section with curves and slopes, the first vehicle may control the millimeter-wave radar to rotate around the Z R axis by a third angle and around the X R axis by a fourth angle.
本申请实施例中,毫米波雷达绕Z R轴旋转和绕X R轴旋转相互独立而不相互依赖,可以提高毫米波雷达姿态调整的准确度和效率,进而进一步提高目标检测跟踪的精准度和效率。 In the embodiment of the present application, the rotation of the millimeter-wave radar around the Z R axis and the rotation around the X R axis are independent of each other and not dependent on each other, which can improve the accuracy and efficiency of the attitude adjustment of the millimeter-wave radar, thereby further improving the accuracy and accuracy of target detection and tracking. efficiency.
在一种可能的设计中,第一车辆在每一次控制毫米波雷达绕Z R轴旋转和/或绕X R轴旋转之后,还可以根据该次旋转的角度值实时更新毫米波雷达的标定矩阵。 In a possible design, after the first vehicle controls the millimeter-wave radar to rotate around the Z- R axis and/or around the X- R axis each time, the calibration matrix of the millimeter-wave radar can be updated in real time according to the angle value of the rotation. .
如此,可以达到毫米波雷达和相机信息融合的快速准确响应,保证毫米波雷达姿态调整过程中毫米波雷达和相机信息融合的可靠性。In this way, a fast and accurate response of the information fusion of the millimeter-wave radar and the camera can be achieved, and the reliability of the information fusion of the millimeter-wave radar and the camera during the attitude adjustment of the millimeter-wave radar can be guaranteed.
在一种可能的设计中,第一车辆可以分别基于相机和毫米波雷达对目标进行检测跟踪,并根据IOU融合规则进行目标融合。具体的,第一车辆基于相机获取第一车辆的前方的RGB图像,并基于目标识别模型对RGB图像进行目标识别,得到第一识别结果,其中第一识别结果包括第二车辆的位置和类型,目标识别模型的输入为RGB图像,输出为目标的位置和类型;第一车辆基于毫米波雷达获取第一车辆的前方的雷达点迹数据,并对雷达 点迹数据进行处理,得到第二识别结果,其中第二识别结果包括第二车辆的位置和速度;第一车辆判断RGB图像中的第二车辆所在区域和雷达点迹数据中第二车辆所在区域的IoU大于第二阈值M时,将第一识别结果和第二识别结果进行融合,得到融合数据,否则不将第一识别结果和第二识别结果进行融合。其中,第二阈值M与第一车辆和/或第二车辆所在弯道的曲率ρ、第一车辆的行车速度V、第一车辆的行车距离L满足以下关系:In a possible design, the first vehicle can detect and track targets based on cameras and millimeter-wave radars, respectively, and perform target fusion according to IOU fusion rules. Specifically, the first vehicle acquires the RGB image in front of the first vehicle based on the camera, and performs target recognition on the RGB image based on the target recognition model to obtain a first recognition result, where the first recognition result includes the location and type of the second vehicle, The input of the target recognition model is an RGB image, and the output is the position and type of the target; the first vehicle obtains the radar spot data in front of the first vehicle based on the millimeter wave radar, and processes the radar spot data to obtain the second recognition result. , wherein the second identification result includes the position and speed of the second vehicle; when the first vehicle determines that the area where the second vehicle is located in the RGB image and the IoU of the area where the second vehicle is located in the radar trace data is greater than the second threshold M, the The first recognition result and the second recognition result are fused to obtain fusion data, otherwise the first recognition result and the second recognition result are not fused. Wherein, the second threshold M and the curvature ρ of the curve where the first vehicle and/or the second vehicle are located, the driving speed V of the first vehicle, and the driving distance L of the first vehicle satisfy the following relationships:
M=a 2ρ+bV+L; M=a 2 ρ+bV+L;
其中,a、b为预设系数。Among them, a and b are preset coefficients.
应理解,以上公式仅为举例而非限定,在具体实施时,第二阈值K还可以与其他因素相关,如第一车辆的加速度等,这里不做限制。It should be understood that the above formula is only an example and not a limitation. During specific implementation, the second threshold K may also be related to other factors, such as the acceleration of the first vehicle, etc., which is not limited here.
在本申请实施例中的第二阈值K与车辆所处的路况(第一车辆和/或第二车辆所在弯道的曲率ρ),以及第一车辆的行驶状态(即行车速度V、行车距离L)相关,可以提高融合识别的准确度,从而进一步提高车辆在驾驶过程中的安全性和稳定性。In the embodiment of the present application, the second threshold value K is related to the road condition where the vehicle is located (the curvature ρ of the curve where the first vehicle and/or the second vehicle is located), and the driving state of the first vehicle (that is, the driving speed V, the driving distance L) correlation, which can improve the accuracy of fusion recognition, thereby further improving the safety and stability of the vehicle during driving.
在一种可能的设计中,当毫米波雷达处于目标丢失状态或目标即将丢失状态时,第一车辆可以基于毫米波雷达处于目标丢失状态或目标即将丢失状态之前的融合数据,和毫米波雷达处于目标丢失状态或目标即将丢失状态时的第一识别结果对第二车辆进行跟踪;或者,当毫米波雷达未处于目标丢失状态或目标即将丢失状态时,第一车辆可以基于连续多帧融合数据对第二车辆进行跟踪。In one possible design, when the millimeter-wave radar is in the target-missing state or the target is about to be lost, the first vehicle may be in The second vehicle is tracked by the first recognition result when the target is lost or the target is about to be lost; or, when the millimeter-wave radar is not in the target-lost state or the target is about to be lost, the first vehicle can be based on the continuous multi-frame fusion data pair. The second vehicle is tracked.
在本申实施例中,第一车辆根据不同的毫米波雷达状态采用不同的跟踪机制,可以提高目标跟踪的准确性,从而进一步提高车辆在驾驶过程中的安全性和稳定性。In the embodiment of the present application, the first vehicle adopts different tracking mechanisms according to different millimeter wave radar states, which can improve the accuracy of target tracking, thereby further improving the safety and stability of the vehicle during driving.
在一种可能的设计中,第一车辆基于目标识别模型对RGB图像进行目标识别,具体可以包括:当毫米波雷达处于目标丢失状态或目标即将丢失状态时,使用轻量化的目标识别模型对相机采集的RGB图像进行目标识别;当毫米波雷达未处于目标丢失状态或目标即将丢失状态时,使用重量化的目标识别模型对相机采集的RGB图像进行目标识别;其中,轻量化的目标识别模型的识别速度大于重量化的目标识别模型的识别速度,轻量化的目标识别模型的识别精度小于重量化的目标识别模型的识别精度。In a possible design, the first vehicle performs target recognition on the RGB image based on the target recognition model, which may specifically include: when the millimeter-wave radar is in the target loss state or the target is about to be lost, using the lightweight target recognition model to identify the camera The collected RGB images are used for target recognition; when the millimeter-wave radar is not in the target loss state or the target is about to be lost, the weighted target recognition model is used to perform target recognition on the RGB images collected by the camera; among them, the lightweight target recognition model is used. The recognition speed is higher than that of the heavyweight target recognition model, and the recognition accuracy of the lightweight target recognition model is lower than that of the heavyweight target recognition model.
本申请实施例针对相机目标识别,设计了两种不同的目标识别模型,在毫米波雷达丢帧的情况下,采用轻量化识别模型对相机采集的数据进行目标识别,提高识别速度,而在毫米波雷达没有丢帧的情况下,采用重量化识别模型对相机采集的数据进行目标识别,提高识别精度,如此可以实现兼顾融合识别的速度和准确度。In the embodiment of the present application, two different target recognition models are designed for camera target recognition. In the case of frame loss of millimeter-wave radar, a lightweight recognition model is used to perform target recognition on the data collected by the camera to improve the recognition speed. When the wave radar does not lose frames, the weighted recognition model is used to identify the data collected by the camera to improve the recognition accuracy, so that the speed and accuracy of fusion recognition can be achieved.
第二方面,提供一种目标检测装置,该装置位于第一车辆,例如可以为设置在车辆内部的芯片。该装置包括用于执行上述第一方面或第一方面任一种可能的设计中第一车辆所执行的步骤所对应的模块或单元或手段,该功能或单元或手段可以通过软件实现,或者通过硬件实现,也可以通过硬件执行相应的软件实现。In a second aspect, a target detection device is provided, and the device is located in a first vehicle, for example, a chip provided inside the vehicle. The device includes modules or units or means corresponding to the steps performed by the first vehicle in the first aspect or any possible design of the first aspect, and the functions, units or means can be implemented by software, or by It can be realized by hardware, and can also be realized by executing corresponding software by hardware.
示例性的,该装置可以包括:检测模块,用于基于毫米波雷达检测第二车辆的速度;其中,第二车辆位于第一车辆的前方,速度包括线速度和角速度;处理模块,用于获取第一车辆和第二车辆所处的路况;根据路况确定第一阈值;若第二车辆的速度超过第一阈值,则确定毫米波雷达处于目标丢失状态或目标即将丢失状态。Exemplarily, the apparatus may include: a detection module for detecting the speed of the second vehicle based on a millimeter-wave radar; wherein the second vehicle is located in front of the first vehicle, and the speed includes a linear velocity and an angular velocity; and a processing module for obtaining The road conditions where the first vehicle and the second vehicle are located; the first threshold is determined according to the road conditions; if the speed of the second vehicle exceeds the first threshold, it is determined that the millimeter-wave radar is in a target loss state or a target loss state.
第三方面,提供一种计算机可读存储介质,包括程序或指令,当程序或指令在计算机上运行时,使得如上述第一方面或第一方面任一种可能的设计中的方法被执行。In a third aspect, a computer-readable storage medium is provided, comprising a program or an instruction, when the program or instruction is executed on a computer, the method as described above in the first aspect or any possible design of the first aspect is performed.
第四方面,提供一种目标检测装置,包括处理器和存储器;其中,存储器存储有可被 处理器执行的指令,处理器通过执行存储器存储的指令,使得装置执行如上述第一方面或第一方面任一种可能的设计中的方法。In a fourth aspect, a target detection device is provided, comprising a processor and a memory; wherein, the memory stores instructions that can be executed by the processor, and the processor executes the instructions stored in the memory to cause the device to execute the above-mentioned first aspect or the first Aspect the method in any possible design.
第五方面,提供一种芯片,芯片与存储器耦合,用于读取并执行存储器中存储的程序指令,以实现如上述第一方面或第一方面任一种可能的设计中的方法。In a fifth aspect, a chip is provided, which is coupled to a memory and used to read and execute program instructions stored in the memory, so as to implement the method in the first aspect or any possible design of the first aspect.
第六方面,提供一种包含指令的计算机程序产品,计算机程序产品中存储有指令,当其在计算机上运行时,使得计算机执行如上述第一方面或第一方面任一种可能的设计中的方法。A sixth aspect provides a computer program product comprising instructions, the computer program product having instructions stored in the computer program product that, when run on a computer, cause the computer to perform the above-mentioned first aspect or any possible design of the first aspect. method.
第七方面,提供一种车辆,该车辆包括目标检测装置、毫米波雷达以及相机;目标检测装置用于通过控制毫米波雷达和相机来实现如上述第一方面或第一方面任一种可能的设计中的方法。In a seventh aspect, a vehicle is provided, the vehicle includes a target detection device, a millimeter-wave radar, and a camera; the target detection device is configured to control the millimeter-wave radar and the camera to achieve the above-mentioned first aspect or any one of the first aspects possible. method in design.
具体地,目标检测装置可以通过毫米波雷达来检测当前车辆前方的另一车辆的速度,通过相机获取第一车辆和第二车辆所处的路况,目标检测装置进一步根据路况确定第一阈值;若当前车辆前方的另一车辆的速度超过第一阈值,则确定毫米波雷达处于目标丢失状态或目标即将丢失状态。Specifically, the target detection device can detect the speed of another vehicle in front of the current vehicle through a millimeter-wave radar, and obtain the road conditions where the first vehicle and the second vehicle are located through a camera, and the target detection device further determines the first threshold according to the road conditions; if The speed of another vehicle in front of the current vehicle exceeds the first threshold, and it is determined that the millimeter-wave radar is in a target loss state or a target loss state.
上述第二方面至第七方面的有益效果参见第一方面中对应设计的有益效果,这里不再赘述。For the beneficial effects of the above-mentioned second to seventh aspects, refer to the beneficial effects of the corresponding designs in the first aspect, which will not be repeated here.
附图说明Description of drawings
图1为毫米波雷达的检测波束角与检测距离的关系示意图;Figure 1 is a schematic diagram of the relationship between the detection beam angle and the detection distance of the millimeter-wave radar;
图2为相机的检测范围和毫米波雷达的检测范围的示意图;FIG. 2 is a schematic diagram of the detection range of the camera and the detection range of the millimeter wave radar;
图3A为本申请实施例适用的一种可能的应用场景;FIG. 3A is a possible application scenario applicable to the embodiment of the present application;
图3B为本申请实施例适用的另一种可能的应用场景;FIG. 3B is another possible application scenario applicable to the embodiment of the present application;
图3C为本申请实施例适用的另一种可能的应用场景;FIG. 3C is another possible application scenario applicable to the embodiment of the present application;
图3D为本申请实施例适用的另一种可能的应用场景;FIG. 3D is another possible application scenario to which the embodiment of the present application is applicable;
图4为一种可能的车辆架构图;Fig. 4 is a possible vehicle architecture diagram;
图5为本申请实施例提供一种目标检测方法的流程图;FIG. 5 provides a flowchart of a target detection method according to an embodiment of the present application;
图6为第一车辆处于直道且第二车辆处于弯道的示意图;6 is a schematic diagram of a first vehicle in a straight road and a second vehicle in a curve;
图7为第一车辆处于弯道且第二车辆处于直道的示意图;7 is a schematic diagram of a first vehicle in a curve and a second vehicle in a straight road;
图8为第一车辆处于弯道且第二车辆处于弯道的示意图;8 is a schematic diagram of a first vehicle in a curve and a second vehicle in a curve;
图9为毫米波雷达的三维坐标系的示意图;9 is a schematic diagram of a three-dimensional coordinate system of a millimeter-wave radar;
图10为调整毫米波雷达姿态的原理图;Figure 10 is a schematic diagram of adjusting the attitude of the millimeter wave radar;
图11为毫米波雷达坐标系与相机坐标系的示意图;Figure 11 is a schematic diagram of a millimeter-wave radar coordinate system and a camera coordinate system;
图12为本申请实施例提供的一种目标跟踪方法的流程图;12 is a flowchart of a target tracking method provided by an embodiment of the present application;
图13为毫米波雷达检测结果对应的ROI和相机检测结果对应的ROI的示意图;13 is a schematic diagram of the ROI corresponding to the detection result of the millimeter wave radar and the ROI corresponding to the detection result of the camera;
图14为毫米波雷达检测结果和相机检测结果的交并集示意图;Figure 14 is a schematic diagram of the intersection and union of the millimeter-wave radar detection results and the camera detection results;
图15、图16为上坡场景的示意图;Figure 15 and Figure 16 are schematic diagrams of an uphill scene;
图17为本申请实施例提供的一种系统架构图;FIG. 17 is a system architecture diagram provided by an embodiment of the present application;
图18为本申请实施例提供的一种目标检测装置180的结构示意图;FIG. 18 is a schematic structural diagram of a target detection apparatus 180 provided by an embodiment of the present application;
图19为本申请实施例提供的另一种目标检测装置190的结构示意图。FIG. 19 is a schematic structural diagram of another target detection apparatus 190 provided by an embodiment of the present application.
具体实施方式Detailed ways
车辆在基于相机和毫米波雷达(millimeter wave radar)进行目标检测跟踪时,相机基本能覆盖近距离的目标障碍物,而远距离的目标障碍物则需要依靠毫米波雷达。When the vehicle detects and tracks the target based on the camera and millimeter wave radar, the camera can basically cover the short-range target obstacle, while the long-distance target obstacle needs to rely on the millimeter wave radar.
毫米波雷达的检测波束角和安装位置是固定的,所以检测范围(一般为扇形角度)是有限的。图1是毫米波雷达的检测波束角与检测距离的关系示意图。如图1所示,检测波束角随检测距离变化而变化,毫米波雷达在远视场中的检测角度小于在近视场中的检测角度。所以,在弯道路况中,后车检测跟踪的目标障碍物(如前车)很可能超出后车毫米波雷达的检测区域,如图2所示。The detection beam angle and installation position of the millimeter-wave radar are fixed, so the detection range (generally the sector angle) is limited. FIG. 1 is a schematic diagram of the relationship between the detection beam angle and the detection distance of the millimeter wave radar. As shown in Figure 1, the detection beam angle changes with the detection distance, and the detection angle of the millimeter-wave radar in the far field of view is smaller than that in the near field of view. Therefore, in curved road conditions, the target obstacle (such as the front vehicle) detected and tracked by the rear vehicle is likely to exceed the detection area of the rear vehicle millimeter-wave radar, as shown in Figure 2.
相机的检测范围一般是一个扇形,常用的相机的视场角(fov)有60°,90°,150°等。相机的检测距离与视场角成反比,例如一款fov为90°的相机,最远检测距离80米,而一款fov为150°的相机的检测距离则远远短于80米。如图2所示,相机的检测范围(即摄像头检测区域)比雷达的检测范围广,但是检测距离比雷达短。The detection range of the camera is generally a sector, and the field of view (fov) of the commonly used camera is 60°, 90°, 150° and so on. The detection distance of a camera is inversely proportional to the field of view. For example, a camera with a fov of 90° has a maximum detection distance of 80 meters, while a camera with a fov of 150° has a detection distance that is much shorter than 80 meters. As shown in Figure 2, the detection range of the camera (ie, the detection area of the camera) is wider than that of the radar, but the detection distance is shorter than that of the radar.
在配置了毫米波雷达和相机传感器的车辆上,目标障碍物的准确位置、速度等信息来自于毫米波雷达的检测数据。所以当目标超出毫米波雷达的检测范围时,由于后车无法获得目标障碍物准确的位置、速度等信息,所以目标障碍物会被后车认为丢失,从而造成无目标障碍物的错觉。On vehicles equipped with millimeter-wave radar and camera sensors, the accurate position and speed of target obstacles come from the detection data of millimeter-wave radar. Therefore, when the target exceeds the detection range of the millimeter-wave radar, since the rear vehicle cannot obtain the accurate position, speed and other information of the target obstacle, the target obstacle will be considered lost by the rear vehicle, resulting in the illusion of no target obstacle.
但实际上,目标障碍物真实存于在客观世界中。因为毫米波雷达是前侧向碰撞预警的最可靠的一重保障,所以在这种弯道路况下目标障碍物丢失的现象,会导致自动驾驶系统无法获取准确的安全距离,进而会对高级驾驶辅助系统(Advanced Driving Assistance System,ADAS)和自动驾驶跟踪产生极为严重的影响,产生重大的安全隐患。But in fact, the target obstacle really exists in the objective world. Because millimeter-wave radar is the most reliable guarantee for forward side collision warning, the loss of target obstacles in such curved road conditions will cause the automatic driving system to fail to obtain an accurate safe distance, which will lead to advanced driving assistance. System (Advanced Driving Assistance System, ADAS) and automatic driving tracking have a very serious impact, resulting in major safety hazards.
鉴于此,本申请实施例提供一种目标检测方法及装置,旨在引入对道路探测盲区的考虑,根据车辆的定位信息和视觉信息,对特殊路况进行快速识别,并通过运动学物理量(如目标角速度/速度等)判断毫米波雷达等传感器是否处于目标丢失状态或目标即将丢失状态,避免车辆产生无目标障碍物的错觉。进一步的,当毫米波雷达等传感器处于目标丢失状态或目标即将丢失状态时,对毫米波雷达等传感器的姿态进行调整,使得毫米波雷达等传感器能够及时地重新检测到目标,保证车辆在驾驶过程中的安全性和稳定性。更进一步的,对毫米波雷达等传感器的姿态进行调整时,本申请实施例从两个方面的自由度(例如左右(α)和上下(β))分别对雷达的姿态进行补偿修正,由于两个自由度的调整是独立的,不相互依赖,所以可以提高调整的精度和效率。In view of this, the embodiments of the present application provide a target detection method and device, which aim to introduce the consideration of road detection blind spots, quickly identify special road conditions according to the positioning information and visual information of the vehicle, and use kinematic physical quantities (such as target Angular velocity/velocity, etc.) to determine whether sensors such as millimeter-wave radar are in a state of target loss or the target is about to be lost, so as to avoid the illusion that the vehicle has no target obstacles. Further, when sensors such as millimeter-wave radar are in the state of target loss or the target is about to be lost, the attitude of sensors such as millimeter-wave radar is adjusted, so that sensors such as millimeter-wave radar can re-detect the target in time to ensure that the vehicle is in the driving process. security and stability. Furthermore, when adjusting the attitude of a sensor such as a millimeter-wave radar, the embodiment of the present application compensates and corrects the attitude of the radar from two degrees of freedom (for example, left and right (α) and up and down (β)). The adjustment of each degree of freedom is independent and does not depend on each other, so the accuracy and efficiency of the adjustment can be improved.
下面结合本申请实施例中的附图,对本申请实施例中的技术方案进行更加详尽的描述。The technical solutions in the embodiments of the present application will be described in more detail below with reference to the accompanying drawings in the embodiments of the present application.
应理解,在本申请的描述中,多个,是指两个或两个以上。至少一个,是指一个或多个。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。“第一”、“第二”等词汇,仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。It should be understood that in the description of this application, a plurality of means two or more. At least one means one or more. "And/or", which describes the association relationship of the associated objects, means that there can be three kinds of relationships, for example, A and/or B, which can mean that A exists alone, A and B exist at the same time, and B exists alone. The character "/" generally indicates that the associated objects are an "or" relationship. Words such as "first" and "second" are only used for the purpose of distinguishing and describing, and cannot be understood as indicating or implying relative importance, nor can they be understood as indicating or implying order.
图3A为本申请实施例适用的一种可能的应用场景,该场景可以是自动驾驶、驾驶辅助、人工驾驶等场景。该场景中包括至少两个车辆,图3A以两个车辆为例。其中后车(即自车)正前方设置有雷达传感器,例如毫米波雷达传感器,还设置有相机,后车可以通过毫米波雷达和相机对前车进行检测跟踪。在图3A所示的场景中,前车和后车均处于弯道。FIG. 3A is a possible application scenario to which the embodiment of the present application is applicable, and the scenario may be scenarios such as automatic driving, driving assistance, and manual driving. The scene includes at least two vehicles, and FIG. 3A takes two vehicles as an example. A radar sensor, such as a millimeter-wave radar sensor, is also arranged in front of the rear vehicle (ie, the self-vehicle), and a camera is also arranged, and the rear vehicle can detect and track the preceding vehicle through the millimeter-wave radar and the camera. In the scenario shown in Figure 3A, both the leading and trailing vehicles are in a curve.
图3B为本申请实施例适用的另一种可能的应用场景,该场景可以是自动驾驶、驾驶 辅助、人工驾驶等场景。与图3A不同的是,在图3B所示的场景中,前车处于弯道,而后车(即自车)处于直道。FIG. 3B is another possible application scenario to which the embodiment of the present application is applicable, and the scenario may be scenarios such as automatic driving, driving assistance, and manual driving. Different from FIG. 3A , in the scene shown in FIG. 3B , the preceding vehicle is on a curve, and the following vehicle (ie, the ego vehicle) is on a straight road.
图3C为本申请实施例适用的另一种可能的应用场景,该场景可以是自动驾驶、驾驶辅助、人工驾驶等场景。与图3A、图3B不同的是,在图3C所示的场景中,后车(即自车)处于弯道,而前车处于直道。FIG. 3C is another possible application scenario to which the embodiment of the present application is applicable, and the scenario may be scenarios such as automatic driving, driving assistance, and manual driving. Different from FIGS. 3A and 3B , in the scenario shown in FIG. 3C , the vehicle behind (ie, the ego vehicle) is on a curve, while the vehicle in front is on a straight road.
需要说明的是,上述图3A、图3B、图3C中均是以两车示出(即自车只有一辆前车),应理解,这并不限定应用场景中车辆的数量,实际场景中还可以包括更多的车辆。例如,图3D所示,自车处于弯道,前一车处于弯道,前二车处于直道。当前车有多辆时,自车针对每辆前车的检测原理是相同的,所以本申请实施例中,主要以一辆前车为例。It should be noted that in the above-mentioned Figures 3A, 3B, and 3C, two vehicles are shown (that is, the vehicle has only one vehicle in front). It should be understood that this does not limit the number of vehicles in the application scenario. In the actual scenario More vehicles may also be included. For example, as shown in Fig. 3D, the ego vehicle is in a curve, the previous vehicle is in a curve, and the previous two vehicles are in a straight road. When there are multiple front vehicles, the detection principle of the self-vehicle for each preceding vehicle is the same, so in the embodiments of the present application, a preceding vehicle is mainly used as an example.
另外,图3A、图3B、图3C、图3D是以弯道路况为例,本申请实施例还可以应用于其它可能导致车辆探测盲区的路况场景中,例如上/下坡、加/减速等,本申请不做限制。In addition, FIGS. 3A , 3B, 3C, and 3D take curved road conditions as an example, and the embodiments of the present application can also be applied to other road conditions that may lead to vehicle detection blind spots, such as up/downhill, acceleration/deceleration, etc. , this application does not limit.
本申请实施例提供的目标检测方法具体可以应用于上述场景中的车辆中,具体的,该方法可以通过软件承载在车辆的计算设备中,计算设备例如是一个单独的车载设备(如车载电脑、驾驶控制装置、驾驶控制设备等),或者是一个或多个处理芯片,或者集成电路等,本申请对此不作限定。The target detection method provided by the embodiment of the present application can be applied to the vehicle in the above-mentioned scenario. Specifically, the method can be carried in the computing device of the vehicle through software, and the computing device is, for example, a separate vehicle-mounted device (such as a vehicle-mounted computer, Driving control device, driving control device, etc.), or one or more processing chips, or integrated circuits, etc., which are not limited in this application.
参见图4,为一种可能的车辆架构图,包含:计算设备、控制设备以及车载传感器等。下面对车辆中的各个部件进行介绍。Referring to FIG. 4 , it is a possible vehicle architecture diagram, including computing equipment, control equipment, and on-board sensors. The various components in the vehicle are described below.
车载传感器(简称传感器),用于实时采集所述车辆的各种传感器数据。本申请实施例中安装在车辆上的传感器例如包括:相机(或称之为摄像头)、定位系统、雷达传感器、姿态传感器等。另外还可以包括其他传感器,例如轴转速传感器、轮速传感器等,本申请不做限制。Vehicle-mounted sensors (sensors for short) are used to collect various sensor data of the vehicle in real time. The sensors installed on the vehicle in the embodiments of the present application include, for example, a camera (or referred to as a camera), a positioning system, a radar sensor, an attitude sensor, and the like. In addition, other sensors may also be included, such as a shaft rotational speed sensor, a wheel speed sensor, etc., which are not limited in this application.
其中,雷达传感器,可简称为雷达。雷达可以测量周围环境的雷达点迹数据,还可以测量车辆在设定方向上的障碍物的信息,例如前车的位置、速度(包括线速度、角速度)等。Among them, the radar sensor may be referred to as radar. The radar can measure the radar trace data of the surrounding environment, and can also measure the information of the obstacles in the set direction of the vehicle, such as the position and speed of the preceding vehicle (including linear velocity, angular velocity), etc.
本申请实施例中的雷达主要以毫米波雷达为例。毫米波雷达:是工作在毫米波波段(millimeter wave)探测的雷达。通常毫米波是指30~300GHz频段(波长为1~10mm)。毫米波的波长介于厘米波和光波之间,因此毫米波兼有微波制导和光电制导的优点。将毫米波雷达安装在汽车上,可以测量从毫米波雷达到被测物体之间的距离、角度和相对速度等。利用毫米波雷达可以实现自适应巡航控制,前向防撞报警(Forward Collision Warning),盲点检测(Blind Spot Detection),辅助停车(Parking aid),辅助变道(Lane change assistant),自主巡航控制等高级驾驶辅助系统(ADAS)功能。The radar in the embodiments of the present application mainly takes a millimeter-wave radar as an example. Millimeter wave radar: It is a radar that works in the millimeter wave band (millimeter wave). Generally, millimeter wave refers to the 30-300GHz frequency band (wavelength is 1-10mm). The wavelength of millimeter wave is between centimeter wave and light wave, so millimeter wave has the advantages of microwave guidance and photoelectric guidance. Installing the millimeter-wave radar on the car can measure the distance, angle and relative speed from the millimeter-wave radar to the object to be measured. Using millimeter wave radar can realize adaptive cruise control, forward collision warning (Forward Collision Warning), blind spot detection (Blind Spot Detection), assisted parking (Parking aid), auxiliary lane change (Lane change assistant), autonomous cruise control, etc. Advanced Driver Assistance Systems (ADAS) features.
在本申请实施例中,毫米波雷达可以与随动机构联动连接,当该随动机构转动时,毫米波雷达的姿态随之发生变化。其中,随动机构可以是毫米波雷达的组成部分,也可以是毫米波雷达配套设备,这里不做限制。In the embodiment of the present application, the millimeter-wave radar may be linked with the follower mechanism, and when the follower mechanism rotates, the attitude of the millimeter-wave radar changes accordingly. Among them, the follow-up mechanism can be a component of the millimeter-wave radar, or it can be a supporting device of the millimeter-wave radar, which is not limited here.
姿态传感器:姿态传感器是基于传感器即微机电系统(Microelectro Mechanical Systems,MEMS)技术的高性能三维运动姿态测量系统。本申请实施例中,姿态传感器安装在雷达上,用于检测雷达的姿态。Attitude sensor: Attitude sensor is a high-performance three-dimensional motion attitude measurement system based on sensor, namely Microelectro Mechanical Systems (MEMS) technology. In the embodiment of the present application, the attitude sensor is installed on the radar and is used to detect the attitude of the radar.
相机可以部署在车辆的四周,并对车辆周围的环境参数进行采集。例如,车辆的前后保险杠、侧视镜、挡风玻璃、车顶上可以分别安装至少一个相机。在本申请实施例中,相机至少可以获取车辆前方的图像,使得计算设备可以根据该图像确定车辆行驶方向上的目 标障碍物的类型(例如前车),还可以确定车辆与目标障碍物的距离,目标障碍物的位置以及道路状况等信息。Cameras can be deployed around the vehicle and collect environmental parameters around the vehicle. For example, at least one camera may be installed on the front and rear bumpers, side mirrors, windshield, and roof of the vehicle, respectively. In this embodiment of the present application, the camera can at least acquire an image in front of the vehicle, so that the computing device can determine the type of the target obstacle (for example, the vehicle in front) in the driving direction of the vehicle according to the image, and can also determine the distance between the vehicle and the target obstacle. , the location of the target obstacle and the road conditions.
需要强调的是,本申请实施例中的车载传感器是以相机和毫米波雷达的组合为例。但本申请并不限定传感器的具体类型,例如车辆的传感器还可以是固态激光雷达和相机的组合,毫米波雷达和固态激光雷达的组合,相机与相机的组合,毫米波雷达、固态激光和相机的组合等。只要是检测范围大、检测距离短、检测结果(障碍物位置速度等)不太准确的传感器,与检测范围小、检测距离远、检测结果准确的传感器搭配使用,本申请实施例均适用。It should be emphasized that the vehicle-mounted sensor in the embodiment of the present application is a combination of a camera and a millimeter-wave radar as an example. However, this application does not limit the specific type of sensor. For example, the sensor of a vehicle can also be a combination of solid-state lidar and camera, a combination of millimeter-wave radar and solid-state lidar, a combination of camera and camera, millimeter-wave radar, solid-state laser and camera. combination, etc. As long as a sensor with a large detection range, short detection distance, and inaccurate detection results (obstacle position and speed, etc.) is used in conjunction with a sensor with a small detection range, long detection distance, and accurate detection results, the embodiments of the present application are applicable.
定位系统用于对车辆当前的位置进行定位。示例性的,定位系统可以是全球定位系统(Global Positioning System,GPS)、北斗系统或者其他定位系统,用于接收卫星信号,并对车辆当前的位置进行定位。定位系统还可以是视觉定位、雷达定位、融合定位等,本申请不做限制,后文主要以GPS为例。The positioning system is used to locate the current position of the vehicle. Exemplarily, the positioning system may be a global positioning system (Global Positioning System, GPS), a Beidou system or other positioning systems, for receiving satellite signals and locating the current position of the vehicle. The positioning system may also be visual positioning, radar positioning, fusion positioning, etc., which are not limited in this application, and GPS is mainly used as an example in the following text.
计算设备负责计算功能,用于根据安装在车辆上的各式各样的传感器(如相机、雷达、定位系统等),在车辆行驶的过程中实时采集该车辆周围环境的参数信息,对采集到的参数信息进行运算与分析,并可以根据分析结果确定控车指令。The computing device is responsible for the computing function, which is used to collect the parameter information of the surrounding environment of the vehicle in real time during the driving of the vehicle according to various sensors (such as cameras, radars, positioning systems, etc.) installed on the vehicle. The parameter information is calculated and analyzed, and the vehicle control command can be determined according to the analysis result.
计算设备中可以提供各种计算功能。例如包括:根据相机和雷达采集到的参数信息感知车辆行驶过程中的周围环境信息;根据定位系统采集到的参数信息确定车辆的地理位置;根据相机、雷达、定位系统等传感器采集到的参数信息进行运算和分析,确定车辆的状态,例如确定雷达是否处于目标丢失状态或目标即将丢失状态;根据上述传感器采集到的参数信息产生控制指令并发送给控制设备,使得控制设备控制相应的传感器,例如可以在雷达处于目标丢失状态或即将处于目标丢失状态时产生控制雷达的随动机构转动的指令,然后将该指令发送给控制设备,使得控制设备控制随动机构转动,进而调整雷达的姿态,使得雷达能够重新获得目标。Various computing functions may be provided in computing devices. For example, it includes: perceiving the surrounding environment information during the driving process of the vehicle according to the parameter information collected by the camera and the radar; determining the geographic location of the vehicle according to the parameter information collected by the positioning system; according to the parameter information collected by the camera, radar, positioning system and other sensors Perform calculation and analysis to determine the state of the vehicle, for example, determine whether the radar is in the state of target loss or the target is about to be lost; generate control instructions according to the parameter information collected by the above sensors and send them to the control device, so that the control device controls the corresponding sensors, such as When the radar is in the target loss state or is about to be in the target loss state, an instruction to control the rotation of the follower mechanism of the radar can be generated, and then the instruction is sent to the control device, so that the control device controls the rotation of the follower mechanism, and then adjusts the attitude of the radar, so that The radar was able to regain the target.
本申请实施例对计算设备的类型不作具体限定。作为一个示例,计算设备可以是移动数据中心(Mobile Data Center,MDC)。MDC是自动驾驶车辆本地的计算平台,MDC上运行的自动驾驶软件,包括本方案的相机感知算法、毫米波雷达感知算法、目标融合跟踪算法(即基于雷达信息和相机信息融合的目标跟踪方法对应的算法)等。MDC还运行一些简单的操作,如电机的控制等。This embodiment of the present application does not specifically limit the type of the computing device. As an example, the computing device may be a Mobile Data Center (MDC). The MDC is the local computing platform of the autonomous vehicle. The autonomous driving software running on the MDC includes the camera perception algorithm, millimeter-wave radar perception algorithm, and target fusion tracking algorithm of this solution (that is, the target tracking method based on the fusion of radar information and camera information corresponds to algorithm) etc. The MDC also runs some simple operations, such as motor control.
本申请实施例对控制设备的类型不作具体限定。作为一个示例,控制设备可以是微控制单元(micro-control unit,MCU)。可选的,MCU也可以称为单片微型计算机(singlechip microcomputer,SCM)或者单片机。The embodiment of the present application does not specifically limit the type of the control device. As an example, the control device may be a micro-control unit (MCU). Optionally, the MCU may also be referred to as a single-chip microcomputer (singlechip microcomputer, SCM) or a single-chip microcomputer.
应理解,微控制单元MCU可以是对中央处理器(central process unit,CPU)的频率与规格做适当缩减,并将内存(memory)、计数器(timer)、通用串口总线(universalserial bus,USB)、模数(analog to digital,AD)转换、通用异步收发传输器(universalasynchronous receiver transmitter,UART)、可编程逻辑控制器(programmable logiccontroller,PLC)、直接内存存取(direct memory access,DMA)等中的至少一个周边接口,甚至液晶显示器(liquid crystal display,LCD)驱动电路都整合在单一芯片上,形成芯片级的计算设备,为不同的应用场合做不同组合控制。It should be understood that the micro control unit MCU can appropriately reduce the frequency and specifications of the central processing unit (CPU), and combine the memory (memory), counter (timer), universal serial bus (universal serial bus, USB), Analog-to-digital (analog to digital, AD) conversion, universal asynchronous receiver transmitter (UART), programmable logic controller (PLC), direct memory access (direct memory access, DMA), etc. At least one peripheral interface, and even a liquid crystal display (LCD) driver circuit are integrated on a single chip to form a chip-level computing device that can be controlled in different combinations for different applications.
控制设备在接收到计算设备发送的控车指令之后,可以通过车控接口来控制车载传感器(如控制雷达的姿态),从而可以实现对车辆的辅助控制。After receiving the vehicle control command sent by the computing device, the control device can control the vehicle sensor (eg, control the attitude of the radar) through the vehicle control interface, so as to realize the auxiliary control of the vehicle.
本领域技术人员可以理解,图4中示出的车载设备的结构并不构成对车载设备的限定,本申请实施例提供的车载设备可以包括比图示更多或更少的模块,或者组合某些模块,或者不同的部件布置,本申请也不作限定。例如,所述车载设备内还可以包括制动机构(如刹车、油门、档位等)、人机交互输入输出部件(如显示屏等)、无线通信模块、通信接口等。Those skilled in the art can understand that the structure of the in-vehicle device shown in FIG. 4 does not constitute a limitation on the in-vehicle device, and the in-vehicle device provided in the embodiments of the present application may include more or less modules than those shown in the figure, or a combination of certain modules may be included. Some modules or different component arrangements are not limited in this application. For example, the in-vehicle device may also include braking mechanisms (such as brakes, accelerators, gears, etc.), human-computer interaction input and output components (such as display screens, etc.), wireless communication modules, communication interfaces, and the like.
如图5所示,为本申请实施例提供一种目标检测方法的流程图,该方法可以应用于图4所示的车辆中,方法包括:As shown in FIG. 5 , a flow chart of a method for object detection is provided in an embodiment of the present application. The method can be applied to the vehicle shown in FIG. 4 , and the method includes:
S501、第一车辆获取第一车辆(自车)和第二车辆(前车)所处的路况,以及获取第二车辆的速度。S501. The first vehicle acquires the road conditions where the first vehicle (own vehicle) and the second vehicle (the preceding vehicle) are located, and acquires the speed of the second vehicle.
具体的,第一车辆安装有相机,毫米波雷达以及定位系统等传感器,计算设备实时采集各传感器采集的传感器数据。其中,相机可以实时获取第一车辆的前方的图像信息以及对第二车辆进行检测跟踪(例如检测第二车辆的类型和位置),定位系统可以实时对第一车辆的位置进行定位,毫米波雷达可以对目标障碍物(本文以第二车辆为例)进行检测跟踪(例如检测第二车辆的位置和速度)。第一车辆的具体结构可以参考图4所示的结构,这里不再详细介绍。Specifically, the first vehicle is equipped with sensors such as a camera, a millimeter-wave radar, and a positioning system, and the computing device collects sensor data collected by each sensor in real time. Among them, the camera can obtain the image information in front of the first vehicle in real time and detect and track the second vehicle (for example, detect the type and position of the second vehicle), the positioning system can locate the position of the first vehicle in real time, and the millimeter wave radar The target obstacle (taking the second vehicle as an example herein) can be detected and tracked (eg, the position and speed of the second vehicle are detected). For the specific structure of the first vehicle, reference may be made to the structure shown in FIG. 4 , which will not be described in detail here.
第一车辆的计算设备可以基于定位系统确定第一车辆所处的路况。具体的,第一车辆的计算设备可以基于定位系统获取第一车辆在地图上的位置,从而可以基于地图判断自身所处的路况,例如弯道、或者直道、或者上坡、或者下坡等。The computing device of the first vehicle may determine the road condition in which the first vehicle is located based on the positioning system. Specifically, the computing device of the first vehicle can obtain the position of the first vehicle on the map based on the positioning system, so as to determine the road condition on which it is located, such as a curve, a straight road, an uphill, or a downhill, based on the map.
第一车辆的计算设备还可以基于定位系统确定第一车辆的行驶距离。例如,参见图6或图7或图8,t0时刻定位系统定位第一车辆所在位置为A点;t1时刻定位系统定位第一车辆所在位置为A′点,计算设备根据A点和A′点位置可以计算出A点到A′点的距离即
Figure PCTCN2021124194-appb-000008
(第一车辆从t0时刻到t1时刻之间的位移)。
The computing device of the first vehicle may also determine the distance traveled by the first vehicle based on the positioning system. For example, referring to FIG. 6 or FIG. 7 or FIG. 8 , at time t0 the positioning system locates the location of the first vehicle as point A; at time t1 the positioning system locates the location of the first vehicle as point A', and the computing device locates the location of the first vehicle as point A' according to point A and point A' The position can calculate the distance from point A to point A', namely
Figure PCTCN2021124194-appb-000008
(Displacement of the first vehicle from time t0 to time t1).
第一车辆的计算设备还可以通过相机获取第二车辆所处的路况。具体的,第一车辆的计算设备控制相机拍摄第一车辆的前方的图像,该图像中包含第二车辆以及第二车辆所处的道路信息,例如车道线的曲率半径或其他道路指示牌等,进而基于该图像判断第二车辆所处的路况,例如弯道或者直道、或者上坡、或者下坡等。The computing device of the first vehicle may also acquire the road conditions where the second vehicle is located through the camera. Specifically, the computing device of the first vehicle controls the camera to capture an image in front of the first vehicle, and the image includes information about the second vehicle and the road where the second vehicle is located, such as the curvature radius of the lane line or other road signs, etc., Then, based on the image, determine the road condition on which the second vehicle is located, such as a curve or a straight road, or an uphill, or a downhill, and the like.
第一车辆的计算设备还可以根据相机和毫米波雷达检测的数据综合确定第二车辆的位置和速度等信息。具体实现可以参考后文介绍的基于毫米波雷达信息和相机信息融合的目标跟踪方法,这里不详细介绍。The computing device of the first vehicle may also comprehensively determine information such as the position and speed of the second vehicle according to the data detected by the camera and the millimeter-wave radar. The specific implementation can refer to the target tracking method based on the fusion of millimeter-wave radar information and camera information, which is described later, and will not be introduced in detail here.
S502、若第二车辆的速度满足该路况对应的预设条件,则第一车辆确定第一车辆的毫米波雷达处于目标丢失状态或目标即将丢失状态。S502. If the speed of the second vehicle satisfies the preset condition corresponding to the road condition, the first vehicle determines that the millimeter-wave radar of the first vehicle is in a target loss state or a target loss state.
第二车辆的速度至少包括角速度或线速度两种,预设条件包括第二车辆的速度是否大于或者等于设定的第一阈值。The speed of the second vehicle includes at least two kinds of angular speed and linear speed, and the preset condition includes whether the speed of the second vehicle is greater than or equal to the set first threshold.
其中,该设定的第一阈值应当小于或者等于第二车辆在毫米波雷达检测范围的边界上行驶时的速度(即第二车辆位于毫米波雷达检测范围内与位于毫米波雷达检测范围外之间的速度临界值)。Wherein, the set first threshold should be less than or equal to the speed of the second vehicle when driving on the boundary of the detection range of the millimeter-wave radar (that is, the second vehicle is located within the detection range of the millimeter-wave radar and located outside the detection range of the millimeter-wave radar. speed threshold in between).
例如,第一阈值等于第二车辆在毫米波雷达检测范围的边界上行驶时的速度,那么第二车辆的速度超过该第一阈值时,第二车辆会超出毫米波雷达检测范围,进入毫米波雷达检测盲区,此时毫米波雷达处于目标丢失状态,当第二车辆的速度等于该第一阈值(或者 第二车辆的速度小于第一阈值且第二车辆的速度与第一阈值的差值小于预设差值)时,毫米波雷达处于目标即将丢失状态。For example, the first threshold is equal to the speed of the second vehicle when it is traveling on the boundary of the detection range of the millimeter-wave radar, then when the speed of the second vehicle exceeds the first threshold, the second vehicle will exceed the detection range of the millimeter-wave radar and enter the millimeter-wave radar The radar detects the blind spot. At this time, the millimeter wave radar is in the target loss state. When the speed of the second vehicle is equal to the first threshold (or the speed of the second vehicle is less than the first threshold and the difference between the speed of the second vehicle and the first threshold is less than When the preset difference is set, the millimeter-wave radar is in a state where the target is about to be lost.
例如,第一阈值小于第二车辆在毫米波雷达检测范围的边界上行驶时的速度,设第一阈值与临界值的差值为△X,那么第二车辆的速度超过该第一阈值且第二车辆的速度与该第一阈值的差值小于或等于△X时,毫米波雷达处于目标即将丢失状态,第二车辆的速度超过该第一阈值且第二车辆的速度与该第一阈值的差值大于△X时,毫米波雷达处于目标丢失状态。For example, the first threshold is smaller than the speed of the second vehicle when it is traveling on the boundary of the detection range of the millimeter-wave radar, and the difference between the first threshold and the critical value is set to be ΔX, then the speed of the second vehicle exceeds the first threshold and the third When the difference between the speed of the second vehicle and the first threshold is less than or equal to ΔX, the millimeter-wave radar is in a state where the target is about to be lost, the speed of the second vehicle exceeds the first threshold, and the speed of the second vehicle is equal to the first threshold. When the difference is greater than △X, the millimeter-wave radar is in a target loss state.
另外,为了提高车辆的安全性,也可以在毫米波雷达实际处于目标即将丢失状态时,将毫米波雷达的状态设定为目标丢失状态(即目标丢失状态包含实际丢失和即将丢失两种情况)。In addition, in order to improve the safety of the vehicle, the state of the millimeter-wave radar can also be set as the target loss state when the millimeter-wave radar is actually in the target loss state (that is, the target loss state includes actual loss and impending loss.) .
应理解,在具体实施时,还可以使用其他词汇来描述毫米波雷达处于目标丢失状态或目标即将丢失状态,例如毫米波雷达处于丢帧状态(该状态可以理解为毫米波雷达没有检测到包含目标的数据帧),本申请对此不做限制。It should be understood that during specific implementation, other terms can also be used to describe that the millimeter-wave radar is in a target-losing state or a state where the target is about to be lost, for example, the millimeter-wave radar is in a frame-loss state (this state can be understood as the millimeter-wave radar does not detect a target containing a target). data frame), which is not limited in this application.
在本申请实施例中,针对不同的路况,预设条件可以不同,例如第一阈值不同。当第二车辆的速度满足当前路况对应的预设条件时,第一车辆的计算设备确定第一车辆的毫米波雷达处于目标丢失状态或目标即将丢失状态。In this embodiment of the present application, for different road conditions, the preset conditions may be different, for example, the first threshold is different. When the speed of the second vehicle satisfies the preset condition corresponding to the current road condition, the computing device of the first vehicle determines that the millimeter-wave radar of the first vehicle is in a target loss state or a target loss state.
需要说明的是,在行驶过程中,如果毫米波雷达的姿态未被调整,则毫米波雷达的检测范围相对于第一车辆是固定不变的。但当第一车辆和第二车辆所处的位置,和/或,第一车辆和第二车辆的相对位置不同时,第一阈值不同。It should be noted that, during the driving process, if the attitude of the millimeter-wave radar is not adjusted, the detection range of the millimeter-wave radar is fixed relative to the first vehicle. However, when the positions of the first vehicle and the second vehicle, and/or the relative positions of the first vehicle and the second vehicle are different, the first threshold value is different.
以下例举几个具体的示例来说明。Several specific examples are given below to illustrate.
示例1、参见图6,第一车辆处于直道且第二车辆处于弯道,则预设条件包括:Example 1. Referring to FIG. 6 , the first vehicle is on a straight road and the second vehicle is on a curve, then the preset conditions include:
Figure PCTCN2021124194-appb-000009
Figure PCTCN2021124194-appb-000009
各参数解释如下:Each parameter is explained as follows:
t0时刻为毫米波雷达采集第一帧数据的时刻,t1时刻为毫米波雷达采集第二帧数据的时刻,其中第一帧数据和第二帧数据可以是连续的两帧数据,当然在具体实施时也可以是间隔少量帧的两帧数据(例如第一帧数据和第二帧数据间隔1帧或2帧)。在本申请实施例中,主要以连续的两帧数据为例。Time t0 is the time when the millimeter-wave radar collects the first frame of data, and time t1 is the time when the millimeter-wave radar collects the second frame of data, where the first frame of data and the second frame of data can be two consecutive frames of data. It can also be two frames of data separated by a small number of frames (for example, the first frame data and the second frame data are separated by 1 frame or 2 frames). In the embodiments of the present application, two consecutive frames of data are mainly used as an example.
t0时刻,第一车辆处于A位置,第二车辆处于B位置,B位置是第一车辆处于A位置时毫米波雷达检测范围的边界位置。t1时刻,第一车辆处于A′位置,第二车辆处于B′位置,B′位置是第一车辆处于A′位置时毫米波雷达检测范围的边界位置。在本申请实施例中,边界位置可以是一个区域或范围,例如在距离雷达检测范围边界线的±L区域内的位置均可以定义为毫米波雷达检测范围的边界位置。At time t0, the first vehicle is at position A, the second vehicle is at position B, and position B is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at position A. At time t1, the first vehicle is at the A' position, the second vehicle is at the B' position, and the B' position is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at the A' position. In this embodiment of the present application, the boundary position may be an area or a range. For example, the position within the ±L area from the boundary line of the radar detection range may be defined as the boundary position of the millimeter wave radar detection range.
ω r′为第二车辆在t2时刻绕弯道做圆周运动的瞬时角速度;N为第一阈值; ω r ' is the instantaneous angular velocity of the second vehicle making a circular motion around the curve at time t2; N is the first threshold;
ω r为第二车辆从t0时刻到t1时刻绕弯道做圆周运动的角速度(是平均角速度,当t0与t1间隔时间极短(例如采集一帧数据的时间)时,该角速度也可以认为是瞬时速度),t2≥t1>t0;
Figure PCTCN2021124194-appb-000010
为第一车辆从t0时刻到t1时刻之间的位移;
ω r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1 (the average angular velocity, when the interval between t0 and t1 is extremely short (for example, the time to collect a frame of data), the angular velocity can also be considered as instantaneous. speed), t2≥t1>t0;
Figure PCTCN2021124194-appb-000010
is the displacement of the first vehicle from time t0 to time t1;
Figure PCTCN2021124194-appb-000011
为第二车辆从t0时刻到t1时刻之间的位移;需要说明的是,第二车辆从B到B′的实际路径是曲线,但是由于t0时刻到t1时刻非常短,例如30~50ms,所以该段路径可以近似为直线,即第二车辆从t0时刻到t1时刻的位于约等于B点到B′点的直线距离;
Figure PCTCN2021124194-appb-000011
is the displacement of the second vehicle from time t0 to time t1; it should be noted that the actual path of the second vehicle from B to B' is a curve, but since time t0 to time t1 is very short, such as 30-50ms, so This section of the path can be approximated as a straight line, that is, the straight-line distance between the second vehicle from time t0 to time t1 is approximately equal to the straight line distance from point B to point B';
Figure PCTCN2021124194-appb-000012
为t0时刻第一车辆和第二车辆之间的欧式距离;
Figure PCTCN2021124194-appb-000012
is the Euclidean distance between the first vehicle and the second vehicle at time t0;
ε为t0时刻A、B、O之间所形成的圆心角的角度值;ε is the angle value of the central angle formed between A, B, and O at time t0;
α是毫米波雷达检测波束角的一半,由毫米波雷达特性决定;α is half of the detection beam angle of the millimeter-wave radar, which is determined by the characteristics of the millimeter-wave radar;
其中,A与A′的坐标可以由定位系统得出,B、B′的位置坐标可以由基于毫米波雷达信息和相机信息融合的目标跟踪方法得出,该目标跟踪方法将在后文进一步详细介绍。根据A、A′、B、B′的位置坐标可以得到
Figure PCTCN2021124194-appb-000013
ε、ω r等可以由该目标跟踪方法输出的A、A′、B、B′的位置坐标进行几何分的计算得出。ω r’可以由毫米波雷达检测得到或者根据毫米波雷达检测的第二车辆的线速度v r’计算得到。
Among them, the coordinates of A and A' can be obtained by the positioning system, and the position coordinates of B and B' can be obtained by the target tracking method based on the fusion of millimeter-wave radar information and camera information. The target tracking method will be further detailed later. introduce. According to the position coordinates of A, A', B, B', we can get
Figure PCTCN2021124194-appb-000013
ε, ω r , etc. can be calculated from the geometric points of the position coordinates of A, A', B, B' output by the target tracking method. ω r ' may be detected by the millimeter-wave radar or calculated according to the linear velocity v r ' of the second vehicle detected by the millimeter-wave radar.
K为第一预设系数,取值范围为(0,1]。可选的,K的值可以与第一车辆调整毫米波雷达的随动机构的执行时间和/或弯道的缓急等相关。执行时间越长,K越小,执行时间越短,K越大;弯道越急,K越小,弯道越缓,K越大。K is the first preset coefficient, and the value range is (0, 1]. Optionally, the value of K may be related to the execution time of the first vehicle to adjust the follow-up mechanism of the millimeter-wave radar and/or the urgency of the curve, etc. .The longer the execution time, the smaller the K, the shorter the execution time, the larger the K;
示例性的,K的取值满足以下公式:Exemplarily, the value of K satisfies the following formula:
K(R,t)=e^-(t/aR);K(R,t)=e^-(t/aR);
其中,t为执行时间,R为弯道半径,a是待定系数。Among them, t is the execution time, R is the radius of the curve, and a is the undetermined coefficient.
以下介绍ω r的推导过程: The following describes the derivation process of ω r :
第一车辆在自动驾驶车辆行驶过程中,第一车辆处于直道,目标车处于弯道。t0时刻,第一车辆处于A位置,目标车处于B位置,该B位置是第一车辆处于A位置时毫米波雷达检测范围的边界位置。当t1时刻,第一车辆处于A′,目标车处于B′,该B′位置是第一车辆处于A′位置时毫米波雷达检测范围的边界位置。O表示是该弯道的圆心(曲率中心),进行做图分析,连接
Figure PCTCN2021124194-appb-000014
分别垂直于B和B′的切线方向(因为t0到t1时间极短,所以
Figure PCTCN2021124194-appb-000015
均可以认为垂直于
Figure PCTCN2021124194-appb-000016
)。
During the driving process of the automatic driving vehicle, the first vehicle is on a straight road, and the target vehicle is on a curve. At time t0, the first vehicle is at position A, the target vehicle is at position B, and the position B is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at position A. At time t1, the first vehicle is at A' and the target vehicle is at B', and the B' position is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at the A' position. O indicates that it is the center of the curve (the center of curvature), and the graph is analyzed and connected
Figure PCTCN2021124194-appb-000014
are perpendicular to the tangent directions of B and B' respectively (because the time from t0 to t1 is very short, so
Figure PCTCN2021124194-appb-000015
can be considered to be perpendicular to
Figure PCTCN2021124194-appb-000016
).
参照图6,根据几何关系,有:Referring to Figure 6, according to the geometric relationship, there are:
Figure PCTCN2021124194-appb-000017
Figure PCTCN2021124194-appb-000017
Figure PCTCN2021124194-appb-000018
Figure PCTCN2021124194-appb-000018
Figure PCTCN2021124194-appb-000019
Figure PCTCN2021124194-appb-000019
通过上式可以求得:It can be obtained by the above formula:
Figure PCTCN2021124194-appb-000020
Figure PCTCN2021124194-appb-000020
由于路况,
Figure PCTCN2021124194-appb-000021
可以看做近似垂直于
Figure PCTCN2021124194-appb-000022
可得:ε≈γ;
Due to road conditions,
Figure PCTCN2021124194-appb-000021
can be seen as approximately perpendicular to
Figure PCTCN2021124194-appb-000022
Available: ε≈γ;
第二车辆在弯道的转弯半径为:The turning radius of the second vehicle in the curve is:
Figure PCTCN2021124194-appb-000023
Figure PCTCN2021124194-appb-000023
可以视作,第一车辆和第二车辆处于同一车道,则有:
Figure PCTCN2021124194-appb-000024
It can be considered that the first vehicle and the second vehicle are in the same lane, then:
Figure PCTCN2021124194-appb-000024
第二车辆的转弯半径:Turning radius of the second vehicle:
Figure PCTCN2021124194-appb-000025
Figure PCTCN2021124194-appb-000025
根据汽车圆周运动的计算公式,可得:According to the calculation formula of the circular motion of the car, we can get:
Figure PCTCN2021124194-appb-000026
Figure PCTCN2021124194-appb-000026
示例2、参见图7,第一车辆处于弯道且第二车辆处于直道,则预设条件包括:Example 2. Referring to FIG. 7 , the first vehicle is on a curve and the second vehicle is on a straight road, the preset conditions include:
Figure PCTCN2021124194-appb-000027
Figure PCTCN2021124194-appb-000027
各参数解释如下:Each parameter is explained as follows:
t0时刻为毫米波雷达采集第一帧数据的时刻,t1时刻为毫米波雷达采集第二帧数据的时刻,其中第一帧数据和第二帧数据可以是连续的两帧数据,当然在具体实施时也可以是间隔少量帧的两帧数据(例如第一帧数据和第二帧数据间隔1帧或2帧)。在本申请实施例中,主要以连续的两帧数据为例。Time t0 is the time when the millimeter-wave radar collects the first frame of data, and time t1 is the time when the millimeter-wave radar collects the second frame of data, where the first frame of data and the second frame of data can be two consecutive frames of data. It can also be two frames of data separated by a small number of frames (for example, the first frame data and the second frame data are separated by 1 frame or 2 frames). In the embodiments of the present application, two consecutive frames of data are mainly used as an example.
t0时刻,第一车辆处于A位置,第二车辆处于B位置,B位置是第一车辆处于A位置时毫米波雷达检测范围的边界位置。t1时刻,第一车辆处于A′位置,第二车辆处于B′位置,B′位置是第一车辆处于A′位置时毫米波雷达检测范围的边界位置。同示例1,该边界位置可以是一个区域或范围。At time t0, the first vehicle is at position A, the second vehicle is at position B, and position B is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at position A. At time t1, the first vehicle is at the A' position, the second vehicle is at the B' position, and the B' position is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at the A' position. As in Example 1, the boundary position may be an area or a range.
Figure PCTCN2021124194-appb-000028
是第二车辆从t0时刻到t1时刻之间的位移;
Figure PCTCN2021124194-appb-000028
is the displacement of the second vehicle from time t0 to time t1;
v r′为第二车辆在t2时刻在直道上的瞬时行驶速度(本文中的行驶速度是指线速度),t2≥t1>t0;N为第一阈值; v r ' is the instantaneous traveling speed of the second vehicle on the straight road at time t2 (the traveling speed in this paper refers to the linear speed), t2≥t1>t0; N is the first threshold;
v r为第二车辆从t0时刻到t1时刻在直道上的行驶速度(本文中的行驶速度是指线速度,这里的v r是平均线速度,当t0与t1间隔时间极短(例如采集一帧数据的时间)时,该线速度也可以认为是瞬时线速度); v r is the running speed of the second vehicle on the straight road from time t0 to time t1 (travel speed in this paper refers to the linear speed, where v r is the average linear speed, when the interval between t0 and t1 is very short (for example, collecting a frame data time), the linear velocity can also be considered as the instantaneous linear velocity);
其中,A与A′的坐标可以由定位系统得出,B、B′的位置坐标可以由基于毫米波雷达信息和相机信息融合的目标跟踪方法得出,目标跟踪方法将在后文进一步详细介绍。根据B、B′的位置坐标可以得到
Figure PCTCN2021124194-appb-000029
v r可以由该目标跟踪方法输出的A、A′、B、B′的位置坐标进行几何分的计算得出。v r’可以由毫米波雷达检测得到。
Among them, the coordinates of A and A' can be obtained by the positioning system, and the position coordinates of B and B' can be obtained by the target tracking method based on the fusion of millimeter-wave radar information and camera information. The target tracking method will be introduced in detail later. . According to the position coordinates of B and B', we can get
Figure PCTCN2021124194-appb-000029
v r can be calculated from the geometric points of the position coordinates of A, A', B, and B' output by the target tracking method. v r ' can be detected by millimeter wave radar.
P为第二预设系数,取值范围为(0,1]。P与示例1中的K相同或不同。可选的,P的值与第一车辆调整毫米波雷达的随动机构的执行时间和/或弯道的缓急等相关。执行时间越长,P越小,执行时间越短,P越大;弯道越急,P越小,弯道越缓,P越大。P is the second preset coefficient, and the value range is (0, 1]. P is the same as or different from K in Example 1. Optionally, the value of P is the same as that of the first vehicle to adjust the execution of the follow-up mechanism of the millimeter-wave radar. Time and/or the priority of the curve, etc. The longer the execution time, the smaller the P, the shorter the execution time, the larger the P;
示例性的,P的取值满足以下公式:Exemplarily, the value of P satisfies the following formula:
P(R,t)=e^-(t/aR);P(R,t)=e^-(t/aR);
其中,t为执行时间,R为弯道半径,a是待定系数。Among them, t is the execution time, R is the radius of the curve, and a is the undetermined coefficient.
以下介绍v r的推导过程: The following describes the derivation process of v r :
参照图7,因为第一车辆的转弯半径:Referring to Figure 7, because the turning radius of the first vehicle:
Figure PCTCN2021124194-appb-000030
Figure PCTCN2021124194-appb-000030
Figure PCTCN2021124194-appb-000031
Figure PCTCN2021124194-appb-000031
θ=ω r(t 1-t 0); θ=ω r (t 1 −t 0 );
又因为
Figure PCTCN2021124194-appb-000032
由三角余弦定理有:
also because
Figure PCTCN2021124194-appb-000032
From the trigonometric cosine theorem we have:
Figure PCTCN2021124194-appb-000033
Figure PCTCN2021124194-appb-000033
Figure PCTCN2021124194-appb-000034
Figure PCTCN2021124194-appb-000034
Figure PCTCN2021124194-appb-000035
其中,β等于α,均为雷达的检测波束角的一半。
Figure PCTCN2021124194-appb-000035
Among them, β is equal to α, which is half of the detection beam angle of the radar.
所以由三角正弦定理有:So by the trigonometric sine theorem we have:
Figure PCTCN2021124194-appb-000036
Figure PCTCN2021124194-appb-000036
Figure PCTCN2021124194-appb-000037
Figure PCTCN2021124194-appb-000037
又因为ε=ξ+θ-γ,所以
Figure PCTCN2021124194-appb-000038
And because ε=ξ+θ-γ, so
Figure PCTCN2021124194-appb-000038
所以第二车辆直线行驶的速度为:
Figure PCTCN2021124194-appb-000039
So the speed of the second vehicle in a straight line is:
Figure PCTCN2021124194-appb-000039
示例3、参见图8,第一车辆处于弯道且第二车辆处于弯道,则预设条件包括:Example 3. Referring to FIG. 8 , the first vehicle is on a curve and the second vehicle is on a curve, then the preset conditions include:
Figure PCTCN2021124194-appb-000040
Figure PCTCN2021124194-appb-000040
各参数解释如下:Each parameter is explained as follows:
t0时刻为毫米波雷达采集第一帧数据的时刻,t1时刻为毫米波雷达采集第二帧数据的时刻,其中第一帧数据和第二帧数据可以是连续的两帧数据,当然在具体实施时也可以是间隔少量帧的两帧数据(例如第一帧数据和第二帧数据间隔1帧或2帧)。在本申请实施例中,主要以连续的两帧数据为例。Time t0 is the time when the millimeter-wave radar collects the first frame of data, and time t1 is the time when the millimeter-wave radar collects the second frame of data, where the first frame of data and the second frame of data can be two consecutive frames of data. It can also be two frames of data separated by a small number of frames (for example, the first frame data and the second frame data are separated by 1 frame or 2 frames). In the embodiments of the present application, two consecutive frames of data are mainly used as an example.
t0时刻,第一车辆处于A位置,第二车辆处于B位置,B位置是第一车辆处于A位置时毫米波雷达检测范围的边界位置。t1时刻,第一车辆处于A′位置,第二车辆处于B′位置,B′位置是第一车辆处于A′位置时毫米波雷达检测范围的边界位置。At time t0, the first vehicle is at position A, the second vehicle is at position B, and position B is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at position A. At time t1, the first vehicle is at the A' position, the second vehicle is at the B' position, and the B' position is the boundary position of the detection range of the millimeter-wave radar when the first vehicle is at the A' position.
ω r′为第二车辆在t2时刻绕弯道做圆周运动的瞬时角速度; ω r ' is the instantaneous angular velocity of the second vehicle making a circular motion around the curve at time t2;
ω r为第二车辆从t0时刻到t1时刻绕弯道做圆周运动的角速度(平均角速度,当t0与t1间隔时间极短(例如采集一帧数据的时间)时,该角速度也可以认为是瞬时速度),t2≥t1>t0;N为第一阈值; ω r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1 (average angular velocity, when the interval between t0 and t1 is extremely short (for example, the time to collect one frame of data), the angular velocity can also be considered as the instantaneous velocity. ), t2≥t1>t0; N is the first threshold;
v是第一车辆的行驶速度(瞬时线速度),可以由第一车辆的底盘信息或者第一车辆定位系统获取;v is the traveling speed (instantaneous linear speed) of the first vehicle, which can be obtained from the chassis information of the first vehicle or the first vehicle positioning system;
Figure PCTCN2021124194-appb-000041
为t0时刻第一车辆和第二车辆之间的欧式距离;
Figure PCTCN2021124194-appb-000041
is the Euclidean distance between the first vehicle and the second vehicle at time t0;
α是毫米波雷达检测波束角的一半,由毫米波雷达特性决定;α is half of the detection beam angle of the millimeter-wave radar, which is determined by the characteristics of the millimeter-wave radar;
其中,A与A′的坐标可以由定位系统得出,B、B′的位置坐标可以由基于毫米波雷达信息和相机信息融合的目标跟踪方法得出,目标跟踪方法将在后文进一步详细介绍。根据A、A′、B、B′的位置坐标可以得到
Figure PCTCN2021124194-appb-000042
ω r’可以由毫米波雷达检测得到或者根据毫米波雷达检测的第二车辆的线速度v r’计算得到。
Among them, the coordinates of A and A' can be obtained by the positioning system, and the position coordinates of B and B' can be obtained by the target tracking method based on the fusion of millimeter-wave radar information and camera information. The target tracking method will be introduced in detail later. . According to the position coordinates of A, A', B, B', we can get
Figure PCTCN2021124194-appb-000042
ω r ' may be detected by the millimeter-wave radar or calculated according to the linear velocity v r ' of the second vehicle detected by the millimeter-wave radar.
Q为第三预设系数,取值范围为(0,1],Q与P、K相同或不同。可选的,Q的值与第一车辆调整毫米波雷达的随动机构的执行时间和/或弯道的缓急等相关。执行时间越长,Q越小,执行时间越短,Q越大;弯道越急,Q越小,弯道越缓,Q越大。Q is the third preset coefficient, the value range is (0,1], and Q is the same as or different from P and K. Optionally, the value of Q is the same as the execution time of the first vehicle to adjust the follow-up mechanism of the millimeter-wave radar and the / or the priority of the curve, etc. The longer the execution time, the smaller the Q, the shorter the execution time, the larger the Q; the more acute the curve, the smaller the Q, the slower the curve, the larger the Q.
示例性的,Q的取值满足以下公式:Exemplarily, the value of Q satisfies the following formula:
Q(R,t)=e^-(t/aR);Q(R,t)=e^-(t/aR);
其中,t为执行时间,R为弯道半径,a是待定系数。Among them, t is the execution time, R is the radius of the curve, and a is the undetermined coefficient.
ω r的推导过程如下: The derivation process of ωr is as follows:
当第一车辆和目标车都处于弯道行驶的过程中,如图8,过O点作OC垂直于AB交AE于D;When both the first vehicle and the target vehicle are in the process of driving on a curve, as shown in Figure 8, cross point O, make OC perpendicular to AB and cross AE to D;
假设两车都处于匀速圆周运动,
Figure PCTCN2021124194-appb-000043
且∠COA=∠CAD=α;
Assuming that both cars are in uniform circular motion,
Figure PCTCN2021124194-appb-000043
And ∠COA=∠CAD=α;
所以第二车辆的转弯半径为:So the turning radius of the second vehicle is:
Figure PCTCN2021124194-appb-000044
Figure PCTCN2021124194-appb-000044
所以
Figure PCTCN2021124194-appb-000045
so
Figure PCTCN2021124194-appb-000045
需要说明的是,以上仅为示例而非限定,在实际应用中,针对其它道路场景也可以采用相同的思路来设计预设条件。It should be noted that the above is only an example and not a limitation. In practical applications, the same idea can also be used to design the preset conditions for other road scenarios.
应理解,以上是以目标为第二车辆为例,在实际应用中,针对其它目标,例如行人、动物等也可以采用本申请实施例技术方案进行目标检测,本申请不做限制。It should be understood that the above takes the target as the second vehicle as an example. In practical applications, other targets, such as pedestrians, animals, etc., can also be detected using the technical solutions of the embodiments of the present application, which are not limited in the present application.
通过上述可知,本申请实施例中的第一车辆通过自身采集的定位信息和视觉信息能够对特殊路况的进行快速识别,且在特殊路况下通过监测第二车辆的运动学物理量(如角速度/线速度等)可以迅速判断自车的毫米波雷达等传感器是否处于目标丢失状态或目标即将丢失状态(最快只需要根据毫米波雷达采集的连续两帧数据的就可以识别),进而避免车辆产生无目标障碍物的错觉,提高目标检测的准确性。It can be seen from the above that the first vehicle in the embodiment of the present application can quickly identify special road conditions through the positioning information and visual information collected by itself, and can monitor the kinematic physical quantities of the second vehicle (such as angular velocity/line under special road conditions) Speed, etc.) can quickly judge whether the sensors such as the millimeter-wave radar of the vehicle are in the target loss state or the target is about to be lost (the fastest only need to be identified according to the continuous two frames of data collected by the millimeter-wave radar), so as to prevent the vehicle from generating no noise. The illusion of target obstacles improves the accuracy of target detection.
下面介绍毫米波雷达处于目标丢失状态或目标即将丢失状态(或者说毫米波雷达处于丢帧状态)后,第一车辆调整毫米波雷达的姿态的方案。The following describes a scheme for adjusting the attitude of the millimeter-wave radar by the first vehicle after the millimeter-wave radar is in a target-loss state or a target is about to be lost (or in other words, the millimeter-wave radar is in a frame-loss state).
为了便于理解,这里先对姿态传感器进行简单的介绍:姿态传感器由加速度传感器(即加速计)、角速度传感器(即陀螺仪)、磁感应传感器(即磁力计)三类传感器中一类实现或多类组合实现。姿态传感器包括三轴姿态传感器(或三维姿态传感器)、六轴姿态传感器(或六维姿态传感器)、九轴姿态传感器(或九维姿态传感器)等。其中,三轴姿态传感器由一类传感器实现(如三轴加速计或三轴陀螺仪或三轴磁力计);六轴姿态传感器一般由两类传感器实现(如三轴加速计+三轴陀螺仪);九轴姿态传感器一般由三轴陀螺仪+三轴加速度计+三轴地磁计,也有六轴加速度传感器+三轴陀螺仪的,也有六轴陀螺仪+三轴加速度计的。For ease of understanding, here is a brief introduction to attitude sensors: attitude sensors are implemented by one or more of three types of sensors: acceleration sensor (ie accelerometer), angular velocity sensor (ie gyroscope), and magnetic induction sensor (ie magnetometer). Combination implementation. Attitude sensors include three-axis attitude sensors (or three-dimensional attitude sensors), six-axis attitude sensors (or six-dimensional attitude sensors), nine-axis attitude sensors (or nine-dimensional attitude sensors), and the like. Among them, the three-axis attitude sensor is implemented by one type of sensor (such as three-axis accelerometer or three-axis gyroscope or three-axis magnetometer); the six-axis attitude sensor is generally implemented by two types of sensors (such as three-axis accelerometer + three-axis gyroscope) ); the nine-axis attitude sensor generally consists of a three-axis gyroscope + three-axis accelerometer + three-axis geomagnetometer, there are also six-axis accelerometer + three-axis gyroscope, and there are also six-axis gyroscope + three-axis accelerometer.
本申请实施例中的姿态传感器可以是六维姿态传感器或九维姿态传感器等,本申请不做限制。该姿态传感器可以实时检测毫米波雷达的姿态。The attitude sensor in the embodiment of the present application may be a six-dimensional attitude sensor or a nine-dimensional attitude sensor, etc., which is not limited in this application. The attitude sensor can detect the attitude of the millimeter wave radar in real time.
参见图9,本申请实施例以车辆行驶在平直道路上时毫米波雷达的中心为圆心,建立毫米波雷达的三维坐标系,其中X R轴平行于地面方向(以毫米波雷达的右向为正,左向为负),Z R轴垂直于地面(以向上为正,向下为负),Y R轴垂直于X R轴和Z R轴所在的平面(以毫米波雷达的前向为正,后向为负),则毫米波雷达的姿态可以通过以下三个参数来表示: Referring to FIG. 9, the embodiment of the present application takes the center of the millimeter-wave radar as the center of the circle when the vehicle is driving on a straight road, and establishes a three-dimensional coordinate system of the millimeter-wave radar, wherein the X- R axis is parallel to the ground direction (take the right direction of the millimeter-wave radar as the center of the circle). is positive, the left direction is negative), the Z R axis is perpendicular to the ground (in the upward direction is positive, the downward direction is negative), the Y R axis is perpendicular to the plane where the X R axis and the Z R axis are located (in the forward direction of the millimeter wave radar is positive and backward is negative), the attitude of the millimeter-wave radar can be represented by the following three parameters:
1)偏航角(Yaw),表示毫米波雷达绕Z R轴的旋转,设Yaw=α; 1) Yaw angle (Yaw), representing the rotation of the millimeter-wave radar around the Z R axis, let Yaw=α;
2)俯仰角(Pitch),表示毫米波雷达绕X R轴的旋转,设Pitch=β; 2) Pitch angle (Pitch), indicating the rotation of the millimeter-wave radar around the X R axis, set Pitch=β;
3)横滚角(Roll),表示毫米波雷达绕Y R轴的旋转,设Roll=γ。 3) Roll angle (Roll), which represents the rotation of the millimeter-wave radar around the Y and R axes, where Roll=γ.
由于本申请实施例主要考虑第一车辆对前车的目标检测,所以姿态传感器主要获取毫米波雷达的α、β。设车辆行驶在平直道路上时,姿态传感器检测到毫米波雷达的姿态参数α、β均为0,那么车辆行驶在弯道和/或坡道上时,姿态传感器检测到毫米波雷达α和β则可能大于或小于0。Since the embodiment of the present application mainly considers the target detection of the preceding vehicle by the first vehicle, the attitude sensor mainly acquires α and β of the millimeter-wave radar. Assuming that when the vehicle is driving on a straight road, the attitude sensor detects that the attitude parameters α and β of the millimeter-wave radar are both 0, then when the vehicle is driving on a curve and/or a ramp, the attitude sensor detects that the millimeter-wave radar α and β are may be greater or less than 0.
在本申请实施例中,当毫米波雷达处于目标丢失状态或目标即将丢失状态时,综合考虑毫米波雷达与道路特征密切相关的自由度,对毫米波雷达的姿态进行补偿修正。In the embodiment of the present application, when the millimeter-wave radar is in a target loss state or the target is about to be lost, the attitude of the millimeter-wave radar is compensated and corrected by comprehensively considering the degrees of freedom of the millimeter-wave radar that are closely related to road characteristics.
以弯道路况为例,则可以从左右转动自由度对毫米波雷达的姿态进行调整:Taking curved road conditions as an example, the attitude of the millimeter-wave radar can be adjusted from the left and right rotational degrees of freedom:
1)左右转动自由度。如图9所示,左右转动自由度即为雷达绕Z R轴旋转的自由度,对应旋转角度为α。以毫米波雷达重新检测到目标为角度调整的闭环控制的标准。其中,输入大致角度与输出角度之间难以建立精准的函数关系式,所以优选用模糊-PID自适应控制方法。 1) Left and right rotational degrees of freedom. As shown in Figure 9, the left and right rotation degrees of freedom are the degrees of freedom of the radar to rotate around the Z R axis, and the corresponding rotation angle is α. A standard for closed-loop control of angle adjustment based on target re-detection by millimeter-wave radar. Among them, it is difficult to establish an accurate functional relationship between the approximate input angle and the output angle, so the fuzzy-PID adaptive control method is preferred.
以坡道路况为例,则可以从上下摆动自由度对毫米波雷达的姿态进行调整:Taking the slope condition as an example, the attitude of the millimeter-wave radar can be adjusted from the up and down swing degrees of freedom:
2)上下摆动自由度。如图9所示,上下转动自由度即为雷达绕X R轴旋转的自由度, 对应旋转角度为β,以毫米波雷达重新检测到目标为角度调整的闭环控制的标准。 2) Up and down swing degrees of freedom. As shown in Figure 9, the degree of freedom of up and down rotation is the degree of freedom of the radar to rotate around the X- R axis, and the corresponding rotation angle is β.
应理解,上述两个方向(即上下和左右)的自由度在机械结构是相互独立的(或者说解耦的),在控制逻辑上也是相互独立的(或者说解耦的),所以计算设备可以只调整一个自由度,当然,根据实际路况,如果道路特征是弯坡组合路段,则计算设备也可以同时从左右转动自由度和上下摆动自由度对毫米波雷达的姿态进行调整。It should be understood that the degrees of freedom of the above two directions (ie up and down and left and right) are independent (or decoupled) in the mechanical structure and independent (or decoupled) in the control logic, so the computing device Only one degree of freedom can be adjusted. Of course, according to the actual road conditions, if the road feature is a combination of curves and slopes, the computing device can also adjust the attitude of the millimeter-wave radar from the left and right rotation degrees of freedom and the up and down swing degrees of freedom at the same time.
参见图10,为第一车辆的计算设备(图10以MDC为例)调整毫米波雷达姿态的原理图。其中,计算设备上运行有图5所示实施例中介绍的目标检测方法,可以识别毫米波雷达是否处于目标丢失状态或目标即将丢失状态,另外计算设备还运行有基于毫米波雷达信息和相机信息融合的目标跟踪方法(该目标跟踪方法可以是现有相关技术中的方法,也可以是后文图12相关实施例中介绍的目标跟踪方法,这里不做限制)。毫米波雷达的随动机构包括Z R轴旋转电机和X R轴旋转电机,其中Z R轴旋转电机可以在电压的控制下旋转进而带动毫米波雷达绕Z R轴旋转,X R轴旋转电机可以在电压的控制下旋转进而带动毫米波雷达绕X R轴旋转。 Referring to FIG. 10 , it is a schematic diagram of adjusting the attitude of the millimeter-wave radar by the computing device of the first vehicle (the MDC is taken as an example in FIG. 10 ). The target detection method described in the embodiment shown in FIG. 5 is run on the computing device, which can identify whether the millimeter-wave radar is in a target loss state or a target is about to be lost. In addition, the computing device also runs based on the millimeter-wave radar information and camera information. A fused target tracking method (the target tracking method may be a method in the prior art, or may be the target tracking method introduced in the related embodiment of FIG. 12 later, which is not limited here). The follow-up mechanism of the millimeter-wave radar includes a Z- R -axis rotating motor and an X- R -axis rotating motor. The Z- R -axis rotating motor can rotate under the control of voltage to drive the millimeter-wave radar to rotate around the Z- R -axis, and the X- R -axis rotating motor can rotate. It rotates under the control of voltage and drives the millimeter-wave radar to rotate around the X R axis.
计算设备控制调整毫米波雷达姿态的过程包括:The process that the computing device controls and adjusts the attitude of the millimeter-wave radar includes:
1)、计算设备基于目标检测方法检测到毫米波雷达是否处于目标丢失状态或目标即将丢失状态(即是否丢帧);1) Based on the target detection method, the computing device detects whether the millimeter-wave radar is in the target loss state or the target is about to be lost (ie, whether the frame is lost);
2)当毫米波雷达处于目标丢失状态或目标即将丢失状态后,计算设备根据毫米波雷达当前的姿态确定毫米波雷达的第一角度调整值(例如毫米波雷达需要绕Z R轴旋转△α的角度,需要绕X R轴旋转△β的角度);计算设备将该角度通过指令发送给控制设备(图10以MCU为例);控制设备保存有角度调整值和电压调整值的映射关系,控制设备收到指令后根据该映射关系将第一角度调整值转换成需要输送给Z R轴旋转电机的第一电压V1和需要输送给X R轴旋转电机的第二电压V2;控制设备控制Z R轴旋转电机的输入电压为V1以及控制X R轴旋转电机的输入电压为V2,使得Z R轴旋转电机旋转进而带动毫米波雷达绕Z R轴旋转△α的角度,以及使得X R轴旋转进而带动毫米波雷达绕X R轴旋转△β的角度;或者, 2) When the millimeter-wave radar is in the target loss state or the target is about to be lost, the computing device determines the first angle adjustment value of the millimeter-wave radar according to the current attitude of the millimeter-wave radar (for example, the millimeter-wave radar needs to rotate around the Z R axis by Δα. The angle needs to be rotated around the X R axis by the angle of △β); the computing device sends the angle to the control device through an instruction (Figure 10 takes the MCU as an example); the control device saves the mapping relationship between the angle adjustment value and the voltage adjustment value. After receiving the instruction, the device converts the first angle adjustment value into the first voltage V1 that needs to be sent to the Z R -axis rotating motor and the second voltage V2 that needs to be sent to the X R -axis rotating motor according to the mapping relationship; the control device controls the Z R The input voltage of the axis rotating motor is V1 and the input voltage controlling the X R axis rotating motor is V2, so that the Z R axis rotating motor rotates to drive the millimeter wave radar to rotate around the Z R axis by an angle of Δα, and the X R axis rotates and then Drive the millimeter-wave radar to rotate around the X R axis by an angle of Δβ; or,
计算设备保存有角度调整值和电压调整值的映射关系,当毫米波雷达处于目标丢失状态或目标即将丢失状态后,计算设备根据毫米波雷达当前的姿态确定毫米波雷达的第一角度调整值,然后根据该映射关系将第一角度调整值转换成需要输送给Z R轴旋转电机的第一电压V1和需要输送给X R轴旋转电机的第二电压V2;计算设备将V1、V2通过指令发送给控制设备,控制设备收到指令后控制Z R轴旋转电机的输入电压为V1以及控制X R轴旋转电机的输入电压为V2,使得Z R轴旋转电机旋转进而带动毫米波雷达绕Z R轴旋转△α的角度,以及使得X R轴旋转进而带动毫米波雷达绕X R轴旋转△β的角度。 The computing device saves the mapping relationship between the angle adjustment value and the voltage adjustment value. When the millimeter-wave radar is in the target loss state or the target is about to be lost, the computing device determines the first angle adjustment value of the millimeter-wave radar according to the current attitude of the millimeter-wave radar. Then, according to the mapping relationship, the first angle adjustment value is converted into a first voltage V1 that needs to be delivered to the Z R -axis rotary motor and a second voltage V2 that needs to be delivered to the X R -axis rotary motor; the computing device sends V1 and V2 through the command To the control device, after the control device receives the command, the input voltage to control the Z R axis rotating motor is V1 and the input voltage to control the X R axis rotating motor is V2, so that the Z R axis rotating motor rotates and drives the millimeter wave radar around the Z R axis. The angle of rotation Δα, and the angle by which the X R axis is rotated to drive the millimeter-wave radar to rotate Δβ around the X R axis.
3)、计算设备每对毫米波雷达进行一次姿态调整之后(如毫米波雷达绕Z R轴旋转△α且绕X R轴旋转△β),计算设备重新检测毫米波雷达的姿态,判断毫米波雷达的左右转动自由度和上下摆动自由度是否满足上述的角度闭环控制的标准;如果满足,则计算设备确定姿态调整完成;如果不满足,计算设备则继续调整毫米波雷达的姿态,以此循环,直到毫米波雷达左右转动自由度和上下摆动自由度均满足对应的角度闭环控制的标准为止。 3) After each attitude adjustment of the millimeter-wave radar by the computing device (for example, the millimeter-wave radar rotates Δα around the Z R axis and rotates Δβ around the X R axis), the computing device re-detects the attitude of the millimeter-wave radar and judges the millimeter-wave radar. Whether the left and right rotation degrees of freedom and up and down swing degrees of freedom of the radar meet the above-mentioned criteria for closed-loop angle control; if so, the computing device determines that the attitude adjustment is completed; if not, the computing device continues to adjust the attitude of the millimeter-wave radar, and this cycle , until both the left and right rotation degrees of freedom and the up and down swing degrees of freedom of the millimeter-wave radar meet the corresponding angular closed-loop control standards.
当毫米波雷达姿态调整完成之后,毫米波雷达的左右转动自由度和上下摆动自由度满足对应的角度闭环控制的标准,毫米波雷达能够重新准确地检测到目标。After the attitude adjustment of the millimeter-wave radar is completed, the left and right rotation degrees of freedom and the up and down swing degrees of freedom of the millimeter-wave radar meet the corresponding angle closed-loop control standards, and the millimeter-wave radar can accurately detect the target again.
可选的,计算设备每对毫米波雷达进行一次姿态调整(如每一次控制雷达绕Z R轴旋转 和/或绕X R轴旋转)之后,计算设备根据毫米波雷达的调整角度值实时更新毫米波雷达的标定矩阵,以此保证毫米波雷达姿态调整过程中毫米波雷达和相机信息融合的可靠性。 Optionally, after each attitude adjustment of the millimeter-wave radar by the computing device (for example, each time the radar is controlled to rotate around the Z- R axis and/or rotate around the X- R axis), the computing device updates the millimeter-wave radar in real time according to the adjustment angle value of the millimeter-wave radar. The calibration matrix of the wave radar ensures the reliability of the information fusion of the millimeter wave radar and the camera during the attitude adjustment of the millimeter wave radar.
如图11所示,为毫米波雷达坐标系与相机坐标系的示意图。其中,毫米波雷达坐标系可以描述物体与毫米波雷达的相对位置,表示为(X R、Y R、Z R),相机坐标系可以描述物体与相机的相对位置,表示为(X C、Y C、Z C)。由于相机和毫米波雷达安装在车辆的不同位置处,相机采集的各特征点的位置数据是各特征点在相机坐标系中的坐标,毫米波雷达采集的各特征点的位置数据是各特征点在毫米波雷达坐标系中的坐标,同一物体在相机坐标系和毫米波雷达坐标系中分别有着不同的坐标参数,所以需要标定矩阵将毫米波雷达采集的数据和相机采集的数据对应到同一个坐标系中,以便于计算设备进行数据运算。 As shown in Figure 11, it is a schematic diagram of the millimeter-wave radar coordinate system and the camera coordinate system. Among them, the millimeter-wave radar coordinate system can describe the relative position of the object and the millimeter-wave radar, expressed as (X R , Y R , Z R ), and the camera coordinate system can describe the relative position of the object and the camera, expressed as (X C , Y C , Z C ). Since the camera and the millimeter-wave radar are installed at different positions of the vehicle, the position data of each feature point collected by the camera is the coordinate of each feature point in the camera coordinate system, and the position data of each feature point collected by the millimeter-wave radar is the feature point. For the coordinates in the millimeter-wave radar coordinate system, the same object has different coordinate parameters in the camera coordinate system and the millimeter-wave radar coordinate system, so a calibration matrix is required to correspond the data collected by the millimeter-wave radar and the data collected by the camera to the same one In the coordinate system, it is convenient for the computing device to perform data operations.
例如,计算设备可以将毫米波雷达采集的数据转换到相机坐标系中,则该标定矩阵可以包括用于毫米波雷达坐标系和相机坐标系之间转换所需要的旋转矩阵R、平移矩阵T等。For example, the computing device can convert the data collected by the millimeter-wave radar into the camera coordinate system, and the calibration matrix can include the rotation matrix R, translation matrix T, etc. required for conversion between the millimeter-wave radar coordinate system and the camera coordinate system. .
例如,计算设备可以将毫米波雷达采集的数据和相机采集的数据均转换到世界坐标系(或图像坐标系或其它坐标系)中,则该标定矩阵可以包括用于毫米波雷达坐标系和世界坐标系(或图像坐标系或其它坐标系)之间转换所需要的旋转矩阵R、平移矩阵T等。For example, the computing device can convert both the data collected by the millimeter-wave radar and the data collected by the camera into the world coordinate system (or image coordinate system or other coordinate system), then the calibration matrix can include the coordinate system for the millimeter-wave radar and the world coordinate system. Rotation matrix R, translation matrix T, etc. required for conversion between coordinate systems (or image coordinate systems or other coordinate systems).
通过上述可知,本申请实施例结合可道路的路况特征,在毫米波雷达丢失目标或即将丢失目标时通过毫米波雷达左右(α)和上下(β)两个方面的自由度对毫米波雷达的姿态进行补偿修正,该两个自由度独立调整而不相互依赖,可以提高毫米波雷达姿态调整的准确度和效率,进而进一步提高目标检测跟踪的精准度和效率。并且,本申请实施例在调整毫米波雷达的角度时,还可以实时补偿修正毫米波雷达的标定矩阵,达到毫米波雷达和相机信息融合的快速准确响应,保证毫米波雷达姿态调整过程中毫米波雷达和相机信息融合的可靠性。From the above, it can be seen that the embodiment of the present application combines the road conditions characteristics of the road, and when the millimeter-wave radar loses the target or is about to lose the target, the millimeter-wave radar has two degrees of freedom in the left and right (α) and up and down (β) aspects of the millimeter-wave radar. The attitude is compensated and corrected, and the two degrees of freedom are independently adjusted and not dependent on each other, which can improve the accuracy and efficiency of the attitude adjustment of the millimeter-wave radar, thereby further improving the accuracy and efficiency of target detection and tracking. In addition, when adjusting the angle of the millimeter-wave radar, the embodiment of the present application can also compensate and correct the calibration matrix of the millimeter-wave radar in real time, so as to achieve a fast and accurate response of information fusion between the millimeter-wave radar and the camera, and ensure the millimeter-wave radar during the attitude adjustment process of the millimeter-wave radar. Reliability of radar and camera information fusion.
需要说明的是,以上是以弯道路况为例介绍毫米波雷达处于目标丢失状态或目标即将丢失状态时(或者说毫米波雷达处于丢帧状态)第一车辆调整毫米波雷达的姿态的过程,在实际应用中,针对其它道路场景(如上/下坡、加/减速等)也可以采用相同的思路来调整毫米波雷达的姿态,本申请不做限制。It should be noted that the above takes the curved road conditions as an example to introduce the process of adjusting the attitude of the millimeter-wave radar by the first vehicle when the millimeter-wave radar is in the target loss state or the target is about to be lost (or the millimeter-wave radar is in the frame-loss state). In practical applications, the attitude of the millimeter-wave radar can also be adjusted in the same way for other road scenarios (such as up/downhill, acceleration/deceleration, etc.), which is not limited in this application.
如图12所示,本申请实施例还提供一种基于毫米波雷达信息和相机信息融合的目标跟踪方法,该目标跟踪方法可以应用于图4所示的车辆中,包括:As shown in FIG. 12 , an embodiment of the present application also provides a target tracking method based on the fusion of millimeter-wave radar information and camera information. The target tracking method can be applied to the vehicle shown in FIG. 4 , including:
S1201、第一车辆获取相机采集的红绿蓝(Red Green Blue,RGB)图像,对RGB图像包括的远视场中的车道线进行粗拟合。S1201. The first vehicle acquires a red green blue (Red Green Blue, RGB) image collected by a camera, and performs rough fitting on the lane lines in the far field of view included in the RGB image.
具体的,第一车辆的计算设备获取相机采集的RGB图像,对该RGB图像进行分析,对该RGB图像包括的远视场中的车道线进行粗拟合。或者,相机中的处理芯片对该RGB图像进行分析,对该RGB图像包括的远视场中的车道线进行粗拟合,然后将拟合结果传输给计算设备。Specifically, the computing device of the first vehicle acquires the RGB image collected by the camera, analyzes the RGB image, and performs rough fitting on the lane lines in the far field of view included in the RGB image. Alternatively, the processing chip in the camera analyzes the RGB image, performs rough fitting on the lane lines in the far field of view included in the RGB image, and then transmits the fitting result to the computing device.
这里的粗拟合是指,将车道线的某几个像素点投影到鸟瞰图上,提取相等间隔的几个点,即可判断弯道拐点、方向,不需要去花大量的算力和耗时去拟合弯道的三次曲线等复杂的方程,也不用做很多车道线、平滑的后处理。The rough fitting here refers to projecting certain pixels of the lane line onto the bird's-eye view, and extracting several points at equal intervals to determine the turning point and direction of the curve without spending a lot of computing power and consumption. Time to fit complex equations such as cubic curves of curves, and do not need to do a lot of lane lines and smooth post-processing.
由于雷达在视场中的检测范围比在远视场中的检测范围大,所以目标(如第二车辆)在近视场中一般不易丢失,所以只对远视场的车道线进行粗拟合,可以在保证准确性的前提下减少计算量,提高计算效率。Since the detection range of the radar in the field of view is larger than that in the far field of view, the target (such as the second vehicle) is generally not easily lost in the near field of view. On the premise of ensuring accuracy, the amount of calculation is reduced and the calculation efficiency is improved.
可选的,远视场为与车辆的距离超过预设距离(如50m、60m或100m等)的位置。 在具体实施时,该预设距离可以根据毫米波雷达的检测波束角和检测距离的特性来决定。Optionally, the far field of view is a position where the distance from the vehicle exceeds a preset distance (eg, 50m, 60m, or 100m, etc.). During specific implementation, the preset distance may be determined according to the characteristics of the detection beam angle and detection distance of the millimeter wave radar.
S1202、第一车辆基于毫米波雷达和相机同时进行目标识别。S1202, the first vehicle simultaneously performs target recognition based on the millimeter-wave radar and the camera.
一方面,第一车辆的计算设备对相机获取的RGB图像进行目标识别,获得第一识别结果;或者,相机中的处理芯片对RGB图像进行目标识别,获得第一识别结果后,将第一识别结果传输给计算设备(即S1202.1、相机目标识别)。On the one hand, the computing device of the first vehicle performs target recognition on the RGB image obtained by the camera to obtain the first recognition result; or, the processing chip in the camera performs target recognition on the RGB image, and after obtaining the first recognition result, the first recognition result is obtained. The results are transmitted to the computing device (ie S1202.1, camera target recognition).
另一方面,第一车辆的计算设备对毫米波雷达获取的雷达点迹数据进行目标识别,获得第二识别结果;或者,毫米波雷达中的处理芯片对雷达点迹数据进行目标识别,获得第二识别结果后,将第二识别结果传输给计算设备(即S1202.2、毫米波雷达目标识别)。On the other hand, the computing device of the first vehicle performs target recognition on the radar trace data obtained by the millimeter-wave radar to obtain the second recognition result; or, the processing chip in the millimeter-wave radar performs target recognition on the radar trace data to obtain the first recognition result. After the second identification result, the second identification result is transmitted to the computing device (ie S1202.2, millimeter wave radar target identification).
在本申请实施例中,针对相机目标识别,计算设备保存有与之对应的训练好的目标识别模型。计算设备在获得相机采集的RGB图像后,将RGB图像输入相机目标识别对应的目标识别模型,即可获得相机对应的目标识别结果,即第一识别结果。计算设备在获得毫米波雷达采集的雷达点迹数据后,可以采用神经网络模型或者采用传统的聚类、跟踪算法等对雷达点迹数据进行处理,毫米波雷达获得毫米波雷达对应的目标识别结果,即第二识别结果。In the embodiment of the present application, for the camera target recognition, the computing device saves a trained target recognition model corresponding to it. After obtaining the RGB image collected by the camera, the computing device inputs the RGB image into the target recognition model corresponding to the target recognition of the camera, so as to obtain the target recognition result corresponding to the camera, that is, the first recognition result. After the computing device obtains the radar spot trace data collected by the millimeter-wave radar, it can use the neural network model or traditional clustering and tracking algorithms to process the radar spot data, and the millimeter-wave radar can obtain the target recognition result corresponding to the millimeter-wave radar. , that is, the second recognition result.
如图12所示,设第一目标识别结果表示为(Xc,Yc,C) T,其中Xc和Zc表示目标的位置数据,C表示目标的种类。设第二目标识别结果表示为(X R,Y R,v,w) T,其中X R和Y R表示目标的位置数据,v表示目标的线速度,w表示目标的角速度。需要说明的是,这里毫米波雷达的目标识别和相机的目标识别的位置数据是以二维平面内的位置数据为例,即第一识别结果仅包括Xc和Yc两个方向上的数据,第二识别结果仅包括X R和Y R两个方向上的数据。在具体实施时,毫米波雷达的目标识别和相机的目标识别的位置数据还可以包括更多的参数,这里不做限制。 As shown in FIG. 12 , the first target recognition result is represented as (Xc, Yc, C) T , where Xc and Zc represent the position data of the target, and C represents the type of the target. Let the second target recognition result be represented as (X R , Y R , v, w) T , where X R and Y R represent the position data of the target, v represents the linear velocity of the target, and w represents the angular velocity of the target. It should be noted that the position data of the target recognition of the millimeter-wave radar and the target recognition of the camera are based on the position data in the two-dimensional plane as an example, that is, the first recognition result only includes the data in the two directions of Xc and Yc, and the third 2. The identification result only includes the data in the two directions of X R and Y R. During specific implementation, the position data of the target recognition of the millimeter-wave radar and the target recognition of the camera may also include more parameters, which are not limited here.
可选的,在本申请实施例针对S1202.1中的相机识别,根据不同的识别场景,可以设计不同的目标识别模型。例如,在毫米波雷达丢帧的情况下,计算设备采用轻量化识别模型对相机采集的数据进行目标识别,轻量化识别模型侧重于识别速度(即算法处理的延时要小),例如轻量化识别模型可以采用你只能看一次(You Only Look Once,YOLO)v3(版本号,代表第三版)。在毫米波雷达没有丢帧的情况下,采用重量化识别模型对相机采集的数据进行目标识别,重量化识别模型侧重于算法的精度(即算法输出的目标的位置、速度和种类的准确性要高),使得计算设备可以基于目标的种类和位置控制随动机构尽快调整毫米波雷达的姿态,重新检测到目标,例如重量化识别模型可以采用区域卷积神经网络(RCNN)。Optionally, for the camera recognition in S1202.1 in this embodiment of the present application, different target recognition models may be designed according to different recognition scenarios. For example, in the case of frame loss of the millimeter-wave radar, the computing device uses a lightweight recognition model to identify the data collected by the camera, and the lightweight recognition model focuses on the recognition speed (that is, the algorithm processing delay is small). For example, lightweight recognition The recognition model can use You Only Look Once (YOLO) v3 (version number, representing the third edition). In the case that the millimeter-wave radar does not lose frames, the weighted recognition model is used to identify the data collected by the camera, and the weighted recognition model focuses on the accuracy of the algorithm (that is, the accuracy of the position, speed and type of the target output by the algorithm should be High), so that the computing device can adjust the attitude of the millimeter-wave radar as soon as possible based on the type and position of the target, and re-detect the target. For example, the weighted recognition model can use a regional convolutional neural network (RCNN).
S1203、第一车辆对第一识别结果和第二识别结果进行时间对齐、目标对齐。S1203, the first vehicle performs time alignment and target alignment on the first recognition result and the second recognition result.
所谓的时间对齐,就是将第一识别结果和第二识别结果进行时间同步。所谓的目标对齐,就是将第一识别结果和第二识别结果进行空间同步,例如上述图11相关实施例中介绍的将位置数据转换到同一坐标系中。The so-called time alignment is to perform time synchronization between the first recognition result and the second recognition result. The so-called target alignment is to perform spatial synchronization between the first recognition result and the second recognition result, for example, converting the position data into the same coordinate system as described in the above-mentioned embodiment of FIG. 11 .
S1204、计算设备根据IOU融合规则,进行目标融合。S1204, the computing device performs target fusion according to the IOU fusion rule.
在本申请实施例中,第一车辆的计算设备在对RGB图像进行目标识别获得第一识别结果之后,还可以在RGB图像中生成感兴趣区域(region of interest,ROI)(即目标框);以及,在对雷达点迹数据进行目标识别获得第一识别结果之后,还可以在雷达点迹数据中生成ROI。其中,ROI是机器视觉、图像处理中,从被处理的图像以方框、圆、椭圆、不规则多边形等方式勾勒出需要处理的区域,称为感兴趣区域,在本申请实施例中ROI为目 标(即第二车辆)所在的区域。In the embodiment of the present application, after the computing device of the first vehicle performs target recognition on the RGB image to obtain the first recognition result, it can also generate a region of interest (ROI) (that is, a target frame) in the RGB image; And, after the first recognition result is obtained by performing target recognition on the radar spot trace data, the ROI may also be generated in the radar spot trace data. Among them, ROI refers to the area that needs to be processed from the processed image in the form of box, circle, ellipse, irregular polygon, etc. in machine vision and image processing, which is called the region of interest. In the embodiment of this application, the ROI is The area where the target (ie the second vehicle) is located.
本申请实施例中的ROI以矩形为例,如图13所示,为毫米波雷达检测结果对应的ROI(用A表示的矩形区域)和相机检测结果对应的ROI(用B表示的矩形区域)的示意图。The ROI in the embodiment of the present application takes a rectangle as an example. As shown in FIG. 13 , it is the ROI corresponding to the detection result of the millimeter wave radar (the rectangular area indicated by A) and the ROI (the rectangular area indicated by B) corresponding to the detection result of the camera. schematic diagram.
IoU是基于毫米波雷达检测输出的ROI与基于相机检测输出的ROI的交集与并集的比值。如图14所示,为毫米波雷达检测结果A和相机检测结果B的交集示意图,IoU为两个矩形区域面积的交集和并集的比值,即:IoU=(A∩B)/(A∪B)。The IoU is the ratio of the intersection and union of the ROI based on the millimeter wave radar detection output and the ROI based on the camera detection output. As shown in Figure 14, it is a schematic diagram of the intersection of the millimeter wave radar detection result A and the camera detection result B. IoU is the ratio of the intersection and union of the two rectangular areas, that is: IoU=(A∩B)/(A∪ B).
IoU的值应当在[0,1]之间。IoU的值越大,代表基于相机检测到的目标和基于雷达检测到的目标是相同目标的概率越高,相反IoU的值越小,代表基于相机检测到的目标和基于雷达检测到的目标是相同目标的概率越低。The value of IoU should be between [0,1]. The larger the value of IoU, the higher the probability that the target detected based on the camera and the target detected based on the radar are the same target. On the contrary, the smaller the value of IoU, the target detected based on the camera and the target detected based on the radar are The probability of the same target is lower.
本申请实施例中,IOU融合规则可以包括:若IoU>M(或者IoU≥M),则计算设备将第一识别结果和第二识别结果进行融合,输出第三识别结果,图12中融合后的数据即第三识别结果表示为(X*,Y*,C*,v*,w*) T;若IoU≤M(或者IoU<M),则计算设备不将第一识别结果和第二识别结果进行融合,输出第一识别结果和第二识别结果。其中,M为设定的第二阈值,取值范围在(0,1)之间。 In the embodiment of the present application, the IOU fusion rule may include: if IoU>M (or IoU≥M), the computing device fuses the first recognition result and the second recognition result, and outputs the third recognition result. The data that is the third recognition result is expressed as (X*, Y*, C*, v*, w*) T ; if IoU≤M (or IoU<M), then the computing device does not combine the first recognition result and the second The recognition results are fused, and the first recognition result and the second recognition result are output. Among them, M is the set second threshold, and the value range is between (0, 1).
可选的,第二阈值M可以与第一车辆和/或第二车辆所在弯道的曲率ρ、第一车辆的行车速度V、第一车辆的行车距离L(即与第二车辆的车距)等因素相关。Optionally, the second threshold M may be related to the curvature ρ of the curve where the first vehicle and/or the second vehicle is located, the driving speed V of the first vehicle, and the driving distance L of the first vehicle (that is, the distance to the second vehicle). ) and other factors.
示例性的,第二阈值M与弯道的曲率ρ、行车速度V、行车距离L满足以下公式:Exemplarily, the second threshold M and the curvature ρ of the curve, the driving speed V, and the driving distance L satisfy the following formulas:
M=a 2ρ+bV+L; M=a 2 ρ+bV+L;
其中,a、b为系数,可以由技术人员根据试验或经验设置。Among them, a and b are coefficients, which can be set by technicians according to experiments or experience.
应理解,以上公式仅为举例而非限定,在具体实施时,第二阈值K还可以与其他因素相关,如第一车辆的加速度等。It should be understood that the above formula is only an example and not a limitation. In a specific implementation, the second threshold K may also be related to other factors, such as the acceleration of the first vehicle.
S1205.1、当雷达未处于目标丢失状态或目标即将丢失状态时,第一车辆执行毫米波雷达目标未丢失跟踪机制:第一车辆的计算设备基于连续多帧融合数据进行目标跟踪;S1205.1. When the radar is not in the target lost state or the target is about to be lost, the first vehicle executes the millimeter-wave radar target not lost tracking mechanism: the computing device of the first vehicle performs target tracking based on continuous multi-frame fusion data;
例如,图12中雷达未处于目标丢失状态或目标即将丢失状态时,计算设备基于连续多帧(X*,Y*,C*,v*,w*) T进行目标跟踪。 For example, when the radar in FIG. 12 is not in the target loss state or the target is about to be lost, the computing device performs target tracking based on consecutive multiple frames (X*, Y*, C*, v*, w*) T.
S1205.2、当雷达处于目标丢失状态或目标即将丢失状态时,第一车辆执行毫米波雷达目标未丢失跟踪机制:第一车辆的计算设备基于雷达处于目标丢失状态或目标即将丢失状态时的第一识别结果、以及雷达处于目标丢失状态或目标即将丢失状态之前的最后一帧或多帧融合数据进行目标跟踪。S1205.2. When the radar is in the target lost state or the target is about to be lost, the first vehicle executes the millimeter-wave radar target not lost tracking mechanism: the computing device of the first vehicle is based on the radar in the target lost state or the target is about to be lost. Target tracking is performed with the recognition result and the last frame or multi-frame fusion data before the radar is in the target lost state or the target is about to be lost.
例如,图12中雷达处于目标丢失状态或目标即将丢失状态时,计算设备基于目标丢失前最后一帧融合数据(即原始一帧(X*,Y*,C* v*,w*) T)和目标丢失后的第一识别结果(即连续帧(X C,Y C,C) T)进行目标跟踪。 For example, when the radar in Fig. 12 is in the state of target loss or the target is about to be lost, the computing device fuses the data based on the last frame before the target is lost (that is, the original frame (X*, Y*, C* , v*, w*) T ) and the first recognition result after the target is lost (ie, consecutive frames (X C , Y C , C) T ) to track the target.
通过上述可知,本申请实施例只对远视场中的车道线粗进行拟合,可以在保证准确性的前提下减少计算量,提高计算效率。其次,本申请实施例针对相机目标识别,设计至少两种目标识别模型,在毫米波雷达丢帧的情况下,采用轻量化识别模型对相机采集的数据进行目标识别,提高识别速度,而在毫米波雷达没有丢帧的情况下,采用重量化识别模型对相机采集的数据进行目标识别,提高识别精度,进而实现兼顾融合识别的速度和准确度。另外,本申请实施例还结合弯道场景设计IoU的第二阈值K,可以进一步提高识别的准确度。It can be seen from the above that the embodiment of the present application only fits the lane line thickness in the far field of view, which can reduce the amount of calculation and improve the calculation efficiency on the premise of ensuring the accuracy. Secondly, the embodiments of the present application design at least two target recognition models for camera target recognition. In the case of frame loss of millimeter-wave radar, a lightweight recognition model is used to perform target recognition on the data collected by the camera to improve the recognition speed. When the wave radar does not lose frames, the weighted recognition model is used to identify the data collected by the camera to improve the recognition accuracy, thereby achieving both the speed and accuracy of fusion recognition. In addition, the embodiment of the present application also designs the second threshold K of the IoU in combination with the curve scene, which can further improve the recognition accuracy.
需要说明的是,本文是以弯道盲区为例介绍如何检测目标是否丢失以及如何重新获取 检测目标的方案,但在实际应用中,其它类似的盲区场景本申请实施例技术方案也适用。例如,在坡道场景中,由于坡道的遮挡也会导致车辆存在盲区的情况,所以也可以控制毫米波雷达实现其上下摆动的自由度。或者例如,不平坦的路段或者汽车的启动/停止时刻,对毫米波雷达的俯仰角影响也大,所以也可以控制毫米波雷达实现其上下摆动的自由度。It should be noted that this paper takes the blind spot of the curve as an example to introduce the solution of how to detect whether the target is lost and how to re-acquire the detected target, but in practical applications, the technical solutions of the embodiments of the present application are also applicable to other similar blind spot scenarios. For example, in a ramp scene, due to the occlusion of the ramp, the vehicle will also have a blind spot, so the millimeter-wave radar can also be controlled to achieve the degree of freedom of its up and down swing. Or, for example, uneven road sections or the start/stop time of a car also have a great influence on the pitch angle of the millimeter-wave radar, so the millimeter-wave radar can also be controlled to achieve the degree of freedom of its up and down swing.
例如,参见图15,为车辆处于上坡的场景。t0时刻第一车辆和第二车辆均处于平坦道路上,第二车辆处于第一车辆的毫米波雷达检测范围之内;t1时刻,第二车辆行驶到上坡道中,由于此时毫米波雷达的姿态与t0时刻的姿态一致,所以由于坡道的存在,导致第二车辆超出(或即将超出)第一车辆的毫米波雷达的检测范围。参见图16、第二车辆超出(或即将超出)第一车辆的毫米波雷达的检测范围后,第一车辆的计算设备调整毫米波雷达的姿态,使得毫米波雷达的俯仰角向上调整,进而第一车辆的毫米波雷达重新检测到第二车辆。For example, see FIG. 15, which is a scene where the vehicle is on an uphill slope. At time t0, both the first vehicle and the second vehicle are on a flat road, and the second vehicle is within the detection range of the millimeter-wave radar of the first vehicle; at time t1, the second vehicle is driving on the uphill. The attitude is consistent with the attitude at time t0, so due to the existence of the ramp, the second vehicle exceeds (or is about to exceed) the detection range of the millimeter-wave radar of the first vehicle. Referring to Figure 16, after the second vehicle exceeds (or is about to exceed) the detection range of the millimeter-wave radar of the first vehicle, the computing device of the first vehicle adjusts the attitude of the millimeter-wave radar, so that the pitch angle of the millimeter-wave radar is adjusted upward, and then the first The millimeter-wave radar of one vehicle re-detects the second vehicle.
进一步需要说明的是,本申请实施例主要以毫米波雷达为例介绍毫米波雷达在弯道、坡道、加减速等特殊场景下如何检测目标丢失以及如何重新检测目标的方案,但在实际应用中,毫米波雷达还可以替换为其它传感器,例如激光毫米波雷达等,特别是在自动驾驶量产方案上,采用固态激光毫米波雷达后,激光毫米波雷达也存在类似于视场角定义的检测范围,所以在一定数量的传感器布局上,必然会存在一定检测范围的盲区,所以相同的技术方案也可以应用于激光毫米波雷达等其它类似的传感器。换而言之,只要是检测范围大、检测距离短、检测结果(障碍物位置速度等)不太准确的传感器,与检测范围小、检测距离远、检测结果准确的传感器搭配使用,存在检测盲区的情况,本申请实施例技术方案均适用。It should be further noted that the embodiments of this application mainly take millimeter-wave radar as an example to introduce the scheme of how to detect target loss and how to re-detect the target in special scenarios such as curves, ramps, acceleration and deceleration, but in practical applications Among them, the millimeter-wave radar can also be replaced with other sensors, such as laser millimeter-wave radar, etc., especially in the mass production scheme of autonomous driving, after the solid-state laser millimeter-wave radar is adopted, the laser millimeter-wave radar also has a definition similar to the field of view angle. Therefore, in the layout of a certain number of sensors, there will inevitably be blind areas with a certain detection range, so the same technical solution can also be applied to other similar sensors such as laser millimeter wave radar. In other words, as long as a sensor with a large detection range, short detection distance, and inaccurate detection results (obstacle position and speed, etc.) is used in conjunction with a sensor with a small detection range, long detection distance, and accurate detection results, there will be detection blind spots. The technical solutions of the embodiments of the present application are applicable.
本申请实施例上述的各个实施方式可以相互结合以实现不同的技术效果。The above-mentioned various implementation manners of the embodiments of the present application may be combined with each other to achieve different technical effects.
基于相同的技术构思,参见图17,为本申请实施例提供的一种系统架构图,该系统可以执行上述方法实施例中的方案。系统包括:定位装置、相机、毫米波雷达、随动机构、以及计算设备。该系统适用于图4所示的车辆,用于实现本申请实施例上述的目标检测方法。Based on the same technical concept, referring to FIG. 17 , which is a system architecture diagram provided in an embodiment of the present application, the system can execute the solutions in the foregoing method embodiments. The system includes: a positioning device, a camera, a millimeter-wave radar, a follow-up mechanism, and a computing device. The system is applicable to the vehicle shown in FIG. 4 , and is used to implement the above-mentioned target detection method in the embodiment of the present application.
其中,定位装置,用于获取车辆的定位信息。The positioning device is used to obtain the positioning information of the vehicle.
相机,用于获取车辆前方的视觉信息。A camera to obtain visual information in front of the vehicle.
毫米波雷达,用于获取车辆前方的毫米波雷达信息。应理解,本申请目标检测是以相机和毫米波雷达的组合为例。但本申请并不限定传感器的具体类型,只要是检测范围大、检测距离短、检测结果(障碍物位置速度等)不太准确的传感器,与检测范围小、检测距离远、检测结果准确的传感器搭配使用,本申请实施例均适用。Millimeter-wave radar, used to obtain the information of the millimeter-wave radar in front of the vehicle. It should be understood that the target detection in the present application takes the combination of a camera and a millimeter-wave radar as an example. However, this application does not limit the specific type of sensor, as long as it is a sensor with a large detection range, a short detection distance, and inaccurate detection results (obstacle position and speed, etc.), and a sensor with a small detection range, a long detection distance, and accurate detection results. When used in combination, all the embodiments of the present application are applicable.
随动机构,设置在毫米波雷达上,用于控制毫米波雷达的姿态。The follower mechanism is set on the millimeter-wave radar and used to control the attitude of the millimeter-wave radar.
计算设备,是具有计算能力/处理能力的设备。具体是可以是车载设备、或者是一个或多个处理芯片,或者是集成电路等,这里不做限制。A computing device is a device with computing/processing capabilities. Specifically, it may be a vehicle-mounted device, or one or more processing chips, or an integrated circuit, etc., which is not limited here.
参见图17,按照逻辑功能划分,计算设备可以包括以下几个模块:Referring to FIG. 17, according to the division of logical functions, the computing device may include the following modules:
路况识别模块,用于根据定位装置采集的定位信息和相机采集的视觉信息识别自车和前车所处的路况;并根据路况,启动对应的阈值比较器(其中不同的路况对应不同的阈值比较器,图12以阈值比较器A、B、C为例,实际数量可以更多或者更少)。The road condition identification module is used to identify the road conditions where the vehicle and the preceding vehicle are located according to the positioning information collected by the positioning device and the visual information collected by the camera; and according to the road conditions, start the corresponding threshold comparator (where different road conditions correspond to different threshold comparisons) Figure 12 takes the threshold comparators A, B, and C as an example, the actual number can be more or less).
阈值比较器,用于判断目标障碍物的速度(包括线速度或角速度)是否超过该阈值比较器的阈值;各阈值比较器的具体实现可以参考上文图6~图8所示实施例中的相关介绍。The threshold comparator is used to determine whether the speed of the target obstacle (including the linear velocity or the angular velocity) exceeds the threshold of the threshold comparator; the specific implementation of each threshold comparator can refer to the embodiments shown in FIGS. 6 to 8 above. Related introduction.
其中,图12中的图像标记是在阈值比较器确认为“是”的时候,相机输出的携带目标标记(如目标框)的RGB图像。Wherein, the image mark in FIG. 12 is an RGB image that carries a target mark (such as a target frame) output by the camera when the threshold comparator confirms as "Yes".
毫米波雷达相机融合目标检测模块,用于对毫米波雷达的检测结果和相机的检测结果进行融合,具体实现可参见图12所示实施例中的相关介绍。The millimeter-wave radar camera fusion target detection module is used to fuse the detection results of the millimeter-wave radar and the detection results of the camera. For the specific implementation, please refer to the relevant introduction in the embodiment shown in FIG. 12 .
目标丢失跟踪模块,用于执行毫米波雷达目标丢失跟踪机制,具体可参见前文S1205.1中的相关介绍。The target loss tracking module is used to implement the millimeter wave radar target loss tracking mechanism. For details, please refer to the relevant introduction in S1205.1 above.
目标未丢失跟踪模块,用于执行毫米波雷达目标未丢失跟踪机制,具体可参见前文S1205.2中的相关介绍。The target not lost tracking module is used to implement the millimeter wave radar target not lost tracking mechanism. For details, please refer to the relevant introduction in S1205.2 above.
其中,最终输出的结果可以有多种形式。例如输出目标的具体的位置、类型、速度等数据(如X*,Y*,C*,v*,w*等),或者输出带目标框的RGB图像等,这里不做限制。Among them, the final output result can have various forms. For example, output the specific position, type, speed and other data of the target (such as X*, Y*, C*, v*, w*, etc.), or output the RGB image with the target frame, etc., there are no restrictions here.
基于相同的技术构思,本申请实施例还提供一种目标检测装置180,该装置180位于第一车辆内部,例如可以为设置在第一车辆内部的芯片。该装置180包括用于执行图5、图10、图12所示方法中的步骤所对应的模块或单元或手段(means),该功能或单元或手段可以通过软件实现,或者通过硬件实现,也可以通过硬件执行相应的软件实现。Based on the same technical concept, an embodiment of the present application further provides a target detection device 180, where the device 180 is located inside the first vehicle, for example, a chip arranged inside the first vehicle. The apparatus 180 includes modules or units or means (means) for executing the steps in the methods shown in FIG. 5 , FIG. 10 , and FIG. 12 . The functions or units or means may be implemented by software, or by hardware, or The corresponding software implementation can be performed by hardware.
如图18所示,装置180可以包括:检测模块1801,用于基于毫米波雷达检测第二车辆的速度;其中,第二车辆位于第一车辆的前方,速度包括线速度和角速度;处理模块1802,用于获取第一车辆和第二车辆所处的路况;根据路况确定第一阈值;若第二车辆的速度超过第一阈值,则确定毫米波雷达处于目标丢失状态或目标即将丢失状态。As shown in FIG. 18 , the apparatus 180 may include: a detection module 1801 for detecting the speed of the second vehicle based on a millimeter-wave radar; wherein the second vehicle is located in front of the first vehicle, and the speed includes a linear velocity and an angular velocity; a processing module 1802 , used to obtain the road conditions where the first vehicle and the second vehicle are located; the first threshold is determined according to the road conditions; if the speed of the second vehicle exceeds the first threshold, it is determined that the millimeter-wave radar is in the target loss state or the target is about to be lost.
在本申请实施例中,装置180根据路况确定第一阈值,并将第二车辆的速度与第一阈值进行比较,进而判断毫米波雷达是否处于目标丢失状态或目标即将丢失状态。如此,在特殊路况(如弯道、坡道等)下,装置180可以及时有效地识别到由路况原因所造成毫米波雷达目标丢失的情况,避免第一车辆产生无目标障碍物的错觉,进而提高车辆在驾驶过程中的安全性和稳定性。In this embodiment of the present application, the device 180 determines a first threshold value according to road conditions, and compares the speed of the second vehicle with the first threshold value, thereby determining whether the millimeter-wave radar is in a target loss state or a target loss state. In this way, under special road conditions (such as curves, ramps, etc.), the device 180 can timely and effectively identify the loss of the millimeter-wave radar target caused by the road conditions, so as to avoid the first vehicle from generating the illusion that there is no target obstacle, and further Improve the safety and stability of the vehicle during driving.
可选的,处理模块1802在获取第一车辆和第二车辆所处的路况时,具体用于:使用相机获取第一车辆的前方的RGB图像,然后根据该RGB图像确定第二车辆所处的路况。如此,可以高效快捷的识别前方的路况,即确定第二车辆所处的路况。Optionally, when acquiring the road conditions where the first vehicle and the second vehicle are located, the processing module 1802 is specifically configured to: use a camera to acquire an RGB image in front of the first vehicle, and then determine the location where the second vehicle is located according to the RGB image. road conditions. In this way, the road condition ahead can be identified efficiently and quickly, that is, the road condition where the second vehicle is located can be determined.
可选的,处理模块1802在根据该RGB图像确定第二车辆所处的路况时,具体用于:从该RGB图像包括的远视场中的车道线上提取特征点,然后根据提取的特征点确定第二车辆所处道路的拐点和方向,进而得到第二车辆所处的路况。由于毫米波雷达在近视场中的检测范围比在远视场中的检测范围大,所以近视场中的目标一般不易丢失,处理模块1802只对远视场中的车道线处理,可以在保证准确性的前提下减少计算量,提高计算效率。Optionally, when determining the road condition where the second vehicle is located according to the RGB image, the processing module 1802 is specifically configured to: extract feature points from the lane lines in the far field of view included in the RGB image, and then determine according to the extracted feature points. The inflection point and direction of the road where the second vehicle is located, so as to obtain the road condition where the second vehicle is located. Since the detection range of the millimeter-wave radar in the near field of view is larger than that in the far field of view, the target in the near field of view is generally not easy to be lost. The processing module 1802 only processes the lane lines in the far field of view, which can ensure accuracy. Under the premise, the amount of calculation is reduced and the calculation efficiency is improved.
可选的,处理模块1802在获取第一车辆和第二车辆所处的路况时,还可以基于定位装置获取第一车辆的位置信息,然后根据位置信息确定第一车辆所处的路况。在本申实施例中,定位装置可以是GPS、北斗系统或者其他定位系统,用于接收卫星信号,并对第一车辆当前的位置进行定位。除此之外,定位系统还可以是视觉定位、毫米波雷达定位、融合定位等,本申请不做限制。处理模块1802在得到位置信息后,就基于地图判断自身所处位置的路况,例如处于弯道、或者直道、或者上坡、或者下坡等。Optionally, when acquiring the road conditions where the first vehicle and the second vehicle are located, the processing module 1802 may also acquire location information of the first vehicle based on the positioning device, and then determine the road conditions where the first vehicle is located according to the location information. In this embodiment of the present application, the positioning device may be a GPS, a Beidou system, or other positioning systems, and is used for receiving satellite signals and positioning the current position of the first vehicle. In addition, the positioning system may also be visual positioning, millimeter wave radar positioning, fusion positioning, etc., which is not limited in this application. After obtaining the location information, the processing module 1802 determines the road conditions where it is located based on the map, for example, in a curve, a straight road, an uphill, or a downhill.
如此,装置180可以快速准确的获得第一车辆所处的路况。In this way, the device 180 can quickly and accurately obtain the road conditions where the first vehicle is located.
应理解,第一阈值可以根据第二车辆位于毫米波雷达检测范围内与位于毫米波雷达检测范围外之间的速度临界值确定,例如第一阈值小于或者等于该速度临界值。当第二车辆 的速度远大于该速度临界值时,第二车辆超出毫米波雷达检测范围,进入毫米波雷达检测盲区,所以毫米波雷达处于目标丢失状态;当第二车辆的速度在该速度临界值附近时,第二车辆随时可能超出毫米波雷达检测范围,即将进入毫米波雷达检测盲区,所以毫米波雷达处于目标即将丢失状态。在本申请实施例根据路况不同,第一阈值的设计可以不同。It should be understood that the first threshold may be determined according to a speed threshold between the second vehicle being within the detection range of the millimeter wave radar and the speed threshold being outside the detection range of the millimeter wave radar, for example, the first threshold is less than or equal to the speed threshold. When the speed of the second vehicle is much greater than the speed threshold, the second vehicle exceeds the detection range of the millimeter-wave radar and enters the blind spot for detection of the millimeter-wave radar, so the millimeter-wave radar is in a target loss state; when the speed of the second vehicle is at the critical speed When the value is near, the second vehicle may exceed the detection range of the millimeter-wave radar at any time, and is about to enter the blind spot of the millimeter-wave radar detection, so the millimeter-wave radar is in a state where the target is about to be lost. In this embodiment of the present application, the design of the first threshold may be different according to different road conditions.
下面给出三个具体的示例来说明。Three specific examples are given below to illustrate.
示例1、第一车辆和第二车辆所处的路况为:第一车辆处于直道,第二车辆处于弯道。则检测模块1801在基于毫米波雷达检测第二车辆的速度时,具体用于:Example 1. The road conditions where the first vehicle and the second vehicle are located are: the first vehicle is on a straight road, and the second vehicle is on a curve. Then, when the detection module 1801 detects the speed of the second vehicle based on the millimeter-wave radar, it is specifically used for:
基于毫米波雷达检测第二车辆绕弯道做圆周运动的瞬时角速度;其中,处理模块1802根据路况确定的第一阈值N符合:The instantaneous angular velocity of the circular motion of the second vehicle around the curve is detected based on the millimeter-wave radar; wherein, the first threshold N determined by the processing module 1802 according to the road conditions conforms to:
Figure PCTCN2021124194-appb-000046
Figure PCTCN2021124194-appb-000046
其中,ω r为第二车辆从t0时刻到t1时刻绕弯道做圆周运动的角速度,
Figure PCTCN2021124194-appb-000047
为t0时刻第一车辆和第二车辆之间的欧式距离;
Figure PCTCN2021124194-appb-000048
为第二车辆从t0时刻到t1时刻之间的位移,α为毫米波雷达的检测波束角的一半,t0时刻为毫米波雷达采集第一帧数据的时刻,t1时刻为毫米波雷达采集第二帧数据的时刻,第一帧数据和第二帧数据为连续的两帧数据;K为大于0且小于等于1的系数;ε为t0时刻第一车辆所在位置A、第二车辆所在位置B、弯道的圆心O之间所形成的圆心角的角度值。
Among them, ω r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1,
Figure PCTCN2021124194-appb-000047
is the Euclidean distance between the first vehicle and the second vehicle at time t0;
Figure PCTCN2021124194-appb-000048
is the displacement of the second vehicle from time t0 to time t1, α is half of the detection beam angle of the millimeter-wave radar, time t0 is the time when the millimeter-wave radar collects the first frame of data, and time t1 is the time when the millimeter-wave radar collects the second frame of data. At the moment of frame data, the first frame data and the second frame data are two consecutive frames of data; K is a coefficient greater than 0 and less than or equal to 1; ε is the position A of the first vehicle, the position B of the second vehicle at t0, The angle value of the central angle formed between the centers O of the curve.
示例2、第一车辆和第二车辆所处的路况为:第一车辆处于弯道,第二车辆处于直道。则检测模块1801在基于毫米波雷达检测第二车辆的速度时,具体用于:Example 2. The road conditions where the first vehicle and the second vehicle are located are: the first vehicle is on a curve, and the second vehicle is on a straight road. Then, when the detection module 1801 detects the speed of the second vehicle based on the millimeter-wave radar, it is specifically used for:
基于毫米波雷达检测第二车辆在直道上的瞬时行驶速度;Detecting the instantaneous speed of the second vehicle on the straight road based on millimeter-wave radar;
其中,处理模块1802根据路况确定的第一阈值N符合:Wherein, the first threshold N determined by the processing module 1802 according to the road conditions conforms to:
Figure PCTCN2021124194-appb-000049
Figure PCTCN2021124194-appb-000049
其中,
Figure PCTCN2021124194-appb-000050
是第二车辆从t0时刻到t1时刻之间的位移;v r为第二车辆从t0时刻到t1时刻在直道上的行驶速度;t0时刻为毫米波雷达采集第一帧数据的时刻,t1时刻为毫米波雷达采集第二帧数据的时刻,第一帧数据和第二帧数据为连续的两帧数据;P为大于0且小于等于1的系数。
in,
Figure PCTCN2021124194-appb-000050
is the displacement of the second vehicle from time t0 to time t1; v r is the driving speed of the second vehicle on the straight from time t0 to time t1; time t0 is the time when the millimeter-wave radar collects the first frame of data, time t1 For the moment when the millimeter-wave radar collects the second frame of data, the first frame of data and the second frame of data are two consecutive frames of data; P is a coefficient greater than 0 and less than or equal to 1.
示例3、第一车辆和第二车辆所处的路况为:第一车辆、第二车辆均处于弯道。则检测模块1801在基于毫米波雷达检测第二车辆的速度时,具体用于:Example 3. The road conditions where the first vehicle and the second vehicle are located are: both the first vehicle and the second vehicle are in a curve. Then, when the detection module 1801 detects the speed of the second vehicle based on the millimeter-wave radar, it is specifically used for:
基于毫米波雷达检测第二车辆绕弯道做圆周运动的瞬时角速度;Detecting the instantaneous angular velocity of the second vehicle in circular motion around the curve based on millimeter-wave radar;
其中,处理模块1802根据路况确定的第一阈值N符合:Wherein, the first threshold N determined by the processing module 1802 according to the road conditions conforms to:
Figure PCTCN2021124194-appb-000051
Figure PCTCN2021124194-appb-000051
其中,ω r为第二车辆从t0时刻到t1时刻绕弯道做圆周运动的角速度;v是第一车辆的瞬时行驶速度;α为毫米波雷达的检测波束角的一半,t0时刻为毫米波雷达采集第一帧数据的时刻,t1时刻为毫米波雷达采集第二帧数据的时刻,第一帧数据和第二帧数据为连续的两帧数据;Q为大于0且小于等于1的系数;
Figure PCTCN2021124194-appb-000052
为t0时刻第一车辆和第二车辆之间的欧式距离;
Among them, ω r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1; v is the instantaneous speed of the first vehicle; α is half of the detection beam angle of the millimeter-wave radar, and time t0 is the millimeter-wave radar. The moment when the first frame of data is collected, the moment t1 is the moment when the millimeter-wave radar collects the second frame of data, the first frame of data and the second frame of data are two consecutive frames of data; Q is a coefficient greater than 0 and less than or equal to 1;
Figure PCTCN2021124194-appb-000052
is the Euclidean distance between the first vehicle and the second vehicle at time t0;
应理解,以上三种仅为示例而非限定。在实际应用中,路况并不仅限于弯道场景,针对其它道路场景(如上/下坡、加/减速等),也可以采用相同的思路来设计预设条件。It should be understood that the above three are only examples and not limitations. In practical applications, road conditions are not limited to curve scenarios. For other road scenarios (such as up/downhill, acceleration/deceleration, etc.), the same idea can also be used to design preset conditions.
可选的,处理模块1802还可以用于:在确定毫米波雷达处于目标丢失状态或目标即将丢失状态之后,根据路况,控制毫米波雷达绕Z R轴旋转和/或绕X R轴旋转,以使毫米波 雷达重新检测到第二车辆;其中,Z R轴垂直于水平面,X R轴平行于水平面且垂直于第一车辆的行驶方向。 Optionally, the processing module 1802 may also be used to: after determining that the millimeter-wave radar is in a target loss state or a target is about to be lost, control the millimeter-wave radar to rotate around the Z R axis and/or around the X R axis according to road conditions, to The millimeter wave radar is made to detect the second vehicle again; wherein, the Z R axis is perpendicular to the horizontal plane, and the X R axis is parallel to the horizontal plane and perpendicular to the traveling direction of the first vehicle.
应理解,上述两个方向(即绕Z R轴旋转的方向、绕X R轴旋转的方向)的自由度在机械结构是相互独立的(或者说解耦的),在控制逻辑上也是相互独立的(或者说解耦的),所以处理模块1802可以只控制调整其中一个自由度,也可以同时控制调整两个自由度。 It should be understood that the degrees of freedom of the above two directions (that is, the direction of rotation around the Z R axis and the direction of rotation around the X R axis) are independent of each other (or decoupled) in the mechanical structure and independent of each other in the control logic. (or decoupling), so the processing module 1802 can control and adjust only one of the degrees of freedom, or can control and adjust two degrees of freedom at the same time.
例如,当第一车辆和/或第二车辆处于弯道时,处理模块1802可以控制毫米波雷达绕Z R轴旋转第一角度。再例如,当第一车辆和/或第二车辆处于坡道时,处理模块1802可以控制毫米波雷达绕X R轴旋转第二角度。又例如,当第一车辆和/或第二车辆处于弯坡组合路段时,处理模块1802可以控制毫米波雷达绕Z R轴旋转第三角度和绕X R轴旋转第四角度。由于毫米波雷达绕Z R轴旋转和绕X R轴旋转相互独立而不相互依赖,所以可以提高毫米波雷达姿态调整的准确度和效率,进而进一步提高目标检测跟踪的精准度和效率。 For example, when the first vehicle and/or the second vehicle is in a curve, the processing module 1802 may control the millimeter-wave radar to rotate around the Z R axis by a first angle. For another example, when the first vehicle and/or the second vehicle is on a ramp, the processing module 1802 may control the millimeter wave radar to rotate around the X R axis by a second angle. For another example, when the first vehicle and/or the second vehicle is in a combination road section with curves and slopes, the processing module 1802 may control the millimeter-wave radar to rotate around the Z R axis by a third angle and around the X R axis by a fourth angle. Since the rotation of the millimeter-wave radar around the Z R axis and the rotation around the X R axis are independent of each other and not dependent on each other, the accuracy and efficiency of the attitude adjustment of the millimeter-wave radar can be improved, thereby further improving the accuracy and efficiency of target detection and tracking.
处理模块1802还可以在每一次控制毫米波雷达绕Z R轴旋转和/或绕X R轴旋转之后,根据该次旋转的角度值实时更新毫米波雷达的标定矩阵。如此,可以达到毫米波雷达和相机信息融合的快速准确响应,保证毫米波雷达姿态调整过程中毫米波雷达和相机信息融合的可靠性。 The processing module 1802 may further update the calibration matrix of the millimeter-wave radar in real time according to the angle value of the rotation after each time the millimeter-wave radar is controlled to rotate around the Z R axis and/or around the X R axis. In this way, a fast and accurate response of the information fusion of the millimeter-wave radar and the camera can be achieved, and the reliability of the information fusion of the millimeter-wave radar and the camera during the attitude adjustment of the millimeter-wave radar can be guaranteed.
可选的,处理模块1802还可以用于:分别基于相机和毫米波雷达对目标进行检测跟踪,并根据IOU融合规则进行目标融合。Optionally, the processing module 1802 can also be used to: detect and track the target based on the camera and the millimeter-wave radar respectively, and perform target fusion according to the IOU fusion rule.
具体的目标融合过程包括:The specific target fusion process includes:
处理模块1802在基于相机获取第一车辆的前方的RGB图像之后,基于目标识别模型对RGB图像进行目标识别,得到第一识别结果,其中第一识别结果包括第二车辆的位置和类型,目标识别模型的输入为RGB图像,输出为目标的位置和类型;处理模块1802在基于毫米波雷达获取第一车辆的前方的雷达点迹数据之后,对雷达点迹数据进行处理,得到第二识别结果,其中第二识别结果包括第二车辆的位置和速度;处理模块1802判断RGB图像中的第二车辆所在区域和雷达点迹数据中第二车辆所在区域的IoU大于第二阈值M时,将第一识别结果和第二识别结果进行融合,得到融合数据,否则不将第一识别结果和第二识别结果进行融合。The processing module 1802, after acquiring the RGB image in front of the first vehicle based on the camera, performs target recognition on the RGB image based on the target recognition model to obtain a first recognition result, wherein the first recognition result includes the position and type of the second vehicle, and the target recognition The input of the model is an RGB image, and the output is the position and type of the target; after the processing module 1802 obtains the radar spot data in front of the first vehicle based on the millimeter-wave radar, it processes the radar spot data to obtain a second recognition result, The second recognition result includes the position and speed of the second vehicle; when the processing module 1802 determines that the area where the second vehicle is located in the RGB image and the IoU of the area where the second vehicle is located in the radar trace data is greater than the second threshold M, the first The recognition result and the second recognition result are fused to obtain fusion data, otherwise the first recognition result and the second recognition result are not fused.
其中,第二阈值M与第一车辆和/或第二车辆所在弯道的曲率ρ、第一车辆的行车速度V、第一车辆的行车距离L满足以下关系:Wherein, the second threshold M and the curvature ρ of the curve where the first vehicle and/or the second vehicle are located, the driving speed V of the first vehicle, and the driving distance L of the first vehicle satisfy the following relationships:
M=a 2ρ+bV+L; M=a 2 ρ+bV+L;
其中,a、b为预设系数。Among them, a and b are preset coefficients.
应理解,以上公式仅为举例而非限定,在具体实施时,第二阈值K还可以与其他因素相关,如第一车辆的加速度等,这里不做限制。It should be understood that the above formula is only an example and not a limitation. During specific implementation, the second threshold K may also be related to other factors, such as the acceleration of the first vehicle, etc., which is not limited here.
由于第二阈值K与车辆所处的路况(第一车辆和/或第二车辆所在弯道的曲率ρ),以及第一车辆的行驶状态(即行车速度V、行车距离L)相关,所以可以提高融合识别的准确度,从而进一步提高车辆在驾驶过程中的安全性和稳定性。Since the second threshold K is related to the road conditions where the vehicle is located (the curvature ρ of the curve where the first vehicle and/or the second vehicle is located) and the driving state of the first vehicle (that is, the driving speed V, the driving distance L), it can be Improve the accuracy of fusion recognition, thereby further improving the safety and stability of the vehicle during driving.
可选的,处理模块1802还可以用于:当毫米波雷达处于目标丢失状态或目标即将丢失状态时,基于毫米波雷达处于目标丢失状态或目标即将丢失状态之前的融合数据,和毫米波雷达处于目标丢失状态或目标即将丢失状态时的第一识别结果对第二车辆进行跟踪;或者当毫米波雷达未处于目标丢失状态或目标即将丢失状态时,基于连续多帧融合数据对第二车辆进行跟踪。由于装置180根据不同的毫米波雷达状态采用不同的跟踪机制,所以 可以提高目标跟踪的准确性,从而进一步提高车辆在驾驶过程中的安全性和稳定性。Optionally, the processing module 1802 can also be used to: when the millimeter-wave radar is in the target-losing state or the target is about to be lost, based on the fusion data before the millimeter-wave radar is in the target-losing state or the target is about to be lost, and the millimeter-wave radar is in the target-losing state or the target is about to be lost. The second vehicle is tracked by the first recognition result when the target is lost or the target is about to be lost; or when the millimeter-wave radar is not in the target-loss state or the target is about to be lost, the second vehicle is tracked based on continuous multi-frame fusion data . Since the device 180 adopts different tracking mechanisms according to different millimeter wave radar states, the accuracy of target tracking can be improved, thereby further improving the safety and stability of the vehicle during driving.
可选的,处理模块在基于目标识别模型对RGB图像进行目标识别时,具体用于:当毫米波雷达处于目标丢失状态或目标即将丢失状态时,使用轻量化的目标识别模型对相机采集的RGB图像进行目标识别;当毫米波雷达未处于目标丢失状态或目标即将丢失状态时,使用重量化的目标识别模型对相机采集的RGB图像进行目标识别;其中,轻量化的目标识别模型的识别速度大于重量化的目标识别模型的识别速度,轻量化的目标识别模型的识别精度小于重量化的目标识别模型的识别精度。Optionally, when the processing module performs target recognition on the RGB image based on the target recognition model, it is specifically used for: when the millimeter-wave radar is in the target loss state or the target is about to be lost, use the lightweight target recognition model to collect the RGB images collected by the camera. The image is used for target recognition; when the millimeter-wave radar is not in the target loss state or the target is about to be lost, the weighted target recognition model is used to perform target recognition on the RGB images collected by the camera; among them, the lightweight target recognition model The recognition speed is greater than The recognition speed of the weighted target recognition model and the recognition accuracy of the lightweight target recognition model are lower than the recognition accuracy of the heavyweight target recognition model.
针对相机目标识别设计了两种不同的目标识别模型根据不同的场景进行切换(在毫米波雷达丢帧的情况下,采用轻量化识别模型对相机采集的数据进行目标识别,提高识别速度,而在毫米波雷达没有丢帧的情况下,采用重量化识别模型对相机采集的数据进行目标识别,提高识别精度),可以实现兼顾融合识别的速度和准确度。For camera target recognition, two different target recognition models are designed to switch according to different scenarios (in the case of frame loss of millimeter wave radar, a lightweight recognition model is used to recognize the data collected by the camera to improve the recognition speed. When the millimeter-wave radar does not lose frames, the weighted recognition model is used to identify the data collected by the camera to improve the recognition accuracy), which can achieve both the speed and accuracy of fusion recognition.
应理解,上述仅以两种模型切换为例,在实际应用中还可以设计更多的模块来进行切换。It should be understood that the above only takes the switching of two models as an example, and in practical applications, more modules can be designed to perform switching.
需要说明的是,上述装置180中各模块所执行的方法步骤的具体实现方式可以参见上述方法实施例中由第一车辆执行对应方法步骤时的具体实现方式,这里不再赘述。It should be noted that, for the specific implementation of the method steps performed by each module in the foregoing apparatus 180, reference may be made to the specific implementation of the corresponding method steps performed by the first vehicle in the foregoing method embodiments, which will not be repeated here.
基于相同的技术构思,参见图19,本申请实施例还提供一种目标检测装置190,包括处理器1901和存储器1902;其中,存储器1902存储有可被处理器1901执行的指令,处理器1901通过执行存储器1902存储的指令,使得装置190执行图5、图10、图12所示的方法。其中,所述处理器1901和所述存储器1902可以通过接口电路耦合,也可以集成在一起,这里不做限制。Based on the same technical concept, referring to FIG. 19 , an embodiment of the present application further provides a target detection apparatus 190, including a processor 1901 and a memory 1902; wherein, the memory 1902 stores instructions that can be executed by the processor 1901, and the processor 1901 passes Executing the instructions stored in the memory 1902 causes the apparatus 190 to execute the methods shown in FIG. 5 , FIG. 10 , and FIG. 12 . The processor 1901 and the memory 1902 may be coupled through an interface circuit, or may be integrated together, which is not limited here.
本申请实施例中不限定上述处理器1901、存储器1902之间的具体连接介质。本申请实施例在图19中以处理器1901、存储器1902之间通过总线连接,总线在图19中以粗线表示,其它部件之间的连接方式,仅是进行示意性说明,并不引以为限。所述总线可以分为地址总线、数据总线、控制总线等。为便于表示,图19中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。The specific connection medium between the processor 1901 and the memory 1902 is not limited in this embodiment of the present application. In the embodiment of the present application, the processor 1901 and the memory 1902 are connected through a bus in FIG. 19 , and the bus is represented by a thick line in FIG. 19 . The connection mode between other components is only for schematic illustration, and is not cited. limited. The bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of presentation, only one thick line is shown in FIG. 19, but it does not mean that there is only one bus or one type of bus.
应理解,本申请实施例中提及的处理器可以通过硬件实现也可以通过软件实现。当通过硬件实现时,该处理器可以是逻辑电路、集成电路等。当通过软件实现时,该处理器可以是一个通用处理器,通过读取存储器中存储的软件代码来实现。It should be understood that the processor mentioned in the embodiments of the present application may be implemented by hardware or software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general-purpose processor implemented by reading software codes stored in memory.
示例性的,处理器可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。Exemplarily, the processor may be a central processing unit (Central Processing Unit, CPU), or other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC) , Off-the-shelf Programmable Gate Array (Field Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
应理解,本申请实施例中提及的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(Read-Only Memory,ROM)、可编程只读存储器(Programmable ROM,PROM)、可擦除可编程只读存储器(Erasable PROM,EPROM)、电可擦除可编程只读存储器(Electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(Random Access Memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(Static RAM,SRAM)、动态随机存取存储器(Dynamic RAM,DRAM)、 同步动态随机存取存储器(Synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(Double Data Eate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(Enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(Synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(Direct Rambus RAM,DR RAM)。It should be understood that the memory mentioned in the embodiments of the present application may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory. Wherein, the non-volatile memory may be a read-only memory (Read-Only Memory, ROM), a programmable read-only memory (Programmable ROM, PROM), an erasable programmable read-only memory (Erasable PROM, EPROM), an electrically programmable read-only memory (Erasable PROM, EPROM). Erase programmable read-only memory (Electrically EPROM, EEPROM) or flash memory. Volatile memory may be Random Access Memory (RAM), which acts as an external cache. By way of example, but not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Synchronous DRAM, SDRAM), double data rate synchronous dynamic random access memory (Double Data Eate SDRAM, DDR SDRAM), enhanced synchronous dynamic random access memory (Enhanced SDRAM, ESDRAM), synchronous link dynamic random access memory (Synchlink DRAM, SLDRAM) ) and direct memory bus random access memory (Direct Rambus RAM, DR RAM).
需要说明的是,当处理器为通用处理器、DSP、ASIC、FPGA或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件时,存储器(存储模块)可以集成在处理器中。It should be noted that when the processor is a general-purpose processor, DSP, ASIC, FPGA or other programmable logic devices, discrete gate or transistor logic devices, or discrete hardware components, the memory (storage module) can be integrated in the processor.
应注意,本文描述的存储器旨在包括但不限于这些和任意其它适合类型的存储器。It should be noted that the memory described herein is intended to include, but not be limited to, these and any other suitable types of memory.
基于相同的技术构思,本申请实施例还提供一种计算机可读存储介质,包括程序或指令,当程序或指令在计算机上运行时,使得如图5、图10、图12所示的方法被执行。Based on the same technical concept, the embodiments of the present application also provide a computer-readable storage medium, including a program or an instruction, when the program or instruction is run on a computer, the methods shown in FIG. 5 , FIG. 10 , and FIG. 12 can be implement.
基于相同的技术构思,本申请实施例还提供一种芯片,芯片与存储器耦合,用于读取并执行存储器中存储的程序指令,以实现如图5、图10、图12所示的方法。Based on the same technical concept, an embodiment of the present application further provides a chip, which is coupled to a memory and used to read and execute program instructions stored in the memory to implement the methods shown in FIG. 5 , FIG. 10 , and FIG. 12 .
基于相同的技术构思,本申请实施例还提供一种包含指令的计算机程序产品,计算机程序产品中存储有指令,当其在计算机上运行时,使得计算机执行如图5、图10、图12所示的方法。Based on the same technical concept, the embodiments of the present application also provide a computer program product containing instructions. The computer program product stores instructions. When the computer program product runs on a computer, the computer can execute the instructions shown in FIG. 5 , FIG. 10 , and FIG. 12 . method shown.
基于相同的技术构思,本申请实施例还提供一种车辆,该车辆包括目标检测装置、毫米波雷达、相机;目标检测装置用于通过控制毫米波雷达和相机来实现如图5、图10、图12所示的方法。所述车辆的结构可以如图11所示。Based on the same technical concept, an embodiment of the present application also provides a vehicle, the vehicle includes a target detection device, a millimeter-wave radar, and a camera; The method shown in Figure 12. The structure of the vehicle may be as shown in FIG. 11 .
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。As will be appreciated by those skilled in the art, the embodiments of the present application may be provided as a method, a system, or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
本申请是参照根据本申请的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the present application. It will be understood that each flow and/or block in the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to the processor of a general purpose computer, special purpose computer, embedded processor or other programmable data processing device to produce a machine such that the instructions executed by the processor of the computer or other programmable data processing device produce Means for implementing the functions specified in a flow or flow of a flowchart and/or a block or blocks of a block diagram.
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture comprising instruction means, the instructions The apparatus implements the functions specified in the flow or flow of the flowcharts and/or the block or blocks of the block diagrams.
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。These computer program instructions can also be loaded on a computer or other programmable data processing device to cause a series of operational steps to be performed on the computer or other programmable device to produce a computer-implemented process such that The instructions provide steps for implementing the functions specified in the flow or blocks of the flowcharts and/or the block or blocks of the block diagrams.
显然,本领域的技术人员可以对本申请进行各种改动和变型而不脱离本申请的精神和范围。这样,倘若本申请的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包含这些改动和变型在内。Obviously, those skilled in the art can make various changes and modifications to the present application without departing from the spirit and scope of the present application. Thus, if these modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to include these modifications and variations.

Claims (16)

  1. 一种目标检测方法,其特征在于,应用于第一车辆,所述方法包括:A target detection method, characterized in that, applied to a first vehicle, the method comprising:
    基于毫米波雷达检测第二车辆的速度;其中,所述第二车辆位于所述第一车辆的前方,所述速度包括线速度和角速度;Detecting a speed of a second vehicle based on a millimeter-wave radar; wherein, the second vehicle is located in front of the first vehicle, and the speed includes a linear speed and an angular speed;
    获取所述第一车辆和所述第二车辆所处的路况,根据所述路况确定第一阈值;acquiring road conditions where the first vehicle and the second vehicle are located, and determining a first threshold according to the road conditions;
    若所述第二车辆的速度超过所述第一阈值,则确定所述毫米波雷达处于目标丢失状态或目标即将丢失状态。If the speed of the second vehicle exceeds the first threshold, it is determined that the millimeter-wave radar is in a target loss state or a target loss state.
  2. 如权利要求1所述的方法,其特征在于,获取所述第一车辆和所述第二车辆所处的路况,包括:The method of claim 1, wherein acquiring the road conditions where the first vehicle and the second vehicle are located comprises:
    基于相机获取所述第一车辆的前方的RGB图像,根据所述RGB图像确定所述第二车辆所处的路况;The RGB image in front of the first vehicle is acquired based on the camera, and the road condition where the second vehicle is located is determined according to the RGB image;
    基于定位装置获取所述第一车辆的位置信息,根据所述位置信息确定所述第一车辆所处的路况。The location information of the first vehicle is acquired based on the positioning device, and the road condition where the first vehicle is located is determined according to the location information.
  3. 如权利要求2所述的方法,其特征在于,所述根据所述RGB图像确定所述第二车辆所处的路况,包括:The method according to claim 2, wherein the determining the road condition where the second vehicle is located according to the RGB image comprises:
    从所述RGB图像包括的远视场中的车道线上提取特征点,根据提取的所述特征点确定所述第二车辆所处道路的拐点和方向。Feature points are extracted from the lane lines in the far field of view included in the RGB image, and the inflection point and direction of the road where the second vehicle is located is determined according to the extracted feature points.
  4. 如权利要求1-3任一项所述的方法,其特征在于,所述第一车辆和所述第二车辆所处的路况为:所述第一车辆处于直道,所述第二车辆处于弯道;The method according to any one of claims 1-3, wherein the road conditions where the first vehicle and the second vehicle are located are: the first vehicle is on a straight road, and the second vehicle is on a curved road road;
    基于毫米波雷达检测第二车辆的速度,包括:Detect the speed of the second vehicle based on millimeter wave radar, including:
    基于毫米波雷达检测所述第二车辆绕所述弯道做圆周运动的瞬时角速度;Detecting the instantaneous angular velocity of the circular motion of the second vehicle around the curve based on a millimeter-wave radar;
    根据所述路况确定的所述第一阈值N符合:The first threshold N determined according to the road conditions conforms to:
    Figure PCTCN2021124194-appb-100001
    Figure PCTCN2021124194-appb-100001
    其中,所述ω r为所述第二车辆从t0时刻到t1时刻绕所述弯道做圆周运动的角速度,
    Figure PCTCN2021124194-appb-100002
    为t0时刻所述第一车辆和所述第二车辆之间的欧式距离;
    Figure PCTCN2021124194-appb-100003
    为所述第二车辆从t0时刻到t1时刻之间的位移,α为所述毫米波雷达的检测波束角的一半,所述t0时刻为所述毫米波雷达采集第一帧数据的时刻,所述t1时刻为所述毫米波雷达采集第二帧数据的时刻,所述第一帧数据和所述第二帧数据为连续的两帧数据;K为大于0且小于等于1的系数;ε为t0时刻第一车辆所在位置A、第二车辆所在位置B、所述弯道的圆心O之间所形成的圆心角的角度值。
    Wherein, the ω r is the angular velocity of the second vehicle making a circular motion around the curve from time t0 to time t1,
    Figure PCTCN2021124194-appb-100002
    is the Euclidean distance between the first vehicle and the second vehicle at time t0;
    Figure PCTCN2021124194-appb-100003
    is the displacement of the second vehicle from time t0 to time t1, α is half of the detection beam angle of the millimeter-wave radar, and the time t0 is the time when the millimeter-wave radar collects the first frame of data, so The time t1 is the time when the millimeter-wave radar collects the second frame of data, the first frame of data and the second frame of data are two consecutive frames of data; K is a coefficient greater than 0 and less than or equal to 1; ε is The angle value of the central angle formed between the position A of the first vehicle, the position B of the second vehicle, and the center O of the curve at time t0.
  5. 如权利要求1-3任一项所述的方法,其特征在于,所述第一车辆和所述第二车辆所处的路况为:所述第一车辆处于弯道,所述第二车辆处于直道;The method according to any one of claims 1-3, wherein the road conditions where the first vehicle and the second vehicle are located are: the first vehicle is in a curve, and the second vehicle is in a curve straight;
    基于毫米波雷达检测第二车辆的速度,包括:Detect the speed of the second vehicle based on millimeter wave radar, including:
    基于毫米波雷达检测第二车辆在所述直道上的瞬时行驶速度;Detecting the instantaneous traveling speed of the second vehicle on the straight road based on the millimeter-wave radar;
    根据所述路况确定的所述第一阈值N符合:The first threshold N determined according to the road conditions conforms to:
    Figure PCTCN2021124194-appb-100004
    Figure PCTCN2021124194-appb-100004
    其中,
    Figure PCTCN2021124194-appb-100005
    是所述第二车辆从t0时刻到t1时刻之间的位移;v r为所述第二车辆从t0时刻到t1时刻在所述直道上的行驶速度;所述t0时刻为所述毫米波雷达采集第一帧数据的时刻,所述t1时刻为所述毫米波雷达采集第二帧数据的时刻,所述第一帧数据和所述第二 帧数据为连续的两帧数据;P为大于0且小于等于1的系数。
    in,
    Figure PCTCN2021124194-appb-100005
    is the displacement of the second vehicle from time t0 to time t1; v r is the running speed of the second vehicle on the straight road from time t0 to time t1; the time t0 is the millimeter-wave radar The time when the first frame of data is collected, the time t1 is the time when the millimeter-wave radar collects the second frame of data, the first frame of data and the second frame of data are two consecutive frames of data; P is greater than 0 and a factor less than or equal to 1.
  6. 如权利要求1-3任一项所述的方法,其特征在于,所述第一车辆和所述第二车辆所处的路况为:所述第一车辆、所述第二车辆均处于弯道;The method according to any one of claims 1-3, wherein the road conditions where the first vehicle and the second vehicle are located are: both the first vehicle and the second vehicle are in a curve ;
    基于毫米波雷达检测第二车辆的速度,包括:Detect the speed of the second vehicle based on millimeter wave radar, including:
    基于毫米波雷达检测所述第二车辆绕所述弯道做圆周运动的瞬时角速度;Detecting the instantaneous angular velocity of the circular motion of the second vehicle around the curve based on a millimeter-wave radar;
    根据所述路况确定的所述第一阈值N符合:The first threshold N determined according to the road conditions conforms to:
    Figure PCTCN2021124194-appb-100006
    Figure PCTCN2021124194-appb-100006
    其中,ω r为所述第二车辆从t0时刻到t1时刻绕所述弯道做圆周运动的角速度;v是所述第一车辆的瞬时行驶速度;α为所述毫米波雷达的检测波束角的一半,所述t0时刻为所述毫米波雷达采集第一帧数据的时刻,所述t1时刻为所述毫米波雷达采集第二帧数据的时刻,所述第一帧数据和所述第二帧数据为连续的两帧数据;Q为大于0且小于等于1的系数;
    Figure PCTCN2021124194-appb-100007
    为t0时刻所述第一车辆和所述第二车辆之间的欧式距离。
    Among them, ω r is the angular velocity of the second vehicle in circular motion around the curve from time t0 to time t1; v is the instantaneous speed of the first vehicle; α is the detection beam angle of the millimeter-wave radar half, the time t0 is the time when the millimeter-wave radar collects the first frame of data, the time t1 is the time when the millimeter-wave radar collects the second frame of data, the first frame of data and the second frame of data Frame data is two consecutive frames of data; Q is a coefficient greater than 0 and less than or equal to 1;
    Figure PCTCN2021124194-appb-100007
    is the Euclidean distance between the first vehicle and the second vehicle at time t0.
  7. 如权利要求1-6任一项所述的方法,其特征在于,在确定所述毫米波雷达处于目标丢失状态或目标即将丢失状态之后,还包括:The method according to any one of claims 1-6, wherein after determining that the millimeter-wave radar is in a target loss state or a target loss state, the method further comprises:
    根据所述路况,控制所述毫米波雷达绕Z R轴旋转和/或绕X R轴旋转,以使所述毫米波雷达重新检测到所述第二车辆;其中,所述Z R轴垂直于水平面,所述X R轴平行于水平面且垂直于所述第一车辆的行驶方向。 According to the road conditions, the millimeter wave radar is controlled to rotate around the Z R axis and/or around the X R axis, so that the millimeter wave radar can re-detect the second vehicle; wherein the Z R axis is perpendicular to The horizontal plane, the X R axis is parallel to the horizontal plane and perpendicular to the traveling direction of the first vehicle.
  8. 如权利要求7所述的方法,其特征在于,根据所述路况,控制所述毫米波雷达绕Z R轴旋转和/或绕X R轴旋转,包括: The method according to claim 7, wherein, according to the road conditions, controlling the millimeter-wave radar to rotate around the Z R axis and/or around the X R axis, comprising:
    当所述第一车辆和/或所述第二车辆处于弯道时,控制所述毫米波雷达绕Z R轴旋转第一角度;或者, When the first vehicle and/or the second vehicle is in a curve, control the millimeter-wave radar to rotate around the Z R axis by a first angle; or,
    当所述第一车辆和/或所述第二车辆处于坡道时,控制所述毫米波雷达绕X R轴旋转第二角度;或者, When the first vehicle and/or the second vehicle is on a slope, control the millimeter-wave radar to rotate around the X- R axis by a second angle; or,
    当所述第一车辆和/或所述第二车辆处于弯坡组合路段时,控制所述毫米波雷达绕Z R轴旋转第三角度和绕X R轴旋转第四角度。 When the first vehicle and/or the second vehicle is in a curved and slope combined road section, the millimeter-wave radar is controlled to rotate around the Z R axis by a third angle and around the X R axis by a fourth angle.
  9. 如权利要求7或8所述的方法,其特征在于,所述方法还包括:The method of claim 7 or 8, wherein the method further comprises:
    在每一次控制所述毫米波雷达绕Z R轴旋转和/或绕X R轴旋转之后,根据该次旋转的角度值实时更新所述毫米波雷达的标定矩阵。 After each time the millimeter wave radar is controlled to rotate around the Z R axis and/or around the X R axis, the calibration matrix of the millimeter wave radar is updated in real time according to the angle value of the rotation.
  10. 如权利要求1-9任一项所述的方法,其特征在于,所述方法还包括:The method according to any one of claims 1-9, wherein the method further comprises:
    基于相机获取所述第一车辆的前方的RGB图像,基于目标识别模型对所述RGB图像进行目标识别,得到第一识别结果,所述第一识别结果包括所述第二车辆的位置和类型;其中,所述目标识别模型的输入为RGB图像,输出为目标的位置和类型;Obtaining an RGB image in front of the first vehicle based on a camera, and performing target recognition on the RGB image based on a target recognition model to obtain a first recognition result, where the first recognition result includes the location and type of the second vehicle; Wherein, the input of the target recognition model is an RGB image, and the output is the position and type of the target;
    基于所述毫米波雷达获取所述第一车辆的前方的雷达点迹数据,对所述雷达点迹数据进行处理,得到第二识别结果;所述第二识别结果包括所述第二车辆的位置和速度;The radar spot data in front of the first vehicle is obtained based on the millimeter-wave radar, and the radar spot data is processed to obtain a second recognition result; the second recognition result includes the position of the second vehicle and speed;
    判断所述RGB图像中的所述第二车辆所在区域和所述雷达点迹数据中所述第二车辆所在区域的交并比IoU大于第二阈值M时,将所述第一识别结果和所述第二识别结果进行融合,得到融合数据;When it is judged that the intersection ratio IoU of the area where the second vehicle is located in the RGB image and the area where the second vehicle is located in the radar trace data is greater than the second threshold M, the first recognition result and the The second recognition result is fused to obtain fused data;
    其中,所述第二阈值M与所述第一车辆和/或所述第二车辆所在弯道的曲率ρ、所述第一车辆的行车速度V、所述第一车辆的行车距离L满足以下关系:Wherein, the second threshold value M and the curvature ρ of the curve where the first vehicle and/or the second vehicle are located, the driving speed V of the first vehicle, and the driving distance L of the first vehicle satisfy the following: relation:
    M=a 2ρ+bV+L; M=a 2 ρ+bV+L;
    其中,a、b为预设系数。Among them, a and b are preset coefficients.
  11. 如权利要求10所述的方法,其特征在于,所述方法还包括:The method of claim 10, wherein the method further comprises:
    当所述毫米波雷达处于目标丢失状态或目标即将丢失状态时,基于所述毫米波雷达处于目标丢失状态或目标即将丢失状态之前的融合数据,和所述毫米波雷达处于目标丢失状态或目标即将丢失状态时的第一识别结果对所述第二车辆进行跟踪;或者,When the millimeter-wave radar is in the target-losing state or the target is about to be lost, based on the fusion data before the millimeter-wave radar is in the target-losing state or the target is about to lose the state, and the millimeter-wave radar is in the target-losing state or the target is about to be lost The second vehicle is tracked with the first recognition result in the lost state; or,
    当所述毫米波雷达未处于目标丢失状态或目标即将丢失状态时,基于连续多帧融合数据对所述第二车辆进行跟踪。When the millimeter-wave radar is not in the target loss state or the target is about to be lost, the second vehicle is tracked based on continuous multi-frame fusion data.
  12. 如权利要求10所述的方法,其特征在于,基于目标识别模型对所述RGB图像进行目标识别,包括:The method according to claim 10, wherein, performing target recognition on the RGB image based on a target recognition model, comprising:
    当所述毫米波雷达处于目标丢失状态或目标即将丢失状态时,使用轻量化的目标识别模型对所述相机采集的RGB图像进行目标识别;When the millimeter-wave radar is in a target loss state or a target is about to be lost, use a lightweight target recognition model to perform target recognition on the RGB images collected by the camera;
    当所述毫米波雷达未处于目标丢失状态或目标即将丢失状态时,使用重量化的目标识别模型对所述相机采集的RGB图像进行目标识别;When the millimeter-wave radar is not in the target loss state or the target is about to be lost, use a heavyweight target recognition model to perform target recognition on the RGB images collected by the camera;
    其中,所述轻量化的目标识别模型的识别速度大于所述重量化的目标识别模型的识别速度,所述轻量化的目标识别模型的识别精度小于所述重量化的目标识别模型的识别精度。Wherein, the recognition speed of the lightweight target recognition model is greater than the recognition speed of the heavyweight target recognition model, and the recognition accuracy of the lightweight target recognition model is lower than the recognition accuracy of the heavyweight target recognition model.
  13. 一种目标检测装置,其特征在于,所述装置位于第一车辆,所述装置包括:A target detection device, characterized in that the device is located in a first vehicle, and the device comprises:
    检测模块,用于基于毫米波雷达检测第二车辆的速度;其中,所述第二车辆位于所述第一车辆的前方,所述速度包括线速度和角速度;a detection module for detecting the speed of a second vehicle based on a millimeter-wave radar; wherein, the second vehicle is located in front of the first vehicle, and the speed includes a linear speed and an angular speed;
    处理模块,用于获取所述第一车辆和所述第二车辆所处的路况;根据所述路况确定第一阈值,若所述第二车辆的速度超过所述第一阈值,则确定所述毫米波雷达处于目标丢失状态或目标即将丢失状态。a processing module, configured to acquire the road conditions where the first vehicle and the second vehicle are located; determine a first threshold according to the road conditions, and determine the first threshold if the speed of the second vehicle exceeds the first threshold The millimeter-wave radar is in the target loss state or the target is about to be lost.
  14. 一种目标检测装置,其特征在于,包括处理器和存储器;A target detection device, comprising a processor and a memory;
    其中,所述存储器存储有可被所述处理器执行的指令,所述处理器通过执行所述存储器存储的指令,使得所述装置执行如权利要求1-12中任一项所述的方法。The memory stores instructions executable by the processor, and the processor executes the instructions stored in the memory to cause the apparatus to perform the method according to any one of claims 1-12.
  15. 一种计算机可读存储介质,其特征在于,包括程序或指令,当所述程序或指令在计算机上运行时,使得如权利要求1-12中任一项所述的方法被执行。A computer-readable storage medium, characterized by comprising programs or instructions, which, when executed on a computer, cause the method according to any one of claims 1-12 to be performed.
  16. 一种车辆,其特征在于,所述车辆包括目标检测装置、毫米波雷达以及相机;A vehicle, characterized in that the vehicle includes a target detection device, a millimeter-wave radar, and a camera;
    所述目标检测装置用于通过控制所述毫米波雷达和所述相机实现如权利要求1-12中任一项所述的方法。The target detection device is configured to implement the method according to any one of claims 1-12 by controlling the millimeter-wave radar and the camera.
PCT/CN2021/124194 2021-01-22 2021-10-15 Target detection method and apparatus WO2022156276A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110089080.2A CN114779229A (en) 2021-01-22 2021-01-22 Target detection method and device
CN202110089080.2 2021-01-22

Publications (1)

Publication Number Publication Date
WO2022156276A1 true WO2022156276A1 (en) 2022-07-28

Family

ID=82407697

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/124194 WO2022156276A1 (en) 2021-01-22 2021-10-15 Target detection method and apparatus

Country Status (2)

Country Link
CN (1) CN114779229A (en)
WO (1) WO2022156276A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115601308A (en) * 2022-09-22 2023-01-13 中国人民解放军军事科学院国防科技创新研究院(Cn) Target tracking method, device, equipment and storage medium based on photoelectric pod
CN115620337A (en) * 2022-10-11 2023-01-17 深圳市谷奇创新科技有限公司 Optical fiber sensor monitoring method and system for vital signs
CN116453346A (en) * 2023-06-20 2023-07-18 山东高速信息集团有限公司 Vehicle-road cooperation method, device and medium based on radar fusion layout
CN116500621A (en) * 2023-06-27 2023-07-28 长沙莫之比智能科技有限公司 Radar blind area early warning method based on double-subframe obstacle recognition
CN116543032A (en) * 2023-07-06 2023-08-04 中国第一汽车股份有限公司 Impact object ranging method, device, ranging equipment and storage medium
CN117129982A (en) * 2023-08-28 2023-11-28 河北德冠隆电子科技有限公司 Linear scanning angle accurate adjustable data dynamic fusion perception radar
CN117292579A (en) * 2023-10-27 2023-12-26 浪潮智慧科技有限公司 Highway curve early warning method, equipment and medium based on big data
CN117369350A (en) * 2023-12-08 2024-01-09 北京市农林科学院智能装备技术研究中心 High-speed seeder control system, method, electronic equipment and storage medium
CN117672007A (en) * 2024-02-03 2024-03-08 福建省高速公路科技创新研究院有限公司 Road construction area safety precaution system based on thunder fuses
CN117975732A (en) * 2024-03-28 2024-05-03 中铁十六局集团有限公司 Intelligent traffic control system and method for tunnel

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115966084B (en) * 2023-03-17 2023-06-09 江西昂然信息技术有限公司 Holographic intersection millimeter wave radar data processing method and device and computer equipment
CN117238143B (en) * 2023-09-15 2024-03-22 北京卓视智通科技有限责任公司 Traffic data fusion method, system and device based on radar double-spectrum camera

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09207609A (en) * 1996-01-31 1997-08-12 Fujitsu Ten Ltd Vehicle recognizing device
CN105182342A (en) * 2015-09-29 2015-12-23 长安大学 Device and method for tracking radar target position of vehicle on bumpy road
CN106043277A (en) * 2016-06-30 2016-10-26 大连楼兰科技股份有限公司 Vehicle automatic car-following control system and method, vehicle automatic car-following system and method, and control radar steering method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09207609A (en) * 1996-01-31 1997-08-12 Fujitsu Ten Ltd Vehicle recognizing device
CN105182342A (en) * 2015-09-29 2015-12-23 长安大学 Device and method for tracking radar target position of vehicle on bumpy road
CN106043277A (en) * 2016-06-30 2016-10-26 大连楼兰科技股份有限公司 Vehicle automatic car-following control system and method, vehicle automatic car-following system and method, and control radar steering method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LIANG YI: "Research on Preceding Vehicle Detection Method Based on Millimeter Wave Radar and Deep Learning Visual Information Fusion", CHINESE MASTER’S THESES DATABASE, ENGINEERING SCIENCE AND TECHNOLOGY II-SOUTH CHINA UNIVERSITY OF TECHNOLOGY GUANGZHOU, CHINA, 11 April 2019 (2019-04-11), XP055954603 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115601308A (en) * 2022-09-22 2023-01-13 中国人民解放军军事科学院国防科技创新研究院(Cn) Target tracking method, device, equipment and storage medium based on photoelectric pod
CN115620337A (en) * 2022-10-11 2023-01-17 深圳市谷奇创新科技有限公司 Optical fiber sensor monitoring method and system for vital signs
CN116453346A (en) * 2023-06-20 2023-07-18 山东高速信息集团有限公司 Vehicle-road cooperation method, device and medium based on radar fusion layout
CN116453346B (en) * 2023-06-20 2023-09-19 山东高速信息集团有限公司 Vehicle-road cooperation method, device and medium based on radar fusion layout
CN116500621A (en) * 2023-06-27 2023-07-28 长沙莫之比智能科技有限公司 Radar blind area early warning method based on double-subframe obstacle recognition
CN116500621B (en) * 2023-06-27 2023-08-29 长沙莫之比智能科技有限公司 Radar blind area early warning method based on double-subframe obstacle recognition
CN116543032A (en) * 2023-07-06 2023-08-04 中国第一汽车股份有限公司 Impact object ranging method, device, ranging equipment and storage medium
CN116543032B (en) * 2023-07-06 2023-11-21 中国第一汽车股份有限公司 Impact object ranging method, device, ranging equipment and storage medium
CN117129982A (en) * 2023-08-28 2023-11-28 河北德冠隆电子科技有限公司 Linear scanning angle accurate adjustable data dynamic fusion perception radar
CN117292579A (en) * 2023-10-27 2023-12-26 浪潮智慧科技有限公司 Highway curve early warning method, equipment and medium based on big data
CN117369350A (en) * 2023-12-08 2024-01-09 北京市农林科学院智能装备技术研究中心 High-speed seeder control system, method, electronic equipment and storage medium
CN117369350B (en) * 2023-12-08 2024-04-16 北京市农林科学院智能装备技术研究中心 High-speed seeder control system, method, electronic equipment and storage medium
CN117672007A (en) * 2024-02-03 2024-03-08 福建省高速公路科技创新研究院有限公司 Road construction area safety precaution system based on thunder fuses
CN117672007B (en) * 2024-02-03 2024-04-26 福建省高速公路科技创新研究院有限公司 Road construction area safety precaution system based on thunder fuses
CN117975732A (en) * 2024-03-28 2024-05-03 中铁十六局集团有限公司 Intelligent traffic control system and method for tunnel
CN117975732B (en) * 2024-03-28 2024-05-28 中铁十六局集团有限公司 Intelligent traffic control system and method for tunnel

Also Published As

Publication number Publication date
CN114779229A (en) 2022-07-22

Similar Documents

Publication Publication Date Title
WO2022156276A1 (en) Target detection method and apparatus
CN109927719B (en) Auxiliary driving method and system based on obstacle trajectory prediction
US9669829B2 (en) Travel lane marking recognition system
RU2738491C1 (en) Method for correction of position error and device for correction of position error in vehicle with driving assistance
US9880554B2 (en) Misrecognition determination device
CN108688660B (en) Operating range determining device
US9235767B2 (en) Detection region modification for driving assistance apparatus and driving assistance method
JP3619628B2 (en) Driving environment recognition device
JP7346499B2 (en) Information processing device, information processing method, and program
EP3349143B1 (en) Nformation processing device, information processing method, and computer-readable medium
EP3674971B1 (en) Method and system for training machine learning algorithm to detect objects at distance
US20190071094A1 (en) Vehicle control system, vehicle control method, and storage medium
JP6171612B2 (en) Virtual lane generation apparatus and program
US11092442B2 (en) Host vehicle position estimation device
JP6450294B2 (en) Object detection apparatus, object detection method, and program
US20200307569A1 (en) Vehicle control device, vehicle control method, and storage medium
US11326889B2 (en) Driver assistance system and control method for the same
EP3819180A1 (en) Method and processor for controlling in-lane movement of autonomous vehicle
JP2005332192A (en) Steering support system
US20230384441A1 (en) Estimating three-dimensional target heading using a single snapshot
JP2004265432A (en) Travel environment recognition device
JP4843880B2 (en) Road environment detection device
JP6609292B2 (en) Outside environment recognition device
WO2021074660A1 (en) Object recognition method and object recognition device
CN116242375A (en) High-precision electronic map generation method and system based on multiple sensors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21920654

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21920654

Country of ref document: EP

Kind code of ref document: A1