CN114779229A - Target detection method and device - Google Patents

Target detection method and device Download PDF

Info

Publication number
CN114779229A
CN114779229A CN202110089080.2A CN202110089080A CN114779229A CN 114779229 A CN114779229 A CN 114779229A CN 202110089080 A CN202110089080 A CN 202110089080A CN 114779229 A CN114779229 A CN 114779229A
Authority
CN
China
Prior art keywords
vehicle
millimeter wave
wave radar
target
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110089080.2A
Other languages
Chinese (zh)
Inventor
黄梓亮
郑永豪
王灿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110089080.2A priority Critical patent/CN114779229A/en
Priority to PCT/CN2021/124194 priority patent/WO2022156276A1/en
Publication of CN114779229A publication Critical patent/CN114779229A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the application provides a target detection method and a target detection device, wherein in the method, a first vehicle detects the speed of a second vehicle based on a millimeter wave radar; acquiring road conditions of a first vehicle and a second vehicle; determining a first threshold value according to the acquired road condition; then comparing the speed of the second vehicle to a first threshold; if the speed of the second vehicle exceeds a first threshold value, the first vehicle determines that the millimeter wave radar is in a target loss state or a state that the target is about to be lost, otherwise the first vehicle determines that the millimeter wave radar is not in the target loss state or the state that the target is about to be lost. Therefore, under special road conditions (such as curves and ramps), the first vehicle can effectively identify the condition that the millimeter wave radar target is lost due to the road conditions in time, the illusion that the first vehicle has no target obstacle is avoided, and the safety and the stability of the vehicle in the driving process are further improved.

Description

Target detection method and device
Technical Field
The application relates to the technical field of automatic driving, in particular to a target detection method and device.
Background
With the rapid development of the economic society, the automatic driving technology gradually matures and falls to the ground. However, some functions under special dangerous road conditions, such as detection and tracking of targets at curves, are a significant problem in the field of automatic driving feeling.
The prior art mainly aims at that the millimeter wave radar and the forward-looking camera are tracked firstly and then fused under the straight road condition, and only small modification is carried out on a fusion method; or additionally arranging sensors in each dead zone range, and considering more to remove invalid targets by combining with curve lane lines and determining continuous frame tracking of the targets in a travelable space. However, since the detection ranges of the camera and the millimeter wave radar and other sensors are fixed, blind areas still occur under the road condition of a curve and cannot be detected, so that the target obstacle is considered to be lost, and the illusion that the vehicle does not have the target obstacle is caused.
Some technologies propose to achieve efficient identification of target-tracking vehicles in curves based on Vehicle-to-Vehicle (V2V) technology. For example, when the tail of the preceding vehicle is coincident with the boundary line of the blind area, the vehicle acquires the speed of the vehicle in the blind area through the internet of vehicles V2V system, and fuses the speed and the boundary of the blind area to form a virtual blind area boundary following model with the speed. The scheme relies on the V2X with high cost to obtain the front vehicle information, and the cost is extremely high; and the deployment of current V2X networks is not complete, the overall scheme is difficult to implement.
Therefore, how to effectively identify and track the target vehicle in the curve with low cost and high efficiency and improve the safety of the vehicle is a technical problem to be solved by the application.
Disclosure of Invention
The application provides a target detection method and a target detection device, which are used for effectively identifying and tracking a target vehicle in a curve at low cost and high efficiency, and further improving the safety and stability of the vehicle in the driving process.
In a first aspect, an object detection method is provided, which can be applied to a vehicle and also can be applied to a chip inside the vehicle. Taking the method as an example for application to a first vehicle, the method comprises: the first vehicle detects the speed of a front vehicle based on the millimeter wave radar, wherein the speed comprises a linear speed and an angular speed, and the front vehicle is a second vehicle for example; the method comprises the steps that a first vehicle obtains road conditions where the first vehicle and a second vehicle are located; the first vehicle determines a first threshold value according to the acquired road condition; and the first vehicle compares the speed of the second vehicle with a first threshold value, if the speed of the second vehicle exceeds the first threshold value, the first vehicle determines that the millimeter wave radar is in a target loss state or a state that the target is about to be lost, otherwise, the first vehicle determines that the millimeter wave radar is not in the target loss state or the state that the target is about to be lost.
In the embodiment of the application, the first vehicle determines the first threshold value according to the road condition, and compares the real-time speed of the front vehicle (namely the second vehicle) with the first threshold value, so as to judge whether the millimeter wave radar is in a target loss state or a state that the target is about to be lost. Therefore, under special road conditions (such as curves and ramps), the first vehicle can effectively identify the condition that the millimeter wave radar target is lost due to the road conditions in time, the illusion that the first vehicle has no target obstacle is avoided, and the safety and the stability of the vehicle in the driving process are further improved.
In one possible design, the first vehicle may use a camera to acquire an RGB image of the front of the first vehicle, and then determine the road condition where the second vehicle is located based on the RGB image.
Therefore, the road condition in front of the first vehicle can be efficiently and quickly identified, and the road condition where the second vehicle is located is determined.
In a possible design, the first vehicle determines the road condition where the second vehicle is located according to the RGB image, specifically, the feature point may be extracted from a lane line in a far field of view included in the RGB image, and then the inflection point and the direction of the road where the second vehicle is located are determined according to the extracted feature point, so as to obtain the road condition where the second vehicle is located.
Because the detection range of the millimeter wave radar in the near vision field is larger than that in the far vision field, the target in the near vision field is generally not easy to lose, so the embodiment of the application only processes the lane line in the far vision field, can reduce the calculated amount on the premise of ensuring the accuracy, and improves the calculation efficiency.
In one possible design, the first vehicle may obtain the position information of the first vehicle based on the positioning device, and then determine the road condition where the first vehicle is located according to the position information.
For example, the positioning device may be a GPS, a beidou system, or other positioning system, and is used for receiving satellite signals and positioning the current position of the first vehicle. In addition, the positioning system can also be visual positioning, millimeter wave radar positioning, fusion positioning and the like, and the application is not limited.
After obtaining the position information of the first vehicle, the first vehicle determines the road condition of the position of the first vehicle, such as a curve, a straight road, an ascending slope, a descending slope, etc., based on the map.
Therefore, the first vehicle can quickly and accurately obtain the road condition of the first vehicle.
In one possible design, the first threshold value may be determined based on a speed threshold value of the second vehicle between being within the millimeter wave radar detection range and being outside the millimeter wave radar detection range, e.g., the first threshold value is less than or equal to the speed threshold value. When the speed of the second vehicle is far larger than the speed critical value, the second vehicle exceeds the detection range of the millimeter wave radar and enters a detection blind area of the millimeter wave radar, so that the millimeter wave radar is in a target loss state; when the speed of the second vehicle is near the speed critical value, the second vehicle possibly exceeds the detection range of the millimeter wave radar at any time, namely the second vehicle is about to enter a detection blind area of the millimeter wave radar, so that the millimeter wave radar is in a state that a target is about to lose. In the embodiment of the present application, the design of the first threshold may be different according to different road conditions, and three specific examples are given below for explanation.
Example 1, the road conditions under which the first vehicle and the second vehicle are located are: the first vehicle is in a straight road and the second vehicle is in a curve. Detecting the speed of the second vehicle based on the millimeter wave radar may include: detecting the instantaneous angular speed of a second vehicle making circular motion around a curve based on a millimeter wave radar; the first threshold value N determined according to the road condition meets the following conditions:
Figure BDA0002912066300000021
wherein, ω isrFor the second vehicle from time t0 to time t1The angular velocity of the circular motion around the curve,
Figure BDA0002912066300000022
the euclidean distance between the first vehicle and the second vehicle at time t 0;
Figure BDA0002912066300000023
alpha is half of the detection beam angle of the millimeter wave radar, the time t0 is the time when the millimeter wave radar collects the first frame data, the time t1 is the time when the millimeter wave radar collects the second frame data, and the first frame data and the second frame data are continuous two frames of data; k is a coefficient greater than 0 and equal to or less than 1; epsilon is the angle value of the central angle formed among the position A of the first vehicle, the position B of the second vehicle and the center O of the curve at the moment t 0.
Example 2, the road conditions under which the first vehicle and the second vehicle are: the first vehicle is in a curve and the second vehicle is in a straight lane. Detecting a speed of the second vehicle based on the millimeter wave radar may include: detecting the instantaneous running speed of the second vehicle on the straight road based on the millimeter wave radar; the first threshold value N determined according to the road condition meets the following conditions:
Figure BDA0002912066300000024
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002912066300000031
is the displacement of the second vehicle from time t0 to time t 1; v. ofrA travel speed on the straight road from time t0 to time t1 for the second vehicle; the time t0 is the time when the millimeter wave radar collects the first frame data, the time t1 is the time when the millimeter wave radar collects the second frame data, and the first frame data and the second frame data are two continuous frames of data; p is a coefficient of greater than 0 and 1 or less.
Example 3, the road conditions under which the first vehicle and the second vehicle are: the first vehicle and the second vehicle are both in a curve. Detecting the speed of the second vehicle based on the millimeter wave radar may include: detecting the instantaneous angular speed of a second vehicle making circular motion around a curve based on a millimeter wave radar; the first threshold value N determined according to the road condition meets the following conditions:
Figure BDA0002912066300000032
wherein, ω isrAn angular velocity for the second vehicle to make a circular motion around the curve from time t0 to time t 1; v is the instantaneous travel speed of the first vehicle; alpha is half of the detection beam angle of the millimeter wave radar, the time t0 is the time when the millimeter wave radar acquires first frame data, the time t1 is the time when the millimeter wave radar acquires second frame data, and the first frame data and the second frame data are two continuous frames of data; q is a coefficient of greater than 0 and equal to or less than 1;
Figure BDA0002912066300000033
the euclidean distance between the first vehicle and the second vehicle at time t 0;
it should be noted that the above three are only examples and are not limiting. In practical application, the road condition is not limited to a curve scene, and the preset condition can be designed by adopting the same idea for other road scenes (such as ascending/descending, acceleration/deceleration and the like).
In one possible design, after the first vehicle determines that the millimeter wave radar is in the target loss state or the target is about to be lost, the first vehicle can control the millimeter wave radar to wind the Z direction according to the road conditionRRotating and/or winding about axis XRThe shaft is rotated to cause the millimeter wave radar to re-detect the second vehicle; wherein Z isRAxis perpendicular to horizontal plane, XRThe axle is parallel to the horizontal plane and perpendicular to the direction of travel of the first vehicle.
It will be appreciated that the two directions described above (i.e. around Z)RDirection of rotation of the shaft, about XRDirection of rotation of the axle) are independent (or decoupled) from each other in the mechanical structure and are independent (or decoupled) from each other in the control logic, so that the first vehicle can adjust only one of the degrees of freedom or both simultaneouslyTwo degrees of freedom. For example, the first vehicle may control the millimeter wave radar around Z when the first vehicle and/or the second vehicle are in a curveRThe shaft rotates a first angle. For example, the first vehicle may control the millimeter wave radar around X when the first vehicle and/or the second vehicle are on a gradeRThe shaft rotates a second angle. For example, when the first vehicle and/or the second vehicle are on a combined road section of a curve, the first vehicle may control the millimeter wave radar around ZRRotation of the shaft by a third angle and about XRThe shaft rotates a fourth angle.
In the embodiment of the application, the millimeter wave radar surrounds ZRAxial rotation and rotation about XRThe rotation of the shafts is independent from each other and independent from each other, so that the accuracy and efficiency of posture adjustment of the millimeter wave radar can be improved, and the accuracy and efficiency of target detection and tracking are further improved.
In one possible design, the first vehicle controls the millimeter wave radar around Z each timeRRotating and/or winding the axis XRAfter the shaft rotates, the calibration matrix of the millimeter wave radar can be updated in real time according to the angle value of the rotation.
Therefore, the quick and accurate response of the information fusion of the millimeter wave radar and the camera can be achieved, and the reliability of the information fusion of the millimeter wave radar and the camera in the attitude adjustment process of the millimeter wave radar is ensured.
In one possible design, the first vehicle may detect and track the target based on the camera and the millimeter wave radar, respectively, and perform target fusion according to the IOU fusion rule. Specifically, the first vehicle acquires an RGB image in front of the first vehicle based on a camera, and performs target recognition on the RGB image based on a target recognition model to obtain a first recognition result, where the first recognition result includes a position and a type of the second vehicle, the input of the target recognition model is the RGB image, and the output is the position and the type of the target; the method comprises the steps that a first vehicle obtains radar trace data in front of the first vehicle based on a millimeter wave radar, and the radar trace data are processed to obtain a second identification result, wherein the second identification result comprises the position and the speed of a second vehicle; and when the first vehicle judges that the area where the second vehicle is located in the RGB image and the IoU of the area where the second vehicle is located in the radar trace data are larger than the second threshold value M, the first recognition result and the second recognition result are fused to obtain fused data, otherwise, the first recognition result and the second recognition result are not fused. The second threshold value M, the curvature rho of the curve where the first vehicle and/or the second vehicle are/is located, the driving speed V of the first vehicle and the driving distance L of the first vehicle satisfy the following relations:
M=a2ρ+bV+L;
wherein a and b are preset coefficients.
It should be understood that the above formula is merely exemplary and not limiting, and in particular implementation, the second threshold K may also be related to other factors, such as the acceleration of the first vehicle, etc., without limitation.
The second threshold K in the embodiment of the present application is related to a road condition where the vehicle is located (curvature ρ of a curve where the first vehicle and/or the second vehicle is located), and a driving state of the first vehicle (i.e., driving speed V and driving distance L), so that accuracy of fusion recognition can be improved, and safety and stability of the vehicle in a driving process are further improved.
In one possible design, when the millimeter wave radar is in a target loss state or a state of losing a target soon, the first vehicle may track the second vehicle based on the fusion data of the millimeter wave radar before the target loss state or the state of losing a target soon and the first recognition result of the millimeter wave radar when the target loss state or the state of losing a target soon; alternatively, when the millimeter wave radar is not in the target loss state or the target is about to be lost state, the first vehicle may track the second vehicle based on the continuous multi-frame fusion data.
In the embodiment, the first vehicle adopts different tracking mechanisms according to different millimeter wave radar states, so that the accuracy of target tracking can be improved, and the safety and the stability of the vehicle in the driving process are further improved.
In a possible design, the target recognition of the RGB image by the first vehicle based on the target recognition model specifically may include: when the millimeter wave radar is in a target loss state or a target to be lost state, performing target identification on an RGB image acquired by a camera by using a light-weight target identification model; when the millimeter wave radar is not in a target loss state or a target is about to be lost, performing target identification on an RGB image acquired by a camera by using a requantized target identification model; the recognition speed of the light-weight target recognition model is greater than that of the heavy-weight target recognition model, and the recognition accuracy of the light-weight target recognition model is less than that of the heavy-weight target recognition model.
The embodiment of the application aims at camera target recognition, two different target recognition models are designed, under the condition that a millimeter wave radar loses frames, a lightweight recognition model is adopted to carry out target recognition on data collected by a camera, the recognition speed is increased, under the condition that the millimeter wave radar does not lose frames, the lightweight recognition model is adopted to carry out target recognition on the data collected by the camera, the recognition precision is increased, and the speed and the accuracy of fusion recognition can be achieved.
In a second aspect, an object detection device is provided that is located in a first vehicle, which may be, for example, a chip disposed within the vehicle. The apparatus includes a module or a unit or a means for performing the steps performed by the first vehicle in any one of the above-mentioned first aspect or the first aspect, where the function or the unit or the means may be implemented by software, or implemented by hardware executing corresponding software.
Illustratively, the apparatus may include: a detection module for detecting a speed of the second vehicle based on the millimeter wave radar; wherein the second vehicle is located forward of the first vehicle, the speed comprising a linear speed and an angular speed; the processing module is used for acquiring road conditions of a first vehicle and a second vehicle; determining a first threshold value according to the road condition; and if the speed of the second vehicle exceeds the first threshold value, determining that the millimeter wave radar is in a target loss state or a target to be lost state.
In a third aspect, there is provided a computer readable storage medium comprising a program or instructions which, when run on a computer, causes a method as in the first aspect or any one of the possible designs of the first aspect to be performed.
In a fourth aspect, an object detection apparatus is provided that includes a processor and a memory; wherein the memory stores instructions executable by the processor to cause the apparatus to perform the method as described in the first aspect or any one of the possible designs of the first aspect by executing the instructions stored by the memory.
In a fifth aspect, there is provided a chip, coupled to a memory, for reading and executing program instructions stored in the memory to implement a method as in the first aspect or any one of the possible designs of the first aspect.
A sixth aspect provides a computer program product comprising instructions stored thereon, which when run on a computer, cause the computer to perform the method as set forth in the first aspect or any one of the possible designs of the first aspect.
In a seventh aspect, a vehicle is provided that includes an object detection device, a millimeter wave radar, and a camera; the object detection apparatus is adapted to implement the method as in the first aspect or any one of the possible designs of the first aspect described above by controlling the millimeter wave radar and the camera.
Specifically, the target detection device may detect a speed of another vehicle in front of the current vehicle through the millimeter wave radar, acquire road conditions where the first vehicle and the second vehicle are located through the camera, and determine the first threshold according to the road conditions; and if the speed of another vehicle in front of the current vehicle exceeds a first threshold value, determining that the millimeter wave radar is in a target loss state or a target is about to be lost state.
For the beneficial effects of the second to seventh aspects, reference is made to the beneficial effects of the corresponding designs in the first aspect, and details are not repeated here.
Drawings
FIG. 1 is a diagram illustrating a relationship between a detection beam angle and a detection distance of a millimeter-wave radar;
FIG. 2 is a schematic diagram of the detection range of a camera and the detection range of a millimeter wave radar;
FIG. 3A is a block diagram of a possible application scenario in which the present disclosure is applied;
FIG. 3B is another possible application scenario in which the embodiments of the present application are applicable;
FIG. 3C is another possible application scenario in which the embodiments of the present application are applicable;
FIG. 3D is another possible application scenario in which the embodiments of the present application are applicable;
FIG. 4 is a diagram of one possible vehicle architecture;
FIG. 5 is a flowchart of a target detection method provided in an embodiment of the present application;
FIG. 6 is a schematic view of a first vehicle in a straight road and a second vehicle in a curve;
FIG. 7 is a schematic view of a first vehicle in a curve and a second vehicle in a straight lane;
FIG. 8 is a schematic view of a first vehicle in a curve and a second vehicle in a curve;
FIG. 9 is a schematic diagram of a three-dimensional coordinate system of a millimeter wave radar;
FIG. 10 is a schematic diagram of adjusting the attitude of a millimeter wave radar;
FIG. 11 is a schematic diagram of a millimeter wave radar coordinate system and a camera coordinate system;
fig. 12 is a flowchart of a target tracking method according to an embodiment of the present application;
FIG. 13 is a schematic diagram of an ROI corresponding to a millimeter wave radar detection result and an ROI corresponding to a camera detection result;
fig. 14 is a schematic diagram of an intersection set of millimeter wave radar detection results and camera detection results;
fig. 15 and 16 are schematic diagrams of an uphill scene;
FIG. 17 is a diagram of a system architecture provided in an embodiment of the present application;
fig. 18 is a schematic structural diagram of an object detection apparatus 180 according to an embodiment of the present disclosure;
fig. 19 is a schematic structural diagram of another object detection device 190 according to an embodiment of the present application.
Detailed Description
When a vehicle detects and tracks a target based on a camera and a millimeter wave radar (millimeter wave radar), the camera can basically cover a short-distance target obstacle, and a long-distance target obstacle needs to depend on the millimeter wave radar.
The detection beam angle and the installation position of the millimeter wave radar are fixed, so the detection range (generally, the fan angle) is limited. Fig. 1 is a schematic diagram showing a relationship between a detection beam angle and a detection distance of a millimeter wave radar. As shown in fig. 1, the detection beam angle varies with the detection distance, and the detection angle of the millimeter wave radar in the far vision field is smaller than that in the near vision field. Therefore, in a curved road condition, a target obstacle (such as a leading vehicle) detected and tracked by a trailing vehicle is likely to exceed the detection area of the millimeter wave radar of the trailing vehicle, as shown in fig. 2.
The detection range of the camera is generally a sector, and the field angles (fov) of the commonly used cameras are 60 degrees, 90 degrees, 150 degrees and the like. The detection distance of the camera is inversely proportional to the field angle, for example, a camera with fov being 90 ° has the farthest detection distance of 80 meters, and a camera with fov being 150 ° has the detection distance much shorter than 80 meters. As shown in fig. 2, the detection range of the camera (i.e., the camera detection area) is wider than that of the radar, but the detection distance is shorter than that of the radar.
In a vehicle equipped with a millimeter wave radar and a camera sensor, information on the exact position, speed, and the like of a target obstacle comes from detection data of the millimeter wave radar. Therefore, when the target exceeds the detection range of the millimeter wave radar, the rear vehicle cannot acquire the accurate position, speed and other information of the target obstacle, so that the target obstacle is considered to be lost by the rear vehicle, and the illusion that no target obstacle exists is caused.
In reality, however, the target obstacle actually exists in the objective world. Because the millimeter wave radar is the most reliable important guarantee for the front-side and lateral collision early warning, the phenomenon that a target obstacle is lost under the road condition of the curve can cause that an automatic Driving System can not obtain an accurate safe distance, and further, the Advanced Driving Assistance System (ADAS) and the automatic Driving tracking can be seriously influenced, and great potential safety hazards are generated.
In view of this, embodiments of the present application provide a target detection method and apparatus, which aim to introduce consideration to a road detection blind area, quickly identify a special road condition according to positioning information and visual information of a vehicle, and determine whether a sensor such as a millimeter wave radar is in a target loss state or a target is about to be lost state through kinematic physical quantities (such as target angular velocity/speed, etc.), so as to avoid an illusion that the vehicle has no target obstacle. Furthermore, when sensors such as millimeter wave radars are in a target loss state or a target is about to be lost, the postures of the sensors such as millimeter wave radars are adjusted, so that the sensors such as millimeter wave radars can timely detect the target again, and the safety and the stability of the vehicle in the driving process are ensured. Furthermore, when the attitude of a sensor such as a millimeter wave radar is adjusted, the attitude of the radar is compensated and corrected from two degrees of freedom (for example, left and right (α) and up and down (β)) respectively, and the two degrees of freedom are independent and independent of each other, so that the accuracy and efficiency of adjustment can be improved.
The technical solution in the embodiments of the present application will be described in more detail below with reference to the drawings in the embodiments of the present application.
It should be understood that in the description of the present application, a plurality means two or more. At least one means one or more. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, a and/or B, which may indicate: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance, nor order.
Fig. 3A is a possible application scenario applicable to the embodiment of the present application, where the scenario may be automatic driving, driving assistance, manual driving, and the like. The scene includes at least two vehicles, and fig. 3A takes two vehicles as an example. Wherein, the front of the rear vehicle (i.e. the self vehicle) is provided with a radar sensor, such as a millimeter wave radar sensor, and is also provided with a camera, and the rear vehicle can detect and track the front vehicle through the millimeter wave radar and the camera. In the scenario shown in fig. 3A, both the front and rear vehicles are in a curve.
Fig. 3B is another possible application scenario to which the embodiment of the present application is applied, where the scenario may be automatic driving, driving assistance, manual driving, and the like. Unlike fig. 3A, in the scenario shown in fig. 3B, the leading vehicle is in a curve and the trailing vehicle (i.e., the own vehicle) is in a straight lane.
Fig. 3C is another possible application scenario to which the embodiment of the present application is applied, where the scenario may be automatic driving, driving assistance, manual driving, and the like. Unlike fig. 3A and 3B, in the scenario shown in fig. 3C, the rear vehicle (i.e., the own vehicle) is in a curve, and the front vehicle is in a straight lane.
It should be noted that fig. 3A, fig. 3B, and fig. 3C are all shown by two vehicles (that is, the vehicle has only one front vehicle), and it should be understood that this does not limit the number of vehicles in the application scenario, and more vehicles may be included in the actual scenario. For example, as shown in fig. 3D, the host vehicle is in a curve, the front vehicle is in a curve, and the front vehicle is in a straight lane. When there are a plurality of current vehicles, the detection principle of the vehicle for each preceding vehicle is the same, so in the embodiment of the present application, a preceding vehicle is mainly taken as an example.
In addition, fig. 3A, 3B, 3C, and 3D are examples of road conditions of a curve, and the embodiment of the present application may also be applied to other road conditions that may cause a vehicle detection blind area, such as ascending/descending, acceleration/deceleration, and the like, which is not limited in the present application.
The object detection method provided in the embodiment of the present application may be specifically applied to a vehicle in the foregoing scenario, and specifically, the method may be carried in a computing device of the vehicle through software, where the computing device is, for example, a single vehicle-mounted device (such as a vehicle-mounted computer, a driving control device, and the like), or one or more processing chips, or an integrated circuit, and the present application is not limited thereto.
Referring to fig. 4, a possible vehicle architecture diagram includes: computing devices, control devices, and in-vehicle sensors, among others. Various components of the vehicle are described below.
And the vehicle-mounted sensor (sensor for short) is used for acquiring various sensor data of the vehicle in real time. The sensor mounted on the vehicle in the embodiment of the present application includes, for example: cameras (or cameras), positioning systems, radar sensors, attitude sensors, etc. Other sensors, such as a shaft speed sensor, a wheel speed sensor, etc., may be included, and the present application is not limited thereto.
Among them, the radar sensor may be simply referred to as a radar. The radar can measure radar trace data of the surrounding environment, and can also measure information of obstacles of the vehicle in a set direction, such as the position and the speed (including linear speed and angular speed) of a front vehicle.
The radar in the embodiment of the present application mainly uses a millimeter wave radar as an example. Millimeter wave radar: is a radar operating in the millimeter wave band (millimetre wave) detection. Usually, the millimeter wave refers to the frequency band of 30 to 300GHz (the wavelength is 1 to 10 mm). The wavelength of the millimeter wave is between centimeter wave and light wave, so the millimeter wave has the advantages of microwave guidance and photoelectric guidance. The millimeter-wave radar is mounted on an automobile, and the distance, angle, relative speed, and the like from the millimeter-wave radar to a measured object can be measured. The millimeter wave radar can be used for realizing the functions of Advanced Driving Assistance Systems (ADAS) such as self-adaptive cruise control, Forward Collision avoidance Warning (Forward Collision Warning), Blind Spot Detection (bland Spot Detection), auxiliary Parking (Parking aid), Lane change assistance, autonomous cruise control and the like.
In the embodiment of the application, the millimeter wave radar can be in linkage connection with the follow-up mechanism, and when the follow-up mechanism rotates, the posture of the millimeter wave radar changes accordingly. The follow-up mechanism can be a component of a millimeter wave radar, and can also be a millimeter wave radar matching device, which is not limited here.
An attitude sensor: the attitude sensor is a high-performance three-dimensional motion attitude measurement system based on a sensor, namely, Micro Electro Mechanical Systems (MEMS) technology. In the embodiment of the application, the attitude sensor is installed on the radar and used for detecting the attitude of the radar.
The cameras may be deployed around the vehicle and collect environmental parameters around the vehicle. For example, at least one camera may be mounted on a front bumper, a side view mirror, a windshield, and a roof of the vehicle, respectively. In the embodiment of the present application, the camera may at least acquire an image in front of the vehicle, so that the computing device may determine the type of the target obstacle (e.g., a leading vehicle) in the driving direction of the vehicle according to the image, and may also determine information such as the distance between the vehicle and the target obstacle, the position of the target obstacle, and the road condition.
It is emphasized that the in-vehicle sensor in the embodiment of the present application is exemplified by a combination of a camera and a millimeter wave radar. However, the present application does not limit the specific type of sensor, and for example, the sensor of the vehicle may be a combination of a solid-state lidar and a camera, a combination of a millimeter-wave radar and a solid-state lidar, a combination of a camera and a camera, a combination of a millimeter-wave radar, a solid-state laser, and a camera, or the like. The embodiment of the present application is applicable to any sensor that has a large detection range, a short detection distance, and a inaccurate detection result (e.g., position and speed of an obstacle), and is used in combination with a sensor that has a small detection range, a long detection distance, and an accurate detection result.
The positioning system is used for positioning the current position of the vehicle. For example, the Positioning System may be a Global Positioning System (GPS), a beidou System or other Positioning systems, and is used for receiving satellite signals and Positioning the current position of the vehicle. The positioning system may also be visual positioning, radar positioning, fusion positioning, etc., and the present application is not limited thereto, and hereinafter, mainly takes GPS as an example.
The computing device is responsible for computing functions, and is used for collecting parameter information of the surrounding environment of the vehicle in real time in the running process of the vehicle according to various sensors (such as a camera, a radar, a positioning system and the like) installed on the vehicle, calculating and analyzing the collected parameter information, and determining a vehicle control instruction according to an analysis result.
Various computing functions may be provided in the computing device. Examples include: sensing surrounding environment information in the running process of the vehicle according to parameter information acquired by a camera and a radar; determining the geographic position of the vehicle according to the parameter information acquired by the positioning system; calculating and analyzing according to parameter information acquired by sensors such as a camera, a radar and a positioning system, and determining the state of the vehicle, such as determining whether the radar is in a target loss state or a target loss state soon; the control instruction is generated according to the parameter information acquired by the sensor and is sent to the control equipment, so that the control equipment controls the corresponding sensor, for example, the instruction for controlling the rotation of the follow-up mechanism of the radar can be generated when the radar is in a target loss state or is about to be in the target loss state, then the instruction is sent to the control equipment, so that the control equipment controls the rotation of the follow-up mechanism, the posture of the radar is further adjusted, and the radar can obtain the target again.
The embodiment of the present application does not specifically limit the type of the computing device. As one example, the computing device may be a Mobile Data Center (MDC). The MDC is a local computing platform of the automatic driving vehicle, automatic driving software running on the MDC comprises a camera perception algorithm, a millimeter wave radar perception algorithm, a target fusion tracking algorithm (namely an algorithm corresponding to a target tracking method based on radar information and camera information fusion) and the like. The MDC also performs some simple operations such as motor control, etc.
The embodiment of the present application does not specifically limit the type of the control device. As an example, the control device may be a micro-control unit (MCU). Optionally, the MCU may also be called a Single Chip Microcomputer (SCM) or a single chip microcomputer.
It should be understood that the MCU may be a computing device that appropriately reduces the frequency and specification of a Central Processing Unit (CPU), and integrates at least one peripheral interface of a memory (memory), a counter (timer), a Universal Serial Bus (USB), an analog to digital (AD) converter, a Universal Asynchronous Receiver Transmitter (UART), a Programmable Logic Controller (PLC), a Direct Memory Access (DMA), and the like, or even a Liquid Crystal Display (LCD) driving circuit on a single chip, to form a level of computing device, so as to perform different combination control for different application occasions.
After receiving the vehicle control instruction sent by the computing device, the control device can control the vehicle-mounted sensor (such as controlling the attitude of the radar) through the vehicle control interface, so that auxiliary control of the vehicle can be realized.
Those skilled in the art will appreciate that the structure of the vehicle-mounted device shown in fig. 4 does not constitute a limitation of the vehicle-mounted device, and the vehicle-mounted device provided in the embodiments of the present application may include more or less modules than those shown, or may combine some modules, or may be arranged in different components, and the present application is not limited thereto. For example, the vehicle-mounted device may further include a braking mechanism (such as a brake, a throttle, a gear, and the like), a human-computer interaction input/output component (such as a display screen, and the like), a wireless communication module, a communication interface, and the like.
As shown in fig. 5, a flowchart of an object detection method is provided for an embodiment of the present application, and the method may be applied to the vehicle shown in fig. 4, and includes:
s501, the first vehicle acquires road conditions of the first vehicle (the own vehicle) and the second vehicle (the front vehicle) and acquires the speed of the second vehicle.
Specifically, the first vehicle is provided with sensors such as a camera, a millimeter wave radar and a positioning system, and the computing equipment acquires sensor data acquired by the sensors in real time. The camera can acquire image information in front of the first vehicle in real time and perform detection and tracking on the second vehicle (for example, detect the type and the position of the second vehicle), the positioning system can perform positioning on the position of the first vehicle in real time, and the millimeter wave radar can perform detection and tracking on a target obstacle (here, the second vehicle is taken as an example) (for example, detect the position and the speed of the second vehicle). The specific structure of the first vehicle may refer to the structure shown in fig. 4, and will not be described in detail here.
The computing device of the first vehicle may determine a road condition on which the first vehicle is located based on the positioning system. Specifically, the computing device of the first vehicle may obtain the position of the first vehicle on the map based on the positioning system, so that the road condition of the first vehicle, such as a curve, a straight road, an uphill slope, a downhill slope, etc., may be determined based on the map.
The computing device of the first vehicle may also determine a distance traveled by the first vehicle based on the positioning system. For example, referring to fig. 6 or fig. 7 or fig. 8, at time t0, the positioning system positions the first vehicle at point a; at time t1, the positioning system positions the first vehicle at point A ', and the computing device can calculate the distance from point A to point A' according to the point A and the point A
Figure BDA0002912066300000091
(displacement of the first vehicle from time t0 to time t 1).
The computing device of the first vehicle may also obtain, via the camera, a road condition on which the second vehicle is located. Specifically, the computing device of the first vehicle controls the camera to capture an image in front of the first vehicle, where the image includes the second vehicle and information of a road where the second vehicle is located, such as a curvature radius of a lane line or other road signs, and then determines a road condition where the second vehicle is located, such as a curve or a straight road, or an ascending slope, or a descending slope, based on the image.
The computing device of the first vehicle may also determine information such as the position and speed of the second vehicle from the data detected by the camera and the millimeter wave radar. Specific implementation may refer to a target tracking method based on the fusion of millimeter wave radar information and camera information, which is described later and is not described in detail here.
S502, if the speed of the second vehicle meets the preset condition corresponding to the road condition, the first vehicle determines that the millimeter wave radar of the first vehicle is in a target loss state or a target is about to be lost state.
The speed of the second vehicle at least comprises an angular speed or a linear speed, and the preset condition comprises whether the speed of the second vehicle is greater than or equal to a set first threshold value.
Wherein the set first threshold value should be less than or equal to the speed at which the second vehicle travels on the boundary of the millimeter wave radar detection range (i.e., the speed threshold value between when the second vehicle is located within the millimeter wave radar detection range and outside the millimeter wave radar detection range).
For example, the first threshold value is equal to the speed of the second vehicle when the second vehicle travels on the boundary of the millimeter wave radar detection range, and then when the speed of the second vehicle exceeds the first threshold value, the second vehicle may exceed the millimeter wave radar detection range and enter the millimeter wave radar detection blind area, and at this time, the millimeter wave radar is in the target losing state, and when the speed of the second vehicle is equal to the first threshold value (or the speed of the second vehicle is less than the first threshold value and the difference between the speed of the second vehicle and the first threshold value is less than the preset difference), the millimeter wave radar is in the target losing state.
For example, if the first threshold is smaller than the speed of the second vehicle when the second vehicle travels on the boundary of the detection range of the millimeter wave radar, and the difference between the first threshold and the threshold is Δ X, then the millimeter wave radar is in the target-losing state when the speed of the second vehicle exceeds the first threshold and the difference between the speed of the second vehicle and the first threshold is smaller than or equal to Δ X, and the millimeter wave radar is in the target-losing state when the speed of the second vehicle exceeds the first threshold and the difference between the speed of the second vehicle and the first threshold is greater than Δ X.
In addition, in order to improve the safety of the vehicle, the state of the millimeter wave radar may be set to the target lost state when the millimeter wave radar is actually in the target soon-to-be-lost state (that is, the target lost state includes both the actual loss and the target soon-to-be-lost).
It should be understood that, in specific implementation, other words may also be used to describe that the millimeter wave radar is in a target-losing state or a target-losing-imminent state, for example, the millimeter wave radar is in a frame-losing state (which may be understood as that the millimeter wave radar does not detect a data frame containing a target), and this application does not limit this.
In the embodiment of the present application, the preset conditions may be different for different road conditions, for example, the first threshold is different. When the speed of the second vehicle meets the preset condition corresponding to the current road condition, the computing device of the first vehicle determines that the millimeter wave radar of the first vehicle is in a target loss state or a state that the target is about to be lost.
It should be noted that, during traveling, if the attitude of the millimeter wave radar is not adjusted, the detection range of the millimeter wave radar is fixed with respect to the first vehicle. However, the first threshold is different when the first vehicle and the second vehicle are located at different positions and/or the relative positions of the first vehicle and the second vehicle are different.
Several specific examples are illustrated below.
Example 1, referring to fig. 6, when the first vehicle is in a straight road and the second vehicle is in a curved road, the preset conditions include:
Figure BDA0002912066300000101
the parameters are explained as follows:
the time t0 is the time when the millimeter wave radar collects the first frame data, and the time t1 is the time when the millimeter wave radar collects the second frame data, where the first frame data and the second frame data may be two consecutive frames, and may also be two frames separated by a small number of frames in a specific implementation (for example, the first frame data and the second frame data are separated by 1 frame or 2 frames). In the embodiment of the present application, two consecutive frames of data are mainly taken as an example.
At time t0, the first vehicle is at the a position, the second vehicle is at the B position, and the B position is the boundary position of the millimeter wave radar detection range when the first vehicle is at the a position. At time t1, the first vehicle is at the a 'position, the second vehicle is at the B' position, and the B 'position is the boundary position of the millimeter wave radar detection range when the first vehicle is at the a' position. In the embodiment of the present application, the boundary position may be one region or range, and for example, positions within ± L regions from the boundary line of the radar detection range may each be defined as the boundary position of the millimeter wave radar detection range.
ωr' is the instantaneous angular velocity of the second vehicle at time t2 in a circular motion around the curve; n is a first threshold;
ωrthe angular velocity for the second vehicle to make a circular motion around the curve from the time t0 to the time t1 (which is the average angular velocity,when the interval between t0 and t1 is very short (for example, the time for collecting one frame of data), the angular velocity can also be regarded as the instantaneous velocity), and t2 is more than or equal to t1 and more than t 0;
Figure BDA0002912066300000111
displacement for the first vehicle from time t0 to time t 1;
Figure BDA0002912066300000112
displacement of the second vehicle from time t0 to time t 1; it should be noted that the actual path from B to B 'of the second vehicle is a curve, but since the time t0 to the time t1 is very short, for example, 30 to 50ms, the path may be approximately a straight line, that is, the distance of the second vehicle from the time t0 to the time t1 is approximately equal to the straight line distance from the point B to the point B';
Figure BDA0002912066300000113
the euclidean distance between the first vehicle and the second vehicle at time t 0;
ε is the angle value of the central angle formed between time A, B, O and t 0;
alpha is half of the detection beam angle of the millimeter wave radar and is determined by the characteristics of the millimeter wave radar;
wherein, the coordinates of a and a 'may be obtained by the positioning system, and the position coordinates of B, B' may be obtained by a target tracking method based on the fusion of millimeter wave radar information and camera information, which will be described in further detail later. Position coordinates according to A, A 'and B, B' can be obtained
Figure BDA00029120663000001116
ε、ωrEtc. may be calculated from the position coordinates A, A ', B, B' output by the target tracking method. Omegar' Linear velocity v of the second vehicle, which can be detected by or from a millimeter-wave radarr' calculated.
K is a first preset coefficient, the value range is (0,1), the value range is optional, the value of K can be related to the execution time of a follow-up mechanism of the first vehicle for adjusting the millimeter wave radar and/or the urgency of a curve, and the like, wherein the longer the execution time is, the smaller the K is, the shorter the execution time is, the larger the K is, the sharper the curve is, the smaller the K is, the gentler the curve is, and the larger the K is.
Illustratively, the value of K satisfies the following formula:
K(R,t)=e^-(t/aR);
wherein t is execution time, R is curve radius, and a is undetermined coefficient.
Omega is introduced belowrThe derivation process of (2):
when the first vehicle runs in the automatic driving vehicle, the first vehicle is in a straight road, and the target vehicle is in a curve. At time t0, the first vehicle is at the a position, and the target vehicle is at the B position, which is the boundary position of the millimeter wave radar detection range when the first vehicle is at the a position. At time t1, the first vehicle is at a ', the target vehicle is at B', and the B 'position is the boundary position of the millimeter wave radar detection range when the first vehicle is at the a' position. O represents the center of the curve (center of curvature), and the curve is plotted and connected
Figure BDA0002912066300000115
Perpendicular to the tangential direction of B and B' (since the time t0 to t1 is extremely short, so
Figure BDA0002912066300000116
Can all be considered to be perpendicular to
Figure BDA0002912066300000117
)。
Referring to fig. 6, according to the geometric relationship, there are:
Figure BDA0002912066300000118
Figure BDA0002912066300000119
Figure BDA00029120663000001110
the following equation can be obtained:
Figure BDA00029120663000001111
due to the road conditions,
Figure BDA00029120663000001112
can be regarded as approximately perpendicular to
Figure BDA00029120663000001113
The following can be obtained: epsilon is approximately equal to gamma;
the turning radius of the second vehicle on the curve is as follows:
Figure BDA00029120663000001114
it can be seen that, if the first vehicle and the second vehicle are in the same lane, there are:
Figure BDA00029120663000001115
turning radius of the second vehicle:
Figure BDA0002912066300000121
according to the calculation formula of the circular motion of the automobile, the following can be obtained:
Figure BDA0002912066300000122
example 2, referring to fig. 7, when the first vehicle is in a curve and the second vehicle is in a straight road, the preset conditions include:
Figure BDA0002912066300000123
the parameters are explained as follows:
the time t0 is the time when the millimeter wave radar collects the first frame data, and the time t1 is the time when the millimeter wave radar collects the second frame data, where the first frame data and the second frame data may be two consecutive frames, and may also be two frames separated by a small number of frames (for example, the first frame data and the second frame data are separated by 1 frame or 2 frames) in specific implementation. In the embodiment of the present application, two consecutive frames of data are mainly taken as an example.
At time t0, the first vehicle is at the a position, the second vehicle is at the B position, and the B position is the boundary position of the detection range of the millimeter wave radar when the first vehicle is at the a position. At time t1, the first vehicle is at the a 'position, the second vehicle is at the B' position, and the B 'position is the boundary position of the detection range of the millimeter wave radar when the first vehicle is at the a' position. As with example 1, the boundary position may be a region or range.
Figure BDA0002912066300000124
Is the displacement of the second vehicle from time t0 to time t 1;
vr' is the instantaneous running speed of the second vehicle on the straight road at the time t2 (the running speed refers to the linear speed in the text), and t2 is more than or equal to t1 and more than t 0; n is a first threshold;
vrthe running speed of the second vehicle on the straight road from the time t0 to the time t1 (the running speed herein refers to the linear speed, where v isrIs the average linear velocity, and when the interval between t0 and t1 is very short (e.g., the time for collecting one frame of data), the linear velocity can also be regarded as the instantaneous linear velocity);
wherein, the coordinates of a and a 'may be obtained by the positioning system, and the position coordinates of B, B' may be obtained by a target tracking method based on the fusion of millimeter wave radar information and camera information, which will be described in further detail later. From the position coordinates of B, B
Figure BDA0002912066300000125
vrThe geometric score can be calculated from the position coordinates of A, A 'and B, B' output by the target tracking method. v. ofr' may be detected by a millimeter wave radar.
The value of P is related to the execution time of a servo mechanism of the first vehicle for adjusting the millimeter wave radar and/or the delay of the curve and the like, the longer the execution time is, the smaller P is, the shorter the execution time is, the larger P is, the faster the curve is, the smaller P is, the slower the curve is, and the larger P is.
Illustratively, the value of P satisfies the following formula:
P(R,t)=e^-(t/aR);
wherein t is execution time, R is curve radius, and a is undetermined coefficient.
Introduction v is as followsrThe derivation process of (2):
referring to fig. 7, because the turning radius of the first vehicle:
Figure BDA0002912066300000126
Figure BDA0002912066300000127
θ=ωr(t1-t0);
and because
Figure BDA0002912066300000128
The trigonometric cosine theorem includes:
Figure BDA0002912066300000131
Figure BDA0002912066300000132
Figure BDA0002912066300000133
wherein beta is equal to alpha, and is half of the detection beam angle of the radar.
So from the trigonometric sine theorem there are:
Figure BDA0002912066300000134
Figure BDA0002912066300000135
since epsilon is xi + theta-gamma, it can be used for making
Figure BDA0002912066300000136
The speed at which the second vehicle travels straight is therefore:
Figure BDA0002912066300000137
example 3, referring to fig. 8, if the first vehicle is in a curve and the second vehicle is in a curve, the preset conditions include:
Figure BDA0002912066300000138
the parameters are explained as follows:
the time t0 is the time when the millimeter wave radar collects the first frame data, and the time t1 is the time when the millimeter wave radar collects the second frame data, where the first frame data and the second frame data may be two consecutive frames, and may also be two frames separated by a small number of frames in a specific implementation (for example, the first frame data and the second frame data are separated by 1 frame or 2 frames). In the embodiment of the present application, two consecutive frames of data are mainly taken as an example.
At time t0, the first vehicle is at the a position, the second vehicle is at the B position, and the B position is the boundary position of the detection range of the millimeter wave radar when the first vehicle is at the a position. At time t1, the first vehicle is at the a 'position, the second vehicle is at the B' position, and the B 'position is the boundary position of the detection range of the millimeter wave radar when the first vehicle is at the a' position.
ωr' an instantaneous angular velocity of the second vehicle making a circular motion around the curve at time t 2;
ωrthe angular velocity (average angular velocity, which can also be considered as instantaneous velocity when the time interval between t0 and t1 is extremely short (for example, the time for acquiring one frame of data)) for the second vehicle to make circular motion around the curve from the time t0 to the time t1, and t2 is more than or equal to t1 and more than t 0; n is a first threshold;
v is the travel speed (instantaneous linear velocity) of the first vehicle, which may be obtained by the chassis information of the first vehicle or the first vehicle positioning system;
Figure BDA0002912066300000139
the euclidean distance between the first vehicle and the second vehicle at time t 0;
alpha is half of the detection beam angle of the millimeter wave radar and is determined by the characteristics of the millimeter wave radar;
wherein, the coordinates of a and a 'may be obtained by the positioning system, and the position coordinates of B, B' may be obtained by a target tracking method based on the fusion of millimeter wave radar information and camera information, which will be described in further detail later. The position coordinates of A, A 'and B, B' can be obtained
Figure BDA00029120663000001310
ωr' Linear velocity v of the second vehicle, which can be detected by or from a millimeter-wave radarr' calculated.
And Q is a third preset coefficient, the value range is (0,1), and Q is the same as or different from P, K. optionally, the value of Q is related to the execution time of a servo mechanism of the first vehicle for adjusting the millimeter wave radar and/or the urgency of a curve and the like.
Illustratively, the value of Q satisfies the following equation:
Q(R,t)=e^-(t/aR);
wherein t is execution time, R is curve radius, and a is undetermined coefficient.
ωrThe derivation process of (1) is as follows:
when the first vehicle and the target vehicle are both in the process of driving on a curve, as shown in FIG. 8, the OC is perpendicular to the AB intersection AE at the O point;
assuming that both vehicles are in uniform circular motion,
Figure BDA0002912066300000141
and < COA ═ CAD ═ alpha;
the turning radius of the second vehicle is therefore:
Figure BDA0002912066300000142
therefore, it is possible to
Figure BDA0002912066300000143
It should be noted that, the above is only an example and is not limited, and in practical applications, the preset condition may also be designed by using the same idea for other road scenes.
It should be understood that, the above is an example that the target is the second vehicle, in practical applications, other targets, for example, pedestrians, animals, etc. may also be detected by using the technical solution of the embodiment of the present application, and the present application is not limited thereto.
Through the above, the first vehicle in the embodiment of the application can rapidly recognize the special road condition through the positioning information and the visual information acquired by the first vehicle, and can rapidly judge whether the sensors such as the millimeter wave radar of the vehicle are in a target loss state or a target to be lost state (the sensors can be recognized as soon as two continuous frames of data acquired by the millimeter wave radar) by monitoring the kinematic physical quantity (such as angular velocity/linear velocity and the like) of the second vehicle under the special road condition, so that the vehicle is prevented from generating the illusion of no target obstacle, and the accuracy of target detection is improved.
The following describes a scheme for adjusting the posture of the millimeter wave radar by the first vehicle after the millimeter wave radar is in a target loss state or a target to be lost state (or the millimeter wave radar is in a frame loss state).
For ease of understanding, the attitude sensor will be briefly described here: the attitude sensor is realized by one or a combination of three types of sensors, namely an acceleration sensor (namely an accelerometer), an angular velocity sensor (namely a gyroscope) and a magnetic induction sensor (namely a magnetometer). The attitude sensor includes a three-axis attitude sensor (or a three-dimensional attitude sensor), a six-axis attitude sensor (or a six-dimensional attitude sensor), a nine-axis attitude sensor (or a nine-dimensional attitude sensor), and the like. Wherein the three-axis attitude sensor is implemented by a type of sensor (such as a three-axis accelerometer or a three-axis gyroscope or a three-axis magnetometer); six-axis attitude sensors are generally implemented by two types of sensors (e.g., three-axis accelerometer + three-axis gyroscope); the nine-axis attitude sensor generally comprises a three-axis gyroscope, a three-axis accelerometer and a three-axis geomagnetic meter, and also comprises a six-axis acceleration sensor and a three-axis gyroscope, and also comprises a six-axis gyroscope and a three-axis accelerometer.
The attitude sensor in the embodiment of the present application may be a six-dimensional attitude sensor or a nine-dimensional attitude sensor, and the present application is not limited thereto. The attitude sensor can detect the attitude of the millimeter wave radar in real time.
Referring to fig. 9, in the embodiment of the present application, a three-dimensional coordinate system of a millimeter wave radar is established with a center of the millimeter wave radar as a center of a circle when a vehicle runs on a straight road, where X isRThe axis is parallel to the ground direction (the right direction of the millimeter wave radar is positive, and the left direction is negative), ZRThe axis is perpendicular to the ground (positive in the upward direction and negative in the downward direction), YRAxis perpendicular to XRAxis and ZRThe plane of the axis (with the forward direction of the millimeter wave radar as positive and the backward direction as negative), the attitude of the millimeter wave radar can be represented by the following three parameters:
1) yaw angle (Yaw), representing milliMeter wave radar winding ZRRotating the shaft, and setting the Yaw as alpha;
2) pitch angle (Pitch), representing the millimeter wave radar around XRSetting Pitch as beta for the rotation of the shaft;
3) roll angle (Roll), representing the millimeter wave radar around YRThe rotation of the shaft is represented by Roll γ.
Since the target detection of the first vehicle to the preceding vehicle is mainly considered in the embodiment of the application, the attitude sensor mainly acquires alpha and beta of the millimeter-wave radar. If the attitude sensor detects that both the attitude parameters alpha and beta of the millimeter wave radar are 0 when the vehicle runs on a straight road, the attitude sensor detects that both the attitude parameters alpha and beta of the millimeter wave radar are greater than or less than 0 when the vehicle runs on a curve and/or a ramp.
In the embodiment of the application, when the millimeter wave radar is in a target loss state or a state that a target is about to be lost, the degree of freedom closely related to the characteristics of the road is comprehensively considered, and the attitude of the millimeter wave radar is compensated and corrected.
Taking the curve road condition as an example, the attitude of the millimeter wave radar can be adjusted from the left-right rotation degree of freedom:
1) left and right rotational degrees of freedom. As shown in FIG. 9, the degree of freedom of the left and right rotation is the radar pitch ZRThe degree of freedom of the shaft rotation corresponds to the rotation angle α. And the millimeter wave radar detects the target again as the standard of closed-loop control of angle adjustment. Among them, it is difficult to establish a precise functional relation between the input approximate angle and the output angle, and therefore, it is preferable to use the fuzzy-PID adaptive control method.
Taking the road condition of the ramp as an example, the attitude of the millimeter wave radar can be adjusted from the degree of freedom of up-and-down swing:
2) and the degree of freedom of up-and-down swinging. As shown in FIG. 9, the degree of freedom of the vertical rotation is the radar winding XRThe degree of freedom of the shaft rotation is beta corresponding to the rotation angle, and the millimeter wave radar detects the target again as the standard of closed-loop control of angle adjustment.
It should be understood that the degrees of freedom in the two directions (i.e., up-down and left-right) are independent (or decoupled) from each other in the mechanical structure and independent (or decoupled) from each other in the control logic, so that the computing device can adjust only one degree of freedom, and of course, according to the actual road condition, if the road characteristic is a curve combination road section, the computing device can also adjust the posture of the millimeter wave radar from the left-right rotation degree of freedom and the up-down swing degree of freedom at the same time.
Referring to fig. 10, a schematic diagram of a pose of a millimeter wave radar is adjusted for a computing device of a first vehicle (fig. 10, taking MDC as an example). The target detection method described in the embodiment shown in fig. 5 is run on the computing device, and may be used to identify whether the millimeter wave radar is in a target loss state or a state where a target is about to be lost, and the computing device also runs a target tracking method based on fusion of millimeter wave radar information and camera information (the target tracking method may be a method in the prior art, and may also be a target tracking method described in the following related embodiment shown in fig. 12, which is not limited here). The follow-up mechanism of the millimeter wave radar comprises ZRShaft rotating electric machine and XRShaft rotating electrical machine, wherein ZRThe shaft rotating motor can rotate under the control of voltage so as to drive the millimeter wave radar to wind ZRRotation of the shaft, XRThe shaft rotating motor can rotate under the control of voltage so as to drive the millimeter wave radar to wind XRThe shaft rotates.
The process of controlling and adjusting the attitude of the millimeter wave radar by the computing device comprises the following steps:
1) the computing equipment detects whether the millimeter wave radar is in a target loss state or a target to be lost state (namely whether frames are lost) based on a target detection method;
2) when the millimeter wave radar is in a target loss state or a target is about to be lost state, the computing equipment determines a first angle adjustment value of the millimeter wave radar according to the current posture of the millimeter wave radar (for example, the millimeter wave radar needs to wind around Z)RThe angle of rotation of the shaft by Δ α, requiring rotation about XRAngle of shaft rotation Δ β); the computing device sends the angle to the control device through an instruction (fig. 10 takes the MCU as an example); the control equipment stores the mapping relation between the angle regulating value and the voltage regulating value, and after receiving the command, the control equipment converts the first angle regulating value into the value required to be transmitted toZRFirst voltage V1 of shaft rotating motor and required to be transmitted to XRA second voltage V2 of the shaft rotating electrical machine; control device control ZRInput voltage of the shaft rotating electrical machine is V1 and control XRThe input voltage of the shaft rotating electrical machine is V2, so that ZRThe shaft rotating motor rotates to drive the millimeter wave radar to wind ZRThe axis being rotated by an angle Δ α, and so that XRThe shaft rotates to drive the millimeter wave radar to wind XRThe angle of shaft rotation Δ β; alternatively, the first and second liquid crystal display panels may be,
the computing equipment stores a mapping relation between the angle adjustment value and the voltage adjustment value, determines a first angle adjustment value of the millimeter wave radar according to the current posture of the millimeter wave radar after the millimeter wave radar is in a target loss state or a target to be lost state, and converts the first angle adjustment value into a value required to be transmitted to Z according to the mapping relationRFirst voltage V1 of shaft rotating motor and required to be transmitted to XRA second voltage V2 of the shaft rotating electric machine; the computing equipment sends V1 and V2 to the control equipment through instructions, and the control equipment controls Z after receiving the instructionsRInput voltage of the shaft rotating electrical machine is V1 and control XRThe input voltage of the shaft rotating electrical machine is V2, so that ZRThe shaft rotating motor rotates to drive the millimeter wave radar to wind ZRThe axis being rotated by an angle Δ α, and so that XRThe shaft rotates to drive the millimeter wave radar to wind XRThe shaft is rotated by an angle Δ β.
3) After each pair of millimeter wave radars of the computing equipment is subjected to one-time attitude adjustment (such as the millimeter wave radars winding Z)RRotation of axis Deltaalpha and about XRShaft rotation delta beta), the computing equipment detects the attitude of the millimeter wave radar again and judges whether the left-right rotation freedom degree and the up-down swing freedom degree of the millimeter wave radar meet the standard of angle closed-loop control or not; if so, the computing device determines that the pose adjustment is complete; if the angle closed-loop control degree of freedom is not met, the computing equipment continues to adjust the posture of the millimeter wave radar, and the operation is circulated until the degree of freedom of the millimeter wave radar in left-right rotation and the degree of freedom of the up-down swing meet the corresponding angle closed-loop control standard.
After the posture adjustment of the millimeter wave radar is completed, the left-right rotation freedom degree and the up-down swing freedom degree of the millimeter wave radar meet the corresponding angle closed-loop control standard, and the millimeter wave radar can accurately detect the target again.
Optionally, the computing device performs attitude adjustment once per pair of millimeter wave radars (e.g., controls the radar to go around Z each timeRRotating and/or winding the axis XRAxis rotation), the computing equipment updates the calibration matrix of the millimeter wave radar in real time according to the adjustment angle value of the millimeter wave radar, so that the reliability of information fusion of the millimeter wave radar and the camera in the attitude adjustment process of the millimeter wave radar is ensured.
Fig. 11 is a schematic diagram of a millimeter wave radar coordinate system and a camera coordinate system. Wherein the millimeter wave radar coordinate system may describe a relative position of the object and the millimeter wave radar, denoted as (X)R、YR、ZR) The camera coordinate system may describe the relative position of the object and the camera, denoted as (X)C、YC、ZC). The camera and the millimeter wave radar are installed at different positions of the vehicle, the position data of each characteristic point collected by the camera is the coordinate of each characteristic point in a camera coordinate system, the position data of each characteristic point collected by the millimeter wave radar is the coordinate of each characteristic point in the millimeter wave radar coordinate system, and the same object has different coordinate parameters in the camera coordinate system and the millimeter wave radar coordinate system respectively, so that a calibration matrix is needed to correspond the data collected by the millimeter wave radar and the data collected by the camera to the same coordinate system, so that the data operation of a computing device is facilitated.
For example, the computing device may convert data collected by the millimeter wave radar into a camera coordinate system, and the calibration matrix may include a rotation matrix R, a translation matrix T, and the like, as needed for conversion between the millimeter wave radar coordinate system and the camera coordinate system.
For example, the computing device may convert both millimeter wave radar acquired data and camera acquired data into the world coordinate system (or image coordinate system or other coordinate system), and the calibration matrix may include the rotation matrix R, translation matrix T, etc. required for conversion between the millimeter wave radar coordinate system and the world coordinate system (or image coordinate system or other coordinate system).
According to the method, the device and the system, the road condition characteristics of the road can be combined, the attitude of the millimeter wave radar can be compensated and corrected through the degrees of freedom of the millimeter wave radar in the left-right (alpha) and up-down (beta) aspects when the target is lost or is about to be lost, the two degrees of freedom can be independently adjusted without mutual dependence, the accuracy and the efficiency of attitude adjustment of the millimeter wave radar can be improved, and the accuracy and the efficiency of target detection and tracking can be further improved. In addition, when the angle of the millimeter wave radar is adjusted, the calibration matrix of the millimeter wave radar can be compensated and corrected in real time, so that the rapid and accurate response of the information fusion of the millimeter wave radar and the camera is achieved, and the reliability of the information fusion of the millimeter wave radar and the camera in the attitude adjustment process of the millimeter wave radar is ensured.
It should be noted that, the process of adjusting the posture of the millimeter wave radar by the first vehicle when the millimeter wave radar is in a target loss state or a target is about to be lost state (or the millimeter wave radar is in a frame loss state) is described above by taking a road condition of a curve as an example, in practical application, the same idea may be adopted to adjust the posture of the millimeter wave radar for other road scenes (such as ascending/descending, acceleration/deceleration, and the like), and the present application is not limited.
As shown in fig. 12, an embodiment of the present application further provides a target tracking method based on the fusion of millimeter wave radar information and camera information, where the target tracking method may be applied to the vehicle shown in fig. 4, and includes:
s1201, the first vehicle acquires a Red Green Blue (RGB) image acquired by the camera, and roughly fits the lane lines in the far field of view included in the RGB image.
Specifically, the computing device of the first vehicle acquires an RGB image captured by the camera, analyzes the RGB image, and performs a rough fit on a lane line in a far field of view included in the RGB image. Or the processing chip in the camera analyzes the RGB image, roughly fits the lane lines in the far field of view included in the RGB image, and then transmits the fitting result to the computing equipment.
The rough fitting refers to projecting certain pixel points of the lane lines onto the aerial view, extracting several points with equal intervals, judging the inflection point and the direction of the curve without spending a large amount of calculation power and time to fit complex equations such as cubic curves of the curve and the like, and also without performing post-processing of a plurality of lane lines and smoothness.
Because the detection range of the radar in the visual field is larger than that in the far visual field, a target (such as a second vehicle) is not easy to lose in the near visual field, only the lane line of the far visual field is roughly fitted, the calculation amount can be reduced on the premise of ensuring the accuracy, and the calculation efficiency is improved.
Optionally, the far vision is a position which is more than a preset distance (e.g. 50m, 60m or 100m, etc.) away from the vehicle. In a specific implementation, the preset distance may be determined according to characteristics of a detection beam angle and a detection distance of the millimeter wave radar.
And S1202, the first vehicle simultaneously identifies the target based on the millimeter wave radar and the camera.
On one hand, the computing equipment of the first vehicle performs target recognition on the RGB image acquired by the camera to obtain a first recognition result; alternatively, the processing chip in the camera performs object recognition on the RGB image, and after obtaining the first recognition result, transmits the first recognition result to the computing device (i.e., S1202.1, camera object recognition).
On the other hand, the computing equipment of the first vehicle performs target recognition on radar trace data acquired by the millimeter wave radar to obtain a second recognition result; or, the processing chip in the millimeter wave radar performs target identification on the radar trace point data, and after a second identification result is obtained, the second identification result is transmitted to the computing device (i.e., S1202.2, millimeter wave radar target identification).
In the embodiment of the application, for the camera target recognition, the computing device stores a trained target recognition model corresponding to the camera target recognition. After obtaining the RGB image acquired by the camera, the computing device inputs the RGB image into the target recognition model corresponding to the target recognition of the camera, and thus a target recognition result corresponding to the camera, that is, a first recognition result, can be obtained. After the computing device obtains the radar trace data collected by the millimeter wave radar, the radar trace data can be processed by adopting a neural network model or adopting a traditional clustering and tracking algorithm, and the millimeter wave radar obtains a target recognition result corresponding to the millimeter wave radar, namely a second recognition result.
As shown in FIG. 12, let the first object recognition result be (Xc, Yc, C)TWhere Xc and Zc denote position data of the object and C denotes the kind of the object. Let the second target recognition result be expressed as (X)R,YR,v,w)TWherein X isRAnd YRRepresents position data of the target, v represents a linear velocity of the target, and w represents an angular velocity of the target. It should be noted that, here, the position data of the target recognition by the millimeter wave radar and the target recognition by the camera are the position data in the two-dimensional plane as an example, that is, the first recognition result includes only the data in the two directions of Xc and Yc, and the second recognition result includes only the data in the two directions of XRAnd YRData in both directions. In particular implementations, the position data of the target recognition by the millimeter wave radar and the target recognition by the camera may further include more parameters, which are not limited herein.
Optionally, in the embodiment of the present application, for the camera recognition in S1202.1, different target recognition models may be designed according to different recognition scenarios. For example, in the case of millimeter wave radar frame loss, the computing device performs target recognition on data collected by the camera by using a lightweight recognition model, which focuses on recognition speed (i.e., the delay of algorithm processing is small), for example, the lightweight recognition model may use a Once-You-see (You Only Look on, YOLO) v3 (version number, representing the third edition). Under the condition that the millimeter wave radar does not lose frames, the data collected by the camera is subjected to target recognition by adopting a weighting recognition model, the weighting recognition model focuses on the accuracy of the algorithm (namely the accuracy of the position, the speed and the type of the target output by the algorithm is high), so that the computing equipment can control a follow-up mechanism to adjust the posture of the millimeter wave radar as soon as possible based on the type and the position of the target and detect the target again, for example, the weighting recognition model can adopt a Regional Convolutional Neural Network (RCNN).
And S1203, the first vehicle carries out time alignment and target alignment on the first recognition result and the second recognition result.
The term "time alignment" refers to time synchronization of the first recognition result and the second recognition result. The term object alignment means that the first recognition result and the second recognition result are spatially synchronized, for example, by converting the position data into the same coordinate system as described in the above-mentioned embodiment related to fig. 11.
And S1204, the computing equipment performs target fusion according to the IOU fusion rule.
In the embodiment of the present application, after the computing device of the first vehicle performs the target recognition on the RGB image to obtain the first recognition result, a region of interest (ROI) (i.e., a target frame) may be further generated in the RGB image; and after the target identification is carried out on the radar locus data to obtain a first identification result, generating an ROI in the radar locus data. The ROI is a region to be processed, which is defined by a frame, a circle, an ellipse, an irregular polygon, etc., from the processed image in the machine vision and image processing, and is referred to as a region of interest.
The ROI in the embodiment of the present application is, for example, a rectangle, and as shown in fig. 13, is a schematic diagram of an ROI (a rectangular region indicated by a) corresponding to a millimeter wave radar detection result and an ROI (a rectangular region indicated by B) corresponding to a camera detection result.
IoU is the ratio of the intersection and union of the ROI based on the millimeter wave radar detection output and the ROI based on the camera detection output. As shown in fig. 14, which is a schematic diagram of the intersection of the millimeter wave radar detection result a and the camera detection result B, IoU is a ratio of the intersection and the union of two rectangular area areas, that is: IoU ═ A ≡ B)/(A ≡ B).
IoU should have a value between 0, 1. IoU, the larger the value, the higher the probability that the representative camera-based detected target and the radar-based detected target are the same target, and conversely, the smaller the value IoU, the lower the probability that the representative camera-based detected target and the radar-based detected target are the same target.
In the embodiment of the present application, the IOU fusion rule may be packagedComprises the following steps: if IoU>M (or IoU is more than or equal to M), the computing device fuses the first recognition result and the second recognition result and outputs a third recognition result, and the fused data in the graph 12, namely the third recognition result, is represented as (X, Y, C, v, w)T(ii) a If IoU is less than or equal to M (or IoU)<M), the computing equipment does not fuse the first recognition result and the second recognition result, and outputs the first recognition result and the second recognition result. Wherein, M is a set second threshold value, and the value range is between (0, 1).
Alternatively, the second threshold M may be related to the curvature ρ of the curve on which the first vehicle and/or the second vehicle is located, the driving speed V of the first vehicle, the driving distance L of the first vehicle (i.e., the distance from the second vehicle), and the like.
For example, the second threshold M and the curvature ρ, the driving speed V, and the driving distance L of the curve satisfy the following formula:
M=a2ρ+bV+L;
where a and b are coefficients, which can be set by a skilled person based on experiments or experience.
It should be understood that the above formula is merely exemplary and not limiting, and that the second threshold K may also be related to other factors, such as the acceleration of the first vehicle, etc., when implemented.
S1205.1, when the radar is not in the target losing state or the target is about to lose the state, the first vehicle executes a millimeter wave radar target losing-free tracking mechanism: the computing equipment of the first vehicle tracks the target based on the continuous multi-frame fusion data;
for example, in fig. 12, when the radar is not in the target missing state or the target is about to be missing state, the computing device is based on a plurality of consecutive frames (X, Y, C, v, w)TAnd carrying out target tracking.
S1205.2, when the radar is in a target loss state or a target is about to be lost state, the first vehicle executes a millimeter wave radar target loss-free tracking mechanism: the computing device of the first vehicle performs target tracking based on a first recognition result when the radar is in a target loss state or an impending target loss state and the last frame or frames of fused data before the radar is in the target loss state or the impending target loss state.
For example, in fig. 12, when the radar is in a target missing state or a target is about to be lost state, the computing device fuses the data based on the last frame before the target is lost (i.e., the original frame (X, Y, C, v, w))T) And the first recognition result after the object is lost (i.e. consecutive frames (X)C,YC,C)T) And carrying out target tracking.
Therefore, the lane line in the far field of view is only fitted roughly, the calculation amount can be reduced on the premise of ensuring the accuracy, and the calculation efficiency is improved. Secondly, the embodiment of the application designs at least two target recognition models aiming at the target recognition of the camera, and under the condition that the millimeter wave radar loses frames, the lightweight recognition model is adopted to perform the target recognition on the data collected by the camera, so that the recognition speed is improved, and under the condition that the millimeter wave radar does not lose frames, the weighted recognition model is adopted to perform the target recognition on the data collected by the camera, so that the recognition precision is improved, and the speed and the accuracy of fusion recognition are further considered. In addition, the second threshold value K of the curve scene design IoU is combined, and the recognition accuracy can be further improved.
It should be noted that, a solution of how to detect whether a target is lost and how to obtain a detected target again is described herein by taking a curve blind zone as an example, but in practical application, other similar blind zone scenes are also applicable to the technical solution of the embodiment of the present application. For example, in a slope scene, because the vehicle has a blind area due to the blocking of the slope, the millimeter wave radar can be controlled to realize the freedom degree of the vertical swing. Or for example, an uneven road section or the start/stop time of the automobile has a large influence on the pitch angle of the millimeter wave radar, so that the millimeter wave radar can be controlled to realize the freedom degree of the up-and-down swing of the millimeter wave radar.
For example, see fig. 15 for a scenario where the vehicle is on an uphill slope. At the moment t0, the first vehicle and the second vehicle are both located on a flat road, and the second vehicle is located within the millimeter wave radar detection range of the first vehicle; at time t1, the second vehicle travels into the uphill road, and since the attitude of the millimeter wave radar at this time coincides with the attitude at time t0, the second vehicle exceeds (or is about to exceed) the detection range of the millimeter wave radar of the first vehicle due to the presence of the hill. Referring to fig. 16, after the second vehicle exceeds (or is about to exceed) the detection range of the millimeter wave radar of the first vehicle, the computing device of the first vehicle adjusts the attitude of the millimeter wave radar, so that the pitch angle of the millimeter wave radar is adjusted upward, and the millimeter wave radar of the first vehicle detects the second vehicle again.
It should be further noted that, in the embodiment of the present application, a scheme of how the millimeter wave radar detects a target loss and detects a target again in a special scene such as a curve, a ramp, acceleration, deceleration, and the like is mainly described by taking the millimeter wave radar as an example, but in an actual application, the millimeter wave radar may be replaced by other sensors, for example, a laser millimeter wave radar and the like, and particularly, in an automatic driving mass production scheme, after the solid-state laser millimeter wave radar is adopted, the laser millimeter wave radar also has a detection range similar to a view angle definition, so that a certain blind area of a certain detection range inevitably exists in a certain number of sensor layouts, and therefore the same technical scheme may also be applied to other similar sensors such as the laser millimeter wave radar. In other words, the technical solution of the embodiment of the present application is applicable to any sensor that has a large detection range, a short detection distance, and a inaccurate detection result (such as the position and the speed of an obstacle), and is used in combination with a sensor that has a small detection range, a long detection distance, and an accurate detection result, and has a detection blind area.
The above embodiments of the present application can be combined with each other to achieve different technical effects.
Based on the same technical concept, referring to fig. 17, a system architecture diagram provided in the embodiment of the present application is provided, and the system can perform the scheme in the above method embodiment. The system comprises: positioning devices, cameras, millimeter wave radars, servos, and computing devices. The system is suitable for the vehicle shown in fig. 4, and is used for implementing the target detection method described in the embodiment of the present application.
The positioning device is used for acquiring positioning information of the vehicle.
A camera for acquiring visual information in front of the vehicle.
And the millimeter wave radar is used for acquiring the millimeter wave radar information in front of the vehicle. It should be understood that the present application target detection is exemplified by a combination of a camera and a millimeter wave radar. However, the present application is not limited to a specific type of sensor, and the embodiments of the present application are applicable to a sensor having a small detection range, a long detection distance, and an accurate detection result (e.g., a position and a speed of an obstacle), as long as the sensor has a large detection range, a short detection distance, and a less accurate detection result.
And the follow-up mechanism is arranged on the millimeter wave radar and is used for controlling the posture of the millimeter wave radar.
A computing device is a device with computing/processing capabilities. The specific information may be, without limitation, a vehicle-mounted device, or one or more processing chips, or an integrated circuit.
Referring to fig. 17, a computing device may include the following modules, divided by logical functions:
the road condition identification module is used for identifying the road conditions of the self vehicle and the front vehicle according to the positioning information acquired by the positioning device and the visual information acquired by the camera; and according to the road conditions, corresponding threshold comparators are activated (wherein different road conditions correspond to different threshold comparators, fig. 12 illustrates the threshold comparator A, B, C, and the actual number may be more or less).
A threshold comparator for determining whether the velocity (including linear velocity or angular velocity) of the target obstacle exceeds a threshold of the threshold comparator; the specific implementation of each threshold comparator can be referred to the related description in the embodiments shown in fig. 6 to 8 above.
The image tag in fig. 12 is an RGB image that carries a target tag (e.g., a target frame) and is output by the camera when the threshold comparator determines "yes".
The millimeter wave radar and camera fused target detection module is configured to fuse a detection result of the millimeter wave radar and a detection result of the camera, and specific implementation may refer to related descriptions in the embodiment shown in fig. 12.
A target loss tracking module, configured to execute a millimeter wave radar target loss tracking mechanism, which is specifically described in the foregoing S1205.1.
A target missing-free tracking module, configured to execute a millimeter wave radar target missing-free tracking mechanism, which is specifically described in the foregoing description in S1205.2.
Wherein the final output result can have various forms. For example, data such as specific position, type, and velocity of the target (e.g., X, Y, C, v, w, etc.), or RGB images with a frame of the target, etc., may be output, without limitation.
Based on the same technical concept, the embodiment of the present application further provides an object detection device 180, where the device 180 is located inside the first vehicle, for example, may be a chip disposed inside the first vehicle. The apparatus 180 includes modules or units or means (means) for executing steps in the methods shown in fig. 5, fig. 10, and fig. 12, and the functions or units or means may be implemented by software, or by hardware executing corresponding software.
As shown in fig. 18, the apparatus 180 may include: a detection module 1801, configured to detect a speed of the second vehicle based on the millimeter wave radar; wherein the second vehicle is located forward of the first vehicle, and the speed comprises a linear speed and an angular speed; the processing module 1802 is configured to acquire road conditions where a first vehicle and a second vehicle are located; determining a first threshold value according to the road condition; and if the speed of the second vehicle exceeds the first threshold value, determining that the millimeter wave radar is in a target loss state or a target to be lost state.
In this embodiment, the device 180 determines the first threshold according to the road condition, and compares the speed of the second vehicle with the first threshold, so as to determine whether the millimeter wave radar is in the target loss state or the target is about to be lost. Thus, under special road conditions (such as curves and ramps), the device 180 can effectively identify the condition that the millimeter wave radar target is lost due to the road conditions in time, avoid the illusion that the first vehicle has no target obstacle, and further improve the safety and stability of the vehicle in the driving process.
Optionally, when the processing module 1802 acquires the road conditions of the first vehicle and the second vehicle, the processing module is specifically configured to: the method comprises the steps of acquiring an RGB image in front of a first vehicle by using a camera, and determining the road condition of a second vehicle according to the RGB image. Therefore, the road condition in front can be efficiently and quickly identified, and the road condition where the second vehicle is located is determined.
Optionally, when determining the road condition of the second vehicle according to the RGB image, the processing module 1802 is specifically configured to: and extracting characteristic points from the lane lines in the far view field included in the RGB image, and then determining the inflection point and the direction of the road where the second vehicle is located according to the extracted characteristic points so as to obtain the road condition where the second vehicle is located. Because the detection range of the millimeter wave radar in the near vision field is larger than that in the far vision field, the target in the near vision field is generally not easy to lose, and the processing module 1802 only processes the lane line in the far vision field, so that the calculation amount can be reduced on the premise of ensuring the accuracy, and the calculation efficiency is improved.
Optionally, when the processing module 1802 acquires the road conditions where the first vehicle and the second vehicle are located, the processing module may further acquire the position information of the first vehicle based on the positioning device, and then determine the road condition where the first vehicle is located according to the position information. In this embodiment, the positioning device may be a GPS, a beidou system, or other positioning system, and is configured to receive a satellite signal and position the current position of the first vehicle. In addition, the positioning system can also be visual positioning, millimeter wave radar positioning, fusion positioning and the like, and the application is not limited. After obtaining the position information, the processing module 1802 determines the road condition of the position where the processing module is located based on the map, such as a curve, a straight road, an ascending slope, a descending slope, and the like.
Thus, the device 180 can quickly and accurately obtain the road condition of the first vehicle.
It should be appreciated that the first threshold value may be determined based on a speed threshold value of the second vehicle between being within the millimeter wave radar detection range and being outside the millimeter wave radar detection range, e.g., the first threshold value is less than or equal to the speed threshold value. When the speed of the second vehicle is far larger than the speed critical value, the second vehicle exceeds the detection range of the millimeter wave radar and enters a detection blind area of the millimeter wave radar, so that the millimeter wave radar is in a target loss state; when the speed of the second vehicle is near the speed critical value, the second vehicle possibly exceeds the detection range of the millimeter wave radar at any time, namely, the second vehicle is about to enter a detection blind area of the millimeter wave radar, so that the millimeter wave radar is in a state that a target is about to be lost. According to different road conditions, the design of the first threshold value may be different in the embodiment of the present application.
Three specific examples are given below for illustration.
Example 1, the road conditions under which the first vehicle and the second vehicle are located are: the first vehicle is in a straight road and the second vehicle is in a curve. The detecting module 1801 is specifically configured to, when detecting the speed of the second vehicle based on the millimeter wave radar:
detecting the instantaneous angular speed of a second vehicle making circular motion around a curve based on a millimeter wave radar; the processing module 1802 determines, according to the road condition, that the first threshold N meets:
Figure BDA0002912066300000211
wherein, ω isrFor the angular velocity of the second vehicle making a circular motion around the curve from the time t0 to the time t1,
Figure BDA0002912066300000212
the euclidean distance between the first vehicle and the second vehicle at time t 0;
Figure BDA0002912066300000213
alpha is half of the detection beam angle of the millimeter wave radar, the time t0 is the time when the millimeter wave radar collects the first frame data, the time t1 is the time when the millimeter wave radar collects the second frame data, and the first frame data and the second frame data are continuous two frames of data; k is a coefficient greater than 0 and equal to or less than 1; epsilon is the angle value of the central angle formed among the position A of the first vehicle, the position B of the second vehicle and the center O of the curve at the moment t 0.
Example 2, the road conditions under which the first vehicle and the second vehicle are: the first vehicle is in a curve and the second vehicle is in a straight lane. The detecting module 1801 is specifically configured to, when detecting the speed of the second vehicle based on the millimeter wave radar:
detecting the instantaneous running speed of the second vehicle on the straight road based on the millimeter wave radar;
the processing module 1802 determines, according to the road condition, that the first threshold N meets:
Figure BDA0002912066300000214
wherein the content of the first and second substances,
Figure BDA0002912066300000215
is the displacement of the second vehicle from time t0 to time t 1; v. ofrA running speed on the straight road from time t0 to time t1 for the second vehicle; the time t0 is the time when the millimeter wave radar acquires first frame data, the time t1 is the time when the millimeter wave radar acquires second frame data, and the first frame data and the second frame data are two continuous frames of data; p is a coefficient of greater than 0 and 1 or less.
Example 3, the road conditions under which the first vehicle and the second vehicle are: the first vehicle and the second vehicle are both in a curve. The detecting module 1801 is specifically configured to, when detecting the speed of the second vehicle based on the millimeter wave radar:
detecting the instantaneous angular speed of a second vehicle making circular motion around a curve based on a millimeter wave radar;
the processing module 1802 determines, according to the road condition, that the first threshold N meets:
Figure BDA0002912066300000221
wherein, ω isrAn angular velocity for the second vehicle to make a circular motion around the curve from time t0 to time t 1; v is the instantaneous travel speed of the first vehicle; alpha is half of the detection beam angle of the millimeter wave radar, time t0 is the time when the millimeter wave radar collects first frame data, time t1 is the time when the millimeter wave radar collects second frame data, and the first frame dataThe first frame data and the second frame data are continuous two frames of data; q is a coefficient of greater than 0 and equal to or less than 1;
Figure BDA0002912066300000222
the euclidean distance between the first vehicle and the second vehicle at time t 0;
it is to be understood that the above three are exemplary only and not limiting. In practical application, the road condition is not limited to a curve scene, and the preset condition can be designed by adopting the same idea for other road scenes (such as ascending/descending, acceleration/deceleration and the like).
Optionally, the processing module 1802 may be further configured to: after the millimeter wave radar is determined to be in the target loss state or the target is about to be lost, the millimeter wave radar is controlled to wind the Z direction according to the road conditionRRotating and/or winding about axis XRThe shaft is rotated to cause the millimeter wave radar to re-detect the second vehicle; wherein Z isRAxis perpendicular to horizontal plane, XRThe axle is parallel to the horizontal plane and perpendicular to the direction of travel of the first vehicle.
It will be appreciated that the two directions described above (i.e. around Z)RDirection of rotation of the shaft, about XRThe direction of shaft rotation) are independent (or decoupled) from each other in the mechanical structure and independent (or decoupled) from each other in the control logic, so that the processing module 1802 can control and adjust only one of the degrees of freedom or both of the degrees of freedom.
For example, the processing module 1802 may control the millimeter wave radar around Z when the first vehicle and/or the second vehicle are in a curveRThe shaft rotates a first angle. For another example, the processing module 1802 may control the millimeter wave radar around X when the first vehicle and/or the second vehicle is on a hillRThe shaft rotates a second angle. For another example, the processing module 1802 may control the millimeter wave radar around Z when the first vehicle and/or the second vehicle are on a curved combination road segmentRRotation of the shaft by a third angle and about XRThe shaft rotates a fourth angle. Due to the millimeter wave radar around ZRRotation of the shaft and about XRThe rotation of the shafts are independent and independent, so that the accuracy of posture adjustment of the millimeter wave radar can be improvedThe accuracy and the efficiency are improved, and the accuracy and the efficiency of target detection and tracking are further improved.
The processing module 1802 may also control the millimeter wave radar around Z at each timeRRotating and/or winding about axis XRAnd after the shaft rotates, updating the calibration matrix of the millimeter wave radar in real time according to the angle value of the rotation. Therefore, the method can achieve the quick and accurate response of the information fusion of the millimeter wave radar and the camera, and ensure the reliability of the information fusion of the millimeter wave radar and the camera in the attitude adjustment process of the millimeter wave radar.
Optionally, the processing module 1802 may be further configured to: and detecting and tracking the target based on the camera and the millimeter wave radar respectively, and fusing the target according to the IOU fusion rule.
The specific target fusion process comprises the following steps:
the processing module 1802 performs target recognition on the RGB image based on a target recognition model after acquiring the RGB image in front of the first vehicle based on the camera, to obtain a first recognition result, where the first recognition result includes a position and a type of the second vehicle, and the input of the target recognition model is the RGB image and the output is the position and the type of the target; the processing module 1802 processes the radar trace data in front of the first vehicle after acquiring the radar trace data based on the millimeter wave radar to obtain a second recognition result, wherein the second recognition result includes a position and a speed of the second vehicle; the processing module 1802 determines that the area where the second vehicle in the RGB image is located and the IoU of the area where the second vehicle in the radar trace data is located are greater than the second threshold value M, and then fuses the first recognition result and the second recognition result to obtain fused data, otherwise, does not fuse the first recognition result and the second recognition result.
The second threshold value M, the curvature rho of the curve where the first vehicle and/or the second vehicle are/is located, the driving speed V of the first vehicle and the driving distance L of the first vehicle satisfy the following relations:
M=a2ρ+bV+L;
wherein a and b are preset coefficients.
It should be understood that the above formula is merely exemplary and not limiting, and in particular implementation, the second threshold K may also be related to other factors, such as the acceleration of the first vehicle, etc., without limitation.
Since the second threshold K is related to the road condition of the vehicle (the curvature ρ of the curve where the first vehicle and/or the second vehicle is located) and the driving state of the first vehicle (i.e., the driving speed V and the driving distance L), the accuracy of fusion recognition can be improved, and thus the safety and stability of the vehicle in the driving process can be further improved.
Optionally, the processing module 1802 may be further configured to: when the millimeter wave radar is in a target loss state or a state that the target is about to be lost, tracking a second vehicle based on fusion data of the millimeter wave radar in the target loss state or before the target is about to be lost and a first recognition result of the millimeter wave radar in the target loss state or the state that the target is about to be lost; or when the millimeter wave radar is not in a target loss state or the target is about to be lost, tracking the second vehicle based on the continuous multi-frame fusion data. Because the device 180 adopts different tracking mechanisms according to different millimeter wave radar states, the accuracy of target tracking can be improved, and the safety and the stability of the vehicle in the driving process are further improved.
Optionally, when the processing module performs target recognition on the RGB image based on the target recognition model, the processing module is specifically configured to: when the millimeter wave radar is in a target loss state or a target to be lost state, performing target identification on an RGB image acquired by a camera by using a light-weight target identification model; when the millimeter wave radar is not in a target loss state or a target is about to be lost, performing target identification on an RGB image acquired by a camera by using a requantized target identification model; the recognition speed of the light-weight target recognition model is higher than that of the heavy-weight target recognition model, and the recognition precision of the light-weight target recognition model is lower than that of the heavy-weight target recognition model.
Two different target recognition models are designed for camera target recognition and are switched according to different scenes (under the condition that a millimeter wave radar loses frames, a lightweight recognition model is adopted to perform target recognition on data collected by a camera, the recognition speed is improved, under the condition that the millimeter wave radar does not lose frames, the lightweight recognition model is adopted to perform target recognition on the data collected by the camera, the recognition precision is improved), and the speed and the accuracy of fusion recognition can be realized.
It should be understood that the above is only an example of switching between two models, and more modules can be designed for switching in practical applications.
It should be noted that, for specific implementation manners of the method steps executed by each module in the apparatus 180, reference may be made to specific implementation manners when the first vehicle executes corresponding method steps in the foregoing method embodiments, and details are not described herein again.
Based on the same technical concept, referring to fig. 19, an embodiment of the present application further provides an object detection apparatus 190, which includes a processor 1901 and a memory 1902; the memory 1902 stores instructions executable by the processor 1901, and the processor 1901 causes the apparatus 190 to execute the methods shown in fig. 5, 10, and 12 by executing the instructions stored in the memory 1902. The processor 1901 and the memory 1902 may be coupled via an interface circuit, or may be integrated together, which is not limited herein.
The embodiment of the present application does not limit the specific connection medium between the processor 1901 and the memory 1902. In the embodiment of the present application, the processor 1901 and the memory 1902 are connected by a bus in fig. 19, the bus is represented by a thick line in fig. 19, and the connection manner between other components is merely illustrative and not limited thereto. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 19, but that does not indicate only one bus or one type of bus.
It should be understood that the processors mentioned in the embodiments of the present application may be implemented by hardware or may be implemented by software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general-purpose processor implemented by reading software code stored in a memory.
The Processor may be, for example, a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will be appreciated that the memory referred to in the embodiments of the application may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), Double Data rate Synchronous Dynamic random access memory (DDR SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DR RAM).
It should be noted that when the processor is a general-purpose processor, a DSP, an ASIC, an FPGA or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component, the memory (memory module) may be integrated into the processor.
It should be noted that the memory described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
Based on the same technical concept, the embodiment of the present application also provides a computer-readable storage medium, which includes a program or instructions, and when the program or instructions are executed on a computer, the method as shown in fig. 5, 10, and 12 is executed.
Based on the same technical concept, the embodiment of the present application further provides a chip, which is coupled to the memory and configured to read and execute the program instructions stored in the memory to implement the methods shown in fig. 5, fig. 10, and fig. 12.
Based on the same technical concept, the embodiment of the present application further provides a computer program product containing instructions, where the instructions are stored in the computer program product, and when the computer program product runs on a computer, the computer program product causes the computer to execute the method shown in fig. 5, 10, and 12.
Based on the same technical concept, the embodiment of the application also provides a vehicle, which comprises a target detection device, a millimeter wave radar and a camera; the object detection apparatus is used to implement the methods shown in fig. 5, 10, 12 by controlling the millimeter wave radar and the camera. The structure of the vehicle may be as shown in fig. 11.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (16)

1. An object detection method, applied to a first vehicle, the method comprising:
detecting a speed of the second vehicle based on the millimeter wave radar; wherein the second vehicle is located forward of the first vehicle, the speeds including a linear speed and an angular speed;
acquiring road conditions of the first vehicle and the second vehicle, and determining a first threshold value according to the road conditions;
and if the speed of the second vehicle exceeds the first threshold value, determining that the millimeter wave radar is in a target loss state or a state that the target is about to be lost.
2. The method of claim 1, wherein obtaining the road condition on which the first vehicle and the second vehicle are located comprises:
acquiring an RGB image in front of the first vehicle based on a camera, and determining a road condition where the second vehicle is located according to the RGB image;
and acquiring the position information of the first vehicle based on a positioning device, and determining the road condition of the first vehicle according to the position information.
3. The method of claim 2, wherein determining the road condition of the second vehicle from the RGB image comprises:
and extracting characteristic points from lane lines in a far view field included in the RGB image, and determining an inflection point and a direction of a road where the second vehicle is located according to the extracted characteristic points.
4. The method according to any of claims 1-3, wherein the first vehicle and the second vehicle are on a road condition that is: the first vehicle is in a straight road and the second vehicle is in a curve;
detecting a speed of the second vehicle based on the millimeter wave radar, including:
detecting an instantaneous angular velocity of the second vehicle in circular motion around the curve based on a millimeter wave radar;
the first threshold value N determined according to the road condition meets the following conditions:
Figure FDA0002912066290000011
wherein, the ω isrFor the angular velocity of the second vehicle making a circular motion around the curve from the time t0 to the time t1,
Figure FDA0002912066290000012
at time t0A Euclidean distance between the first vehicle and the second vehicle;
Figure FDA0002912066290000013
a is a displacement of the second vehicle from a time t0 to a time t1, a is a half of a detected beam angle of the millimeter wave radar, the time t0 is a time when first frame data are collected by the millimeter wave radar, the time t1 is a time when second frame data are collected by the millimeter wave radar, and the first frame data and the second frame data are two continuous frames of data; k is a coefficient greater than 0 and equal to or less than 1; epsilon is an angle value of a central angle formed between the position A where the first vehicle is located, the position B where the second vehicle is located and the center O of the curve at the moment t 0.
5. The method according to any of claims 1-3, wherein the first vehicle and the second vehicle are under the following road conditions: the first vehicle is in a curve and the second vehicle is in a straight lane;
detecting a speed of the second vehicle based on the millimeter wave radar, including:
detecting the instantaneous running speed of the second vehicle on the straight road based on the millimeter wave radar;
the first threshold value N determined according to the road condition meets the following conditions:
Figure FDA0002912066290000014
wherein, the first and the second end of the pipe are connected with each other,
Figure FDA0002912066290000015
is the displacement of the second vehicle from time t0 to time t 1; v. ofrA travel speed on the straight road from time t0 to time t1 for the second vehicle; the time t0 is the time when the millimeter wave radar collects first frame data, the time t1 is the time when the millimeter wave radar collects second frame data, and the first frame data and the second frame data are continuous two-frame data; p is largeA coefficient of 0 to 1 or less.
6. The method according to any of claims 1-3, wherein the first vehicle and the second vehicle are under the following road conditions: the first vehicle and the second vehicle are both in a curve;
detecting a speed of the second vehicle based on the millimeter wave radar, including:
detecting the instantaneous angular speed of the second vehicle making circular motion around the curve based on a millimeter wave radar;
the first threshold value N determined according to the road condition meets:
Figure FDA0002912066290000021
wherein, ω isrAn angular velocity for the second vehicle to make a circular motion around the curve from time t0 to time t 1; v is the instantaneous travel speed of the first vehicle; alpha is half of the detection beam angle of the millimeter wave radar, the time t0 is the time when the millimeter wave radar collects first frame data, the time t1 is the time when the millimeter wave radar collects second frame data, and the first frame data and the second frame data are continuous two-frame data; q is a coefficient of greater than 0 and equal to or less than 1;
Figure FDA0002912066290000022
is the euclidean distance between the first vehicle and the second vehicle at time t 0.
7. The method of any one of claims 1-6, after determining that the millimeter wave radar is in a target lost state or an impending target lost state, further comprising:
according to the road condition, controlling the millimeter wave radar to wind around ZRRotating and/or winding the axis XRShaft rotation to cause the millimeter wave radar to re-detect the second vehicle; wherein, Z isRThe axis is perpendicular to the horizontal plane,said X isRThe axis is parallel to a horizontal plane and perpendicular to a direction of travel of the first vehicle.
8. The method of claim 7, wherein the millimeter wave radar is controlled around Z according to the road conditionRRotating and/or winding about axis XRThe shaft rotates, including:
when the first vehicle and/or the second vehicle is in a curve, controlling the millimeter wave radar to turn around ZRThe shaft rotates by a first angle; alternatively, the first and second electrodes may be,
when the first vehicle and/or the second vehicle is/are on a slope, controlling the millimeter wave radar to surround XRThe shaft rotates by a second angle; alternatively, the first and second liquid crystal display panels may be,
when the first vehicle and/or the second vehicle are/is in a combined road section with a curve slope, controlling the millimeter wave radar to wind around ZRRotation of the shaft by a third angle and about XRThe shaft rotates a fourth angle.
9. The method of claim 7 or 8, further comprising:
controlling the millimeter wave radar to wind Z at each timeRRotating and/or winding about axis XRAnd after the shaft rotates, updating the calibration matrix of the millimeter wave radar in real time according to the angle value of the rotation.
10. The method of any one of claims 1-9, further comprising:
acquiring an RGB image in front of the first vehicle based on a camera, and performing target recognition on the RGB image based on a target recognition model to obtain a first recognition result, wherein the first recognition result comprises the position and the type of the second vehicle; the input of the target recognition model is an RGB image, and the output is the position and the type of a target;
acquiring radar trace data in front of the first vehicle based on the millimeter wave radar, and processing the radar trace data to obtain a second identification result; the second recognition result includes a position and a speed of the second vehicle;
when the intersection ratio IoU of the area where the second vehicle is located in the RGB image and the area where the second vehicle is located in the radar trace data is judged to be larger than a second threshold value M, the first recognition result and the second recognition result are fused to obtain fused data;
the second threshold value M, the curvature rho of the curve where the first vehicle and/or the second vehicle is located, the driving speed V of the first vehicle and the driving distance L of the first vehicle satisfy the following relations:
M=a2ρ+bV+L;
wherein a and b are preset coefficients.
11. The method of claim 10, wherein the method further comprises:
when the millimeter wave radar is in a target loss state or a state of losing targets, tracking the second vehicle based on fusion data of the millimeter wave radar in the target loss state or before the state of losing targets and a first recognition result of the millimeter wave radar in the target loss state or the state of losing targets; alternatively, the first and second liquid crystal display panels may be,
and when the millimeter wave radar is not in a target loss state or is in a state of losing targets, tracking the second vehicle based on continuous multi-frame fusion data.
12. The method of claim 10, wherein target recognizing the RGB image based on a target recognition model comprises:
when the millimeter wave radar is in a target loss state or a target is about to be lost state, performing target recognition on the RGB image acquired by the camera by using a light-weight target recognition model;
when the millimeter wave radar is not in a target loss state or a target is about to be lost, performing target identification on the RGB image acquired by the camera by using a re-quantized target identification model;
wherein the recognition speed of the lightweight target recognition model is greater than the recognition speed of the heavily quantized target recognition model, and the recognition accuracy of the lightweight target recognition model is less than the recognition accuracy of the heavily quantized target recognition model.
13. An object detection device, wherein the device is located in a first vehicle, the device comprising:
a detection module for detecting a speed of the second vehicle based on the millimeter wave radar; wherein the second vehicle is located forward of the first vehicle, the speed comprising a linear speed and an angular speed;
the processing module is used for acquiring road conditions of the first vehicle and the second vehicle; and determining a first threshold value according to the road condition, and if the speed of the second vehicle exceeds the first threshold value, determining that the millimeter wave radar is in a target loss state or a target is about to be lost.
14. An object detection device comprising a processor and a memory;
wherein the memory stores instructions executable by the processor, the processor causing the apparatus to perform the method of any one of claims 1-12 by executing the instructions stored by the memory.
15. A computer-readable storage medium comprising a program or instructions which, when run on a computer, causes the method of any one of claims 1-12 to be performed.
16. A vehicle characterized by comprising an object detection device, a millimeter wave radar, and a camera;
the object detection apparatus is configured to implement the method of any one of claims 1-12 by controlling the millimeter wave radar and the camera.
CN202110089080.2A 2021-01-22 2021-01-22 Target detection method and device Pending CN114779229A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110089080.2A CN114779229A (en) 2021-01-22 2021-01-22 Target detection method and device
PCT/CN2021/124194 WO2022156276A1 (en) 2021-01-22 2021-10-15 Target detection method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110089080.2A CN114779229A (en) 2021-01-22 2021-01-22 Target detection method and device

Publications (1)

Publication Number Publication Date
CN114779229A true CN114779229A (en) 2022-07-22

Family

ID=82407697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110089080.2A Pending CN114779229A (en) 2021-01-22 2021-01-22 Target detection method and device

Country Status (2)

Country Link
CN (1) CN114779229A (en)
WO (1) WO2022156276A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115966084A (en) * 2023-03-17 2023-04-14 江西昂然信息技术有限公司 Holographic intersection millimeter wave radar data processing method and device and computer equipment
CN117238143A (en) * 2023-09-15 2023-12-15 北京卓视智通科技有限责任公司 Traffic data fusion method, system and device based on radar double-spectrum camera

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115620337A (en) * 2022-10-11 2023-01-17 深圳市谷奇创新科技有限公司 Optical fiber sensor monitoring method and system for vital signs
CN116453346B (en) * 2023-06-20 2023-09-19 山东高速信息集团有限公司 Vehicle-road cooperation method, device and medium based on radar fusion layout
CN116500621B (en) * 2023-06-27 2023-08-29 长沙莫之比智能科技有限公司 Radar blind area early warning method based on double-subframe obstacle recognition
CN116543032B (en) * 2023-07-06 2023-11-21 中国第一汽车股份有限公司 Impact object ranging method, device, ranging equipment and storage medium
CN117129982A (en) * 2023-08-28 2023-11-28 河北德冠隆电子科技有限公司 Linear scanning angle accurate adjustable data dynamic fusion perception radar
CN117292579A (en) * 2023-10-27 2023-12-26 浪潮智慧科技有限公司 Highway curve early warning method, equipment and medium based on big data
CN117369350B (en) * 2023-12-08 2024-04-16 北京市农林科学院智能装备技术研究中心 High-speed seeder control system, method, electronic equipment and storage medium
CN117672007B (en) * 2024-02-03 2024-04-26 福建省高速公路科技创新研究院有限公司 Road construction area safety precaution system based on thunder fuses

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3719691B2 (en) * 1996-01-31 2005-11-24 富士通テン株式会社 Vehicle recognition device
CN105182342B (en) * 2015-09-29 2018-11-09 长安大学 The follow-up mechanism and method for tracing of a kind of bumpy road Radar for vehicle target location
CN106043277B (en) * 2016-06-30 2019-06-28 大连楼兰科技股份有限公司 Automobile is automatically with vehicle control and the automatic follow the bus system and method for method, automobile, control radar forward method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115966084A (en) * 2023-03-17 2023-04-14 江西昂然信息技术有限公司 Holographic intersection millimeter wave radar data processing method and device and computer equipment
CN117238143A (en) * 2023-09-15 2023-12-15 北京卓视智通科技有限责任公司 Traffic data fusion method, system and device based on radar double-spectrum camera
CN117238143B (en) * 2023-09-15 2024-03-22 北京卓视智通科技有限责任公司 Traffic data fusion method, system and device based on radar double-spectrum camera

Also Published As

Publication number Publication date
WO2022156276A1 (en) 2022-07-28

Similar Documents

Publication Publication Date Title
CN114779229A (en) Target detection method and device
US11042157B2 (en) Lane/object detection and tracking perception system for autonomous vehicles
Dickmann et al. Automotive radar the key technology for autonomous driving: From detection and ranging to environmental understanding
US9235767B2 (en) Detection region modification for driving assistance apparatus and driving assistance method
US10055650B2 (en) Vehicle driving assistance device and vehicle having the same
US9669829B2 (en) Travel lane marking recognition system
US20210188356A1 (en) Vehicle control device
CN108688660B (en) Operating range determining device
US10782405B2 (en) Radar for vehicle and vehicle provided therewith
US20210188262A1 (en) Vehicle control device
Lee et al. Development of a self-driving car that can handle the adverse weather
EP3715204A1 (en) Vehicle control device
US11200432B2 (en) Method and apparatus for determining driving information
CN102700548A (en) Robust vehicular lateral control with front and rear cameras
JP2007300181A (en) Periphery monitoring apparatus and periphery monitoring method and program thereof
US20200384999A1 (en) Vehicle control device
CN114442101B (en) Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar
WO2023092451A1 (en) Method and apparatus for predicting drivable lane
Lim et al. Real-time forward collision warning system using nested Kalman filter for monocular camera
KR20210126365A (en) Method, apparatus, electronic device, computer program and computer readable recording medium for detecting lane marking based on vehicle image
US11326889B2 (en) Driver assistance system and control method for the same
JP2024503671A (en) System and method for processing by combining visible light camera information and thermal camera information
TWI680898B (en) Light reaching detection device and method for close obstacles
US20210370832A1 (en) Data processing methods, devices, and apparatuses, and movable platforms
JP2003058997A (en) Traveling road environment detecting device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination