CN111898582A - Obstacle information fusion method and system for binocular camera and millimeter wave radar - Google Patents

Obstacle information fusion method and system for binocular camera and millimeter wave radar Download PDF

Info

Publication number
CN111898582A
CN111898582A CN202010809921.8A CN202010809921A CN111898582A CN 111898582 A CN111898582 A CN 111898582A CN 202010809921 A CN202010809921 A CN 202010809921A CN 111898582 A CN111898582 A CN 111898582A
Authority
CN
China
Prior art keywords
millimeter wave
information
list
camera
lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010809921.8A
Other languages
Chinese (zh)
Other versions
CN111898582B (en
Inventor
周坤
曾小韬
孙辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Suzhou Automotive Research Institute of Tsinghua University
Original Assignee
Tsinghua University
Suzhou Automotive Research Institute of Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University, Suzhou Automotive Research Institute of Tsinghua University filed Critical Tsinghua University
Priority to CN202010809921.8A priority Critical patent/CN111898582B/en
Publication of CN111898582A publication Critical patent/CN111898582A/en
Application granted granted Critical
Publication of CN111898582B publication Critical patent/CN111898582B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a binocular camera and a barrier information fusion method of a millimeter wave radar, which comprises the following steps: obtaining a camera object list and a millimeter wave object list, wherein the object list comprises an object ID and object information; according to a lane line equation, calculating objects of the lane to obtain a camera main object list and a millimeter wave main object list, and calculating objects of lanes on two sides of the lane to obtain a camera secondary object list and a millimeter wave secondary object list; respectively fusing a main object list and a secondary object list in the camera object list and the millimeter wave object list, searching the object ID of the previous frame of fusion barrier list, and updating the object information if the object ID is searched; otherwise, calculating the distance according to the position information in the object information, if the distance is smaller than a threshold value, judging the same object, and updating the object information; otherwise, the object is judged to be in doubt. The problem of missing detection is solved, and the reliability of sensor information is improved.

Description

Obstacle information fusion method and system for binocular camera and millimeter wave radar
Technical Field
The invention relates to the technical field of sensor information fusion processing, in particular to a method and a system for fusing barrier information of a binocular camera and a millimeter wave radar.
Background
The camera, the millimeter wave radar and the laser radar are sensors for identifying common objects in the auxiliary driving stage. For the early warning function of the Mobil eye, a pure vision scheme is basically sufficient. But the accuracy becomes insufficient from the early warning to the execution level. The fault-tolerant rate of the automobile to reverse control is low, so that redundancy verification is required to be carried out on at least two kinds of sensor information, the accuracy is improved, and meanwhile, a standby scheme when the function of one sensor is damaged in extreme weather is met.
The scheme of 'camera + millimeter wave radar' of the former mainstream: optimal selection among cost, precision and speed. For the camera, a monocular camera needs to establish and continuously maintain a huge sample feature database, and if feature data of a target to be identified is lacked, the system cannot identify and measure distance, the problem of missed detection occurs, and accidents are easily caused. The binocular camera can directly measure the distance of the scenery in front through the depth map, the defect of the monocular camera is overcome, and the cost of the binocular camera is 20% to 30% higher than that of the monocular scheme. But it is nearly one hundred percent lower than the combined camera and radar solution and has very good performance. Therefore, in future sensor fusion, the binocular camera and the radar are the most potential combination. The invention is thus based on the following.
Disclosure of Invention
In order to solve the technical problems, the invention provides a method and a system for fusing barrier information of a binocular camera and a millimeter wave radar, wherein the method and the system are used for fusing, matching and tracking the barrier information detected by the binocular camera and the barrier information detected by the millimeter wave radar, so that the reliability of barrier detection is improved, the omission factor and the false detection rate are reduced, the distance precision of detection is improved, and information is provided for the longitudinal control of subsequent vehicles and the path planning of the vehicles.
The technical scheme of the invention is as follows:
a binocular camera and millimeter wave radar obstacle information fusion method comprises the following steps:
s01: processing the data of the binocular camera to obtain a lane line equation;
s02: obtaining a camera object list according to the binocular camera data, and obtaining a millimeter wave object list according to the millimeter wave radar data, wherein the object list comprises an object ID and object information;
s03: according to a lane line equation, calculating objects of the lane to obtain a camera main object list and a millimeter wave main object list, and calculating objects of lanes on two sides of the lane to obtain a camera secondary object list and a millimeter wave secondary object list;
s04: respectively fusing a main object list and a secondary object list in the camera object list and the millimeter wave object list, searching the object ID of the previous frame of fusion barrier list, and updating the object information if the object ID is searched; otherwise, calculating the distance according to the position information in the object information, if the distance is smaller than a threshold value, judging the same object, and updating the object information; otherwise, the object is judged to be in doubt.
In a preferred embodiment, the step S04 of calculating the distance according to the position information in the object information further includes: and calculating the distance according to the position information in the object information of the camera main object list/the camera secondary object list and the position information in the object information of the millimeter wave main object list/the millimeter wave secondary object list, and if the distance is less than a threshold value, determining the same object to obtain new fusion data.
In a preferred technical solution, in the step S01, a two-degree-of-freedom single-vehicle kinematics model is used, the vehicle center is used as a coordinate origin, and a vehicle coordinate system OXY is established to obtain a lane line equation:
Y=C0+C1X+C2X2+C3X3
wherein, C0Indicating vehicle deviation from host vehicleThe distance between the lane lines is negative to the right and positive to the left; c1The included angle between the longitudinal axis of the vehicle body of the vehicle and the tangent direction of the center of the lane line, namely the included angle between the tangent direction of the center line of the lane and the OX direction is represented; c2Representing the curvature-related information of the whole lane line; c3Information representing a rate of change of curvature of the entire lane line;
and performing Kalman tracking on the lane line, generating a predicted value of the next frame through the estimated value tracked by the previous frame, and weighting by using the predicted value of the next frame and the measured value of the next frame to obtain the estimated value of the next frame.
In a preferred technical solution, the data in step S04 is updated in a fixed period, where the period is greater than the lane line update period and less than the camera update period.
In a preferred embodiment, the step S02 further includes performing query according to the object ID, if the object ID of the previous frame does not exist in the object list of the new frame, calculating object information of the previous frame through kalman filtering, and adding the object information to the object list of the camera or the millimeter wave object list of the new frame, and if the object ID exists in the object list of the new frame, calculating object information according to data of the previous frame and data of the current frame.
In a preferred technical solution, in the step S03, the area range of the own lane is calculated according to a lane line center line equation and a lane line width, whether the object is in the own lane area is determined according to the lateral position of the object, and if yes, the object is considered as the own lane object; and calculating the area range of the left lane and the area range of the right lane by using a lane line central line equation and lane line width, judging whether the object is in the left lane area or the right lane area according to the transverse position of the object, and if so, determining that the object is a left lane object or a right lane object.
In a preferred embodiment, if the number of objects in the lane is greater than a threshold value, the distance between the object and the host vehicle is calculated, and a set number of objects closest to the host vehicle are selected.
The invention also discloses a barrier information fusion system of the binocular camera and the millimeter wave radar, which comprises the following steps:
the lane line tracking module is used for processing the binocular camera data to obtain a lane line equation, performing Kalman tracking on a lane line, generating a predicted value of a next frame according to an estimated value tracked by a previous frame, and weighting by using the predicted value of the next frame and a measured value of the next frame to obtain an estimated value of the next frame;
the barrier information fusion module is used for obtaining a camera object list according to binocular camera data and obtaining a millimeter wave object list according to millimeter wave radar data, wherein the object list comprises an object ID and object information; according to a lane line equation, calculating objects of the lane to obtain a camera main object list and a millimeter wave main object list, and calculating objects of lanes on two sides of the lane to obtain a camera secondary object list and a millimeter wave secondary object list; respectively fusing a main object list and a secondary object list in the camera object list and the millimeter wave object list, searching the object ID of the previous frame of fusion barrier list, and updating the object information if the object ID is searched; otherwise, calculating the distance according to the position information in the object information, if the distance is smaller than a threshold value, judging the same object, and updating the object information; otherwise, the object is judged to be in doubt.
In a preferred embodiment, the calculating the distance according to the position information in the object information further includes: and calculating the distance according to the position information in the object information of the camera main object list/the camera secondary object list and the position information in the object information of the millimeter wave main object list/the millimeter wave secondary object list, and if the distance is less than a threshold value, determining the same object to obtain new fusion data.
In a preferred technical solution, the obstacle information fusion module further includes performing query according to the object ID, if the previous frame object ID does not exist in the new frame object list, calculating the previous frame object information through kalman filtering, and adding the previous frame object information to the camera object list or the millimeter wave object list of the new frame, and if the object ID exists in the new frame object list, calculating the object information through the previous frame data and the current frame data.
Compared with the prior art, the invention has the advantages that:
1. the method has the advantages that the method carries out fusion matching and tracking processing on the obstacle information detected by the binocular camera and the obstacle information detected by the millimeter wave radar, improves the reliability of obstacle detection, reduces the omission factor and the false detection rate, improves the distance precision of detection, and provides information for longitudinal control of subsequent vehicles and path planning of the vehicles. According to the invention, through 2-2-2 mode obstacle tracking, the tracking stability can be ensured, and tracking mutation caused by obstacle lane change can be avoided.
2. The suspected object is arranged in the tracking algorithm, so that missing detection and error detection can be eliminated to the maximum extent.
3. And tracking of more accurate position and speed can be realized by a Kalman tracking algorithm for binocular camera data and millimeter wave radar data.
4. The distance matching is carried out on the camera and the millimeter waves, the distance replacement is carried out on the camera and the millimeter waves and the fusion target list of the previous frame, the fusion barrier is updated from near to far, and the reliability of barrier detection can be improved.
Drawings
The invention is further described with reference to the following figures and examples:
FIG. 1 is a schematic diagram of a binocular camera lane line tracking performed by the two-degree-of-freedom single-vehicle kinematics model of the present invention;
FIG. 2 is a schematic view of Kalman tracking of a lane line of the present invention;
FIG. 3 is a schematic view of obstacle tracking according to the present invention;
FIG. 4 is a flow chart of the camera and millimeter wave data fusion process of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings in conjunction with the following detailed description. It should be understood that the description is intended to be exemplary only, and is not intended to limit the scope of the present invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
Example (b):
the preferred embodiments of the present invention will be further described with reference to the accompanying drawings.
The invention provides a barrier information fusion scheme of a binocular camera and a millimeter wave radar, which can realize more reliable environment perception information. The invention has the premise that the binocular camera and the millimeter wave radar are transformed into the same coordinate system after pose transformation, and can be transformed into a vehicle body coordinate system or a global coordinate system when in actual use. The invention mainly provides a scheme for processing and information fusion of the camera and millimeter wave data, and the scheme is used for carrying out information fusion of the two sensors, so that more reliable and robust environmental perception is realized.
The utility model provides a barrier information fusion system of binocular camera and millimeter wave radar, contains two modules: the lane line tracking module and the barrier information fusion module.
The lane line tracking module processes information of the binocular camera, provides more accurate information input for transverse control of a follow-up vehicle and provides lane information for the obstacle information fusion module.
And the obstacle fusion module is used for matching object (obstacle) information detected by the binocular camera and object (obstacle) information detected by the millimeter wave radar, tracking, improving the reliability of obstacle detection, reducing the missing detection rate and the false detection rate, improving the distance precision of detection and providing information for longitudinal control of subsequent vehicles and path planning of the vehicles.
Lane line tracking module:
lane line identification mainly passes through binocular camera, and the lane line can not be identified to the millimeter wave radar. However, the binocular camera/sensor recognizes the problem of certain noise, such as occasional jumps. So that karman filtering of the lane lines is required. The equation of the lane line is fitted by a polynomial, and for convenience of explanation, a cubic polynomial is used for fitting, but the fitting order can be other orders, and the processing mode is the same. With a two-degree-of-freedom single-vehicle kinematic model and a vehicle center as a coordinate origin, a vehicle coordinate system OXY is established as shown in FIG. 1, and an output equation of a lane line is as follows:
Y=C0+C1X+C2X2+C3X3
wherein C is0Indicating the distance of the vehicle from the center of the lane line, negative to the right, positive to the left, C1Representing the angle between the direction of travel of the vehicle and the lane line, i.e. the angle between the tangent to the lane centre line and the direction of OX, C2Information indicating the curvature of the lane line, C3The rate of change of curvature of the lane line is indicated.
The kalman tracking is performed on the lane line, namely, the filtering correction is performed on 4 parameters of the lane line, theoretically, the tracking correction needs to be performed on the 4 parameters of the lane line at the same time, but the operation of a 4x4 covariance matrix is introduced, so that the operation load is greatly increased. Therefore, in practical applications, we track each parameter separately.
For the two-degree-of-freedom bicycle model shown in FIG. 1,/fAnd lrRespectively representing a front wheelbase and a rear wheelbase. The x-direction represents the direction in which the principal axis of the vehicle is forward, and the y-direction represents the direction to the left perpendicular to the principal axis of the vehicle body. v. ofxAnd vyRepresenting the velocity in the x, y directions, respectively. Generally, the velocity in the y-direction is small and generally negligible.
Figure BDA0002630566030000051
Representing the yaw rate of the vehicle. The vehicle visual dynamics equation is as follows:
Figure BDA0002630566030000061
Figure BDA0002630566030000062
wherein,
Figure BDA0002630566030000063
indicating the rate of change of the distance deviation of the vehicle from the center of the lane line,
Figure BDA0002630566030000064
representing the rate of change of the yaw angle of the vehicle.
Figure BDA0002630566030000065
Indicating the rate of change of direction of the road i.e. the desired yaw rate,
Figure BDA0002630566030000066
wherein v isxRepresenting the speed of the vehicle and R the radius of the lane line.
The logic of the lane line tracking process is shown in fig. 2. Respectively tracking the lane line parameters, wherein the tracking formula is as follows:
Figure BDA0002630566030000067
wherein T is the period.
Estimation tracked by last frame (C0)T-1,C1T-1,C2T-1,C3T-1) Generating a prediction value for the next frame (C0)T,C1T,C2T,C3T). Then weighting the predicted value of the next frame and the measured value of the next frame to obtain the estimated value of the next frame, and repeating the process to realize continuous tracking. It is noted that equation 4 is based on the assumption that the rate of change of curvature of the lane lines is constant, which can be satisfied in practical scenarios, especially within a short time window.
The barrier information fusion module:
the detection of obstacles is mainly directed to the longitudinal control of the vehicle. And performing Kalman tracking processing by combining the previous frame data according to the new frame data of the camera and the millimeter wave radar data, and then performing matching fusion on the millimeter wave data and the camera data to output reliable obstacle information.
The tracking mode of the obstacles is tracking according to a 2-2-2 mode, that is, 2 obstacles on the lane line of the vehicle and 2 obstacles on the lane lines on both sides, but of course, other modes are possible, and this embodiment is described only in the 2-2-2 mode. The obstacle information of the lane line is mainly used for distance measurement, and safety and functions are guaranteed. The obstacle information of the left side and the right side is mainly used for judging the cut-in of the vehicle in time and ensuring the comfort and the stability of the vehicle control.
As shown in fig. 3, in the actual use process, because the frame rates of the binocular camera and the millimeter wave radar are different, the fusion scheme is uniformly updated in a fixed period, and the embodiment selects 50 ms. If a new measurement value has not yet arrived at the time of the value update, the old measurement value is used directly. As shown in fig. 3, the data period of the camera is 60-100 ms, the millimeter wave radar period is 80ms, and the lane line tracking adopts an update speed of 20 ms. The whole processing logic is as follows:
(1) the method comprises the steps of carrying out message analysis processing through binocular camera original data to obtain a lane line equation, a camera object list and a millimeter wave object list, wherein object (obstacle) information comprises ID (identification number and positive integer) of an obstacle, position information of the obstacle relative to a vehicle and speed information of the obstacle relative to the vehicle.
(2) Reliable lane line equations are obtained by lane line kalman tracking (the method shown in fig. 2).
(3) And inquiring the camera object list, the millimeter wave object list, the main object successfully tracked in the previous frame and the secondary objects (6 main targets in the previous frame) on two sides according to the object ID numbers, if the object ID numbers successfully tracked are lost, indicating that the binocular camera is missed, calculating a new position of the main and secondary objects in the previous frame through Kalman filtering, and adding the new position into the camera object list of the new frame. If the ID number of the corresponding object is found to still exist in the object list of the new frame, the position of the object is calculated by the previous frame data and the present frame data.
(4) To binocular camera data/millimeter wave data and lane line data mutually support, through lane line information, form new 6 primary and secondary objects based on binocular camera millimeter wave: the number of the lane lines is 2, and the number of the lane lines on the two sides is 2 respectively. The specific method comprises the following steps: calculating the area range of the lane according to the lane line central line equation and the lane line width, considering the obstacle as the obstacle of the lane if the transverse position of the obstacle is in the range, and then selecting two obstacles with the closest distance according to the distance. And similarly, calculating the area range of the left lane according to the lane line central line and the lane line width, judging whether the obstacle falls in the left lane area according to the transverse position of the obstacle, and then selecting the two closest obstacles according to the distance, wherein the lane line on the right side is similar.
(5) The new camera-based camera object list (6 obstacles) or millimeter wave object list formed according to step (4), of course, if there are not so many obstacles, the corresponding obstacle positions may be vacant. For example, it is also possible that the left lane has only one obstacle. The millimeter wave radar is processed similarly to the camera.
(6) The primary and secondary objects of the object list respectively formed by the camera and the millimeter wave are matched, and the matching logic is shown in fig. 4. Wherein the primary and secondary object matching characterizations are: the objects of the lane are primary objects, and the objects of the left and right lanes are secondary objects. And respectively matching the camera and the millimeter wave according to the lane to which the obstacle belongs. The camera and the millimeter wave barrier of this lane match, and the camera and the millimeter wave of left lane match in the left lane, and the right lane is the same.
Data fusion logic of the binocular camera and the millimeter wave as shown in fig. 4, the result of fusion 2-2-2 has been stored in the previous frame, corresponding to two objects (there may be no value) of the left-middle-right lane, respectively. Meanwhile, the suspicious object 1-1-1 is recorded for each lane. The suspicious object is that an obstacle may be present but is not completely determined, and if the suspicious object is still detected in the frame, the presence of the obstacle can be determined.
The detection logic for the suspect object is: if the object appearing in the frame does not exist in the main object list (6 obstacles) tracked in the previous frame, the object is set as the suspicious object. It can be known from this logic that there can be a plurality of suspicious objects, and therefore we select the obstacle closest to the host vehicle in each lane (left, center, right) as the suspicious object.
As can be seen from fig. 4, the processing of the binocular camera and the millimeter wave is symmetrical, so the binocular camera is taken as an example to illustrate the fusion logic of the obstacles.
1) Firstly, the barrier of the target list newly added by the binocular camera is detected, which cannot be found in the fusion barrier list successfully tracked in the previous frame, i.e. the IDs are the same, if the barrier is found, the barrier tracking is successful at this time, and the data of the camera is directly replaced by the data in the fusion barrier list (corresponding to the main targets 6 in the previous frame in fig. 3) in the previous frame. Continue processing for the next obstacle, otherwise jump to 2).
2) And detecting whether the data of the binocular camera is similar to the distance information of the previous frame of fused data or not, if so, indicating that the data of the binocular camera is the same object, directly replacing the data of the binocular camera with the data of the previous frame of fused camera, continuing to process the next obstacle, and otherwise, skipping to 3).
3) And detecting whether the data of the camera can be matched with the obstacle of the millimeter wave in distance, if so, indicating that the data of the camera is the same object, fusing the data of the two new frames to form new fused data. Continue processing the next object, otherwise jump to 4).
4) And whether the ID of the obstacle of the camera is the same as the ID of the suspicious object in the previous frame or not, if so, generating a new reliable detection object, and if not, generating a new reliable detection object. And setting the suspected object of the camera as the camera object of the current frame. The next cycle is continued.
The judgment method for the proximity of the distance information comprises the following steps: the sum of the difference between the lateral positions and the difference between the longitudinal positions of the two obstacles is smaller than a set threshold value.
And after all the cameras and the millimeter wave barriers are processed, performing distance replacement according to the fusion data and the reliable barrier data newly generated in the step 3) and the step 4) and a fusion target list of the previous frame, and updating the fusion barriers from near to far.
The specific implementation method comprises the following steps: the fusion target list of the previous frame has 2 objects of the left lane, 2 objects of the own lane and 2 objects of the right lane. Taking the left lane as an example, since fusion objects of a new left lane may be generated through 3, 4 steps, for example, 2 objects are newly generated, there are now 4 fusion objects in the left lane, and two objects with the closest distance need to be selected according to the principle of closest distance. The same applies to the processing of other lanes.
It is to be understood that the above-described embodiments of the present invention are merely illustrative of or explaining the principles of the invention and are not to be construed as limiting the invention. Therefore, any modification, equivalent replacement, improvement and the like made without departing from the spirit and scope of the present invention should be included in the protection scope of the present invention. Further, it is intended that the appended claims cover all such variations and modifications as fall within the scope and boundaries of the appended claims or the equivalents of such scope and boundaries.

Claims (10)

1. A binocular camera and millimeter wave radar obstacle information fusion method is characterized by comprising the following steps:
s01: processing the data of the binocular camera to obtain a lane line equation;
s02: obtaining a camera object list according to the binocular camera data, and obtaining a millimeter wave object list according to the millimeter wave radar data, wherein the object list comprises an object ID and object information;
s03: according to a lane line equation, calculating objects of the lane to obtain a camera main object list and a millimeter wave main object list, and calculating objects of lanes on two sides of the lane to obtain a camera secondary object list and a millimeter wave secondary object list;
s04: respectively fusing a main object list and a secondary object list in the camera object list and the millimeter wave object list, searching the object ID of the previous frame of fusion barrier list, and updating the object information if the object ID is searched; otherwise, calculating the distance according to the position information in the object information, if the distance is smaller than a threshold value, judging the same object, and updating the object information; otherwise, the object is judged to be in doubt.
2. The binocular camera and millimeter wave radar obstacle information fusion method according to claim 1, wherein the calculating of the distance according to the position information in the object information in step S04 further comprises: and calculating the distance according to the position information in the object information of the camera main object list/the camera secondary object list and the position information in the object information of the millimeter wave main object list/the millimeter wave secondary object list, and if the distance is less than a threshold value, determining the same object to obtain new fusion data.
3. The binocular camera and millimeter wave radar obstacle information fusion method according to claim 1, wherein in step S01, a vehicle coordinate system OXY is established with a two-degree-of-freedom single vehicle kinematics model and a vehicle center as an origin of coordinates, to obtain a lane line equation:
Y=C0+C1X+C2X2+C3X3
wherein, C0The distance of the vehicle from the lane line is shown, and the distance is negative towards the right and positive towards the left; c1The included angle between the longitudinal axis of the vehicle body of the vehicle and the tangent direction of the center of the lane line, namely the included angle between the tangent direction of the center line of the lane and the OX direction is represented; c2Representing the curvature-related information of the whole lane line; c3Information representing a rate of change of curvature of the entire lane line;
and performing Kalman tracking on the lane line, generating a predicted value of the next frame through the estimated value tracked by the previous frame, and weighting by using the predicted value of the next frame and the measured value of the next frame to obtain the estimated value of the next frame.
4. The binocular camera and millimeter wave radar obstacle information fusion method according to claim 1, wherein the data in step S04 is updated with a fixed period, the period being greater than a lane line update period and less than a camera update period.
5. The binocular camera and millimeter wave radar obstacle information fusion method according to claim 1, wherein the step S02 further includes performing a query according to the object ID, calculating and adding previous frame object information to the camera object list or the millimeter wave object list of the new frame through kalman filtering if the previous frame object ID does not exist in the new frame object list, and calculating object information through previous frame data and present frame data if the object ID exists in the new frame object list.
6. The binocular camera and millimeter wave radar obstacle information fusion method according to claim 1, wherein in step S03, the area range of the own lane is calculated by a lane line center line equation and a lane line width, and it is determined whether an object is in the own lane area according to the lateral position of the object, and if so, the object is regarded as the own lane object; and calculating the area range of the left lane and the area range of the right lane by using a lane line central line equation and lane line width, judging whether the object is in the left lane area or the right lane area according to the transverse position of the object, and if so, determining that the object is a left lane object or a right lane object.
7. The binocular camera and millimeter wave radar obstacle information fusion method of claim 6, wherein if the number of objects in the lane is greater than a threshold value, the distance between the object and the vehicle is calculated, and a set number of objects closest to the vehicle is selected.
8. The utility model provides a barrier information fusion system of binocular camera and millimeter wave radar which characterized in that includes:
the lane line tracking module is used for processing the binocular camera data to obtain a lane line equation, performing Kalman tracking on a lane line, generating a predicted value of a next frame according to an estimated value tracked by a previous frame, and weighting by using the predicted value of the next frame and a measured value of the next frame to obtain an estimated value of the next frame;
the barrier information fusion module is used for obtaining a camera object list according to binocular camera data and obtaining a millimeter wave object list according to millimeter wave radar data, wherein the object list comprises an object ID and object information; according to a lane line equation, calculating objects of the lane to obtain a camera main object list and a millimeter wave main object list, and calculating objects of lanes on two sides of the lane to obtain a camera secondary object list and a millimeter wave secondary object list; respectively fusing a main object list and a secondary object list in the camera object list and the millimeter wave object list, searching the object ID of the previous frame of fusion barrier list, and updating the object information if the object ID is searched; otherwise, calculating the distance according to the position information in the object information, if the distance is smaller than a threshold value, judging the same object, and updating the object information; otherwise, the object is judged to be in doubt.
9. The binocular camera and millimeter wave radar obstacle information fusion system of claim 8, wherein the calculating the distance according to the position information in the object information further comprises: and calculating the distance according to the position information in the object information of the camera main object list/the camera secondary object list and the position information in the object information of the millimeter wave main object list/the millimeter wave secondary object list, and if the distance is less than a threshold value, determining the same object to obtain new fusion data.
10. The binocular camera and millimeter wave radar obstacle information fusion system of claim 8, wherein the obstacle information fusion module further includes, for query according to an object ID, calculating previous frame object information through kalman filtering if a previous frame object ID does not exist in a new frame object list, and adding to the camera object list or millimeter wave object list of the new frame, and calculating object information through previous frame data and present frame data if an object ID exists in the new frame object list.
CN202010809921.8A 2020-08-13 2020-08-13 Obstacle information fusion method and system for binocular camera and millimeter wave radar Active CN111898582B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010809921.8A CN111898582B (en) 2020-08-13 2020-08-13 Obstacle information fusion method and system for binocular camera and millimeter wave radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010809921.8A CN111898582B (en) 2020-08-13 2020-08-13 Obstacle information fusion method and system for binocular camera and millimeter wave radar

Publications (2)

Publication Number Publication Date
CN111898582A true CN111898582A (en) 2020-11-06
CN111898582B CN111898582B (en) 2023-09-12

Family

ID=73230182

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010809921.8A Active CN111898582B (en) 2020-08-13 2020-08-13 Obstacle information fusion method and system for binocular camera and millimeter wave radar

Country Status (1)

Country Link
CN (1) CN111898582B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010249613A (en) * 2009-04-14 2010-11-04 Toyota Motor Corp Obstacle recognition device and vehicle control unit
CN109360228A (en) * 2018-10-18 2019-02-19 清华大学苏州汽车研究院(吴江) Pose method for registering between monocular cam and millimetre-wave radar
CN109829386A (en) * 2019-01-04 2019-05-31 清华大学 Intelligent vehicle based on Multi-source Information Fusion can traffic areas detection method
CN110126824A (en) * 2019-05-22 2019-08-16 河南工业大学 A kind of commercial vehicle AEBS system of integrated binocular camera and millimetre-wave radar
CN110517303A (en) * 2019-08-30 2019-11-29 的卢技术有限公司 A kind of fusion SLAM method and system based on binocular camera and millimetre-wave radar

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010249613A (en) * 2009-04-14 2010-11-04 Toyota Motor Corp Obstacle recognition device and vehicle control unit
CN109360228A (en) * 2018-10-18 2019-02-19 清华大学苏州汽车研究院(吴江) Pose method for registering between monocular cam and millimetre-wave radar
CN109829386A (en) * 2019-01-04 2019-05-31 清华大学 Intelligent vehicle based on Multi-source Information Fusion can traffic areas detection method
CN110126824A (en) * 2019-05-22 2019-08-16 河南工业大学 A kind of commercial vehicle AEBS system of integrated binocular camera and millimetre-wave radar
CN110517303A (en) * 2019-08-30 2019-11-29 的卢技术有限公司 A kind of fusion SLAM method and system based on binocular camera and millimetre-wave radar

Also Published As

Publication number Publication date
CN111898582B (en) 2023-09-12

Similar Documents

Publication Publication Date Title
CN107646114B (en) Method for estimating lane
Ferryman et al. Visual surveillance for moving vehicles
CN107085938B (en) The fault-tolerant planing method of intelligent driving local path followed based on lane line and GPS
CN112285714B (en) Obstacle speed fusion method and device based on multiple sensors
CN112154455A (en) Data processing method, equipment and movable platform
KR102569900B1 (en) Apparatus and method for performing omnidirectional sensor-fusion and vehicle including the same
CN111738207A (en) Lane line detection method and device, electronic device and readable storage medium
Binelli et al. A modular tracking system for far infrared pedestrian recognition
CN115993597A (en) Visual radar perception fusion method and terminal equipment
CN115923839A (en) Vehicle path planning method
CN113029185A (en) Road marking change detection method and system in crowdsourcing type high-precision map updating
CN113485381A (en) Robot moving system and method based on multiple sensors
CN116859413A (en) Perception model building method for open-air mine car
CN114084129A (en) Fusion-based vehicle automatic driving control method and system
CN115195773A (en) Apparatus and method for controlling vehicle driving and recording medium
CN117173666A (en) Automatic driving target identification method and system for unstructured road
CN111898582B (en) Obstacle information fusion method and system for binocular camera and millimeter wave radar
US20230154199A1 (en) Driving control system and method of controlling the same using sensor fusion between vehicles
CN116052469A (en) Vehicle collision early warning method based on vehicle-road collaborative track prediction
CN116363611A (en) Multi-sensor decision-level fusion vehicle track tracking method
KR102618680B1 (en) Real-time 3D object detection and tracking system using visual and LiDAR
CN115388880A (en) Low-cost memory parking map building and positioning method and device and electronic equipment
JP5682302B2 (en) Traveling road estimation device, method and program
KR20220064407A (en) Onboard cluster tracking system
CN116630765B (en) Bicycle fusion sensing system based on multiple information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant