CN114120288A - Vehicle detection method based on millimeter wave radar and video fusion - Google Patents
Vehicle detection method based on millimeter wave radar and video fusion Download PDFInfo
- Publication number
- CN114120288A CN114120288A CN202111473173.1A CN202111473173A CN114120288A CN 114120288 A CN114120288 A CN 114120288A CN 202111473173 A CN202111473173 A CN 202111473173A CN 114120288 A CN114120288 A CN 114120288A
- Authority
- CN
- China
- Prior art keywords
- millimeter wave
- wave radar
- data
- video
- radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/50—Systems of measurement based on relative movement of target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/50—Systems of measurement based on relative movement of target
- G01S13/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/91—Radar or analogous systems specially adapted for specific applications for traffic control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/91—Radar or analogous systems specially adapted for specific applications for traffic control
- G01S13/92—Radar or analogous systems specially adapted for specific applications for traffic control for velocity measurement
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention discloses a vehicle detection method based on millimeter wave radar and video fusion. The invention detects the speed, distance and angle information of a target vehicle through a millimeter wave radar, detects the position and type of the target vehicle through a camera, and then completes the alignment of the millimeter wave radar and the camera in time and space to obtain a final detection result. The method can be widely applied to the fields of intelligent traffic systems, vehicle-road cooperative systems and the like, so that the traffic condition of the signalless intersection is improved, and an important technical method support is provided for cooperative optimization control of the intersection.
Description
Technical Field
The invention relates to the technical field of intelligent traffic systems and vehicle-road cooperative systems, in particular to a vehicle detection method based on millimeter wave radar and video fusion.
Background
In recent years, with the development of intelligent traffic systems, roadside services are more and more widely applied to urban road traffic systems. At present, many non-signalized intersections in cities are lack of signal control, traffic flow is disordered, traffic accidents are easy to happen, and traffic management and control means are urgently needed. The vehicle detection based on the video or millimeter wave radar at the present stage has certain problems, the radar has limited target identification capability in terms of hardware, false alarm and missed detection can occur in the detection process, the detection distance of the camera is limited, the camera is easily influenced by weather, and a target detection algorithm capable of avoiding certain hardware influence is generally high in complexity and low in operation speed. The traditional vehicle trajectory prediction depends on a large amount of data, although the accuracy is high, the complexity is high, the real-time performance is poor, and under the condition that hardware equipment is insufficient, the support of a large amount of data is difficult to provide. Therefore, in order to obtain more accurate vehicle trajectory data, a vehicle detection method based on millimeter wave radar and video fusion needs to be provided urgently, so that accurate position information of a target vehicle passing through a signalless intersection is obtained, and then collision judgment is carried out, so that the traffic condition of the signalless intersection is improved, and an important technical method support is provided for cooperative optimization control of the intersection.
Disclosure of Invention
The invention aims to make up for the defects of the prior art, and provides a vehicle detection method based on millimeter wave radar and video fusion, so as to solve the problem that vehicle track data cannot be accurately obtained at an intersection in the prior art.
The invention is realized by the following technical scheme:
a vehicle detection method based on millimeter wave radar and video fusion specifically comprises the following steps:
(1) and (3) space coordinate alignment: video data and millimeter wave radar data are introduced, video and radar coordinates are unified, and the radar coordinates are positioned on image data;
(2) time coordinate alignment; the alignment of time coordinates is finished in a multithreading mode;
(3) a data fusion method; and analyzing the millimeter wave radar data and the video data by using a data fusion algorithm based on the detection frame overlap ratio, and selecting more accurate data according to the overlapped part of the millimeter wave radar data and the video data to form a new detection frame.
The spatial coordinate alignment in the step (1) comprises the following specific steps:
firstly, an internal parameter matrix M of the camera is introducedinAnd an extrinsic parameter matrix Mout;
fx、fyReferred to as normalized focal length in the u-axis and v-axis, respectively, u0And v0Is the center of the image;
extrinsic parameter matrix MoutThe three parameters of pitch, yaw and roll are a pitch pi/180, b yaw pi/180 and c roll pi/180; pitch, yaw and roll represent the pitch angle, yaw angle and roll angle of the camera respectively;
and multiply the two:
Mmulti=Min*Mout,Mmultirepresenting the product of the internal parameter matrix and the external parameter matrix;
secondly, four correction coefficients X of the millimeter wave radar are introducedr,Yr,Zr,ZcAnd pitch angle pitch of millimeter wave radarrAnd the installation height h of the millimeter wave radarrThe expression is as follows:
Xr=DistLat;
Zr=DistLong;
Zc=-cosa*sinb*cosc*sina*sinc*Xr+cosa*sinb*sine*sina*cosc*Yt+cosa*cosb*Zr+0.12
(ii) a DistLat and DistLong represent the transverse distance and the longitudinal distance, respectively;
and stores with a matrix: d ═ Xr,Yr,Zr,1);
Finally, unifying the video and the radar coordinates, and positioning the radar coordinates on the image data
ur=Mmulti*D(0,0)/Zc
vr=Mmulti*D(1,0)/Zc。urAnd vrRespectively representing the abscissa and ordinate of the center of the radar box.
The alignment of the time coordinate is completed in a multi-thread mode in the step (2), and the specific steps are as follows:
(a) the video frame is faster than the millimeter wave radar frame, and the video frame is required to wait for millimeter wave radar data, so that the video data are stored in a queue after the video data thread processes the data;
(b) when the millimeter wave radar acquires information, the information is processed by the millimeter wave radar thread and is transmitted to the fusion thread, a command is sent to the video thread at the same time, and the latest stored video information is transmitted to the fusion thread;
(c) and the fusion thread fuses the newly collected video data and the millimeter wave radar data to complete the time alignment.
After the alignment of the coordinates is completed, the data of the two are fused. The millimeter wave radar can obtain the central point and the length and the width of the target, but due to inaccurate calibration and actual errors, the central point obtained by the millimeter wave radar is different from the real central position of the target, and the millimeter wave radar is combined with the length and the width of the target, so that the detection frame has large offset. The video data can obtain a relatively accurate detection frame, and a final detection result is obtained according to intersection of the video data and the detection frame.
The invention has the advantages that: the method is used for accurately acquiring the position information of the target vehicle at the signalless intersection; the invention detects the speed, distance and angle information of a target vehicle through the millimeter wave radar, the camera detects the position and type of the target vehicle, and then the millimeter wave radar and the camera are aligned in time and space to obtain a final detection result, so that the method can be widely applied to the fields of intelligent traffic systems, vehicle-road cooperative systems and the like, and provides an important technical method support for cooperative optimization control of intersections.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a diagram illustrating the fused results in the example of the present invention.
Detailed Description
As shown in fig. 1, the vehicle detection method based on millimeter wave radar and video fusion according to this embodiment includes the following steps:
(1) aligning space coordinates; firstly, an internal parameter matrix Min and an external parameter matrix M of the camera are introducedout。
The external parameter matrix comprises three parameters of pitch, yaw and roll, a ═ pitch ═ pi/180, b ═ yaw ═ pi/180, and c ═ roll · pi/180.
And multiply the two:
Mmulti=Min*Mout
secondly, four correction coefficients X of the millimeter wave radar are introducedr,Yr,Zr,ZcAnd mm, andpitch angle pitch of wave radarrInstallation height h of millimeter wave radarrThe expression is as follows:
Xr=DistLat;
Zr=DistLong;
Zc=-cosa*sinb*cosc*sina*sinc*Xr+cosa*sinb*sinc*sina*cosc*Yr+cosa*cosb*Zr+0.12;
and stores with a matrix: d ═ Xr,Yr,Zr,1);
Finally, unifying the video and the radar coordinates, and positioning the radar coordinates on the image data
ur=Mmulti*D(0,0)/Zc
vr=Mmulti*D(1,0)/Zc
(2) Time coordinate alignment; the time alignment is completed in a multithreading mode, and the specific operation process is as follows:
(a) since the video frame is faster than the radar frame, the video frame should be allowed to wait for radar data, so that the data is stored in the queue after the video data thread processes the data.
(b) When the radar acquires information, the information is processed by the radar thread and transmitted to the fusion thread, a command is sent to the video thread at the same time, and the latest stored video information is transmitted to the fusion thread.
(c) And the fusion thread fuses the latest collected video data and radar data to complete the time alignment. Because the millimeter wave radar and the camera are not simultaneously unfolded to work, certain time difference is inevitable in practical application, but the time difference does not exceed one period through the processing, and can be ignored.
And then, finishing the alignment of the millimeter wave radar and the camera on the time coordinate.
(3) A data fusion method; and analyzing the millimeter wave radar data and the video data by using a data fusion algorithm based on the detection frame overlap ratio, and selecting more accurate data according to the overlapped part of the millimeter wave radar data and the video data to form a new detection frame.
After the alignment of the coordinates is completed, the data of the two are fused. The millimeter wave radar can obtain the central point and the length and the width of the target, but due to inaccurate calibration and actual errors, the central point obtained by the millimeter wave radar is different from the real central position of the target, and the millimeter wave radar is combined with the length and the width of the target, so that the detection frame has large offset. The video data can obtain a relatively accurate detection frame, and a final detection result is obtained according to intersection of the video data and the detection frame. A flow chart of the process of fusing radar data with video data is shown in fig. 1.
Fig. 2 is a result of detection using the vehicle detection method of the present invention, and the target vehicle is framed in an image, it can be seen that the vehicle information can be accurately detected.
Claims (3)
1. A vehicle detection method based on millimeter wave radar and video fusion is characterized in that: the method specifically comprises the following steps:
(1) and (3) space coordinate alignment: video data and millimeter wave radar data are introduced, video and radar coordinates are unified, and the radar coordinates are positioned on image data;
(2) time coordinate alignment: the alignment of time coordinates is finished in a multithreading mode;
(3) a data fusion method; and analyzing the millimeter wave radar data and the video data by using a data fusion algorithm based on the detection frame overlap ratio, and selecting more accurate data according to the overlapped part of the millimeter wave radar data and the video data to form a new detection frame.
2. The vehicle detection method based on millimeter wave radar and video fusion as claimed in claim 1, wherein: the spatial coordinate alignment in the step (1) comprises the following specific steps:
firstly, an internal parameter matrix M of the camera is introducedinAnd an extrinsic parameter matrix Mout;
Wherein f isx、fyReferred to as normalized focal length in the u-axis and v-axis, respectively, u0And v0Is the center of the image;
extrinsic parameter matrix MoutThe three parameters of pitch, yaw and roll are a pitch pi/180, b yaw pi/180 and c roll pi/180;
pitch, yaw and roll represent the pitch angle, yaw angle and roll angle of the camera respectively;
and multiply the two:
Mmulti=Min*Mout,Mmultirepresenting the product of the internal parameter matrix and the external parameter matrix;
secondly, four correction coefficients X of the millimeter wave radar are introducedr,Yr,Zr,ZcAnd pitch angle pitch of millimeter wave radarrAnd the installation height h of the millimeter wave radarrThe expression is as follows:
Xr=DistLat;
Zr=DistLong;
Zc=-cosa*sinb*cosc*sina*sinc*Xr+cosa*sinb*sinc*sina*cosc*Yr+cosa*cosb*Zr+0.12;
DistLat and DistLong represent the transverse distance and the longitudinal distance, respectively;
and stores with a matrix: d ═ Xr,Yr,Zr,1);
Finally, unifying the video and the radar coordinates, and positioning the radar coordinates on the image data
ur=Mmulti*D(0,0)/Zc
vr=Mmulti*D(1,0)/Zc,urAnd vrRespectively representing the abscissa and ordinate of the center of the radar box.
3. The vehicle detection method based on millimeter wave radar and video fusion as claimed in claim 1, wherein: the alignment of the time coordinate is completed in a multi-thread mode in the step (2), and the specific steps are as follows:
(a) the video frame is faster than the millimeter wave radar frame, and the video frame is required to wait for millimeter wave radar data, so that the video data are stored in a queue after the video data thread processes the data;
(b) when the millimeter wave radar acquires information, the information is processed by the millimeter wave radar thread and is transmitted to the fusion thread, a command is sent to the video thread at the same time, and the latest stored video information is transmitted to the fusion thread;
(c) and the fusion thread fuses the newly collected video data and the millimeter wave radar data to complete the time alignment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111473173.1A CN114120288A (en) | 2021-12-02 | 2021-12-02 | Vehicle detection method based on millimeter wave radar and video fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111473173.1A CN114120288A (en) | 2021-12-02 | 2021-12-02 | Vehicle detection method based on millimeter wave radar and video fusion |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114120288A true CN114120288A (en) | 2022-03-01 |
Family
ID=80366610
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111473173.1A Pending CN114120288A (en) | 2021-12-02 | 2021-12-02 | Vehicle detection method based on millimeter wave radar and video fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114120288A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115412844A (en) * | 2022-08-25 | 2022-11-29 | 北京大学 | Real-time alignment method for vehicle networking beams based on multi-mode information synaesthesia |
-
2021
- 2021-12-02 CN CN202111473173.1A patent/CN114120288A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115412844A (en) * | 2022-08-25 | 2022-11-29 | 北京大学 | Real-time alignment method for vehicle networking beams based on multi-mode information synaesthesia |
CN115412844B (en) * | 2022-08-25 | 2024-05-24 | 北京大学 | Multi-mode information alliance-based real-time alignment method for vehicle networking wave beams |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020248614A1 (en) | Map generation method, drive control method and apparatus, electronic equipment and system | |
CN110763246A (en) | Automatic driving vehicle path planning method and device, vehicle and storage medium | |
CN110497901A (en) | A kind of parking position automatic search method and system based on robot VSLAM technology | |
CN113640778B (en) | Multi-laser radar combined calibration method based on non-overlapping view fields | |
CN105676253A (en) | Longitudinal positioning system and method based on city road marking map in automatic driving | |
CN111815717B (en) | Multi-sensor fusion external parameter combination semi-autonomous calibration method | |
AU2018410435B2 (en) | Port area monitoring method and system, and central control system | |
CN111178215B (en) | Sensor data fusion processing method and device | |
WO2023213018A1 (en) | Car following control method and system | |
CN110766760B (en) | Method, device, equipment and storage medium for camera calibration | |
CN113885062A (en) | Data acquisition and fusion equipment, method and system based on V2X | |
CN114120288A (en) | Vehicle detection method based on millimeter wave radar and video fusion | |
CN107356244A (en) | A kind of scaling method and device of roadside unit antenna | |
CN112683260A (en) | High-precision map and V2X-based integrated navigation positioning precision improving system and method | |
US20230278587A1 (en) | Method and apparatus for detecting drivable area, mobile device and storage medium | |
JP2023548879A (en) | Methods, devices, electronic devices and storage media for determining traffic flow information | |
CN110751693A (en) | Method, device, equipment and storage medium for camera calibration | |
CN110766761A (en) | Method, device, equipment and storage medium for camera calibration | |
CN116958763A (en) | Feature-result-level-fused vehicle-road collaborative sensing method, medium and electronic equipment | |
CN114141055B (en) | Parking space detection device and method of intelligent parking system | |
CN113221957A (en) | Radar information fusion characteristic enhancement method based on Centernet | |
CN117274402A (en) | Calibration method and device for camera external parameters, computer equipment and storage medium | |
CN117310627A (en) | Combined calibration method applied to vehicle-road collaborative road side sensing system | |
CN116403186A (en) | Automatic driving three-dimensional target detection method based on FPN Swin Transformer and Pointernet++ | |
TWI805077B (en) | Path planning method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |