CN117434531B - Method and equipment for fusing detection target characteristics of millimeter wave radar and camera - Google Patents

Method and equipment for fusing detection target characteristics of millimeter wave radar and camera Download PDF

Info

Publication number
CN117434531B
CN117434531B CN202311768836.1A CN202311768836A CN117434531B CN 117434531 B CN117434531 B CN 117434531B CN 202311768836 A CN202311768836 A CN 202311768836A CN 117434531 B CN117434531 B CN 117434531B
Authority
CN
China
Prior art keywords
target
millimeter wave
result set
wave radar
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311768836.1A
Other languages
Chinese (zh)
Other versions
CN117434531A (en
Inventor
刘建蓓
马小龙
赵斌
陈天益
马媛媛
张维
孙铸
赵翔
贺桂锡
宋浩杰
郭静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CCCC First Highway Consultants Co Ltd
Original Assignee
CCCC First Highway Consultants Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CCCC First Highway Consultants Co Ltd filed Critical CCCC First Highway Consultants Co Ltd
Priority to CN202311768836.1A priority Critical patent/CN117434531B/en
Publication of CN117434531A publication Critical patent/CN117434531A/en
Application granted granted Critical
Publication of CN117434531B publication Critical patent/CN117434531B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/806Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention relates to the technical field of radars, in particular to a method and equipment for fusing detection target characteristics of a millimeter wave radar and a camera. The invention provides an automatic calibration method for the space-time range of a millimeter wave radar and a camera, a target uniqueness matching processing method and a data feature fusion method. The automatic calibration function of the millimeter wave radar and the detection range of the camera is realized; the problem of unique matching of the millimeter wave radar and the camera detection target in data fusion is solved; and the matching combination of the millimeter wave radar and the camera perception information is completed. The method enriches the characteristic information of vehicles, is beneficial to improving the perception capability of traffic managers on road holographic data, and effectively meets the current requirements on data accuracy and richness of traffic flow for large-range and long-distance perception.

Description

Method and equipment for fusing detection target characteristics of millimeter wave radar and camera
Technical Field
The invention relates to the technical field of radars, in particular to a method and equipment for fusing detection target characteristics of a millimeter wave radar and a camera.
Background
Millimeter wave radars have the ability of identifying multiple targets simultaneously as information sensing equipment in the traffic field and have the penetrating ability of being popular for environmental interference such as smoke, dust, fog and the like, but have limited and abstract visual feedback information acquisition degree of detection targets, and are unfavorable for providing visual display for users.
The camera has stronger capturing capability on visual information of the target, and the detected target characteristic data and the visual feedback result of human have the characteristic of high matching in data type and dimension, but the detection result is out of alignment due to the fact that the camera is easily interfered by meteorological factors such as sunlight, rain, snow and the like, and the camera has poor detection effect on a long-distance target and is not suitable for being independently deployed as information sensing equipment in a traffic scene with a longer viewing distance.
Therefore, the advantages of the two types of equipment are complemented and organically fused at the data layer, so that the perception capability of a traffic manager on road holographic data is improved, the accuracy and data richness requirements of the current target feature detection under a longer detection time window are met, and the fusion of the two types of equipment still has great difficulty.
Disclosure of Invention
The invention aims to overcome the problems that the millimeter wave radar and the camera in the prior art have larger defects and the fusion use has matching conflict, and provides a method and equipment for fusing the detection target characteristics of the millimeter wave radar and the camera.
In order to achieve the above object, the present invention provides the following technical solutions:
A method for fusing the characteristics of detection targets of a millimeter wave radar and a camera comprises the following steps:
s1: respectively acquiring position information and visual information of vehicles on a road through a millimeter wave radar and a camera; the position information comprises distance information of the vehicle relative to the millimeter wave radar, vehicle type and lane number; the visual information comprises a vehicle type and a lane number of the vehicle;
s2: performing space-time range automatic calibration according to the position information and the visual information, and determining a detection range overlapping area of the millimeter wave radar and the camera;
s3: generating a radar target detection result set and a camera target detection result set according to the detection results of the millimeter wave radar and the camera;
s4: performing detection target uniqueness matching on the radar target detection result set and the camera target detection result set according to the detection range overlapping region;
if the matching is successful, S5 is entered, otherwise, target matching processing is carried out;
s5: and carrying out feature fusion on the radar target detection result set and the camera target detection result set to generate and output a fusion detection result set.
As a preferable mode of the present invention, the position information further includes a running speed of the vehicle; the visual information also includes a vehicle color, a vehicle license plate color, and a vehicle license plate type of the vehicle.
As a preferred embodiment of the present invention, the step S2 includes the steps of:
s21: when visual information of a vehicle is acquired, acquiring distance information of all vehicles in the millimeter wave radar range relative to the millimeter wave radar at the current moment;
s22: after one round of target acquisition is completed, extracting the minimum value in the distance information of each vehicle relative to the millimeter wave radar from all detection targetsl i min And maximum valuel i max Output is target matching Rangel i min ,l i max ]The method comprises the steps of carrying out a first treatment on the surface of the I is a round sequence number of target acquisition, and the target acquisition number of each round is a preset value;
s23: judging the optimization target matching rangel min ,l max ]Matching range with target [l i min ,l i max ]Is a size relationship of (2);
if it isl i min Greater thanl min Updatingl min =l i min The method comprises the steps of carrying out a first treatment on the surface of the Otherwisel min Not updating;
if it isl i max Less thanl max Updatingl max =l i max The method comprises the steps of carrying out a first treatment on the surface of the Otherwisel max Not updating;
wherein [ thel min ,l max ]The initial value is [l 1 min ,l 1 max ];
S24: calculating the matching range of the optimization targetl min ,l max ]Range float values of (2);
judging whether the range floating value is smaller than the preset range floating value threshold value, if so, outputting the current [ l min ,l max ]Is a detection range overlapping region; if not, the process proceeds to S21.
As a preferred embodiment of the present invention, the step S2 includes the steps of:
s21: when visual information of a vehicle is acquired, acquiring distance information of all vehicles in the millimeter wave radar range relative to the millimeter wave radar at the current moment;
s22: maximum detection distance dis of the millimeter wave radar max According to a preset minimum division graduation sep min Evenly divided into a plurality of partition sections;
s23: after one round of target acquisition is completed, counting the total number of millimeter wave radar acquired vehicles in each partitioned section according to the distance information of each vehicle relative to the millimeter wave radar;
s24: calculating the ratio of the total number of millimeter wave radar collected vehicles to the total number of camera collected vehicles in each partition;
the ratio of each division interval is arranged from large to small, the first division interval and the second division interval are respectively the maximum division interval and the second maximum division interval, when the ratio of the maximum division interval is more than 95 percent and the ratio of the maximum division interval is higher than the ratio of the second maximum division interval by 20 percent, iteration is ended, and the division interval with the maximum output ratio is a detection range overlapping area;
if not, the process proceeds to S21.
As a preferred embodiment of the present invention, the step S3 includes the steps of:
S31: when the camera detects a vehicle, respectively storing detection results output by the millimeter wave radar and the camera at the current moment as a radar target detection result set r j Camera object detection result set s j The method comprises the steps of carrying out a first treatment on the surface of the The target detection mode of the camera is single target detection, and j is the current moment; when the camera does not detect a vehicle, proceeding to S31;
s32: if the detection result set r j Is not empty and the result set s is detected j Is not empty; outputting the radar target detection result set r j The camera object detection result set s j
If the detection result set r j Empty and the result set s is detected j And if not, performing target matching logic patching processing.
As a preferred embodiment of the present invention, the target matching logic patching process includes the following steps:
s321: temporary storage of detection result set s j Waiting for the radar target detection result set r j
If the waiting time exceeds the target detection time interval of the millimeter wave radar, a radar target detection result set r is not obtained j Discarding the current target matching operation, and deleting the target detection result set s of the camera j Sending out millimeter wave radar data acquisition error alarm, and entering S31;
obtaining a radar target detection result set r when the waiting time does not exceed the target detection time interval of the millimeter wave radar j S322 is entered;
s322: judging whether the camera obtains a new detection result set s in the waiting time j+1
If the camera obtains a detection result set s in the waiting time j+1 S31 is entered;
if the camera does not obtain the detection result set s within the waiting time j+1 Outputting the radar target detection result set r j The camera object detection result set s j And S4 is entered.
As a preferred embodiment of the present invention, the step S4 includes the steps of:
s41: filtering radar target detection result set r j The vehicle with the target uniqueness matching completed in the vehicle is used for outputting a radar target detection result set r j2 The method comprises the steps of carrying out a first treatment on the surface of the j is the moment of generating the radar target detection result set;
s42: filtering the radar target detection result set r j2 Millimeter wave radar detection data with middle position information not in the overlapping region of the detection ranges are output to a radar target detection result set r j3
S43: acquiring the radar target detection result set r j3 The lane number Lr3 of the vehicle in question and the vehicle type Lt; camera object detection result set s j A lane number Ls3 and a vehicle type Ct of the lane where;
s44: when Lr 3=ls3 and lt=ct, outputting that the current vehicle is a millimeter wave radar detection matching target, and entering S5; otherwise, performing target matching processing.
As a preferred embodiment of the present invention, the target matching process includes the steps of:
s441: if the radar target detection result set r j3 If there are only 1 detected vehicles, the vehicles are regarded as detection matching targets of millimeter wave radar, and S5 is entered;
if the radar target detection result set r j3 In greater than 1 detectIs entered into S442;
s442: from the radar target detection result set r j3 Extracting a position information detection result set r_pos of each vehicle, and sorting from large to small according to the distance information of each vehicle relative to the millimeter wave radar to obtain a sorted position information detection result set r_pos_q;
s443: detecting a result set r of radar targets according to the position information detection result set r_pos_q j3 The vehicle farthest from the millimeter wave radar is obtained, and whether the vehicle type information Lt of the vehicle is the same as the vehicle type Ct is compared;
if the two types are the same, S5 is entered;
if the two types of data are different, discarding the current target matching operation, and sending out millimeter wave radar data acquisition error alarm.
As a preferred embodiment of the present invention, the step S5 includes the steps of:
and carrying out feature fusion on the position information and visual information corresponding to the millimeter wave radar detection matching target in the radar target detection result set and the camera target detection result set, outputting the position information and the visual information as a fusion detection result set, and marking the millimeter wave radar detection matching target as a vehicle with the completed target uniqueness matching.
A millimeter wave radar and camera detection target feature fusion device comprising at least one processor and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of the preceding claims.
Compared with the prior art, the invention has the beneficial effects that:
1. by the automatic calibration method for the space-time range of the millimeter wave radar and the camera, the automatic calibration matching of the millimeter wave radar and the camera in the radar detection range is realized, installation and debugging personnel do not need to consider the matching and fusion requirements of multiple devices, the single device function is installed and debugged, the site debugging pressure is greatly reduced, the optional range of the camera installation is enlarged, and the construction difficulty is reduced.
2. By the unique target matching processing method, the unique matching problem of the millimeter wave radar and the camera detection target in data fusion under normal conditions is solved, and matching fusion of camera visual information and millimeter wave radar position information is realized.
3. Through the target matching logic repair processing, the problem that the millimeter wave radar and the camera detect targets are not synchronous is solved, and the reasonable processing is carried out aiming at the various conditions that the camera targets exist and the millimeter wave radar targets do not exist, so that the matching success rate is improved by more than 5%.
4. Through the target matching processing, the problems of inconsistent quantity of detection targets and inconsistent characteristics of the detection targets in the matching areas of the millimeter wave radar and the camera are solved, the fusion conditions of detection time sequence, vehicle type matching, lane matching and the like are fully utilized, the detection targets are subjected to the target repairing processing to the greatest extent, and the fusion rate is improved by more than 3%.
5. By the data feature fusion method, matching combination of millimeter wave radar and camera perception information is completed, vehicle feature information is enriched, the perception capability of a traffic manager on road holographic data is improved, and the current requirements on data accuracy and richness of traffic flow on large-range and long-distance perception are effectively met.
Drawings
Fig. 1 is a flow chart of a method for fusing detection target features of a millimeter wave radar and a camera according to embodiment 1 of the present invention;
fig. 2 is a schematic diagram of a process for obtaining a detection range overlapping region in a method for fusing detection target features of a millimeter wave radar and a camera according to embodiment 3 of the present invention;
fig. 3 is a system architecture diagram of a data monitoring network in a method for fusing detected target features of a millimeter wave radar and a camera according to embodiment 4 of the present invention;
Fig. 4 is a graph of determining overlapping areas of detection ranges of a millimeter wave radar and a camera in the method for fusing detection target features of the millimeter wave radar and the camera according to embodiment 4 of the present invention;
fig. 5 is a flow chart of unique matching between a millimeter wave radar and a detection target of a camera in a method for fusing characteristics of detection targets of a millimeter wave radar and a camera according to embodiment 4 of the present invention;
fig. 6 is a workflow diagram of a target matching process in a method for fusing detected target features of a millimeter wave radar and a camera according to embodiment 4 of the present invention;
fig. 7 is a schematic diagram of partial millimeter wave radar detection data in a test example of a method for fusing detection target features of a millimeter wave radar and a camera according to embodiment 5 of the present invention;
fig. 8 is a schematic structural diagram of a device for fusing detection target features of a millimeter wave radar and a camera according to embodiment 6 of the present invention, which uses the method for fusing detection target features of a millimeter wave radar and a camera according to embodiment 1.
Detailed Description
The present invention will be described in further detail with reference to test examples and specific embodiments. It should not be construed that the scope of the above subject matter of the present invention is limited to the following embodiments, and all techniques realized based on the present invention are within the scope of the present invention.
Example 1
As shown in fig. 1, a method for fusing detection target features of a millimeter wave radar and a camera includes the following steps:
s1: respectively acquiring position information and visual information of vehicles on a road through a millimeter wave radar and a camera; the position information comprises distance information of the vehicle relative to the millimeter wave radar, vehicle type and lane number; the visual information includes a vehicle type of the vehicle and a lane number in which the vehicle is located.
S2: and carrying out automatic space-time range calibration according to the position information and the visual information, and determining the overlapping area of the detection ranges of the millimeter wave radar and the camera.
S3: and generating a radar target detection result set and a camera target detection result set according to the detection results of the millimeter wave radar and the camera.
S4: and carrying out detection target unique matching on the radar target detection result set and the camera target detection result set according to the detection range overlapping region.
And (5) if the matching is successful, entering S5, otherwise, performing target matching processing.
S5: and carrying out feature fusion on the radar target detection result set and the camera target detection result set to generate and output a fusion detection result set.
Example 2
This example is a specific implementation of the method described in example 1, and includes the following steps:
s1: and respectively acquiring the position information and the visual information of the vehicle on the road through the millimeter wave radar and the camera.
The position information comprises distance information, running speed, vehicle type and lane number of the vehicle relative to the millimeter wave radar; the visual information includes a vehicle color, a vehicle type, a lane number in which the vehicle is located, a vehicle license plate color, and a vehicle license plate type of the vehicle.
S2: and carrying out automatic space-time range calibration according to the position information and the visual information, and determining the overlapping area of the detection ranges of the millimeter wave radar and the camera.
S21: when the visual information of the vehicle is acquired, the distance information of all vehicles in the millimeter wave radar range at the current moment relative to the millimeter wave radar is acquired.
S22: after one round of target acquisition is completed, extracting the minimum value in the distance information of each vehicle relative to the millimeter wave radar from all detection targetsl i min And maximum valuel i max Output is target matching Rangel i min ,l i max ]The method comprises the steps of carrying out a first treatment on the surface of the Wherein i is the number of turns of target acquisition, and the target acquisition number of each turn is a preset value.
S23: judging the optimization target matching rangel min ,l max ]Matching range with target [l i min ,l i max ]Is a size relationship of (2);
if it isl i min Greater thanl min Updatingl min =l i min The method comprises the steps of carrying out a first treatment on the surface of the Otherwisel min Not updating;
if it isl i max Less thanl max Updatingl max =l i max The method comprises the steps of carrying out a first treatment on the surface of the Otherwisel max Not updating;
wherein [ thel min ,l max ]The initial value is [l 1 min ,l 1 max ]。
S24: calculating the matching range of the optimization targetl min ,l max ]Range float values of (2).
Judging whether the range floating value is smaller than the preset range floating value threshold value, if so, outputting the current [l min ,l max ]Is a detection range overlapping region; if not, the process proceeds to S21.
S3: and generating a radar target detection result set and a camera target detection result set according to the detection results of the millimeter wave radar and the camera.
S31: when the camera detects a vehicle, respectively storing detection results output by the millimeter wave radar and the camera at the current moment as a radar target detection result set r j Camera object detection result set s j The method comprises the steps of carrying out a first treatment on the surface of the The target detection mode of the camera is single target detection (namely, one target returns to correspond to a unique detection result set), and j is the current moment.
When the camera does not detect the vehicle, the process proceeds to S31.
S32: if the detection result set r j Is not empty and the result set s is detected j Is not empty; outputting the radar target detection result set r j The camera object detectionMeasurement result set s j
If the detection result set r j Empty and the result set s is detected j And if not, performing target matching logic patching processing.
The target matching logic patching process comprises the following steps:
s321: if the detection result set r j Empty and the result set s is detected j Is not empty, temporarily stores the detection result set s j Waiting for the radar target detection result set r j
If the waiting time exceeds the target detection time interval of the millimeter wave radar, a radar target detection result set r is not obtained j Discarding the current target matching operation, and deleting the target detection result set s of the camera j Sending out millimeter wave radar data acquisition error alarm, and entering S31;
obtaining a radar target detection result set r when the waiting time does not exceed the target detection time interval of the millimeter wave radar j S322 is entered;
s322: judging whether the camera obtains a new detection result set s in the waiting time j+1
If the camera obtains a detection result set s in the waiting time j+1 S31 is entered;
if the camera does not obtain the detection result set s within the waiting time j+1 Outputting the radar target detection result set r j The camera object detection result set s j And S4 is entered.
S4: and carrying out detection target unique matching on the radar target detection result set and the camera target detection result set according to the detection range overlapping region. And (5) if the matching is successful, entering S5, otherwise, performing target matching processing.
S41: filtering radar target detection result set r j The vehicle with the target uniqueness matching completed in the vehicle is used for outputting a radar target detection result set r j2 The method comprises the steps of carrying out a first treatment on the surface of the j is the moment when the radar target detection result set is generated.
S42: filtering the radar target detection result set r j2 Middle positionMillimeter wave radar detection data with information not in the overlapping area of the detection ranges are output to a radar target detection result set r j3
S43: acquiring the radar target detection result set r j3 The lane number Lr3 of the vehicle in question and the vehicle type Lt; camera object detection result set s j The lane number Ls3 and the vehicle type Ct of the vehicle.
S44: when Lr 3=ls3 and lt=ct, outputting that the current vehicle is a millimeter wave radar detection matching target, and entering S5; otherwise, performing target matching processing.
The target matching process includes the steps of:
s441: if the radar target detection result set r j3 If there are only 1 detected vehicles, the vehicles are regarded as detection matching targets of millimeter wave radar, and S5 is entered;
If the radar target detection result set r j3 More than 1 detected vehicle, and the process proceeds to S442.
S442: from the radar target detection result set r j3 And extracting a position information detection result set r_pos of each vehicle, and sorting from large to small according to the distance information of each vehicle relative to the millimeter wave radar to obtain a sorted position information detection result set r_pos_q.
S443: detecting a result set r of radar targets according to the position information detection result set r_pos_q j3 The vehicle farthest from the millimeter wave radar is obtained, and whether the vehicle type information Lt of the vehicle is the same as the vehicle type Ct is compared;
if the two types are the same, S5 is entered;
if the two types of data are different, discarding the current target matching operation, and sending out millimeter wave radar data acquisition error alarm.
S5: and carrying out feature fusion on the radar target detection result set and the camera target detection result set to generate and output a fusion detection result set.
And carrying out feature fusion on the position information and visual information corresponding to the millimeter wave radar detection matching target in the radar target detection result set and the camera target detection result set, outputting the position information and the visual information as a fusion detection result set, and marking the millimeter wave radar detection matching target as a vehicle with the completed target uniqueness matching.
Example 3
As shown in fig. 2, this embodiment is another way of obtaining the overlapping region of the detection ranges described in embodiment 2; specifically, the step S2 includes the following steps:
s21: when the visual information of the vehicle is acquired, the distance information of all vehicles in the millimeter wave radar range at the current moment relative to the millimeter wave radar is acquired.
S22: maximum detection distance dis of the millimeter wave radar max According to a preset minimum division graduation sep min Is evenly divided into a plurality of divided sections.
S23: and after the acquisition of one round of targets is completed, counting the total number of millimeter wave radar acquired vehicles in each partitioned section according to the distance information of each vehicle relative to the millimeter wave radar.
S24: and calculating the ratio of the total number of millimeter wave radar collected vehicles to the total number of camera collected vehicles in each partition section.
The ratio of each division interval is arranged from large to small, the first division interval and the second division interval are respectively the maximum division interval and the second maximum division interval, when the ratio of the maximum division interval is more than 95 percent and the ratio of the maximum division interval is higher than the ratio of the second maximum division interval by 20 percent, the iteration is ended, and the division interval with the maximum output ratio is the overlapping area of the detection range.
If not, the process proceeds to S21.
Example 4
The embodiment is a specific implementation manner of the detection target feature fusion method based on millimeter wave radar and camera described in embodiment 1, including the following steps:
step 1, a data monitoring network is established, and as shown in fig. 3, the system architecture communication composition comprises millimeter wave radar, a camera, a road side switch, a monitoring center switch, a road side computing unit and a radar data server. The millimeter wave radar, the road side computing unit and the camera are connected with a road side switch through a network cable, the road side switch is connected with a monitoring center switch through a network cable, and the radar data server is communicated with the monitoring center switch through a network cable; the client is connected with the monitoring center switch through a network cable and is directly communicated with the radar data server to obtain detection target data subjected to target feature fusion, and the detection target data are displayed on the client.
And 2, the millimeter wave radar judges and detects the lane driving line of the road section.
And step 3, a detection data receiving module, a region matching module, a feature fusion module and a target matching processing module are established in the road side computing unit.
Step 4, the millimeter wave radar is used for acquiring the position information of the vehicle on the road, and the position information of the vehicle comprises the distance information of the vehicle relative to the radar, the running speed of the vehicle and the lane number (obtained in step 2) of the vehicle.
And 5, the camera is used for acquiring visual information of vehicles on the road, wherein the visual information of the vehicles comprises vehicle colors, vehicle types, lanes on which the vehicles run, vehicle license plates, vehicle license plate colors and vehicle license plate types.
And 6, actively acquiring acquired data of the millimeter wave radar and the camera by the detection data receiving module established in the step 3 and transmitting the acquired data to the region matching module established in the step 3.
Step 7, in the region matching module established in step 3, an overlapping region of the millimeter wave radar and the camera detection range is determined (the overlapping region is shown in fig. 4). The process comprises the following steps:
and B1, roughly determining a detection coverage range U1 of the millimeter wave radar according to the installation height H and the installation angle beta of the millimeter wave radar.
And step B2, roughly determining the detection coverage range U2 of the camera according to the installation height H and the installation angle alpha of the camera.
And step B3, obtaining target detection structured data of the millimeter wave radar, wherein the target detection structured data comprises position information of targets such as vehicle type information, vehicle speed information and relative radar position information.
And step B4, obtaining the target detection structured data of the camera, wherein the structured data comprise visual information of targets such as vehicle types, vehicle colors, license plate information and license plate colors.
And B5, when the camera target detection structured data is obtained in the step B4, returning the target contained in the millimeter wave radar data detected in the step B3 to the position relative to the radar, wherein the position of the c-th target relative to the radar is represented.
And step B6, obtaining a distance interval of the positions of all detection targets relative to the millimeter wave radar, and finding a target distance closest to the millimeter wave radar and a target distance farthest from the millimeter wave radar, so that the matching range of the detection targets of the camera of the present round and the detection targets of the millimeter wave radar is obtained.
Step B7, repeating steps B3-B7, wherein the matching range result obtained by each round of repetition is smaller than the distance data obtained by the upper round of repetition. Ending the step until the range variation floating is smaller than 5m, and obtaining the accurate space-time position matching range of the millimeter wave radar and the camera.
Step 8, determining a target detection result data set r1 obtained by the millimeter wave radar and a target detection result data set s1 obtained by the camera, including the following steps:
and C1, the detection data receiving module established in the step 3 obtains a target detection result set r1 of the millimeter wave radar at a time t 1.
And C2, the detection data receiving module established in the step 3 obtains a target detection result set s1 of the camera at a time t1, wherein the target detection mode of the camera is single-target detection, namely, one target returns to correspond to one unique detection result set.
And C3, if at the time t1, when the target detection result set s1 of the camera is not empty and the target detection result set r1 of the millimeter wave radar is not empty, namely, the target in the detection range is successfully detected, the target detection result set r1 of the millimeter wave radar at the corresponding time is reserved, and if at the time t1, the target detection result set r1 of the camera is empty, the steps C1-C3 in the step 8 are repeated.
If the target detection result set s1 of the camera is not null but the target detection result set r1 of the millimeter wave radar is null, entering a target matching processing module established in the step 3 to carry out target matching logic patching processing. The method specifically comprises the following steps:
and D1, temporarily storing a target detection result set s1 of the camera in a memory space of a program, waiting for the millimeter wave radar to obtain the target detection result set r1, and if the target detection time interval of the millimeter wave radar is t2, keeping the waiting time not to exceed t2.
Step D2, according to step D1, if the millimeter wave radar is obtained within time t2 to obtain the target detection result set r1, the steps 9-12 of this embodiment are continuously executed. If the millimeter wave radar is not obtained within the time t2 to obtain the target detection result set r1, discarding the current target matching process, and alarming the terminal of the error of millimeter wave radar data acquisition.
Step D3, if the camera obtains a new target detection result set s2 within the waiting time t2 in the step D1, at this time, a millimeter wave radar target detection result set r2 is obtained, and if r2 is still empty, the new target detection result set s2 is continuously temporarily stored in the program memory space, at this time, the camera target detection result set s1 is discarded. If the millimeter wave radar target detection result set r2 is not empty at this time, the steps 9-12 of the present invention are executed to preferentially match the camera target detection result set s2, and then match the camera target detection result set s1.
Step 9, as shown in fig. 5, of obtaining a millimeter wave radar detection target detection result set r1 in step 8, and performing detection target unique matching with a target detection result set s1 of a camera, including the steps of:
and E1, filtering targets which are subjected to target feature matching in the millimeter wave radar target detection result set r1 to obtain a filtered millimeter wave radar target detection result set r2.
And E2, filtering millimeter wave radar detection data of which the position information is not in the U3 area in the millimeter wave radar target detection result set r2 according to the position information of the overlapping area U3 of the millimeter wave radar and camera detection range obtained in the step 7, and obtaining a filtered millimeter wave radar target detection result set r3.
And E3, obtaining the vehicle driving lane information Lr3 and the vehicle type information Lt in the filtered millimeter wave radar target detection result set r3, and obtaining the vehicle driving lane information Ls3 and the vehicle type information Ct in the camera detection result set s 1.
Step E4, judging whether the millimeter wave radar detection target target_rad driving lane information Lr3 and the vehicle type information Lt are the same as the vehicle driving lane information Ls3 and the vehicle type information Ct in the camera detection result set s1, and if so, executing the corresponding content of step 10.
If not, the target matching processing module established in the step 3 is entered for target matching. As shown in fig. 6, the target matching includes the steps of:
in step F1, if there are only 1 detected targets in the filtered millimeter wave radar detection result set r3, the targets may be regarded as millimeter wave radar detection matching targets target_rad, and steps 10 to 12 of the present invention are continuously executed.
And F2, if not only 1 detected target is in the filtered millimeter wave radar detection result set r3, acquiring a vehicle position information data set r_pos in the filtered millimeter wave radar detection result set r3, and sorting according to the position of the detected target from the millimeter wave radar to acquire a sorted position information data set r_pos_q.
And F3, obtaining a target farthest from the millimeter wave radar in the filtered millimeter wave radar detection result set r3 according to the ordered position information data set r_pos_q, wherein the target can be identified as a millimeter wave radar detection target target_rad, and if the vehicle type information Lt of the target_rad is the same as the vehicle type information Ct in the camera detection result set s1, continuing to execute the steps 10-12 of the invention. If the two types of data are different, the current target matching process is abandoned, and the terminal is warned of error in millimeter wave radar data acquisition.
Particularly, if the filtered millimeter wave radar detection result set r3 is empty, it is indicated that targets detected by the millimeter wave radar are not in an overlapping area U3 of detection ranges of the millimeter wave radar and the camera, at the moment, the current target matching process is abandoned, and the terminal is alerted to data acquisition errors of the millimeter wave radar.
And step 10, transmitting the data set r3 corresponding to the millimeter wave radar detection matching target target_rad obtained in the step 9 and the target detection data set s1 corresponding to the camera to the feature fusion module in the step 3 for data fusion.
And 11, in a feature fusion module, fusing detection data of different dimensions of a millimeter wave radar detection matching target target_rad and a camera detection matching target data set s1, constructing a result as a new fusion data set target of the detection target, and marking the millimeter wave radar detection target as a target subjected to target feature matching.
And step 12, transmitting the new fusion data set target constructed in the step 11 to a monitoring center switch through a road side switch by using the data detection network established in the step 1 through a network line by using a road side computing unit to forward to a radar data server for use by a client.
Example 5
In this embodiment, an actual traffic environment test of a highway tunnel is taken as an example, and an in-situ experiment is performed. The space area in the tunnel is narrow, the millimeter wave radar has stronger reflection effect, the probability of generating false targets by the millimeter wave radar is higher, the accuracy of detecting targets is lower than that of an open road, and after the method for fusing the millimeter wave radar and the detection targets of the camera is adopted, the fusion rate and the fusion accuracy of the targets are higher through field test and manual verification. The specific flow is as follows:
and 1, acquiring target data of 1h through a millimeter wave radar, drawing a heat map by using detection scattered points, and calibrating lane lines in a tunnel.
And 2, deploying software containing the method on the road side computing unit and starting a fusion algorithm.
And 3, when the vehicle appears on the road, the millimeter wave radar obtains corresponding detection data, wherein part of the detection data is shown in fig. 7.
Specifically, frame_id is a frame number of the millimeter wave radar, and each frame of data contains one or more detection targets; target_id is the number of the millimeter wave radar detection target; pos_x is the transverse distance of a millimeter wave radar detection target in a coordinate system with the radar as the origin of coordinates and the driving direction as the positive direction, and the unit is: rice; pos_y is the longitudinal distance of a millimeter wave radar detection target in a coordinate system with the radar as the origin of coordinates and the driving direction as the positive direction, and the unit is: rice; size_x is the transverse dimension of the millimeter wave radar detection target, and the unit is: rice; size_y is the longitudinal dimension of the millimeter wave radar detection target, and the unit is: rice; spd_x is the transverse moving speed of the millimeter wave radar detection target, and the unit is: rice/sec; spd_y is the longitudinal movement speed of the millimeter wave radar detection target, and the unit is: rice/sec; lane_num is the lane of the millimeter wave radar detection target in step 2; type is the motorcycle type that millimeter wave radar detected.
Step 4, obtaining the license plate detection structured data of the camera, wherein the structure is as follows:
[car_color, car_type, car_lane, car_plate, car_plate_color, car_plate_type]。
wherein car_color is the vehicle color detected by the camera; car_type is the type of vehicle detected by the camera; car_lane is the lane in which the vehicle detected by the camera travels; the car_plate is a vehicle license plate character string detected by a camera; car_plate_color is the license plate color of the vehicle detected by the camera; car_plate_type is the type of license plate of the vehicle detected by the camera.
And 5, executing an automatic calibration module of the detection range of the millimeter wave radar and the camera, and calibrating a detection overlapping area of the radar and the camera through millimeter wave radar data and camera license plate data acquired in real time, wherein the range of the overlapping area U3 obtained by automatic calibration is between [35 and 76], and the range is enlarged to U3= [30 and 80] in actual use.
And 6, executing a millimeter wave radar and camera detection target data fusion module. Taking millimeter wave radar data of fig. 7 as an example, when the millimeter wave radar detection frame number is 2680164, a target with a target number of 344772145835934000 is obtained, and at this time, the camera obtains a detection target license plate number of "shan a XXXXX", and the camera detects that the driving lane of the motor vehicle is 3 lanes, and the vehicle type is a large vehicle. Through traversing all detection targets of the millimeter wave radar of the current frame, a target with the target number 344772145835934000 is in the range of U3, the millimeter wave radar is the same as a lane where a detection target of a camera is located, in the current frame number, the distance between the target and the millimeter wave radar is the largest, and the vehicle type is a large vehicle, so that the matching condition is met, and the perception data of the camera and the perception data of the millimeter wave radar are fused.
As shown in fig. 7, when the millimeter wave radar frame number is 2680165, the camera obtains the detection target license plate number "shan C XXXXX", and the camera detects that the vehicle driving lane is 3 lanes. At this time, the millimeter wave radar detection frame has 2 targets, wherein one target is a matched target, the target number is 344772145835934000, and the other target detection lane is 1 lane. The millimeter wave radar detection data and the camera detection data conflict, and the target matching processing module is executed at the moment, wherein the target is located in the U3 range, is in the millimeter wave radar detection range, is only one target, and is considered to meet the matching condition at the moment, and the sensing data of the camera and the sensing data of the millimeter wave radar are fused.
As shown in fig. 7, when the millimeter wave radar frame number is 2680166, the camera obtains the detection target license plate number "shan F XXXXX", and the camera detects that the driving lane of the motor vehicle is 3 lanes, and the vehicle type is a large vehicle. At this time, the millimeter wave radar detection frame has 4 targets, wherein 2 targets are already matched targets, and the target numbers are 344772145835934000 and 344771539276661000. The camera only detects one target, the millimeter wave radar detects a plurality of targets, the target matching processing module of the invention is executed at this time, the targets in U3 are screened out, the distances between the targets and the millimeter wave radar are ordered, and the millimeter wave radar detection target with the largest distance is obtained. Therefore, the target is considered to satisfy the data matching condition, and the sensing data of the camera and the sensing data of the millimeter wave radar are fused.
As shown in fig. 7, when the millimeter wave radar frame number is 2680167, the camera obtains the detection target license plate number "shan E XXXXX", and the camera detects that the driving lane of the motor vehicle is 3 lanes, and the vehicle type is a large vehicle. At this time, the millimeter wave radar detection frame has 3 targets, wherein the 3 targets are already matched targets, the millimeter wave radar has no unmatched targets, and according to the target matching logic patching process provided by the invention, the camera detection target is temporarily stored, and then the judgment is carried out after waiting for 1 frame of millimeter wave radar data. When the millimeter wave radar frame number is 2680168, the millimeter wave radar detection target has no unmatched data, and at the moment, the millimeter wave radar perception is wrong, the matching process of the round is abandoned, and the error is reported.
In this embodiment, a total of 100 targets were driven through by manual verification during the test period. The targets are detected by the camera to be 100, and the targets are detected by the millimeter wave radar to be 126. When the target matching logic patching processing and the target matching processing method which are not provided by the method are used, the matching accuracy is 85%. When the target feature fusion method for the target matching logic patching processing and the target matching processing provided by the method is used, the matching accuracy is 95%. The method for fusing the millimeter wave radar and the camera detection target features can effectively solve the problem that the millimeter wave radar has false targets, a large vehicle breaks apart, and detection conflicts with the camera caused by lane errors are detected, so that target matching cannot be performed.
Example 6
As shown in fig. 8, a device for fusing millimeter wave radar with detection target features of a camera includes at least one processor, a memory communicatively connected to the at least one processor, and at least one input/output interface communicatively connected to the at least one processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method for fusing detection target features of a millimeter wave radar and a camera as described in the foregoing embodiments. The input/output interface may include a display, a keyboard, a mouse, and a USB interface for inputting and outputting data.
Those skilled in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware related to program instructions, and the foregoing program may be stored in a computer readable storage medium, where the program, when executed, performs steps including the above method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read Only Memory (ROM), a magnetic disk or an optical disk, or the like, which can store program codes.
The above-described integrated units of the invention, when implemented in the form of software functional units and sold or used as stand-alone products, may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present invention may be embodied in essence or a part contributing to the prior art in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a removable storage device, a ROM, a magnetic disk, or an optical disk.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.

Claims (9)

1. The method for fusing the characteristics of the detection targets of the millimeter wave radar and the camera is characterized by comprising the following steps of:
S1: respectively acquiring position information and visual information of vehicles on a road through a millimeter wave radar and a camera; the position information comprises distance information of the vehicle relative to the millimeter wave radar, vehicle type and lane number; the visual information comprises a vehicle type and a lane number of the vehicle;
s2: performing space-time range automatic calibration according to the position information and the visual information, and determining a detection range overlapping area of the millimeter wave radar and the camera;
s3: generating a radar target detection result set and a camera target detection result set according to the detection results of the millimeter wave radar and the camera;
s4: performing detection target uniqueness matching on the radar target detection result set and the camera target detection result set according to the detection range overlapping region;
if the matching is successful, S5 is entered, otherwise, target matching processing is carried out;
s5: performing feature fusion on the radar target detection result set and the camera target detection result set to generate and output a fusion detection result set;
wherein, the step S3 comprises the following steps:
s31: when the camera detects a vehicle, respectively storing detection results output by the millimeter wave radar and the camera at the current moment as a radar target detection result set r j Camera object detection result set s j The method comprises the steps of carrying out a first treatment on the surface of the The target detection mode of the camera is single target detection, and j is the current moment; when the camera does not detect a vehicle, proceeding to S31;
s32: if the detection result set r j Is not empty and the result set s is detected j Is not empty; outputting the radar target detection result set r j The camera object detection result set s j
If the detection result set r j Empty and the result set s is detected j And if not, performing target matching logic patching processing.
2. The method for fusing detected target features of a millimeter wave radar and a camera according to claim 1, wherein the position information further includes a traveling speed of the vehicle; the visual information also includes a vehicle color, a vehicle license plate color, and a vehicle license plate type of the vehicle.
3. The method for fusing the features of the detection targets of the millimeter wave radar and the camera according to claim 1, wherein the step S2 comprises the following steps:
s21: when visual information of a vehicle is acquired, acquiring distance information of all vehicles in the millimeter wave radar range relative to the millimeter wave radar at the current moment;
s22: after one round of target acquisition is completed, extracting the minimum value in the distance information of each vehicle relative to the millimeter wave radar from all detection targets Maximum->Output is target matching range->The method comprises the steps of carrying out a first treatment on the surface of the Wherein (1)>The method comprises the steps of acquiring a round sequence number for target acquisition, wherein the target acquisition number of each round is a preset value;
s23: judging the matching range of the optimization targetMatching range with target->Is a size relationship of (2);
if it isIs greater than->Update->The method comprises the steps of carrying out a first treatment on the surface of the Otherwise->Not updating;
if it isLess than->Update->The method comprises the steps of carrying out a first treatment on the surface of the Otherwise->Not updating;
wherein,the initial value is +.>
S24: calculating the matching range of the optimization targetRange float values of (2);
judging whether the range floating value is smaller than a preset range floating value threshold value, if so, outputting the currentIs a detection range overlapping region; if not, the process proceeds to S21.
4. The method for fusing the features of the detection targets of the millimeter wave radar and the camera according to claim 1, wherein the step S2 further comprises the steps of:
s21: when visual information of a vehicle is acquired, acquiring distance information of all vehicles in the millimeter wave radar range relative to the millimeter wave radar at the current moment;
s22: maximum detection distance dis of the millimeter wave radar max According to a preset minimum division graduation sep min Evenly divided into a plurality of partition sections;
s23: after one round of target acquisition is completed, counting the total number of millimeter wave radar acquired vehicles in each partitioned section according to the distance information of each vehicle relative to the millimeter wave radar;
S24: calculating the ratio of the total number of millimeter wave radar collected vehicles to the total number of camera collected vehicles in each partition;
the ratio of each division interval is arranged from large to small, the first division interval and the second division interval are respectively the maximum division interval and the second maximum division interval, when the ratio of the maximum division interval is more than 95 percent and the ratio of the maximum division interval is higher than the ratio of the second maximum division interval by 20 percent, iteration is ended, and the division interval with the maximum output ratio is a detection range overlapping area;
if not, the process proceeds to S21.
5. The method for fusing the detected target features of the millimeter wave radar and the camera according to claim 1, wherein the target matching logic patching process comprises the following steps:
s321: temporary storage of detection result set s j Waiting for the radar target detection result set r j
If the waiting time exceeds the target detection time interval of the millimeter wave radar, a radar target detection result set r is not obtained j Discarding the current target matching operation, and deleting the target detection result set s of the camera j Sending out millimeter wave radar data acquisition error alarm, and entering S31;
obtaining a radar target detection result set r when the waiting time does not exceed the target detection time interval of the millimeter wave radar j S322 is entered;
s322: judging whether the camera obtains a new detection result set s in the waiting time j+1
If the camera obtains a detection result set s in the waiting time j+1 Enter S31;
If the camera does not obtain the detection result set s within the waiting time j+1 Outputting the radar target detection result set r j The camera object detection result set s j And S4 is entered.
6. The method for fusing the features of the detection targets of the millimeter wave radar and the camera according to claim 1, wherein the step S4 comprises the following steps:
s41: filtering radar target detection result set r j The vehicle with the target uniqueness matching completed in the vehicle is used for outputting a radar target detection result set r j2 The method comprises the steps of carrying out a first treatment on the surface of the j is the moment of generating the radar target detection result set;
s42: filtering the radar target detection result set r j2 Millimeter wave radar detection data with middle position information not in the overlapping region of the detection ranges are output to a radar target detection result set r j3
S43: acquiring the radar target detection result set r j3 The lane number Lr3 of the vehicle in question and the vehicle type Lt; camera object detection result set s j A lane number Ls3 and a vehicle type Ct of the lane where;
s44: when Lr 3=ls3 and lt=ct, outputting that the current vehicle is a millimeter wave radar detection matching target, and entering S5; otherwise, performing target matching processing.
7. The method for fusing the features of the detection targets of the millimeter wave radar and the camera according to claim 6, wherein the target matching process comprises the steps of:
s441: if the radar target detection result set r j3 If there are only 1 detected vehicles, the vehicles are regarded as detection matching targets of millimeter wave radar, and S5 is entered;
if the radar target detection result set r j3 Of which there are more than 1 detected vehicles, proceeding to S442;
s442: from the radar target detection result set r j3 Extracting position information of each vehicleThe detection result set r_pos is sequenced from big to small according to the distance information of each vehicle relative to the millimeter wave radar, and a sequenced position information detection result set r_pos_q is obtained;
s443: detecting a result set r of radar targets according to the position information detection result set r_pos_q j3 The vehicle farthest from the millimeter wave radar is obtained, and whether the vehicle type information Lt of the vehicle is the same as the vehicle type Ct is compared;
if the two types are the same, S5 is entered;
if the two types of data are different, discarding the current target matching operation, and sending out millimeter wave radar data acquisition error alarm.
8. The method for fusing the features of the detection targets of the millimeter wave radar and the camera according to claim 7, wherein the step S5 comprises the following steps:
And carrying out feature fusion on the position information and visual information corresponding to the millimeter wave radar detection matching target in the radar target detection result set and the camera target detection result set, outputting the position information and the visual information as a fusion detection result set, and marking the millimeter wave radar detection matching target as a vehicle with the completed target uniqueness matching.
9. A millimeter wave radar and camera detection target feature fusion device, comprising at least one processor and a memory communicatively connected to the at least one processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 8.
CN202311768836.1A 2023-12-21 2023-12-21 Method and equipment for fusing detection target characteristics of millimeter wave radar and camera Active CN117434531B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311768836.1A CN117434531B (en) 2023-12-21 2023-12-21 Method and equipment for fusing detection target characteristics of millimeter wave radar and camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311768836.1A CN117434531B (en) 2023-12-21 2023-12-21 Method and equipment for fusing detection target characteristics of millimeter wave radar and camera

Publications (2)

Publication Number Publication Date
CN117434531A CN117434531A (en) 2024-01-23
CN117434531B true CN117434531B (en) 2024-03-12

Family

ID=89556933

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311768836.1A Active CN117434531B (en) 2023-12-21 2023-12-21 Method and equipment for fusing detection target characteristics of millimeter wave radar and camera

Country Status (1)

Country Link
CN (1) CN117434531B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114708585A (en) * 2022-04-15 2022-07-05 电子科技大学 Three-dimensional target detection method based on attention mechanism and integrating millimeter wave radar with vision
CN114898296A (en) * 2022-05-26 2022-08-12 武汉大学 Bus lane occupation detection method based on millimeter wave radar and vision fusion
WO2023066156A1 (en) * 2021-10-18 2023-04-27 长沙中车智驭新能源科技有限公司 Visual and radar perception fusion method and terminal device
CN116894855A (en) * 2023-07-14 2023-10-17 重庆邮电大学 Intersection multi-target cross-domain tracking method based on overlapping view
CN117237692A (en) * 2023-07-13 2023-12-15 中交第四航务工程局有限公司 Multi-feature fusion system and method for automatically identifying working state of special working vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023066156A1 (en) * 2021-10-18 2023-04-27 长沙中车智驭新能源科技有限公司 Visual and radar perception fusion method and terminal device
CN114708585A (en) * 2022-04-15 2022-07-05 电子科技大学 Three-dimensional target detection method based on attention mechanism and integrating millimeter wave radar with vision
CN114898296A (en) * 2022-05-26 2022-08-12 武汉大学 Bus lane occupation detection method based on millimeter wave radar and vision fusion
CN117237692A (en) * 2023-07-13 2023-12-15 中交第四航务工程局有限公司 Multi-feature fusion system and method for automatically identifying working state of special working vehicle
CN116894855A (en) * 2023-07-14 2023-10-17 重庆邮电大学 Intersection multi-target cross-domain tracking method based on overlapping view

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
雷达与视频融合的交通目标跟踪算法研究;魏晨依;中国优秀硕士学位论文全文数据库;20220415;论文摘要,第3,4, 5章 *

Also Published As

Publication number Publication date
CN117434531A (en) 2024-01-23

Similar Documents

Publication Publication Date Title
CN107066953B (en) A kind of vehicle cab recognition towards monitor video, tracking and antidote and device
CN109738910A (en) A kind of curb detection method based on three-dimensional laser radar
CN110400478A (en) A kind of road condition notification method and device
CN109829351A (en) Detection method, device and the computer readable storage medium of lane information
US20220035378A1 (en) Image segmentation
CN107192994A (en) Multi-line laser radar mass cloud data is quickly effectively extracted and vehicle, lane line characteristic recognition method
CN113345237A (en) Lane-changing identification and prediction method, system, equipment and storage medium for extracting vehicle track by using roadside laser radar data
CN112686923A (en) Target tracking method and system based on double-stage convolutional neural network
CN112541475B (en) Sensing data detection method and device
CN105608417A (en) Traffic signal lamp detection method and device
CN105374208A (en) Method for reminding user of road condition and detecting state of camera, and device thereof
CN107590834A (en) A kind of road traffic accident video detecting method and system
Bock et al. On-street parking statistics using lidar mobile mapping
CN113869196B (en) Vehicle type classification method and device based on laser point cloud data multi-feature analysis
Wang et al. Road edge detection in all weather and illumination via driving video mining
CN113281782A (en) Laser radar snow point filtering method based on unmanned vehicle
CN112883936A (en) Method and system for detecting vehicle violation
CN106570487A (en) Method and device for predicting collision between objects
Revilloud et al. A lane marker estimation method for improving lane detection
CN112149471B (en) Loop detection method and device based on semantic point cloud
CN114972911A (en) Method and equipment for collecting and processing output data of automatic driving perception algorithm model
CN114926984A (en) Real-time traffic conflict collection and road safety evaluation method
CN103559502A (en) Pedestrian detection system and method based on adaptive clustering analysis
Minnikhanov et al. Detection of traffic anomalies for a safety system of smart city
Hussain et al. Multiple objects tracking using radar for autonomous driving

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant