CN110515073B - Multi-radar cross-regional networking multi-target tracking identification method and device - Google Patents

Multi-radar cross-regional networking multi-target tracking identification method and device Download PDF

Info

Publication number
CN110515073B
CN110515073B CN201910763489.0A CN201910763489A CN110515073B CN 110515073 B CN110515073 B CN 110515073B CN 201910763489 A CN201910763489 A CN 201910763489A CN 110515073 B CN110515073 B CN 110515073B
Authority
CN
China
Prior art keywords
radar
target
target vehicle
radars
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910763489.0A
Other languages
Chinese (zh)
Other versions
CN110515073A (en
Inventor
许古午
章庆
史灵清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Hurys Intelligent Technology Co Ltd
Original Assignee
Nanjing Hurys Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Hurys Intelligent Technology Co Ltd filed Critical Nanjing Hurys Intelligent Technology Co Ltd
Priority to CN201910763489.0A priority Critical patent/CN110515073B/en
Priority to PCT/CN2019/113974 priority patent/WO2021031338A1/en
Publication of CN110515073A publication Critical patent/CN110515073A/en
Application granted granted Critical
Publication of CN110515073B publication Critical patent/CN110515073B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a multi-radar cross-regional networking multi-target tracking identification method and a multi-radar cross-regional networking multi-target tracking identification device, which relate to the field of vehicle road detection and comprise a plurality of radars and video detection systems arranged on a highway; when the target enters a trigger position set by the radar, the video detection data is triggered, the radar data and the video data are correlated, characteristic information of a license plate, a vehicle type and the like of a target vehicle and motion information of a vehicle speed and the like can be synthesized on the expressway, and then the target is fused through multiple radar data to form a running track of a tracking target running in a high-speed cross-region mode. The advantages of radar and video detection are combined, so that the limit of the detection range of a single radar is broken through, the detection distance of the radar is infinitely extended, and the transmission of vehicle characteristic information and motion information is realized; and finally, information such as all motion tracks, traffic behaviors and the like of the vehicle in the radar network can be acquired, and a multi-radar cross-regional networking multi-target tracking identification process is realized.

Description

Multi-radar cross-regional networking multi-target tracking identification method and device
Technical Field
The invention relates to the field of vehicle road detection, in particular to a multi-radar cross-regional networking multi-target tracking identification method and device.
Background
The expressway belongs to a closed road, ensures the life and property safety of the past drivers and passengers on the expressway, is a big matter of the national civilians, and also belongs to the work content of key management and improvement of the road traffic department in the new period. In order to further improve the control strength of a highway management unit on the whole operation process of a highway network and strengthen the emergency handling capacity of highway emergencies, the requirement on whole-process monitoring of the highway is more and more urgent.
In recent years, high-definition video technology is mature, most of highway management implements monitoring management in the whole process by means of digital high-definition monitoring videos, but with the increase of the number of video channels, the working intensity is extremely high when traffic accidents and road emergency situations are found and confirmed manually in time, and the high-definition video event detection system is arranged to automatically detect events and accidents such as vehicle stopping and traffic jam, and the using effect of the monitoring video system can be improved. The method has the disadvantages that a high-definition video is expensive, is greatly influenced by severe environment, is extremely low in detection precision when meeting weather such as rain, fog and snow and the like, and is not high in economic benefit ratio; secondly, if a brand-new device is used, the common camera arranged before needs to be replaced, and the waste of resources is caused.
In addition, although the high-definition camera can accurately sense characteristic attributes of vehicles such as vehicle types and license plates, the accuracy is poor when motion attributes such as distance and speed are detected, and due to the optical characteristics of the camera, the camera is very easily interfered by the surrounding environment, and the normal work of the camera is affected by weather conditions such as strong light, rain, snow, fog and other severe weather, so that the acquired information result is not accurate enough.
On the other hand, the problem of video monitoring (high-definition camera) equipment can be perfectly solved by a modern radar detection-based area monitoring system, the radar can work for 24 hours all day long and is not influenced by rain, snow, fog and night, so that the radar can be widely applied to the field of intelligent traffic management, but cannot be visually displayed, the collection of information quantity is limited, for example, only traffic information such as vehicle speed or vehicle flow and the like can be collected, and information such as license plates and the like cannot be collected. In addition, the detection distance of a single radar is limited, for example, a section microwave detector can only detect one section, and a single wide-area radar can only detect one area, for example, 200 meters.
Disclosure of Invention
The invention aims to provide a multi-radar cross-regional networking multi-target tracking identification method and device, which are characterized in that characteristic information such as license plates and vehicle types of target vehicles and motion information such as vehicle speeds are synthesized through correlation of radar and video detection data, and then data fusion of multiple radars is carried out, so that the limitation of the detection range of a single radar is broken through, the detection distance of the radar is infinitely extended, the transmission of the characteristic information and the motion information of the vehicles can be realized, all the motion tracks, traffic behaviors and other information of the vehicles in a radar network can be finally obtained, and the identification and tracking of the targets are realized.
In order to achieve the above purpose, the invention provides the following technical scheme: a multi-radar cross-regional networking multi-target tracking identification method comprises the following steps:
1) uniformly installing a plurality of radars and video detection systems on a highway, wherein the radar and video detection systems comprise a video detector and a plurality of radars with equipment numbers, and the detection areas of any two adjacent radars in all the radars and video detection systems installed on the highway are partially overlapped;
2) respectively establishing a radar coordinate system of each radar by taking the center of a receiving antenna of each radar as an origin, wherein the X-axis direction of the radar coordinate system is opposite to the running direction of a target vehicle; when a target vehicle enters a detection area of a first radar in a video detection system and a first radar on a highway section, the first radar detects speed information of the target vehicle in the area and position information of the target vehicle in a first radar coordinate system in real time, generates a target ID representing the target vehicle, and returns the target ID, the speed information and the position information representing the target vehicle to the system; the first radar is provided with a trigger position in a detection area, when a target vehicle runs to the trigger position, the first radar triggers the video detector, and when the video detector judges that the number of a lane where the target vehicle is located is matched with the number of the lane uniquely corresponding to the video detector, the video detector captures an image of the target vehicle and identifies license plate information, and returns a capture result containing the license plate information of the target vehicle to the system;
3) matching and fusing radar and video data of the same target vehicle by a radar and video detection system to generate a complete information set of the target vehicle, wherein the complete information set comprises a target ID, running speed information, a number of a lane where the target vehicle is located, position information of the target vehicle and license plate fusion data records of license plate numbers;
4) the target vehicle runs to the downstream radar numbered by the adjacent equipment, and the target vehicle is judged to enter an overlapping detection area of the two adjacent radars;
5) performing same target matching on target vehicles entering two adjacent radar overlapping detection areas;
6) when a target vehicle is matched as the same target vehicle in the overlapping detection areas of two adjacent radars, fusing the target ID and the license plate number of the upstream radar and outputting the fused target ID and the license plate number to the adjacent downstream radar to generate a new target information set in the detection area of the downstream radar, namely the target vehicle uses the target information set to continuously track in the detection area of the adjacent downstream radar; and when the target vehicle cannot be matched with the same target in the overlapping detection areas of two adjacent radars, generating a new radar target ID by the adjacent downstream radars, wherein the target ID does not contain license plate information until the target vehicle runs to the next radar and the first radar detection area of the video detection system to match the target ID with the license plate.
Further, in the step 4), the target vehicle is judged to enter the overlapping detection area of the two adjacent radars according to the 'equipment number' field in the real-time data of the target vehicle and the X-axis coordinates of the target vehicle in the coordinate systems of the two adjacent radars at the upstream and the downstream;
defining the length of a detection area of a radar as L1, the length of an overlapping detection area of two adjacent radars at the upstream and the downstream as L2, the distance of a radar blind area as delta L, the coordinate of an X axis of a target vehicle detected by the adjacent upstream radar as X1, and the coordinate of the X axis of the target vehicle detected by the adjacent downstream radar as X2;
and when the delta L is not less than x1 not less than L2+ delta L and the L1-L2 not less than x2 not less than L1, judging the target vehicle to be the target in the overlapping detection area.
Further, the overlapping detection area in the step 5) is a fusion area matched with the target between two adjacent radars, and after the target vehicle enters the radar overlapping detection area, the speed difference and the distance difference of the target in the overlapping area of the two adjacent radars are calculated according to the longitude and latitude and the vehicle speed of the target vehicle in the fusion area;
when the absolute value of the distance difference detected by two adjacent radars to the target vehicle is less than 2m and the absolute value of the speed difference detected by the target vehicle is less than 2km/h, the target vehicles detected by the two adjacent radars are judged to be the same target, and the same target detected by the two adjacent radars keeps the same fusion number and license plate number information in the fusion area, so that vehicle tracking fusion between the two adjacent radars is realized.
Furthermore, the first radar and video detection system is arranged at a bayonet of an entrance end of the highway.
Furthermore, the video detector is a bolt, and the bolt and the first radar are jointly installed on a bolt bayonet rod.
In addition, the invention also discloses a multi-radar cross-regional networking multi-target tracking and identifying device, which comprises a radar and video detection system, a first judgment unit, a second judgment unit, a third judgment unit and an information fusion unit;
the radar and video detection system comprises a plurality of radars which are respectively and uniformly arranged along the highway, and comprise a video detector and radars which are uniformly arranged along the highway at intervals and provided with equipment numbers; detection areas of any two adjacent radars in all the radars and video detection systems installed on the highway are partially overlapped;
the radar is used for acquiring the speed information of the target vehicle in the detection area and the position information of the target vehicle in a corresponding radar coordinate system in real time;
defining a first numbered radar in a radar and video detection system as a first radar, wherein a trigger position is arranged in a detection area of the first radar;
the first judgment unit is used for judging whether the target vehicle enters the trigger position in the detection area of the first radar, and triggering and starting the video detector to record the video image of the first radar detection area section when the judgment result is that the target vehicle enters the trigger position;
the second judging unit is used for judging whether the lane number of the target vehicle is matched with the unique lane number corresponding to the video detector in the video image; when the judgment result is that the lane number of the target vehicle is matched with the unique corresponding lane number of the video detector, the video detector snapshots the image of the target vehicle and identifies the license plate information, and transmits the snapshotted result containing the license plate information of the target vehicle back to the system;
the third judging unit is used for judging whether the target vehicle enters an overlapping detection area of two adjacent radars; when the judgment result is that the target vehicle enters the overlapping detection area, performing same target matching on the target vehicle entering the overlapping detection area;
and the information fusion unit is used for fusing and outputting a target information set of the same target vehicle entering the overlapping detection area to the adjacent downstream radar, and the target vehicle continues to track in the adjacent downstream radar detection area by using the target information set.
According to the technical scheme, the multi-radar cross-regional networking multi-target tracking identification method provided by the technical scheme of the invention has the following beneficial effects:
the invention discloses a multi-radar cross-regional networking multi-target tracking identification method and a multi-radar cross-regional networking multi-target tracking identification device, which are combined with the advantages of radar and video detection, and comprise a plurality of radars and a video detection system which are arranged on an expressway, when a target enters a trigger position set by the radar, video detection data is triggered, the radar data and the video data are associated, characteristic information such as license plates and vehicle types of target vehicles and motion information such as vehicle speeds can be synthesized on the expressway, and then data fusion is carried out when the target passes through the plurality of radars.
The technical scheme of the invention is based on the original common camera, utilizes the all-weather high-reliability working advantage of the radar, introduces the whole-process intelligent analysis of multiple radars, detects the whole process of a moving target, uses the radar to master the whole-line road running condition, and immediately controls the camera to take a snapshot and upload the snapshot to a background when detecting that a traffic accident occurs; and after receiving the radar alarm information, background personnel call corresponding monitoring to confirm the authenticity of the accident and then give out corresponding police force, so that a large amount of labor and material cost is saved.
It should be understood that all combinations of the foregoing concepts and additional concepts described in greater detail below can be considered as part of the inventive subject matter of this disclosure unless such concepts are mutually inconsistent.
The foregoing and other aspects, embodiments and features of the present teachings can be more fully understood from the following description taken in conjunction with the accompanying drawings. Additional aspects of the present invention, such as features and/or advantages of exemplary embodiments, will be apparent from the description which follows, or may be learned by practice of specific embodiments in accordance with the teachings of the present invention.
Drawings
The drawings are not intended to be drawn to scale. In the drawings, each identical or nearly identical component that is illustrated in various figures may be represented by a like numeral. For purposes of clarity, not every component may be labeled in every drawing. Embodiments of various aspects of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which:
FIG. 1 is a flow chart of an embodiment of the present invention;
FIG. 2 is a state diagram of the target vehicle drive-in radar and video detection system of the present invention;
FIG. 3 is a flowchart of the fusion of radar single data frames in the present invention;
fig. 4 is a diagram of a radar coordinate system employed in the embodiment.
Detailed Description
In order to better understand the technical content of the present invention, specific embodiments are described below with reference to the accompanying drawings.
In this disclosure, aspects of the present invention are described with reference to the accompanying drawings, in which a number of illustrative embodiments are shown. Embodiments of the present disclosure are not intended to include all aspects of the present invention. It should be appreciated that the various concepts and embodiments described above, as well as those described in greater detail below, may be implemented in any of numerous ways, as the disclosed concepts and embodiments are not limited to any one implementation. In addition, some aspects of the present disclosure may be used alone, or in any suitable combination with other aspects of the present disclosure.
Based on the prior art, on one hand, the high-definition camera is arranged on the expressway, although the characteristic attributes of the vehicle type, the license plate and the like can be accurately sensed, the accuracy is poor when the motion attributes of the distance, the speed and the like are detected, and the acquired information result is inaccurate due to the fact that the high-definition camera is easily interfered by the surrounding environment; on the other hand, the area monitoring system based on radar detection can perfectly solve the problem of video monitoring equipment, but cannot realize visual display, and has limited acquisition of information quantity; the invention aims to provide a multi-radar cross-regional networking multi-target tracking identification method, which breaks through the limitation of the detection range of a single radar by correlating and fusing the radar and video detection data, realizes the transmission of vehicle characteristic information and motion information, and identifies, acquires and tracks all motion tracks and traffic behaviors of a vehicle in a radar network.
The multi-radar cross-regional networking multi-target tracking identification method and device of the invention are further specifically described below with reference to the embodiments shown in the drawings.
Referring to fig. 1, a multi-radar cross-regional networking multi-target tracking identification method includes the following steps: 1) uniformly installing a plurality of radars and video detection systems on the expressway, wherein each radar and video detection system comprises a plurality of video detectors and a plurality of radars with equipment numbers; detection areas of any two adjacent radars in all the radars and video detection systems installed on the highway are partially overlapped;
2) respectively establishing a radar coordinate system of each radar by taking the center of a receiving antenna of each radar as an origin, wherein the X-axis direction of the radar coordinate system is opposite to the running direction of a target vehicle; when a target vehicle enters a detection area of a first radar in a video detection system and a first radar on a highway section, the first radar detects speed information of the target vehicle in the area and position information of the target vehicle in a first radar coordinate system in real time, generates a target ID representing the target vehicle, and returns the target ID, the speed information and the position information representing the target vehicle to the system; the first radar is provided with a trigger position in a detection area, when a target vehicle runs to the trigger position, the first radar triggers the video detector, and when the video detector judges that the number of a lane where the target vehicle is located is matched with the number of the lane uniquely corresponding to the video detector, the video detector captures an image of the target vehicle and identifies license plate information, and returns a capture result containing the license plate information of the target vehicle to the system;
3) matching and fusing radar and video data of the same target vehicle by a radar and video detection system to generate a complete information set of the target vehicle, wherein the complete information set comprises a target ID, running speed information, a number of a lane where the target vehicle is located, position information of the target vehicle and license plate fusion data records of license plate numbers;
4) the target vehicle runs to the downstream radar numbered by the adjacent equipment, and the target vehicle is judged to enter an overlapping detection area of the two adjacent radars;
5) performing same target matching on target vehicles entering two adjacent radar overlapping detection areas;
6) when a target vehicle is matched as the same target vehicle in the overlapping detection areas of two adjacent radars, fusing the target ID and the license plate number of the upstream radar and outputting the fused target ID and the license plate number to the adjacent downstream radar to generate a new target information set in the detection area of the downstream radar, namely the target vehicle uses the target information set to continuously track in the detection area of the adjacent downstream radar; and when the target vehicle cannot be matched with the same target in the overlapping detection areas of two adjacent radars, generating a new radar target ID by the adjacent downstream radars, wherein the target ID does not contain license plate information until the target vehicle runs to the next radar and the first radar detection area of the video detection system to match the target ID with the license plate.
And therefore, the step 6) is repeated, and the target for realizing multi-target tracking and identification in multi-radar cross-regional networking shown in the figure 2 is achieved. In the specific implementation process, the first radar and video detection system is arranged at a bayonet of an entrance end of a highway; the video detector adopts the rifle bolt, and the rifle bolt is installed on rifle bolt bayonet rod piece together with first radar.
In the step 2), the position information of the target vehicle detected by the first radar in real time in the first radar coordinate system is the X-axis and Y-axis coordinates of the target vehicle in the corresponding radar coordinate system; when the first radar detects that the target vehicle reaches the trigger position, the first radar sends a trigger signal to the video detector in a transmission mode of an RS485 serial port, and meanwhile, the radar and video detection system records the trigger signal and simultaneously records the target ID sent by the first radar and the position information of the target vehicle. The trigger signal contains information such as lane numbers and target speed required to be captured by the bolt. And when the video detector receives a trigger signal sent by the first radar, judging, namely if the lane number of the target vehicle is matched with the unique lane number corresponding to the video detector, carrying out snapshot.
After the radar and video detection system acquires all data of the first radar and the video detector, data matching is carried out in step 2) according to the trigger position of the target vehicle, the number of the lane where the target vehicle is located and a reasonable time interval (such as 0-2 seconds), if the matching is successful, the first radar trigger signal of the target vehicle is merged with the snapshot result, license plate data fusion of the target vehicle is carried out through parameters such as time, lane number, position information and the like, the license plate fusion data is a license plate data source for multi-radar target fusion, and finally a license plate fusion data record containing a target ID, driving speed information, the number of the lane where the target vehicle is located, the position of the target vehicle and the license plate number can be generated. After the license plate information is fused, the vehicle detected by the first radar comprises a vehicle unique identifier (such as a lane number and a license plate number) and target vehicle basic information (such as a longitude and latitude position, a position of the target vehicle relative to the radar, running speed information, a vehicle type and the like).
Further, in the step 4), the target vehicle is judged to enter an overlapping detection area of two adjacent radars according to an equipment number field of the radars in the real-time data of the target vehicle and an X-axis coordinate of the target vehicle in a radar detection area; defining the length of a detection area of a radar as L1, the length of an overlapping detection area of two adjacent radars at the upstream and the downstream as L2, the distance of a radar blind area as delta L, the coordinate of an X axis of a target vehicle detected by the adjacent upstream radar as X1, and the coordinate of the X axis of the target vehicle detected by the adjacent downstream radar as X2;
and when the delta L is not less than x1 not less than L2+ delta L and the L1-L2 not less than x2 not less than L1, judging the target vehicle to be the target in the overlapping detection area.
The overlapping detection area in the step 5) is a fusion area matched with the target between two adjacent radars, and after the target vehicle enters the radar overlapping detection area, the speed difference and the distance difference of the target in the overlapping area of the two adjacent radars are calculated according to the longitude and latitude and the vehicle speed of the target vehicle in the fusion area; when the absolute value of the distance difference detected by two adjacent radars to the target vehicle is less than 2m and the absolute value of the speed difference detected by the target vehicle is less than 2km/h, the target vehicles detected by the two adjacent radars are judged to be the same target, and the same target detected by the two adjacent radars keeps the fusion number and the license plate number information consistent in the fusion area, so that vehicle tracking fusion between the two adjacent radars is realized, and the subsequent adjacent downstream radars are matched and fused in the same way, so that multi-radar cross-area whole-course tracking of the target vehicle is realized.
With reference to the fusion flow chart of the radar single data frame shown in fig. 3 and the radar coordinate system diagram shown in fig. 4, a specific calculation process in which the absolute value of the distance difference detected by two adjacent radars to the target vehicle is less than 2 is as follows:
firstly, the system can calculate the longitude and latitude of the target vehicle according to the longitude and latitude of each radar, the true north deflection angle and the coordinates of the target vehicle in the corresponding radar coordinate system.
Setting the longitude of any radar in the radar and video detection system as lon1, the latitude as lat1 and the true north deflection angle as beta, and setting the coordinates of the target vehicle in a radar coordinate system as (x, y);
the equatorial radius of the earth: EARTH _ RADIUS is 6378137m, arc length per degree (radian) of the EARTH: EARTH _ ARC 6378.137km pi/180 111199 m;
the distance between the radar and a target vehicle located in a radar monitoring area is as follows:
Figure BDA0002171128610000091
the azimuth angle difference (radian) between the radar and a target vehicle located in a radar monitoring area is alpha:
Figure BDA0002171128610000092
the true north yaw angle (radian) of the target vehicle is a:
Figure BDA0002171128610000093
the degree (radian) of the relative spherical center phase difference between the target vehicle and the radar on the earth surface is c:
Figure BDA0002171128610000094
the latitude complementary angle of the target vehicle is b:
b=arccos【cos 90-lon1×cos c+sin(90-lon1)×sin c×cos a】 (5)
the target vehicle and the radar have a difference angle d in the longitudinal direction:
Figure BDA0002171128610000101
then:
the longitude of the target vehicle is lonM:
lonM=lon1+d (7)
the target vehicle latitude is latM:
latM=90-b (8)
and traversing all targets in the fusion area of the two adjacent radars at the upstream and the downstream, and calculating the distance difference of the two radars to each target vehicle according to the longitude and latitude of each target vehicle.
For example, calculating a distance difference between a first target vehicle and a second target vehicle: the longitude and latitude of the first target vehicle are assumed to be lonA and latA, and the longitude and latitude of the second target vehicle are assumed to be lonB and latB; the known equatorial radius of the earth: ear _ RADIUS 6378137 m;
then, the longitude and latitude differences between the first target vehicle and the second target vehicle are respectively:
Δlon=lonA-lonB (9)
Δlat=latA-latB (10)
from the above equation, the distance S between the first target vehicle and the second target vehicle is:
Figure BDA0002171128610000102
judging the absolute value of the S value and the size of 2m, and simultaneously judging the speed difference V between the first target vehicle and the second target vehicle detected by the radar and the size of 2 km/h; when the size II S II is less than 2m and the size II V II is less than 2km/h, a first target vehicle and a second target vehicle which are respectively detected by two adjacent radars at the upper and lower reaches are directly judged to be the same target vehicle, then vehicle tracking and data fusion between the two adjacent radars at the upper and lower reaches are further carried out, corresponding judgment processing is carried out on subsequent radars at the lower reaches in sequence, multi-radar cross-regional networking is realized, and multi-target tracking identification is further realized.
The invention also discloses a multi-radar cross-regional networking multi-target tracking and identifying device, aiming at solving the technical problems that the monitoring of the vehicle running condition by adopting a high-definition camera on a highway is easily interfered by the environment, the information result is inaccurate, the visual display cannot be realized by adopting a radar detection system, and the acquisition of the information quantity is limited.
The multi-radar cross-regional networking multi-target tracking and identifying device comprises a plurality of radars and video detection systems which are uniformly arranged along a highway; the radar and video detection system comprises a video detector and a plurality of radars which are arranged at intervals along the highway and are provided with equipment numbers, and the detection areas of any two adjacent radars in all the radars and video detection systems arranged on the highway are partially overlapped; the radar is used for acquiring speed information of a target vehicle in a detection area and position information of the target vehicle in a corresponding radar coordinate system in real time;
defining a first numbered radar in a radar and video detection system as a first radar, wherein a trigger position is arranged in a detection area of the first radar;
the multi-radar cross-regional networking multi-target tracking and identifying device further comprises:
and the first judgment unit is used for judging whether the target vehicle enters the trigger position in the detection area of the first radar, and triggering and starting the video detector to record the video image of the detection area segment of the first radar when the judgment result is that the target vehicle enters the trigger position.
The second judgment unit is used for judging whether the lane number of the target vehicle is matched with the unique lane number corresponding to the video detector in the video image; and when the judgment result shows that the lane number of the target vehicle is matched with the unique corresponding lane number of the video detector, the video detector snapshots the image of the target vehicle, identifies the license plate information, and transmits the snapshotted result containing the license plate information of the target vehicle back to the system.
The third judgment unit is used for judging whether the target vehicle enters an overlapping detection area of two adjacent radars; and when the judgment result is that the target vehicle enters the overlapping detection area, performing same target matching on the target vehicle entering the overlapping detection area.
And the information fusion unit is used for fusing and outputting the target information set of the same target vehicle entering the overlapping detection area to the adjacent downstream radar, and the target vehicle uses the target information set to continue track tracking in the adjacent downstream radar detection area.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to be limited thereto. Those skilled in the art can make various changes and modifications without departing from the spirit and scope of the invention. Therefore, the protection scope of the present invention should be determined by the appended claims.

Claims (6)

1. A multi-radar cross-regional networking multi-target tracking identification method is characterized by comprising the following steps:
1) uniformly installing a plurality of radars and video detection systems on the expressway, wherein each radar and video detection system comprises a video detector and a plurality of radars with equipment numbers; detection areas of any two adjacent radars in all the radars and video detection systems installed on the highway are partially overlapped;
2) respectively establishing a radar coordinate system of each radar by taking the center of a receiving antenna of each radar as an origin, wherein the X-axis direction of the radar coordinate system is opposite to the running direction of a target vehicle; when a target vehicle enters a detection area of a first radar in a video detection system and a first radar on a highway section, the first radar detects speed information of the target vehicle in the area and position information of the target vehicle in a first radar coordinate system in real time, generates a target ID representing the target vehicle, and returns the target ID, the speed information and the position information representing the target vehicle to the system; the first radar is provided with a trigger position in a detection area, when a target vehicle runs to the trigger position, the first radar triggers the video detector, and when the video detector judges that the number of a lane where the target vehicle is located is matched with the number of the lane uniquely corresponding to the video detector, the video detector captures an image of the target vehicle and identifies license plate information, and returns a capture result containing the license plate information of the target vehicle to the system;
3) matching and fusing radar and video data of the same target vehicle by a radar and video detection system to generate a complete information set of the target vehicle, wherein the complete information set comprises a target ID, running speed information, a number of a lane where the target vehicle is located, position information of the target vehicle and license plate fusion data records of license plate numbers;
4) the target vehicle runs to the downstream radar numbered by the adjacent equipment, and the target vehicle is judged to enter an overlapping detection area of the two adjacent radars;
5) performing same target matching on target vehicles entering two adjacent radar overlapping detection areas;
6) when a target vehicle is matched as the same target vehicle in the overlapping detection areas of two adjacent radars, fusing the target ID and the license plate number of the upstream radar and outputting the fused target ID and the license plate number to the adjacent downstream radar to generate a new target information set in the detection area of the downstream radar, namely the target vehicle uses the target information set to continuously track in the detection area of the adjacent downstream radar; and when the target vehicle cannot be matched with the same target in the overlapping detection areas of two adjacent radars, generating a new radar target ID by the adjacent downstream radars, wherein the target ID does not contain license plate information until the target vehicle runs to the next radar and the first radar detection area of the video detection system to match the target ID with the license plate.
2. The multi-radar cross-regional networking multi-target tracking identification method according to claim 1, wherein in the step 4), the target vehicle is judged to enter the overlapping detection region of two adjacent radars through an equipment number field in real-time data of the target vehicle and X-axis coordinates of the target vehicle in coordinate systems of the two adjacent radars at the upstream and the downstream;
defining the length of a detection area of a radar as L1, the length of an overlapping detection area of two adjacent radars at the upstream and the downstream as L2, the distance of a radar blind area as delta L, the coordinate of an X axis of a target vehicle detected by the adjacent upstream radar as X1, and the coordinate of the X axis of the target vehicle detected by the adjacent downstream radar as X2;
and when the delta L is not less than x1 not less than L2+ delta L and the L1-L2 not less than x2 not less than L1, judging the target vehicle to be the target in the overlapping detection area.
3. The multi-radar cross-regional networking multi-target tracking identification method according to claim 1, wherein the overlapping detection region in the step 5) is a fusion region matched with targets between two adjacent radars, and after a target vehicle enters the radar overlapping detection region, the speed difference and the distance difference of the targets in the overlapping region of the two adjacent radars are calculated according to the longitude and latitude and the vehicle speed of the target vehicle in the fusion region;
when the absolute value of the distance difference detected by two adjacent radars to the target vehicle is less than 2m and the absolute value of the speed difference detected by the target vehicle is less than 2km/h, the target vehicles detected by the two adjacent radars are judged to be the same target, and the same target detected by the two adjacent radars keeps the same fusion number and license plate number information in the fusion area, so that vehicle tracking fusion between the two adjacent radars is realized.
4. The multi-radar cross-regional networking multi-target tracking identification method according to claim 1, wherein the first radar and video detection system is arranged at a gate of an entrance end of a highway.
5. The multi-radar cross-regional networking multi-target tracking identification method according to claim 4, wherein the video detector is a gun camera, and the gun camera and the first radar are mounted on a gun camera bayonet rod together.
6. A multi-radar cross-regional networking multi-target tracking identification device is characterized by comprising a radar and video detection system, a first judgment unit, a second judgment unit, a third judgment unit and an information fusion unit;
the radar and video detection system comprises a plurality of radars which are respectively and uniformly arranged along the highway, and comprise a video detector and radars which are uniformly arranged along the highway at intervals and provided with equipment numbers; detection areas of any two adjacent radars in all the radars and video detection systems installed on the highway are partially overlapped;
the radar is used for acquiring the speed information of the target vehicle in the detection area and the position information of the target vehicle in a corresponding radar coordinate system in real time;
defining a first numbered radar in a radar and video detection system as a first radar, wherein a trigger position is arranged in a detection area of the first radar;
the first judgment unit is used for judging whether the target vehicle enters the trigger position in the detection area of the first radar, and triggering and starting the video detector to record the video image of the first radar detection area section when the judgment result is that the target vehicle enters the trigger position;
the second judging unit is used for judging whether the lane number of the target vehicle is matched with the unique lane number corresponding to the video detector in the video image; when the judgment result shows that the lane number of the target vehicle is matched with the unique corresponding lane number of the video detector, the video detector captures the image of the target vehicle, identifies the license plate information and sends the license plate information of the target vehicleSnapshooting result is returned to system
The third judging unit is used for judging whether the target vehicle enters an overlapping detection area of two adjacent radars; when the judgment result is that the target vehicle enters the overlapping detection area, performing same target matching on the target vehicle entering the overlapping detection area;
and the information fusion unit is used for fusing and outputting a target information set of the same target vehicle entering the overlapping detection area to the adjacent downstream radar, and the target vehicle continues to track in the adjacent downstream radar detection area by using the target information set.
CN201910763489.0A 2019-08-19 2019-08-19 Multi-radar cross-regional networking multi-target tracking identification method and device Active CN110515073B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910763489.0A CN110515073B (en) 2019-08-19 2019-08-19 Multi-radar cross-regional networking multi-target tracking identification method and device
PCT/CN2019/113974 WO2021031338A1 (en) 2019-08-19 2019-10-29 Multiple object tracking and identification method and apparatus based on multi-radar cross-regional networking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910763489.0A CN110515073B (en) 2019-08-19 2019-08-19 Multi-radar cross-regional networking multi-target tracking identification method and device

Publications (2)

Publication Number Publication Date
CN110515073A CN110515073A (en) 2019-11-29
CN110515073B true CN110515073B (en) 2021-09-07

Family

ID=68625724

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910763489.0A Active CN110515073B (en) 2019-08-19 2019-08-19 Multi-radar cross-regional networking multi-target tracking identification method and device

Country Status (2)

Country Link
CN (1) CN110515073B (en)
WO (1) WO2021031338A1 (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110930721A (en) * 2019-12-03 2020-03-27 上海熹翼科技有限公司 Road vehicle monitoring system and method based on soft handover radar
CN111083444B (en) * 2019-12-26 2021-10-15 浙江大华技术股份有限公司 Snapshot method and device, electronic equipment and storage medium
CN113095345A (en) * 2020-01-08 2021-07-09 富士通株式会社 Data matching method and device and data processing equipment
CN111260808B (en) * 2020-01-17 2021-12-10 河北德冠隆电子科技有限公司 Free flow vehicle charging device, system and method based on multi-data fusion
CN111679271B (en) * 2020-06-15 2023-03-14 杭州海康威视数字技术股份有限公司 Target tracking method, target tracking device, monitoring system and storage medium
CN112162283A (en) * 2020-08-18 2021-01-01 重庆睿行电子科技有限公司 All-section networking traffic radar multi-target detection system
CN112071083B (en) * 2020-09-15 2022-03-01 深圳市领航城市科技有限公司 Motor vehicle license plate relay identification system and license plate relay identification method
CN112099040A (en) * 2020-09-15 2020-12-18 浙江省机电设计研究院有限公司 Whole-course continuous track vehicle tracking system and method based on laser radar network
CN112309123B (en) * 2020-10-15 2022-04-15 武汉万集信息技术有限公司 Vehicle detection method and system
CN112433203B (en) * 2020-10-29 2023-06-20 同济大学 Lane linearity detection method based on millimeter wave radar data
CN114445307A (en) * 2020-10-30 2022-05-06 高新兴科技集团股份有限公司 Method, device, MEC and medium for acquiring target information based on radar and visible light image
CN114639262B (en) * 2020-12-15 2024-02-06 北京万集科技股份有限公司 Method and device for detecting state of sensing device, computer device and storage medium
CN112731324A (en) * 2020-12-16 2021-04-30 中交第一公路勘察设计研究院有限公司 Multi-radar cross-regional networking multi-target tracking method for expressway
CN112837546A (en) * 2020-12-25 2021-05-25 山东交通学院 Expressway agglomerate fog guide laying method
CN112700647B (en) * 2020-12-29 2022-09-16 杭州海康威视数字技术股份有限公司 Method and device for monitoring vehicle driving information
CN115331480A (en) * 2021-05-10 2022-11-11 北京万集科技股份有限公司 Vehicle early warning method and device and computing equipment
CN113393675B (en) * 2021-05-24 2023-03-21 青岛海信网络科技股份有限公司 Vehicle ID determination method, device, equipment and medium
CN113419244A (en) * 2021-05-28 2021-09-21 同济大学 Vehicle track splicing method based on millimeter wave radar data
CN115410379B (en) * 2021-05-28 2024-02-13 深圳成谷科技有限公司 Matching relation determining method, device and processing equipment applied to vehicle-road cooperation
CN113792634B (en) * 2021-09-07 2022-04-15 北京易航远智科技有限公司 Target similarity score calculation method and system based on vehicle-mounted camera
CN115880791A (en) * 2021-09-26 2023-03-31 山西西电信息技术研究院有限公司 Parking management method
CN113888865B (en) * 2021-09-29 2022-11-11 青岛海信网络科技股份有限公司 Electronic device and vehicle information acquisition method
CN113900070B (en) * 2021-10-08 2022-09-27 河北德冠隆电子科技有限公司 Method, device and system for automatically drawing target data and accurately outputting radar lane
CN115206091A (en) * 2022-06-07 2022-10-18 西安电子科技大学广州研究院 Road condition and event monitoring system and method based on multiple cameras and millimeter wave radar
CN115331469A (en) * 2022-08-15 2022-11-11 北京图盟科技有限公司 Vehicle track online restoration method, device and equipment
CN116721552B (en) * 2023-06-12 2024-05-14 北京博宏科元信息科技有限公司 Non-motor vehicle overspeed identification recording method, device, equipment and storage medium
CN117197182B (en) * 2023-11-07 2024-02-27 华诺星空技术股份有限公司 Lei Shibiao method, apparatus and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104021673A (en) * 2014-06-17 2014-09-03 北京易华录信息技术股份有限公司 System and method for rapidly finding congestion causing reasons by utilizing radar tracking technology
CN106448189A (en) * 2016-11-02 2017-02-22 南京慧尔视智能科技有限公司 Multilane speed measuring and block port triggering method and device based on microwaves
CN106710240A (en) * 2017-03-02 2017-05-24 公安部交通管理科学研究所 Passing vehicle tracking and speed measuring method integrating multiple-target radar and video information
CN109212513A (en) * 2018-09-29 2019-01-15 河北德冠隆电子科技有限公司 Multiple target between radar data transmitting, data fusion and localization method is continuously tracked
CN109671278A (en) * 2019-03-02 2019-04-23 安徽超远信息技术有限公司 A kind of bayonet precise positioning grasp shoot method and device based on multiple target radar

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2660624A1 (en) * 2012-04-30 2013-11-06 Traficon International N.V. A traffic monitoring device and a method for monitoring a traffic stream.
CN102768803B (en) * 2012-07-31 2014-10-22 株洲南车时代电气股份有限公司 Vehicle intelligent monitoring and recording system and method based on radar and video detection
JP5929870B2 (en) * 2013-10-17 2016-06-08 株式会社デンソー Target detection device
BR102013033041B1 (en) * 2013-12-20 2022-02-01 Perkons S/A System and method for monitoring and enforcing traffic and recording traffic violations and corresponding unmanned aerial vehicle
WO2016060384A1 (en) * 2014-10-17 2016-04-21 전자부품연구원 Method and device for providing panoramic vehicle situation information using multiple cameras and radar sensor information
CN104537834A (en) * 2014-12-21 2015-04-22 北京工业大学 Intersection identification and intersection trajectory planning method for intelligent vehicle in urban road running process
CN104966400A (en) * 2015-06-11 2015-10-07 山东鼎讯智能交通股份有限公司 Integrated multi-object radar speed measurement snapshot system and method
CN106125076A (en) * 2016-07-13 2016-11-16 南京慧尔视智能科技有限公司 A kind of Anticollision Radar method for early warning being applied to urban transportation and device
CN106373394B (en) * 2016-09-12 2019-01-04 深圳尚桥交通技术有限公司 Vehicle detection method and system based on video and radar
CN107527506A (en) * 2017-09-20 2017-12-29 上海安道雷光波系统工程有限公司 Embedded radar monitors recombination optics and radar monitoring capturing system and method
CN107767668A (en) * 2017-10-19 2018-03-06 深圳市置辰海信科技有限公司 A kind of method based on the continuous real-time tracking of radar active probe vehicle
CN108957478B (en) * 2018-07-23 2021-03-26 上海禾赛科技股份有限公司 Multi-sensor synchronous sampling system, control method thereof and vehicle
CN109061600B (en) * 2018-09-28 2023-02-14 上海市刑事科学技术研究院 Target identification method based on millimeter wave radar data
CN109615870A (en) * 2018-12-29 2019-04-12 南京慧尔视智能科技有限公司 A kind of traffic detection system based on millimetre-wave radar and video

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104021673A (en) * 2014-06-17 2014-09-03 北京易华录信息技术股份有限公司 System and method for rapidly finding congestion causing reasons by utilizing radar tracking technology
CN106448189A (en) * 2016-11-02 2017-02-22 南京慧尔视智能科技有限公司 Multilane speed measuring and block port triggering method and device based on microwaves
CN106710240A (en) * 2017-03-02 2017-05-24 公安部交通管理科学研究所 Passing vehicle tracking and speed measuring method integrating multiple-target radar and video information
CN109212513A (en) * 2018-09-29 2019-01-15 河北德冠隆电子科技有限公司 Multiple target between radar data transmitting, data fusion and localization method is continuously tracked
CN109671278A (en) * 2019-03-02 2019-04-23 安徽超远信息技术有限公司 A kind of bayonet precise positioning grasp shoot method and device based on multiple target radar

Also Published As

Publication number Publication date
WO2021031338A1 (en) 2021-02-25
CN110515073A (en) 2019-11-29

Similar Documents

Publication Publication Date Title
CN110515073B (en) Multi-radar cross-regional networking multi-target tracking identification method and device
CN108919256B (en) Four-dimensional real-scene traffic simulation vehicle overspeed whole-course tracking detection alarm system and method
CN110738846B (en) Vehicle behavior monitoring system based on radar and video group and implementation method thereof
CN108877234B (en) Four-dimensional real-scene traffic simulation vehicle illegal lane occupation tracking detection system and method
WO2021077766A1 (en) Large-area multi-target traffic event detection system and method
CN112099040A (en) Whole-course continuous track vehicle tracking system and method based on laser radar network
CN108550262B (en) Urban traffic sensing system based on millimeter wave radar
CN108961767B (en) Highway inspection chases fee alarm system based on four-dimensional outdoor traffic simulation
CN105679043A (en) 3D radar intelligent bayonet system and processing method thereof
CN104021673B (en) Radar tracking technology is utilized to find fast to block up and cause the system and method for reason
CN208335464U (en) A kind of traffic events monitoring device based on video and radar
CN110796862B (en) Highway traffic condition detection system and method based on artificial intelligence
CN110264742B (en) Vehicle-mounted information monitoring device suitable for expressway and warning method thereof
CN102592457A (en) Compound type interval velocity-measuring system based on internet of things technology and method thereof
CN113012445A (en) Intelligent traffic control system and control method thereof
CN111477010A (en) Device for intersection holographic sensing and control method thereof
RU2587662C1 (en) Automated system for detecting road traffic violation at crossroad, railway crossing or pedestrian crossing
CN216873319U (en) Thunder look fuses road monitored control system
CN104008649B (en) Radar tracking is utilized quickly to find the system and method for runway exception parking reason
CN103985255A (en) System and method for precisely judging abnormal parking on running lane by utilizing radar tracking technology
CN111477011A (en) Detection device and detection method for road intersection early warning
CN116229728A (en) Road traffic monitoring system of 4D millimeter wave radar and video sensor array
CN112034449A (en) System and method for realizing vehicle running track correction based on physical space attribute
CN107564336B (en) Signalized intersection left turn conflict early warning system and early warning method
CN202013659U (en) Intelligent safe traffic information platform based on embedded platform imaging processing and wireless communication

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Multi radar cross regional network multi target tracking and recognition method and device

Granted publication date: 20210907

Pledgee: Bank of China Limited by Share Ltd. Nanjing Jiangning branch

Pledgor: NANJING HURYS INTELLIGENT TECHNOLOGY Co.,Ltd.

Registration number: Y2024980010482