WO2021031338A1 - Multiple object tracking and identification method and apparatus based on multi-radar cross-regional networking - Google Patents

Multiple object tracking and identification method and apparatus based on multi-radar cross-regional networking Download PDF

Info

Publication number
WO2021031338A1
WO2021031338A1 PCT/CN2019/113974 CN2019113974W WO2021031338A1 WO 2021031338 A1 WO2021031338 A1 WO 2021031338A1 CN 2019113974 W CN2019113974 W CN 2019113974W WO 2021031338 A1 WO2021031338 A1 WO 2021031338A1
Authority
WO
WIPO (PCT)
Prior art keywords
radar
target vehicle
target
video
detection area
Prior art date
Application number
PCT/CN2019/113974
Other languages
French (fr)
Chinese (zh)
Inventor
许古午
章庆
史灵清
Original Assignee
南京慧尔视智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 南京慧尔视智能科技有限公司 filed Critical 南京慧尔视智能科技有限公司
Publication of WO2021031338A1 publication Critical patent/WO2021031338A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the invention relates to the field of vehicle road detection, in particular to a method and device for tracking and recognizing multiple targets in a multi-radar cross-regional network.
  • the expressway is a closed road. It is a major issue of the national economy and people's death to protect the lives and property safety of the passengers and passengers on the expressway, and it is also the key management and improvement work content of the road traffic department in the new era.
  • the need for expressway monitoring throughout the entire process is becoming more and more urgent.
  • high-definition cameras can accurately perceive characteristic attributes such as vehicle models and license plates, they are less accurate when detecting motion attributes such as distance and speed.
  • motion attributes such as distance and speed.
  • due to the optical characteristics of the camera they are extremely susceptible to interference from the surrounding environment, such as strong light. Climatic conditions such as exposure, rain, snow, and fog will affect the normal operation of the camera, causing the information collected by it to be inaccurate.
  • the purpose of the present invention is to provide a method and device for multi-radar cross-regional networking and multi-target tracking and recognition.
  • the characteristic information such as the license plate and model of the target vehicle and the movement information such as vehicle speed are synthesized, and then the multi-radar
  • the data fusion not only breaks through the limitation of the detection range of a single radar, extends the detection distance of the radar indefinitely, but also realizes the transmission of vehicle characteristic information and motion information, and finally obtains all the vehicle's movement trajectory and traffic behavior in the radar network And other information to achieve target identification and tracking.
  • a multi-radar cross-regional network multi-target tracking and recognition method including the following steps:
  • the radar and video detection systems include a video detector and several radars with equipment numbers. Among all the radars and video detection systems installed on the highway The detection areas of any two adjacent radars partially overlap;
  • the X axis direction of the radar coordinate system is opposite to the driving direction of the target vehicle; when the target vehicle enters the first radar and video detection system on the highway section In the detection area of the first radar, the first radar detects the speed information of the target vehicle in the area and its position information in the first radar coordinate system in real time, generates a target ID representing the target vehicle, and will represent the target vehicle’s target ID, Speed information and position information are sent back to the system; the first radar has a trigger position in its detection area, and when the target vehicle travels to the trigger position, the first radar triggers the video detector, and the video detector determines the lane where the target vehicle is located When the number matches the unique corresponding lane number of the video detector, capture the target vehicle image and identify the license plate information, and return the capture result containing the target vehicle license plate information to the system;
  • the radar and video detection system match and fuse the radar and video data of the same target vehicle to generate a complete information set of the target vehicle, including the target ID, driving speed information, lane number, target vehicle location information and license plate number. recording;
  • the target vehicle drives to the downstream radar of the adjacent equipment number, and judges that the target vehicle enters the overlapping detection area of two adjacent radars;
  • the target ID and license plate number of the upstream radar are fused and output to the adjacent downstream radar to generate a new target in the detection area of the downstream radar Information collection, that is, the target vehicle uses the target information collection in the adjacent downstream radar detection area to continue trajectory tracking; when the target vehicle cannot match the same target in the overlap detection area of two adjacent radars, the adjacent downstream radar generates a new one Radar target ID.
  • the target ID does not contain license plate information until the target vehicle travels to the first radar detection area of the next radar and video detection system to match the target ID with the license plate.
  • step 4 it is judged that the target vehicle enters the two adjacent radars through the "device number" field in the target vehicle real-time data and the X-axis coordinates of the target vehicle in the two adjacent radar coordinate systems upstream and downstream.
  • Overlap detection area
  • the length of the detection area of the radar as L1
  • the length of the overlap detection area of two adjacent radars on the upstream and downstream is L2
  • the distance of the radar blind zone is ⁇ L
  • the X-axis coordinate of the target vehicle detected by the adjacent upstream radar is x1
  • the adjacent downstream The X-axis coordinate of the target vehicle detected by the radar is x2;
  • the overlap detection area in step 5 is the fusion area of target matching between two adjacent radars.
  • the two adjacent radars are calculated according to the latitude and longitude of the target vehicle in the fusion area and the vehicle speed. The speed difference and distance difference of the target in the overlapping area of the radar;
  • the target vehicles detected by the two adjacent radars are the same One target, and the same target detected by two adjacent radars will maintain the same fusion number and license plate number information in the fusion area to achieve vehicle tracking and fusion between two adjacent radars.
  • the first radar and video detection system is installed at the checkpoint at the entry end of the expressway.
  • the video detector is a bolt action, and the bolt action and the first radar are jointly installed on the bolt bayonet rod.
  • the present invention also discloses a multi-radar cross-regional networking and multi-target tracking recognition device, which includes a radar and video detection system, a first judgment unit, a second judgment unit, a third judgment unit and an information fusion unit;
  • the radar and video detection system includes several, which are respectively evenly arranged along the expressway, including a video detector and radars with equipment numbers arranged evenly and spaced along the expressway; all radars and video detection systems installed on the expressway The detection areas of any two adjacent radars in the system partially overlap;
  • the radar is used to obtain the speed information of the target vehicle in the detection area and the position information of the target vehicle in the corresponding radar coordinate system in real time;
  • the radar with the first number in the radar and video detection system as the first radar, and a trigger position is set in the detection area of the first radar;
  • the first judgment unit is used to judge whether the target vehicle has entered the trigger position within the detection area of the first radar, and when the judgment result is that the target vehicle has entered the trigger position, trigger the start of the video detector to record the first radar detection area Segment of video images;
  • the second judgment unit is used for judging whether the lane number where the target vehicle is located matches the lane number uniquely corresponding to the video detector in the video image; when the judgment result is that the lane number where the target vehicle is located uniquely corresponds to the video detector When the lane numbers match the same, the video detector captures the target vehicle image and recognizes the license plate information, and returns the captured result containing the target vehicle license plate information to the system ;
  • the third judgment unit is used to judge whether the target vehicle enters the overlapping detection area of two adjacent radars; when the judgment result is that the target vehicle enters the overlapping detection area, perform the same target matching for the target vehicle that enters the overlapping detection area;
  • the information fusion unit is used to fuse the target information set of the same target vehicle in the upstream radar that enters the overlapping detection area and output it to the adjacent downstream radar, and the target vehicle uses the target information set in the adjacent downstream radar detection area to continue doing Tracking.
  • the multi-radar cross-area networking and multi-target tracking and recognition method and device disclosed in the present invention combine the advantages of radar and video detection, including several radars and video detection systems set on the highway, when the target enters the trigger position set by the radar
  • the video detection data is triggered at the time, and the radar data is associated with the video data.
  • the characteristic information such as the license plate and model of the target vehicle and the motion information such as the speed of the vehicle can be synthesized, and the data fusion when the target passes through multiple radars is not only a breakthrough in a single
  • the limitation of the radar detection range extends the detection distance of the radar indefinitely, and realizes the transmission of vehicle characteristic information and motion information. Finally, it can obtain all the vehicle's movement trajectory and traffic behavior information in the radar network, and realize the multi-radar cross-regional network. Target tracking and identification process.
  • the technical scheme of the present invention is based on the original ordinary camera, using the advantages of radar all-weather high-reliability work, introducing multi-radar intelligent analysis throughout the entire process, detecting the entire process of moving targets, using the radar to master the overall road operating conditions, and detecting When a traffic accident occurs, the camera will be captured immediately and uploaded to the background; the background personnel receive the radar alarm information and call the corresponding monitoring to confirm the authenticity of the accident, and then dispatch the corresponding police force, saving a lot of manpower and material costs.
  • FIG. 1 is a flowchart of the implementation of the present invention
  • Figure 2 is a state diagram of the target vehicle entering the radar and video detection system of the present invention
  • Figure 3 is a flow chart of the fusion of a single radar data frame in the present invention.
  • Figure 4 is a diagram of the radar coordinate system used in the embodiment.
  • the present invention aims to propose A multi-radar cross-regional networking and multi-target tracking and recognition method. Through the association and data fusion of radar and video detection data, it breaks the limitation of the detection range of a single radar, realizes the transmission of vehicle characteristic information and motion information, recognition, acquisition and tracking All trajectories and traffic behaviors of vehicles in the radar network.
  • a multi-radar cross-regional networking and multi-target tracking and recognition method includes the following steps: 1) Evenly install a number of radars and video detection systems on the highway, the radar and video detection systems include a number of videos Detector and several radars with equipment numbers; all the radars installed on the highway and the detection areas of any two adjacent radars in the video detection system partially overlap;
  • the X axis direction of the radar coordinate system is opposite to the driving direction of the target vehicle; when the target vehicle enters the first radar and video detection system on the highway section In the detection area of the first radar, the first radar detects the speed information of the target vehicle in the area and its position information in the first radar coordinate system in real time, generates a target ID representing the target vehicle, and will represent the target vehicle’s target ID, Speed information and position information are sent back to the system; the first radar has a trigger position in its detection area, and when the target vehicle travels to the trigger position, the first radar triggers the video detector, and the video detector determines the lane where the target vehicle is located When the number matches the unique corresponding lane number of the video detector, capture the target vehicle image and identify the license plate information, and return the capture result containing the target vehicle license plate information to the system;
  • the radar and video detection system match and fuse the radar and video data of the same target vehicle to generate a complete information set of the target vehicle, including the target ID, driving speed information, lane number, target vehicle location information and license plate number. recording;
  • the target vehicle drives to the downstream radar of the adjacent equipment number, and judges that the target vehicle enters the overlapping detection area of two adjacent radars;
  • the target ID and license plate number of the upstream radar are fused and output to the adjacent downstream radar to generate a new target in the detection area of the downstream radar Information collection, that is, the target vehicle uses the target information collection in the adjacent downstream radar detection area to continue trajectory tracking; when the target vehicle cannot match the same target in the overlap detection area of two adjacent radars, the adjacent downstream radar generates a new one Radar target ID.
  • the target ID does not contain license plate information until the target vehicle travels to the first radar detection area of the next radar and video detection system to match the target ID with the license plate.
  • step 6 to achieve the goal of multi-target tracking and recognition in the multi-radar cross-regional network shown in Figure 2.
  • the first radar and video detection system are installed at the bayonet of the entry end of the expressway; the video detector adopts a bolt, and the bolt and the first radar are both installed on the bolt of the bolt. on.
  • the position information of the target vehicle detected in real time by the first radar in the first radar coordinate system is the X-axis and Y-axis coordinates of the target vehicle in the corresponding radar coordinate system; when the first radar detects that the target vehicle reaches the trigger position At the same time, the first radar sends a trigger signal to the video detector through the RS485 serial port transmission method. At the same time, the radar and the video detection system record the trigger signal, and at the same time record the target ID and the position information of the target vehicle sent by the first radar.
  • the trigger signal contains information such as the lane number and target vehicle speed that the camera needs to capture.
  • step 2) the data is matched according to the trigger position of the target vehicle, the lane number and a reasonable time interval (such as 0-2 seconds), and the matching is successful. Then the first radar trigger signal of the target vehicle is combined with the capture result, and then the license plate data of the target vehicle is fused through parameters such as time, lane number, and location information.
  • the license plate fusion data is the license plate data source of the multi-radar target fusion. Finally, a license plate fusion data record containing target ID, driving speed information, lane number, target vehicle location and license plate number can be generated.
  • the vehicle detected by the first radar contains the unique vehicle identification (such as lane number and license plate number) and basic information of the target vehicle (such as latitude and longitude position, position of the target vehicle relative to the radar, driving speed information, model, etc.).
  • unique vehicle identification such as lane number and license plate number
  • basic information of the target vehicle such as latitude and longitude position, position of the target vehicle relative to the radar, driving speed information, model, etc.
  • the target vehicle enters the overlapping detection area of two adjacent radars through the "device number" field of the radar in the real-time data of the target vehicle and the X-axis coordinates of the target vehicle in the radar detection area;
  • the length of the detection area of the radar as L1
  • the length of the overlap detection area of two adjacent radars on the upstream and downstream is L2
  • the distance of the radar blind zone is ⁇ L
  • the X-axis coordinate of the target vehicle detected by the adjacent upstream radar is x1
  • the adjacent downstream The X-axis coordinate of the target vehicle detected by the radar is x2;
  • the overlap detection area is the fusion area of target matching between two adjacent radars.
  • calculate the overlap of the two adjacent radars based on the latitude and longitude and vehicle speed of the target vehicle in the fusion area
  • the speed difference and distance difference of the target in the area when the absolute value of the distance difference detected by two adjacent radars to the target vehicle is less than 2m, and the absolute value of the speed difference detected by the target vehicle is less than 2km/h, it is determined
  • the target vehicle detected by two adjacent radars is the same target, and the same target detected by two adjacent radars will maintain the same fusion number and license plate number information in the fusion area to achieve vehicle tracking and fusion between two adjacent radars ,
  • the subsequent downstream radars that are adjacent to each other are matched and fused in the same way to achieve multi-radar cross-area tracking of the target vehicle.
  • the system will calculate the latitude and longitude of the target vehicle based on the latitude and longitude of each radar, the deflection angle of true north, and the coordinates of the target vehicle in the corresponding radar coordinate system.
  • EARTH_RADIUS 6378137m
  • the distance between the radar and the target vehicle located in the radar monitoring area is the distance between the radar and the target vehicle located in the radar monitoring area:
  • the azimuth angle difference (radian) between the radar and the target vehicle in the radar monitoring area is ⁇ :
  • the true north deflection angle (radian) of the target vehicle is a:
  • the degree (radian) difference between the target vehicle and the radar relative to the center of the sphere on the earth's surface is c:
  • the complementary latitude of the target vehicle is b:
  • the angle between the target vehicle and the radar in the longitude direction is d:
  • the longitude of the target vehicle is lonM:
  • the latitude of the target vehicle is latM:
  • the latitude and longitude difference between the first target vehicle and the second target vehicle are:
  • the distance S between the first target vehicle and the second target vehicle is:
  • the multi-radar cross-regional networking and multi-target tracking and recognition device includes several radars and video detection systems evenly arranged along the highway; the radar and video detection system includes a video detector and a number of evenly and spacedly arranged along the highway A radar with a device number is set, and the detection areas of any two adjacent radars in all radars and video detection systems installed on the highway partially overlap; among them, the radar is used to obtain the speed information of the target vehicle and the target vehicle in the detection area in real time Position information in the corresponding radar coordinate system;
  • the radar with the first number in the radar and video detection system as the first radar, and a trigger position is set in the detection area of the first radar;
  • the multi-radar cross-regional networking multi-target tracking and recognizing device further includes:
  • the first judgment unit is used to judge whether the target vehicle has entered the trigger position within the detection area of the first radar, and when the judgment result is that the target vehicle has entered the trigger position, trigger the start of the video detector to record the first radar detection area segment Video image.
  • the second judging unit is used for judging whether the lane number of the target vehicle is consistent with the lane number uniquely corresponding to the video detector in the video image; when the judgment result is that the lane number of the target vehicle is uniquely corresponding to the lane of the video detector When the number matches are consistent, the video detector captures the target vehicle image and recognizes the license plate information, and returns the captured result containing the target vehicle license plate information to the system .
  • the third judgment unit is used to judge whether the target vehicle enters the overlapping detection area of two adjacent radars; when the judgment result is that the target vehicle enters the overlapping detection area, perform the same target matching for the target vehicle entering the overlapping detection area.
  • the information fusion unit is used to fuse the target information set of the same target vehicle in the upstream radar that enters the overlapping detection area and output it to the adjacent downstream radar.
  • the target vehicle uses the target information set in the adjacent downstream radar detection area to continue trajectory tracking .

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

A multiple object tracking and identification method and apparatus based on multi-radar cross-regional networking. The apparatus comprises several radars and a video detection system, which are arranged on an expressway. When an object enters trigger positions set by the radars, video detection data is triggered; by associating radar data with the video data, characteristic information, such as a license plate and the model of a target vehicle, and motion information, such as a vehicle speed, can be synthesized on the expressway; and a running trajectory of a tracked object running on the expressway across regions is formed by means of multi-radar data fusion. In combination with the advantages of the radars and video detection, a multiple object tracking and identification process based on multi-radar cross-regional networking is realized.

Description

多雷达跨区域组网多目标跟踪识别方法及装置Multi-radar cross-regional networking multi-target tracking and recognizing method and device 技术领域Technical field
本发明涉及车辆道路检测领域,具体涉及一种多雷达跨区域组网多目标跟踪识别方法及装置。The invention relates to the field of vehicle road detection, in particular to a method and device for tracking and recognizing multiple targets in a multi-radar cross-regional network.
背景技术Background technique
高速公路属于封闭式道路,保障高速公路过往司乘人员的生命与财产安全是国计民生的大事,也属于新时期道路交通部门重点管理和改进的工作内容。为进一步提升高速公路管理单位对高速路网营运全过程的掌控力度,加强高速公路突发事件的应急处置能力,对高速公路全程监控的需求越来越迫切。The expressway is a closed road. It is a major issue of the national economy and people's livelihood to protect the lives and property safety of the passengers and passengers on the expressway, and it is also the key management and improvement work content of the road traffic department in the new era. In order to further enhance the control of the expressway management unit over the entire process of expressway network operation and strengthen the emergency response capability of expressway emergencies, the need for expressway monitoring throughout the entire process is becoming more and more urgent.
近年来高清视频技术的成熟,高速公路管理大多推行数字化高清监控视频来实现全过程的监控管理,但随着视频路数的增加,都是靠人工及时发现确认交通事故及道路紧急情况的话工作强度极大,设置高清视频事件检测系统来自动检测车辆停驶、交通拥堵等事件、事故也可以提升监控视频系统的使用效果。带来的弊端就是,一高清视频价格比较贵,且受恶劣环境影响较大,碰到雨雾雪等天气检测精度极低,经济效益比不高;二若使用全新的设备就需要把之前布设的普通摄像机替换下来,造成对资源的浪费。In recent years, high-definition video technology has matured. Most highway management implements digital high-definition surveillance video to realize the entire process of monitoring and management. However, with the increase in the number of video channels, it is manual to find and confirm traffic accidents and road emergencies in time. Greatly, setting up a high-definition video event detection system to automatically detect incidents and accidents such as vehicle parking, traffic jams, etc. can also improve the use of surveillance video systems. The disadvantage is that high-definition video is more expensive, and it is greatly affected by the harsh environment, and the detection accuracy of weather such as rain, fog, and snow is extremely low, and the economic benefit ratio is not high; second, if you use brand-new equipment, you need to deploy the previous The replacement of ordinary cameras causes a waste of resources.
另外,高清摄像机虽然能够准确感知车辆的车型、车牌等特征属性,但是在探测距离、速度等运动属性的时候精度差一些,而且由于摄像头的光学特性,极易受到周围环境的干扰,例如强光照射、雨雪雾等恶劣天气的气候条件都会影响摄像头的正常工作,导致其采集的信息结果不够准确。In addition, although high-definition cameras can accurately perceive characteristic attributes such as vehicle models and license plates, they are less accurate when detecting motion attributes such as distance and speed. Moreover, due to the optical characteristics of the camera, they are extremely susceptible to interference from the surrounding environment, such as strong light. Climatic conditions such as exposure, rain, snow, and fog will affect the normal operation of the camera, causing the information collected by it to be inaccurate.
另一方面,现代基于雷达探测的区域监控系统可以完美解决视频监控(高清摄像机)设备的问题,雷达可以全天候24小时工作,并且不受雨、雪、雾以及黑夜的影响,因此,在智能交通管理领域里得到了广泛的应用,但其无法做到可视化展示,对于信息量的采集有限,比如只能采集车速或车流量等交通信 息,无法采集车牌等信息。此外,单台雷达的检测距离有限,如断面微波检测器只能检测一个断面,单台广域雷达只能检测一个区域,例如检测200米。On the other hand, modern area monitoring systems based on radar detection can perfectly solve the problem of video surveillance (high-definition cameras) equipment. The radar can work 24 hours a day and is not affected by rain, snow, fog and night. Therefore, in intelligent transportation It has been widely used in the management field, but it cannot be displayed visually. The collection of information is limited. For example, it can only collect traffic information such as vehicle speed or traffic volume, but cannot collect information such as license plates. In addition, the detection range of a single radar is limited. For example, a cross-sectional microwave detector can only detect one cross-section, and a single wide-area radar can only detect one area, such as 200 meters.
发明内容Summary of the invention
本发明目的在于提供一种多雷达跨区域组网多目标跟踪识别方法及装置,通过雷达与视频检测数据的关联,合成目标车辆的车牌、车型等特征信息和车速等运动信息,再通过多雷达的数据融合,不仅突破了单台雷达检测范围的限制,将雷达的检测距离无限延伸,还可以实现车辆特征信息和运动信息的传递,最终可以获取车辆在雷达网中的全部运动轨迹、交通行为等信息,实现目标的识别追踪。The purpose of the present invention is to provide a method and device for multi-radar cross-regional networking and multi-target tracking and recognition. Through the association of radar and video detection data, the characteristic information such as the license plate and model of the target vehicle and the movement information such as vehicle speed are synthesized, and then the multi-radar The data fusion not only breaks through the limitation of the detection range of a single radar, extends the detection distance of the radar indefinitely, but also realizes the transmission of vehicle characteristic information and motion information, and finally obtains all the vehicle's movement trajectory and traffic behavior in the radar network And other information to achieve target identification and tracking.
为达成上述目的,本发明提出如下技术方案:一种多雷达跨区域组网多目标跟踪识别方法,包括如下步骤:In order to achieve the above objective, the present invention proposes the following technical solution: a multi-radar cross-regional network multi-target tracking and recognition method, including the following steps:
1)在高速公路上均匀安装若干雷达及视频检测系统,所述雷达及视频检测系统包括一视频检测器和若干设置有设备编号的雷达,所述高速公路上安装的所有雷达及视频检测系统中任意相邻两台雷达的检测区域部分重叠;1) Install several radars and video detection systems evenly on the highway. The radar and video detection systems include a video detector and several radars with equipment numbers. Among all the radars and video detection systems installed on the highway The detection areas of any two adjacent radars partially overlap;
2)分别以各雷达的接收天线中心为原点建立各雷达的雷达坐标系,雷达坐标系的X轴方向与目标车辆行驶方向相反;当目标车辆进入高速公路路段上首个雷达及视频检测系统中第一雷达的检测区域时,第一雷达实时检测区域内目标车辆的速度信息和其在第一雷达坐标系中的位置信息,生成表示目标车辆的目标ID,并将表示目标车辆的目标ID、速度信息和位置信息回传给系统;所述第一雷达在其检测区域内设置有触发位置,当目标车辆行驶至触发位置时第一雷达触发视频检测器,视频检测器判断目标车辆所在的车道编号与视频检测器唯一对应的车道编号匹配一致时,抓拍目标车辆图像并识别车牌信息,并将包含目标车辆车牌信息的抓拍结果回传给系统;2) Establish the radar coordinate system of each radar with the receiving antenna center of each radar as the origin. The X axis direction of the radar coordinate system is opposite to the driving direction of the target vehicle; when the target vehicle enters the first radar and video detection system on the highway section In the detection area of the first radar, the first radar detects the speed information of the target vehicle in the area and its position information in the first radar coordinate system in real time, generates a target ID representing the target vehicle, and will represent the target vehicle’s target ID, Speed information and position information are sent back to the system; the first radar has a trigger position in its detection area, and when the target vehicle travels to the trigger position, the first radar triggers the video detector, and the video detector determines the lane where the target vehicle is located When the number matches the unique corresponding lane number of the video detector, capture the target vehicle image and identify the license plate information, and return the capture result containing the target vehicle license plate information to the system;
3)雷达及视频检测系统对同一目标车辆的雷达和视频数据匹配融合,生成目标车辆的完整信息集合,包含目标ID、行驶速度信息、所在车道编号、目标车辆位置信息和车牌号的车牌融合数据记录;3) The radar and video detection system match and fuse the radar and video data of the same target vehicle to generate a complete information set of the target vehicle, including the target ID, driving speed information, lane number, target vehicle location information and license plate number. recording;
4)目标车辆向相邻设备编号的下游雷达行驶,判断目标车辆进入相邻两台雷达的重叠检测区域;4) The target vehicle drives to the downstream radar of the adjacent equipment number, and judges that the target vehicle enters the overlapping detection area of two adjacent radars;
5)对进入相邻两台雷达重叠检测区域的目标车辆做同一目标匹配;5) Perform the same target matching on the target vehicle entering the overlapping detection area of two adjacent radars;
6)当目标车辆在相邻两台雷达重叠检测区域内匹配为同一目标车辆时,将上游雷达的目标ID和车牌号融合并输出给相邻下游雷达,生成在下游雷达检测区域内新的目标信息集合,即目标车辆在相邻下游雷达检测区域内使用该目标信息集合继续做轨迹跟踪;当目标车辆在相邻两台雷达重叠检测区域无法匹配为同一目标,则相邻下游雷达生成新的雷达目标ID,该目标ID不包含车牌信息,直至目标车辆行驶至下一雷达及视频检测系统的第一雷达检测区域做目标ID与车牌的匹配。6) When the target vehicle matches the same target vehicle in the overlapping detection area of two adjacent radars, the target ID and license plate number of the upstream radar are fused and output to the adjacent downstream radar to generate a new target in the detection area of the downstream radar Information collection, that is, the target vehicle uses the target information collection in the adjacent downstream radar detection area to continue trajectory tracking; when the target vehicle cannot match the same target in the overlap detection area of two adjacent radars, the adjacent downstream radar generates a new one Radar target ID. The target ID does not contain license plate information until the target vehicle travels to the first radar detection area of the next radar and video detection system to match the target ID with the license plate.
进一步的,所述步骤4)中通过目标车辆实时数据中“设备编号”字段和目标车辆在上、下游相邻两台雷达坐标系中的X轴坐标,判断目标车辆进入相邻两台雷达的重叠检测区域;Further, in the step 4), it is judged that the target vehicle enters the two adjacent radars through the "device number" field in the target vehicle real-time data and the X-axis coordinates of the target vehicle in the two adjacent radar coordinate systems upstream and downstream. Overlap detection area;
定义雷达的检测区域长度为L1,上、下游相邻两台雷达的重叠检测区域长度为L2,雷达盲区距离为△L,相邻上游雷达检测到目标车辆的X轴坐标为x1,相邻下游雷达检测到目标车辆的X轴坐标为x2;Define the length of the detection area of the radar as L1, the length of the overlap detection area of two adjacent radars on the upstream and downstream is L2, the distance of the radar blind zone is △L, the X-axis coordinate of the target vehicle detected by the adjacent upstream radar is x1, and the adjacent downstream The X-axis coordinate of the target vehicle detected by the radar is x2;
当△L≤x1≤L2+△L,L1-L2≤x2≤L1时,则判定该目标车辆为重叠检测区域内目标。When △L≤x1≤L2+△L, L1-L2≤x2≤L1, it is determined that the target vehicle is the target in the overlap detection area.
进一步的,所述步骤5)中重叠检测区域为相邻两台雷达之间目标匹配的融合区域,当目标车辆进入雷达重叠检测区域后,据融合区域目标车辆的经纬度、车辆速度计算相邻两台雷达重叠区域内目标的速度差和距离差;Further, the overlap detection area in step 5) is the fusion area of target matching between two adjacent radars. When the target vehicle enters the radar overlap detection area, the two adjacent radars are calculated according to the latitude and longitude of the target vehicle in the fusion area and the vehicle speed. The speed difference and distance difference of the target in the overlapping area of the radar;
当相邻两台雷达对目标车辆检测到的距离差的绝对值小于2m,同时对目标车辆检测到的速度差的绝对值小于2km/h时,判定相邻两台雷达检测的目标车辆为同一个目标,并且相邻两台雷达检测到的同一个目标会在融合区域保持融合编号和车牌号信息一致,实现相邻两台雷达间的车辆跟踪融合。When the absolute value of the distance difference between the two adjacent radars detected by the target vehicle is less than 2m, and the absolute value of the speed difference detected by the target vehicle is less than 2km/h, it is determined that the target vehicles detected by the two adjacent radars are the same One target, and the same target detected by two adjacent radars will maintain the same fusion number and license plate number information in the fusion area to achieve vehicle tracking and fusion between two adjacent radars.
进一步的,所述首个雷达及视频检测系统设置在高速公路的进入端的卡口 处。Further, the first radar and video detection system is installed at the checkpoint at the entry end of the expressway.
进一步的,所述视频检测器为枪机,所述枪机与第一雷达共同安装于枪机卡口杆件上。Further, the video detector is a bolt action, and the bolt action and the first radar are jointly installed on the bolt bayonet rod.
此外,本发明还公开了一种多雷达跨区域组网多目标跟踪识别装置,包括雷达及视频检测系统、第一判断单元、第二判断单元、第三判断单元和信息融合单元;In addition, the present invention also discloses a multi-radar cross-regional networking and multi-target tracking recognition device, which includes a radar and video detection system, a first judgment unit, a second judgment unit, a third judgment unit and an information fusion unit;
所述雷达及视频检测系统包括若干,分别沿高速公路均匀布置,包括一视频检测器和沿高速公路均匀且间隔布置的设置有设备编号的雷达;所述高速公路上安装的所有雷达及视频检测系统中任意相邻两台雷达的检测区域部分重叠;The radar and video detection system includes several, which are respectively evenly arranged along the expressway, including a video detector and radars with equipment numbers arranged evenly and spaced along the expressway; all radars and video detection systems installed on the expressway The detection areas of any two adjacent radars in the system partially overlap;
所述雷达,用于实时获取检测区域内目标车辆的速度信息和目标车辆在对应雷达坐标系中的位置信息;The radar is used to obtain the speed information of the target vehicle in the detection area and the position information of the target vehicle in the corresponding radar coordinate system in real time;
定义雷达及视频检测系统中首个编号的雷达为第一雷达,所述第一雷达的检测区域内设置有触发位置;Define the radar with the first number in the radar and video detection system as the first radar, and a trigger position is set in the detection area of the first radar;
所述第一判断单元,用于在第一雷达的检测区域内判断目标车辆是否驶入触发位置,并当判断结果为目标车辆驶入触发位置时,触发启动视频检测器录制第一雷达检测区域段的视频图像;The first judgment unit is used to judge whether the target vehicle has entered the trigger position within the detection area of the first radar, and when the judgment result is that the target vehicle has entered the trigger position, trigger the start of the video detector to record the first radar detection area Segment of video images;
所述第二判断单元,用于在视频图像中,判断目标车辆所在的车道编号与视频检测器唯一对应的车道编号匹配是否一致;当判断结果为目标车辆所在的车道编号与视频检测器唯一对应的车道编号匹配一致时,视频检测器抓拍目标车辆图像并识别车牌信息,并将包含目标车辆车牌信息的抓拍结果回传给系统 The second judgment unit is used for judging whether the lane number where the target vehicle is located matches the lane number uniquely corresponding to the video detector in the video image; when the judgment result is that the lane number where the target vehicle is located uniquely corresponds to the video detector When the lane numbers match the same, the video detector captures the target vehicle image and recognizes the license plate information, and returns the captured result containing the target vehicle license plate information to the system ;
所述第三判断单元,用于判断目标车辆是否进入相邻两台雷达的重叠检测区域;当判断结果为目标车辆进入重叠检测区域时,对进入重叠检测区域的目标车辆做同一目标匹配;The third judgment unit is used to judge whether the target vehicle enters the overlapping detection area of two adjacent radars; when the judgment result is that the target vehicle enters the overlapping detection area, perform the same target matching for the target vehicle that enters the overlapping detection area;
所述信息融合单元,用于将进入重叠检测区域的同一目标车辆在上游雷达的目标信息集合融合并输出给相邻下游雷达,目标车辆在相邻下游雷达检测区域内使用该目标信息集合继续做轨迹跟踪。The information fusion unit is used to fuse the target information set of the same target vehicle in the upstream radar that enters the overlapping detection area and output it to the adjacent downstream radar, and the target vehicle uses the target information set in the adjacent downstream radar detection area to continue doing Tracking.
由以上技术方案可知,本发明的技术方案提供的多雷达跨区域组网多目标跟踪识别方法,获得了如下有益效果:It can be seen from the above technical solutions that the multi-radar cross-regional networking and multi-target tracking and recognition method provided by the technical solution of the present invention achieves the following beneficial effects:
本发明公开的多雷达跨区域组网多目标跟踪识别方法及装置,结合雷达和视频检测的优点,包括设置在高速公路上的若干雷达及视频检测系统,当目标进入到雷达设定的触发位置时触发视频检测数据,将雷达数据与视频数据关联,在高速公路可以合成目标车辆的车牌、车型等特征信息和车速等运动信息,再对目标通过多雷达时的数据融合,不仅突破了单台雷达检测范围的限制,将雷达的检测距离无限延伸,实现车辆特征信息和运动信息的传递,最终可以获取车辆在雷达网中的全部运动轨迹、交通行为等信息,实现多雷达跨区域组网多目标跟踪识别过程。The multi-radar cross-area networking and multi-target tracking and recognition method and device disclosed in the present invention combine the advantages of radar and video detection, including several radars and video detection systems set on the highway, when the target enters the trigger position set by the radar The video detection data is triggered at the time, and the radar data is associated with the video data. On the highway, the characteristic information such as the license plate and model of the target vehicle and the motion information such as the speed of the vehicle can be synthesized, and the data fusion when the target passes through multiple radars is not only a breakthrough in a single The limitation of the radar detection range extends the detection distance of the radar indefinitely, and realizes the transmission of vehicle characteristic information and motion information. Finally, it can obtain all the vehicle's movement trajectory and traffic behavior information in the radar network, and realize the multi-radar cross-regional network. Target tracking and identification process.
本发明的技术方案基于原有普通摄像机的基础上,利用雷达全天候高可靠性工作的优势,引入多雷达全程智能分析,检测运动目标的全过程,用雷达总体掌握全线道路运行状况,检测到有交通事故发生则立即控制摄像机抓拍并上传至后台;后台人员接到雷达报警信息则调取相应监控确认事故真实性,再出动相应警力,节省大量人力、物力成本。The technical scheme of the present invention is based on the original ordinary camera, using the advantages of radar all-weather high-reliability work, introducing multi-radar intelligent analysis throughout the entire process, detecting the entire process of moving targets, using the radar to master the overall road operating conditions, and detecting When a traffic accident occurs, the camera will be captured immediately and uploaded to the background; the background personnel receive the radar alarm information and call the corresponding monitoring to confirm the authenticity of the accident, and then dispatch the corresponding police force, saving a lot of manpower and material costs.
应当理解,前述构思以及在下面更加详细地描述的额外构思的所有组合只要在这样的构思不相互矛盾的情况下都可以被视为本公开的发明主题的一部分。It should be understood that all combinations of the foregoing concepts and the additional concepts described in more detail below can be regarded as part of the inventive subject matter of the present disclosure as long as such concepts are not mutually contradictory.
结合附图从下面的描述中可以更加全面地理解本发明教导的前述和其他方面、实施例和特征。本发明的其他附加方面例如示例性实施方式的特征和/或有益效果将在下面的描述中显见,或通过根据本发明教导的具体实施方式的实践中得知。The foregoing and other aspects, embodiments and features of the teachings of the present invention can be more fully understood from the following description with reference to the accompanying drawings. Other additional aspects of the present invention, such as the features and/or beneficial effects of the exemplary embodiments, will be apparent in the following description, or learned from the practice of the specific embodiments taught by the present invention.
附图说明Description of the drawings
附图不意在按比例绘制。在附图中,在各个图中示出的每个相同或近似相同的组成部分可以用相同的标号表示。为了清晰起见,在每个图中,并非每个组成部分均被标记。现在,将通过例子并参考附图来描述本发明的各个方面的实施例,其中:The drawings are not intended to be drawn to scale. In the drawings, each identical or nearly identical component shown in each figure may be represented by the same reference numeral. For clarity, not every component is labeled in every figure. Now, embodiments of various aspects of the present invention will be described by way of examples and with reference to the accompanying drawings, in which:
图1为本发明实施流程图;Figure 1 is a flowchart of the implementation of the present invention;
图2为本发明目标车辆驶入雷达及视频检测系统状态图;Figure 2 is a state diagram of the target vehicle entering the radar and video detection system of the present invention;
图3为本发明中雷达单数据帧的融合流程图;Figure 3 is a flow chart of the fusion of a single radar data frame in the present invention;
图4为实施例中采用的雷达坐标系图。Figure 4 is a diagram of the radar coordinate system used in the embodiment.
具体实施方式detailed description
为了更了解本发明的技术内容,特举具体实施例并配合所附图式说明如下。In order to better understand the technical content of the present invention, specific embodiments are described below in conjunction with the accompanying drawings.
在本公开中参照附图来描述本发明的各方面,附图中示出了许多说明的实施例。本公开的实施例不定义包括本发明的所有方面。应当理解,上面介绍的多种构思和实施例,以及下面更加详细地描述的那些构思和实施方式可以以很多方式中任意一种来实施,这是因为本发明所公开的构思和实施例并不限于任何实施方式。另外,本发明公开的一些方面可以单独使用,或者与本发明公开的其他方面的任何适当组合来使用。In this disclosure, various aspects of the present invention are described with reference to the accompanying drawings, in which many illustrated embodiments are shown. The embodiments of the present disclosure are not defined to include all aspects of the present invention. It should be understood that the various concepts and embodiments introduced above, as well as those described in more detail below, can be implemented in any of many ways, because the concepts and embodiments disclosed in the present invention are not Limited to any implementation. In addition, some aspects disclosed in the present invention can be used alone or in any appropriate combination with other aspects disclosed in the present invention.
基于现有技术中,一方面通过在高速公路上安装高清摄像机虽然能够准确感知车辆的车型、车牌等特征属性,但是在探测距离、速度等运动属性的时候精度差,并且容易受到周围环境的干扰,导致其采集的信息结果不准确;另一方面基于雷达探测的区域监控系统虽然可以完美解决视频监控设备的问题,但却无法做到可视化展示,对于信息量的采集有限;本发明旨在提出一种多雷达跨区域组网多目标跟踪识别方法,通过雷达与视频检测数据的关联和数据融合,突破了单台雷达检测范围的限制,实现车辆特征信息和运动信息的传递,识别获取并追踪车辆在雷达网中的全部运动轨迹和交通行为。Based on the prior art, on the one hand, by installing high-definition cameras on the highway, although it can accurately perceive the characteristic attributes of the vehicle's model, license plate, etc., it has poor accuracy when detecting movement attributes such as distance and speed, and is easily affected by the surrounding environment. , Resulting in inaccurate information collection results; on the other hand, although the area monitoring system based on radar detection can perfectly solve the problem of video monitoring equipment, it cannot achieve visual display, and the collection of information is limited; the present invention aims to propose A multi-radar cross-regional networking and multi-target tracking and recognition method. Through the association and data fusion of radar and video detection data, it breaks the limitation of the detection range of a single radar, realizes the transmission of vehicle characteristic information and motion information, recognition, acquisition and tracking All trajectories and traffic behaviors of vehicles in the radar network.
下面结合附图所示的实施例,对本发明的多雷达跨区域组网多目标跟踪识别方法及装置作进一步具体介绍。In the following, in conjunction with the embodiments shown in the drawings, the method and device for multi-radar cross-regional network multi-target tracking and recognition of the present invention will be further introduced in detail.
结合图1所示,一种多雷达跨区域组网多目标跟踪识别方法,包括如下步骤:1)在高速公路上均匀安装若干雷达及视频检测系统,所述雷达及视频检测系统包括若干一视频检测器和若干设置有设备编号的雷达;所述高速公路上安装的所有雷达及视频检测系统中任意相邻两台雷达的检测区域部分重叠;As shown in Figure 1, a multi-radar cross-regional networking and multi-target tracking and recognition method includes the following steps: 1) Evenly install a number of radars and video detection systems on the highway, the radar and video detection systems include a number of videos Detector and several radars with equipment numbers; all the radars installed on the highway and the detection areas of any two adjacent radars in the video detection system partially overlap;
2)分别以各雷达的接收天线中心为原点建立各雷达的雷达坐标系,雷达坐标系的X轴方向与目标车辆行驶方向相反;当目标车辆进入高速公路路段上首个雷达及视频检测系统中第一雷达的检测区域时,第一雷达实时检测区域内目标车辆的速度信息和其在第一雷达坐标系中的位置信息,生成表示目标车辆的目标ID,并将表示目标车辆的目标ID、速度信息和位置信息回传给系统;所述第一雷达在其检测区域内设置有触发位置,当目标车辆行驶至触发位置时第一雷达触发视频检测器,视频检测器判断目标车辆所在的车道编号与视频检测器唯一对应的车道编号匹配一致时,抓拍目标车辆图像并识别车牌信息,并将包含目标车辆车牌信息的抓拍结果回传给系统;2) Establish the radar coordinate system of each radar with the receiving antenna center of each radar as the origin. The X axis direction of the radar coordinate system is opposite to the driving direction of the target vehicle; when the target vehicle enters the first radar and video detection system on the highway section In the detection area of the first radar, the first radar detects the speed information of the target vehicle in the area and its position information in the first radar coordinate system in real time, generates a target ID representing the target vehicle, and will represent the target vehicle’s target ID, Speed information and position information are sent back to the system; the first radar has a trigger position in its detection area, and when the target vehicle travels to the trigger position, the first radar triggers the video detector, and the video detector determines the lane where the target vehicle is located When the number matches the unique corresponding lane number of the video detector, capture the target vehicle image and identify the license plate information, and return the capture result containing the target vehicle license plate information to the system;
3)雷达及视频检测系统对同一目标车辆的雷达和视频数据匹配融合,生成目标车辆的完整信息集合,包含目标ID、行驶速度信息、所在车道编号、目标车辆位置信息和车牌号的车牌融合数据记录;3) The radar and video detection system match and fuse the radar and video data of the same target vehicle to generate a complete information set of the target vehicle, including the target ID, driving speed information, lane number, target vehicle location information and license plate number. recording;
4)目标车辆向相邻设备编号的下游雷达行驶,判断目标车辆进入相邻两台雷达的重叠检测区域;4) The target vehicle drives to the downstream radar of the adjacent equipment number, and judges that the target vehicle enters the overlapping detection area of two adjacent radars;
5)对进入相邻两台雷达重叠检测区域的目标车辆做同一目标匹配;5) Perform the same target matching on the target vehicle entering the overlapping detection area of two adjacent radars;
6)当目标车辆在相邻两台雷达重叠检测区域内匹配为同一目标车辆时,将上游雷达的目标ID和车牌号融合并输出给相邻下游雷达,生成在下游雷达检测区域内新的目标信息集合,即目标车辆在相邻下游雷达检测区域内使用该目标信息集合继续做轨迹跟踪;当目标车辆在相邻两台雷达重叠检测区域无法匹配为同一目标,则相邻下游雷达生成新的雷达目标ID,该目标ID不包含车牌信息,直至目标车辆行驶至下一雷达及视频检测系统的第一雷达检测区域做目标ID与车牌的匹配。6) When the target vehicle matches the same target vehicle in the overlapping detection area of two adjacent radars, the target ID and license plate number of the upstream radar are fused and output to the adjacent downstream radar to generate a new target in the detection area of the downstream radar Information collection, that is, the target vehicle uses the target information collection in the adjacent downstream radar detection area to continue trajectory tracking; when the target vehicle cannot match the same target in the overlap detection area of two adjacent radars, the adjacent downstream radar generates a new one Radar target ID. The target ID does not contain license plate information until the target vehicle travels to the first radar detection area of the next radar and video detection system to match the target ID with the license plate.
因此重复步骤6),达到实现图2所示的在多雷达跨区域组网多目标跟踪识别的目标。在具体实施过程中,所述首个雷达及视频检测系统设置在高速公路的进入端的卡口处;所述视频检测器采用枪机,枪机与第一雷达共同安装于枪机卡口杆件上。Therefore, repeat step 6) to achieve the goal of multi-target tracking and recognition in the multi-radar cross-regional network shown in Figure 2. In the specific implementation process, the first radar and video detection system are installed at the bayonet of the entry end of the expressway; the video detector adopts a bolt, and the bolt and the first radar are both installed on the bolt of the bolt. on.
在步骤2)中第一雷达实时探测的目标车辆在第一雷达坐标系中位置信息为目标车辆在对应雷达坐标系中的X轴、Y轴坐标;当第一雷达检测到目标车辆到达触发位置时,第一雷达通过RS485串口的传输方式发送触发信号到视频检测器,同时雷达及视频检测系统将该触发信号记录下来,同时记录第一雷达发送的目标ID和目标车辆的位置信息。触发信号中包含枪机需要抓拍的车道编号、目标车速等信息。当视频检测器收到第一雷达发出的触发信号后开始判断,即如果目标车辆所在的车道编号与视频检测器唯一对应的车道编号匹配一致,就进行抓拍。In step 2), the position information of the target vehicle detected in real time by the first radar in the first radar coordinate system is the X-axis and Y-axis coordinates of the target vehicle in the corresponding radar coordinate system; when the first radar detects that the target vehicle reaches the trigger position At the same time, the first radar sends a trigger signal to the video detector through the RS485 serial port transmission method. At the same time, the radar and the video detection system record the trigger signal, and at the same time record the target ID and the position information of the target vehicle sent by the first radar. The trigger signal contains information such as the lane number and target vehicle speed that the camera needs to capture. When the video detector receives the trigger signal from the first radar, it starts to judge, that is, if the lane number where the target vehicle is located matches the lane number uniquely corresponding to the video detector, it will take a snapshot.
雷达及视频检测系统获取第一雷达和视频检测器的全部数据后,在步骤2)中根据目标车辆触发位置、所在车道编号和合理时间间隔(如0-2秒)进行数据匹配,匹配成功,则将该目标车辆的第一雷达触发信号与抓拍结果合并,再通过时间、车道编号、位置信息等参数进行目标车辆的车牌数据融合,该车牌融合数据即为多雷达目标融合的车牌数据源,最终可以生成一条包含目标ID、行驶速度信息、所在车道编号、目标车辆位置和车牌号的车牌融合数据记录。车牌信息融合后,第一雷达检测到的车辆包含了车辆唯一标识(如车道编号和车牌号)和目标车辆基础信息(如经纬度位置、目标车辆相对雷达的位置、行驶速度信息、车型等)。After the radar and video detection system obtains all the data of the first radar and the video detector, in step 2) the data is matched according to the trigger position of the target vehicle, the lane number and a reasonable time interval (such as 0-2 seconds), and the matching is successful. Then the first radar trigger signal of the target vehicle is combined with the capture result, and then the license plate data of the target vehicle is fused through parameters such as time, lane number, and location information. The license plate fusion data is the license plate data source of the multi-radar target fusion. Finally, a license plate fusion data record containing target ID, driving speed information, lane number, target vehicle location and license plate number can be generated. After the license plate information is fused, the vehicle detected by the first radar contains the unique vehicle identification (such as lane number and license plate number) and basic information of the target vehicle (such as latitude and longitude position, position of the target vehicle relative to the radar, driving speed information, model, etc.).
进一步的,在所述步骤4)中通过目标车辆实时数据中雷达的“设备编号”字段和目标车辆在雷达检测区域中的X轴坐标,判断目标车辆进入相邻两台雷达的重叠检测区域;定义雷达的检测区域长度为L1,上、下游相邻两台雷达的重叠检测区域长度为L2,雷达盲区距离为△L,相邻上游雷达检测到目标车辆的X轴坐标为x1,相邻下游雷达检测到目标车辆的X轴坐标为x2;Further, in the step 4), it is judged that the target vehicle enters the overlapping detection area of two adjacent radars through the "device number" field of the radar in the real-time data of the target vehicle and the X-axis coordinates of the target vehicle in the radar detection area; Define the length of the detection area of the radar as L1, the length of the overlap detection area of two adjacent radars on the upstream and downstream is L2, the distance of the radar blind zone is △L, the X-axis coordinate of the target vehicle detected by the adjacent upstream radar is x1, and the adjacent downstream The X-axis coordinate of the target vehicle detected by the radar is x2;
当△L≤x1≤L2+△L,L1-L2≤x2≤L1时,则判定该目标车辆为重叠检测区域内目标。When △L≤x1≤L2+△L, L1-L2≤x2≤L1, it is determined that the target vehicle is the target in the overlap detection area.
所述步骤5)中重叠检测区域为相邻两台雷达之间目标匹配的融合区域,当目标车辆进入雷达重叠检测区域后,据融合区域目标车辆的经纬度、车辆速度 计算相邻两台雷达重叠区域内目标的速度差和距离差;当相邻两台雷达对目标车辆检测到的距离差的绝对值小于2m,同时对目标车辆检测到的速度差的绝对值小于2km/h时,则判定相邻两台雷达检测的目标车辆为同一个目标,并且相邻两台雷达检测到的同一个目标会在融合区域保持融合编号和车牌号信息一致,实现相邻两台雷达间的车辆跟踪融合,后续依次相邻的下游雷达也是以同样的方式匹配融合,实现目标车辆的多雷达跨区全程跟踪。In the step 5), the overlap detection area is the fusion area of target matching between two adjacent radars. When the target vehicle enters the radar overlap detection area, calculate the overlap of the two adjacent radars based on the latitude and longitude and vehicle speed of the target vehicle in the fusion area The speed difference and distance difference of the target in the area; when the absolute value of the distance difference detected by two adjacent radars to the target vehicle is less than 2m, and the absolute value of the speed difference detected by the target vehicle is less than 2km/h, it is determined The target vehicle detected by two adjacent radars is the same target, and the same target detected by two adjacent radars will maintain the same fusion number and license plate number information in the fusion area to achieve vehicle tracking and fusion between two adjacent radars , The subsequent downstream radars that are adjacent to each other are matched and fused in the same way to achieve multi-radar cross-area tracking of the target vehicle.
结合图3所示的雷达单数据帧的融合流程图和图4所示的雷达坐标系图,相邻两台雷达对目标车辆检测到的距离差的绝对值小于2的具体计算过程为:Combining the fusion flowchart of a single radar data frame shown in Figure 3 and the radar coordinate system diagram shown in Figure 4, the specific calculation process for the absolute value of the distance difference between the two adjacent radars to the target vehicle detected by the target vehicle is less than two:
首先系统会根据每台雷达的经纬度、正北偏转角、目标车辆在对应雷达坐标系中的坐标计算出目标车辆的经纬度。First, the system will calculate the latitude and longitude of the target vehicle based on the latitude and longitude of each radar, the deflection angle of true north, and the coordinates of the target vehicle in the corresponding radar coordinate system.
设定雷达及视频检测系统中任一雷达经度为lon1、纬度为lat1、正北偏转角为β,目标车辆在雷达坐标系中的坐标为(x,y);Set the longitude of any radar in the radar and video detection system to lon1, latitude to lat1, and north deflection angle to β, and the coordinates of the target vehicle in the radar coordinate system to (x, y);
地球赤道半径:EARTH_RADIUS=6378137m,地球每一度(弧度)对应的弧长:EARTH_ARC=6378.137km*π/180=111199m;Earth's equatorial radius: EARTH_RADIUS=6378137m, the arc length corresponding to each degree (radian) of the earth: EARTH_ARC=6378.137km*π/180=111199m;
所述雷达与位于雷达监测区域内的目标车辆的距离:The distance between the radar and the target vehicle located in the radar monitoring area:
Figure PCTCN2019113974-appb-000001
Figure PCTCN2019113974-appb-000001
雷达与位于雷达监测区域内的目标车辆的方位角差(弧度)为α:The azimuth angle difference (radian) between the radar and the target vehicle in the radar monitoring area is α:
Figure PCTCN2019113974-appb-000002
Figure PCTCN2019113974-appb-000002
目标车辆的正北偏转角(弧度)为a:The true north deflection angle (radian) of the target vehicle is a:
Figure PCTCN2019113974-appb-000003
Figure PCTCN2019113974-appb-000003
目标车辆和雷达在地球表面相对球心相差的度数(弧度)为c:The degree (radian) difference between the target vehicle and the radar relative to the center of the sphere on the earth's surface is c:
Figure PCTCN2019113974-appb-000004
Figure PCTCN2019113974-appb-000004
目标车辆纬度余角为b:The complementary latitude of the target vehicle is b:
b=arccos【cos90-lon1×cos c+sin(90-lon1)×sin c×cos a】   (5)b=arccos[cos90-lon1×cos c+sin(90-lon1)×sin c×cos a] (5)
目标车辆和雷达在经度方向上相差的角度为d:The angle between the target vehicle and the radar in the longitude direction is d:
Figure PCTCN2019113974-appb-000005
Figure PCTCN2019113974-appb-000005
则:then:
所述目标车辆经度为lonM:The longitude of the target vehicle is lonM:
lonM=lon1+d   (7)lonM=lon1+d (7)
目标车辆纬度为latM:The latitude of the target vehicle is latM:
latM=90-b   (8)latM=90-b (8)
其次,遍历上、下游相邻两台雷达融合区域的所有目标,根据各个目标车辆的经纬度计算两台雷达对各目标车辆的距离差。Secondly, traverse all the targets in the fusion area of two adjacent radars in the upstream and downstream, and calculate the distance difference between the two radars to each target vehicle according to the latitude and longitude of each target vehicle.
例如计算第一目标车辆和第二目标车辆之间的距离差为例:假定第一目标车辆的经纬度为lonA、latA,第二目标车辆的经纬度为lonB、latB;已知地球赤道半径:EARTH_RADIUS=6378137m;For example, calculate the distance difference between the first target vehicle and the second target vehicle as an example: suppose the latitude and longitude of the first target vehicle are lonA, latA, and the latitude and longitude of the second target vehicle are lonB, latB; the radius of the earth’s equator is known: EARTH_RADIUS= 6378137m;
则,第一目标车辆和第二目标车辆之间的经纬度差分别为:Then, the latitude and longitude difference between the first target vehicle and the second target vehicle are:
Δlon=lonA-lonB   (9)Δlon=lonA-lonB (9)
Δlat=latA-latB   (10)Δlat=latA-latB (10)
由上式可得,第一目标车辆和第二目标车辆之间的距离S为:From the above formula, the distance S between the first target vehicle and the second target vehicle is:
Figure PCTCN2019113974-appb-000006
Figure PCTCN2019113974-appb-000006
判断S值的绝对值与2m的大小,同时判断第一目标车辆和第二目标车辆经雷达检测到的速度差V与2km/h的大小;当‖S‖<2m、‖V‖<2km/h时,直接判断上、下游相邻两台雷达分别检测到的第一目标车辆和第二目标车辆为同一目标车辆,再进一步将上下游相邻两台雷达间的车辆跟踪和数据融合,依次对下游后续雷达作对应的判断处理,实现多雷达跨区域组网,进而实现多目标跟踪识别。Determine the absolute value of the S value and the magnitude of 2m, and at the same time judge the speed difference V between the first target vehicle and the second target vehicle detected by the radar and the magnitude of 2km/h; when ‖S‖<2m,‖V‖<2km/ When h, it is directly judged that the first target vehicle and the second target vehicle respectively detected by the two adjacent upstream and downstream radars are the same target vehicle, and then the vehicle tracking and data fusion between the two adjacent upstream and downstream radars are further followed. Corresponding judgment processing is performed on downstream subsequent radars to realize multi-radar cross-regional networking, and then realize multi-target tracking and recognition.
本发明为解决在高速公路上采用高清摄像机监控车辆行驶状况容易受环境的干扰、信息结果不准确和采用雷达检测系统无法做到可视化展示、信息量的采集有限的技术问题,还公开了一种多雷达跨区域组网多目标跟踪识别装置。In order to solve the technical problems of using high-definition cameras to monitor vehicle driving conditions on highways, which are easily affected by environmental interference, inaccurate information results, and the use of radar detection systems to achieve visual display and limited information collection, it also discloses a Multi-radar cross-regional network multi-target tracking and recognition device.
所述多雷达跨区域组网多目标跟踪识别装置,包括沿高速公路均匀布置的 若干雷达及视频检测系统;所述雷达及视频检测系统包括一视频检测器和若干沿高速公路均匀且间隔布置的设置有设备编号的雷达,并且高速公路上安装的所有雷达及视频检测系统中任意相邻两台雷达的检测区域部分重叠;其中,雷达用于实时获取检测区域内目标车辆的速度信息和目标车辆在对应雷达坐标系中的位置信息;The multi-radar cross-regional networking and multi-target tracking and recognition device includes several radars and video detection systems evenly arranged along the highway; the radar and video detection system includes a video detector and a number of evenly and spacedly arranged along the highway A radar with a device number is set, and the detection areas of any two adjacent radars in all radars and video detection systems installed on the highway partially overlap; among them, the radar is used to obtain the speed information of the target vehicle and the target vehicle in the detection area in real time Position information in the corresponding radar coordinate system;
定义雷达及视频检测系统中首个编号的雷达为第一雷达,所述第一雷达的检测区域内设置有触发位置;Define the radar with the first number in the radar and video detection system as the first radar, and a trigger position is set in the detection area of the first radar;
所述多雷达跨区域组网多目标跟踪识别装置,还包括:The multi-radar cross-regional networking multi-target tracking and recognizing device further includes:
第一判断单元,用于在第一雷达的检测区域内判断目标车辆是否驶入触发位置,并当判断结果为目标车辆驶入触发位置时,触发启动视频检测器录制第一雷达检测区域段的视频图像。The first judgment unit is used to judge whether the target vehicle has entered the trigger position within the detection area of the first radar, and when the judgment result is that the target vehicle has entered the trigger position, trigger the start of the video detector to record the first radar detection area segment Video image.
第二判断单元,用于在视频图像中,判断目标车辆所在的车道编号与视频检测器唯一对应的车道编号匹配是否一致;当判断结果为目标车辆所在的车道编号与视频检测器唯一对应的车道编号匹配一致时,视频检测器抓拍目标车辆图像并识别车牌信息,并将包含目标车辆车牌信息的抓拍结果回传给系统 The second judging unit is used for judging whether the lane number of the target vehicle is consistent with the lane number uniquely corresponding to the video detector in the video image; when the judgment result is that the lane number of the target vehicle is uniquely corresponding to the lane of the video detector When the number matches are consistent, the video detector captures the target vehicle image and recognizes the license plate information, and returns the captured result containing the target vehicle license plate information to the system .
第三判断单元,用于判断目标车辆是否进入相邻两台雷达的重叠检测区域;当判断结果为目标车辆进入重叠检测区域时,对进入重叠检测区域的目标车辆做同一目标匹配。The third judgment unit is used to judge whether the target vehicle enters the overlapping detection area of two adjacent radars; when the judgment result is that the target vehicle enters the overlapping detection area, perform the same target matching for the target vehicle entering the overlapping detection area.
信息融合单元,用于将进入重叠检测区域的同一目标车辆在上游雷达的目标信息集合融合并输出给相邻下游雷达,目标车辆在相邻下游雷达检测区域内使用该目标信息集合继续做轨迹跟踪。The information fusion unit is used to fuse the target information set of the same target vehicle in the upstream radar that enters the overlapping detection area and output it to the adjacent downstream radar. The target vehicle uses the target information set in the adjacent downstream radar detection area to continue trajectory tracking .
虽然本发明已以较佳实施例揭露如上,然其并非用以限定本发明。本发明所属技术领域中具有通常知识者,在不脱离本发明的精神和范围内,当可作各种的更动与润饰。因此,本发明的保护范围当视权利要求书所界定者为准。Although the present invention has been disclosed as above in preferred embodiments, it is not intended to limit the present invention. Those with ordinary knowledge in the technical field of the present invention can make various changes and modifications without departing from the spirit and scope of the present invention. Therefore, the protection scope of the present invention shall be subject to what is defined in the claims.

Claims (6)

  1. 一种多雷达跨区域组网多目标跟踪识别方法,其特征在于,包括如下步骤:A multi-radar cross-area networking multi-target tracking and recognizing method is characterized in that it comprises the following steps:
    1)在高速公路上均匀安装若干雷达及视频检测系统,所述雷达及视频检测系统包括一视频检测器和若干设置有设备编号的雷达;所述高速公路上安装的所有雷达及视频检测系统中任意相邻两台雷达的检测区域部分重叠;1) Install a number of radars and video detection systems evenly on the highway. The radar and video detection systems include a video detector and a number of radars with equipment numbers; among all the radars and video detection systems installed on the highway The detection areas of any two adjacent radars partially overlap;
    2)分别以各雷达的接收天线中心为原点建立各雷达的雷达坐标系,雷达坐标系的X轴方向与目标车辆行驶方向相反;当目标车辆进入高速公路路段上首个雷达及视频检测系统中第一雷达的检测区域时,第一雷达实时检测区域内目标车辆的速度信息和其在第一雷达坐标系中的位置信息,生成表示目标车辆的目标ID,并将表示目标车辆的目标ID、速度信息和位置信息回传给系统;所述第一雷达在其检测区域内设置有触发位置,当目标车辆行驶至触发位置时第一雷达触发视频检测器,视频检测器判断目标车辆所在的车道编号与视频检测器唯一对应的车道编号匹配一致时,抓拍目标车辆图像并识别车牌信息,并将包含目标车辆车牌信息的抓拍结果回传给系统;2) Establish the radar coordinate system of each radar with the receiving antenna center of each radar as the origin. The X axis direction of the radar coordinate system is opposite to the driving direction of the target vehicle; when the target vehicle enters the first radar and video detection system on the highway section In the detection area of the first radar, the first radar detects the speed information of the target vehicle in the area and its position information in the first radar coordinate system in real time, generates a target ID representing the target vehicle, and will represent the target vehicle’s target ID, Speed information and position information are sent back to the system; the first radar has a trigger position in its detection area, and when the target vehicle travels to the trigger position, the first radar triggers the video detector, and the video detector determines the lane where the target vehicle is located When the number matches the unique corresponding lane number of the video detector, capture the target vehicle image and identify the license plate information, and return the capture result containing the target vehicle license plate information to the system;
    3)雷达及视频检测系统对同一目标车辆的雷达和视频数据匹配融合,生成目标车辆的完整信息集合,包含目标ID、行驶速度信息、所在车道编号、目标车辆位置信息和车牌号的车牌融合数据记录;3) The radar and video detection system match and fuse the radar and video data of the same target vehicle to generate a complete information set of the target vehicle, including the target ID, driving speed information, lane number, target vehicle location information and license plate number. recording;
    4)目标车辆向相邻设备编号的下游雷达行驶,判断目标车辆进入相邻两台雷达的重叠检测区域;4) The target vehicle drives to the downstream radar of the adjacent equipment number, and judges that the target vehicle enters the overlapping detection area of two adjacent radars;
    5)对进入相邻两台雷达重叠检测区域的目标车辆做同一目标匹配;5) Perform the same target matching on the target vehicle entering the overlapping detection area of two adjacent radars;
    6)当目标车辆在相邻两台雷达重叠检测区域内匹配为同一目标车辆时,将上游雷达的目标ID和车牌号融合并输出给相邻下游雷达,生成在下游雷达检测区域内新的目标信息集合,即目标车辆在相邻下游雷达检测区域内使用该目标信息集合继续做轨迹跟踪;当目标车辆在相邻两台雷达重叠检测区域无法匹配为同一目标,则相邻下游雷达生成新的雷达目标ID,该目标ID不包含车牌信息, 直至目标车辆行驶至下一雷达及视频检测系统的第一雷达检测区域做目标ID与车牌的匹配。6) When the target vehicle matches the same target vehicle in the overlapping detection area of two adjacent radars, the target ID and license plate number of the upstream radar are fused and output to the adjacent downstream radar to generate a new target in the detection area of the downstream radar Information collection, that is, the target vehicle uses the target information collection in the adjacent downstream radar detection area to continue trajectory tracking; when the target vehicle cannot match the same target in the overlap detection area of two adjacent radars, the adjacent downstream radar generates a new one Radar target ID. The target ID does not contain license plate information until the target vehicle travels to the first radar detection area of the next radar and video detection system to match the target ID with the license plate.
  2. 根据权利要求1所述的多雷达跨区域组网多目标跟踪识别方法,其特征在于,所述步骤4)中通过目标车辆实时数据中“设备编号”字段和目标车辆在上、下游相邻两台雷达坐标系中的X轴坐标,判断目标车辆进入相邻两台雷达的重叠检测区域;The multi-radar cross-area networking and multi-target tracking and recognition method according to claim 1, characterized in that, in the step 4), the "device number" field in the real-time data of the target vehicle and the target vehicle are adjacent to each other in the upstream and downstream. X-axis coordinates in the radar coordinate system to determine that the target vehicle enters the overlapping detection area of two adjacent radars;
    定义雷达的检测区域长度为L1,上、下游相邻两台雷达的重叠检测区域长度为L2,雷达盲区距离为△L,相邻上游雷达检测到目标车辆的X轴坐标为x1,相邻下游雷达检测到目标车辆的X轴坐标为x2;Define the length of the detection area of the radar as L1, the length of the overlap detection area of two adjacent radars on the upstream and downstream is L2, the distance of the radar blind zone is △L, the X-axis coordinate of the target vehicle detected by the adjacent upstream radar is x1, and the adjacent downstream The X-axis coordinate of the target vehicle detected by the radar is x2;
    当△L≤x1≤L2+△L,L1-L2≤x2≤L1时,则判定该目标车辆为重叠检测区域内目标。When △L≤x1≤L2+△L, L1-L2≤x2≤L1, it is determined that the target vehicle is the target in the overlap detection area.
  3. 根据权利要求1所述的多雷达跨区域组网多目标跟踪识别方法,其特征在于,所述步骤5)中重叠检测区域为相邻两台雷达之间目标匹配的融合区域,当目标车辆进入雷达重叠检测区域后,据融合区域目标车辆的经纬度、车辆速度计算相邻两台雷达重叠区域内目标的速度差和距离差;The multi-radar cross-area networking and multi-target tracking and recognition method according to claim 1, wherein the overlapping detection area in step 5) is a fusion area of target matching between two adjacent radars. When the target vehicle enters After the radar overlaps the detection area, calculate the speed difference and distance difference of the target in the overlap area of two adjacent radars according to the longitude and latitude of the target vehicle in the fusion area, and the vehicle speed;
    当相邻两台雷达对目标车辆检测到的距离差的绝对值小于2m,同时对目标车辆检测到的速度差的绝对值小于2km/h时,则判定相邻两台雷达检测的目标车辆为同一个目标,并且相邻两台雷达检测到的同一个目标会在融合区域保持融合编号和车牌号信息一致,实现相邻两台雷达间的车辆跟踪融合。When the absolute value of the distance difference between the two adjacent radars detected by the target vehicle is less than 2m, and the absolute value of the speed difference detected by the target vehicle is less than 2km/h, it is determined that the target vehicle detected by the two adjacent radars is The same target, and the same target detected by two adjacent radars will maintain the same fusion number and license plate number information in the fusion area to achieve vehicle tracking and fusion between two adjacent radars.
  4. 根据权利要求1所述的多雷达跨区域组网多目标跟踪识别方法,其特征在于,所述首个雷达及视频检测系统设置在高速公路的进入端的卡口处。The method for multi-radar cross-regional networking and multi-target tracking and identification according to claim 1, wherein the first radar and video detection system is set at the bayonet at the entry end of the expressway.
  5. 根据权利要求4所述的多雷达跨区域组网多目标跟踪识别方法,其特征在于,所述视频检测器为枪机,所述枪机与第一雷达共同安装于枪机卡口杆件上。The method for multi-radar cross-area networking and multi-target tracking and identification according to claim 4, wherein the video detector is a bolt action, and the bolt action and the first radar are jointly mounted on the bolt mount rod .
  6. 一种多雷达跨区域组网多目标跟踪识别装置,其特征在于,包括雷达及视频检测系统、第一判断单元、第二判断单元、第三判断单元和信息融合单元;A multi-radar cross-regional networking and multi-target tracking and recognition device, which is characterized by comprising a radar and video detection system, a first judgment unit, a second judgment unit, a third judgment unit and an information fusion unit;
    所述雷达及视频检测系统包括若干,分别沿高速公路均匀布置,包括一视频检测器和沿高速公路均匀且间隔布置的设置有设备编号的雷达;所述高速公路上安装的所有雷达及视频检测系统中任意相邻两台雷达的检测区域部分重叠;The radar and video detection system includes several, which are respectively evenly arranged along the expressway, including a video detector and radars with equipment numbers arranged evenly and spaced along the expressway; all radars and video detection systems installed on the expressway The detection areas of any two adjacent radars in the system partially overlap;
    所述雷达,用于实时获取检测区域内目标车辆的速度信息和目标车辆在对应雷达坐标系中的位置信息;The radar is used to obtain the speed information of the target vehicle in the detection area and the position information of the target vehicle in the corresponding radar coordinate system in real time;
    定义雷达及视频检测系统中首个编号的雷达为第一雷达,所述第一雷达的检测区域内设置有触发位置;Define the radar with the first number in the radar and video detection system as the first radar, and a trigger position is set in the detection area of the first radar;
    所述第一判断单元,用于在第一雷达的检测区域内判断目标车辆是否驶入触发位置,并当判断结果为目标车辆驶入触发位置时,触发启动视频检测器录制第一雷达检测区域段的视频图像;The first judgment unit is used to judge whether the target vehicle has entered the trigger position within the detection area of the first radar, and when the judgment result is that the target vehicle has entered the trigger position, trigger the start of the video detector to record the first radar detection area Segment of video images;
    所述第二判断单元,用于在视频图像中,判断目标车辆所在的车道编号与视频检测器唯一对应的车道编号匹配是否一致;当判断结果为目标车辆所在的车道编号与视频检测器唯一对应的车道编号匹配一致时,视频检测器抓拍目标车辆图像并识别车牌信息,并将包含目标车辆车牌信息的抓拍结果回传给系统 The second judgment unit is used for judging whether the lane number where the target vehicle is located matches the lane number uniquely corresponding to the video detector in the video image; when the judgment result is that the lane number where the target vehicle is located uniquely corresponds to the video detector When the lane numbers match the same, the video detector captures the target vehicle image and recognizes the license plate information, and returns the captured result containing the target vehicle license plate information to the system ;
    所述第三判断单元,用于判断目标车辆是否进入相邻两台雷达的重叠检测区域;当判断结果为目标车辆进入重叠检测区域时,对进入重叠检测区域的目标车辆做同一目标匹配;The third judgment unit is used to judge whether the target vehicle enters the overlapping detection area of two adjacent radars; when the judgment result is that the target vehicle enters the overlapping detection area, perform the same target matching for the target vehicle that enters the overlapping detection area;
    所述信息融合单元,用于将进入重叠检测区域的同一目标车辆在上游雷达的目标信息集合融合并输出给相邻下游雷达,目标车辆在相邻下游雷达检测区域内使用该目标信息集合继续做轨迹跟踪。The information fusion unit is used to fuse the target information set of the same target vehicle in the upstream radar that enters the overlapping detection area and output it to the adjacent downstream radar, and the target vehicle uses the target information set in the adjacent downstream radar detection area to continue doing Tracking.
PCT/CN2019/113974 2019-08-19 2019-10-29 Multiple object tracking and identification method and apparatus based on multi-radar cross-regional networking WO2021031338A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910763489.0A CN110515073B (en) 2019-08-19 2019-08-19 Multi-radar cross-regional networking multi-target tracking identification method and device
CN201910763489.0 2019-08-19

Publications (1)

Publication Number Publication Date
WO2021031338A1 true WO2021031338A1 (en) 2021-02-25

Family

ID=68625724

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/113974 WO2021031338A1 (en) 2019-08-19 2019-10-29 Multiple object tracking and identification method and apparatus based on multi-radar cross-regional networking

Country Status (2)

Country Link
CN (1) CN110515073B (en)
WO (1) WO2021031338A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116721552A (en) * 2023-06-12 2023-09-08 北京博宏科元信息科技有限公司 Non-motor vehicle overspeed identification recording method, device, equipment and storage medium
CN117197182A (en) * 2023-11-07 2023-12-08 华诺星空技术股份有限公司 Lei Shibiao method, apparatus and storage medium

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110930721A (en) * 2019-12-03 2020-03-27 上海熹翼科技有限公司 Road vehicle monitoring system and method based on soft handover radar
CN111083444B (en) * 2019-12-26 2021-10-15 浙江大华技术股份有限公司 Snapshot method and device, electronic equipment and storage medium
CN113095345A (en) * 2020-01-08 2021-07-09 富士通株式会社 Data matching method and device and data processing equipment
CN111260808B (en) * 2020-01-17 2021-12-10 河北德冠隆电子科技有限公司 Free flow vehicle charging device, system and method based on multi-data fusion
CN111679271B (en) * 2020-06-15 2023-03-14 杭州海康威视数字技术股份有限公司 Target tracking method, target tracking device, monitoring system and storage medium
CN112162283B (en) * 2020-08-18 2024-07-02 重庆睿行电子科技有限公司 Multi-target detection system of full-road-section networking traffic radar
CN112099040A (en) * 2020-09-15 2020-12-18 浙江省机电设计研究院有限公司 Whole-course continuous track vehicle tracking system and method based on laser radar network
CN112071083B (en) * 2020-09-15 2022-03-01 深圳市领航城市科技有限公司 Motor vehicle license plate relay identification system and license plate relay identification method
CN112309123B (en) * 2020-10-15 2022-04-15 武汉万集信息技术有限公司 Vehicle detection method and system
CN112433203B (en) * 2020-10-29 2023-06-20 同济大学 Lane linearity detection method based on millimeter wave radar data
CN114445307A (en) * 2020-10-30 2022-05-06 高新兴科技集团股份有限公司 Method, device, MEC and medium for acquiring target information based on radar and visible light image
CN114639262B (en) * 2020-12-15 2024-02-06 北京万集科技股份有限公司 Method and device for detecting state of sensing device, computer device and storage medium
CN112837546A (en) * 2020-12-25 2021-05-25 山东交通学院 Expressway agglomerate fog guide laying method
CN112700647B (en) * 2020-12-29 2022-09-16 杭州海康威视数字技术股份有限公司 Method and device for monitoring vehicle driving information
CN115331480A (en) * 2021-05-10 2022-11-11 北京万集科技股份有限公司 Vehicle early warning method and device and computing equipment
CN113393675B (en) * 2021-05-24 2023-03-21 青岛海信网络科技股份有限公司 Vehicle ID determination method, device, equipment and medium
CN113419244A (en) * 2021-05-28 2021-09-21 同济大学 Vehicle track splicing method based on millimeter wave radar data
CN115410379B (en) * 2021-05-28 2024-02-13 深圳成谷科技有限公司 Matching relation determining method, device and processing equipment applied to vehicle-road cooperation
CN113625236B (en) * 2021-06-30 2024-05-24 嘉兴聚速电子技术有限公司 Multi-radar data fusion method, device, storage medium and equipment
CN113792634B (en) * 2021-09-07 2022-04-15 北京易航远智科技有限公司 Target similarity score calculation method and system based on vehicle-mounted camera
CN115880791A (en) * 2021-09-26 2023-03-31 山西西电信息技术研究院有限公司 Parking management method
CN113888865B (en) * 2021-09-29 2022-11-11 青岛海信网络科技股份有限公司 Electronic device and vehicle information acquisition method
CN113900070B (en) * 2021-10-08 2022-09-27 河北德冠隆电子科技有限公司 Method, device and system for automatically drawing target data and accurately outputting radar lane
CN115206091B (en) * 2022-06-07 2024-06-07 西安电子科技大学广州研究院 Road condition and event monitoring system and method based on multiple cameras and millimeter wave radar
CN115331469A (en) * 2022-08-15 2022-11-11 北京图盟科技有限公司 Vehicle track online restoration method, device and equipment
CN116168546B (en) * 2023-02-20 2024-05-31 烽火通信科技股份有限公司 Method, device, equipment and readable storage medium for judging attribution of vehicle identification information

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102768803A (en) * 2012-07-31 2012-11-07 株洲南车时代电气股份有限公司 Vehicle intelligent monitoring and recording system and method based on radar and video detection
EP2660624A1 (en) * 2012-04-30 2013-11-06 Traficon International N.V. A traffic monitoring device and a method for monitoring a traffic stream.
CN104966400A (en) * 2015-06-11 2015-10-07 山东鼎讯智能交通股份有限公司 Integrated multi-object radar speed measurement snapshot system and method
CN107527506A (en) * 2017-09-20 2017-12-29 上海安道雷光波系统工程有限公司 Embedded radar monitors recombination optics and radar monitoring capturing system and method
CN107767668A (en) * 2017-10-19 2018-03-06 深圳市置辰海信科技有限公司 A kind of method based on the continuous real-time tracking of radar active probe vehicle
CN109212513A (en) * 2018-09-29 2019-01-15 河北德冠隆电子科技有限公司 Multiple target between radar data transmitting, data fusion and localization method is continuously tracked

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5929870B2 (en) * 2013-10-17 2016-06-08 株式会社デンソー Target detection device
BR102013033041B1 (en) * 2013-12-20 2022-02-01 Perkons S/A System and method for monitoring and enforcing traffic and recording traffic violations and corresponding unmanned aerial vehicle
CN104021673B (en) * 2014-06-17 2016-04-20 北京易华录信息技术股份有限公司 Radar tracking technology is utilized to find fast to block up and cause the system and method for reason
WO2016060384A1 (en) * 2014-10-17 2016-04-21 전자부품연구원 Method and device for providing panoramic vehicle situation information using multiple cameras and radar sensor information
CN104537834A (en) * 2014-12-21 2015-04-22 北京工业大学 Intersection identification and intersection trajectory planning method for intelligent vehicle in urban road running process
CN106125076A (en) * 2016-07-13 2016-11-16 南京慧尔视智能科技有限公司 A kind of Anticollision Radar method for early warning being applied to urban transportation and device
CN106373394B (en) * 2016-09-12 2019-01-04 深圳尚桥交通技术有限公司 Vehicle detection method and system based on video and radar
CN106448189A (en) * 2016-11-02 2017-02-22 南京慧尔视智能科技有限公司 Multilane speed measuring and block port triggering method and device based on microwaves
CN106710240B (en) * 2017-03-02 2019-09-27 公安部交通管理科学研究所 The passing vehicle for merging multiple target radar and video information tracks speed-measuring method
CN108957478B (en) * 2018-07-23 2021-03-26 上海禾赛科技股份有限公司 Multi-sensor synchronous sampling system, control method thereof and vehicle
CN109061600B (en) * 2018-09-28 2023-02-14 上海市刑事科学技术研究院 Target identification method based on millimeter wave radar data
CN109615870A (en) * 2018-12-29 2019-04-12 南京慧尔视智能科技有限公司 A kind of traffic detection system based on millimetre-wave radar and video
CN109671278B (en) * 2019-03-02 2020-07-10 安徽超远信息技术有限公司 Bayonet accurate positioning snapshot method and device based on multi-target radar

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2660624A1 (en) * 2012-04-30 2013-11-06 Traficon International N.V. A traffic monitoring device and a method for monitoring a traffic stream.
CN102768803A (en) * 2012-07-31 2012-11-07 株洲南车时代电气股份有限公司 Vehicle intelligent monitoring and recording system and method based on radar and video detection
CN104966400A (en) * 2015-06-11 2015-10-07 山东鼎讯智能交通股份有限公司 Integrated multi-object radar speed measurement snapshot system and method
CN107527506A (en) * 2017-09-20 2017-12-29 上海安道雷光波系统工程有限公司 Embedded radar monitors recombination optics and radar monitoring capturing system and method
CN107767668A (en) * 2017-10-19 2018-03-06 深圳市置辰海信科技有限公司 A kind of method based on the continuous real-time tracking of radar active probe vehicle
CN109212513A (en) * 2018-09-29 2019-01-15 河北德冠隆电子科技有限公司 Multiple target between radar data transmitting, data fusion and localization method is continuously tracked

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116721552A (en) * 2023-06-12 2023-09-08 北京博宏科元信息科技有限公司 Non-motor vehicle overspeed identification recording method, device, equipment and storage medium
CN116721552B (en) * 2023-06-12 2024-05-14 北京博宏科元信息科技有限公司 Non-motor vehicle overspeed identification recording method, device, equipment and storage medium
CN117197182A (en) * 2023-11-07 2023-12-08 华诺星空技术股份有限公司 Lei Shibiao method, apparatus and storage medium
CN117197182B (en) * 2023-11-07 2024-02-27 华诺星空技术股份有限公司 Lei Shibiao method, apparatus and storage medium

Also Published As

Publication number Publication date
CN110515073B (en) 2021-09-07
CN110515073A (en) 2019-11-29

Similar Documents

Publication Publication Date Title
WO2021031338A1 (en) Multiple object tracking and identification method and apparatus based on multi-radar cross-regional networking
CN108919256B (en) Four-dimensional real-scene traffic simulation vehicle overspeed whole-course tracking detection alarm system and method
CN103473926B (en) The interlock road traffic parameter collection of rifle ball and capturing system violating the regulations
CN112700470B (en) Target detection and track extraction method based on traffic video stream
CN108550262B (en) Urban traffic sensing system based on millimeter wave radar
CN102521979B (en) High-definition camera-based method and system for pavement event detection
CN112099040A (en) Whole-course continuous track vehicle tracking system and method based on laser radar network
CN105679043A (en) 3D radar intelligent bayonet system and processing method thereof
CN102768801A (en) Method for detecting motor vehicle green light follow-up traffic violation based on video
CN102231231A (en) Area road network traffic safety situation early warning system and method thereof
CN102332209A (en) Automobile violation video monitoring method
CN114333330B (en) Intersection event detection system based on road side edge holographic sensing
CN109102695B (en) Intelligent traffic service station, intelligent traffic service method and system
CN110796862B (en) Highway traffic condition detection system and method based on artificial intelligence
CN205609012U (en) Highway intelligent transportation road conditions monitored control system
CN114419874B (en) Target driving safety risk early warning method based on road side sensing equipment data fusion
RU2587662C1 (en) Automated system for detecting road traffic violation at crossroad, railway crossing or pedestrian crossing
CN111754786A (en) System for identifying traffic vehicle passing events on highway
CN112140995A (en) Intelligent automobile safe driving system based on network cloud
CN111477011A (en) Detection device and detection method for road intersection early warning
CN114387785A (en) Safety management and control method and system based on intelligent highway and storable medium
CN202013659U (en) Intelligent safe traffic information platform based on embedded platform imaging processing and wireless communication
CN104008649A (en) System and method for quickly finding abnormal parking reason on carriageway through radar tracking
CN104408942A (en) Intelligent vehicle speed measuring device and method
CN116165655A (en) Global vehicle track tracking system based on millimeter wave radar group

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19942262

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19942262

Country of ref document: EP

Kind code of ref document: A1