CN114373297B - Data processing device and method and electronic equipment - Google Patents

Data processing device and method and electronic equipment Download PDF

Info

Publication number
CN114373297B
CN114373297B CN202111529903.5A CN202111529903A CN114373297B CN 114373297 B CN114373297 B CN 114373297B CN 202111529903 A CN202111529903 A CN 202111529903A CN 114373297 B CN114373297 B CN 114373297B
Authority
CN
China
Prior art keywords
target
data
intersection
intersection data
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111529903.5A
Other languages
Chinese (zh)
Other versions
CN114373297A (en
Inventor
陈晓明
王玉波
王华伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense TransTech Co Ltd
Original Assignee
Hisense TransTech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense TransTech Co Ltd filed Critical Hisense TransTech Co Ltd
Priority to CN202111529903.5A priority Critical patent/CN114373297B/en
Publication of CN114373297A publication Critical patent/CN114373297A/en
Application granted granted Critical
Publication of CN114373297B publication Critical patent/CN114373297B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5061Partitioning or combining of resources
    • G06F9/5072Grid computing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/07Controlling traffic signals
    • G08G1/08Controlling traffic signals according to detected number or speed of vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/095Traffic lights

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • General Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Development Economics (AREA)
  • Marketing (AREA)
  • Educational Administration (AREA)
  • Chemical & Material Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Primary Health Care (AREA)
  • Analytical Chemistry (AREA)
  • Health & Medical Sciences (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the application provides a data processing device, a data processing method and electronic equipment, relates to the technical field of intelligent traffic, and aims to improve the accuracy of tracking a target appearing at a road junction. The device comprises: a processing unit and a receiving unit; the receiving unit is configured to: acquiring intersection data, wherein the intersection data comprises first intersection data at a first moment and second intersection data at a second moment, and the second moment is behind the first moment; the processing unit is configured to: processing the first intersection data to determine a first target set appearing in the first time intersection; the first set of targets includes a first target; processing the second intersection data, and determining a second target set appearing in the intersection at a second moment; and when the second target set does not comprise the first target, determining the real position information of the first target in the second intersection data according to the position information of the first target in the first intersection data.

Description

Data processing device and method and electronic equipment
Technical Field
The present disclosure relates to the field of intelligent transportation technologies, and in particular, to a data processing apparatus and method, and an electronic device.
Background
Edge computing means that an open platform integrating network, computing, storage and application core capabilities is adopted on one side close to an object or a data source to provide nearest-end services nearby. The application program is initiated at the edge side, so that a faster network service response is generated, and the basic requirements of the industry in the aspects of real-time business, application intelligence, safety, privacy protection and the like are met. The edge computation is between the physical entity and the industrial connection, or on top of the physical entity.
Based on edge calculation, traffic data of the intersection can be acquired by processing information acquired by equipment such as a built gate, an electronic police, a radar and the like of the intersection. However, due to the limitation of the detection area of the equipment in which the intersection is constructed, there may be a case where the detected object is lost. Based on the missing targets, it is difficult to obtain more accurate traffic data.
Disclosure of Invention
The embodiment of the application provides a data processing device, a data processing method and electronic equipment, which are used for improving the accuracy of tracking a target appearing at a road junction.
In a first aspect, an embodiment of the present application provides a data processing apparatus, including: a processing unit and a receiving unit; the receiving unit is configured to: the method comprises the steps of obtaining intersection data, wherein the intersection data comprise first intersection data at a first moment and second intersection data at a second moment, and the second moment is behind the first moment; the processing unit is configured to: processing the first intersection data, and determining a first target set appearing in the intersection at a first moment; the first set of targets comprises a first target; processing the second intersection data, and determining a second target set appearing in the intersection at a second moment; and when the second target set does not comprise the first target, determining the real position information of the first target in the second intersection data according to the position information of the first target in the first intersection data.
Due to the fact that the detection area of equipment built at the intersection has a limitation problem, the target may be lost. Based on the scheme, the edge calculation unit can calculate the real position information of the lost target, so that the edge calculation unit can track each target appearing at the intersection, the accuracy of tracking the target appearing at the intersection is improved, and more accurate traffic data is obtained.
In a possible implementation manner, when the processing unit determines, according to the location information of the first target in the first intersection data, the actual location information of the first target in the second intersection data, the processing unit is specifically configured to: according to the position information of the first target in the first intersection data, determining the predicted position information of the first target in the second intersection data, and taking the predicted position information as the real position information; the predicted position information satisfies the following formula:
Figure BDA0003410355940000021
p i,x for the predicted x-axis coordinate, p, of the first target in the second intersection data i,y For the predicted y-axis coordinate, p, of the first target in the second intersection data i-1,x Is the x-axis coordinate, p, of the first target in the first intersection data i-1,y Is the y-axis coordinate, v, of the first target in the first intersection data i-1,x The x-axis velocity, v, in the first intersection data for the first target i-1,y Is the y-axis speed, a, of the first target in the first intersection data i-1,y Is the y-axis acceleration, a, of the first target in the first intersection data i-1,x And Δ t is the time difference between the second moment and the first moment, which is the x-axis acceleration of the first target in the first intersection data.
Based on the scheme, the position information of the lost target at the second moment can be calculated by calculating the position information of the lost target at the first moment, so that the lost target is tracked.
In one possible implementation manner, the processing unit is configured to, before the predicted location information is used as the actual location information: when the position information of the first target in the first intersection data and the predicted position information of the first target in the second intersection data satisfy the following formula, determining that the predicted position information of the first target in the second intersection data is the real position information of the first target in the second intersection data:
Figure BDA0003410355940000022
Figure BDA0003410355940000023
x 1 is the x-axis coordinate, x, of the first target in the position information of the first intersection data 2 For the x-axis coordinate, y, of the predicted position information of the first target in the second intersection data 1 Is the x-axis coordinate, y, of the first target in the position information of the first intersection data 2 For the y-axis coordinate of the first target in the predicted position information in the second intersection data,
Figure BDA0003410355940000024
based on the length of the target which is relatively smaller in the y-axis coordinate in the position information of the first target in the first intersection data and the x-axis coordinate in the predicted position information of the first target in the second intersection data, the method comprises the step of determining the length of the target based on the length of the target and the length of the target based on the length of the target>
Figure BDA0003410355940000025
And alpha is a length coincidence degree threshold value and beta is a width coincidence degree threshold value, wherein alpha is the sum of the width of the first target in the position information of the first intersection data and the width of the first target in the predicted position information of the second intersection data.
Based on the scheme, when the predicted position information of the lost target at the second moment and the position information of the lost target at the first moment meet the formula, the predicted position information of the lost target at the second moment is determined to be the real position information of the lost target at the second moment, and the lost target can be conveniently and quickly tracked.
In a possible implementation manner, the intersection data includes data detected by a first detection device in a first detection area and data detected by a second detection device in a second detection area; the processing unit is further configured to: processing the data of the first detection device to determine a second target in an overlapping area; the overlap region is an overlap region of the first detection region and the second detection region; processing data of the second detection device to determine a third target appearing in the overlapping area; determining the second target and the third target to be the same target when the second target and the third target satisfy the following formula:
s<min(L 0 ,L 1 ,W 0 ,W 1 )
s represents the distance between the second and third targets, L 0 Represents the length, L, of the second target 1 Represents the length, W, of the third target 0 Represents the width, W, of the third target 1 Representing the width of the third target.
Based on the scheme, the targets in the overlapped area can be correctly tracked, and the inaccuracy of traffic detection data caused by the fact that the same target has a plurality of position information in the edge calculation unit is avoided.
In a possible implementation manner, before the receiving unit acquires the intersection data, the receiving unit is further configured to: obtaining a calibration result; the calibration result comprises a coordinate system adopted by the first detection device and a coordinate system adopted by the second detection device; and acquiring the information of the first detection area and the information of the second detection area.
Based on the scheme, the edge calculation unit obtains the calibration result and can calibrate the coordinate systems adopted by the first detection device and the second detection device, so that the edge calculation unit stably tracks the target appearing at the intersection.
In one possible implementation, before the processing unit determines that the second target and the third target are the same target, the processing unit is further configured to: carrying out longitude and latitude coordinate correction on a fourth target in the overlapping area according to the longitude deviation of the fourth target and the latitude deviation of the fourth target, wherein the fourth target is any target in the overlapping area; the longitude deviation of the fourth target satisfies the following equation:
Δlat=(ΔL*cos(Ang*π/180))/110540;
delta L is the coordinate deviation displacement of the fourth target, and Ang is the angle between the fourth target and the due north direction; the latitude deviation of the fourth target satisfies the following formula:
Δlng=(ΔL*sin(Ang*π/180))/(111320*cos(lat A *π/180))
and delta L is the coordinate deviation displacement of the sixth target, and Ang is the angle between the sixth target and the due north direction.
Based on the scheme, the longitude and latitude coordinates of the targets in the overlapped area are corrected, so that the targets repeatedly appearing in multiple devices can be fused, and the condition that the same target has multiple pieces of position information in the edge calculation unit to cause inaccuracy of traffic detection data is avoided.
In one possible implementation, the processing unit is further configured to: determining traffic detection data of the intersection according to the first target set and the second target set; the traffic detection data includes one or more of flow information of lanes in the intersection, a target average speed, a target queuing length, a time occupancy of lanes in the intersection, and a space occupancy of lanes in the intersection.
Based on the scheme, the transverse interconnection and intercommunication of traffic facilities can be realized at the intersection, and the intersection can be subjected to fine management. In addition, the radar data is analyzed, and the video analysis capability of the electric alarm equipment and the bayonet equipment is multiplexed at the same time, so that the method can be realized on the basis of the existing traffic facilities at the intersection, thereby avoiding resource waste, avoiding repeated construction of the intersection facilities, reducing the construction cost of the intersection and being capable of landing on the ground in a large area for application and popularization.
In a second aspect, an embodiment of the present application provides a data processing method, including: acquiring intersection data, wherein the intersection data comprises first intersection data at a first moment and second intersection data at a second moment, and the second moment is behind the first moment; processing the first intersection data to determine a first target set appearing in the intersection at a first time; the first set of targets comprises a first target; processing the second intersection data, and determining a second target set appearing in the intersection at a second moment; and when the second target set does not comprise the first target, determining the real position information of the first target in the second intersection data according to the position information of the first target in the first intersection data.
In a possible implementation manner, when determining, according to the location information of the first target in the first intersection data, the actual location information of the first target in the second intersection data, specifically includes: according to the position information of the first target in the first intersection data, determining the predicted position information of the first target in the second intersection data, and taking the predicted position information as the real position information; the predicted position information satisfies the following formula:
Figure BDA0003410355940000041
p i,x for the predicted x-axis coordinate, p, of the first target in the second intersection data i,y For the predicted y-axis coordinate, p, of the first target in the second intersection data i-1,x Is the x-axis coordinate, p, of the first target in the first intersection data i-1,y Is the y-axis coordinate, v, of the first target in the first intersection data i-1,x The x-axis velocity, v, in the first intersection data for the first target i-1,y Is the y-axis speed, a, of the first target in the first intersection data i-1,y Is the y-axis acceleration, a, of the first target in the first intersection data i-1,x And Δ t is the time difference between the second moment and the first moment, and is the x-axis acceleration of the first target in the first intersection data.
In a possible implementation manner, before the taking the predicted location information as the actual location information, the method further includes: when the position information of the first target in the first intersection data and the predicted position information of the first target in the second intersection data satisfy the following formula, determining that the predicted position information of the first target in the second intersection data is the real position information of the first target in the second intersection data:
Figure BDA0003410355940000042
Figure BDA0003410355940000043
x 1 is the x-axis coordinate, x, of the first target in the position information of the first intersection data 2 For the x-axis coordinate, y, of the predicted position information of the first target in the second intersection data 1 Is the x-axis coordinate, y, of the first target in the position information of the first intersection data 2 For the y-axis coordinate of the first target in the predicted position information in the second intersection data,
Figure BDA0003410355940000044
based on the length of the target which is smaller in the y-axis coordinate in the position information of the first target in the first intersection data and the x-axis coordinate in the predicted position information of the first target in the second intersection data, the judgment unit judges whether the length of the target is greater than or equal to the length of the target>
Figure BDA0003410355940000045
And alpha is a length coincidence degree threshold value and beta is a width coincidence degree threshold value, wherein alpha is the sum of the width of the first target in the position information of the first intersection data and the width of the first target in the predicted position information of the second intersection data.
In a possible implementation manner, the intersection data includes data detected by a first detection device in a first detection area and data detected by a second detection device in a second detection area; processing the data of the first detection device to determine a second target in an overlapping area; the overlap region is an overlap region of the first detection region and the second detection region; processing data of the second detection device to determine a third target appearing in the overlapping area; determining the second target and the third target to be the same target when the second target and the third target satisfy the following formula:
s<min(L 0 ,L 1 ,W 0 ,W 1 )
s represents the distance between the second and third targets, L 0 Represents the length, L, of the second target 1 Represents the length, W, of the third target 0 Represents the width, W, of the third target 1 Representing the width of the third target.
In a possible implementation manner, before acquiring the intersection data, the method further includes: obtaining a calibration result; the calibration result comprises a coordinate system adopted by the first detection device and a coordinate system adopted by the second detection device; and acquiring the information of the first detection area and the information of the second detection area.
In a possible implementation manner, before determining that the second target and the third target are the same target, the method further includes: carrying out longitude and latitude coordinate correction on a fourth target in the overlapping area according to the longitude deviation of the fourth target and the latitude deviation of the fourth target, wherein the fourth target is any one target in the overlapping area; the longitude deviation of the fourth target satisfies the following equation:
Δlat=(ΔL*cos(Ang*π/180))/110540;
delta L is the coordinate deviation displacement of the fourth target, and Ang is the angle between the fourth target and the true north direction; the latitude deviation of the fourth target satisfies the following formula:
Δlng=(ΔL*sin(Ang*π/180))/(111320*cos(lat A *π/180))
and delta L is the coordinate deviation displacement of the sixth target, and Ang is the angle between the sixth target and the due north direction.
In a possible implementation manner, traffic detection data of the intersection is determined according to the first target set and the second target set; the traffic detection data includes one or more of flow information of lanes in the intersection, a target average speed, a target queuing length, a time occupancy of lanes in the intersection, and a space occupancy of lanes in the intersection.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a memory for storing computer instructions;
a processor coupled to the memory for executing the computer instructions in the memory and when executing the computer instructions implementing the method as described in the first or second aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing computer instructions that, when executed on a computer, cause the computer to perform the method according to any one of the first or second aspects.
For each of the second aspect to the fourth aspect and possible technical effects achieved by each aspect, please refer to the above description of the technical effects that can be achieved by the first aspect or various possible schemes in the first aspect, and details are not repeated here.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application.
FIG. 1 is a schematic diagram of a data processing system according to an embodiment of the present application;
fig. 2 is an exemplary flowchart of a data processing method provided in an embodiment of the present application;
fig. 3 is a schematic diagram of a detection coil provided in an embodiment of the present application;
fig. 4 is a schematic diagram of a signal scheme optimization process provided in an embodiment of the present application;
fig. 5 is a schematic view of a traffic problem diagnosis process provided in an embodiment of the present application;
fig. 6 is a schematic flow chart of a traffic event recognition process provided in an embodiment of the present application;
FIG. 7 is a block diagram of a data processing system according to an embodiment of the present application;
FIG. 8 is a diagram illustrating a data processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to facilitate understanding of technical solutions provided by the embodiments of the present application, terms of art related to the embodiments of the present application are described below.
1) The bayonet is a road traffic on-site monitoring device which is used for shooting, processing and recording all motor vehicles and persons in the vehicles passing through the bayonet at a specific position on a road.
2) The signal machine is one of the important components of modern urban traffic system, and is mainly used for controlling and managing urban road traffic signals.
3) The electric police can monitor the illegal behaviors of the motor vehicle such as red light running, retrograde motion, overspeed, line crossing running, illegal parking and the like all weather and capture the illegal information of the motor vehicle by a plurality of technologies such as motor vehicle detection, photoelectric imaging, automatic control, network communication, computers and the like.
4) The Road Side Unit (RSU) is a device which is installed in the Road Side in an Electronic Toll Collection (ETC) system, communicates with an On-Board Unit (OBU) by using a Dedicated Short Range Communication (DSRC) technology, and realizes motor vehicle identification and Electronic deduction.
5) And an On Board Unit (OBU) which is a microwave device communicating with the RSU by adopting DSRC technology.
6) The road junction canalization is to dredge and guide the road traffic flow by arranging traffic islands or traffic markings and setting signs, and to make the traffic smooth, thereby achieving the purposes of improving the road traffic capacity, driving safety and ensuring the safety of passers-by.
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments, but not all embodiments, of the technical solutions of the present application. All other embodiments obtained by a person skilled in the art based on the embodiments described in the present application without any creative effort belong to the protection scope of the technical solution of the present application.
The terms "first" and "second" in the embodiments of the present application are used to distinguish different objects, not to describe a specific order. Furthermore, the term "comprises" and any variations thereof, which are intended to cover non-exclusive protection. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. The "plurality" in the present application may mean at least two, for example, two, three or more, and the embodiments of the present application are not limited.
In addition, the term "and/or" herein is only one kind of association relationship describing the association object, and means that there may be three kinds of relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in this document generally indicates that the preceding and following related objects are in an "or" relationship unless otherwise specified.
At present, for traffic problems or traffic events at intersections, information collected by equipment such as radar equipment, video equipment and electronic police at intersections needs to be sent to a background center for analysis and processing. Therefore, the traffic equipment at the intersection cannot realize transverse interconnection, and the fine management of the intersection is restricted and influenced.
In view of this, the present application provides a data processing method, which can be applied to a data processing system. In the method, the edge calculation unit acquires the intersection data at multiple moments and processes the intersection data at multiple moments to determine the real position information of the lost target, so that the accuracy of tracking the target appearing at the intersection can be improved.
Fig. 1 is a schematic diagram of a data processing system according to an embodiment of the present disclosure, where the system includes an edge computing unit, a radar device, a video device, a signal device, a configuration tool, and a central system. Wherein the edge computing unit can be installed in a signal cabinet, and the video equipment comprises a bayonet and an electric alarm. The data processing system can be applied to the optimization of a road signal scheme, the diagnosis of road traffic problems, the identification of traffic events and other fine management of the road.
Referring to fig. 2, an exemplary flowchart of a data processing method provided in an embodiment of the present application may include the following processes:
s201, the edge computing unit acquires intersection data.
For example, the edge calculation unit may acquire intersection data from an already-constructed detection device of an intersection through a portal. The detection device may include a radar device and a video device, such as a video device of a gate and an electronic police. In one possible scenario, whenever the detection device detects an object at the intersection, the collected data of the object may be sent to the edge calculation unit. In another possible case, the detection device may periodically send the collected intersection data to the edge calculation unit. The intersection data in S201 may include first intersection data at a first time and second intersection data at a second time. Wherein the second time may be after the first time. Alternatively, the intersection data may include data of intersections at a plurality of times. In the embodiment of the present application, an example in which an edge calculation unit obtains first intersection data at a first time and second intersection data at a second time is described.
Because the detection device can include a radar device and a video device, the intersection data can include radar data collected by the radar device and video data collected by the video device. For example, the intersection data in S201 may include radar data and video data at a first time, and further include radar data and video data at a second time.
Wherein the radar data may include one or more of: the radar device comprises data such as a device number of the radar device, a sending time of radar data, an Identification (ID) of a target, an abscissa of the target, an ordinate of the target, an abscissa speed of the target, an ordinate speed of the target, a length of the target, a width of the target, a speed of the target, a longitude of a position where the target is located, and a latitude of the position where the target is located.
The edge calculation unit is used for calculating the number of the equipment, wherein a plurality of radar equipment may exist at one intersection, each radar equipment has the own equipment number, and the radar equipment can send the equipment number to the edge calculation unit when sending radar data to the edge calculation unit. The transmission time of the radar data refers to a time when the radar device transmits the radar data to the edge calculation unit. The target identification may be assigned to the target by the radar device after the target is recognized, and may be a serial number or a random number. The abscissa and ordinate of the target are the abscissa and ordinate of the target in the radar apparatus coordinate system.
The video data may include one or more of: the video equipment comprises data such as equipment numbers of video equipment, sending time of video data, target identification, passing time of a target, a lane number where the target is located, license plate color of the target, license plate number of the target, vehicle speed of the target, sequence numbers of shot intersection pictures and the like.
The video equipment can send the equipment number to the edge computing unit when sending video data to the edge computing unit. The transmission time of the video data refers to a time when the video device transmits the video data to the edge calculation unit. The target identification may be assigned to the target by the video device after identifying the target, such as a serial number or a random number. The photographed intersection picture is the intersection picture photographed by a bayonet in the video equipment.
Optionally, the intersection data may also include signal data of the intersection signal device. The signal machine data comprises intersection channelized information, intersection signal lamp schemes, intersection signal lamp states, intersection signal lamp countdown and other data. The intersection channelized information comprises intersection lane information, the signal lamp scheme of the intersection comprises the green signal ratio and the period of an intersection signal lamp, the lamp state of the intersection signal lamp represents the state of the signal lamp when intersection data are obtained, namely the signal lamp is a red lamp, a green lamp or a yellow lamp, and the countdown of the intersection signal lamp represents the countdown of the signal lamp when the intersection data are obtained.
In a possible implementation, the intersection data may further include Road Side Unit (RSU) data. The RSU data may include: the system comprises data such as an On Board Unit (OBU) identification, a license plate number of a target, a position of the target, a lane where the target is located, a speed of the target and the like.
S202, the edge calculation unit processes the first intersection data and determines a first target set appearing in the intersection.
The first target set may include a first target.
Optionally, the intersection data acquired in S201 may include data such as a target identifier of the target, a speed of the target, an acceleration of the target, and position information of the target. The edge calculation unit may determine information such as velocity, acceleration and position of each object in the first set of objects.
It should be understood that the edge calculation unit may determine the speed of the target based on any one of radar data, video data, or RSU data in the intersection data. Alternatively, the edge calculation unit may also obtain the speeds of the targets in the radar data, the video data, and the RSU data, and then take the average value as the speed of the target, which is not limited in this application.
Optionally, the edge calculation unit may further identify a type of the object and a size of the object. For example, the edge calculation unit may identify the type of each object in the first set of objects and the size of each object. For another example, the edge calculation unit may identify a type of each object in the second set of objects and a size of each object.
It should be noted that the size of the target may be determined according to the length of the target and the width of the target in the radar data. The types of targets may include: automotive, non-automotive and pedestrian.
S203, the edge calculation unit processes the second intersection data and determines a second target set appearing in the intersection.
According to the data processing method of S202, the edge calculation unit may also determine information such as the velocity, acceleration, and position of each object in the second set of objects.
And S204, when the second target set does not comprise the first target, determining the real position information of the first target in the second intersection data according to the position information of the first target in the first intersection data.
For example, there may be a case where the edge calculation unit can recognize the first object in the intersection at the first time and cannot recognize the first object in the intersection at the second time, in which case the edge calculation unit may consider the first object to be lost. In this case, the edge calculation unit tracks the first target, that is, determines the real position information of the first target at the second time according to the position information, the velocity information and the acceleration information of the first target at the first time. The following is specifically presented:
the edge calculation unit may determine predicted location information of the first object at the second time instant from the location information of the first object at the first time instant. For convenience of explanation, the first time will be referred to as time i-1, and the second time will be referred to as time i. That is, the edge calculation unit may calculate the predicted coordinates of the first object at the time i from the coordinates, velocity, and acceleration information of the first object at the time i-1, satisfying equation (1).
Figure BDA0003410355940000091
Wherein p is i,x Predicting an x-axis coordinate, p, at time i for a first target i,y Predicting y-axis coordinates, p, at time i for a first target i-1,x X-axis coordinate, p, for the first target at time i-1 i-1,y Is the y-axis coordinate, v, of the first target at time i-1 i-1,x X-axis velocity, v, at time i-1 for the first target i-1,y Y-axis velocity, a, for the first target at time i-1 i-1,y Y-axis acceleration, a, for a first target at time i-1 i-1,x The x-axis acceleration at time i-1 is the first target. Δ t isTime difference between time i and time i-1.
The coordinates of the first object at time i-1 (object b) and the predicted coordinates at time i (object c) are then object matched. Let the target b coordinate be (x) 1 ,y 1 ) The coordinate of the object c is (x) 2 ,y 2 ) If formula (2) and formula (3) are satisfied simultaneously, then target b and target c are considered to be the same target:
Figure BDA0003410355940000092
Figure BDA0003410355940000093
wherein the content of the first and second substances,
Figure BDA0003410355940000094
for the length of the object with the relatively small x-axis coordinate in object b and object c, is/are>
Figure BDA0003410355940000101
The sum of the widths of the target b and the target c, α, β, is a threshold value of the degree of overlap of the length, β is a threshold value of the degree of overlap of the width, and the threshold values of the degree of overlap of the length and the degree of overlap of the width are preset empirically and may be equal to or different from each other, which is not limited in the present application.
When the target b and the target c are determined to be the same target, the predicted coordinate of the first target at the time i can be determined to be the real coordinate of the first target at the time i, that is, the real position information of the first target at the second time is determined.
In one possible scenario, each object in the intersection data should have a unique position, velocity, and acceleration. But since there may be multiple different devices in the intersection, such as there may be multiple radar devices and multiple video devices, etc. The detection areas of the multiple different devices may overlap, which may result in multiple different positions, different velocities, and different accelerations of the same object within the overlapping area. Thus, in order to ensure that the same object has a unique position, velocity and acceleration in different devices, the edge calculation unit may perform object fusion of the same object appearing in different devices in the overlap region.
Wherein the overlapping area is determined according to the detection area configured by the configuration tool for each device. Specifically, before the intersection data is acquired, the configuration tool configures the basic information of the edge computing unit through the network interface. The basic information here may include information of the detection area. The detection area can be understood as an area where the radar device and the video device perform detection and identification on the target. It should be understood that the detection area may be preset according to actual conditions or experience, and the application is not limited herein. For example, an area of the range radar apparatus 100m may be selected as the detection area.
First, the edge calculation unit acquires information of the first detection area and information of the second detection area. Wherein the first detection area is a detection area of a first detection device and the second detection area is a detection area of a second detection device. It should be appreciated that the first detection device may be a radar device or a video device, and the second detection device may be the same. In the embodiment of the present application, a first detection device and a second detection device are taken as an example to perform target fusion for two different radar devices.
The edge calculation unit then processes the data of the first detection device to determine a second target within an overlapping region of the first detection region and the second detection region. And processing the data of the second detection device to determine a third target in the overlapping area. And performing object fusion when the second object and the third object are determined to be the same object. Specifically, firstly, longitude and latitude correction is performed on all targets in the result target set in the overlapping area, and the formula (4) and the formula (5) are met. The result target set refers to a set of all targets determined after target fusion is performed last time. And if the target fusion is carried out for the first time, carrying out longitude and latitude correction on all targets in the overlapping area.
lat B =lat A +Δlat formula (4)
lng B =lng A + Δ lng formula (5)
Wherein, lng A And lat A Lng being the original longitude and latitude of the target B And lat B Corrected longitude and latitude for the target. Δ lng is a longitude deviation, and satisfies equation (6). Δ lat is a latitude deviation, and satisfies equation (7).
Δlng=(ΔL*sin(Ang*π/180))/(111320*cos(lat A * Pi/180)) formula (6)
Δ lat = (Δ L cos (Ang pi/180))/110540 equation (7)
Wherein Δ L is a coordinate deviation displacement, and satisfies formula (8). Ang is the angle between the original target and the due north direction, and satisfies the formula (9). 111320 is the arc length of one longitude on the equator of the earth, and 110540 is the arc length of one latitude on the earth.
Figure BDA0003410355940000111
Where Δ x is the target x-axis displacement, and satisfies equation (10). Δ y is the target y-axis displacement and satisfies equation (11).
Ang = γ + δ equation (9)
Wherein γ is an angle of the corrected target in the radar coordinate system, and satisfies formula (12). Delta is the angle between the normal of the radar equipment and the north direction.
Figure BDA0003410355940000112
Figure BDA0003410355940000113
Wherein v is x Targeted x-axis velocity, v y Targeted y-axis velocity, a x Targeted x-axis acceleration, a y The y-axis acceleration of the target, Δ t is the latest time when the radar data is received and the latest targetThe time difference of the fusion result is normalized to satisfy the formula (13). It will be appreciated that Δ t is 0 if the target fusion is performed for the first time.
Figure BDA0003410355940000114
Δt=t i -t i-1 Formula (13)
Wherein, t i Time of latest received radar data, t i-1 The time when the fused result target set is obtained last time.
And then, calculating the distance between the second target and the third target according to the corrected longitude and latitude coordinates of the second target and the third target. And when the distance is smaller than the threshold value, fusing the two targets, namely determining that the second target and the third target are the same target. Wherein the distance between the targets satisfies equation (14).
Figure BDA0003410355940000115
Where s represents the distance between two targets, R is the earth radius, and a is the longitude radian difference, satisfying equation (15). b is the latitude radian difference and meets the formula (16). lat 0 And lat 1 Indicating the latitude of two targets.
a=(lat 0 -lat 1 ) Pi/180 formula (15)
b=(lng 0 -lng 1 ) Pi/180 equation (16)
Wherein, lat 0 And lat 1 Indicates the latitude, lng, of two targets 0 And lng 1 The longitude of both objects. When the distance s satisfies the formula (17), the second target and the third target are determined to be the same target.
s<min(L 0 ,L 1 ,W 0 ,W 1 ) Formula (17)
Wherein L is 0 And L 1 Respectively, the lengths of two targets, W 0 And W 1 The widths of the two targets respectively.
After the second target and the third target are determined to be the same target, information such as coordinates, speed and acceleration of the second target and the third target can be fused, and data such as the unique coordinates, speed and acceleration of the target can be determined. The edge calculation unit may generate a unique target identification based on the target identification of the second target in the first detection device and the target identification of the third target in the second detection device. Alternatively, the edge calculation unit may store the target identifier of the second target in the first detection device and the target identifier of the third target in the second detection device in a corresponding manner, so as to indicate that the two target identifiers are identifiers of the same target.
The coordinate system used in the object fusion may be configured by the configuration tool into the edge calculation unit. For example, the basic information configured by the configuration tool for the edge calculation unit may further include calibration results of the first detection device and the second detection device. The calibration result is obtained by measuring and calibrating the first detection device and the second detection device of the intersection, and is used for assisting the edge calculation unit to accurately register the targets detected by different devices in different time and space. The calibration results of the radar apparatus may include a coordinate system adopted by the radar data. Specifically, the information includes longitude and latitude coordinates of radar equipment at the intersection, deflection angles of the radar equipment with the due north direction, relative positions of every two different pieces of equipment, and the like.
It should be understood that, the above target fusion operation is described by taking two radar devices as the first detection device and the second detection device as examples, the operation of performing target fusion on two video devices by the first detection device and the second detection device may be performed by referring to the two radar devices, and the operation of performing target fusion on one radar device and one video device by the first detection device and the second detection device may also be performed by referring to the two radar devices.
It should be noted that the coordinate system adopted by the video device may also be configured by the configuration tool into the edge calculation unit. For example, the basic information configured by the configuration tool for the edge calculation unit may further include a calibration result of the video device, and may be implemented by referring to the calibration result of the radar device, which is not described herein again.
Optionally, the basic information may further include information of the mask region. The shielded area may refer to an area where the radar device and the video device do not detect. In other words, the intersection data does not include information on the object detected in the mask area.
It should be understood that the shielding region is preset according to actual conditions or experience, and the application is not limited herein. For example, the area outside the real lane is used as a shielding area, so that the billboard or the tree outside the real lane is shielded.
It is to be understood that, in the intersection data obtained by the edge calculation unit in S201, radar data may be obtained by one or more radar devices in the detection area, and video data may be obtained by one or more video devices in the detection area.
In a possible implementation manner, after the targets are fused, characteristic attributes such as the length, the width, the coordinates, the license plate, the speed, the angular velocity, the yaw angle and the like of the second target can be determined, and under the condition that each target has a unique characteristic attribute, the edge calculation unit can track each target to generate a track of each target, so as to calculate traffic detection data and identify a traffic event.
After the edge calculation unit determines the speed, acceleration, position and other related information of each target appearing in the intersection according to the flow, the traffic detection data of the intersection can be calculated according to the information of each target in the intersection data.
The traffic detection data may include one or more of traffic information of lanes at the intersection, time occupancy of the lanes, headway, target average speed, space occupancy of the lanes, number of zone targets, headway-to-stop line distance, queuing length of the lanes, number of stops, and delay time.
The above-described traffic detection data will be described below.
1) The time occupancy is a time ratio of one lane at the intersection where the target occupies the lane.
2) The space occupancy is the ratio of the total length of the target in the detection area to the length of the detection area.
3) The headway refers to the time difference between the front ends of two front and rear targets passing through the same place in one lane of the intersection.
4) The section of the target number of sections may be preset according to experience or actual conditions, for example, the section may be a lane solid line section, which is not limited in the present application.
5) Delay time, which is the time the target waits for a red light.
Specifically, the flow rate information of the lane at the intersection is calculated by the detection coils arranged in the edge calculation unit by the arrangement tool. For example, the basic information may also include information of the detection coil. The parameters of the detection coil include the lane L of the coil N Detecting x-axis coordinate C of the coil x_pos Detecting the y-axis coordinate C of the coil y_pos Detecting coil length C L And detection coil width C W And the like.
Fig. 3 is a schematic view of a detection coil provided in an embodiment of the present application. In fig. 3, the detection coil of the middle lane is taken as an example, that is, the lane to which the coil belongs is L 2 The four vertex coordinate positions of the detection coil are respectively as follows: a (x) 1 ,y 1 ),B(x 1 ,y 2 ),C(x 2 ,y 1 ) And D (x) 2 ,y 2 ). Wherein the coordinate positions of the four vertexes are determined according to the configuration parameters of the detection coils, x 1 And x 2 Satisfies the formula (18), y 1 And y 2 Equation (19) is satisfied.
Figure BDA0003410355940000131
Figure BDA0003410355940000132
When the X-axis coordinate of the target judges that the target passes through the AB edge and the CD edge of the coil, the traffic flow count is added by 1, and then the traffic flow in a certain time period at the intersection can be counted.
In a possible implementation manner, when the method for calculating the queuing length information of the lane at the intersection is used, all the prohibited targets in the detection area can be obtained to obtain a prohibited target set. The forbidden target is a target with a speed less than a speed threshold, the speed threshold is preset according to experience or actual conditions, and the value of the speed threshold can be 5km/h, 10km/h or 8km/h, and the application is not limited herein.
Then ordering the x-axis coordinates of all targets in the forbidden target set from small to large to obtain the ordered forbidden target x-axis coordinate set { \8230Oh x_pos,i-1 ,O x_pos,i 8230and its preparation method. Wherein, O x_pos,i X-axis coordinates representing the ith inhibiting target, i being an integer greater than or equal to 1. Then calculating the difference delta of the x-axis coordinate of each forbidden target and the adjacent previous forbidden target after sorting x_pos,i Equation (20) is satisfied.
Figure BDA0003410355940000133
Since the leading prohibited object located behind the stop line has no adjacent preceding prohibited object, the difference between the x-axis coordinate of the prohibited object and the x-axis coordinate of the stop line is calculated to satisfy equation (21).
Δ x_pos,1 =O x_pos,1 -S x_pos,stop Formula (21)
Wherein, delta x_pos,1 Is the difference between the x-axis coordinate of the 1 st inhibit target and the x-axis coordinate of the stop line, S x_pos,stop Is the x-axis coordinate of the stop-line.
The difference delta of the x-axis coordinate of each forbidden target and the adjacent previous forbidden target x_pos,i Comparing with the maximum queuing distance threshold if delta x_pos,i And if the value is larger than the maximum queuing distance threshold value, taking the x-axis coordinate of the (i-1) th forbidden target as the queuing end position. If a plurality of Δ s occur x_pos,i In the case where it is greater than the maximum queue spacing threshold,there may be a plurality of queue end positions, and the x-axis coordinate of the prohibited object having the smallest x-axis coordinate among the plurality of queue end positions is determined as the queue end position. If Δ x_pos,i And if the number of the forbidden targets is less than or equal to the maximum queuing threshold value, taking the x-axis coordinate of the forbidden target with the maximum x-axis coordinate in the forbidden target set as the queuing end position.
And then calculating the queuing length according to the determined queuing end position, wherein the queuing length meets the formula (22) if the queuing end position is the x-axis coordinate of the jth forbidden target.
L q =O x_pos,j +O L,j -S x_pos,stop Formula (22)
Wherein L is q Indicates the queue length, O x_pos,j X-axis coordinate of jth forbidden target, O L,j Denotes the length of the jth inhibit target, S x_pos,stop Is the x-axis coordinate of the stop line.
Based on the scheme, the queuing length information of the lanes in the traffic detection data can be calculated through the lane number of the target and the position information of the target, and the queuing length information is used for the edge calculation unit to optimize the crossing signal scheme and diagnose the crossing traffic problem.
After the traffic detection data is collected, whether the split setting of the intersection signal lamp is reasonable can be judged through data such as traffic information of lanes, queuing length information of the lanes, delay time and the like in the traffic detection data, and if the split setting is not reasonable, the traffic of the intersection can be optimized in a mode of adjusting a signal scheme.
In a possible implementation manner, the edge calculation unit obtains the traffic signal data of the traffic signal equipment at the intersection through the internet access, and then optimizes the signal scheme at the intersection by combining the traffic detection data. Referring to fig. 4, a schematic diagram of a signal scheme optimization process provided in the embodiment of the present application is shown, and adaptive optimization of a crossing signal scheme can be implemented repeatedly according to the optimization process. In the embodiment shown in fig. 4, the type of the object is exemplified as a motor vehicle.
S401, the edge calculation unit acquires traffic detection data when the first signal scheme or the fourth signal scheme operates.
If the fourth signal scheme exists, the edge calculation unit acquires traffic detection data when the fourth signal scheme is operated. If the fourth signal scheme does not exist, the edge calculation unit acquires traffic detection data when the first signal scheme is operated. Wherein the first signal scheme refers to a signal scheme in the traffic signal data, and the fourth signal scheme is the signal scheme generated in S407.
S402, the edge calculation unit adjusts the first signal scheme in real time to generate a second signal scheme.
For example, the green light may be extended or stopped according to the arrival of the vehicle. When the detection coil detects that the automobile is on the lane, the green light time can be prolonged correspondingly; the green light may be stopped when the detection coil detects that no vehicle is on the lane.
S403, the edge calculation unit acquires traffic detection data when the second signal scheme is operated.
S404, the edge calculation unit establishes a signal lamp period and green signal ratio model.
The edge calculation unit establishes a period and split model from the traffic detection data acquired in S403.
S405, the edge calculation unit generates a third signal scheme.
And the edge calculation unit is used for further optimizing the second signal scheme according to the traffic detection data and the period and green signal ratio model of the signal lamp to generate a third signal scheme. For example, the webster model may be used to calculate an optimal signal period from the delay time for generating the third signal scheme.
And S406, the edge calculation unit evaluates the third signal scheme according to the pre-stored traffic evaluation model.
And evaluating the third signal scheme through a pre-stored traffic evaluation model, wherein the traffic evaluation model is preset according to experience or actual conditions, and the traffic evaluation model is not limited in the application. For example, the optimized signal scheme can be evaluated through the number of parking of vehicles at the intersection and the delay time.
S407, the edge calculation unit performs signal scheme adjustment to generate a fourth signal scheme.
And evaluating the preliminarily optimized signal scheme through a prestored traffic evaluation model, and further adjusting the third signal scheme to generate a fourth signal scheme, for example, if the third signal scheme is judged to cause the intersection delay time to be too long through the prestored traffic evaluation model, adjusting the third signal scheme in a manner of increasing the split ratio to obtain the fourth signal scheme. And then repeatedly performing S401 according to the obtained fourth signal scheme.
In a possible implementation manner, the edge calculation unit can also identify the traffic problems at the intersection according to the real-time trajectory data of the motor vehicle and a pre-stored traffic problem model. Wherein the traffic problem may include: the traffic problems of unreasonable phase sequence setting of the intersection, overlarge green light time, unbalanced green signal ratio distribution, unreasonable lane function division, low exit traffic efficiency and the like.
Fig. 5 is a schematic view of a traffic problem diagnosis process provided in the embodiment of the present application.
S501, the edge calculation unit generates the track of the motor vehicle.
The edge calculation unit generates a trajectory of the motor vehicle represented by the mark according to the motor vehicle mark and the position information of the motor vehicle. The position information of the motor vehicle may be determined according to radar data, video data, or RSU data in the intersection data, which is not limited in this application.
And S502, modeling the traffic problem.
The traffic problem model is stored in the edge calculation unit in advance before the traffic problem diagnosis. The traffic problem model is established by the following steps: firstly, traffic problems are defined according to laws, standards and experience of traffic regulation, and then a traffic problem model is established according to the definitions and traffic operation evaluation data. The traffic operation evaluation data may include one or more of traffic detection data, such as data of a queuing length of a lane, a number of parking times, delay time, and the like, and may be selected according to an actual situation or experience, which is not limited in the present application.
S503, the edge calculation unit judges the traffic problem.
And the edge calculation unit judges whether the traffic problem exists at the intersection or not according to the pre-stored traffic problem model and the real-time track data of the motor vehicle.
S504, the edge calculation unit stores the track of the motor vehicle.
The edge computing unit can store the time period of the traffic problem and the track of the motor vehicle at the intersection. For example, the edge calculation unit may store trajectory data of intersection vehicles 15 minutes before and 15 minutes after the occurrence of the traffic problem.
Based on the scheme, the track data of the motor vehicles can be stored to restore the scene of the traffic problem occurrence time period, and how the traffic problem is generated and how the traffic problem is spread (or dissipated) can be found out, so that targeted solution is carried out.
In a possible implementation manner, the edge computing unit can also generate real-time track data of motor vehicles at the intersection according to intersection data of the event area to identify the traffic events existing at the intersection. The traffic events can include roadside illegal parking, retrograde motion, red light running, non-guided lane running, intersection overflow, overspeed, illegal lane change and other illegal events.
Optionally, the basic information configured by the configuration tool may further include event area, intersection channeling, and radar equipment information. The event area refers to an area for detecting whether a traffic event occurs. Wherein traffic events are used to characterize violation events. It should be understood that the event area is preset according to actual conditions or experience, and the application is not limited herein. For example, the first 30m of the lane solid line segment may be taken as the event area.
Fig. 6 is a schematic view of a traffic event identification process provided in the embodiment of the present application.
S601, the edge calculation unit generates a track of the motor vehicle.
The edge calculation unit generates a trajectory of the vehicle represented by the vehicle identification based on the vehicle identification of the event area and the position information of the vehicle. The position information of the motor vehicle can be determined according to radar data, video data or RSU data in the intersection data, which is not limited in the present application.
And S602, defining a traffic event.
The traffic events are defined according to specifications and configured into an edge computing unit. For example, a traffic event may be defined to include roadside parking, reversing, running a red light, traveling off a guided lane, and other violations.
S603, the edge computing unit identifies the traffic events.
And the edge computing unit identifies whether the traffic event exists at the intersection in real time according to the definition of the traffic event. The rule and the setting operation of the vehicle corresponding to the rule may be stored in the edge calculation unit. The edge calculation unit can compare the actual operation with the set operation according to the motor vehicle in the intersection data, and if the actual operation is the set operation, the motor vehicle is not a traffic event. If different, it can be considered a traffic event. For example, when the traffic light lights up a red light, the setting operation is such that the vehicle stops before the stop line, and a traffic event is recognized if the vehicle does not stop before the stop line.
S604, the edge calculating unit confirms the traffic incident.
The edge calculation unit further confirms the identified traffic events according to the video data and the radar data of the intersection. For example, the identified traffic event may be further confirmed through the actions of the rear vehicle of the vehicle identified by S603 in which the traffic event occurs, bypassing or parking.
S605, the edge calculation unit stores the video data of the track and the intersection of the motor vehicle.
The edge computing unit stores video data of a track and an intersection of a motor vehicle in which a traffic event occurs within a period of time. For example, the edge calculation unit may store trajectory data of the motor vehicle and video data of the intersection in the first 15 minutes and the last 15 minutes of the occurrence of the traffic event.
In a possible implementation manner, after the edge computing unit diagnoses the traffic incident and the traffic problem, information such as a real-time track, traffic detection data, monitoring of a signal optimization process, a traffic problem, a traffic incident and the like of a target of the traffic incident or the traffic problem at the intersection can be displayed in the central system.
Referring to fig. 7, a block diagram of a data processing system according to an embodiment of the present application is provided. In the data processing system, firstly, a calibration tool calibrates the position of equipment at an intersection, and a configuration tool configures basic information of the intersection into an edge computing unit. And then the radar equipment, the electric alarm equipment, the gate equipment, the signal equipment and the RSU of the intersection send the radar data, the electric alarm data, the gate data, the signal equipment data and the RSU of the intersection to the edge calculation unit. Next, the edge calculating unit performs object identification, object tracking, object fusion, feature extraction, and data definition on the intersection data of the received strip to determine the position, speed, acceleration, and other relevant information of each object, and the specific steps are detailed in the related description of fig. 2 and will not be described herein again. The traffic detection data of the intersection can be obtained through calculation according to the relevant information of each target, so that the optimization of the traffic signals of the intersection is realized, and the track of the target can be generated according to the relevant information of each target, so that the display and the reduction of the traffic state of the intersection are realized. According to the traffic detection data and the target track, diagnosis of traffic problems and recognition of traffic events can be carried out, so that traffic control is realized, and finally, the effect of realizing refined control on the road junction is achieved.
Based on the same concept of the above method, referring to fig. 8, for a schematic diagram of a data processing apparatus provided in an embodiment of the present application, a data processing apparatus 800 includes a receiving unit 801 and a processing unit 802.
In one scenario:
the receiving unit 801 is configured to: acquiring intersection data, wherein the intersection data comprises first intersection data at a first moment and second intersection data at a second moment, and the second moment is behind the first moment;
the processing unit 802 is configured to: processing the first intersection data, and determining a first target set appearing in the intersection at a first moment; the first set of targets comprises a first target; processing the second intersection data, and determining a second target set appearing in the intersection at a second moment; and when the second target set does not comprise the first target, determining the real position information of the first target in the second intersection data according to the position information of the first target in the first intersection data.
In a possible implementation manner, when the processing unit 802 determines, according to the location information of the first target in the first intersection data, the actual location information of the first target in the second intersection data, the processing unit is specifically configured to: according to the position information of the first target in the first intersection data, determining the predicted position information of the first target in the second intersection data, and taking the predicted position information as the real position information; the predicted position information satisfies the following formula:
Figure BDA0003410355940000171
p i,x for the predicted x-axis coordinate, p, of the first target in the second intersection data i,y For the predicted y-axis coordinate, p, of the first target in the second intersection data i-1,x Is the x-axis coordinate, p, of the first target in the first intersection data i-1,y Is the y-axis coordinate, v, of the first target in the first intersection data i-1,x The x-axis velocity, v, in the first intersection data for the first target i-1,y Is the y-axis speed, a, of the first target in the first intersection data i-1,y Is the y-axis acceleration, a, of the first target in the first intersection data i-1,x And Δ t is the time difference between the second moment and the first moment, which is the x-axis acceleration of the first target in the first intersection data.
In one possible implementation manner, the processing unit 802 is configured to, before the predicted location information is the real location information: when the position information of the first target in the first intersection data and the predicted position information of the first target in the second intersection data satisfy the following formula, determining that the predicted position information of the first target in the second intersection data is the real position information of the first target in the second intersection data:
Figure BDA0003410355940000181
Figure BDA0003410355940000182
x 1 is the x-axis coordinate, x, of the first target in the position information of the first intersection data 2 For the x-axis coordinate, y, of the predicted position information of the first target in the second intersection data 1 Is the x-axis coordinate, y, of the first target in the position information of the first intersection data 2 For the y-axis coordinate of the first target in the predicted position information in the second intersection data,
Figure BDA0003410355940000183
based on the length of the target which is relatively smaller in the y-axis coordinate in the position information of the first target in the first intersection data and the x-axis coordinate in the predicted position information of the first target in the second intersection data, the method comprises the step of determining the length of the target based on the length of the target and the length of the target based on the length of the target>
Figure BDA0003410355940000184
And alpha is a length coincidence degree threshold value and beta is a width coincidence degree threshold value, wherein alpha is the sum of the width of the first target in the position information of the first intersection data and the width of the first target in the predicted position information of the second intersection data.
In a possible implementation manner, the intersection data includes data detected by a first detection device in a first detection area and data detected by a second detection device in a second detection area; the processing unit 802 is further configured to: processing the data of the first detection device to determine a second target in an overlapping area; the overlap region is an overlap region of the first detection region and the second detection region; processing data of the second detection device to determine a third target appearing in the overlapping area; determining the second target and the third target to be the same target when the second target and the third target satisfy the following formula:
s<min(L 0 ,L 1 ,W 0 ,W 1 )
s represents the distance between the second and third targets, L 0 Represents the length, L, of the second target 1 Represents the length, W, of the third target 0 Represents the width, W, of the third target 1 Representing the width of the third target.
In a possible implementation manner, before the receiving unit 801 acquires the intersection data, the receiving unit is further configured to: obtaining a calibration result; the calibration result comprises a coordinate system adopted by the first detection device and a coordinate system adopted by the second detection device; and acquiring the information of the first detection area and the information of the second detection area.
In a possible implementation manner, before the processing unit 802 determines that the second target and the third target are the same target, it is further configured to: carrying out longitude and latitude coordinate correction on a fourth target in the overlapping area according to the longitude deviation of the fourth target and the latitude deviation of the fourth target, wherein the fourth target is any one target in the overlapping area; the longitude deviation of the fourth target satisfies the following equation:
Δlat=(ΔL*cos(Ang*π/180))/110540;
delta L is the coordinate deviation displacement of the fourth target, and Ang is the angle between the fourth target and the due north direction; the latitude deviation of the fourth target satisfies the following formula:
Δlng=(ΔL*sin(Ang*π/180))/(111320*cos(lat A *π/180))
and delta L is the coordinate deviation displacement of the sixth target, and Ang is the angle between the sixth target and the true north direction.
In one possible implementation, the processing unit 802 is further configured to: determining traffic detection data of the intersection according to the first target set and the second target set; the traffic detection data includes one or more of flow information of lanes in the intersection, a target average speed, a target queuing length, a time occupancy of lanes in the intersection, and a space occupancy of lanes in the intersection.
Based on the same concept of the above method, referring to fig. 9, an electronic device provided in an embodiment of the present application includes a processor 901 and a memory 902. The memory 902 is used for storing computer-executable instructions, and the processor 901 executes the computer-executable instructions in the memory to perform the operation steps of the method in any one of the possible implementations of the method described above by using hardware resources in the controller.
Embodiments of the present application further provide a computer-readable medium, on which a computer program is stored, where the computer program is executed by a processor to implement the steps of any one of the methods as described above.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
While specific embodiments of the present application have been described above, it will be appreciated by those skilled in the art that these are by way of example only, and that the scope of the present application is defined by the appended claims. Various changes or modifications to these embodiments can be made by those skilled in the art without departing from the principle and spirit of this application, and these changes and modifications all fall into the scope of this application. While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (9)

1. A data processing apparatus, characterized by comprising: a processing unit and a receiving unit;
the receiving unit is configured to: acquiring intersection data, wherein the intersection data comprises first intersection data at a first moment and second intersection data at a second moment, and the second moment is behind the first moment; the intersection data also comprises data obtained by the detection of the first detection equipment in the first detection area and data obtained by the detection of the second detection equipment in the second detection area;
the processing unit is configured to: processing the first intersection data, and determining a first target set appearing in the intersection at a first moment; the first set of targets comprises a first target;
processing the second intersection data, and determining a second target set appearing in the intersection at a second moment;
when the second target set does not comprise the first target, determining real position information of the first target in the second intersection data according to the position information of the first target in the first intersection data;
processing the data of the first detection device to determine a second target in an overlapping area; the overlap region is an overlap region of the first detection region and the second detection region;
processing data of the second detection device to determine a third target appearing in the overlapping area;
determining the second target and the third target to be the same target when the second target and the third target satisfy the following formula:
s<min(L 0 ,L 1 ,W 0 ,W 1 )
s represents the distance between the second and third targets, L 0 Represents the length, L, of the second target 1 Represents the length, W, of the third target 0 Represents the width, W, of the third target 1 Representing the width of the third target.
2. The apparatus according to claim 1, wherein the processing unit, when determining, according to the location information of the first target in the first intersection data, the real location information of the first target in the second intersection data, is specifically configured to:
according to the position information of the first target in the first intersection data, determining the predicted position information of the first target in the second intersection data, and taking the predicted position information as the real position information;
the predicted position information satisfies the following formula:
Figure FDA0003990483760000021
p i,x for the predicted x-axis coordinate, p, of the first target in the second intersection data i,y For the predicted y-axis coordinate, p, of the first target in the second intersection data i-1,x Is the x-axis coordinate, p, of the first target in the first intersection data i-1,y Is the y-axis coordinate, v, of the first target in the first intersection data i-1,x The x-axis velocity, v, in the first intersection data for the first target i-1,y Is the y-axis speed, a, of the first target in the first intersection data i-1,y Is the y-axis acceleration, a, of the first target in the first intersection data i-1,x Is the firstAnd the x-axis acceleration of a target in the first intersection data, and delta t is the time difference between the second moment and the first moment.
3. The apparatus of claim 2, wherein the processing unit is further configured to, prior to the predicted location information being the true location information:
when the position information of the first target in the first intersection data and the predicted position information of the first target in the second intersection data satisfy the following formula, determining that the predicted position information is the real position information of the first target:
Figure FDA0003990483760000022
Figure FDA0003990483760000023
x 1 is the x-axis coordinate, x, of the first target in the position information of the first intersection data 2 For the x-axis coordinate, y, of the predicted position information of the first target in the second intersection data 1 Is the x-axis coordinate, y, of the first target in the position information of the first intersection data 2 For the y-axis coordinate of the first target in the predicted position information in the second intersection data,
Figure FDA0003990483760000031
based on the length of the target which is smaller in the y-axis coordinate in the position information of the first target in the first intersection data and the x-axis coordinate in the predicted position information of the first target in the second intersection data, the judgment unit judges whether the length of the target is greater than or equal to the length of the target>
Figure FDA0003990483760000032
Is the position information of the first target in the first intersection dataThe sum of the width of the information and the width of the first target in the predicted position information in the second intersection data, where α is a length overlap ratio threshold and β is a width overlap ratio threshold.
4. The apparatus of claim 1, wherein before the receiving unit obtains the intersection data, the receiving unit is further configured to:
obtaining a calibration result; the calibration result comprises a coordinate system adopted by the first detection device and a coordinate system adopted by the second detection device;
and acquiring the information of the first detection area and the information of the second detection area.
5. The apparatus of claim 1, wherein before the processing unit determines that the second target and the third target are the same target, the processing unit is further configured to:
carrying out longitude and latitude coordinate correction on a fourth target in the overlapping area according to the longitude deviation of the fourth target and the latitude deviation of the fourth target, wherein the fourth target is any one target in the overlapping area;
the longitude deviation of the fourth target satisfies the following equation:
Δlat=(ΔL*cos(Ang*π/180))/110540;
delta L is the coordinate deviation displacement of the fourth target, and Ang is the angle between the fourth target and the due north direction;
the latitude deviation of the fourth target satisfies the following formula:
Δln g=(ΔL*sin(Ang*π/180))/(111320*cos(lat A *π/180))
and delta L is the coordinate deviation displacement of the fourth target, and Ang is the angle between the fourth target and the due north direction.
6. The apparatus of claim 1, wherein the processing unit is further configured to:
determining traffic detection data of the intersection according to the first target set and the second target set; the traffic detection data includes one or more of flow information of lanes in the intersection, a target average speed, a target queuing length, a time occupancy of lanes in the intersection, and a space occupancy of lanes in the intersection.
7. A method of data processing, comprising:
acquiring intersection data, wherein the intersection data comprises first intersection data at a first moment and second intersection data at a second moment, and the second moment is behind the first moment; the intersection data also comprises data detected by the first detection equipment in a first detection area and data detected by the second detection equipment in a second detection area;
processing the first intersection data, and determining a first target set appearing in the intersection at a first moment; the first set of targets comprises a first target;
processing the second intersection data, and determining a second target set appearing in the intersection at a second moment;
when the second target set does not comprise the first target, determining the predicted position information of the first target in the second intersection data according to the position information of the first target in the first intersection data;
processing the data of the first detection device to determine a second target in the overlapping area; the overlap region is an overlap region of the first detection region and the second detection region;
processing data of the second detection device to determine a third target appearing in the overlapping area;
determining the second target and the third target to be the same target when the second target and the third target satisfy the following formula:
s<min(L 0 ,L 1 ,W 0 ,W 1 )
s represents the distance between the second and third targets, L 0 Representing the length of the second target,L 1 Represents the length, W, of the third target 0 Represents the width, W, of the third target 1 Representing the width of the third target.
8. An electronic device, comprising:
a memory for storing computer instructions;
a processor coupled to the memory for executing the computer instructions in the memory and when executing the computer instructions implementing the method as claimed in claim 7.
9. A computer-readable storage medium, comprising:
the computer readable storage medium stores computer instructions which, when executed on a computer, cause the computer to perform the method as claimed in claim 7.
CN202111529903.5A 2021-12-14 2021-12-14 Data processing device and method and electronic equipment Active CN114373297B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111529903.5A CN114373297B (en) 2021-12-14 2021-12-14 Data processing device and method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111529903.5A CN114373297B (en) 2021-12-14 2021-12-14 Data processing device and method and electronic equipment

Publications (2)

Publication Number Publication Date
CN114373297A CN114373297A (en) 2022-04-19
CN114373297B true CN114373297B (en) 2023-04-07

Family

ID=81140938

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111529903.5A Active CN114373297B (en) 2021-12-14 2021-12-14 Data processing device and method and electronic equipment

Country Status (1)

Country Link
CN (1) CN114373297B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117475411B (en) * 2023-12-27 2024-03-26 安徽蔚来智驾科技有限公司 Signal lamp countdown identification method, computer readable storage medium and intelligent device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108961754A (en) * 2018-07-19 2018-12-07 王大江 Smart city cloud intelligent traffic monitoring method
CN109859495A (en) * 2019-03-31 2019-06-07 东南大学 A method of overall travel speed is obtained based on RFID data
CN112530158B (en) * 2020-10-27 2022-05-13 北方信息控制研究院集团有限公司 Road network supplementing method based on historical track
CN112863219B (en) * 2020-12-30 2022-12-20 深圳酷派技术有限公司 Position updating method and device, storage medium and electronic equipment
CN113419244A (en) * 2021-05-28 2021-09-21 同济大学 Vehicle track splicing method based on millimeter wave radar data

Also Published As

Publication number Publication date
CN114373297A (en) 2022-04-19

Similar Documents

Publication Publication Date Title
US11967230B2 (en) System and method for using V2X and sensor data
US20190333371A1 (en) Driver behavior monitoring
KR101696881B1 (en) Method and apparatus for analyzing traffic information
US11380105B2 (en) Identification and classification of traffic conflicts
EP3403219A1 (en) Driver behavior monitoring
CN110648533A (en) Traffic control method, equipment, system and storage medium
CN103778670B (en) Car interference method, localization method, ETC method and system are followed based on the anti-of DSRC
CN106327880B (en) A kind of speed recognition methods and its system based on monitor video
JP7063379B2 (en) Traffic monitoring equipment, traffic monitoring systems, traffic monitoring methods and programs
CN114419874B (en) Target driving safety risk early warning method based on road side sensing equipment data fusion
CN113744564A (en) Intelligent network bus road cooperative control system based on edge calculation
CN112053562A (en) Intelligent service open platform based on edge calculation
CN114373297B (en) Data processing device and method and electronic equipment
KR20190025165A (en) Parking enforcement Method and parking enforcemnt system by using a 3D travelling enforcemnt device
CN115618932A (en) Traffic incident prediction method and device based on internet automatic driving and electronic equipment
CN114495520B (en) Counting method and device for vehicles, terminal and storage medium
JP2008135070A (en) Road traffic control system
CN113570868A (en) Intersection green light passing rate calculation method, device, equipment and storage medium
KR101694155B1 (en) Vehicle analysis apparatus and method thereof
CN109300313B (en) Illegal behavior detection method, camera and server
CN111055852B (en) Interested target search area determination method for automatic driving
CN113192217A (en) Fee evasion detection method, fee evasion detection device, computer equipment and medium
JP3470172B2 (en) Traffic flow monitoring device
CN117058872A (en) Method, system and storage medium for publishing information on expressway
KR101963353B1 (en) An enforcing system for smart traffic violation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant