CN109559528B - Self-perception interactive traffic signal control device based on 3D laser radar - Google Patents

Self-perception interactive traffic signal control device based on 3D laser radar Download PDF

Info

Publication number
CN109559528B
CN109559528B CN201910046578.3A CN201910046578A CN109559528B CN 109559528 B CN109559528 B CN 109559528B CN 201910046578 A CN201910046578 A CN 201910046578A CN 109559528 B CN109559528 B CN 109559528B
Authority
CN
China
Prior art keywords
dist
cuboid
traffic
omega
traffic signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910046578.3A
Other languages
Chinese (zh)
Other versions
CN109559528A (en
Inventor
林赐云
龚勃文
周翔宇
赵玉
王康
喻永力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN201910046578.3A priority Critical patent/CN109559528B/en
Publication of CN109559528A publication Critical patent/CN109559528A/en
Application granted granted Critical
Publication of CN109559528B publication Critical patent/CN109559528B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/07Controlling traffic signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention belongs to the technical field of traffic signal control, and particularly relates to a self-perception interactive traffic signal control device based on a 3D laser radar; the device mainly comprises a 3D laser radar detection unit, a traffic signal interaction coordination unit and a traffic signal drive control unit; the device realizes the autonomous perception of pedestrian flow and vehicle flow in the inlet direction of the intersection and the automatic extraction of traffic flow parameters with real time, high precision and multiple resolutions through the 3D laser radar sensor, and performs refined traffic signal optimization, coordination and control to improve the traffic efficiency of the intersection and improve the utilization rate of road traffic resources.

Description

Self-perception interactive traffic signal control device based on 3D laser radar
Technical Field
The invention belongs to the technical field of traffic signal control, and particularly relates to a self-perception interactive traffic signal control device based on a 3D laser radar.
Background
With the continuous development of the social economy and the urbanization process of China, the characteristics of more people and less land, less vehicle and multiple paths, centralized facilities, tense land use and frequent activities in cities determine that the land resources available for traffic in the cities are extremely limited, along with the continuous and rapid development of the social economy, the traffic demand of the cities must be continuously expanded, and the contradiction between the urban traffic supply and the traffic demand is increasingly prominent. From the perspective of expanding traffic supply: firstly, the urban road is built by new construction, however, under the limited land resource of the city, the new construction of the road is not practical, and the speed of road construction can never keep up with the increase speed of traffic demand, so that the problem of traffic congestion can not be fundamentally solved; and secondly, the efficiency of the urban road is improved, the utilization rate of urban road traffic infrastructure is improved by intelligent traffic technical means, and the road traffic capacity is improved, so that the method is a main technical means for solving the urban traffic problem at home and abroad at present. The urban road intersection is the bottleneck of road passing efficiency, the road passing efficiency is effectively improved, and the road passing efficiency can be effectively improved only by finely controlling traffic signals through real-time, microscopic and accurate traffic flow data.
In the current traffic signal control system, traffic information is mainly acquired through sensing equipment such as a coil detector, a radar detector, a video detector, a GPS floating car and the like. However, although the coil detector can acquire information on the flow rate, speed and occupancy of a certain section of a road, the coil detector can only reflect the traffic flow running state near the section and cannot reflect the traffic state in a certain area range; similarly, for a Doppler radar detector and a video detector which are both roadbed type detection equipment, although the obtained traffic information is slightly different, the obtained traffic information can only reflect the traffic flow state information of a certain section of a road, and for the fine traffic signal control, the traffic flow information provided by the roadbed type detector is not fine enough at present, the resolution and the dimensionality of the information cannot meet the requirements of the fine traffic optimization control, and other detection equipment needs to be installed for supplement and improvement; although the GPS floating car data can reflect the traffic flow information in a certain space range, the GPS floating car data is limited by conditions such as sample volume, sampling period, network transmission and the like, and the traffic flow information provided by the GPS floating car has obvious hysteresis and precision fluctuation and is not suitable for being applied to fine traffic signal control.
The 3D laser radar is used as an active vision sensor, has the advantages of insensitive external illumination change, strong adaptability to complex environment, strong anti-interference capability, high sensitivity, high resolution, high precision, wide coverage range, large information amount and the like, can provide real-time, microscopic, high-precision and high-resolution traffic flow information, and provides a new technical solution for refined traffic information perception, traffic flow dynamic identification and tracking of urban traffic signal control.
Disclosure of Invention
The invention provides a self-perception interactive traffic signal control device based on a 3D laser radar, which mainly comprises a 3D laser radar detection unit, a traffic signal interaction coordination unit and a traffic signal drive control unit, and is shown in figure 1. The device realizes the autonomous perception of the pedestrian flow and the traffic flow in the inlet direction of the intersection and the automatic extraction of the traffic flow parameters in real time, high precision and multi-resolution through the 3D laser radar sensor, and performs refined traffic signal optimization, coordination and control so as to improve the traffic efficiency of the intersection and improve the utilization rate of road traffic resources.
The technical scheme includes that the self-sensing interactive traffic signal control device based on the 3D laser radar is installed on a cantilever type signal lamp post at an urban intersection, and is shown in figure 2. A3D laser radar detection unit in the device is responsible for carrying out real-time and dynamic detection, identification, tracking and information extraction on traffic flow in the inlet direction of an intersection and pedestrian flow in a pedestrian crossing; the traffic signal interaction coordination unit optimizes the phase, phase sequence and green light duration of traffic signals in the inlet direction, performs information interaction and signal coordination according to real-time, high-precision and multi-resolution traffic flow information extracted by the 3D laser radar detection unit, and performs dynamic monitoring and safety early warning on pedestrian flow of pedestrian crossings; the traffic signal driving control unit drives the traffic signal display module, the traffic signal prompt module and the traffic safety early warning module according to a traffic signal scheme formed by the traffic signal interaction coordination unit, and controls the turn-on time, the turn-on duration, the turn-on state, the display pattern, the display information and the safety early warning of the traffic signal lamp bank and the information display screen.
The invention provides a self-perception interactive traffic signal control device based on a 3D laser radar, which is characterized by mainly comprising the following components:
1) 3D laser radar detecting unit
The 3D laser radar detection unit consists of a 3D laser radar sensor and a multi-core laser point cloud micro-processing module, wherein the 3D laser radar sensor and the multi-core laser point cloud micro-processing module are connected by adopting an EMIF (External Memory Interface) bus and are subjected to data communication transmission; the 3D laser radar sensor is used for scanning and detecting traffic flow in the inlet direction of an intersection and pedestrian flow in a pedestrian crossing to form a laser point cloud data frame, and transmitting the laser point cloud data frame to the multi-core laser point cloud micro-processing module through an EMIF bus; the multi-core laser point cloud microprocessing module is responsible for carrying out data filtering and information extraction on a laser point cloud data frame transmitted by the 3D laser radar sensor and extracting traffic flow and people flow parameter information from the laser point cloud data frame.
2) Traffic signal interaction coordination unit
The traffic signal interaction coordination unit consists of a communication module, an FPGA main control module, a DSP safety monitoring module and a PCB backboard, wherein the communication module, the FPGA main control module and the DSP signal regulation and control module are connected by adopting an EMIF bus and are used for data communication transmission, and the communication module, the FPGA main control module and the DSP signal regulation and control module are fixedly arranged on the PCB backboard; meanwhile, the traffic signal interaction coordination unit is connected with the 3D laser radar detection unit and the traffic signal driving control unit through an EMIF bus and performs data communication transmission;
the communication module in the traffic signal interaction coordination unit is responsible for carrying out information interaction and sharing with self-perception interactive traffic signal control devices which are arranged on other cantilever type lamp poles in the same intersection; the FPGA main control module optimizes the phase of the traffic signal and the duration of the green light in the corresponding entrance direction according to the traffic flow parameter information provided by the laser radar detection unit, and coordinates the phase, the phase sequence and the duration of the green light with a self-sensing interactive traffic signal control device on other cantilever lamp poles at the intersection through the communication module; the DSP safety monitoring module carries out dynamic monitoring and safety early warning on the safety states of traffic flow and people flow in the inlet direction of the intersection, and carries out automatic intervention and quick adjustment on traffic signals under the emergency traffic incident.
3) Traffic signal drive control unit
The traffic signal driving control unit consists of a traffic signal display module, a traffic signal prompt module, a traffic safety early warning module, a signal lamp group, an LED micro display screen and a 6U VPX switch signal interface board, wherein the traffic signal display module, the traffic signal prompt module, the traffic safety early warning module, the signal lamp group and the LED micro display screen are connected with the 6U VPX switch signal interface board through serial ports and are in data communication transmission; the traffic signal display module is responsible for driving and controlling the turn-on time, the turn-on state and the turn-on patterns of the signal lamp group according to the traffic signal control scheme transmitted by the traffic signal interaction coordination unit; the traffic signal prompt module controls the information prompt content of the LED micro display screen according to the traffic signal control scheme transmitted by the traffic signal interaction coordination unit, and performs information prompt and dynamic guidance on the traffic state and the traffic time of people flow and traffic flow; and the traffic safety early warning module carries out safety early warning on the pedestrian flow and the traffic flow in the inlet direction of the intersection according to the traffic safety early warning instruction transmitted by the traffic signal interaction and coordination unit.
Drawings
FIG. 1: a self-perception interactive traffic signal control device functional structure diagram based on 3D laser radar;
FIG. 2: a self-perception interactive traffic signal control device based on a 3D laser radar is installed and laid out;
FIG. 3: the vehicle detection space schematic diagram is drawn by the multi-core laser point cloud microprocessing module;
FIG. 4: and the pedestrian detection space schematic diagram is drawn by the multi-core laser point cloud microprocessing module.
Detailed Description
The invention relates to a self-perception interactive traffic signal control device based on a 3D laser radar, which mainly comprises a 3D laser radar detection unit, a traffic signal interaction coordination unit and a traffic signal drive control unit, as shown in figures 1 and 2; the device realizes the autonomous perception of traffic flow and pedestrian flow in the inlet direction of the intersection and the automatic extraction of traffic flow parameters in real time, high precision and multi-resolution through the 3D laser radar sensor, and performs refined traffic signal optimization, coordination and control so as to improve the traffic efficiency of the intersection and improve the utilization rate of road traffic resources.
The invention provides a self-perception interactive traffic signal control device based on a 3D laser radar, which comprises the following specific working processes:
1) 3D laser radar detecting unit
The 3D laser radar detection unit consists of a laser radar sensor and a multi-core laser point cloud micro-processing module, wherein the 3D laser radar sensor and the multi-core laser point cloud micro-processing module are connected by adopting an External Memory Interface (EMIF) bus and are subjected to data communication transmission; the laser radar sensor is used for scanning and detecting traffic flow in the inlet direction of the intersection and pedestrian flow of a pedestrian crossing to form a laser point cloud data frame, and transmitting the laser point cloud data frame to the multi-core laser point cloud micro-processing module through an EMIF bus; the multi-core laser point cloud microprocessing module is responsible for carrying out data filtering and information extraction on a laser point cloud data frame transmitted by the 3D laser radar sensor and extracting traffic flow and people flow parameter information from the laser point cloud data frame. The specific working steps are as follows:
step1: the method comprises the following steps that a 3D laser radar sensor emits laser beams at a certain frequency and rotates a laser refraction mirror surface, 3D scanning of a road traffic environment in a detection direction is achieved by receiving laser reflection beams, a 3D laser point cloud image is formed in a laser point cloud mode, when the 3D laser radar sensor finishes 3D scanning of the road traffic environment in a detection space range every time, a 3D laser point cloud data frame is formed, and the data frame comprises three-dimensional coordinate information (X, Y and Z coordinates) of the laser point cloud, laser intensity, laser ID, a laser horizontal rotation direction angle, a laser vertical direction included angle, a laser distance and a timestamp;
step2: the multi-core laser point cloud microprocessing module divides the scanning space of the 3D laser radar sensor into two subspaces, wherein one subspace is a vehicle detection space VEH _ omega in the direction of an entrance, and the other subspace is a pedestrian detection space PED _ omega in a pedestrian crossing in the direction of detection, as shown in fig. 3 and 4; wherein: VEH _ omega is a cuboid with VL multiplied by VW multiplied by H, PED _ omega is a cuboid with PL multiplied by PW multiplied by H, and VL is the distance between the farthest position of the 3D laser radar sensor in the inlet direction and the stop line in the inlet direction; VW is the width of an entrance way in the detection direction of the 3D laser radar sensor; h is the height of the 3D laser radar sensor from the ground; PL is the length of the pedestrian crossing in the direction detected by the 3D laser radar sensor; PW is the width of a pedestrian crossing in the direction detected by the 3D laser radar sensor;
step3: the multi-core laser point cloud microprocessing module respectively extracts laser point clouds in the VEH _ omega and PED _ omega space ranges from the laser point cloud data frame, and respectively constructs a MV multiplied by NV multiplied by KV three-dimensional space matrix V (VEH _ omega), V ijk e.V (VEH _ OMEGA) and MP × NP × KP three-dimensional space matrix P (PED _ OMEGA), P ijk ∈P(PED_Ω);
In VEH _ Ω: v. of ijk The laser intensity of the ith row of the X coordinate axis, the jth column of the Y coordinate axis and the kth layer of laser points of the Z coordinate axis in VEH _ omega is shown; i =1,2, \8230;, MV; j =1,2, \ 8230;, NV; k =1,2, \ 8230;, KV; MV is the total row number of X coordinate axis laser point clouds in VEH _ omega; NV is the total column number of Y coordinate axis laser point clouds in VEH _ omega; KV is the total number of layers of Z coordinate axis laser point clouds in VEH _ omega;
in PED _ Ω: p is a radical of ijk The laser intensity of the laser spot on the ith row of the X coordinate axis, the jth column of the Y coordinate axis and the kth layer of the Z coordinate axis in PED _ omega is set; i =1,2, \ 8230;, MP; j =1,2, \ 8230;, NP; k =1,2, \ 8230;, KP; MP is the total line number of X coordinate axis laser point clouds in PED _ omega; NV is the total column number of Y coordinate axis laser point clouds in PED _ omega; KP is the total number of layers of Z coordinate axis laser point clouds in PED _ omega;
the intersection inlet direction scanned by the 3D laser radar sensor in VEH _ omega and PED _ omega is the Y coordinate axis direction; the crosswalk direction at the entrance direction of the intersection scanned by the 3D laser radar sensor is the X coordinate axis direction; the vertical direction of the ground at the intersection is the direction of a Z coordinate axis;
step4: background filtering is carried out on the laser point cloud in the VEH _ omega space range:
step4.1: defining A as a cuboid containing UV VV WV laser spots, A ∈ VEH _ Ω,
Figure BDA0001949392220000051
are respectively rectangular parallelepiped VEH _ omegaThe upper layer and the lower layer are cuboids A in the ith row and the jth column;
Figure BDA0001949392220000052
the cuboids A are respectively the jth layer of the left wall and the jth layer of the right wall in the cuboid VEH _ omega; i =1,2, \8230;, MV'; j =1,2, \8230;, NV'; k =1,2, \ 8230;, KV';
Figure BDA0001949392220000053
let δ (a) be the laser point cloud density of the cuboid a, calculate the laser point cloud density fluctuation rate of the cuboid a:
Figure BDA0001949392220000054
Figure BDA0001949392220000055
Figure BDA0001949392220000056
Figure BDA0001949392220000057
wherein:
Figure BDA0001949392220000058
the laser point cloud average densities of the middle upper layer, the lower layer, the left wall and the right wall of the cuboid VEH _ omega with the A as the section are respectively obtained;
Figure BDA0001949392220000061
laser point cloud density fluctuation rates of the cuboid A on the ith row and the jth column of the middle upper layer and the lower layer of the cuboid VEH _ omega are respectively set;
Figure BDA0001949392220000062
excitation of cuboid A of jth layer on left wall and right wall of cuboid VEH _ omega respectivelyThe fluctuation rate of the density of the light point cloud;
step4.2: judging whether the cuboid A in the upper layer, the lower layer, the left wall and the right wall of the cuboid VEH _ omega is a background laser point cloud image or not, and filtering the background:
if it is
Figure BDA0001949392220000063
Judging that the laser point cloud contained in the cuboid A in the ith row and the jth column of the middle upper layer of the cuboid VEH _ omega is background laser point cloud;
if it is
Figure BDA0001949392220000064
Judging that the laser point cloud contained in the cuboid A in the ith row and the jth column of the lower layer in the cuboid VEH _ omega is background laser point cloud;
if it is
Figure BDA0001949392220000065
Then the laser point cloud contained in the jth row and kth layer of the cuboid A on the left wall of the cuboid VEH _ omega is background laser point cloud;
if it is
Figure BDA0001949392220000066
Then the laser point cloud contained in the jth row and kth layer of the cuboid A on the right wall of the cuboid VEH _ omega is background laser point cloud;
wherein: tau is UP (T)、τ DW (T)、τ LF (T)、τ RT (T) respectively judging the critical threshold values of the point cloud density fluctuation of the cuboid A background laser in the middle upper layer, the lower layer, the left wall and the right wall of the cuboid VEH _ omega in the Tth judging period;
step4.3: repeating Step4.1 to Step4.2, carrying out background judgment on all cuboids A in the upper layer, the lower layer, the left wall and the right wall of the cuboid VEH _ omega, and eliminating all background laser point clouds to form a new three-dimensional space matrix V '(VEH _ omega), V' ijk ∈V′(VEH_Ω);
Step5: background filtering the laser point cloud in the PED _ omega space range:
step5.1: definition B is one can contain UP × VPWP rectangular parallelepiped of laser spot, B e PED omega,
Figure BDA0001949392220000067
the cuboids B are respectively an upper layer cuboid PED _ omega and a lower layer cuboid B in the ith row and the jth column;
Figure BDA0001949392220000068
the cuboids B are respectively the jth layer of the left wall and the jth layer of the right wall in the cuboid PED _ omega; i =1,2, \ 8230;, MP'; j =1,2, \8230;, NP'; k =1,2, \ 8230;, KP';
Figure BDA0001949392220000069
let δ (B) be the laser point cloud density of the cuboid B, calculate the laser point cloud density fluctuation rate of the cuboid B:
Figure BDA0001949392220000071
Figure BDA0001949392220000072
Figure BDA0001949392220000073
Figure BDA0001949392220000074
wherein:
Figure BDA0001949392220000075
the laser point cloud average densities of the middle upper layer, the lower layer, the left wall and the right wall of the cuboid PED _ omega with the section B as the section are respectively obtained;
Figure BDA0001949392220000076
laser point cloud density fluctuation rates of cuboids B of the ith row and the jth column of the middle upper layer and the lower layer of the cuboid PED _ omega are respectively set;
Figure BDA0001949392220000077
laser point cloud density fluctuation rates of the cuboid B on the jth column k layer of the left wall and the right wall in the cuboid PED _ omega are respectively set;
step5.2: judging whether the cuboid B in the upper layer, the lower layer, the left wall and the right wall in the cuboid PED _ omega is a background image or not, and filtering the background:
if it is
Figure BDA0001949392220000078
Judging that the laser point cloud contained in the cuboid B in the ith row and the jth column of the upper layer in the cuboid PED _ omega is background laser point cloud;
if it is
Figure BDA0001949392220000079
Judging that the laser point cloud contained in the cuboid B in the ith row and the jth column of the lower layer in the cuboid PED _ omega is background laser point cloud;
if it is
Figure BDA00019493922200000710
Then the laser point cloud contained in the cuboid B of the jth column and kth layer on the left wall of the cuboid PED _ omega is background laser point cloud;
if it is
Figure BDA00019493922200000711
Then the laser point cloud contained in the cuboid B of the jth column and kth layer on the right wall of the cuboid PED _ omega is background laser point cloud;
wherein: lambda [ alpha ] UP (T)、λ DW (T)、λ LF (T)、λ RT (T) respectively determining the critical threshold values of the point cloud density fluctuation of the cuboid B background laser in the upper layer, the lower layer, the left wall and the right wall of the cuboid PED _ omega in the Tth determination period;
step5.3: repeating Step5.1 to Step5.2, performing background judgment on all cuboids B in the upper layer, the lower layer, the left wall and the right wall of the cuboid PED _ omega, and eliminating all background laser point clouds to form a new three-dimensional space matrix P '(PEH _ omega), P' ijk ∈P′(PED_Ω);
Step6: dividing an X-axis plane and a Y-axis plane of VEH _ omega into NC multiplied by NL grids, wherein each grid is internally provided with a cuboid E of WL multiplied by LV multiplied by H; wherein: WL is the lane width of the 3D laser radar sensor in the direction of the inlet; LV is the average length of the vehicle; h is the height of the 3D laser radar sensor from the ground; e ∈ VEH _ Ω, E ij Is a cuboid with VEH _ omega as the ith row and the jth lane; wherein: i =1,2, \ 8230;, NC; j =1,2, \8230;, NL;
Figure BDA0001949392220000081
step6.1: calculating the laser point cloud density delta (E) of E of the cuboid in VEH _ omega
Figure BDA0001949392220000082
Wherein:
Figure BDA00019493922200000816
the sum of the point cloud densities of all the laser points in the E;
Figure BDA0001949392220000083
is the volume of cuboid E;
step6.2: judging the existence of the vehicle in the cuboid E:
(1) If delta (E) is equal to or greater than epsilon V Judging that the vehicle exists in the cuboid E;
(2) If e is bV ≤δ(E)<∈ V The cuboid E is divided into a front part and a rear part which have the same volume
Figure BDA0001949392220000084
And
Figure BDA0001949392220000085
if it is
Figure BDA0001949392220000086
And is
Figure BDA0001949392220000087
The vehicle is present in the front of the cuboid E and will
Figure BDA0001949392220000088
And
Figure BDA0001949392220000089
merging and marking as a new cuboid E'; if it is
Figure BDA00019493922200000810
And is
Figure BDA00019493922200000811
The vehicle is present at the rear of the rectangular parallelepiped E, will
Figure BDA00019493922200000812
And
Figure BDA00019493922200000813
merging and marking as a new cuboid E';
wherein:
Figure BDA00019493922200000814
the rear half part of the front cuboid E adjacent to the cuboid E;
Figure BDA00019493922200000815
the front half part of the next cuboid E adjacent to the cuboid E; e is the same as V Critical density for the presence of cuboid E vehicles; e is the same as bV Is the critical density occupied by the cuboid E vehicle;
step6.3: the minimum number of neighbors of a given vehicle is Min _ VehPTs, and the radius of the neighbors is R _ Veh; traversing all laser points Li _ Pt in the cuboid E or E' to find out the number NumPT (Li _ Pt) of the laser points in the neighborhood radius R _ Veh of each laser point Li _ Pt; if NumPt (Li _ Pt) is greater than or equal to Min _ VehPTs, marking the laser point Li _ Pt as the vehicle mass center VehCore _ Pt in the cuboid E or E';
step6.4: repeating Step6.1 to Step6.3, and acquiring the number VehNum of vehicles and the queuing length VehQue of each entrance way in the current laser point cloud data frame VEH _ omega according to the position information and the number of the vehicle center of mass points;
step7: dividing an X-axis plane and a Y-axis plane of PED _ omega into NP multiplied by NT grids, wherein each grid is internally provided with a cuboid S of WP multiplied by PT multiplied by H; wherein: WP is the average longitudinal space distance required by the pedestrian to walk; PT is the transverse space distance of the pedestrian; h is the height of the laser radar sensor from the inside; s belongs to PED omega, S ij PED _ omega is a cuboid S in the ith row and the jth column in the pedestrian crossing; wherein: i =1,2, \8230;, NP; j =1,2, \ 8230;, NT;
Figure BDA0001949392220000091
step7.1: calculating the laser point cloud density delta (S) of the rectangular S in PED _ omega
Figure BDA0001949392220000092
Wherein:
Figure BDA00019493922200000926
the sum of the point cloud densities of all the laser points in the S;
Figure BDA0001949392220000093
is the volume of the cuboid S;
step7.2: judging the existence of the pedestrian in the cuboid S:
(1) If delta (S) ≧ epsilon p Judging that the pedestrian exists in the cuboid S;
(2) If e is bp ≤δ(S)<∈ p The cuboid S is split into a front part and a rear part with equal volume
Figure BDA0001949392220000094
And
Figure BDA0001949392220000095
or left and right parts
Figure BDA0001949392220000096
And
Figure BDA0001949392220000097
if it is
Figure BDA0001949392220000098
And is
Figure BDA0001949392220000099
The pedestrian is present in the front of the rectangular parallelepiped S and will
Figure BDA00019493922200000910
And
Figure BDA00019493922200000911
merging and marking as a new cuboid S'; if it is
Figure BDA00019493922200000912
And is
Figure BDA00019493922200000913
The vehicle is present at the rear of the rectangular parallelepiped S, will
Figure BDA00019493922200000914
And
Figure BDA00019493922200000915
merging and marking as a new cuboid S'; if it is
Figure BDA00019493922200000916
And is
Figure BDA00019493922200000917
The pedestrian is present on the right side of the rectangular parallelepiped S and will
Figure BDA00019493922200000918
And
Figure BDA00019493922200000919
merging and marking as a new cuboid S'; if it is
Figure BDA00019493922200000920
And is
Figure BDA00019493922200000921
The pedestrian is present on the left side of the rectangular parallelepiped S and will be
Figure BDA00019493922200000922
And
Figure BDA00019493922200000923
combined and marked as new cuboid S'
Wherein:
Figure BDA00019493922200000924
the cuboid S is the rear half part of the adjacent front cuboid S;
Figure BDA00019493922200000925
the front half part of the next cuboid S adjacent to the cuboid S; e is the same as p Is the critical density of a pedestrian existing in a cuboid S; e is b p Is the critical density occupied by the rectangular pedestrian;
step7.3: the minimum number of neighbors of a given pedestrian is Min _ PedPts, and the neighborhood radius is R _ Ped; traversing all laser points Li _ Pt in the cuboid S or S' to find out the number NumPt (Li _ Pt) of laser points in the neighborhood radius R _ Ped of each laser point Li _ Pt; if NumPt (Li _ Pt) is more than or equal to Min _ PedPts, marking the laser point Li _ Pt as a pedestrian centroid PedCore _ Pt in the cuboid S or S';
step7.4: repeating Step7.1 to Step7.3, and acquiring the pedestrian number VPedNum and the pedestrian position information PedLoInfo of the pedestrian crossing crosswalk and the pedestrian crossing waiting area in the current laser point cloud data frame PED _ omega according to the position information and the number of the pedestrian centroid points;
step8: repeatedly carrying out the processing from Step2 to Step7 on data of each frame transmitted by the 3D laser radar sensor, tracking the positions of the vehicles and the pedestrians in each frame, acquiring the running tracks VedTrace and PedTrace of the vehicles and the pedestrians, and acquiring running speed information VehSpeedInfo and PedSpeedInfo of the vehicles and the pedestrians through the position change of the vehicles and the pedestrians;
step9: the multi-core laser point cloud microprocessing module transmits the number, position, speed and running track information of vehicles and pedestrians in the detection area to the traffic signal interaction coordination unit through an EMIF bus.
2) Traffic signal interaction coordination unit
The traffic signal interaction coordination unit consists of a communication module, an FPGA main control module, a DSP safety monitoring module and a PCB backboard, wherein the communication module, the FPGA main control module and the DSP signal regulation and control module adopt an EMIF bus to carry out data communication transmission and carry out layout fixation on the PCB backboard; the communication module is responsible for carrying out information interaction and sharing with self-sensing interactive traffic signal control devices which are arranged on other cantilever type lamp poles in the same intersection; the FPGA main control module optimizes the phase of the traffic signal and the duration of the green light in the corresponding entrance direction according to the traffic flow parameter information provided by the laser radar detection unit, and coordinates the phase, the phase sequence and the duration of the green light with a self-sensing interactive traffic signal control device on other cantilever lamp poles at the intersection through the communication module; the DSP safety monitoring module carries out dynamic monitoring and safety early warning on the safety states of the traffic flow and the pedestrian flow of the pedestrian crossing, and carries out automatic intervention and quick adjustment on traffic signals under the emergency traffic incident; the specific working steps are as follows:
step1: the FPGA main control module predicts the queuing traffic flow of each entrance road in the detection direction at a certain time interval period T _ GAP:
Figure BDA0001949392220000101
wherein: vol (T + T _ GAP), vol (T) and Vol (T-T _ GAP) are respectively the flow of an intersection inlet channel of the next prediction period interval T _ GAP, the current prediction period and the previous prediction period interval T _ GAP at the time T; vel (T) and Vel (T-T _ GAP) are respectively the average speed of vehicles at the intersection entrance road of the current prediction period and the last prediction period interval T _ GAP at the time T; alpha and beta are correction parameters respectively;
step2: the FPGA main control module estimates the time required for dissipation of queued vehicles of each entrance lane:
Figure BDA0001949392220000111
wherein: disT is the lane in-line vehicle dissipation time; mu is the start delay of the queued vehicle; gamma is a queuing dispersion time correction coefficient;
Figure BDA0001949392220000112
average dissipation speed for the queued vehicles;
step3: the FPGA main control module determines the dissipation time of queued vehicles required by each passing direction of an entrance lane:
disT_ST=max(disT_ST 1 ,disT_ST 2 ,…,disT_ST n )
disT_LT=max(disT_LT 1 ,disT_LT 2 ,…,disT_LT m )
wherein: disT _ ST and disT _ LT are respectively the dissipation time of the queued vehicles required by the straight-going traffic direction and the left-turn traffic direction of the entrance lane; disT _ ST 1 ,disT_ST 2 ,…,disT_ST n Respectively 1 st, 2 nd, \ 8230, the dissipation time of queuing vehicles required by n straight lanes; disT _ LT 1 ,disT_LT 2 ,…,disT_LT m Respectively 1 st, 2 nd, \ 8230;, the dissipation time of queuing vehicles required by m left-turn lanes;
step4: the FPGA main control module calculates the time difference between the dissipation time disT _ ST of the vehicles queued in the straight passing direction of the entrance lane and the dissipation time disT _ LT of the vehicles queued in the left-turning passing direction:
ΔdisT=|disT_ST-disT_LT|
(1) If the delta disT is less than or equal to xi, the FPGA main control module combines the straight traffic signal phase and the left-turn traffic signal phase of the entrance lane into the same signal phase stage, and the green time of the phase stage is max (disT _ ST, disT _ LT);
(2) If the delta disT is larger than xi, the FPGA main control module acquires the dissipation time disT _ ST 'and disT _ LT' of the queued vehicles in the straight-going traffic direction and the left-turning traffic direction of the opposite-entrance lane through the communication module;
(1) if the | disT _ ST-disT _ ST '| is less than or equal to xi, the FPGA main control module coordinates with the opposite FPGA main control module through the communication module, combining the straight traffic signal phase of the entrance lane and the straight traffic signal phase of the opposite entrance lane into the same signal phase, wherein the green time of the phase is max (disT _ ST, disT _ ST');
(2) if the absolute value of the DIST _ LT-DIST _ LT 'is less than or equal to xi, the FPGA main control module coordinates with the opposite FPGA main control module through the communication module, the left-turn traffic signal phase of the entrance lane and the left-turn traffic signal phase of the opposite entrance lane are combined into the same signal phase stage, and the green light time of the phase stage is max (DIST _ LT, DIST _ LT');
(3) if | disT _ ST-disT _ ST '| is less than or equal to xi and | disT _ LT-disT _ LT' | is greater than xi, the FPGA main control module coordinates with the opposite FPGA main control module through the communication module:
combining the straight traffic signal phase of the entrance lane and the straight traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is max (disT _ ST, disT _ ST');
if disT _ LT > disT _ LT ' and maxVOL _ ST (T + T _ GAP) > maxVOL _ LT ' (T + T _ GAP), merging the left-turn traffic signal phase of the entrance lane and the left-turn traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is min (disT _ LT, disT _ LT '); a phase superposition stage is added in an entrance lane, the straight traffic direction and the left-turn traffic direction in the entrance direction are allowed to be released simultaneously, and the green light time in the phase superposition stage is | disT _ LT-disT _ LT' |;
if disT _ LT is greater than disT _ LT 'and maxVOL _ ST (T + T _ GAP) < maxVOL _ LT' (T + T _ GAP), merging the left-turn traffic signal phase of the entrance lane and the left-turn traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is disT _ LT;
if disT _ LT is less than disT _ LT ' and maxVOL _ LT (T + T _ GAP) > maxVOL _ ST ' (T + T _ GAP), merging the left-turn traffic signal phase of the entrance lane and the left-turn traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is disT _ LT ';
if disT _ LT is less than disT _ LT ' and maxVOL _ LT (T + T _ GAP) < maxVOL _ ST ' (T + T _ GAP), merging the left-turning traffic signal phase of the entrance lane and the left-turning traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is min (disT _ LT, disT _ LT '); a superposition phase stage is added in the opposite-direction entrance lane, the opposite-direction entrance is allowed to simultaneously release in the straight-going passing direction and the left-turning passing direction, and the green light time in the superposition phase stage is | disT _ LT-disT _ LT' |;
(4) if | disT _ ST-disT _ ST '| > xi and | disT _ LT-disT _ LT' | is less than or equal to xi, the FPGA main control module coordinates with the opposite FPGA main control module through the communication module:
combining the left-turn traffic signal phase of the entrance lane and the left-turn traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is max (disT _ LT, disT _ LT');
if disT _ ST > disT _ ST ' and maxVOL _ LT (T + T _ GAP) > maxVOL _ ST ' (T + T _ GAP), combining the straight traffic signal phase of the entrance lane and the straight traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is min (disT _ ST, disT _ ST '); a phase superposition stage is added in an entrance lane, the straight traffic direction and the left-turn traffic direction in the entrance direction are allowed to be released simultaneously, and the green light time in the phase superposition stage is | disT _ ST-disT _ ST' |;
if disT _ ST > disT _ ST ', and maxVOL _ LT (T + T _ GAP) < maxVOL _ ST' (T + T _ GAP), merging the straight traffic signal phase of the entrance lane and the straight traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is disT _ ST;
if disT _ ST < disT _ ST ', and maxVOL _ ST (T + T _ GAP) > maxVOL _ LT ' (T + T _ GAP), merging the left-turn traffic signal phase of the entrance lane and the left-turn traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is disT _ LT ';
if disT _ ST < disT _ ST ' and maxVOL _ ST (T + T _ GAP) < maxVOL _ LT ' (T + T _ GAP), merging the straight traffic signal phase of the entrance lane and the straight traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is min (disT _ ST, disT _ ST '); adding a phase superposition stage to the opposite inlet lane, allowing the straight traffic direction and the left-turn traffic direction of the inlet direction to simultaneously pass, wherein the green light time of the phase superposition stage is | disT _ ST-disT _ ST' |;
wherein: a ξ phase stage splitting critical threshold; maxVOL _ ST (T + T _ GAP), maxVOL _ ST' (T + T _ GAP) are the next prediction interval T _ GAP at time T, respectively, the straight traffic direction of the entrance lane and the maximum straight traffic lane flow rate to the straight traffic direction of the entrance lane; maxVOL _ LT (T + T _ GAP), maxVOL _ LT' (T + T _ GAP) are the next prediction interval T _ GAP at time T, left turn traffic direction of the entrance lane and the maximum left turn traffic flow to the left turn traffic direction of the entrance lane, respectively;
step5: the FPGA main control module and other FPGA main control modules at the intersection coordinate and determine the execution sequence of each signal phase, namely the phase sequence, according to the green light duration of each passing phase in each import direction calculated by each FPGA main control module, a Webster intersection delay calculation formula and the principle of minimum intersection delay by adopting the principle of minimum intersection delay through the communication module;
step6: the DSP safety monitoring module monitors the running conditions of vehicles and pedestrians in real time:
step6.1: during the period that the green light of vehicle signal is about to end, DSP safety monitoring module real-time supervision import direction is about to drive into the vehicle at intersection, calculates whether current vehicle can safely park or safely pass through the intersection:
Figure BDA0001949392220000131
Figure BDA0001949392220000132
wherein: x 0 The minimum safe parking distance for the vehicle to be able to decelerate and park; x C The maximum safe distance for the vehicle to pass through in an accelerating way; v 0 The running speed of a vehicle about to enter the intersection; σ is the driver's average brake reflection time; alpha is alpha - Is the vehicle average braking deceleration; alpha (alpha) ("alpha") + Is the vehicle average braking acceleration; greenT is the remaining transit time; w is the width of the intersection; l is the average vehicle length;
step6.2: if X 0 >X C If the vehicle can not pass safely and can not be parked safely, the DSP safety monitoring module is coordinated with the FPGA main control module to prolong the green time of the vehicle in the running direction so that the vehicle can pass safely, and a DilemmaType1 instruction data packet is sent to the traffic signal driving control module;
step6.3: if X 0 <X C If the traffic signal driving control unit is in a traffic signal driving state, the DSP safety monitoring module coordinates with the traffic signal driving control unit to flicker a signal lamp in the vehicle running direction, prompts a driver to decelerate and stop the vehicle or accelerate to pass through, and sends a DilemmaType2 instruction data packet to the traffic signal driving control module;
step6.4: the DSP safety monitoring module monitors the running tracks of pedestrians and vehicles in the direction of the entrance in real time, and once the situation that the running tracks of the pedestrians and the vehicles conflict with each other is detected, the DSP safety monitoring module sends danger to the traffic signal driving control unit, and the traffic signal driving control unit is noticed safety! "the security alert DangerInfo command packet;
step6.5: when the green light of the vehicle signal is about to end, the DSP safety monitoring module monitors whether pedestrians exist in the pedestrian crossing in real time, and once the pedestrians are detected to appear on the pedestrian crossing boundary at the edge of the road, the DSP safety monitoring module sends' patience waiting, please return to the waiting area! "safety precaution PedWarningType1 instruction data packet;
step6.6: during the period of red light of vehicle signal and green light of pedestrian signal, the DSP safety detection module monitors whether the pedestrian walks on the pedestrian crossing in real time, and once the fact that the walking track of the pedestrian is not on the pedestrian crossing is detected, the DSP safety detection module sends' civilized traffic, please walk the zebra crossing! "PedWarningType 2 instruction data packet of safety precaution;
step7: the FPGA main control module and the DSP safety detection module transmit a traffic signal control scheme comprising green light time of each signal phase, an execution sequence of the signal phases and a traffic safety early warning instruction to a traffic signal driving control unit through an EMIF bus;
3) Traffic signal drive control unit
The traffic signal driving control unit consists of a traffic signal display module, a traffic signal prompt module, a traffic safety early warning module, a signal lamp group, an LED micro display screen and a 6U VPX switch signal interface board, wherein the traffic signal display module, the traffic signal prompt module, the traffic safety early warning module, the signal lamp group and the LED micro display screen are connected with the 6U VPX switch signal interface board through serial ports and are in data communication transmission; the traffic signal display module is responsible for driving and controlling the turn-on time, the turn-on state and the turn-on patterns of the signal lamp group according to the traffic signal control scheme transmitted by the traffic signal coordination unit; the traffic signal prompt module controls the information prompt content of the LED micro display screen according to the traffic signal control scheme transmitted by the traffic signal interaction coordination unit, and carries out information prompt and dynamic guidance on the traffic state and the traffic time of people and traffic; and the traffic safety early warning module carries out safety early warning on the pedestrian flow and the traffic flow at the intersection according to the traffic safety early warning instruction transmitted by the traffic signal interaction and coordination unit.
Step1: the traffic signal display module receives the traffic signal control scheme transmitted by the traffic signal interaction coordination unit, converts the digital information of the traffic signal control scheme into analog signals, and drives and controls the lighting time, lighting color, lighting sequence, transition sequence and transition form of different lamp colors and the graph displayed by the lamp colors;
step2: the traffic signal prompt module receives a traffic signal control scheme transmitted by the traffic signal interaction coordination unit, drives and controls the LED micro-display screen, and prompts and guides the traffic flow, the traffic state, the traffic time, the remaining time and the advancing direction;
step3: the traffic safety early warning module receives a traffic safety early warning instruction transmitted by the traffic signal interaction coordination unit, and carries out safety early warning according to different signal instructions:
step3.1: when the traffic safety early warning module receives a DilemmaType1 instruction data packet, taking over the control of the traffic signal display module on the signal lamp group, prolonging the green lamp duration in the appointed passing direction according to the DilemmaType1 instruction content, releasing the control on the signal lamp group when the prolonging time is over, and returning the control right of the signal lamp group to the traffic signal display module;
step3.2: when the traffic safety early warning module receives a DilemmaType2 instruction data packet, the traffic safety early warning module takes over the control of the traffic signal display module on the signal lamp group, carries out flicker control on the signal lamp in the appointed passing direction according to the DilemmaType2 instruction content, releases the control on the signal lamp group after the flicker control is finished, and returns the control right of the signal lamp group to the traffic signal display module;
step3.3: when the traffic safety early warning module receives the DangerInfo instruction data packet, the traffic signal display module takes over the control of the signal lamp group, the red light flashing early warning is started in the appointed traffic direction according to the DangerInfo instruction content, and the danger is alarmed through the voice and the LED micro display screen, please pay attention to the safety! ";
step3.4: when the traffic safety early warning module receives a PedWarningType1 instruction data packet, the pedestrian is warned of 'waiting for patience' by voice and an LED micro display screen, and please return to a waiting area! ";
step3.5: when the traffic safety early warning module receives a PedWarningType2 instruction data packet, the pedestrian is warned of' civilized traffic, please walk the zebra crossing!through the voice and the LED micro display screen! ".

Claims (3)

1. The utility model provides a self-perception interactive traffic signal controlling means based on 3D lidar which characterized in that:
the 3D laser radar detection unit consists of a laser radar sensor and a multi-core laser point cloud micro-processing module, and the 3D laser radar sensor and the multi-core laser point cloud micro-processing module are connected and transmitted in data communication by adopting an EMIF bus; the laser radar sensor is used for scanning and detecting traffic flow in the inlet direction of the intersection and pedestrian flow of a pedestrian crossing to form a laser point cloud data frame, and transmitting the laser point cloud data frame to the multi-core laser point cloud micro-processing module through an EMIF bus; the multi-core laser point cloud microprocessing module is responsible for carrying out data filtering and information extraction on a laser point cloud data frame transmitted by the 3D laser radar sensor and extracting traffic flow and people flow parameter information from the laser point cloud data frame; the specific working steps are as follows:
step1: the method comprises the following steps that a 3D laser radar sensor emits laser beams and rotates a laser refraction mirror surface at a certain frequency, 3D scanning of the road traffic environment in the detection direction is achieved by receiving laser reflection beams, a 3D laser point cloud image is formed in a laser point cloud mode, when the 3D laser radar sensor completes 3D scanning of the road traffic environment in the detection space range every time, a 3D laser point cloud data frame is formed, and the data frame comprises three-dimensional coordinate information, laser intensity, laser ID, a laser horizontal rotation direction angle, a laser vertical direction included angle, a laser distance and a timestamp of the laser point cloud;
step2: the multi-core laser point cloud microprocessing module divides a scanning space of the 3D laser radar sensor into two subspaces, wherein one subspace is a vehicle detection space VEH _ omega in the direction of an entrance, and the other subspace is a pedestrian detection space PED _ omega in a pedestrian crosswalk in the direction of detection; wherein: VEH _ omega is a cuboid with VL multiplied by VW multiplied by H, PED _ omega is a cuboid with PL multiplied by PW multiplied by H, and VL is the distance between the farthest position of the 3D laser radar sensor in the inlet direction and the stop line in the inlet direction; VW is the width of an entrance way in the detection direction of the 3D laser radar sensor; h is the height of the 3D laser radar sensor from the ground; PL is the length of a crosswalk in the direction detected by the 3D laser radar sensor; PW is the width of a pedestrian crossing in the direction detected by the 3D laser radar sensor;
step3: the multi-core laser point cloud microprocessing module respectively extracts the data frames at VEH _ omega and PED _ omega from the laser point cloud data framesRespectively constructing a MV multiplied by NV multiplied by KV three-dimensional space matrix V (VEH _ omega), V by laser point cloud in a space range ijk e.V (VEH _ OMEGA) and MP × NP × KP three-dimensional space matrix P (PED _ OMEGA), P ijk ∈P(PED_Ω);
In VEH _ Ω: v. of ijk The laser intensity of the ith row of the X coordinate axis, the jth column of the Y coordinate axis and the kth layer of laser points of the Z coordinate axis in VEH _ omega is shown; i =1,2, \8230;, MV; j =1,2, \ 8230;, NV; k =1,2, \ 8230;, KV; MV is the total row number of X coordinate axis laser point clouds in VEH _ omega; NV is the total column number of Y coordinate axis laser point clouds in VEH _ omega; KV is the total number of layers of Z coordinate axis laser point clouds in VEH _ omega;
in PED _ Ω: p is a radical of ijk The laser intensity of the laser spot on the ith row of the X coordinate axis, the jth column of the Y coordinate axis and the kth layer of the Z coordinate axis in PED _ omega is set; i =1,2, \ 8230;, MP; j =1,2, \ 8230;, NP; k =1,2, \ 8230;, KP; MP is the total line number of X coordinate axis laser point clouds in PED _ omega; NV is the total column number of Y coordinate axis laser point clouds in PED _ omega; KP is the total number of layers of Z coordinate axis laser point clouds in PED _ omega;
the intersection inlet direction scanned by the 3D laser radar sensor in VEH _ omega and PED _ omega is the Y coordinate axis direction; the pedestrian crossing direction in the intersection inlet direction scanned by the 3D laser radar sensor is the X coordinate axis direction; the vertical direction of the ground at the intersection is the direction of a Z coordinate axis;
step4: background filtering is carried out on the laser point cloud in the VEH _ omega space range:
step4.1: defining A as a cuboid containing UV VV WV laser spots, A ∈ VEH _ Ω,
Figure QLYQS_1
the cuboids A are respectively an upper-layer row and a lower-layer jth row in the cuboid VEH _ omega;
Figure QLYQS_2
the cuboids A are respectively the jth layer of the left wall and the jth layer of the right wall in the cuboid VEH _ omega; i =1,2, \8230;, MV'; j =1,2, \8230;, NV'; k =1,2, \ 8230;, KV';
Figure QLYQS_3
let δ (a) be the laser point cloud density of the cuboid a, calculate the laser point cloud density fluctuation rate of the cuboid a:
Figure QLYQS_4
Figure QLYQS_5
Figure QLYQS_6
Figure QLYQS_7
wherein:
Figure QLYQS_8
the laser point cloud average densities of the middle upper layer, the lower layer, the left wall and the right wall of the cuboid VEH _ omega with the A as the section are respectively obtained;
Figure QLYQS_9
laser point cloud density fluctuation rates of the cuboid A on the ith row and the jth column of the middle upper layer and the lower layer of the cuboid VEH _ omega are respectively set;
Figure QLYQS_10
laser point cloud density fluctuation rates of the cuboid A on the jth column and kth layer of the left wall and the right wall in the cuboid VEH _ omega are respectively set;
step4.2: judging whether the cuboid A in the upper layer, the lower layer, the left wall and the right wall of the cuboid VEH _ omega is a background laser point cloud image or not, and filtering the background:
if it is
Figure QLYQS_11
Then, the cuboid A in the ith row and the jth column of the upper layer in the cuboid VEH _ omega is judged to containThe laser point cloud is background laser point cloud;
if it is
Figure QLYQS_12
Judging that the laser point cloud contained in the cuboid A in the ith row and the jth column of the lower layer in the cuboid VEH _ omega is background laser point cloud;
if it is
Figure QLYQS_13
Then the laser point cloud contained in the jth column and kth layer of the cuboid A on the left wall in the cuboid VEH _ omega is background laser point cloud;
if it is
Figure QLYQS_14
Then the laser point cloud contained in the jth row and kth layer of the cuboid A on the right wall of the cuboid VEH _ omega is background laser point cloud;
wherein: tau is UP (T)、τ DW (T)、τ LF (T)、τ RT (T) respectively judging the critical threshold values of the point cloud density fluctuation of the cuboid A background laser in the middle upper layer, the lower layer, the left wall and the right wall of the cuboid VEH _ omega in the Tth judging period;
step4.3: repeating Step4.1 to Step4.2, carrying out background judgment on all cuboids A in the upper layer, the lower layer, the left wall and the right wall of the cuboid VEH _ omega, and eliminating all background laser point clouds to form a new three-dimensional space matrix V '(VEH _ omega), V' ijk ∈V′(VEH_Ω);
Step5: background filtering the laser point cloud in the PED _ omega space range:
step5.1: defining B as a cuboid containing UP VP WP laser points, B e PED omega,
Figure QLYQS_15
the cuboids B are respectively an upper-layer row and a lower-layer jth column in the cuboid PED _ omega;
Figure QLYQS_16
the cuboids B are respectively the jth layer of the left wall and the jth layer of the right wall in the cuboid PED _ omega; i =1,2, \ 8230;,MP′;j=1,2,…,NP′;k=1,2,…,KP′;
Figure QLYQS_17
let δ (B) be the laser point cloud density of the cuboid B, calculate the laser point cloud density fluctuation rate of the cuboid B:
Figure QLYQS_18
Figure QLYQS_19
Figure QLYQS_20
Figure QLYQS_21
wherein:
Figure QLYQS_22
the laser point cloud average densities of the middle upper layer, the lower layer, the left wall and the right wall of the cuboid PED _ omega with the section B as the section are respectively;
Figure QLYQS_23
laser point cloud density fluctuation rates of cuboids B of the ith row and the jth column of the middle upper layer and the lower layer of the cuboid PED _ omega are respectively set;
Figure QLYQS_24
laser point cloud density fluctuation rates of the cuboid B on the jth column k layer of the left wall and the right wall in the cuboid PED _ omega are respectively set;
step5.2: judging whether the cuboid B in the upper layer, the lower layer, the left wall and the right wall in the cuboid PED _ omega is a background image or not, and filtering the background:
if it is
Figure QLYQS_25
Judging that the laser point cloud contained in the cuboid B in the ith row and the jth column of the upper layer in the cuboid PED _ omega is background laser point cloud;
if it is
Figure QLYQS_26
Judging that the laser point cloud contained in the cuboid B in the ith row and the jth column of the lower layer in the cuboid PED _ omega is background laser point cloud;
if it is
Figure QLYQS_27
Then the laser point cloud contained in the cuboid B of the jth column and kth layer on the left wall of the cuboid PED _ omega is background laser point cloud;
if it is
Figure QLYQS_28
Then the laser point cloud contained in the cuboid B of the jth column and kth layer on the right wall of the cuboid PED _ omega is background laser point cloud;
wherein: lambda [ alpha ] UP (T)、λ DW (T)、λ LF (T)、λ RT (T) respectively judging the critical threshold values of the point cloud density fluctuation of the background laser points of the cuboid B in the upper layer, the lower layer, the left wall and the right wall in the cuboid PED _ omega in the Tth judging period;
step5.3: repeating Step5.1 to Step5.2, performing background judgment on all cuboids B in the upper layer, the lower layer, the left wall and the right wall of the cuboid PED _ omega, and removing all background laser point clouds to form a new three-dimensional space matrix P '(PEH _ omega), P' ijk ∈P′(PED_Ω);
Step6: dividing an X-axis plane and a Y-axis plane of VEH _ omega into NC multiplied by NL grids, wherein each grid is internally provided with a cuboid E of WL multiplied by LV multiplied by H; wherein: WL is the lane width of the 3D laser radar sensor in the direction of the inlet; LV is the average length of the vehicle; h is the height of the 3D laser radar sensor from the ground; e is equal to VEH _ omega, E ij The vehicle is a cuboid with VEH _ omega as the jth lane of the ith row; wherein: i =1,2, \ 8230;, NC; j =1,2, \8230;, NL;
Figure QLYQS_29
step6.1: calculating the laser point cloud density delta (E) of E of the cuboid in VEH _ omega
Figure QLYQS_30
Wherein:
Figure QLYQS_31
the sum of the point cloud densities of all the laser points in the E;
Figure QLYQS_32
is the volume of cuboid E;
step6.2: judging the existence of the vehicle in the cuboid E:
(1) If delta (E) is equal to or greater than epsilon V Judging that the vehicle exists in the cuboid E;
(2) If e is bV ≤δ(E)<∈ V The cuboid E is divided into a front part and a rear part which have the same volume
Figure QLYQS_34
And
Figure QLYQS_38
if it is
Figure QLYQS_41
And is
Figure QLYQS_35
The vehicle is present in the front of the cuboid E and will
Figure QLYQS_37
And
Figure QLYQS_40
merging and marking as a new cuboid E'; if it is
Figure QLYQS_42
And is
Figure QLYQS_33
The vehicle is present at the rear of the rectangular parallelepiped E, will
Figure QLYQS_36
And
Figure QLYQS_39
merging and marking as a new cuboid E';
wherein:
Figure QLYQS_43
the rear half part of the front cuboid E adjacent to the cuboid E;
Figure QLYQS_44
the front half part of the next cuboid E adjacent to the cuboid E; e is the same as V Critical density for the presence of cuboid E vehicles; e is the same as bV Is the critical density occupied by the cuboid E vehicle;
step6.3: the minimum number of neighbors of a given vehicle is Min _ VehPTs, and the radius of the neighbors is R _ Veh; traversing all laser points Li _ Pt in the cuboid E or E' to find out the number NumPT (Li _ Pt) of the laser points in the neighborhood radius R _ Veh of each laser point Li _ Pt; if NumPt (Li _ Pt) is greater than or equal to Min _ VehPTs, marking the laser point Li _ Pt as the vehicle mass center VehCore _ Pt in the cuboid E or E';
step6.4: repeating Step6.1 to Step6.3, and acquiring the number VehNum of vehicles and the queuing length VehQue of each entrance way in the current laser point cloud data frame VEH _ omega according to the position information and the number of the vehicle center of mass points;
step7: dividing an X-axis plane and a Y-axis plane of PED _ omega into NP multiplied by NT grids, wherein each grid is internally provided with a cuboid S of WP multiplied by PT multiplied by H; wherein: WP is the average longitudinal space distance required by the pedestrian to walk; PT is the transverse space distance of the pedestrian; h is the height of the laser radar sensor from the inside; s belongs to PED omega, S ij PED _ omega is a cuboid S in the ith row and the jth column in the pedestrian crossing; wherein: i =1,2, \8230;, NP; j =1,2,…,NT;
Figure QLYQS_45
step7.1: calculating the laser point cloud density delta (S) of the rectangular S in PED _ omega
Figure QLYQS_46
Wherein:
Figure QLYQS_47
the sum of the point cloud densities of all the laser points in the S;
Figure QLYQS_48
is the volume of the cuboid S;
step7.2: judging the existence of the pedestrian in the cuboid S:
(1) If delta (S) ≧ epsilon p Judging that the pedestrian exists in the cuboid S;
(2) If e is bp ≤δ(S)<∈ p The cuboid S is split into a front part and a rear part with equal volume
Figure QLYQS_55
And
Figure QLYQS_51
or left and right parts
Figure QLYQS_57
And
Figure QLYQS_53
if it is
Figure QLYQS_61
And is
Figure QLYQS_58
The pedestrian is present in the front of the rectangular parallelepiped S and will
Figure QLYQS_67
And
Figure QLYQS_56
merging and marking as a new cuboid S'; if it is
Figure QLYQS_59
And is
Figure QLYQS_49
The vehicle is present at the rear of the rectangular parallelepiped S, will
Figure QLYQS_66
And
Figure QLYQS_50
merging and marking as a new cuboid S'; if it is
Figure QLYQS_62
And is
Figure QLYQS_54
The pedestrian is present on the right side of the rectangular parallelepiped S and will
Figure QLYQS_63
And
Figure QLYQS_60
merging and marking as a new cuboid S'; if it is
Figure QLYQS_68
And is
Figure QLYQS_64
The pedestrian is present on the left side of the rectangular parallelepiped S and will be
Figure QLYQS_65
And
Figure QLYQS_52
combined and marked as new cuboid S'
Wherein:
Figure QLYQS_69
the cuboid S is the rear half part of the adjacent front cuboid S;
Figure QLYQS_70
the front half part of the next cuboid S adjacent to the cuboid S; e is the same as p Is the critical density of a pedestrian existing in a cuboid S; e is a bp Is the critical density occupied by the pedestrian in the cuboid S;
step7.3: the minimum number of adjacent points of a given pedestrian is Min _ PedPts, and the radius of the adjacent points is R _ Ped; traversing all laser points Li _ Pt in the cuboid S or S' to find out the number NumPt (Li _ Pt) of laser points in the neighborhood radius R _ Ped of each laser point Li _ Pt; if NumPt (Li _ Pt) is more than or equal to Min _ PedPts, marking the laser point Li _ Pt as a pedestrian centroid PedCore _ Pt in the cuboid S or S';
step7.4: repeating Step7.1 to Step7.3, and acquiring the pedestrian number VPedNum and the pedestrian position information PedLoInfo of the pedestrian crossing crosswalk and the pedestrian crossing waiting area in the current laser point cloud data frame PED _ omega according to the position information and the number of the pedestrian centroid points;
step8: repeatedly carrying out the processing from Step2 to Step7 on data of each frame transmitted by the 3D laser radar sensor, tracking the positions of the vehicles and the pedestrians in each frame, acquiring the running tracks VedTrace and PedTrace of the vehicles and the pedestrians, and acquiring running speed information VehSpeedInfo and PedSpeedInfo of the vehicles and the pedestrians through the position change of the vehicles and the pedestrians;
step9: the multi-core laser point cloud microprocessing module transmits the number, position, speed and running track information of vehicles and pedestrians in the detection area to the traffic signal interaction coordination unit through an EMIF bus.
2. The 3D lidar based self-sensing interactive traffic signal control apparatus of claim 1, wherein:
the traffic signal interaction coordination unit consists of a communication module, an FPGA main control module, a DSP safety monitoring module and a PCB backboard, wherein the communication module, the FPGA main control module and the DSP signal regulation and control module adopt an EMIF bus to carry out data communication transmission and carry out layout fixation on the PCB backboard; the communication module is responsible for carrying out information interaction and sharing with self-sensing interactive traffic signal control devices which are arranged on other cantilever type lamp poles in the same intersection; the FPGA main control module optimizes the phase of the traffic signal and the duration of the green light in the corresponding entrance direction according to the traffic flow parameter information provided by the laser radar detection unit, and coordinates the phase, the phase sequence and the duration of the green light with a self-sensing interactive traffic signal control device on other cantilever lamp poles at the intersection through the communication module; the DSP safety monitoring module carries out dynamic monitoring and safety early warning on the safety states of the traffic flow and the pedestrian flow of the pedestrian crossing, and carries out automatic intervention and quick adjustment on traffic signals under the emergency traffic incident; the specific working steps are as follows:
step1: the FPGA main control module predicts the queuing traffic flow of each entrance road in the detection direction at a certain time interval period T _ GAP:
Figure QLYQS_71
wherein: vol (T + T _ GAP), vol (T) and Vol (T-T _ GAP) are respectively the flow of an intersection inlet channel of the next prediction period interval T _ GAP, the current prediction period and the previous prediction period interval T _ GAP at the time T; vel (T) and Vel (T-T _ GAP) are respectively the average speed of vehicles at the intersection entrance road of the current prediction period and the last prediction period interval T _ GAP at the time T; alpha and beta are correction parameters respectively;
step2: the FPGA main control module estimates the time required for dissipation of queued vehicles of each entrance lane:
Figure QLYQS_72
wherein: disT is the lane in-line vehicle dissipation time; mu queueVehicle start-up delay; gamma is a queuing dispersion time correction coefficient;
Figure QLYQS_73
average dissipation speed for the queued vehicles;
step3: the FPGA main control module determines the dissipation time of queued vehicles required by each passing direction of an entrance lane:
disT_ST=max(disT_ST 1 ,disT_ST 2 ,…,disT_ST n )
disT_LT=max(disT_LT 1 ,disT_LT 2 ,…,disT_LT m )
wherein: disT _ ST and disT _ LT are respectively the dissipation time of the queuing vehicles required by the straight-going traffic direction and the left-turning traffic direction of the entrance lane; disT _ ST 1 ,disT_ST 2 ,…,disT_ST n Respectively 1 st, 2 nd, \ 8230, the dissipation time of queuing vehicles required by n straight lanes; disT _ LT 1 ,disT_LT 2 ,…,disT_LT m Respectively 1 st, 2 nd, \ 8230;, the dissipation time of queuing vehicles required by m left-turn lanes;
step4: the FPGA main control module calculates the time difference between the dissipation time disT _ ST of the vehicles queued in the straight-going passing direction of the entrance lane and the dissipation time disT _ LT of the vehicles queued in the left-turning passing direction:
ΔdisT=|disT_ST-disT_LT|
(1) If the delta disT is less than or equal to xi, the FPGA main control module combines the straight traffic signal phase and the left-turn traffic signal phase of the entrance lane into the same signal phase stage, and the green time of the phase stage is max (disT _ ST, disT _ LT);
(2) If the delta disT is larger than xi, the FPGA main control module acquires the dissipation time disT _ ST 'and disT _ LT' of the queued vehicles in the straight-going traffic direction and the left-turning traffic direction of the opposite-entrance lane through the communication module;
(1) if the | disT _ ST-disT _ ST '| is less than or equal to xi, the FPGA main control module coordinates with the opposite FPGA main control module through the communication module, combining the straight traffic signal phase of the entrance lane and the straight traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is max (disT _ ST, disT _ ST');
(2) if the absolute value of the DIST _ LT-DIST _ LT 'is less than or equal to xi, the FPGA main control module coordinates with the opposite FPGA main control module through the communication module, the left-turn traffic signal phase of the entrance lane and the left-turn traffic signal phase of the opposite entrance lane are combined into the same signal phase stage, and the green light time of the phase stage is max (DIST _ LT, DIST _ LT');
(3) if | disT _ ST-disT _ ST '| is less than or equal to xi and | disT _ LT-disT _ LT' | is greater than xi, the FPGA main control module coordinates with the opposite FPGA main control module through the communication module:
combining the straight traffic signal phase of the entrance lane and the straight traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is max (disT _ ST, disT _ ST');
if disT _ LT > disT _ LT ' and maxVOL _ ST (T + T _ GAP) > maxVOL _ LT ' (T + T _ GAP), merging the left-turn traffic signal phase of the entrance lane and the left-turn traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is min (disT _ LT, disT _ LT '); a phase superposition stage is added in an entrance lane, the straight traffic direction and the left-turn traffic direction in the entrance direction are allowed to be released simultaneously, and the green light time in the phase superposition stage is | disT _ LT-disT _ LT' |;
if disT _ LT is greater than disT _ LT 'and maxVOL _ ST (T + T _ GAP) < maxVOL _ LT' (T + T _ GAP), merging the left-turning traffic signal phase of the entrance lane and the left-turning traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is disT _ LT;
if disT _ LT is less than disT _ LT ' and maxVOL _ LT (T + T _ GAP) > maxVOL _ ST ' (T + T _ GAP), merging the left-turn traffic signal phase of the entrance lane and the left-turn traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is disT _ LT ';
if disT _ LT is less than disT _ LT ' and maxVOL _ LT (T + T _ GAP) < maxVOL _ ST ' (T + T _ GAP), merging the left-turning traffic signal phase of the entrance lane and the left-turning traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is min (disT _ LT, disT _ LT '); a superposition phase stage is added in the opposite-direction entrance lane, the opposite-direction entrance is allowed to simultaneously release in the straight-going passing direction and the left-turning passing direction, and the green light time in the superposition phase stage is | disT _ LT-disT _ LT' |;
(4) if | disT _ ST-disT _ ST '| > xi and | disT _ LT-disT _ LT' | is less than or equal to xi, the FPGA main control module coordinates with the opposite FPGA main control module through the communication module:
combining the left-turn traffic signal phase of the entrance lane and the left-turn traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is max (disT _ LT, disT _ LT');
if disT _ ST > disT _ ST ', and maxVOL _ LT (T + T _ GAP) > maxVOL _ ST ' (T + T _ GAP), merging the straight traffic signal phase of the entrance lane and the straight traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is min (disT _ ST, disT _ ST '); a phase superposition stage is added in an entrance lane, the straight traffic direction and the left-turn traffic direction in the entrance direction are allowed to be released simultaneously, and the green light time in the phase superposition stage is | disT _ ST-disT _ ST' |;
if disT _ ST > disT _ ST ', and maxVOL _ LT (T + T _ GAP) < maxVOL _ ST' (T + T _ GAP), merging the straight traffic signal phase of the entrance lane and the straight traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is disT _ ST;
if disT _ ST < disT _ ST ', and maxVOL _ ST (T + T _ GAP) > maxVOL _ LT ' (T + T _ GAP), merging the left-turning traffic signal phase of the entrance lane and the left-turning traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is disT _ LT ';
if disT _ ST < disT _ ST ' and maxVOL _ ST (T + T _ GAP) < maxVOL _ LT ' (T + T _ GAP), merging the straight traffic signal phase of the entrance lane and the straight traffic signal phase of the opposite entrance lane into the same signal phase stage, wherein the green time of the phase stage is min (disT _ ST, disT _ ST '); adding a phase superposition stage to the opposite inlet lane, allowing the straight traffic direction and the left-turn traffic direction of the inlet direction to simultaneously pass, wherein the green light time of the phase superposition stage is | disT _ ST-disT _ ST' |;
wherein: a ξ phase stage splitting critical threshold; maxVOL _ ST (T + T _ GAP), maxVOL _ ST' (T + T _ GAP) are the next prediction interval T _ GAP at time T, respectively, the straight traffic direction of the entrance lane and the maximum straight traffic lane flow rate to the straight traffic direction of the entrance lane; maxVOL _ LT (T + T _ GAP), maxVOL _ LT' (T + T _ GAP) are the next prediction interval T _ GAP at time T, left turn traffic direction of the entrance lane and the maximum left turn traffic flow to the left turn traffic direction of the entrance lane, respectively;
step5: the FPGA main control module and other FPGA main control modules at the intersection coordinate and determine the execution sequence of each signal phase, namely the phase sequence, according to the green light duration of each passing phase in each import direction calculated by each FPGA main control module, a Webster intersection delay calculation formula and the principle of minimum intersection delay by adopting the principle of minimum intersection delay through the communication module;
step6: the DSP safety monitoring module monitors the running conditions of vehicles and pedestrians in real time:
step6.1: during the period that the green light of vehicle signal is about to end, DSP safety monitoring module real-time supervision import direction is about to drive into the vehicle at intersection, calculates whether current vehicle can safely park or safely pass through the intersection:
Figure QLYQS_74
Figure QLYQS_75
wherein: x 0 The minimum safe parking distance for the vehicle to be decelerated and parked; x C The maximum safe distance for the vehicle to pass through in an accelerating way; v 0 The running speed of a vehicle about to enter the intersection; σ is the driver's average brake reflection time; alpha is alpha _ Is the vehicle average braking deceleration; alpha is alpha + For average braking of vehiclesSpeed; greenT is the remaining transit time; w is the width of the intersection; l is the average vehicle length;
step6.2: if X 0 >X C If the vehicle can not safely pass or safely stop, the DSP safety monitoring module is coordinated with the FPGA main control module to prolong the green time of the vehicle in the running direction so that the vehicle can safely pass, and a DilemmaType1 instruction data packet is sent to the traffic signal driving control module;
step6.3: if X 0 <X C If the traffic signal driving control unit is in a traffic signal driving state, the DSP safety monitoring module coordinates with the traffic signal driving control unit to flicker a signal lamp in the vehicle running direction, prompts a driver to decelerate and stop the vehicle or accelerate to pass through, and sends a DilemmaType2 instruction data packet to the traffic signal driving control module;
step6.4: the DSP safety monitoring module monitors the running tracks of pedestrians and vehicles in the direction of the entrance in real time, and once the situation that the running tracks of the pedestrians and the vehicles conflict with each other is detected, the DSP safety monitoring module sends danger to the traffic signal driving control unit, please note safety! "the security alert DangerInfo command packet;
step6.5: when the vehicle signal green light is about to end, the DSP safety monitoring module monitors whether pedestrians exist in the pedestrian crossing in real time, and once the pedestrians are detected to appear at the boundary of the pedestrian crossing at the edge of the road, the DSP safety monitoring module sends' patience waiting, please return to the waiting area! "safety precaution PedWarningType1 instruction data packet;
step6.6: during the period of red light of vehicle signal and green light of pedestrian signal, the DSP safety detection module monitors whether the pedestrian walks on the pedestrian crossing in real time, and once the fact that the walking track of the pedestrian is not on the pedestrian crossing is detected, the DSP safety detection module sends' civilized traffic, please walk the zebra crossing! "PedWarningType 2 instruction data packet of safety precaution;
step7: the FPGA main control module and the DSP safety detection module transmit a traffic signal control scheme including green light time of each signal phase, an execution sequence of the signal phases and a traffic safety early warning instruction to a traffic signal driving control unit through an EMIF bus.
3. The 3D lidar based self-sensing interactive traffic signal control apparatus of claim 1, wherein:
the traffic signal driving control unit consists of a traffic signal display module, a traffic signal prompt module, a traffic safety early warning module, a signal lamp set, an LED micro display screen and a 6UVPX switch signal interface board, wherein the traffic signal display module, the traffic signal prompt module, the traffic safety early warning module, the signal lamp set and the LED micro display screen are connected with the 6U VPX switch signal interface board through serial ports and are in data communication transmission; the traffic signal display module is responsible for driving and controlling the turn-on time, the turn-on state and the turn-on patterns of the signal lamp group according to the traffic signal control scheme transmitted by the traffic signal coordination unit; the traffic signal prompt module controls the information prompt content of the LED micro display screen according to the traffic signal control scheme transmitted by the traffic signal interaction coordination unit, and carries out information prompt and dynamic guidance on the traffic state and the traffic time of people and traffic; the traffic safety early warning module carries out safety early warning on pedestrian flow and traffic flow at the intersection according to the traffic safety early warning instruction transmitted by the traffic signal interaction and coordination unit;
step1: the traffic signal display module receives the traffic signal control scheme transmitted by the traffic signal interaction coordination unit, converts the digital information of the traffic signal control scheme into analog signals, and drives and controls the lighting time, lighting color, lighting sequence, transition sequence and transition form of different lamp colors and the graph displayed by the lamp colors;
step2: the traffic signal prompt module receives a traffic signal control scheme transmitted by the traffic signal interaction coordination unit, drives and controls the LED micro-display screen, and prompts and guides the traffic flow, the traffic state, the traffic time, the remaining time and the advancing direction;
step3: the traffic safety early warning module receives a traffic safety early warning instruction transmitted by the traffic signal interaction coordination unit, and carries out safety early warning according to different signal instructions:
step3.1: when the traffic safety early warning module receives a DilemmaType1 instruction data packet, taking over the control of the traffic signal display module on the signal lamp group, prolonging the green lamp duration in the appointed passing direction according to the DilemmaType1 instruction content, releasing the control on the signal lamp group when the prolonging time is over, and returning the control right of the signal lamp group to the traffic signal display module;
step3.2: when the traffic safety early warning module receives a DilemmaType2 instruction data packet, the traffic safety early warning module takes over the control of the traffic signal display module on the signal lamp group, carries out flicker control on the signal lamp in the appointed passing direction according to the DilemmaType2 instruction content, releases the control on the signal lamp group after the flicker control is finished, and returns the control right of the signal lamp group to the traffic signal display module;
step3.3: when the traffic safety early warning module receives the DangerInfo instruction data packet, the traffic signal display module takes over the control of the signal lamp group, the red light flashing early warning is started in the appointed traffic direction according to the DangerInfo instruction content, and the danger is alarmed through the voice and the LED micro display screen, please pay attention to the safety! ";
step3.4: when the traffic safety early warning module receives a PedWarningType1 instruction data packet, the pedestrian is warned of' waiting for patience, please return to the waiting area! ";
step3.5: when the traffic safety early warning module receives a PedWarningType2 instruction data packet, the pedestrian is warned of' civilized traffic, please walk the zebra crossing!through the voice and the LED micro display screen! ".
CN201910046578.3A 2019-01-18 2019-01-18 Self-perception interactive traffic signal control device based on 3D laser radar Active CN109559528B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910046578.3A CN109559528B (en) 2019-01-18 2019-01-18 Self-perception interactive traffic signal control device based on 3D laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910046578.3A CN109559528B (en) 2019-01-18 2019-01-18 Self-perception interactive traffic signal control device based on 3D laser radar

Publications (2)

Publication Number Publication Date
CN109559528A CN109559528A (en) 2019-04-02
CN109559528B true CN109559528B (en) 2023-03-21

Family

ID=65873165

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910046578.3A Active CN109559528B (en) 2019-01-18 2019-01-18 Self-perception interactive traffic signal control device based on 3D laser radar

Country Status (1)

Country Link
CN (1) CN109559528B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110110696B (en) * 2019-05-17 2022-07-05 百度在线网络技术(北京)有限公司 Method and apparatus for processing information
CN110097766A (en) * 2019-06-03 2019-08-06 南昌金科交通科技股份有限公司 It is a kind of to utilize three-dimensional radar inspection optimization traffic control system
CN110599802A (en) * 2019-08-16 2019-12-20 同济大学 Pedestrian crossing early warning implementation method and early warning system
CN115643809A (en) * 2019-09-26 2023-01-24 深圳市大疆创新科技有限公司 Signal processing method of point cloud detection system and point cloud detection system
CN110599782B (en) * 2019-11-07 2021-06-11 山西省地震局 Method for controlling duration of traffic lights according to population distribution thermodynamic diagrams
CN110928317A (en) * 2019-12-27 2020-03-27 华域视觉科技(上海)有限公司 Street lamp and automatic driving road system
CN111477011A (en) * 2020-04-08 2020-07-31 图达通智能科技(苏州)有限公司 Detection device and detection method for road intersection early warning
CN111553517A (en) * 2020-04-17 2020-08-18 平安科技(深圳)有限公司 Road optimization method, system, terminal and computer readable storage medium
CN112907989A (en) * 2021-01-29 2021-06-04 安徽四创电子股份有限公司 Traffic signal control method for detecting pedestrian road crossing based on video and radar
CN112921851B (en) * 2021-02-01 2022-06-10 武汉理工大学 Self-explanatory street crossing traffic safety facility for pedestrians in road
CN113112830B (en) * 2021-04-08 2021-12-17 同济大学 Signal control intersection emptying method and system based on laser radar and track prediction
CN114530039A (en) * 2022-01-27 2022-05-24 浙江梧斯源通信科技股份有限公司 Real-time detection device and method for pedestrian flow and vehicle flow at intersection
CN114510788B (en) * 2022-04-18 2022-08-16 北京中科慧眼科技有限公司 Time threshold model construction method and system based on vehicle automatic braking system
CN115410373A (en) * 2022-11-01 2022-11-29 山东科技职业学院 Radar positioning system based on road traffic

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103302287A (en) * 2013-06-18 2013-09-18 华北电力大学 Iron-based amorphous powder for wear-resisting and corrosion-resisting coating and preparation method thereof
CN103711050A (en) * 2013-12-31 2014-04-09 中交第二公路勘察设计研究院有限公司 Laser radar road reconstruction and expansion exploratory survey design method
CN104134216A (en) * 2014-07-29 2014-11-05 武汉大学 Laser point cloud auto-registration method and system based on 16-dimension feature description
CA2847121A1 (en) * 2013-05-13 2014-11-13 Oliver Nagy Apparatus for measuring the position of a vehicle or a surface thereof
CN104966404A (en) * 2015-07-23 2015-10-07 合肥革绿信息科技有限公司 Single-point self-optimization signal control method and device based on array radars
CN105070074A (en) * 2015-07-23 2015-11-18 合肥革绿信息科技有限公司 Region self-optimization signal control method based on array radar and device
DE102014012364A1 (en) * 2014-08-25 2016-02-25 Microtech Gefell Gmbh Inertia-free A / D converter for determining the density of gas and optical signal processing equipment
CN105513387A (en) * 2016-01-30 2016-04-20 山东星志智能交通科技有限公司 Pedestrian passing signal lamp capable of being intelligently controlled
CN105551270A (en) * 2016-01-30 2016-05-04 山东星志智能交通科技有限公司 Intelligent traffic signal control system
KR20160062880A (en) * 2014-11-26 2016-06-03 휴앤에스(주) road traffic information management system for g using camera and radar
CN105894826A (en) * 2016-06-28 2016-08-24 侯舸 Traffic flow density identifying device based on infrared thermal imaging
CN106530761A (en) * 2016-12-21 2017-03-22 吉林大学 Traffic signal controller based on fractal geometry
CN106772318A (en) * 2016-11-24 2017-05-31 东台银信钢结构工程有限公司 A kind of measuring method of steel construction inner laser radar scanning density
CN106846792A (en) * 2017-04-05 2017-06-13 滁州学院 Urban Traffic Peccancy intelligent early-warning system and its method
CN107123287A (en) * 2017-06-27 2017-09-01 吉林大学 A kind of traffic lights control integrated device based on Self-organizing Coordinated
CN207263924U (en) * 2017-08-28 2018-04-20 国网甘肃省电力公司电力科学研究院 A kind of improved power-line patrolling shaft tower modeling laser radar system
CN108417055A (en) * 2018-03-22 2018-08-17 南京推推兔信息科技有限公司 A kind of trunk roads synergistic signal machine control method based on radar detector
CN108629968A (en) * 2018-06-26 2018-10-09 吉林大学 A kind of pedestrian's street crossing safety control based on laser radar
CN208240196U (en) * 2018-06-26 2018-12-14 吉林大学 A kind of pedestrian's street crossing safety control based on laser radar
CN109147363A (en) * 2018-09-10 2019-01-04 吉林大学 Traffic intelligent guides system and bootstrap technique

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140158578A1 (en) * 2012-12-06 2014-06-12 Jason Varan Folding apparatus for the containment and transport of bottles and method of use
US10677953B2 (en) * 2016-05-31 2020-06-09 Lockheed Martin Corporation Magneto-optical detecting apparatus and methods

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2847121A1 (en) * 2013-05-13 2014-11-13 Oliver Nagy Apparatus for measuring the position of a vehicle or a surface thereof
CN103302287A (en) * 2013-06-18 2013-09-18 华北电力大学 Iron-based amorphous powder for wear-resisting and corrosion-resisting coating and preparation method thereof
CN103711050A (en) * 2013-12-31 2014-04-09 中交第二公路勘察设计研究院有限公司 Laser radar road reconstruction and expansion exploratory survey design method
CN104134216A (en) * 2014-07-29 2014-11-05 武汉大学 Laser point cloud auto-registration method and system based on 16-dimension feature description
DE102014012364A1 (en) * 2014-08-25 2016-02-25 Microtech Gefell Gmbh Inertia-free A / D converter for determining the density of gas and optical signal processing equipment
KR20160062880A (en) * 2014-11-26 2016-06-03 휴앤에스(주) road traffic information management system for g using camera and radar
CN104966404A (en) * 2015-07-23 2015-10-07 合肥革绿信息科技有限公司 Single-point self-optimization signal control method and device based on array radars
CN105070074A (en) * 2015-07-23 2015-11-18 合肥革绿信息科技有限公司 Region self-optimization signal control method based on array radar and device
CN105513387A (en) * 2016-01-30 2016-04-20 山东星志智能交通科技有限公司 Pedestrian passing signal lamp capable of being intelligently controlled
CN105551270A (en) * 2016-01-30 2016-05-04 山东星志智能交通科技有限公司 Intelligent traffic signal control system
CN105894826A (en) * 2016-06-28 2016-08-24 侯舸 Traffic flow density identifying device based on infrared thermal imaging
CN106772318A (en) * 2016-11-24 2017-05-31 东台银信钢结构工程有限公司 A kind of measuring method of steel construction inner laser radar scanning density
CN106530761A (en) * 2016-12-21 2017-03-22 吉林大学 Traffic signal controller based on fractal geometry
CN106846792A (en) * 2017-04-05 2017-06-13 滁州学院 Urban Traffic Peccancy intelligent early-warning system and its method
CN107123287A (en) * 2017-06-27 2017-09-01 吉林大学 A kind of traffic lights control integrated device based on Self-organizing Coordinated
CN207263924U (en) * 2017-08-28 2018-04-20 国网甘肃省电力公司电力科学研究院 A kind of improved power-line patrolling shaft tower modeling laser radar system
CN108417055A (en) * 2018-03-22 2018-08-17 南京推推兔信息科技有限公司 A kind of trunk roads synergistic signal machine control method based on radar detector
CN108629968A (en) * 2018-06-26 2018-10-09 吉林大学 A kind of pedestrian's street crossing safety control based on laser radar
CN208240196U (en) * 2018-06-26 2018-12-14 吉林大学 A kind of pedestrian's street crossing safety control based on laser radar
CN109147363A (en) * 2018-09-10 2019-01-04 吉林大学 Traffic intelligent guides system and bootstrap technique

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
林赐云.考虑低排放低延误的交通信号优化方法.2015,29-34+41. *
王肖.复杂环境下智能车辆动态目标三维感知方法研究.2017,C035-9. *

Also Published As

Publication number Publication date
CN109559528A (en) 2019-04-02

Similar Documents

Publication Publication Date Title
CN109559528B (en) Self-perception interactive traffic signal control device based on 3D laser radar
CN107798861B (en) Vehicle cooperative formation driving method and system
CN108011947B (en) Vehicle cooperative formation driving system
WO2022063331A1 (en) V2x-based formation driving networked intelligent passenger vehicle
WO2017124584A1 (en) Informatization and networking implementation method for road pavement supporting unmanned automatic driving of an automobile
CN114995451A (en) Control method, road side equipment and system for cooperative automatic driving of vehicle and road
CN113345253B (en) Traffic flow array and signal cooperative control system
CN109074733A (en) Vehicle control system, control method for vehicle and vehicle control program
CN111845754B (en) Decision prediction method of automatic driving vehicle based on edge calculation and crowd-sourcing algorithm
CN114495547B (en) Signal intersection cooperative passing method for automatically-driven automobile
CN109823339A (en) Vehicle traffic light intersection passing control method and control system
CN110807412B (en) Vehicle laser positioning method, vehicle-mounted equipment and storage medium
CN112437501B (en) Multi-sensor beyond-the-horizon ad hoc network method based on traffic semantics and game theory
CN113375678B (en) Driving path planning method, management server and parking management system
CN110271542A (en) Controller of vehicle, control method for vehicle and storage medium
CN110271547A (en) Controller of vehicle, control method for vehicle and storage medium
CN115100904B (en) Forward and game-based slow traffic and automobile conflict early warning method and system
EP3846148B1 (en) System, management method, automated driving vehicle, and program
CN114822048A (en) Cloud planning service system and method for Internet vehicles
CN110517527A (en) Isomerous multi-source wireless sensor network and its it is autonomous nobody park method
CN115840404B (en) Cloud control automatic driving system based on automatic driving special road network and digital twin map
CN116013101A (en) System and method for suggesting speed of signal-free intersection based on network environment
CN114475662A (en) Vehicle-mounted intelligent control system based on environment perception and multi-vehicle cooperation
KR102605101B1 (en) Smart signal apparatus for non-signalized intersection and control method thereof
KR102628707B1 (en) Location-based IoT Autonomous Sensing Road Facility System for Electric and Hydrogen Unmanned Autonomous Vehicles and Method of Operation Thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant