WO2020139283A2 - Bubble eye system - Google Patents

Bubble eye system Download PDF

Info

Publication number
WO2020139283A2
WO2020139283A2 PCT/TR2019/051182 TR2019051182W WO2020139283A2 WO 2020139283 A2 WO2020139283 A2 WO 2020139283A2 TR 2019051182 W TR2019051182 W TR 2019051182W WO 2020139283 A2 WO2020139283 A2 WO 2020139283A2
Authority
WO
WIPO (PCT)
Prior art keywords
traffic
data
vehicles
detection
intersection
Prior art date
Application number
PCT/TR2019/051182
Other languages
French (fr)
Other versions
WO2020139283A3 (en
Inventor
Emre TUNALI
Sinan ÖZ
Osman BAYTAROGLU
Original Assignee
İnnomoti̇ve Elektroni̇k Yazilim Araştirma Geli̇şti̇rme Sanayi̇ Ve Ti̇caret Li̇mi̇ted Şi̇rketi̇
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by İnnomoti̇ve Elektroni̇k Yazilim Araştirma Geli̇şti̇rme Sanayi̇ Ve Ti̇caret Li̇mi̇ted Şi̇rketi̇ filed Critical İnnomoti̇ve Elektroni̇k Yazilim Araştirma Geli̇şti̇rme Sanayi̇ Ve Ti̇caret Li̇mi̇ted Şi̇rketi̇
Publication of WO2020139283A2 publication Critical patent/WO2020139283A2/en
Publication of WO2020139283A3 publication Critical patent/WO2020139283A3/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0129Traffic data processing for creating historical data or processing based on historical data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0141Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination

Definitions

  • the Technical Field Of The Invention relates to bubble eye system which is for real-time traffic parameter estimation and statistical data generation by using omnidirectional visual sensors and computer vision methodologies.
  • Loop based solutions have lower error rates and competitive prices; however, they can only be a compelling option in mature infrastructures as being the case in Western and Northern Europe due to their destructive installations (i.e. they are installed inside the road asphalt). Moreover, their competitiveness is presumed to decrease even in these regions since loop based solutions becomes insufficient in providing data that are required by the state of the art intersection, arterial, network control algorithms and widespread adaptive traffic management system, see Table 2.
  • This invention relates to a bubble eye system.
  • traffic is modeled based on a variety of critical parameters.
  • Benefiting from visual sensors, main goal of the system is to generate rich flow of data not only for human understanding but also for revealing traffic parameters that can be used for optimizing traffic flow capacity.
  • intersections are selected as primary target since they are one of the bottlenecks of the traffic flow and in need of better management. More precisely, our objective is to aid controlling signalization times at an intersection via our traffic sensor solution that can disclose traffic parameters by using of novel computer vision techniques.
  • intersections are selected as primary targets, the system can also be well generalized for safety, observability and control of other traffic spots including highways, tunnels and bridges.
  • Intelligent transport systems (ITS) for traffic monitoring, management and enforcement benefits from various sensor types to make urban passenger transport faster, more efficient and more reliable.
  • Our invention has a holistic approach, with analysis regarding to traffic management, transportation planning, traffic safety, and traffic enforcement. This ensures better understanding of the traffic by the all types of users and cuts many overlapping costs such as traffic counts and unnecessary transportation design changes.
  • Our system is designed to be an all in one solution, enabling plug and play installation and providing convenience in operation and maintenance.
  • Price erosion caused by competitiveness in the market emphasize requirement of low cost ITS systems. Due to nature of the design of the system, it offers all in one solution with minimal and low cost hardware and requires inexpensive installation, which gives us a distinct advantage over our competitors.
  • Table 1 gives a summary of comparison among various prominent sensor technologies that are applied to intelligent transportation systems.
  • the Bubble Eye system has a considerable impact potential. It provides much anticipated data types (entry-exit pair counting, queue estimation) for next generation traffic control and traffic monitoring applications, which is provided by no other solutions in the market. Table 2 gives a summary of prominent advanced traffic management systems and fitness of our system to this market.
  • system can produce vehicle presence, moving object detection, vehicle tracking, queue length estimation, waiting pedestrian detection, congestion detection, speed estimation, incident detection, entry-exit pair counting and vehicle classification by using of blocks.
  • the Cameras that are both omnidirectional (typically fisheye) cameras, panoramic cameras and single directional network cameras.
  • Another aspect of the invention wherein the system optimize traffic flow by adaptive signal time controlling in intersections and together with planning fixed time signalizations for any intersection and network of intersections by using of output informations.
  • the output informations are vehicle presence, moving object detection, vehicle tracking, queue length estimation, waiting pedestrian detection, congestion detection, speed estimation, incident detection, entry-exit pair counting and vehicle classification.
  • the system can use any type of camera(s) as a sensor and analyzes traffic by using computer vision techniques.
  • Origin-destination counting module which is providing that From which direction the vehicles comes and which direction the vehicles goes that is detected by following the vehicles and from which direction in which direction how many vehicles goes are counted by high accuracy.
  • Another aspect of the invention wherein the input comprising definition of arrival regions, entry nodes, exit nodes, and routes in between these entry-exit regions in minimum.
  • Another aspect of the invention wherein the real data allows to be used for optimization of traffic light times (signal durations) in intersection or network of intersections.
  • Another aspect of the invention wherein the multiple tasks are illegal turn, wrong way violation, stopped vehicle detection, parking violation, debris detection or abnormality detections.
  • FIG. 1 Flowchart of the system is given.
  • Figure 2 An Example of Intersection Installation with the User Inputs.
  • Figure 3 Sample Usage of Multiple Cameras at a Roundabout.
  • Figure 4 Foop based Systems (Upper Feft) Collects Data from Yellow Region; Single Directional Network Camera based Systems (Upper Right) can only Collect data from Fighted Areas; While Omnidirectional Sensors (bottom) allow system to collect data from whole intersection.
  • Bubble Eye System which is for real-time traffic parameter estimation and statistical data generation by using omnidirectional visual sensors and computer vision methodologies developed with this invention, the details are as presented below.
  • the system is designed for analyzing the traffic flow and disclosing traffic parameters which can be used for many different applications including advanced intersection management, optimization of traffic light durations, transportation planning, and statistical data generation for analyzing dynamics of traffic.
  • vision based sensors have two major advantages over many other detection sensors or localization technologies.
  • vision based systems can capture tremendous amount of visual information over wide areas, often beyond the range of other sensors such as active radar, magnetic loop detector etc.
  • they are relatively inexpensive and can be easily installed without the need for any complementary equipment, disrupting road operation or altering infrastructure.
  • vision based sensors can be configured for utilization of other additional tasks by only software changes. In this manner, by analyzing visual data, visual sensor dissolves necessity of data fusion via many different sensor types for achieving multiple tasks at the same time. For instance, illegal turn, wrong way violation, stopped vehicle detection, parking violation, debris detection and abnormality detections can be achieved at the same time.
  • Our system can use any type of camera(s) as a sensor and analyzes traffic by using computer vision techniques.
  • Key feature of the system lies in the capability of processing visual data from both omnidirectional (typically fisheye) cameras, any type of panoramic cameras and single directional network cameras by a variety of novel computer vision methods developed in our solution BubbleEye.
  • the Bubble Eye consists of five fundamental building blocks: object detection, target tracking, occlusion elimination, queue length estimation, and vehicle classification.
  • system can produce vehicle presence, moving object detection, vehicle tracking, queue length estimation, waiting pedestrian detection, congestion detection, speed estimation, incident detection, entry-exit pair counting, and vehicle classification. Since these outputs reveals key information about current traffic status; they can be benefitted by traffic controllers for optimizing traffic flow in any infrastructure including our primary target, intersections.
  • some of these estimated traffic parameters, especially entry-exit pair counting provides valuable statistical data for traffic engineering to plan fixed time signalizations, to shape future infrastructure planning decisions and to pave to way for next generation traffic control approaches.
  • intersection solutions exactly the same technology can be employed for enhancing safety, observability and control of other traffic infrastructures including highways, tunnels and bridges
  • the system detects moving objects in the field of view, which is usually an intersection, tracks these objects from their entry points to exit points.
  • the information gathered from every moving object transformed into a traffic state information and is delivered to provide analysis such as:
  • these regions define entry legs of the field of view.
  • the region is drawn by the user via an installation application as a polygon.
  • the system automatically divides the polygon into grids.
  • the aim of this region is to provide information about traffic volume in the region and respective roads such as intersection leg. Traffic volume in the region and distribution of the volume is an important indicator to understand the traffic demand and anomalies inside the region such as stopped vehicle or traffic accident. In other words, benefitting from analysis made in these regions, saturation at an intersection leg can be successfully detected.
  • settings of the tracking operation such as at which grid tracking should start or at which grid traffic demand is counted and processed can be defined by the user.
  • the setting to process traffic demand can have multiple criteria. If no grid is defined by the user to start tracking, vehicles exiting the yellow region is controlled to open tracking automatically when the object is exiting the region. In other words, if no grid is selected specifically to start object tracking, tracking starts when the object exits the region.
  • Entry Nodes Illustrated with white dots with black solid boundaries, these dots indicate entry points. Starting from these dots, the user draws possible turning movement paths inside the field of concern. If these dots are inside the arrival regions, tracking starts at the respective grid of the arrival region, else, tracking starts when the objects exit the arrival region. These entry nodes define entry points of entry exit pairs and the matrix formed by these pairs.
  • Exit Nodes Illustrated with big solid black dots, these dots indicate exit points. The objects moving close to these dots are considered as exiting through the respective node. Entry-exit pairs of the objects reaching to proximity of these nodes are decided via the route (trajectory) information and entry node. All vehicle and traffic parameters related to the object is saved at this point. Then, tracking and analysis of the object terminates.
  • Routes are possible trajectories that vehicles can pass through between entry and exit nodes. With this extra information provided to the system, the effect of the losses in tracking due to the effects such as frame drop or occlusion is mitigated. In such an event of track loss, the data gathered by the system is analyzed to restart tracking of the vehicle in proximity of the track loss location and is used to maximize the accuracy of the traffic volumes. Routes are illustrated as dashed arrows from entry nodes to exit nodes. Irrelevant Zone: Illustrated with a dotted mask, the zone defines the zones outside the field of concern. These zones include regions where the traffic is not flowing or the traffic is not relevant in the field of view of the camera. However, these regions are not excluded from monitoring of the users.
  • Figure 1 Loop based Systems (Upper Left) Collects Data from Regions with lighted overlay, indicating the location of the loop detectors; Single Directional Network Camera based Systems (Upper Right) can only Collect data from Lighted Areas; While Omnidirectional Sensors (bottom) allow system to collect data from whole intersection.
  • the system is designed to simultaneously use multiple cameras at once to increase the regions where data can be collected or to increase the accuracy and the performance of the system overall.
  • Placement of the cameras determine whether cameras have overlapping regions in their field of views or not. In both cases, the system is designed to benefit from data provided by multiple cameras simultaneously for current analysis.
  • Each camera provides data for its respective field of concern.
  • the system is based on five main components:
  • the system has both the image and video processing modules as well as the traffic data analysis module is using the datas obtained from these modules. Moreover, these data and the results of the analysis are presented to other systems as input in real time and shown to users.
  • the system has modules which are moving object detection module, partial occlusion elimination module, track initialization module, omni-directional visual tracking module, origin-destination counting module, feature based classification module, occupancy detection modules to generate data for the traffic analyzer module.
  • Flowchart of the Bubble Eye system is given Figure 1.
  • Moving vehicles are detected in the Bubble Eye system. Vehicles that remain parked or inactive for a long time are excluded. Detection of moving vehicles are provided by moving object detection module. Background subtsraction techniques are used in the moving object detection module. In this way, vehicles that are not moving for a certain period of time are not detected thus, there will be no impact of vehicles parked at roadside. This situation is positive for bubble eye system. At any time, if the parked vehicle starts to move, the system will detect the movement and detect the vehicle.
  • Partial occlusion elimination module allows you to distinguish vehicles when more than one vehicle acts as a single vehicle.
  • the vehicles identified from the Track Initialization Module, Moving Object Detection and Partial Occlusion Elimination modules are monitored and the movement models are extracted to decide whether to follow the vehicles.
  • Origin-Destination Counting module
  • the vehicles are continiuously classified according to their location as an one of the four categories: bicycle/motorcycle, car, minibus/van and bus/truck.
  • classification is made by using of deep learning. In this situation, it is possible that some vehicles can not be in any class.In this type of situations, class of some vehicles which has no class, are labeled as a undefined.
  • Occupancy Detection Module Instensity of the vehicles in red light standby zone is substracted by occupancy detection module. For example, the que information generated by vehicles waiting at the red light is produced at a 4-arm intersection. This information is used by data analyzer. Furthermore, this information can be used instead of occupancy information which is generated by loop dedector, thus it allows to direct traffic lights adaptively. Traffic Data Analyzer Module:
  • Traffic data analyzer module can generate traffic jam information that is exist or not at intersections or roads. Basically, this information can generated easily by making of following vehicles at intersections.
  • the number of vehicles which are using of intersections can be measured by vehicle detection and following module.
  • the number of vehicles using the intersection can also be measured by vehicle detection and tracking modules.
  • the average speed of passage of vehicles in the intersection can be measured by camera registration.
  • the number of units using the input and output arms can be measured by means of vehicle detection and tracking capability. Datas such as the intensity of the traffic per lane and per arm and the number of U turns are measured by the traffic data analyzer module.
  • the queue length of the vehicles waiting in the red light at the intersection arms is also is signified by this module.
  • Data generated from the traffic analyzer on the image taken from the camera is printed in real time as a symbology and the image produced is streamed from the internet and transmitted to the user and observers.
  • the above-mentioned data taken from the intersection are instantaneously written to the database and transmitted to a specified server.
  • the measurements taken at the intersection can be transmitted to the simulation program in real time, and the real data at a later time allows to be used for optimization of intersection light times.
  • bubble eye system When we compare bubble eye system and other current systems between each other, bubble eye system has some unique features. These are;
  • Origin-Destination Counting By tracking all vehicles across the field of view, bubble eye system has the unique capibility of providing real time origin destination counting data. This data then could be used in real time for coordinated traffic manegment purposes or in transportation infrastructure evaluation.
  • Incidents of various types including crashes, fallen debris, stoppedvehicle etc. is detected along with the traffic flow data. This feature enables rapid emergency response to the intersections, tunnels or bridges where a large pedesterians use frequently.
  • Bubble eye is the first all in one solution that brings real time, bicycle and pedesterian detection which are supported with features. Waiting pedesterian detection data is beneficial for traffic management purposes, especially in areas where pedesterians use frequently
  • Buble eye has the feature of congestion detection, where an oversaturated field of view could easily be detected and appropriate traffic manegment measures could be taken immediately.
  • a bubble eye system for analyzing the traffic flow and disclosing traffic parameters which can be used for many different applications including advanced intersection management, optimization of traffic light durations, transportation planning, and statistical data generation for analyzing dynamics of traffic, comprising;
  • Moving object detection module which is detecting of moving vehicles
  • Partial occlusion elimination module which is providing that The distinguish vehicles when more than one vehicle acts as a single vehicle
  • Track initialization module which is providing that The vehicles identified from the Moving Object Detection and Partial Occlusion Elimination modules are monitoring and the movement models are extracted to decide whether to follow the vehicles
  • Omni-directional visual tracking module which is providing that tracking of vehicle without by the deterioration of the optical means
  • Origin-destination counting module which is providing that counting of vehicles by following at high accuracy
  • Feature based classification module which is providing that classifying of vehicles according to their location as a categories by using of deep learning
  • Occupancy detection modules which is providing that instensity of the vehicles in red light standby zone is substracted
  • Traffic analyzer module which is generating an traffic jam information that is exist or not at intersections or roads
  • ⁇ Statistical Data Generation ve Simulation Integration section which is providing that data taken from the intersection are instantaneously written to the database and transmitted to a specified server and the measurements taken at the intersection can be transmitted to the simulation program in real time, and the real data at a later time allows to be used for optimization of intersection light times,
  • ⁇ Adaptive intersection controller is the unit that determines green light duration considering the traffic parameters extracted by the system as a whole
  • Blocks that are object detection, target tracking, occlusion elimination, queue length estimation, and vehicle classification
  • a method for operating a Bubble Eye system comprising the steps of;

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to bubble eye system which is for real-time traffic parameter estimation and statistical data generation by using omnidirectional visual sensors and computer vision methodologies.

Description

BUBBLE EYE SYSTEM
The Technical Field Of The Invention The invention relates to bubble eye system which is for real-time traffic parameter estimation and statistical data generation by using omnidirectional visual sensors and computer vision methodologies.
Prior Art About The Invention(Previous Technique)
As personal vehicles become widespread and vehicle per capita increases rapidly, current infrastructures have been falling behind, causing massive traffic problems. Traffic jams, congestion and inefficient timing in signalization are causing excessive fuel consumption, high carbon emission, traffic accidents and time lost in traffic. It is estimated that these problems are wasting tens of billions of Euros each year.
Loop based solutions have lower error rates and competitive prices; however, they can only be a compelling option in mature infrastructures as being the case in Western and Northern Europe due to their destructive installations (i.e. they are installed inside the road asphalt). Moreover, their competitiveness is presumed to decrease even in these regions since loop based solutions becomes insufficient in providing data that are required by the state of the art intersection, arterial, network control algorithms and widespread adaptive traffic management system, see Table 2.
State of the art technologies in detection includes radar, thermal imagery and LIDAR, all of which have their own limitations in application. Thermal cameras are subject to export restrictions on product quality, hence performance issues limit their extensive usage.
Although LIDAR technology is promising, it is still far from being economically feasible. Radar technology is also limited due to its nature of detecting through cross sections, which makes omnidirectional imaging solutions viable especially for complete intersection management see Table 1.
In addition, current sensors for vehicle, bicycle and pedestrian detection and data extracted from these sensors causes systems implemented to be complex and expensive. Aims Of The Invention and a Brief Explanation
This invention relates to a bubble eye system. To achieve sustainable, efficient, clean and safe transportation; and to acquire relevant information for future planning, traffic is modeled based on a variety of critical parameters. Benefiting from visual sensors, main goal of the system is to generate rich flow of data not only for human understanding but also for revealing traffic parameters that can be used for optimizing traffic flow capacity. For this task, intersections are selected as primary target since they are one of the bottlenecks of the traffic flow and in need of better management. More precisely, our objective is to aid controlling signalization times at an intersection via our traffic sensor solution that can disclose traffic parameters by using of novel computer vision techniques. Although intersections are selected as primary targets, the system can also be well generalized for safety, observability and control of other traffic spots including highways, tunnels and bridges. Intelligent transport systems (ITS) for traffic monitoring, management and enforcement benefits from various sensor types to make urban passenger transport faster, more efficient and more reliable.
Using our invention, vehicles approaching an intersection can be notified via the previous intersection since vehicle exit points are available instantaneously. That ensures better coordination solutions by the adaptive traffic management system providers.
Using our solution, it will be possible to perform traffic counts without the need of employing many people to count intersections by classification and turning movements. Any images recorded by any type of cameras located at the intersection, on a drone or on an air balloon will easily be transferred into richer analysis that is cost effective and accurate.
Our invention has a holistic approach, with analysis regarding to traffic management, transportation planning, traffic safety, and traffic enforcement. This ensures better understanding of the traffic by the all types of users and cuts many overlapping costs such as traffic counts and unnecessary transportation design changes.
Our system is designed to be an all in one solution, enabling plug and play installation and providing convenience in operation and maintenance. Price erosion caused by competitiveness in the market emphasize requirement of low cost ITS systems. Due to nature of the design of the system, it offers all in one solution with minimal and low cost hardware and requires inexpensive installation, which gives us a distinct advantage over our competitors.
The unique capability of benefitting from visual data over wide areas, introduces a major advantage over other sensing technologies. The list of other advantages of the system over competing technologies is given in Table 1. Table 1 gives a summary of comparison among various prominent sensor technologies that are applied to intelligent transportation systems.
Figure imgf000005_0001
Table 1
The Bubble Eye system has a considerable impact potential. It provides much anticipated data types (entry-exit pair counting, queue estimation) for next generation traffic control and traffic monitoring applications, which is provided by no other solutions in the market. Table 2 gives a summary of prominent advanced traffic management systems and fitness of our system to this market.
Figure imgf000006_0001
Table 2
Table 1 Fitness of Bubble Eye to Requirements of Widespread Advanced Traffic Management System.
Usage of single camera removed need of collecting, synchronizing and merging data from multiple units, which introduces another major advantage of our system: Network simplicity. Moreover, having whole intersection observed by single sensor enables accurate tracking, which is a valuable metric for real time traffic control and advanced traffic control coordination.
Another aspect of the invention, wherein the system can produce vehicle presence, moving object detection, vehicle tracking, queue length estimation, waiting pedestrian detection, congestion detection, speed estimation, incident detection, entry-exit pair counting and vehicle classification by using of blocks.
Another aspect of the invention, wherein the Cameras that are both omnidirectional (typically fisheye) cameras, panoramic cameras and single directional network cameras.
Another aspect of the invention, wherein the system optimize traffic flow by adaptive signal time controlling in intersections and together with planning fixed time signalizations for any intersection and network of intersections by using of output informations.
Another aspect of the invention, wherein the output informations are vehicle presence, moving object detection, vehicle tracking, queue length estimation, waiting pedestrian detection, congestion detection, speed estimation, incident detection, entry-exit pair counting and vehicle classification. Another aspect of the invention, wherein the system can use any type of camera(s) as a sensor and analyzes traffic by using computer vision techniques.
Another aspect of the invention, wherein the Origin-destination counting module which is providing that From which direction the vehicles comes and which direction the vehicles goes that is detected by following the vehicles and from which direction in which direction how many vehicles goes are counted by high accuracy.
Another aspect of the invention, wherein the input comprising definition of arrival regions, entry nodes, exit nodes, and routes in between these entry-exit regions in minimum.
Another aspect of the invention, wherein the real data allows to be used for optimization of traffic light times (signal durations) in intersection or network of intersections.
Another aspect of the invention, wherein the multiple tasks are illegal turn, wrong way violation, stopped vehicle detection, parking violation, debris detection or abnormality detections.
Another aspect of the invention, wherein the system benefit from data provided by multiple cameras simultaneously for any analysis given by the system.
Another aspect of the invention, wherein the system brings real time, bicycle and pedesterian detection which are supported with features.
Another aspect of the invention, wherein the Cameras that have overlapping regions in their field of views or not situations;
If cameras have overlapping regions in their field of views:
• ensuring of Track continuity between cameras,
• A moving object exiting from the field of concern of a camera is continued to be tracked in the other camera’s field of concern,
If cameras do not have overlapping regions in their field of views:
• Providing of data for Each cameras respective field of concern.
The Descriptions Of The Figures Explaining The Invention
The figures used to better explain a bubble eye system for real-time traffic parameter estimation and statistical data generation by using omnidirectional visual sensors and computer vision methodologies developed with this invention and their descriptions are as follows:
Figure 1 : Flowchart of the system is given.
Figure 2: An Example of Intersection Installation with the User Inputs.
Figure 3: Sample Usage of Multiple Cameras at a Roundabout.
Figure 4: Foop based Systems (Upper Feft) Collects Data from Yellow Region; Single Directional Network Camera based Systems (Upper Right) can only Collect data from Fighted Areas; While Omnidirectional Sensors (bottom) allow system to collect data from whole intersection.
The Detailed Explanation Of The Invention
To better explain a Bubble Eye System which is for real-time traffic parameter estimation and statistical data generation by using omnidirectional visual sensors and computer vision methodologies developed with this invention, the details are as presented below.
The system is designed for analyzing the traffic flow and disclosing traffic parameters which can be used for many different applications including advanced intersection management, optimization of traffic light durations, transportation planning, and statistical data generation for analyzing dynamics of traffic.
The system benefits from vision based sensors since they have two major advantages over many other detection sensors or localization technologies. First, vision based systems can capture tremendous amount of visual information over wide areas, often beyond the range of other sensors such as active radar, magnetic loop detector etc. Second, they are relatively inexpensive and can be easily installed without the need for any complementary equipment, disrupting road operation or altering infrastructure. Moreover, vision based sensors can be configured for utilization of other additional tasks by only software changes. In this manner, by analyzing visual data, visual sensor dissolves necessity of data fusion via many different sensor types for achieving multiple tasks at the same time. For instance, illegal turn, wrong way violation, stopped vehicle detection, parking violation, debris detection and abnormality detections can be achieved at the same time. Our system can use any type of camera(s) as a sensor and analyzes traffic by using computer vision techniques. Key feature of the system lies in the capability of processing visual data from both omnidirectional (typically fisheye) cameras, any type of panoramic cameras and single directional network cameras by a variety of novel computer vision methods developed in our solution BubbleEye.
The Bubble Eye consists of five fundamental building blocks: object detection, target tracking, occlusion elimination, queue length estimation, and vehicle classification. By using these blocks, system can produce vehicle presence, moving object detection, vehicle tracking, queue length estimation, waiting pedestrian detection, congestion detection, speed estimation, incident detection, entry-exit pair counting, and vehicle classification. Since these outputs reveals key information about current traffic status; they can be benefitted by traffic controllers for optimizing traffic flow in any infrastructure including our primary target, intersections. Moreover, some of these estimated traffic parameters, especially entry-exit pair counting, provides valuable statistical data for traffic engineering to plan fixed time signalizations, to shape future infrastructure planning decisions and to pave to way for next generation traffic control approaches. As well as intersection solutions, exactly the same technology can be employed for enhancing safety, observability and control of other traffic infrastructures including highways, tunnels and bridges
In short, the system detects moving objects in the field of view, which is usually an intersection, tracks these objects from their entry points to exit points. During tracking, the information gathered from every moving object transformed into a traffic state information and is delivered to provide analysis such as:
• Congestion inside the intersection
• Number of vehicles inside the intersection
• Average turn speeds
• Counting by entry-exit pairs
• Traffic volumes by entry-exit pairs
• Vehicle classification
• Traffic demand by lanes or turning movements at the intersection legs
• Queue length at the intersection legs The system needs definition of the application area and sets of rules before being implemented. An example of an intersection with exemplary inputs defined is given in Figure 2. As seen in Figure 2, system requires definition of arrival regions, entry nodes, exit nodes, and routes in between these entry-exit regions in minimum.
Arrival Regions:
Illustrated in regions with dashed boundaries and dashed grid lines, these regions define entry legs of the field of view. The region is drawn by the user via an installation application as a polygon. The system automatically divides the polygon into grids. The aim of this region is to provide information about traffic volume in the region and respective roads such as intersection leg. Traffic volume in the region and distribution of the volume is an important indicator to understand the traffic demand and anomalies inside the region such as stopped vehicle or traffic accident. In other words, benefitting from analysis made in these regions, saturation at an intersection leg can be successfully detected. Due to the grid structure, settings of the tracking operation such as at which grid tracking should start or at which grid traffic demand is counted and processed can be defined by the user. The setting to process traffic demand can have multiple criteria. If no grid is defined by the user to start tracking, vehicles exiting the yellow region is controlled to open tracking automatically when the object is exiting the region. In other words, if no grid is selected specifically to start object tracking, tracking starts when the object exits the region.
Entry Nodes: Illustrated with white dots with black solid boundaries, these dots indicate entry points. Starting from these dots, the user draws possible turning movement paths inside the field of concern. If these dots are inside the arrival regions, tracking starts at the respective grid of the arrival region, else, tracking starts when the objects exit the arrival region. These entry nodes define entry points of entry exit pairs and the matrix formed by these pairs.
Exit Nodes: Illustrated with big solid black dots, these dots indicate exit points. The objects moving close to these dots are considered as exiting through the respective node. Entry-exit pairs of the objects reaching to proximity of these nodes are decided via the route (trajectory) information and entry node. All vehicle and traffic parameters related to the object is saved at this point. Then, tracking and analysis of the object terminates.
Routes: Routes are possible trajectories that vehicles can pass through between entry and exit nodes. With this extra information provided to the system, the effect of the losses in tracking due to the effects such as frame drop or occlusion is mitigated. In such an event of track loss, the data gathered by the system is analyzed to restart tracking of the vehicle in proximity of the track loss location and is used to maximize the accuracy of the traffic volumes. Routes are illustrated as dashed arrows from entry nodes to exit nodes. Irrelevant Zone: Illustrated with a dotted mask, the zone defines the zones outside the field of concern. These zones include regions where the traffic is not flowing or the traffic is not relevant in the field of view of the camera. However, these regions are not excluded from monitoring of the users. Inside these regions no data is collected and no traffic information is gathered. Figure 1 Loop based Systems (Upper Left) Collects Data from Regions with lighted overlay, indicating the location of the loop detectors; Single Directional Network Camera based Systems (Upper Right) can only Collect data from Lighted Areas; While Omnidirectional Sensors (bottom) allow system to collect data from whole intersection.
Use of Multiple Cameras:
· When there is a structure preventing visual contact with several regions in the field of concern,
• When the camera pole height is not sufficient,
• Where the occlusion rate is high,
• When the field of concern is too large, such as large roundabouts
Use of multiple cameras can be necessary. The system is designed to simultaneously use multiple cameras at once to increase the regions where data can be collected or to increase the accuracy and the performance of the system overall.
Placement of the cameras determine whether cameras have overlapping regions in their field of views or not. In both cases, the system is designed to benefit from data provided by multiple cameras simultaneously for current analysis.
If cameras have overlapping regions in their field of views:
► Track continuity is ensured between cameras.
► A moving object exiting from the field of concern of a camera is continued to be
tracked in the other camera’s field of concern. If cameras do not have overlapping regions in their field of views:
► Each camera provides data for its respective field of concern.
The data shared from multiple cameras and analyzed together to provide a complete control over the total field of concern.
The system is based on five main components:
• Moving object detection,
• Object tracking,
• Occupancy detection,
• Occlusion elimination,
• Vehicle classification
All components are suitable for single camera & multi-camera use.
The system has both the image and video processing modules as well as the traffic data analysis module is using the datas obtained from these modules. Moreover, these data and the results of the analysis are presented to other systems as input in real time and shown to users. The system has modules which are moving object detection module, partial occlusion elimination module, track initialization module, omni-directional visual tracking module, origin-destination counting module, feature based classification module, occupancy detection modules to generate data for the traffic analyzer module. Flowchart of the Bubble Eye system is given Figure 1.
Moving object detection module:
Moving vehicles are detected in the Bubble Eye system. Vehicles that remain parked or inactive for a long time are excluded. Detection of moving vehicles are provided by moving object detection module. Background subtsraction techniques are used in the moving object detection module. In this way, vehicles that are not moving for a certain period of time are not detected thus, there will be no impact of vehicles parked at roadside. This situation is positive for bubble eye system. At any time, if the parked vehicle starts to move, the system will detect the movement and detect the vehicle.
Partial Occlusion Elimination module:
In some cases, more than one vehicle moves in the same direction, side-by-side or back- to-back. This causes multiple vehicles to be perceived as a single vehicle. The Partial occlusion elimination module allows you to distinguish vehicles when more than one vehicle acts as a single vehicle.
Track Initialization module:
The vehicles identified from the Track Initialization Module, Moving Object Detection and Partial Occlusion Elimination modules are monitored and the movement models are extracted to decide whether to follow the vehicles.
Omni-Directional Visual Tracking module:
Due to the wide viewing angles of the cameras used in the imaging, deformations such as the growth towards the image center and shrinkage towards the edges occur in the vehicles seen in the image. Thus, there is possible tracking of vehicle without by the deterioration of the optical means in the Omni-Directional Visual Tracking module.
Origin-Destination Counting module:
From which direction the vehicles comes and which direction the vehicles goes that is detected by following the vehicles. Thus, from which direction in which direction how many vehicles goes are Counted by high accuracy. In other systems, these numbers are generally estimated by a statistical method according to the status of the lights. Since Bubble Eye system has origin-destination following, the value can measurable.
Feature Based Classification Module:
The vehicles are continiuously classified according to their location as an one of the four categories: bicycle/motorcycle, car, minibus/van and bus/truck. In this classification, classification is made by using of deep learning. In this situation, it is possible that some vehicles can not be in any class.In this type of situations, class of some vehicles which has no class, are labeled as a undefined.
Occupancy Detection Module: Instensity of the vehicles in red light standby zone is substracted by occupancy detection module. For example, the que information generated by vehicles waiting at the red light is produced at a 4-arm intersection. This information is used by data analyzer. Furthermore, this information can be used instead of occupancy information which is generated by loop dedector, thus it allows to direct traffic lights adaptively. Traffic Data Analyzer Module:
Traffic data analyzer module can generate traffic jam information that is exist or not at intersections or roads. Basically, this information can generated easily by making of following vehicles at intersections. The number of vehicles which are using of intersections, can be measured by vehicle detection and following module. The number of vehicles using the intersection can also be measured by vehicle detection and tracking modules. The average speed of passage of vehicles in the intersection can be measured by camera registration. The number of units using the input and output arms can be measured by means of vehicle detection and tracking capability. Datas such as the intensity of the traffic per lane and per arm and the number of U turns are measured by the traffic data analyzer module. In addition, the queue length of the vehicles waiting in the red light at the intersection arms is also is signified by this module.
Instantaneous Visual Output Generation and Real-time Visual Display Section:
Data generated from the traffic analyzer on the image taken from the camera is printed in real time as a symbology and the image produced is streamed from the internet and transmitted to the user and observers.
Statistical Data Generation ve Simulation Integration Section:
The above-mentioned data taken from the intersection are instantaneously written to the database and transmitted to a specified server. In this way, the measurements taken at the intersection can be transmitted to the simulation program in real time, and the real data at a later time allows to be used for optimization of intersection light times.
When we compare bubble eye system and other current systems between each other, bubble eye system has some unique features. These are;
• Network Simplicity
• Origin-Destination Counting
• Incident Detection
• Waiting Pedesterian Detection
• Congestion Detection
• Competitive Prices Network Simplicity: single detector means easy installation without lots of network equipment and cabling, while easing maintance and operation.
Origin-Destination Counting: By tracking all vehicles across the field of view, bubble eye system has the unique capibility of providing real time origin destination counting data. This data then could be used in real time for coordinated traffic manegment purposes or in transportation infrastructure evaluation.
Incident Detection: Incidents of various types including crashes, fallen debris, stoppedvehicle etc. is detected along with the traffic flow data. This feature enables rapid emergency response to the intersections, tunnels or bridges where a large pedesterians use frequently.
Waiting Pedesterian Detection: Bubble eye is the first all in one solution that brings real time, bicycle and pedesterian detection which are supported with features. Waiting pedesterian detection data is beneficial for traffic management purposes, especially in areas where pedesterians use frequently
Congestion Detection: Buble eye has the feature of congestion detection, where an oversaturated field of view could easily be detected and appropriate traffic manegment measures could be taken immediately.
Competitive Prices: using an inexpensive fisheyecamera, single dedicated hardware that runs computer vision algoritihims and a single pole to mount the camera on, bubble eye is compettive in pricing, which enables a strong position in the market where competition over pricing is possible.
A bubble eye system for analyzing the traffic flow and disclosing traffic parameters which can be used for many different applications including advanced intersection management, optimization of traffic light durations, transportation planning, and statistical data generation for analyzing dynamics of traffic, comprising;
► Input Frame which is defining of the application area and setting of rules before being implemented,
► Modules which are generating of data for traffic analyzer module as presented below;
• Moving object detection module which is detecting of moving vehicles,
• Partial occlusion elimination module which is providing that The distinguish vehicles when more than one vehicle acts as a single vehicle, • Track initialization module which is providing that The vehicles identified from the Moving Object Detection and Partial Occlusion Elimination modules are monitoring and the movement models are extracted to decide whether to follow the vehicles,
• Omni-directional visual tracking module which is providing that tracking of vehicle without by the deterioration of the optical means,
• Origin-destination counting module which is providing that counting of vehicles by following at high accuracy,
• Feature based classification module which is providing that classifying of vehicles according to their location as a categories by using of deep learning,
• Occupancy detection modules which is providing that instensity of the vehicles in red light standby zone is substracted,
► Traffic analyzer module which is generating an traffic jam information that is exist or not at intersections or roads,
► Instantaneous Visual Output Generation ve Real-time Visual Display section which is providing that Data generated from the traffic analyzer on the image taken from the camera is printed in real time as a symbology and the image produced is streamed from the internet and transmitted to the user and observers,
► Statistical Data Generation ve Simulation Integration section which is providing that data taken from the intersection are instantaneously written to the database and transmitted to a specified server and the measurements taken at the intersection can be transmitted to the simulation program in real time, and the real data at a later time allows to be used for optimization of intersection light times,
► Virtual pan tilt zoom Section which is providing that allows to using of pan tilt zoom functions,
► Adaptive intersection controller is the unit that determines green light duration considering the traffic parameters extracted by the system as a whole,
► Cameras that are used for processing visual data in the field of view,
► Blocks that are object detection, target tracking, occlusion elimination, queue length estimation, and vehicle classification,
► Sensors That Are Vision Based Sensors And Providing Achieving Multiple Tasks At The Same Time By Analyzing Visual Data,
► A database where taken of data from the intersection is written instantaneously, ► A server where the data is transmitted,
► A simulation integration where the measurements taken at the intersection is transmitted in real time.
A method for operating a Bubble Eye system comprising the steps of;
• Defining of the application area and setting of rules before being implemented,
• Generating of data for traffic analyzer module by Modules,
• Analyzing of datas which are collecting from Modules at traffic analyzer module,
• Printing of data which is generated from the traffic analyzer on the image taken from the camera in real time as a symbology,
• Streaming of the produced image from the internet and transmitting to the user and observers,
• Using for many different applications including advanced intersection management, optimization of traffic light durations, transportation planning, and statistical data generation for analyzing dynamics of traffic.

Claims

1. Bubble Eye system for analyzing the traffic flow and disclosing traffic parameters which can be used for many different applications including advanced intersection management, optimization of traffic light durations, transportation planning, and statistical data generation for analyzing dynamics of traffic, comprising;
► Input Frame which is defining of the application area and setting of rules before being implemented,
► Modules which are generating of data for traffic analyzer module as presented below;
• Moving object detection module which is detecting of moving vehicles,
• Partial occlusion elimination module which is providing that The distinguish vehicles when more than one vehicle acts as a single vehicle,
• Track initialization module which is providing that The vehicles identified from the Moving Object Detection and Partial Occlusion Elimination modules are monitoring and the movement models are extracted to decide whether to follow the vehicles,
• Omni-directional visual tracking module which is providing that tracking of vehicle without by the deterioration of the optical means,
• Origin-destination counting module which is providing that counting of vehicles by following at high accuracy,
• Feature based classification module which is providing that classifying of vehicles according to their location as a categories by using of deep learning,
• Occupancy detection modules which is providing that instensity of the vehicles in red light standby zone is substracted,
► Traffic analyzer module which is generating an traffic jam information that is exist or not at intersections or roads,
► Instantaneous Visual Output Generation ve Real-time Visual Display section which is providing that Data generated from the traffic analyzer on the image taken from the camera is printed in real time as a symbology and the image produced is streamed from the internet and transmitted to the user and observers,
► Statistical Data Generation ve Simulation Integration section which is providing that data taken from the intersection are instantaneously written to the database and transmitted to a specified server and the measurements taken at the intersection can be transmitted to the simulation program in real time, and the real data at a later time allows to be used for optimization of intersection light times,
► Virtual pan tilt zoom Section which is providing that allows to using of pan tilt zoom functions,
► Adaptive intersection controller is the unit that determines green light duration considering the traffic parameters extracted by the system as a whole,
► Cameras that are used for processing visual data in the field of view,
► Blocks that are object detection, target tracking, occlusion elimination, queue length estimation, and vehicle classification,
► Sensors That Are Vision Based Sensors And Providing Achieving Multiple Tasks At The Same Time By Analyzing Visual Data,
► A database where taken of data from the intersection is written instantaneously,
► A server where the data is transmitted,
► A simulation integration where the measurements taken at the intersection is transmitted in real time.
2. Bubble Eye system according to claim 1; wherein said system can produce vehicle presence, moving object detection, vehicle tracking, queue length estimation, waiting pedestrian detection, congestion detection, speed estimation, incident detection, entry- exit pair counting and vehicle classification by using of blocks.
3. Bubble Eye system according to claim 1; wherein said Cameras that are both omnidirectional (typically fisheye) cameras, panoramic cameras and single directional network cameras.
4. Bubble Eye system according to claim 1 or claim 2; wherein said system optimize traffic flow by adaptive signal time controlling in intersections and together with planning fixed time signalizations for any intersection and network of intersections by using of output informations.
5. Bubble Eye system according to claim 4; wherein said output informations are vehicle presence, moving object detection, vehicle tracking, queue length estimation, waiting pedestrian detection, congestion detection, speed estimation, incident detection, entry- exit pair counting and vehicle classification.
6. Bubble Eye system according to claim 1; wherein said system can use any type of camera(s) as a sensor and analyzes traffic by using computer vision techniques.
7. Bubble Eye system according to claim 1; wherein said Origin-destination counting module which is providing that From which direction the vehicles comes and which direction the vehicles goes that is detected by following the vehicles and from which direction in which direction how many vehicles goes are counted by high accuracy.
8. Bubble Eye system According to claim 1; wherein said input comprising definition of arrival regions, entry nodes, exit nodes, and routes in between these entry-exit regions in minimum.
9. Bubble Eye system According to claim 1; wherein said the real data allows to be used for optimization of traffic light times (signal durations) in intersection or network of intersections.
10. Bubble Eye system According to claim 1; wherein said multiple tasks are illegal turn, wrong way violation, stopped vehicle detection, parking violation, debris detection or abnormality detections.
11. Bubble Eye system according to claim 1; wherein said system benefit from data provided by multiple cameras simultaneously for any analysis given by the system.
12. Bubble Eye system according to claim 1; wherein said system brings real time, bicycle and pedesterian detection which are supported with features.
13. Bubble Eye system according to claim 1; wherein said Cameras that have overlapping regions in their field of views or not situations;
If cameras have overlapping regions in their field of views:
• ensuring of Track continuity between cameras,
• A moving object exiting from the field of concern of a camera is continued to be tracked in the other camera’s field of concern,
If cameras do not have overlapping regions in their field of views:
• Providing of data for Each cameras respective field of concern.
14. A method for operating Bubble Eye system comprising the steps of;
• Defining of the application area and setting of rules before being implemented,
• Generating of data for traffic analyzer module by Modules,
• Analyzing of datas which are collecting from Modules at traffic analyzer module,
• Printing of data which is generated from the traffic analyzer on the image taken from the camera in real time as a symbology,
• Streaming of the produced image from the internet and transmitting to the user and observers, • Using for many different applications including advanced intersection management, optimization of traffic light durations, transportation planning, and statistical data generation for analyzing dynamics of traffic.
PCT/TR2019/051182 2018-12-25 2019-12-23 Bubble eye system WO2020139283A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR2018050885 2018-12-25
TRPCT/TR2018/050885 2018-12-25

Publications (2)

Publication Number Publication Date
WO2020139283A2 true WO2020139283A2 (en) 2020-07-02
WO2020139283A3 WO2020139283A3 (en) 2020-08-13

Family

ID=71128329

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2019/051182 WO2020139283A2 (en) 2018-12-25 2019-12-23 Bubble eye system

Country Status (1)

Country Link
WO (1) WO2020139283A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114783181A (en) * 2022-04-13 2022-07-22 江苏集萃清联智控科技有限公司 Traffic flow statistical method and device based on roadside perception

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105809958A (en) * 2016-03-29 2016-07-27 中国科学院深圳先进技术研究院 Traffic control method and system based on intersection group
US9633560B1 (en) * 2016-03-30 2017-04-25 Jason Hao Gao Traffic prediction and control system for vehicle traffic flows at traffic intersections
JP2020502718A (en) * 2016-12-19 2020-01-23 スルグリーン・エルエルシー Connected adaptive vehicle traffic management system with digital prioritization

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114783181A (en) * 2022-04-13 2022-07-22 江苏集萃清联智控科技有限公司 Traffic flow statistical method and device based on roadside perception

Also Published As

Publication number Publication date
WO2020139283A3 (en) 2020-08-13

Similar Documents

Publication Publication Date Title
Chen et al. Surrogate safety analysis of pedestrian‐vehicle conflict at intersections using unmanned aerial vehicle videos
Zhang et al. Data-driven intelligent transportation systems: A survey
Geroliminis et al. Identification and analysis of queue spillovers in city street networks
US11380105B2 (en) Identification and classification of traffic conflicts
US20200320313A1 (en) Time Variant Geographical Information Systemdata Acquisition Device for Mobile Vehicles and the System Thereof
JP2021512425A (en) Intelligent Road Infrastructure Systems (IRIS): Systems and Methods
CN114333330B (en) Intersection event detection system based on road side edge holographic sensing
CN106327880B (en) A kind of speed recognition methods and its system based on monitor video
CN115618932A (en) Traffic incident prediction method and device based on internet automatic driving and electronic equipment
Abdel-Aty et al. Using closed-circuit television cameras to analyze traffic safety at intersections based on vehicle key points detection
KR102456869B1 (en) System for smart managing traffic
Oskarbski et al. Automatic incident detection at intersections with use of telematics
Balasubramanian et al. Machine learning based IoT system for secure traffic management and accident detection in smart cities
Zheng Developing a traffic safety diagnostics system for unmanned aerial vehicles usingdeep learning algorithms
Ansariyar Investigating the Car-Pedestrian Conflicts Based on an Innovative Post Encroachment Time Threshold (PET) Classification
WO2020139283A2 (en) Bubble eye system
Ranka et al. A vision of smart traffic infrastructure for traditional, connected, and autonomous vehicles
KR101977391B1 (en) Method and apparatus for control traffic signal based on thermal image
Hashmi et al. Analysis and monitoring of a high density traffic flow at T-intersection using statistical computer vision based approach
Trivedi et al. A vision-based real-time adaptive traffic light control system using vehicular density value and statistical block matching approach
Subramaniyam et al. A Survey on IoT Based Intelligent Road Traffic and Transport Management Systems
KR20230126927A (en) Traffic accident prevention system and its service method
TWI712997B (en) Method and device for detecting violations
Hnoohom et al. The video-based safety methodology for pedestrian crosswalk safety measured: The case of Thammasat University, Thailand
CN113034898B (en) Road unblocked display system based on cloud computing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19903045

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
122 Ep: pct application non-entry in european phase

Ref document number: 19903045

Country of ref document: EP

Kind code of ref document: A2