CN115512308A - Method, apparatus, device, storage medium and program product for intersection traffic analysis - Google Patents

Method, apparatus, device, storage medium and program product for intersection traffic analysis Download PDF

Info

Publication number
CN115512308A
CN115512308A CN202110693646.2A CN202110693646A CN115512308A CN 115512308 A CN115512308 A CN 115512308A CN 202110693646 A CN202110693646 A CN 202110693646A CN 115512308 A CN115512308 A CN 115512308A
Authority
CN
China
Prior art keywords
image
intersection
target vehicle
determining
reference line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110693646.2A
Other languages
Chinese (zh)
Inventor
车正平
汪浩文
李文蔚
石玥
东科
姜波
唐剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Didi Woya Technology Co ltd
Original Assignee
Shanghai Didi Woya Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Didi Woya Technology Co ltd filed Critical Shanghai Didi Woya Technology Co ltd
Priority to CN202110693646.2A priority Critical patent/CN115512308A/en
Publication of CN115512308A publication Critical patent/CN115512308A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06Q50/40
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/065Traffic control systems for road vehicles by counting the vehicles in a section of the road or in a parking area, i.e. comparing incoming count with outgoing count

Abstract

According to an embodiment of the present disclosure, a method, an apparatus, a device, a storage medium, and a program product for intersection traffic analysis are provided. The method proposed herein comprises: identifying a target vehicle from a first image of the intersection, the target vehicle being associated with a first reference line in the first image, the first reference line indicating an entry location of the intersection; determining whether a second image is included in subsequent images of the intersection, the second image in which the target vehicle is associated with a second reference line, the second reference line indicating an exit position corresponding to the entry position; and in response to detecting the second image, determining intersection traffic information for the target vehicle based on the first image and the second image. According to the fact of the disclosure, the traffic condition of the vehicle at the intersection can be effectively tracked.

Description

Method, apparatus, device, storage medium and program product for intersection traffic analysis
Technical Field
Implementations of the present disclosure relate to the field of intelligent transportation, and more particularly, to methods, apparatuses, devices, storage media, and program products for intersection traffic analysis.
Background
With the enhanced popularization of 5G technology and novel infrastructure construction, transportation and transportation become one of the most beneficial industries. Various sensors, signal systems, edge computing devices and the like which are actually deployed in a large scale along with cost reduction and technology maturity can provide a large amount of data and perform appropriate analysis, so that the traffic system is more intelligent. In an urban road traffic environment, the efficiency and experience of the whole trip are influenced by the states of the traffic speed, the quantity and the like of vehicles at the intersection. Therefore, it is desirable to obtain more accurate and detailed intersection traffic information.
Disclosure of Invention
Embodiments of the present disclosure provide a solution for intersection traffic analysis.
In a first aspect of the disclosure, a method for intersection traffic analysis is provided. The method comprises the following steps: identifying a target vehicle from a first image of the intersection, the target vehicle being associated with a first reference line in the first image, the first reference line indicating an entry location of the intersection; determining whether a second image is included in subsequent images of the intersection, in which second image the target vehicle is associated with a second reference line, the second reference line indicating an exit position corresponding to the entrance position; and in response to detecting the second image, determining intersection traffic information for the target vehicle based on the first image and the second image.
In a second aspect of the present disclosure, an apparatus for intersection traffic analysis is provided. The device includes: an identification module configured to identify a target vehicle from a first image of an intersection, the target vehicle being associated with a first reference line in the first image, the first reference line indicating an entry location of the intersection; a detection module configured to determine whether a second image is included in subsequent images of the intersection in which the target vehicle is associated with a second reference line indicating an exit position corresponding to the entrance position; and an analysis module configured to determine intersection traffic information of the target vehicle based on the first image and the second image in response to detecting the second image.
In a third aspect of the present disclosure, there is provided an electronic device comprising one or more processors and memory, wherein the memory is for storing computer-executable instructions that are executed by the one or more processors to implement a method according to the first aspect of the present disclosure.
In a fourth aspect of the present disclosure, a computer-readable storage medium is provided having computer-executable instructions stored thereon, wherein the computer-executable instructions, when executed by a processor, implement a method according to the first aspect of the present disclosure.
In a fifth aspect of the present disclosure, a computer program product is provided comprising computer executable instructions, wherein the computer executable instructions, when executed by a processor, implement the method according to the first aspect of the present disclosure.
According to various embodiments of the disclosure, the traffic condition of the vehicle at the intersection can be effectively tracked.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements, of which:
FIG. 1 illustrates a schematic diagram of an example environment in which various embodiments of the present disclosure can be implemented;
FIG. 2 illustrates a flow diagram of an example method of intersection traffic analysis, in accordance with some embodiments of the present disclosure;
FIG. 3 illustrates an example first image, according to some embodiments of the present disclosure;
FIG. 4 illustrates an example second image, according to some embodiments of the present disclosure;
FIG. 5 illustrates a flow diagram of an example process of determining intersection traffic information in accordance with some embodiments of the present disclosure;
fig. 6 illustrates an example third image, in accordance with some embodiments of the present disclosure;
FIG. 7 shows a schematic block diagram of an apparatus for intersection traffic analysis, in accordance with some embodiments of the present disclosure; and
fig. 8 illustrates a block diagram of a computing device capable of implementing various embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more complete and thorough understanding of the present disclosure. It should be understood that the drawings and the embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
In describing embodiments of the present disclosure, the terms "include" and "comprise," and similar language, are to be construed as open-ended, i.e., "including but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like may refer to different or the same object. Other explicit and implicit definitions are also possible below.
As discussed above, traffic conditions at intersections are one of the major factors affecting urban traffic. Therefore, it is desirable to be able to obtain accurate traffic information of intersections, for example, to perform planning of navigation paths, or scheduling of traffic flow. Some schemes can count vehicles passing through the intersection, and the existing intersection vehicle motion counting mainly comprises three categories, namely ground induction coil-based vehicle motion counting, driving GPS data reporting-based vehicle motion counting, video image processing-based vehicle motion counting and the like.
The method is characterized in that a ground induction coil is arranged at the intersection of the road, and the ground induction coil is arranged at the intersection of the road and the road. The method is basically not influenced by any condition change such as weather, environment, light and the like.
However, the ground-based method of sensing the coil is costly and inconvenient to install and maintain, such as having to temporarily interrupt traffic to place or remove the coil from the ground. Secondly, this type of method does not distinguish the class of the vehicle, for example, it is not possible to distinguish whether a running vehicle is a truck or a bus. In addition, in the case of many road vehicles, it is difficult to recognize a specific movement trajectory route of the vehicle. Thus, this type of approach can provide limited information.
Based on the intersection vehicle counting mode reported by the GPS data, the GPS device loaded on each vehicle is used for recording GPS signal data, and the space-time track of the vehicle, namely, the time and place where the vehicle passes can be obtained. By arranging a server or a computing node, GPS information of all vehicles passing through the intersection is collected and analyzed uniformly, and then a corresponding intersection traffic flow counting result can be obtained. This way is also not affected by weather and light etc.
However, the method based on GPS data reporting needs to establish data communication between each vehicle on the road and the counting system at the intersection, and it is difficult to completely cover all vehicles on the road. In addition, the GPS signal is also easily interfered by factors such as tall buildings, trees, congested vehicles, etc., and the accuracy of the position and speed is poor.
The method based on video image processing needs to arrange corresponding overhead cameras and image processing computing equipment at intersections, extract vehicle and traffic flow information from collected video images through methods such as computer vision and the like, and can provide live videos as references. Among them, visual detection mainly depends on an optical flow method, a difference method, and the like.
The general method based on video image processing has higher requirements on the accuracy and performance of the algorithm of video processing, and is easily limited by severe rainy and snowy weather, change of illumination conditions and the like. Among them, the optical flow method is complex and slow in calculation; the difference method has poor effect on fast moving vehicles and in complex image noise scenes; the target detection method may have the phenomena of missing detection and false detection, and has an undesirable effect on conditions such as vehicle or environment shielding, large scale change and the like.
In view of this, the embodiments of the present disclosure provide a solution for intersection traffic analysis. In this approach, a target vehicle is identified from a first image of the intersection, where the target vehicle is associated with a first reference line in the first image, the first reference line indicating an entry location of the intersection. Further, it is determined whether a second image is included in the subsequent images of the intersection, wherein in the second image the target vehicle is associated with a second reference line indicating an exit position corresponding to the entry position. In response to detecting the second image, intersection traffic information of the target vehicle is determined based on the first image and the second image.
According to such a scheme, the embodiment of the disclosure can acquire more accurate and fine intersection traffic information of the vehicle based on the reference line.
Some example embodiments of the disclosure will now be described with continued reference to the accompanying drawings.
Example Environment
FIG. 1 illustrates a block diagram of an example environment 100 in which embodiments of the present disclosure can be implemented. As shown in FIG. 1, an environment 100 includes an intersection 110, and a vehicle 120 in motion. It should be understood that the description of the structure and function of environment 100 is for exemplary purposes only and does not imply any limitation as to the scope of the disclosure. For example, embodiments of the present disclosure may also be applied to environments other than environment 100. It should be understood that in the present disclosure, "intersection" refers to an intersection of roads, which may also be referred to as a "road intersection".
As shown in fig. 1, the intersection 110 is shown as an intersection. It should be understood that this type is merely illustrative, and that the intersection 110 may also be any other suitable type, examples of which include, but are not limited to: crossroads, T-shaped intersections, roundabout intersections, Y-shaped intersections, high-speed ramp junction entrances, elevated road junction entrances and the like.
Additionally, in the example of FIG. 1, the vehicle 120 may be any type of vehicle that may carry people and/or things and be moved by a powered system such as an engine, including but not limited to a car, truck, bus, caravan, motorcycle, bicycle, and the like.
According to an embodiment of the present disclosure, the image capture device 130 may acquire a plurality of images 140 of an image and send them to the analysis apparatus 150. The analysis device 150 can determine intersection traffic information 160 for the vehicle according to a process discussed in detail below.
In some embodiments, the image capture device 130 may be installed near the intersection 110 to capture images of the intersection 110 from a fixed angle. Illustratively, the image capture device 130 may include a camera mounted on an overhead or tower, and may capture video of the intersection and send the video frames to the analysis apparatus 150 for processing.
In some embodiments, the analytics device 150 may be, for example, any type of computing device, examples of which include, but are not limited to, a roadside device, an edge computing device, or a cloud computing device. The analysis device 150 may be configured to receive the captured image 140 of the intersection 110 from the image capture apparatus 130 via a wired connection or a wireless network, for example, and perform a corresponding intersection traffic information analysis. The process for generating the intersection passage analysis information 160 will be described in detail below.
Example procedure
The process of intersection traffic analysis according to an embodiment of the present disclosure will be described in detail below with reference to fig. 2. Fig. 2 shows a schematic diagram of a process 200 of intersection traffic analysis, according to some embodiments of the present disclosure. For ease of discussion, the process of intersection traffic analysis is discussed with reference to FIG. 1. The process 200 may be performed, for example, at the analysis device 150 shown in fig. 1. It should be understood that process 200 may also include blocks not shown and/or may omit blocks shown. The scope of the present disclosure is not limited in this respect.
As shown in fig. 2, at block 210, the analysis device 150 identifies a target vehicle from a first image of the intersection 110, wherein the target vehicle is associated with a first reference line in the first image, the first reference line indicating an entry location of the intersection 110.
In some embodiments, the image capture device 130 may capture a first image of the intersection 110 and send the first image to the analysis apparatus 150.
As discussed above, since the image capturing apparatus 130 may be fixedly installed to capture an image of the intersection 110 from a specific angle. Therefore, the positions of various static traffic elements (e.g., traffic sign lines) in the intersection 110 in the image are always fixed.
In some embodiments, the analysis device 150 can identify a location of a first reference line in the first image, where the first reference line can indicate, for example, an entry location of the intersection 110. In some embodiments, such entry locations indicate, for example, stop lines before entering the intersection 110. For example, a different lane enters the stop line before the intersection 110.
Fig. 3 illustrates an example first image 300 according to an embodiment of this disclosure. As shown in fig. 3, the analysis device 150 may identify a first reference line 310 in the first image 300, which may for example correspond to a stop-line of a straight lane in a bottom-up direction.
In some embodiments, the analysis device 150 can identify the first reference line 310 from the image of the intersection based on image recognition. For example, the analysis device 150 may automatically identify the first reference line 310 based on the identification of the stop line and the traffic guide line.
In some embodiments, such a first reference line 310 may also be determined, for example, based on the received location information. For example, the user may input start and end coordinates of the stop line in world coordinates, so that the analysis device 150 may obtain a pixel position corresponding to the start and end coordinates of the stop line based on the conversion relationship of the image coordinates and the world coordinates, and thereby identify the first reference line 310. Alternatively, such start and end point coordinates may also be determined automatically, for example, based on information from a high-precision map.
In some embodiments, the analysis device 150 may detect at least one vehicle from the first image 300 and determine a target vehicle from the at least one vehicle based on a comparison of a detection frame of the at least one vehicle to the first reference line 310.
In particular, the analysis device 150 may, for example, detect the vehicle in the first image 300 using a suitable object detection model. Examples of such object detection models may include, but are not limited to: YOLO v3 or EfficientDet-D0, and the like.
Further, the analysis device 150 may determine whether any vehicles arrive at the entrance position corresponding to the first reference line 310, for example, by comparing the detection frame of the detected vehicles with the first reference line 310.
In the example of fig. 3, the analysis device 150 may determine that the target vehicle 120 corresponding to the detection block 320 has arrived at the entry location by determining that the detection block 320 intersects the first reference line 310. It should be understood that in the present disclosure, associating the vehicle with the reference line may mean that the detection frame of the vehicle intersects the reference line in the figure, or may also mean that the distance from the detection frame of the vehicle to the reference line in the figure is less than a predetermined threshold.
With continued reference to fig. 2, at block 220, the analysis device 150 determines whether a second image is included in the subsequent images of the intersection 110, wherein the target vehicle is associated with a second reference line in the second image, the second reference line indicating an exit location corresponding to the entry location.
In some embodiments, the analysis device 150 may represent a second reference line in the acquired subsequent images of the intersection 110, such reference line may correspond to the first reference line.
Fig. 4 illustrates an example first image 400 according to an embodiment of this disclosure. As shown in fig. 4, the analysis device 150 can identify a first reference line 410 in the second image 400, which exits the exit location of the intersection 110, for example, after entering the intersection 110 from the entry location corresponding to the first reference line 310. Taking fig. 4 as an example, such an exit position may be, for example, an entrance position of a lane that can be traveled after going straight through the intersection 110 from bottom to top.
In some embodiments, the analysis device 150 can identify the second reference line 410 from the image of the intersection based on image recognition. For example, the analysis device 150 may automatically identify the second reference line 410 based on the identification of the stop line and the traffic guide line.
In some embodiments, such a second reference line 310 may also be determined, for example, based on the received location information. For example, the user may input start and end point coordinates of the exit position in the world coordinates, so that the analysis device 150 may obtain a pixel position corresponding to the start and end point coordinates of the exit position based on the conversion relationship of the image coordinates and the world coordinates, and thereby identify the second reference line 410. Alternatively, such start and end point coordinates may also be automatically determined, for example, based on information from a high-precision map.
It should be understood that although in the examples of fig. 3 and 4, the first reference line 310 and the second reference line 410 are shown as a single line segment, the first reference line 310 and/or the second reference line 410 may be a combination of multiple line segments as needed by the actual scenario. For example, if the first reference line 310 corresponds to a stop line for a straight or left-turn lane, the second reference line 410 may correspond to, for example, two separate lines corresponding to the location of a left-turn exit intersection and the location of a straight exit intersection, respectively.
After identifying the second reference line 410, the analysis device 150 may continuously track the determined target vehicle 120 in the first image 300 in subsequent reference images and may compare its detection frame to the second reference line 410, for example, to determine whether the currently received image is the second image 400.
If a second image is detected at block 220, process 200 proceeds to block 230. At block 203, the analysis device 150 determines intersection traffic information 160 for the target vehicle based on the first image and the second image.
In some embodiments, the analysis device 150 may continuously track the target vehicle 120 in subsequent images after acquiring the first image 300 and detect whether it has reached an exit location. In some embodiments, these subsequent images may be subsequent video frames of the intersection 110 captured by the image capture device 130. It should be appreciated that the target vehicle (e.g., the Deepsort model) may be tracked in any suitable manner and the disclosure is not intended to be limited thereto.
Taking fig. 4 as an example, if the analysis device 150 detects the target vehicle 120 from a subsequent image, and the detection frame of the target vehicle 120 intersects the second reference line 410, the image may be determined as the second image 400.
Further, the analysis device 150 can determine the intersection passage information 160 of the target vehicle 120 based on the first image 300 and the second image 400.
In some embodiments, the intersection traffic information 160 can include, for example, the transit time of the target vehicle 120 through the intersection 110. Illustratively, the analysis device 150 can determine the transit time for the target vehicle 120 to transit the intersection 110 based on the capture times of the first image 300 and the second image 40.
In some embodiments, the intersection traffic information 160 can also include the speed of passage of the target vehicle 120 through the intersection 110. For example, the analysis device 150 may determine the transit speed of the target vehicle 120 based on the distance between the exit location and the entrance location and the transit time. Alternatively, the analysis device 150 may also determine its travel trajectory based on the tracking of the target vehicle 120 and determine the transit speed based on the length of the travel trajectory and the transit time.
In some embodiments, the intersection traffic information 160 can also include a traffic trajectory with the target vehicle 120 through the intersection 110. In some embodiments, such a traffic trajectory may indicate a direction of travel of the target vehicle 120, e.g., straight from the south side to the north side, or turn left from the south side to the west side, etc. In some embodiments, the analysis device 150 may determine the transit trajectory based on, for example, traffic identifications corresponding to the entry locations.
In some embodiments, the analysis device 150 can also determine the route of the target vehicle passing intersection based on the position of the target vehicle in the first and second images, considering that some traffic identifications may not uniquely correspond to the passing direction.
Taking the entry location as a straight or left-turn lane from the south side as an example, if the analysis device 150 determines that the location where the target vehicle 120 leaves the intersection 110 is the entry of a west-side vehicle, the analysis device 150 may determine that the target vehicle 120 is turning left through the intersection 110.
Based on the mode, the embodiment of the disclosure can set a pair of reference lines to track the condition that the vehicle passes through the intersection, so that more accurate and fine intersection traffic information can be obtained. Such intersection traffic information can further assist in navigation planning or traffic flow control, etc.
In some embodiments, the analysis device 150 may also be unable to continuously track the progress of the target vehicle 120 exiting the intersection 110 for some reason. For example, the analysis device 150 may fail to track the target vehicle in subsequent frames, possibly due to blockage or visibility issues, etc.
An example process 500 for determining traffic information in the event of a failure to track a target vehicle will be described below with reference to fig. 5.
Before detecting the second image, the analysis device 150 may determine whether the received subsequent image is a third image. As shown in fig. 5, at block 502, when the third image is detected, the analysis device 150 may determine that the second image is not included in the subsequent images. Specifically, the third image refers to an image in which the current image can detect the target vehicle and the latter image cannot detect the target vehicle.
In some embodiments, the subsequent image may refer to a video frame subsequent to a video frame corresponding to the third image, for example. Alternatively, if the analysis device 150 performs the analysis once every predetermined number of frames (for example, every three frames), the latter image may also be a video frame having a predetermined time interval from the third image. For example, in the case of processing once every three frames, the latter image may be the third frame after the third image.
Fig. 6 illustrates an example third image 600 according to some embodiments of the present disclosure. As shown in fig. 6, the analysis device 150 can, for example, continuously detect that the target vehicle 120 has moved to a position corresponding to the detection frame 610, and that the detection frame 610 does not intersect the second reference line 410, i.e., that the target vehicle 120 has not yet exited the intersection 110. Further, the analysis device 150 detects, for example, that the target vehicle 120 has failed in a subsequent image of the third image 600, for example, because of occlusion.
In the event that it is determined that the subsequent image includes the third image, i.e., does not include the second image, the analysis device 150 may further determine the distance the target vehicle 120 has traveled. Specifically, as shown, the analysis device 150 may determine a tracking distance of the target vehicle 120 based on the first image 300 and the third image 600 at block 504.
Illustratively, the analysis device 150 may determine the tracking distance of the target vehicle 120 based on the image distance between the detection block 320 and the detection block 610.
At block 506, the analysis device 150 may determine whether the tracking distance is greater than a predetermined threshold. In some implementations, such a predetermined threshold may be, for example, a predetermined proportion of the distance from the entrance location to the exit location. For example, the analysis device 150 may determine whether the tracking distance is greater than half the distance from the entrance location to the exit location.
Alternatively, such a predetermined threshold may also be an absolute distance (e.g., 5 meters), for example, the analysis device 150 may compare the tracking distance to the predetermined threshold.
As shown in fig. 5, if it is determined at block 506 that the tracking distance is greater than the threshold, the process 500 may proceed to block 508, i.e., the analysis device 150 may determine intersection passage information for the target vehicle 120 based on the first image 300 and the third image 600.
In some embodiments, the analysis device 150 can determine the length of time for tracking of the target vehicle 120 based on the time of capture of the first image 300 and the third image 600. For example, the capture time of the first image 300 is 15 pm 00 min 0 s and the capture time of the third image 600 is 15 pm 00 min 25 s, the tracking time may be determined to be 25 s, for example.
Further, the analysis device 150 may determine the travel speed of the target vehicle based on the tracking duration and the tracking distance. For example, such a travel speed may represent an average travel speed of the target vehicle from the entry location to the last tracked location.
Additionally, the analysis device 150 can determine intersection passage information of the target vehicle based on the travel speed. In some embodiments, the travel speed may be directly used as the expected intersection transit speed of the target vehicle 120, for example, as the intersection transit information.
In some embodiments, the analysis device 150 can also determine an expected intersection transit time of the target vehicle 120 as intersection transit information based on the travel speed. Specifically, the analysis device 150 may determine the remaining distance of the target vehicle to the exit location based on the third image and the second reference line. For example, the analysis device 150 may determine the remaining distance of the target vehicle 120 to the exit location based on the image distance of the detection box 610 to the second reference line 410.
Further, the analysis device 150 can determine an expected intersection transit time of the target vehicle based on the remaining distance and the travel speed. For example, the analysis device 150 can estimate the time remaining to travel to the exit location based on the remaining distance and the average speed of the previous travel, and can thereby determine the expected transit time of the target vehicle 120 through the intersection 110.
With continued reference to fig. 5, if it is determined at block 506 that the tracking distance is less than the predetermined threshold, the process 500 may proceed to block 510, i.e., the analysis device 150 may obtain historical traffic information associated with the first reference line and the second reference line and, based on the historical traffic information, determine intersection traffic information for the target vehicle 120
In some embodiments, if the tracking distance is too short to make it difficult to predict the passage of the target vehicle 120, the analysis device 150 can estimate intersection passage information of the target vehicle 120 based on historical passage information of other vehicles.
In some embodiments, such historical traffic information is associated with the same entry location and exit location. For example, such historical traffic information may include the average traffic speed, or average traffic time, of a straight-ahead vehicle, also from the south side to the north side. Further, the analysis device 150 may determine the average traffic speed or the average traffic time determined based on the historical traffic information as the intersection traffic information of the target vehicle.
In some embodiments, the analysis device 150 may also determine the vehicle type of the target vehicle 120, and the obtained historical traffic information is associated with other vehicles having the same vehicle type. For example, if the target vehicle 120 is a car, the historical traffic information may include intersection traffic information for cars that have also traveled straight through the intersection from the south side to the north side over a predetermined period of time.
In some embodiments, the analysis device 150 may determine the vehicle type of the target vehicle at any suitable stage. For example, when the target vehicle 120 is identified from the first image 300. The analysis device 150 may utilize any suitable classification model to determine the vehicle type of the target vehicle.
It should be appreciated that such vehicle types may include any suitable classification, e.g., based on function (e.g., truck, bus, taxi, etc.), based on size (e.g., large, medium, small, etc.), based on power (e.g., automotive, non-automotive, etc.). This disclosure is not intended to be limiting as to how the classification may be made.
In some embodiments, the analysis device 150 may also generate traffic information for the intersection 110 further based on the determined vehicle type and the intersection traffic information, wherein the traffic information indicates traffic conditions of different types of vehicles at the intersection 110.
For example, based on the type of the target vehicle and intersection traffic information, the analysis device 150 may aggregate the traffic situation of the past predetermined time period to obtain the traffic information of the intersection 110. For example, the number of types of vehicles, average traveling speed, average transit time, traveling direction distribution, and the like, at the intersection 110 in the past 10 minutes.
Such finer intersection traffic information may be used to perform more detailed analysis, e.g., a bottleneck to intersection congestion may be determined for dynamically scheduling traffic lights. In some embodiments, such traffic information may also be provided to a navigation application, for example, for planning of a navigation path. For example, the planned navigation path may be made to bypass intersections that are frequently traversed by large vehicles.
Example apparatus and devices
Fig. 7 shows a schematic block diagram of an apparatus 700 for intersection traffic analysis, according to certain embodiments of the present disclosure. The apparatus 700 may be embodied as or included in the analysis device 150 or other device that implements the process for intersection traffic analysis of the present disclosure.
As shown in fig. 7, the apparatus 700 includes an identification module 710 configured to identify a target vehicle from a first image of an intersection in which the target vehicle is associated with a first reference line indicating an entry location of the intersection. The apparatus 700 further includes a detection module 720 configured to determine whether a second image is included in the subsequent images of the intersection in which the target vehicle is associated with a second reference line indicating an exit location corresponding to the entry location. Further, the apparatus 700 further includes an analysis module 730 configured to determine intersection traffic information of the target vehicle based on the first image and the second image in response to detecting the second image.
In some embodiments, the representation module 710 is further configured to: detecting at least one vehicle from the first image; and determining a target vehicle from the at least one vehicle based on the comparison of the detection frame of the at least one vehicle to the first reference line.
In some embodiments, the analysis module 730 is further configured to: determining the passing time of the target vehicle passing intersection based on the capturing time of the first image and the second image; and determining intersection traffic information based on the traffic time.
In some embodiments, the analysis module 730 is further configured to: determining a passing track of the target vehicle passing intersection based on the positions of the target vehicle in the first image and the second image; and determining intersection traffic information based on the traffic track.
In some embodiments, the detection module 720 is further configured to: in response to detecting a third image from the subsequent images before detecting the second image, determining that the second image is not included in the subsequent images, wherein the target vehicle is detected in the third image and not detected in an image subsequent to the third image.
In some embodiments, the analysis module 730 is further configured to: determining a tracking distance of the target vehicle based on the first image and the third image; and determining intersection traffic information of the target vehicle based on the first image and the third image in response to the tracking distance being greater than a predetermined threshold.
In some embodiments, the analysis module 730 is further configured to: determining a tracking duration of the target vehicle based on the capture times of the first image and the third image; determining the running speed of the target vehicle based on the tracking time length and the tracking distance; and determining intersection traffic information of the target vehicle based on the driving speed.
In some embodiments, the analysis module 730 is further configured to: determining a remaining distance of the target vehicle to the exit location based on the third image and the second reference line; and determining intersection passing time of the target vehicle as intersection passing information based on the remaining distance and the traveling speed.
In some embodiments, the analysis module 730 is further configured to: acquiring historical traffic information associated with the first reference line and the second reference line in response to the tracking distance being less than or equal to a predetermined threshold; and determining intersection traffic information of the target vehicle based on the historical traffic information.
In some embodiments, the historical transit information is associated with a reference vehicle having the same vehicle type as the target vehicle.
In some embodiments, the first image and the second image are video frames captured by an image capture device installed at the intersection.
In some embodiments, the apparatus 700 further comprises: a type determination module configured to determine a vehicle type of a target vehicle; and a traffic information generation module configured to generate traffic information for the intersection based on the type and the intersection traffic information, the traffic information indicating traffic conditions of different types of vehicles at the intersection.
In some embodiments, the apparatus 700 further comprises a providing module configured to provide traffic information to a navigation application for planning of a navigation path.
Fig. 8 illustrates a block diagram that shows an electronic device 800 in which one or more embodiments of the disclosure may be implemented. It should be understood that the electronic device 800 illustrated in fig. 8 is merely exemplary and should not be construed as limiting the functionality or scope of the embodiments described herein in any way. The electronic device 800 shown in fig. 8 may be included in or implemented as the analysis device 150 of fig. 1 or other device for intersection traffic analysis that implements the present disclosure.
As shown in fig. 8, electronic device 800 is in the form of a general purpose computing device. The electronic device 800 may also be any type of computing device or server. The components of electronic device 800 may include, but are not limited to, one or more processors or processing units 810, memory 820, storage device 830, one or more communication units 840, one or more input devices 850, and one or more output devices 860. The processing unit 810 may be a real or virtual processor and can perform various processes according to programs stored in the memory 820. In a multi-processor system, multiple processing units execute computer-executable instructions in parallel to improve the parallel processing capabilities of the electronic device 800.
Electronic device 800 typically includes a number of computer storage media. Such media may be any available media that is accessible by electronic device 800 and includes, but is not limited to, volatile and non-volatile media, removable and non-removable media. The memory 820 may be volatile memory (e.g., registers, cache, random Access Memory (RAM)), non-volatile memory (e.g., read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory), or some combination thereof. The storage device 830 may be a removable or non-removable medium and may include a machine-readable medium, such as a flash drive, a magnetic disk, or any other medium that may be capable of being used to store information and/or data (e.g., map data) and that may be accessed within the electronic device 800.
The electronic device 800 may further include additional removable/non-removable, volatile/nonvolatile storage media. Although not shown in FIG. 8, a magnetic disk drive for reading from or writing to a removable, non-volatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, non-volatile optical disk may be provided. In these cases, each drive may be connected to a bus (not shown) by one or more data media interfaces. Memory 820 may include a computer program product 825 having one or more program modules configured to perform the various methods or acts of the various embodiments of the disclosure.
Communication unit 840 enables communication with other computing devices over a communication medium. Additionally, the functionality of the components of the electronic device 800 may be implemented in a single computing cluster or multiple computing machines, which are capable of communicating over a communications connection. Thus, the electronic device 800 may operate in a networked environment using logical connections to one or more other servers, network Personal Computers (PCs), or another network node.
The input device 850 may be one or more input devices such as a mouse, keyboard, trackball, or the like. The output device(s) 860 may be one or more output devices such as a display, speakers, printer, or the like. Electronic device 800 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., communication with one or more devices that enable a user to interact with electronic device 800, or communication with any devices (e.g., network cards, modems, etc.) that enable electronic device 800 to communicate with one or more other computing devices via communication unit 840, as desired. Such communication may be performed via input/output (I/O) interfaces (not shown).
According to an exemplary implementation of the present disclosure, a computer-readable storage medium is provided, on which computer-executable instructions or a program are stored, wherein the computer-executable instructions or the program are executed by a processor to implement the above-described method or function. The computer-readable storage medium may include a non-transitory computer-readable medium. According to an exemplary implementation of the present disclosure, there is also provided a computer program product comprising computer executable instructions or a program which are executed by a processor to implement the above described method or function. The computer program product may be tangibly embodied on a non-transitory computer-readable medium.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus, devices and computer program products implemented in accordance with the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-executable instructions or programs.
These computer-executable instructions or programs may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-executable instructions or programs may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer-executable instructions or programs may be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer-implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various implementations of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing has described implementations of the present disclosure, and the above description is illustrative, not exhaustive, and not limited to the implementations disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described implementations. The terminology used herein was chosen in order to best explain the principles of the implementations, the practical application, or improvements to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the various implementations disclosed herein.
Example implementation
The TS 1. A method for analyzing intersection traffic, comprising:
identifying a target vehicle from a first image of an intersection, the target vehicle being associated with a first reference line in the first image, the first reference line indicating an entry location of the intersection;
determining whether a second image is included in subsequent images of the intersection in which the target vehicle is associated with a second reference line indicating an exit location corresponding to the entry location; and
in response to detecting the second image, determining intersection traffic information for the target vehicle based on the first image and the second image.
TS 2. The method of TS 1, wherein identifying the target vehicle comprises:
detecting at least one vehicle from the first image; and
determining the target vehicle from the at least one vehicle based on a comparison of the detection frame of the at least one vehicle to the first reference line.
TS 3. The method of TS 1, wherein determining the intersection traffic information comprises:
determining a transit time for the target vehicle to transit the intersection based on the capture times of the first and second images; and
and determining the intersection traffic information based on the traffic time.
TS 4. The method of TS 1, wherein determining the intersection traffic information comprises:
determining a passing track of the target vehicle passing through the intersection based on the position of the target vehicle in the first image and the second image; and
and determining the intersection traffic information based on the traffic track.
TS 5. The method of TS 1, wherein determining whether a second image is included in subsequent images of the intersection comprises:
determining that the second image is not included in the subsequent images in response to detecting a third image from the subsequent images before detecting the second image,
wherein the target vehicle is detected in the third image and not detected in a subsequent image of the third image.
TS 6. The method of TS 5, further comprising:
determining a tracking distance of the target vehicle based on the first image and the third image; and
determining intersection passage information for the target vehicle based on the first and third images in response to the tracking distance being greater than a predetermined threshold.
TS 7. The method of TS 6, wherein determining intersection traffic information of the target vehicle based on the first and third images comprises:
determining a tracking duration of the target vehicle based on the capture times of the first and third images;
determining a travel speed of the target vehicle based on the tracking duration and the tracking distance; and
determining the intersection traffic information of the target vehicle based on the travel speed.
TS 8. The method of TS 7, wherein determining the intersection passage information of the target vehicle based on the travel speed comprises:
determining a remaining distance of the target vehicle to the exit location based on the third image and the second reference line; and
and determining the intersection passing time of the target vehicle as the intersection passing information based on the remaining distance and the running speed.
TS 9. The method of TS 6, further comprising:
in response to the tracking distance being less than or equal to the predetermined threshold, obtaining historical traffic information associated with the entry location and the exit location; and
and determining intersection traffic information of the target vehicle based on the historical traffic information.
TS 10. The method according to TS 9, wherein the historical traffic information is associated with a reference vehicle, the reference vehicle having the same vehicle type as the target vehicle.
TS 11. The method of TS 1, wherein the first image and the second image are video frames captured by an image capture device installed at the intersection.
TS 12. The method of any one of TS 1 to 11, further comprising:
determining a vehicle type of the target vehicle; and
and generating traffic information aiming at the intersection based on the types of the vehicles and the intersection traffic information, wherein the traffic information indicates the traffic conditions of different types of vehicles at the intersection.
TS 13. The method according to TS 12, further comprising:
providing the traffic information to a navigation application for planning of a navigation path.
TS 14. An apparatus for intersection traffic analysis, comprising:
an identification module configured to identify a target vehicle from a first image of an intersection in which the target vehicle is associated with a first reference line indicating an entry location of the intersection;
a detection module configured to determine whether a second image is included in subsequent images of the intersection in which the target vehicle is associated with a second reference line indicating an exit location corresponding to the entry location; and
an analysis module configured to determine intersection traffic information of the target vehicle based on the first image and the second image in response to detecting the second image.
TS 15, an electronic device, comprising:
a memory and a processor;
wherein the memory is to store one or more computer instructions, wherein the one or more computer instructions are to be executed by the processor to implement the method according to any one of TS 1 to 13.
TS 16. A computer readable storage medium having stored thereon one or more computer instructions, wherein the one or more computer instructions are executed by a processor to implement the method according to any one of TS 1 to 13.
TS 17. A computer program product comprising computer executable instructions, wherein the computer executable instructions, when executed by a processor, implement the method according to any of TS 1 to 13.

Claims (10)

1. A method of intersection traffic analysis, comprising:
identifying a target vehicle from a first image of an intersection, the target vehicle being associated with a first reference line in the first image, the first reference line indicating an entry location of the intersection;
determining whether a second image is included in subsequent images of the intersection in which the target vehicle is associated with a second reference line indicating an exit location corresponding to the entry location; and
in response to detecting the second image, determining intersection traffic information for the target vehicle based on the first image and the second image.
2. The method of claim 1, wherein identifying a target vehicle comprises:
detecting at least one vehicle from the first image; and
determining the target vehicle from the at least one vehicle based on a comparison of the detection frame of the at least one vehicle to the first reference line.
3. The method of claim 1, wherein determining the intersection passage information comprises:
determining a transit time for the target vehicle to transit the intersection based on the capture times of the first and second images; and
and determining the intersection traffic information based on the traffic time.
4. The method of claim 1, wherein determining the intersection passage information comprises:
determining a passing trajectory of the target vehicle for passing through the intersection based on the position of the target vehicle in the first image and the second image; and
and determining the intersection traffic information based on the traffic track.
5. The method of claim 1, wherein determining whether a second image is included in the subsequent images of the intersection comprises:
determining that the second image is not included in the subsequent images in response to detecting a third image from the subsequent images before detecting the second image,
wherein the target vehicle is detected in the third image and not detected in a subsequent image of the third image.
6. The method of claim 5, further comprising:
determining a tracking distance of the target vehicle based on the first image and the third image; and
determining intersection traffic information of the target vehicle based on the first and third images in response to the tracking distance being greater than a predetermined threshold.
7. The method of claim 6, wherein determining intersection passage information of the target vehicle based on the first and third images comprises:
determining a tracking duration of the target vehicle based on the capture times of the first and third images;
determining a travel speed of the target vehicle based on the tracking duration and the tracking distance; and
determining the intersection traffic information of the target vehicle based on the travel speed.
8. An electronic device, comprising:
a memory and a processor;
wherein the memory is to store one or more computer instructions, wherein the one or more computer instructions are to be executed by the processor to implement the method of any one of claims 1 to 7.
9. A computer readable storage medium having one or more computer instructions stored thereon, wherein the one or more computer instructions are executed by a processor to implement the method of any one of claims 1 to 7.
10. A computer program product comprising computer executable instructions, wherein the computer executable instructions, when executed by a processor, implement the method of any one of claims 1 to 7.
CN202110693646.2A 2021-06-22 2021-06-22 Method, apparatus, device, storage medium and program product for intersection traffic analysis Pending CN115512308A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110693646.2A CN115512308A (en) 2021-06-22 2021-06-22 Method, apparatus, device, storage medium and program product for intersection traffic analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110693646.2A CN115512308A (en) 2021-06-22 2021-06-22 Method, apparatus, device, storage medium and program product for intersection traffic analysis

Publications (1)

Publication Number Publication Date
CN115512308A true CN115512308A (en) 2022-12-23

Family

ID=84499040

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110693646.2A Pending CN115512308A (en) 2021-06-22 2021-06-22 Method, apparatus, device, storage medium and program product for intersection traffic analysis

Country Status (1)

Country Link
CN (1) CN115512308A (en)

Similar Documents

Publication Publication Date Title
JP6783949B2 (en) Road detection using traffic sign information
CN101350109B (en) Method for locating and controlling multilane free flow video vehicle
WO2020240274A1 (en) Systems and methods for vehicle navigation
CN109615870A (en) A kind of traffic detection system based on millimetre-wave radar and video
CN104296756B (en) Run the method and motor vehicle of motor vehicle
CN102013159A (en) High-definition video detection data-based region dynamic origin and destination (OD) matrix acquiring method
GB2596940A (en) Systems and methods for vehicle navigation
US20210341303A1 (en) Clustering event information for vehicle navigation
CN113570864B (en) Method and device for matching running path of electric bicycle and storage medium
US11914041B2 (en) Detection device and detection system
CN113435237B (en) Object state recognition device, recognition method, and computer-readable recording medium, and control device
WO2020242945A1 (en) Systems and methods for vehicle navigation based on image analysis
CN103810854B (en) A kind of based on the artificial intelligent transportation parameter detection method demarcated
US20230236037A1 (en) Systems and methods for common speed mapping and navigation
CN108806244A (en) Image transfer apparatus, method and non-transient storage media
CN106372619A (en) Vehicle robustness detection and divided-lane arrival accumulative curve estimation method
JP3775394B2 (en) Travel link determination system and link travel time measurement system
Tak et al. Development of AI-based vehicle detection and tracking system for C-ITS application
WO2021116752A1 (en) Systems and methods for selectively decelerating a vehicle
CN111183464B (en) System and method for estimating saturation flow of signal intersection based on vehicle trajectory data
US20200193810A1 (en) Information processing system, program, and information processing method
CN113808414B (en) Road load determination method, device and storage medium
US20230168368A1 (en) Guardrail estimation method based on multi-sensor data fusion, and vehicle-mounted device
CN114730492A (en) Assertion vehicle detection model generation and implementation
US20230136710A1 (en) Systems and methods for harvesting images for vehicle navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination