GB2619098A - Apparatus and method for determining vehicle location - Google Patents

Apparatus and method for determining vehicle location Download PDF

Info

Publication number
GB2619098A
GB2619098A GB2207898.4A GB202207898A GB2619098A GB 2619098 A GB2619098 A GB 2619098A GB 202207898 A GB202207898 A GB 202207898A GB 2619098 A GB2619098 A GB 2619098A
Authority
GB
United Kingdom
Prior art keywords
vehicle
location
image
corner
quadrilateral
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2207898.4A
Other versions
GB202207898D0 (en
Inventor
Bullimore Sharon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AGD Systems Ltd
Original Assignee
AGD Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AGD Systems Ltd filed Critical AGD Systems Ltd
Priority to GB2207898.4A priority Critical patent/GB2619098A/en
Publication of GB202207898D0 publication Critical patent/GB202207898D0/en
Publication of GB2619098A publication Critical patent/GB2619098A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/07Controlling traffic signals
    • G08G1/08Controlling traffic signals according to detected number or speed of vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30236Traffic on road, railway or crossing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Abstract

An apparatus and method for determining if a vehicle 1 is within a quadrilateral shaped detection zone 20 includes an image processor 1002 and a camera 1001 to capture one or more 2D images of a scene 1200, where the scene includes a portion of a surface 10 that includes the detection zone, wherein the detection zone consists of two pairs of lines, each pair of lines consisting of non-neighbouring edges of the quadrilateral. The images include a first image in which the vehicle is visible. A memory storage device 1003 stores the locations of first and second vanishing points of a quadrilateral portion of the captured images that corresponds to the location of the detection zone. The location of a first corner of a basal plane of the vehicle is determined, and an algorithm ran by the image processor determines if the vehicle is within the detection zone by combining the location of the first corner with the location of at least one of the two vanishing points.

Description

Apparatus and method for determining vehicle location This invention relates to an apparatus for detecting the location of a vehicle. The invention also relates to a method for detecting the location of a vehicle.
In many traffic applications there is a requirement for information pertaining to the location of vehicle(s). The apparatus and method discussed herein seek to provide this information.
A typical application of an apparatus for detecting the location of a vehicle is in traffic control systems. Such systems usually comprise one or more inter-connected traffic lights which phase through a sequence of red -amber -green light signals to successively initiate and terminate traffic flow through a junction.
These systems often comprise an apparatus for detecting the location of a vehicle.
Such an apparatus may provide information about the location and number of vehicles present at different entrances and exits to the junction. This information may be used by the traffic control system to better regulate traffic flow across the junction.
Different types of apparatus for detecting the location of a vehicle are currently employed in these traffic control systems. Examples include induction loop, radar and video detectors. Induction loop and radar detectors have significant drawbacks such as the high cost associated with embedding coils of wire into a road surface and the limited functionality of radar to accurately locate stationary vehicles. As such, video detectors appear to be a promising alternative, capable of accurately locating vehicles through image processing techniques. These detectors have minimal installation costs. Beneficially, some of the required equipment may already be present in the traffic control system. For instance, an automatic number plate recognition (ANPR) device of a traffic control system may have an image capture device operable for use in a video detector.
However, current video detectors face significant problems. Detectors that utilise 2D image processing techniques often fail to correctly locate vehicles due to the effect of perspective on 2D images. For example, a vehicle in a left lane may partially obscure the right lane of a road when viewed by an image detection device positioned at a side of a road. The video detector may erroneously register that a vehicle is present in the right lane and signal a right filter for the junction. These errors have the effect of increasing net traffic waiting times at junctions. Video detectors utilising 3D image processing techniques, sometimes referred to as 3D object detectors, mitigate this issue by locating a vehicle in 3D space on a road. However, 3D image processing techniques also introduce problems such as the requirement for high processing power and memory storage. This often prevents 3D object detectors from being used on edge devices. These requirements also have the effect of increasing the expense and power consumption of the apparatus as well as increasing the time required to detect and locate a vehicle. This can result in low frame rates, unsuitable for real-time traffic detection.
This invention seeks to ameliorate the problems associated with conventional video detectors that employ 2D and/or 3D image processing techniques by utilising visual heuristics within a 2D image processing framework. This maintains the benefits of 2D image processing such as a high frame rates and low processing requirements while improving the accuracy with which a vehicle may be located.
According to a first aspect the invention provides an apparatus for determining if a vehicle on a surface is within a quadrilateral shaped detection zone of the surface that consists of a first and second pair of lines, each pair of lines consisting of non-neighbouring edges of the quadrilateral.
the apparatus comprising: an image capture device operable to capture one or more 2D images of a scene that includes the portion of the surface that includes the detection zone, wherein the one or more 2D images comprise a first 2D image in which at least the vehicle is visible; an image processor; a memory storage device for storing the locations of a first vanishing point and a second vanishing point of a quadrilateral portion of the captured images that corresponds to the location of the detection zone; a means for determining the location of a first corner,of a basal plane of the vehicle; and an algorithm that can be run by the image processor wherein the algorithm determines if the vehicle is within the quadrilateral detection zone by combining the location of the first corner with the location of at least one of the two vanishing points.
The apparatus may include a means for determining the location of four corners of the basal plane of the vehicle in the first 2D image. The first corner of the basal plane being defined as the furthest of four corners of a basal plane of the vehicle from a vanishing point line when measured perpendicularly to the vanishing point line, wherein the vanishing point line is a straight line that intersects both of the two vanishing points.
The apparatus may further comprise a means for determining the location of the first vanishing point of the two vanishing points of the quadrilateral from the first pair of lines and a means for determining the location of the second vanishing point of the two vanishing points of the quadrilateral from the second pair of lines.
The one or more 2D images of the scene may further comprise a second 2D image in which the vehicle may or may not be visible. The first 2D image may be captured by the image capture device before or after the second 2D image is captured by the image capture device.
The first pair of lines and the second pair of lines may be identified within the first 2D image or the second 2D image, using 2D image processing techniques.
Alternately, the apparatus may further comprise a user interface that includes a graphical display to allow the first pair of lines and the second pair of lines or the corners of the quadrilateral to be manually input from a user drawing lines on the user interface displaying the scene present in the first 2D image or the second 2D image of the scene. Where the corners are input the processing unit by identify the lines that join up the corners.
Manually entering the location of detection zone allows a zone to be defined even when it is not physically marked on the surface and as such could not be identified by an image processing technique.
The image processor may be a 2D image processor only configured to implement 2D image processing techniques.
The apparatus may further comprise at least one computing device configured to detect at least one vehicle in the first 2D image and generate a 2D detection box through a 2D vehicle detection process.
The algorithm may determine the location of a second corner of the basal plane by extrapolating the location of the first corner directly towards the first vanishing point to form a first extrapolated line and then calculating the intersection point of the 2D detection box with the first extrapolated line.
The algorithm may also determine the location of a third corner of the basal plane by extrapolating the location of the first corner directly towards the second vanishing point to form a second extrapolated line and then calculating the intersection point of the 2D detection box with the second extrapolated line.
The means to determine the location of the first corner of a basal plane of the vehicle may involve measuring an angle between a first virtual line and a bottom edge of the 2D detection box wherein the first virtual line extends between the first vanishing point and a midpoint of the bottom edge and applying an equation to the angle to calculate a fractional distance of the first corner from a left end of the bottom edge along the bottom edge. Wherein the angle is measured clockwise from the bottom edge to the first virtual line The equation may be p = 0 for 0 < (-rr + a).
p = (0 -1 R -a) / Or /2 - for Or + a) < 0 < P = (n/2 + 0) /(7r /2 -a) for -n/2 < 0 < -a and P = 1 for 0 > -a.
Where a represents a small angle in radians typically n/18, 0 represents the angle in radians, p represents the fractional distance. As is conventional, the angle measured in the clockwise direction is negative.
The algorithm may determine the location of a fourth corner of the basal plane by: extrapolating the location of a second corner directly towards the second vanishing point to form a third extrapolated line; extrapolating the location of a third corner directly towards the first vanishing point to form a fourth extrapolated line; and locating the fourth corner of the basal plane as the intersection of the third extrapolated line and the fourth extrapolated line.
The 2D vehicle detection process may include a categorisation process to categorise vehicles into one of a plurality of classes.
The algorithm may determine if the vehicle is within the quadrilateral by comparing the locations of each of the first corner, a second corner, a third corner and a fourth corner of the basal plane with the quadrilateral.
The vehicle may be any of; a car, a pedestrian, a lorry, a motorbike, a bicycle, a bus a tram or other type of vehicle.
According to a second aspect the invention provides a method for determining if a vehicle on a surface, such as a road surface, is within a detection zone on the surface 20 by; capturing one or more 2D images of a scene that includes the detection zone, where at least the vehicle is visible in a first 2D image of the one or more 2D images; determining the location of at least one of two vanishing points of a quadrilateral representing the boundaries of the detection zone in the one or more 2D images; determining the location of a first corner of a basal plane of the vehicle; and determining if the vehicle is within the quadrilateral by combining the location of the first corner with the location of at least one of the two vanishing points using image processing techniques.
The image processing techniques utilised by the method may consist of only 2D image processing techniques There will now be described, by way of example only, an exemplar apparatus and an exemplar method that fall within the scope of the present invention with reference to the accompanying drawings of which Figure 1 shows an apparatus according to a first aspect of the invention, installed at a road junction; Figure 2a shows a first 2D image of a scene, as captured by an image capture device of the apparatus of Figure I, including a surface, a detection zone on the surface and a vehicle; Figure 2b shows a second 2D image of a scene, as captured by the image capture device of the apparatus of Figure 1, including a surface and a detection zone thereon; Figure 3 shows a virtual extension of the first 2D image of Figure 2a including a vehicle and two vanishing points; Figure 4 shows the virtual extension of Figure 3 with some of the features of Figure 3 omitted for clarity; Figure 5 shows the first 2D image of Figure 2a and includes a subset of corners of a basal plane of the vehicle; Figure 6 shows the first 2D image of Figure 2a and includes four corners of a basal plane of the vehicle; Figure 7 shows a flow chart of a method for determining if a vehicle on a surface is within a quadrilateral on the surface.
Figure 1 shows an apparatus 1000 for determining if a vehicle 1 on a surface 10 is within a detection zone 20 on the surface 10, according to a first aspect of the invention. The apparatus 1000 is shown installed at a road junction 1100, adjacent to a scene 1200 which includes a surface 10, a detection zone 20. in this example, the scene 1200 also includes a vehicle I. The apparatus 1000 comprises an image capture device 1001 operable to capture one or more 2D images 100, 200 of the scene 1200. The detection zone may be represented by a quadrilateral 130, 230 in the one or more 2D images 100, 200. The apparatus 1000 further comprises an image processor 1002, a memory storage device 1003 and an algorithm that can be run by the image processor 1002 to determine if the vehicle 1 is within the quadrilateral 130, 230.
Figures 2a and 2b each show a 2D image of a scene 1200, captured by the image capture device 1001 of the apparatus 1000. The image capture device 1001 is operable to capture one or more 2D images 100, 200, of the scene 1200 which includes the surface 10 and the detection zone 20.
Each of the one or more 2D images 100, 200 of the scene 1200 of Figure 2a and Figure 2b show the quadrilateral 130, 230 representing the detection zone 20. The quadrilateral consists of a first pair of lines 131, 231 and second pair of lines 132, 232, each pair of lines consisting of non-neighbouring edges of the quadrilateral 130, 230. The first pair of lines 131, 231 represent a first pair of parallel boundaries of the detection zone 20 on the surface 10. Similarly, the second pair of lines 132, 232 represent a second pair of parallel boundaries of the detection zone 20 on the surface 10. While the first pair of boundaries are parallel in real space and also the second pair of boundaries are parallel in real space, their representations in the one or more 2D images 100, 200, for example the first pair of lines 131, 231 and/or second pair of lines 132, 232, may not be parallel.
Figure 2a shows a first 2D image 100 of the one or more 2D images 100, 200, captured by the image capture device 1001 of the apparatus 1000, in which a vehicle 1 is visible. The apparatus 1000 further comprises at least one computing device configured to detect the vehicle 1 in the first 2D image 100 and generate a 2D detection box 140 through a 2D vehicle detection process. The 2D vehicle detection process may include a categorisation process to categorise vehicles into one of a plurality of classes. The vehicle shown in the first 2D image has been categorised into the class 'car'.
Figure 2b shows a second 2D image 200 of the one or more 2D images 100, 200, captured by the image capture device 1001 of the apparatus 1000_ in which a vehicle 1
S
is not visible. The first 2D image 100 may be captured by the image capture device 1001 before or after the second 2D image 200 is captured by the image capture device 1001. For example, the second 2D image 200 may be captured during installation and calibration of the apparatus 1000 while the first 2D image 100 may be captured during an operating period of the apparatus 1000.
The first 2D image 100 and the second 2D image 200 may comprise a further one or more quadrilaterals 130', 230' for a further one or more detection zones 20' on the surface 10. The first 2D image 100 may comprise a further one or more vehicles such that the apparatus 1000 may be operable to detect if the vehicle 1 and optionally the further one or more vehicles in one or more detection zones 20, 20', from the first 2D image 100.
The one or more 2D images 100, 200 may comprise any number of additional 2D images similar to either the first 2D image 100 or the second 2D image 200 but captured by the image capture device 1001 at a different time to the first 2D image 100 or second 2D image 200. in this way, the apparatus 1000 may continue to determine if the or any vehicle 1 on the surface 10 is within the one or more detection zones 20, 20' over a period of time. This also advantageously allows for the apparatus 1000 to accommodate to an adjustment to the parameters of the image capture device 1001, such as a change in position of the image capture device 1001 relative to the detection zone(s) 20, 20'.
The first pair of lines 131, 231 and the second pair of lines 132, 232 of the quadrilateral 130, 230 may be identified from the one or more 2D images 100, 200 using 2D image processing techniques. Alternately, the apparatus 1000 may further comprise a user interface which allows the first and second pair of lines 131, 231, 132, 232 to be manually input from a user drawing lines on the user interface displaying the first 2D image 100 or the second 2D image 200 of the scene 1200.
The parameters of the image capture device 1001 are the same for capturing both the first 2D image 100 and the second 2D image 200. The quadrilaterals 130, 230, 130', 230' in the first and second 2D images 100, 200 are therefore equivalent. This feature is beneficial as it allows the quadrilateral 130, 230 to be determined from the second 2D image 200 where there is no vehicle 1 obscuring the detection zone 20.
Figure 3 shows a virtual extension of the first 2D image 100 of Figure 2a including two vanishing points 151, 152 of the quadrilateral 130. The apparatus 1000 comprises a means for determining the location of a first vanishing point 151 of the two vanishing points 151, 152 as the intersection of extensions of the first pair of lines 131. Similarly, the apparatus 1000 also comprises a means for determining the location of the second vanishing point 152 of the two vanishing points 151, 152 as the intersection of extensions of the second pair of lines 132. The location of the two vanishing points 151, 152 of the quadrilateral 130 may be stored on the memory storage device 1003. The vanishing points 151, 152 are shown in relation to the first 2D image 100 in Figure 3, however it should be understood that the location of the two vanishing points 151, 152 in relation to the first 2D image 100 could be determined from the second 2D image 200 Figure 3 shows a basal plane 160, of the vehicle 1, defined by four corners 161, 162, 163, 164 including a first corner 161, second corner 162, third corner 163 and fourth corner 164. The first corner 161 of the basal plane 160 is defined as the furthest of the four corners 161, 162, 163, 164 from a vanishing point line 170 intersecting both of the two vanishing points 151, 152, measured perpendicularly to the vanishing point line 170. In the example first 2D image 100 shown in Figures 2a, 3, 4, 5 and 6, the third corner 163 is the furthest corner of the four corners 161, 162, 163, 164 from the first vanishing point 151. The first corner 161 is the second furthest of the four corners 161, 162, 163, 164 from the first vanishing point 151 and the second furthest of the four corners 161, 162, 163, 164 from the second vanishing point 152 but the furthest of the four corners 161, 162, 163, 164 from the vanishing point line 170.
Figure 4 shows the first 2D image of Figure 2a with many of the features shown in Figure 3 omitted for clarity.
A first virtual line 180 extends between the first vanishing point 151 and a midpoint 142 of a bottom edge 141 of the 2D detection box 140. The apparatus 1000 includes a means, for determining the location of the first corner 161, through measurement of the angle 181 between the first virtual line 180 and the bottom edge 141 of the 2D detection box 140. Wherein the angle is measured clockwise from the bottom edge 141 to the first virtual line 180. The means for determining the location of the first corner 161 is further operable to apply an equation to the angle 181 to calculate a fractional distance of the first corner 161 from a left end 143 of the bottom edge 141 along the bottom edge 141. The equation is p = 0 for 0 < (-)r + a).
p = (0 + it -a) / (n /2 -a) for (-7E a) < 0 < -n/2.
p = (n/2 + 0) /(n /2 -a) for -n/2 <0 < -a and p = 1 for > -a.
Where a represents a small angle in radians typically n/18, 0 represents the angle 181 in radians, p represents the fractional distance and it represents pi. As is conventional, an angle measured in the clockwise direction is negative.
Alternatively, the location of the first corner 161 of a basal plane 160 of the vehicle 1 may be determined through other means such as through 2D or 3D image processing techniques. These 2D or 3D image processing techniques may be run on the same image processor 1002 as the algorithm which determines if the vehicle 1 is within the quadrilateral 130.
The algorithm determines the location of the second corner 162 of the basal plane 160 by extrapolating the location of the first corner 161 directly towards the first vanishing point 151 to form a first extrapolated line 191 and then calculating the intersection point of the 2D detection box 140 with the first extrapolated line 191. This is best seen in Figure 5 which shows the first 2D image 100 of Figure 2a.
Similarly, the algorithm determines the location of the third corner 163 of the basal plane 160 by extrapolating the location of the first corner 161 directly towards the second vanishing point 152 to form a second extrapolated line 192 and then calculating the intersection point of the 2D detection box 140 with the second extrapolated line 192.
Figure 6 shows the location of the fourth corner 164 of the basal plane 160 determined by the algorithm by: extrapolating the location of the second corner 162 directly towards the second vanishing point 152 to form a third extrapolated line 193; extrapolating the location of a third corner 163 directly towards the first vanishing point 151 to form a fourth extrapolated line 194; and locating the fourth corner 164 of the basal plane 160 as the intersection of the third extrapolated line 193 and the fourth extrapolated line 194.
The algorithm then determines if the vehicle 1 is within the quadrilateral 130 by comparing the locations of each of the first corner 161, second corner 162, third corner 163 and fourth corner 164 of the basal plane 160 with the quadrilateral 130.
The four corners 161, 162, 163, 164 of the basal plane 160 are used to construct a virtual representation of the basal plane 160 of the vehicle I and the degree of overlap of this virtual representation of the basal plane 160 of the vehicle 1 with the quadrilateral 130 used to determine if the vehicle 1 is within the quadrilateral 130 and produce a signal in response.
A first signal is produced, indicating that a vehicle 1 is within the quadrilateral 130, if a vehicle percentage representing the fraction of virtual representation of the basal plane 160 of the vehicle 1 within the quadrilateral 130 exceeds a first threshold value. A second signal is produced, indicating that a vehicle 1 is not fully within the quadrilateral 130, if the vehicle percentage falls below a second threshold value and is greater than zero. A third signal is produced, indicating that a vehicle 1 is not within the quadrilateral 130 if the vehicle percentage is zero.
Alternately, a first signal may be produced, indicating that a vehicle 1 is within the quadrilateral 130 if a quadrilateral percentage, representing the fraction of quadrilateral 130 within the virtual representation of the basal plane 160 of the vehicle 1, exceeds a first threshold quadrilateral value. A second signal may be produced, indicating that a vehicle 1 is not fully within the quadrilateral 130, if a quadrilateral percentage falls below a second threshold quadrilateral value and is greater than zero. A third signal may be produced, indicating that a vehicle 1 is not within the quadrilateral 130 if the quadrilateral percentage is zero The apparatus 1000 may further include a memory node operable to store information relating to the signal(s). Advantageously, this may allow the apparatus 1000 to inform a traffic control system to improve the flow of traffic through a junction or allow the apparatus 1000 to collate data relating to motorway or road use over time.
Data including vehicle percentages and quadrilateral percentages may be stored in the memory node and/or distributed to local or remote data processing devices.
For example, the memory node may be used to count the number of first signals to estimate the number of times a vehicle is determined to be within the detection zone 20 for traffic management purposes. In another example, the memory node may be used to count the number of third signals to estimate the number of available parking spaces in a car park or to count the number of second signals to estimate the number of vehicles 1 parked across two parking spaces in a car park.
The memory node may provide information to a traffic management system to alter the phasing of traffic lights of the traffic management system. Live (real-time) reporting of information from the memory node may allow current traffic conditions to be measured, better accounting for the current situation than estimates used to determine traffic light phasing. This may result in better regulated traffic flow and minimise a build-up of traffic. For example, instead of a first traffic light being phased periodically, build-up of traffic would be minimised if the first traffic light only turned red after a or a set number of vehicles were detected at a second crossflow traffic light.
The surface 10 shown in Figures 1 to 6 is a road surface, however, it should be appreciated that the surface 10 may be any other type of surface such as that of a car park.
in Figure 1, the apparatus 1000 is shown installed at a road junction 1100, mounted to a traffic light 1010. However, it should be understood that the apparatus 1000 may be installed in a variety of positions in various settings and should not be limited to this configuration. For example, the apparatus 1000 may not be installed at a road junction and may instead be installed on a motorway, mounted on a central reservation of the motorway. In another example, the apparatus 1000 may be mounted on a beam or fixed to a wall of a parking structure.
Alternately, the apparatus 1000 may be freestanding and not be mounted. A freestanding apparatus 1000 has advantages allowing more flexibility in installation.
Any selection or combination of: the image processor 1002; memory storage device 1003; algorithm; computing device; user interface; means for determining the location of a first corner of a basal plane of the vehicle in the first 2D image; and the means for determining the location of the two vanishing points, may be locally connected and/or contained within the same housing or alternately may be remote and connected to other elements of apparatus 1000 via wired or wireless connections such as via radio transmission.
It should be noted that the first 2D image 100 shown in Figures 2a, 3, 4, 5 and 6 and the second 2D image 200 shown in Figure 2a are examples and should not be regarded as limiting. Features of these examples may not be present or may be configured differently in other first and second 2D images 100, 200, within the scope of the claims. The vehicle 1 of Figures 2a and 3 to 6 is a car, however, the vehicle I may alternately be any other type of vehicle such as a lorry, a motorbike, a bicycle, a bus or a tram.
Figure 7 shows a flow chart of a method 400 for determining if a vehicle on a surface is within a detection zone on the surface. In the first step 410 of the method 400, one or more 2D images of a scene 1200 are captured. The scene 1200 includes the surface and the detection zone and may include the vehicle. The one or more 2D images includes a first 2D image in which at least the vehicle is visible. Optionally, the one or more 2D images of the scene 1200 may further comprise a second 2D image in which the vehicle may or may not be visible An image capture device 1001 may be used to capture the one or more 2D images of the scene 1200. The parameters of the image capture device 1001 may be the same for capturing both the first 2D image and the second 2D image such that the quadrilaterals in the first and second 2D images are equivalent. This feature is beneficial as would allow the quadrilateral to be determined from the second 2D image where there is no vehicle obscuring the detection zone.
Each of the one or more 2D images may comprise a further one or more quadrilaterals for a further one or more detection zones on the surface and/or a further one or more vehicles on the surface. In this way, the method 400 may be utilised to determine if the vehicle and optionally the further one or more vehicles on the surface, such as a road surface, is within the one or more detection zones on the surface.
In the second step 420, the location of at least one of two vanishing points of a quadrilateral representing the detection zone in the one or more 2D images is determined. The quadrilateral consists of a first pair of lines and second pair of lines, each pair of lines consisting of non-neighbouring edges of the quadrilateral. The first pair of lines represent a first pair of parallel boundaries of the detection zone on the surface. Similarly, the second pair of lines represent a second pair of parallel boundaries of the detection zone on the surface. The first pair of parallel boundaries arc defined as the pair of parallel boundaries parallel to the side faces of an expected vehicle. The second pair of parallel boundaries are defined as the pair of parallel boundaries perpendicular to the side faces of an expected vehicle. For example, an expected vehicle may be assumed to travel approximately in line with the direction of a road. While the first pair of boundaries are parallel in real space and also the second pair of boundaries are parallel in real space, their representations in the one or more 2D images, for example the first pair of lines and/or second pair of lines, may not be parallel.
When the quadrilaterals in the first and second 2D images are equivalent, the location of the two vanishing points in relation to the first 2D image could be determined from the second 2D image The first pair of lines and the second pair of lines of the quadrilateral may be identified from the one or more 2D images using 2D image processing techniques.
Alternately, the first and second pair of lines may be manually input through a user drawing lines on a user interface displaying the first 2D image or the second 2D image of the scene 1200.
The second step 420 may involve determining the location of a first vanishing point of the two vanishing points as the intersection of extensions of the first pair of lines. Similarly, the second step 420 may also involve determining the location of the second vanishing point of the two vanishing points as the intersection of extensions of the second pair of lines. The second step 420 may involve storing the location of at least one of the two vanishing points of the quadrilateral.
In the third step 430, the location of a first corner of a basal plane of the vehicle is determined. The basal plane of the vehicle may be defined by four corners including a first corner, second corner, third corner and fourth corner. The first corner of the basal plane is defined as the furthest of the four corners from a vanishing point line intersecting both of the two vanishing points, measured perpendicularly to the vanishing point line.
The third step 430 may include detection of the vehicle in the 2D image and generation of a 2D detection box through a 2D vehicle detection process run on at least one computing device. The 2D vehicle detection process may include a categorisation process to categorise vehicles into one of a plurality of classes.
The location of the first corner may be determined through measurement of an angle between a first virtual line and the bottom edge of the 2D detection box. The angle is measured clockwise from the bottom edge 141 to the first virtual line ISO. Where the first virtual line extends between the one of the two vanishing points and a midpoint of a bottom edge of the 2D detection box.
An equation may then be applied to the angle to calculate a fractional distance of the first corner from a left end of the bottom edge along the bottom edge. The equation may be; p = 0 for < (-rc + a).
p = (0 + -a) / In /2 -a) for (-a + a) < 0 < -21/2.
p = (n/2 + 0) /(n /2 - for -n/2 <0 < -a and p= I for 0 > -a.
Where a represents a small angle in radians typically n/18, 0 represents the angle in radians, p represents the fractional distance and it represents pi. As is conventional, the angle measured in the clockwise direction is negative.
Alternatively, the location of the first corner of a basal plane of the vehicle may be determined through other means such as though 2D or 3D image processing 35 techniques.
In the fourth step 440, the location of the first corner is combined with the location of at least one of the two vanishing points using image processing techniques to determine if the vehicle is within the quadrilateral. This may involve the use of an image processor 1002 and an algorithm that can be run by the image processor 1002.
The fourth step 440 may involve the algorithm determining the location of the second corner of the basal plane by extrapolating the location of the first corner directly towards the first vanishing point to form a first extrapolated line and then calculating the intersection point of the 2D detection box with the first extrapolated line.
Similarly, the algorithm may determine the location of the third corner of the basal plane by extrapolating the location of the first corner directly towards the second vanishing point to form a second extrapolated line and then calculate the intersection point of the 2D detection box with the second extrapolated line.
The location of the fourth corner of the basal plane may be determined by the algorithm by: extrapolating the location of the second corner direct].) towards the second vanishing point to form a third extrapolated line; extrapolating the location of a third corner directly towards the first vanishing point to form a fourth extrapolated line; and locating the fourth corner of the basal plane as the intersection of the third extrapolated line and the fourth extrapolated line.
in this way, the assumption that the vehicle is aligned such that its sides are substantially parallel to the sides of the detection zone is utilised to determine the location of the four corners of the basal plane of the vehicle in the first 2D image.
The algorithm may then determine if the vehicle is within the quadrilateral by comparing the locations of each of the first corner, second corner, third corner and fourth corner of the basal plane with the quadrilateral.
The fourth step 440 may also include production of a signal in response to determining if the vehicle is within the quadrilateral.
For example, the four corners of the basal plane may be used to construct a virtual representation of the basal plane of the vehicle and the degree of overlap of this virtual representation of the basal plane of the vehicle with the quadrilateral used to determine if the vehicle is within the quadrilateral and produce a signal in response.
For example, a first signal may be produced, indicating that a vehicle is within the quadrilateral, if a vehicle percentage representing the fraction of virtual representation of the basal plane of the vehicle within the quadrilateral exceeds a first threshold value. A second signal may be produced, indicating that a vehicle is not fully within the quadrilateral, if the vehicle percentage falls below a second threshold value and is greater than zero. A third signal may be produced, indicating that a vehicle is not within the quadrilateral if the vehicle percentage is zero.
Alternately, a first signal may be produced, indicating that a vehicle is within the quadrilateral if a quadrilateral percentage, representing the fraction of quadrilateral within the virtual representation of the basal plane of the vehicle, exceeds a first threshold quadrilateral value. A second signal may be produced, indicating that a vehicle is not fully within the quadrilateral, if a quadrilateral percentage falls below a second threshold quadrilateral value and is greater than zero. A third signal may be produced, indicating that a vehicle is not within the quadrilateral if the quadrilateral percentage is zero The method 400 shown in Figure 7 may optionally include a fifth step 450 whereby a memory node operable to store information relating to the signal(s) is utilised. For example, the memory node may count the number of first signals to estimate the number of times a vehicle is determined to be within the detection zone for traffic management purposes. in another example, the memory node may be used to count the number of third signals to estimate the number of available parking spaces in a car park or to count the number of second signals to estimate the number of vehicles parked across two parking spaces in a car park.
Data including vehicle percentages and quadrilateral percentages may be stored in the memory node and/or distributed to local or remote data processing devices.
The fifth step 550 may include the memory node providing information to a traffic management system to alter the phasing of traffic lights of the traffic management system. Live reporting of information from the memory node may allow current traffic conditions to be measured, better accounting for the current situation than estimates used to determine traffic light phasing. This may result in better regulated traffic flow and minimise a build-up of traffic. For example, instead of a first traffic light being phased periodically, build-up of traffic would be minimised if the first traffic light only turned red after a or a set number of vehicles were detected at a second crossflow traffic light The method 400 may be run continuously or periodically such that additionally 2D images of the one or more 2D images are captured by the image capture device 1001. The additional 2D images being similar to either the first 2D image or the second 2D image but captured by the image capture device 1001 at a different time to the first 2D image or second 2D image. In this way, the method may continue to determine if the vehicle and optionally the further one or more vehicles on the surface are within the one or more detection zones over a period of time. This also advantageously allows for the method 400 to accommodate to an adjustment to the parameters of the image capture device 1001, such as a change in position of the image capture device 1001 relative to the detection zone(s).
While the steps 410, 420, 430, 440, 450 of the method 400 are ordered as the first 410, second 420, third 430 step etc. it should be understood that the steps 410, 420, 430, 440, 450 may be completed concurrently or in a different order to the exemplar order described above. For example, the location of at least one of two vanishing points of a quadrilateral may be determined before the location of a first corner of a basal plane of the vehicle. Furthermore, additional features of the steps 410, 420, 430, 440, 450 described in relation to Figure 7 may not be present or may instead occur in alternate steps 410, 420, 430, 440, 450 to the specific exemplar combinations described above, within the scope of the claims Looking more generally at both the apparatus 1000 and the method 400, the apparatus 1000 may be operable to complete the method 400. Any or all features described in relation to the method 400 may be directly equivalent to features described in relation to the apparatus 1000. For example, the first 2D image 100 of the apparatus 1000 may be directly equivalent to the first 2D image of the method 400.
Image processing operations may be run on a single image processor 1002. For example, the first pair of lines 100 and the second pair of lines 200 of the quadrilateral 130 may be identified from the one or more 2D images 100, 200 and the algorithm run by the same image processor 1002. Alternately, multiple separate image processors 1002 may be utilised for each or a subset of operations requiring image processing techniques The image processor 1002 may be a 2D image processor 1002 and may only be configured to implement 2D image processing techniques and not be configured to implement 3D processing techniques The detection zone 20 may be partially obscured in either the first or second 2D image 100, 200 and may be completely obscured in the first 2D image 100 The detection zone 20 may be rectangular or square.
The detection zone 20 may be defined by lines in real-space on the surface 10 such as a car parking space in a car park or a painted box on a road surface. Alternately, the detection zone 20 may be defined through other means, for example a periodically spaced grid of dots on the surface or the detection zone 20 may be defined by a set distance from real-space lines measured in either the first or the second 2D image 100, 200.
The quadrilateral 130 representing the detection zone 20 may or may not be visible in the one or more 2D images 100, 200.
The plurality of classes with which the categorisation process, implemented by the 2D vehicle detection process, categorises vehicles 1 into may include; car, bus, tram, lorry, pedestrian, motorbike, bicycle, or any other category of vehicle.
The parameters of the image capture device 1001 may be the same for capturing both the first 2D image 100 and the second 2D image 200 such that the quadrilaterals 130, 230 in the first and second 2D images 100, 200 are equivalent.
Alternately, the parameters of the image capture device 1001 may not be the same for capturing both the first 2D image 100 and the second 2D image 200. However, the quadrilateral 130 of the first 2D image 100 may still be determined from the second 2D image 200 by applying transformations dependant on the change in parameters of the image capture device 1001 between capturing the first 2D image 100 and the second 2D image 200.

Claims (13)

  1. Claims 1. An apparatus for determining if a vehicle on a surface is within a quadrilateral shaped detection zone of the surface that consists of a first and second pair of lines, each pair of lines consisting of non-neighbouring edges of the quadrilateral.the apparatus comprising: an image capture device operable to capture one or more 2D images of a scene that includes the portion of the surface that includes the detection zone, wherein the one or more 2D images comprise a first 2D image in which at least the vehicle is visible an image processor; a memory storage device for storing the locations of a first vanishing point and a second vanishing point of a quadrilateral portion of the captured images that corresponds to the location of the detection zone; a means for determining the location of a first corner, of a basal plane of the vehicle; and an algorithm that can be run by the image processor wherein the algorithm determines if the vehicle is within the quadrilateral detection zone by combining the location of the first corner with the location of at least one of the two vanishing 20 points.
  2. An apparatus according to claim 1, further comprising a means for: determining the location of the first vanishing point of the two vanishing points of the quadrilateral from the first pair of lines; and determining the location of the second vanishing point of the two vanishing points of the quadrilateral from the second pair of lines; optionally wherein the one or more 2D images of the scene further comprise a second 2D image in which the vehicle may or may not be visible.
  3. 3. An apparatus according to claim 2, wherein the boundaries of the detection zone are physically marked on the surface and wherein the processor is arranged to identify the first pair of lines and the second pair of lines from either the first 2D image or the second 2D image, using 2D image processing techniques.
  4. 4. An apparatus according to claim 2, wherein the boundaries of the detection zone are virtual boundaries and the apparatus further comprises a user interface including a graphical display unit to allow the first pair of lines and the second pair of lines to be manually input by the user drawing lines or marking the corners of the detection zone on a displayed image of the first 2D image or the second 2D image of the scene on the display unit.
  5. 5. An apparatus according to any preceding claim, wherein the image processor is a 2D image processor only configured to implement 2D image processing techniques 10
  6. 6. An apparatus according to any preceding claim, further comprising at least one computing device configured to detect at least one vehicle in the first 2D image and generate a 2D detection box through a 2D vehicle detection process; and wherein the algorithm determines the location of a second corner of the basal plane by extrapolating the location of the first corner directly towards the first vanishing point to form a first extrapolated line and then calculating the intersection point of the 2D detection box with the first extrapolated line, optionally wherein the algorithm determines the location of a third corner of the basal plane by extrapolating the location of the first corner directly towards the second vanishing point to form a second extrapolated line and then calculating the intersection point of the 2D detection box with the second extrapolated line.
  7. 7. An apparatus according to claim 5, wherein the means to determine the location of the first corner of a basal plane of the vehicle is operable to: measure an angle between a first virtual line and a bottom edge of the 2D detection box wherein the first virtual line extends between the first vanishing point and a midpoint of the bottom edge; and apply an equation to the angle to calculate a fractional distance of the first corner from a left end of the bottom edge along the bottom edge, optionally wherein a represents a small angle in radians, 0 represents the angle in radians measured clockwise from the bottom edge to the first virtual line, p represents the fractional distance and the equation is any of; p = 0 for U5 (-7r + a), p = (0 + it -a) / (n /2 -a) for (-7c + a) < 9 < -n/2, p = (n/2 + 0) Az /2 -a) for -n/2 5U < -a and p = 1 for 0 > -a, optionally wherein a = 7r/18.
  8. 8. An apparatus according to any preceding claim, wherein thc algorithm determines the location of a fourth corner of the basal plane by; extrapolating the location of a second corner directly towards the second vanishing point to form a third extrapolated line; extrapolating the location of a third corner directly towards the first vanishing point to form a fourth extrapolated line; and locating the fourth corner of the basal plane as the intersection of the third extrapolated line and the fourth extrapolated line.
  9. 9. An apparatus according to claim 6, wherein the 2D vehicle detection process includes a categorisation process to categorise vehicles into one of a plurality of classes.
  10. 10. An apparatus according to any preceding claim, wherein the algorithm determines if the vehicle is within the quadrilateral by comparing the locations of each of the first corner, a second corner, a third corner and a fourth corner of the basal plane with the quadrilateral.
  11. 11. An apparatus according to any preceding claim, wherein the vehicle is any of; a car, a pedestrian, a lorry, a motorbike, a bicycle a bus, a tram or other type of vehicle.
  12. 12. A method for determining if a vehicle on a surface, such as a road surface within a detection zone on the surface by; capturing one or more 2D images of a scene that includes the detection zone, where at least the vehicle is visible in a first 2D image of the one or more 2D images; determining the location of at least one of two vanishing points of a quadrilateral representing the boundaries of the detection zone in the one or more 2D images; determining the location of a first corner of a basal plane of the vehicle; and determining if the vehicle is within the quadrilateral by combining the location of the first corner with the location of at least one of the two vanishing points using image processing techniques.
  13. 13. A method according to claim 12. wherein the image processing techniques consist of only 2D image processing techniques.
GB2207898.4A 2022-05-27 2022-05-27 Apparatus and method for determining vehicle location Pending GB2619098A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2207898.4A GB2619098A (en) 2022-05-27 2022-05-27 Apparatus and method for determining vehicle location

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2207898.4A GB2619098A (en) 2022-05-27 2022-05-27 Apparatus and method for determining vehicle location

Publications (2)

Publication Number Publication Date
GB202207898D0 GB202207898D0 (en) 2022-07-13
GB2619098A true GB2619098A (en) 2023-11-29

Family

ID=82323992

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2207898.4A Pending GB2619098A (en) 2022-05-27 2022-05-27 Apparatus and method for determining vehicle location

Country Status (1)

Country Link
GB (1) GB2619098A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102177614B1 (en) * 2020-01-09 2020-11-12 렉스젠(주) Lane generation system for collecting traffic information and method thereof
US20210342606A1 (en) * 2020-04-30 2021-11-04 Boe Technology Group Co., Ltd. Parking Detection Method, System, Processing Device and Storage Medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102177614B1 (en) * 2020-01-09 2020-11-12 렉스젠(주) Lane generation system for collecting traffic information and method thereof
US20210342606A1 (en) * 2020-04-30 2021-11-04 Boe Technology Group Co., Ltd. Parking Detection Method, System, Processing Device and Storage Medium

Also Published As

Publication number Publication date
GB202207898D0 (en) 2022-07-13

Similar Documents

Publication Publication Date Title
CN102254318B (en) Method for measuring speed through vehicle road traffic videos based on image perspective projection transformation
US8849554B2 (en) Hybrid traffic system and associated method
CN104021676B (en) Vehicle location based on vehicle dynamic video features and vehicle speed measurement method
EP3637313A1 (en) Distance estimating method and apparatus
CN110322702A (en) A kind of Vehicular intelligent speed-measuring method based on Binocular Stereo Vision System
CN102467821B (en) Road distance detection method based on video image and apparatus thereof
US9251586B2 (en) Optical overhead wire measurement
LU502288B1 (en) Method and system for detecting position relation between vehicle and lane line, and storage medium
CN101952688A (en) Method for map matching with sensor detected objects
CN101727671A (en) Single camera calibration method based on road surface collinear three points and parallel line thereof
CN113034586B (en) Road inclination angle detection method and detection system
JP2012084024A (en) Intersection traffic flow measurement device
JP2020013573A (en) Three-dimensional image reconstruction method of vehicle
CN110986887B (en) Monocular camera-based distance measurement method, storage medium and monocular camera
GB2619098A (en) Apparatus and method for determining vehicle location
CN114384505A (en) Method and device for determining radar deflection angle
CN114782555B (en) Map mapping method, apparatus, and storage medium
Lin et al. Adaptive inverse perspective mapping transformation method for ballasted railway based on differential edge detection and improved perspective mapping model
Laureshyn et al. Automated video analysis as a tool for analysing road user behaviour
CN113847902A (en) Method and device for determining a distance of a vehicle to an object
CN115015909A (en) Radar data and video data fusion method and system based on perspective transformation
CN114239995A (en) Method and system for generating full-area cruising route, electronic device and storage medium
CN111372051A (en) Multi-camera linkage blind area detection method and device and electronic equipment
US20240070916A1 (en) Vehicle and Control Method Thereof
TWI836366B (en) Automatic parking mapping system mounted on vehicle