WO2021077157A1 - Sensor and associated system and method for detecting a vehicle - Google Patents

Sensor and associated system and method for detecting a vehicle Download PDF

Info

Publication number
WO2021077157A1
WO2021077157A1 PCT/AU2020/051127 AU2020051127W WO2021077157A1 WO 2021077157 A1 WO2021077157 A1 WO 2021077157A1 AU 2020051127 W AU2020051127 W AU 2020051127W WO 2021077157 A1 WO2021077157 A1 WO 2021077157A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
vehicle
spatial region
predefined
predefined spatial
Prior art date
Application number
PCT/AU2020/051127
Other languages
French (fr)
Inventor
Matthew James Adams
Oscar FEHLBERG
Original Assignee
Summit Innovations Holdings Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2019903958A external-priority patent/AU2019903958A0/en
Application filed by Summit Innovations Holdings Pty Ltd filed Critical Summit Innovations Holdings Pty Ltd
Publication of WO2021077157A1 publication Critical patent/WO2021077157A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/04Systems determining presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/027Constructional details of housings, e.g. form, type, material or ruggedness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • G01S13/56Discriminating between fixed and moving objects or between objects moving at different speeds for presence detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Definitions

  • the present invention relates to a sensor and associated system and method for detecting a vehicle.
  • the present invention relates to the detection of vehicles moving along a predefined path.
  • a key source of revenue for quick service restaurants is the drive-through, wherein customers drive their vehicles to various points of a QSR to order, pay for and collect their meals, all from the convenience of their vehicle.
  • Data regarding the flow of vehicles through a drive-through can be instrumental for a QSR to improve its drive-through performance and efficiency, thereby increasing revenue generated by the drive-through.
  • loops are not able to accurately track certain vehicle behaviour, such as "pull-ins” and “pull-outs” which can occur in "uncontrolled” drive-through lanes whereby vehicles do not necessarily complete or follow a sequential path through a drive-through.
  • loops are vulnerable to misattributing data between vehicles, thereby decreasing the accuracy of the collected drive-through data and comprising the efficiency of a QSR.
  • loop systems While alternatives to loop systems have been proposed, these alternatives have been relatively expensive, require additional computer equipment and infrastructure to operate, and require that the entire system be replaced even though it may be that only a single inductive loop has failed and requires replacement.
  • a sensor for determining whether a vehicle is present in a predefined spatial region, the sensor comprising a housing in which is enclosed: a camera module for capturing visual data associated with the spatial region; a radar module for capturing location data associated with objects in the spatial region; and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the spatial region.
  • the senor comprises a first circuit board on which the radar module is mounted, and a second processing circuit board arranged behind and substantially parallel to the first circuit board.
  • both circuit boards comprise a through hole arranged such that the camera module can protrude therethrough and monitor the predefined spatial region
  • the camera module and radar module are mounted within the housing such that a plane of the field of view monitored by the camera module is substantially parallel with a plane in which the radar module is mounted.
  • the camera module and the radar module are mounted within the housing such that they are substantially co-planar with one another.
  • the housing comprises: a front cover substantially formed from plastic and having a window through which the camera module can monitor the spatial region.
  • the housing also comprises a rear portion substantially formed from metal which acts as a thermal mass to passively cool the sensor.
  • the rear portion comprises heat dissipation elements which extend inwardly into the housing and contact one or both circuit boards and/or componentry mounted thereon so as to cool the sensor.
  • the senor is configured to output one of two possible signals for a predefined spatial region at a given time, a first signal indicating that a vehicle is present in the predefined spatial region, and a second signal indicating that a vehicle is not present in the predefined spatial region.
  • the radar module is a two-dimensional spatial radar comprising a four radar element array.
  • the senor is a power over ethernet sensor wherein the radar module comprises a 77 GHz radar.
  • the processor in determining whether a vehicle is present in a predefined spatial region, is configured to: correlate the visual data and the location data; determine the optical flow of an object in the predefined spatial region; determine the size and/or mass of the object; determine the speed of the object; and identify visual characteristics of the object, and based on this information determine whether the detected object is a vehicle.
  • the processor uses a machine learning process to determine whether a vehicle is present in the predefined spatial region.
  • the sensor of any one of the preceding claims wherein the predefined spatial region comprises two or more spatial subregions.
  • the subregions are substantially adjacent to one another and define a continuous path along which a vehicle can travel.
  • the subregions are virtual subregions which map onto the predefined spatial region.
  • a system for detecting the presence of a vehicle as it moves along a predefined path and through a series of predefined spatial regions along that path comprising: a plurality of sensors configured for determining whether a vehicle is present in a predefined spatial region, each sensor comprising a camera module for capturing visual data associated with the spatial region, a radar module for capturing location data associated with objects in the spatial region, and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the spatial region, wherein each sensor is mounted at a respective position along the predefined path and configured to monitor respective predefined spatial regions on said path to determine whether a vehicle is present within each respective predefined spatial region; and a central processing unit in communication with each of the sensors and configured to receive data therefrom which either confirms or denies the presence of a vehicle in said predefined spatial regions.
  • the predefined path comprises at least a portion of the path of a drive-through of a quick service restaurant.
  • the predefined path comprises a virtual path which maps onto the path of the drive-through of the quick service restaurant.
  • data corresponding to the predefined virtual path is stored in the sensor.
  • a first sensor is mounted at a first location of the quick service restaurant so as to monitor a first spatial region associated with an entrance of the drive-through.
  • a second sensor is mounted at a second location of the quick service restaurant so as to monitor a second spatial region associated with an order point of the drive-through.
  • a third sensor is mounted at a third location of the quick service restaurant so as to monitor a third spatial region associated with a payment point of the drive-through.
  • a fourth sensor is mounted at a fourth location of the quick service restaurant so as to monitor a fourth spatial region associated with a food pick-up point of the drive-through.
  • a fifth sensor is mounted at a fifth location of the quick service restaurant so as to monitor a fifth spatial region associated with a waiting area where a vehicle can park while waiting for food to be delivered from the quick service restaurant to the vehicle.
  • each sensor is configured to output to the central processing unit one of two possible signals at a given time for a given predefined spatial region, a first signal indicating that a vehicle is present in the predefined spatial region, and a second signal indicating that a vehicle is not present in the predefined spatial region.
  • each sensor comprises a sensor according to the first aspect of the invention.
  • a system for determining the presence of a vehicle as it moves along a predefined path and through a series of predefined spatial regions along that path comprising: a plurality of sensors configured for sensing activity in respective predefined spatial regions, the plurality of sensors including: at least one each sensor comprising a camera module for capturing visual data associated with a predefined spatial region, a radar module for capturing location data associated with objects in the predefined spatial region, and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the predefined spatial region; and at least one inductive loop for sensing vehicles in respective predefined spatial regions; wherein each sensor is positioned along the predefined path and configured to monitor respective predefined spatial regions on said path to determine if a vehicle is present within each respective predefined spatial region; and a central processing unit in communication with each of the sensors and configured to receive data therefrom which either confirms or denies the presence of a vehicle in each of the respective predefined spatial regions.
  • each sensor is configured to output to the central processing unit a signal indicative of whether a vehicle is or is not present at a respective predefined spatial region such that the central processing unit can correlate signals from the or each camera-radar sensor and the or each inductive loop sensor to determine the presence and track the path of a vehicle along the predefined path.
  • each camera-radar sensor comprises a sensor according to the first aspect of the invention.
  • a method of determining the presence of a vehicle in a predefined spatial region the method being performed within a sensor having a camera module for capturing visual data associated with the spatial region, a radar module for capturing location data associated with objects in the spatial region, and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the predefined spatial region, the processor performing the steps of: correlating the visual data and the location data; determining the optical flow of an object in the predefined spatial region; determining the size and/or mass of the object; determining the speed of the object; and identifying visual characteristics of the object, and based on this information determine whether the detected object is a vehicle.
  • Figure 1 is a front perspective view of a sensor according to an embodiment of the invention.
  • Figure 2 is a rear perspective view of the sensor of Figure 1;
  • Figure 3 is a rear perspective view of a sensor according to an embodiment of the invention with a mounting element attached;
  • Figure 4 is an exploded view of the sensor of Figure 1;
  • Figure 5 is a cross sectional rear perspective view of the sensor of Figure 1;
  • Figure 6 is a diagram of the power architecture of a sensor according to embodiments of the invention.
  • a sensor 2 according to an embodiment of the invention is shown in Figures 1 to 3.
  • the sensor 2 is configured for detecting whether one or more vehicles is present in a predefined spatial region.
  • One or more such sensors 2 can be integrated along the path of a drive-through lane at a QSR and used to detect and track, in real time, the movement of vehicles through the drive-through lane.
  • Known sensors simply capture information, such as the optical data of a scene, and upload that information to external servers for processing to determine whether an object is present in a scene and whether that object is a vehicle. This process can be relatively slow and resource intensive, particularly because of the need to transfer a significant amount of data from the sensor to the external servers for processing.
  • the present sensor 2 can both capture and process data on its own.
  • the present sensor 2 can function as a standalone device which can determine on its own whether an object detected in a spatial region or scene is a vehicle.
  • the present sensor 2 can itself determine whether a sensed object is a vehicle, and is configured to simply deliver a simple signal or payload to the server indicating whether a vehicle is present in a predefined spatial region.
  • One advantage of performing this computing at the edge is that the central server need not be significantly upgraded in order to handle the large amounts of data that it would otherwise need to be processed. This is particularly beneficial if a QSR wants to monitor additional spatial regions along a drive-through path for vehicles.
  • the central server may require performance upgrades in order to handle the additional inflow of data that would be sent from known sensors.
  • no upgrade is needed since the present sensor 2 does not send all the gathered information for processing. Instead, it merely sends a simple signal to the central server indicative of whether a vehicle has been detected in the spatial region or not.
  • the present sensors 2 are self-contained vehicle-detecting devices, they can easily be retrofitted to and cooperate with a QSR's existing traffic-monitoring system, which often involves a series of inductive loops. For example, suppose an inductive loop at a QSR has unexpectedly malfunctioned. Rather than undergo the expensive and time- consuming process of repairing or replacing the loop (which can involve closing off the associated drive-through lane entirely), the QSR can simply install the present sensor 2 to create a loop equivalent. The sensor 2 can simply be mounted to a location along the drive- through path such that it faces and monitors a spatial region which was previously monitored by the inductive loop.
  • the sensor 2 can thus begin detecting objects entering the spatial region and determining whether those objects are vehicles and then reporting to the central server - in much the same way that the loop did - whether a vehicle is in the selected spatial region.
  • one or more of the present sensors 2 can be integrated into the drive-through path of an existing QSR and be used alongside with, or instead of, existing inductive loops that the QSR currently relies on to detect traffic flow. The sensor 2 will now be described with reference to the Figures.
  • FIGs 1 and 2 show the front and rear, respectively, of the present sensor 2.
  • the sensor 2 comprises a housing in which is enclosed a camera module 4, a radar module 6, a processor and other internal componentry.
  • the housing comprises a front cover 8 which is substantially formed from plastic which fits over the sensor's 2 camera and radar modules 4, 6, and a substantially metallic rear portion 10 which secures to the front cover 6 and acts as a thermal mass to help cool the sensor 2.
  • An elastomeric seal 12 ( Figure 4) is fitted between the front cover 8 and rear portion 10 and seals the interior of the sensor 2 against external contaminants, such as moisture and dust.
  • the rear portion 10 comprises a series of outwardly projecting cooling fins 12 which increase the surface area from which heat can be radiated away from the sensor 2.
  • the rear portion 10 is also provided with an ethernet plug 14 via which the sensor 2 can be powered via power over ethernet.
  • a mounting arm 16 can be secured to a mounting face 18 ( Figure 2) of the rear portion 10 of the housing and allows the sensor 2 to be mounted at a suitable location of a QSR.
  • the mounting arm 16 comprises two ball joints which enable easy and flexible orienting of the sensor 2 once it has been mounted to a location of the QSR, such as a ceiling, wall or support column.
  • the radar module 6 is mounted towards a front end of the sensor 2 and is configured to capture location data in front of the sensor 2.
  • the front cover 8 is made from plastic to avoid interfering with signals emitted from and retrieved by the radar module 6.
  • the radar module 6 is 77-GHz Four-Element Phased-Array Radar configured to capture location information in a two-dimensional plane.
  • the front cover 8 is also configured with a window or lens filter 20 through which the camera module 4 can monitor the scene before it.
  • the lens 20 is formed from a polycarbonate material and may be provided with an oleophobic coating.
  • the present sensor 2 can more accurately and reliably detect and track the movement of vehicles as compared with a sensor which receives only visual data or only location data. Moreover, if either of the radar module 6 or the camera module 4 malfunctions, or if the effectiveness of one of the modules is impeded (e.g. a snowstorm may compromise the quality of the visual data gathered by the camera module 4), the other of the modules 4, 6 can still function to provide information about the presence and movement of vehicles.
  • a ring of IR LEDs 22 is arranged on a ring mount 24 around the camera module 4 to assist with capturing visual information in low light.
  • the camera module 4 is arranged behind the lens filter 20, which itself is secured with fasteners 28 to the front cover 8 via a lens plate 26.
  • the lens plate 26 acts to secure the lens arrangement in Figure 4 to the front portion 8.
  • An elastomeric O-ring 30 is provided between the lens plate 26 and lens filter 20 and helps to seal the lens components and camera module 4 against external contaminants, such as moisture and dust.
  • the rear portion and front cover can be releasably secured to one another via fasteners 32.
  • the sensor 2 comprises two main circuit boards: a front circuit board 34 on which is mounted the radar module 6, and a rear circuit board 36 configured to process visual data from the camera module 4 and location data from the radar module 6 to determine whether a detected object in a predefined spatial region is a vehicle.
  • Both circuit boards 34, 36 are arranged in a stacked manner, with the processing circuit board 36 arranged behind and parallel to the front radar-supporting circuit board 34.
  • Both circuit boards 34, 36 comprise an aligned through hole through 38 which at least a portion of the camera module 4 can protrude through.
  • the sensor 2 also comprises an ethernet board 40 via which the sensor 2 can be powered via power over ethernet.
  • the rear metal backing 10 of the sensor 2 comprises heat dissipating elements in the form of metallic arms 42a, 42b which extend into the interior of the housing and physically contact the circuit boards and internal components.
  • an upper metallic arm 42a projects towards and contacts a rear 44 of the camera module 4
  • a lower metallic arm 42b projects towards and contacts a rear of the processing circuit board 36.
  • the metallic arms 42a, 42b act as heatsinks through which heat in the camera module 4 and processing circuit board 36 can dissipate.
  • no active cooling components e.g. fans
  • this hypothetical QSR Prior to the installation of any sensors 2, this hypothetical QSR has inductive loops buried in the asphalt of its drive-through lane at each of the above four points. These inductive loops are connected with a central processing unit or server of the QSR and may trigger certain actions based on whether a vehicle is detected.
  • the loop at the order point may be associated with one of the QSR's audio pathways (e.g. microphone and speaker system) which activates to allow a customer and QSR staff member to remotely communicate with one another regarding what the customer wishes to order.
  • the central server may be programmed to activate the audio pathway to enable the customer to interact with a staff member.
  • a loop which detects a vehicle present at the collection point may trigger an alert to notify a staff member to attend to the customer with their food order.
  • the QSR can simply install the present sensor 2 at an appropriate location so that it faces the spatial region previously monitored by the inductive loop.
  • the sensor 2 could be mounted to a building structure proximate to the order point, and then be oriented so that the camera and radar modules 4, 6 are pointed towards an area on the ground of the drive-through lane where it is important to know whether there is a vehicle there or not.
  • the power over ethernet connection not only supplies power to the sensor 2, but also allows the sensor to communicate with the QSR's existing central server (which the other inductive loops are in communication with).
  • the scene being surveyed by the sensor 2 can be viewed remotely on another electronic device, such as a tablet.
  • the specific spatial region to be monitored can thus be selected, for example, by selecting that spatial region on the tablet.
  • the sensor 2 is thus configured to monitor a predefined spatial region as selected by the user.
  • the camera module 4 of the sensor 2 captures visual data associated with the predefined spatial region
  • the radar module 6 captures location data associated with objects in the predefined spatial region.
  • the processor of the sensor 2 is configured to receive data from both the camera module 4 and the radar module 6, and to process this data to determine whether an object detected in the predefined spatial region is indeed a vehicle.
  • the processor may be configured to:
  • the sensor 2 can then send a simple signal to the QSR's server to indicate whether a vehicle is present in the predefined spatial region. For example, the sensor 2 may simply send a binary signal, with 0 indicating that a vehicle is not present at the spatial region, and 1 indicating that a vehicle is present at the spatial region.
  • the sensor 2 can be programmed with a machine learning process or module to improve its ability to identify vehicles over time.
  • a single sensor 2 is integrated into the drive-through path of a QSR and works in cooperation with the existing inductive loops to detect and track vehicle flow. It if of course possible for multiple sensors 2 to be installed along the drive-through path such that they effectively replace the inductive loops altogether.
  • a sensor 2 can be configured to monitor more than one predefined spatial regions.
  • an appropriately mounted sensor 2 could be used to monitor two separate drive- through lanes which are next to one another.
  • a single sensor 2 could replace two or more inductive loops.
  • an appropriately mounted sensor 2 could monitor the entire stretch of road: a user could configure the sensor 2 to monitor a first predefined spatial region associated with the payment point, and a second predefined spatial region associated with the collection point. This is another example whereby a single sensor 2 can be used to perform the functionality of two or more inductive loops.
  • loops can have difficulty accurately tracking such vehicle behaviour. For example, in a drive-through path with a series of loops arranged in a sequence, if a first vehicle leaves the drive-through path midway, the loops will not be able to identify this behaviour. Instead, the loop that would have been downstream of the first vehicle is now triggered by a second vehicle directly behind the first vehicle, but this information is not conveyed to the QSR's central servers, which instead misattributes data from the first vehicle to the second vehicle.
  • loops may not identify when a vehicle has joined a drive-through path midway and may instead attribute the data of an upstream vehicle to the vehicle that has newly joined the queue. While such misattributions of data are relatively insignificant and can be somewhat corrected for when there are not many vehicles in the drive-through path (e.g. by a timing system which recognises that a vehicle has not triggered a downstream loop for an inordinate amount of time, or has not lingered at a loop for a sufficient amount of time), the error becomes less detectable and correctable during busier periods when there is a steady flow of traffic through a drive-through, in which case each vehicle downstream of the first misattribution is also attributed with incorrect vehicle data. Because loops are unable to accurately identify events such as vehicle pull-ins and pull-outs, loops can fail to track the flow of vehicles through uncontrolled drive-through lanes with 100% accuracy.
  • one or more of the present sensors 2 can be integrated with uncontrolled drive-throughs and accurately identify vehicle behaviour such as pull-ins and pull-outs such that data from one vehicle is not incorrectly misattributed to another.
  • a vehicle-tracking system which utilises one or more of the present sensors 2 can provide more accurate vehicle data than known loop systems and may even achieve 100% accurate vehicle tracking in many drive-through configurations.
  • the sensor 2 can monitor activity within a more expansive spatial region chosen by the user when deciding on the predefined spatial region to monitor.
  • the sensor 2 when configuring the sensor 2 to monitor a spatial region associated with a payment point, the sensor 2 is not merely monitoring the location that would be monitored by a loop; the predefined spatial region can comprise several continuous and adjacent subregions which encompass the payment point as well as several metres upstream and downstream thereof, and even to one or both sides of the payment point. Because the spatial region monitored by a sensor 2 is more expansive than a loop, the sensor 2 can thus detect and track vehicles beyond fixed points (e.g. the aforementioned four points of the hypothetical controlled drive- through); the sensor 2 can track spatial regions between such defined points.
  • a sensor 2 can be mounted to the QSR to monitor a ten metre stretch in the drive- through path, wherein the 10 metre stretch is segmented into three continuous and bounded subregions. The sensor 2 could thus detect and track the presence of a vehicle as it enters the first subregion, lingers in the first subregion, leaves the first subregion and enters the second subregion, and thus conclude that the same vehicle has moved from the first subregion to the second subregion.
  • the sensor 2 could thus conclude that the vehicle has pulled-out from the drive-through path and thus adjust its vehicle data to recognise that the next vehicle that enters the third subregion is not the vehicle which has just pulled-out of the drive-through.
  • This sensor 2 can of course be similarly integrated to function with a parallel lane drive-through configuration wherein vehicles from both the inner and outer lanes can pull-out at any given time.
  • a sensor 2 could be positioned to survey a spatial region wherein a first subregion monitors a first upstream lane, a second subregion monitors a second upstream lane, a third subregion monitors the spatial region where the vehicles merge, and a fourth subregion monitors a downstream region where the vehicles have merged into a single lane. By detecting and tracking the flow of vehicles as they move between the subregions, the sensor 2 is thus able to determine the order in which the vehicles have merged and therefore attribute to respective vehicles the correct vehicle data.
  • the senor 2 may be configured to monitor wait bays and kerbside pickup locations, wherein if the sensor 2 detects a vehicle has entered such a bay or location, the QSR will be notified that a customer is waiting for their order in said bay or location.
  • One or more such sensors 2 could of course be similarly used to monitor and detect the presence of vehicles in multiple wait bays, wherein customers simply park at respective waiting bays and order and wait for their food.
  • the present sensors 2 may also be configured to communicate vehicle data with one another to further improve the accuracy and reliability of vehicle tracking.
  • the present sensors 2 can of course be configured to provide more than just a binary signal to a QSR's central server which confirms or denies the presence of a vehicle at a specific location.
  • the sensor 2 may also communicate to the central server certain characteristics or aspects about the vehicle, such as the vehicle's colour, license plate number, and how many people are in each vehicle.
  • This data could be stored and analysed by the QSR to further improve the QSR's ability to predict drive-through trends and thus optimise its drive-through processes.
  • the visual data of a vehicle captured by the sensor 2 can be processed and converted into a virtual visual representation of the vehicle for display on screen.
  • the virtual vehicle can be displayed, for example, overlapping with a virtual or real representation of the predefined path over which the vehicle's behaviour is tracked. This can help the QSR better visualise the flow of vehicles through its drive-through, potentially helping to spot bottlenecks and other inefficiencies which may otherwise be difficult to detect.

Abstract

A sensor for determining whether a vehicle is present in a predefined spatial region, the sensor comprising a housing in which is enclosed: a camera module for capturing visual data associated with the spatial region; a radar module for capturing location data associated with objects in the spatial region; and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the spatial region.

Description

Sensor and associated system and method for detecting a vehicle
Field of the invention
The present invention relates to a sensor and associated system and method for detecting a vehicle. In particular, the present invention relates to the detection of vehicles moving along a predefined path.
Background
A key source of revenue for quick service restaurants (QSR) is the drive-through, wherein customers drive their vehicles to various points of a QSR to order, pay for and collect their meals, all from the convenience of their vehicle. Data regarding the flow of vehicles through a drive-through can be instrumental for a QSR to improve its drive-through performance and efficiency, thereby increasing revenue generated by the drive-through.
Most drive-throughs use inductive loop traffic detectors to detect the presence of vehicles moving through the drive-through. However, the installation, maintenance and repair of such loop detectors is relatively time and labour-intensive and can result in significant drive-through downtime, thereby negatively impacting a QSR's performance.
If a loop malfunctions and/or needs replacement, at least a portion of the drive- through, if not an entire drive-through lane, often needs to be closed off. This can negatively impact sales and customer experience. Additionally, loop systems are relatively expensive to replace, since it involves removing asphalt and concrete. This time and monetary cost can be especially problematic if the configuration of one or more drive- through lanes is to change, especially to accommodate a growing QSR. Given how competitive the QSR industry is, such additional costs and drive-through downtime can impact a QSR's profitability and viability.
Additionally, loops are not able to accurately track certain vehicle behaviour, such as "pull-ins" and "pull-outs" which can occur in "uncontrolled" drive-through lanes whereby vehicles do not necessarily complete or follow a sequential path through a drive-through. In such drive-through configurations, loops are vulnerable to misattributing data between vehicles, thereby decreasing the accuracy of the collected drive-through data and comprising the efficiency of a QSR. While alternatives to loop systems have been proposed, these alternatives have been relatively expensive, require additional computer equipment and infrastructure to operate, and require that the entire system be replaced even though it may be that only a single inductive loop has failed and requires replacement.
There is a need to address the above, and/or at least provide a useful alternative.
Summary
According to a first aspect of the present invention, there is provided a sensor for determining whether a vehicle is present in a predefined spatial region, the sensor comprising a housing in which is enclosed: a camera module for capturing visual data associated with the spatial region; a radar module for capturing location data associated with objects in the spatial region; and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the spatial region.
In an embodiment of the invention, the sensor comprises a first circuit board on which the radar module is mounted, and a second processing circuit board arranged behind and substantially parallel to the first circuit board.
In an embodiment of the invention, both circuit boards comprise a through hole arranged such that the camera module can protrude therethrough and monitor the predefined spatial region
In an embodiment of the invention, the camera module and radar module are mounted within the housing such that a plane of the field of view monitored by the camera module is substantially parallel with a plane in which the radar module is mounted.
In an embodiment of the invention, the camera module and the radar module are mounted within the housing such that they are substantially co-planar with one another.
In an embodiment of the invention, the housing comprises: a front cover substantially formed from plastic and having a window through which the camera module can monitor the spatial region.
In an embodiment of the invention, the housing also comprises a rear portion substantially formed from metal which acts as a thermal mass to passively cool the sensor. In an embodiment of the invention, the rear portion comprises heat dissipation elements which extend inwardly into the housing and contact one or both circuit boards and/or componentry mounted thereon so as to cool the sensor.
In an embodiment of the invention, the sensor is configured to output one of two possible signals for a predefined spatial region at a given time, a first signal indicating that a vehicle is present in the predefined spatial region, and a second signal indicating that a vehicle is not present in the predefined spatial region.
In an embodiment of the invention, the radar module is a two-dimensional spatial radar comprising a four radar element array.
In an embodiment of the invention, the sensor is a power over ethernet sensor wherein the radar module comprises a 77 GHz radar.
In an embodiment of the invention, in determining whether a vehicle is present in a predefined spatial region, the processor is configured to: correlate the visual data and the location data; determine the optical flow of an object in the predefined spatial region; determine the size and/or mass of the object; determine the speed of the object; and identify visual characteristics of the object, and based on this information determine whether the detected object is a vehicle.
In an embodiment of the invention, the processor uses a machine learning process to determine whether a vehicle is present in the predefined spatial region.
In an embodiment of the invention, the sensor of any one of the preceding claims, wherein the predefined spatial region comprises two or more spatial subregions.
In an embodiment of the invention, the subregions are substantially adjacent to one another and define a continuous path along which a vehicle can travel.
In an embodiment of the invention, the subregions are virtual subregions which map onto the predefined spatial region.
In an embodiment of the invention, data corresponding to the virtual subregions are stored in the sensor. According to a second aspect of the present invention, there is provided a system for detecting the presence of a vehicle as it moves along a predefined path and through a series of predefined spatial regions along that path, comprising: a plurality of sensors configured for determining whether a vehicle is present in a predefined spatial region, each sensor comprising a camera module for capturing visual data associated with the spatial region, a radar module for capturing location data associated with objects in the spatial region, and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the spatial region, wherein each sensor is mounted at a respective position along the predefined path and configured to monitor respective predefined spatial regions on said path to determine whether a vehicle is present within each respective predefined spatial region; and a central processing unit in communication with each of the sensors and configured to receive data therefrom which either confirms or denies the presence of a vehicle in said predefined spatial regions.
In an embodiment of the invention, the predefined path comprises at least a portion of the path of a drive-through of a quick service restaurant.
In an embodiment of the invention, the predefined path comprises a virtual path which maps onto the path of the drive-through of the quick service restaurant.
In an embodiment of the invention, data corresponding to the predefined virtual path is stored in the sensor.
In an embodiment of the invention, a first sensor is mounted at a first location of the quick service restaurant so as to monitor a first spatial region associated with an entrance of the drive-through.
In an embodiment of the invention, a second sensor is mounted at a second location of the quick service restaurant so as to monitor a second spatial region associated with an order point of the drive-through.
In an embodiment of the invention, a third sensor is mounted at a third location of the quick service restaurant so as to monitor a third spatial region associated with a payment point of the drive-through. In an embodiment of the invention, a fourth sensor is mounted at a fourth location of the quick service restaurant so as to monitor a fourth spatial region associated with a food pick-up point of the drive-through.
In an embodiment of the invention, a fifth sensor is mounted at a fifth location of the quick service restaurant so as to monitor a fifth spatial region associated with a waiting area where a vehicle can park while waiting for food to be delivered from the quick service restaurant to the vehicle.
In an embodiment of the invention, each sensor is configured to output to the central processing unit one of two possible signals at a given time for a given predefined spatial region, a first signal indicating that a vehicle is present in the predefined spatial region, and a second signal indicating that a vehicle is not present in the predefined spatial region.
In an embodiment of the invention, each sensor comprises a sensor according to the first aspect of the invention.
According to a third aspect of the invention, there is provided a system for determining the presence of a vehicle as it moves along a predefined path and through a series of predefined spatial regions along that path, comprising: a plurality of sensors configured for sensing activity in respective predefined spatial regions, the plurality of sensors including: at least one each sensor comprising a camera module for capturing visual data associated with a predefined spatial region, a radar module for capturing location data associated with objects in the predefined spatial region, and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the predefined spatial region; and at least one inductive loop for sensing vehicles in respective predefined spatial regions; wherein each sensor is positioned along the predefined path and configured to monitor respective predefined spatial regions on said path to determine if a vehicle is present within each respective predefined spatial region; and a central processing unit in communication with each of the sensors and configured to receive data therefrom which either confirms or denies the presence of a vehicle in each of the respective predefined spatial regions.
In an embodiment of the invention, each sensor is configured to output to the central processing unit a signal indicative of whether a vehicle is or is not present at a respective predefined spatial region such that the central processing unit can correlate signals from the or each camera-radar sensor and the or each inductive loop sensor to determine the presence and track the path of a vehicle along the predefined path.
In an embodiment of the invention, each camera-radar sensor comprises a sensor according to the first aspect of the invention.
According to a fourth aspect of the present invention there is provided a method of determining the presence of a vehicle in a predefined spatial region, the method being performed within a sensor having a camera module for capturing visual data associated with the spatial region, a radar module for capturing location data associated with objects in the spatial region, and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the predefined spatial region, the processor performing the steps of: correlating the visual data and the location data; determining the optical flow of an object in the predefined spatial region; determining the size and/or mass of the object; determining the speed of the object; and identifying visual characteristics of the object, and based on this information determine whether the detected object is a vehicle.
Brief description of the drawings
In order that the invention may be more easily understood, an embodiment will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a front perspective view of a sensor according to an embodiment of the invention;
Figure 2 is a rear perspective view of the sensor of Figure 1; Figure 3 is a rear perspective view of a sensor according to an embodiment of the invention with a mounting element attached;
Figure 4 is an exploded view of the sensor of Figure 1;
Figure 5 is a cross sectional rear perspective view of the sensor of Figure 1; and
Figure 6 is a diagram of the power architecture of a sensor according to embodiments of the invention.
Detailed description
A sensor 2 according to an embodiment of the invention is shown in Figures 1 to 3. The sensor 2 is configured for detecting whether one or more vehicles is present in a predefined spatial region. One or more such sensors 2 can be integrated along the path of a drive-through lane at a QSR and used to detect and track, in real time, the movement of vehicles through the drive-through lane.
Known sensors simply capture information, such as the optical data of a scene, and upload that information to external servers for processing to determine whether an object is present in a scene and whether that object is a vehicle. This process can be relatively slow and resource intensive, particularly because of the need to transfer a significant amount of data from the sensor to the external servers for processing. Unlike such sensors, the present sensor 2 can both capture and process data on its own. The present sensor 2 can function as a standalone device which can determine on its own whether an object detected in a spatial region or scene is a vehicle. Rather than send relatively large amounts of data to a central server for processing, the present sensor 2 can itself determine whether a sensed object is a vehicle, and is configured to simply deliver a simple signal or payload to the server indicating whether a vehicle is present in a predefined spatial region. One advantage of performing this computing at the edge is that the central server need not be significantly upgraded in order to handle the large amounts of data that it would otherwise need to be processed. This is particularly beneficial if a QSR wants to monitor additional spatial regions along a drive-through path for vehicles. Using known sensors, the central server may require performance upgrades in order to handle the additional inflow of data that would be sent from known sensors. By using sensors 2 embodying the present invention, no upgrade is needed since the present sensor 2 does not send all the gathered information for processing. Instead, it merely sends a simple signal to the central server indicative of whether a vehicle has been detected in the spatial region or not.
Because the present sensors 2 are self-contained vehicle-detecting devices, they can easily be retrofitted to and cooperate with a QSR's existing traffic-monitoring system, which often involves a series of inductive loops. For example, suppose an inductive loop at a QSR has unexpectedly malfunctioned. Rather than undergo the expensive and time- consuming process of repairing or replacing the loop (which can involve closing off the associated drive-through lane entirely), the QSR can simply install the present sensor 2 to create a loop equivalent. The sensor 2 can simply be mounted to a location along the drive- through path such that it faces and monitors a spatial region which was previously monitored by the inductive loop. The sensor 2 can thus begin detecting objects entering the spatial region and determining whether those objects are vehicles and then reporting to the central server - in much the same way that the loop did - whether a vehicle is in the selected spatial region. As such, one or more of the present sensors 2 can be integrated into the drive-through path of an existing QSR and be used alongside with, or instead of, existing inductive loops that the QSR currently relies on to detect traffic flow. The sensor 2 will now be described with reference to the Figures.
Figures 1 and 2 show the front and rear, respectively, of the present sensor 2. The sensor 2 comprises a housing in which is enclosed a camera module 4, a radar module 6, a processor and other internal componentry. The housing comprises a front cover 8 which is substantially formed from plastic which fits over the sensor's 2 camera and radar modules 4, 6, and a substantially metallic rear portion 10 which secures to the front cover 6 and acts as a thermal mass to help cool the sensor 2. An elastomeric seal 12 (Figure 4) is fitted between the front cover 8 and rear portion 10 and seals the interior of the sensor 2 against external contaminants, such as moisture and dust. The rear portion 10 comprises a series of outwardly projecting cooling fins 12 which increase the surface area from which heat can be radiated away from the sensor 2. The rear portion 10 is also provided with an ethernet plug 14 via which the sensor 2 can be powered via power over ethernet.
Referring to Figure 3, a mounting arm 16 can be secured to a mounting face 18 (Figure 2) of the rear portion 10 of the housing and allows the sensor 2 to be mounted at a suitable location of a QSR. In the depicted embodiment, the mounting arm 16 comprises two ball joints which enable easy and flexible orienting of the sensor 2 once it has been mounted to a location of the QSR, such as a ceiling, wall or support column.
Referring to Figures 4 and 5, the radar module 6 is mounted towards a front end of the sensor 2 and is configured to capture location data in front of the sensor 2. As such, it is advantageous that the front cover 8 is made from plastic to avoid interfering with signals emitted from and retrieved by the radar module 6. In the depicted embodiment, the radar module 6 is 77-GHz Four-Element Phased-Array Radar configured to capture location information in a two-dimensional plane. The front cover 8 is also configured with a window or lens filter 20 through which the camera module 4 can monitor the scene before it. In embodiments of the invention, the lens 20 is formed from a polycarbonate material and may be provided with an oleophobic coating.
By fusing location data from the radar module 6 with visual data from the camera module 4, the present sensor 2 can more accurately and reliably detect and track the movement of vehicles as compared with a sensor which receives only visual data or only location data. Moreover, if either of the radar module 6 or the camera module 4 malfunctions, or if the effectiveness of one of the modules is impeded (e.g. a snowstorm may compromise the quality of the visual data gathered by the camera module 4), the other of the modules 4, 6 can still function to provide information about the presence and movement of vehicles.
Referring to Figures 1 and 4, a ring of IR LEDs 22 is arranged on a ring mount 24 around the camera module 4 to assist with capturing visual information in low light. The camera module 4 is arranged behind the lens filter 20, which itself is secured with fasteners 28 to the front cover 8 via a lens plate 26. The lens plate 26 acts to secure the lens arrangement in Figure 4 to the front portion 8. An elastomeric O-ring 30 is provided between the lens plate 26 and lens filter 20 and helps to seal the lens components and camera module 4 against external contaminants, such as moisture and dust. The rear portion and front cover can be releasably secured to one another via fasteners 32.
Referring to Figures 4 and 5, the circuit boards and componentry and arranged in a functional and compact manner, resulting in a housing and sensor 2 of reduced size. The sensor 2 comprises two main circuit boards: a front circuit board 34 on which is mounted the radar module 6, and a rear circuit board 36 configured to process visual data from the camera module 4 and location data from the radar module 6 to determine whether a detected object in a predefined spatial region is a vehicle. Both circuit boards 34, 36 are arranged in a stacked manner, with the processing circuit board 36 arranged behind and parallel to the front radar-supporting circuit board 34. Both circuit boards 34, 36 comprise an aligned through hole through 38 which at least a portion of the camera module 4 can protrude through. The sensor 2 also comprises an ethernet board 40 via which the sensor 2 can be powered via power over ethernet.
Referring to Figure 5, the rear metal backing 10 of the sensor 2 comprises heat dissipating elements in the form of metallic arms 42a, 42b which extend into the interior of the housing and physically contact the circuit boards and internal components. In the embodiment shown in Figure 5, an upper metallic arm 42a projects towards and contacts a rear 44 of the camera module 4, and a lower metallic arm 42b projects towards and contacts a rear of the processing circuit board 36. The metallic arms 42a, 42b act as heatsinks through which heat in the camera module 4 and processing circuit board 36 can dissipate. By utilising this form of passive cooling, no active cooling components (e.g. fans) are required, thereby reducing the size of the sensor 2 as well as the amount of power required to operate each sensor 2.
The functionality of a sensor 2 will now be described with reference to a simple "controlled" drive-through lane of a QSR. In this hypothetical drive-through lane, there are four key spatial regions where it is important for the QSR to know whether there is a vehicle or not in those regions:
1. the drive-through entrance (the beginning of the drive-through lane);
2. the order point (where customers order their meals);
3. the pay point (where customers pay for their meals); and
4. the collection point (where customers pick up their meals).
Prior to the installation of any sensors 2, this hypothetical QSR has inductive loops buried in the asphalt of its drive-through lane at each of the above four points. These inductive loops are connected with a central processing unit or server of the QSR and may trigger certain actions based on whether a vehicle is detected. For example, the loop at the order point may be associated with one of the QSR's audio pathways (e.g. microphone and speaker system) which activates to allow a customer and QSR staff member to remotely communicate with one another regarding what the customer wishes to order. In this situation, when the loop at the order point detects that a vehicle is present at the order point, upon receipt of this information the central server may be programmed to activate the audio pathway to enable the customer to interact with a staff member. Similarly, a loop which detects a vehicle present at the collection point may trigger an alert to notify a staff member to attend to the customer with their food order.
Suppose the inductive loop at the payment point of this QSR stops working. Rather than tear up the asphalt to maintain or replace the loop, the QSR can simply install the present sensor 2 at an appropriate location so that it faces the spatial region previously monitored by the inductive loop. For example, the sensor 2 could be mounted to a building structure proximate to the order point, and then be oriented so that the camera and radar modules 4, 6 are pointed towards an area on the ground of the drive-through lane where it is important to know whether there is a vehicle there or not. The power over ethernet connection not only supplies power to the sensor 2, but also allows the sensor to communicate with the QSR's existing central server (which the other inductive loops are in communication with).
Upon first booting up the sensor 2, the scene being surveyed by the sensor 2 can be viewed remotely on another electronic device, such as a tablet. The specific spatial region to be monitored can thus be selected, for example, by selecting that spatial region on the tablet. The sensor 2 is thus configured to monitor a predefined spatial region as selected by the user.
During operation, the camera module 4 of the sensor 2 captures visual data associated with the predefined spatial region, and the radar module 6 captures location data associated with objects in the predefined spatial region. The processor of the sensor 2 is configured to receive data from both the camera module 4 and the radar module 6, and to process this data to determine whether an object detected in the predefined spatial region is indeed a vehicle.
To identify whether an object in the predefined spatial region is a vehicle, the processor may be configured to:
- correlate the visual data and the location data;
- perform a background subtraction to detect moving objects in the spatial region - analyse the optical flow or direction of travel of the object;
- determine the size and/or mass of the object;
- determine the speed of the object; and
- identifying visual characteristics of the object, such as the windscreen or centroid of a vehicle, and based on this information determine whether the detected object is indeed a vehicle. The sensor 2 can then send a simple signal to the QSR's server to indicate whether a vehicle is present in the predefined spatial region. For example, the sensor 2 may simply send a binary signal, with 0 indicating that a vehicle is not present at the spatial region, and 1 indicating that a vehicle is present at the spatial region.
The sensor 2 can be programmed with a machine learning process or module to improve its ability to identify vehicles over time.
In the above example, a single sensor 2 is integrated into the drive-through path of a QSR and works in cooperation with the existing inductive loops to detect and track vehicle flow. It if of course possible for multiple sensors 2 to be installed along the drive-through path such that they effectively replace the inductive loops altogether.
Additionally, since the scene monitored by a sensor 2 can be relatively expansive, a sensor 2 can be configured to monitor more than one predefined spatial regions. For example, an appropriately mounted sensor 2 could be used to monitor two separate drive- through lanes which are next to one another. In this case, a single sensor 2 could replace two or more inductive loops. In another example, consider a relatively straight and long stretch of road on the drive-through path which comprises both the payment point and the collection point. While two or more inductive loops may have been required to monitor the flow of vehicles along this stretch of road, an appropriately mounted sensor 2 could monitor the entire stretch of road: a user could configure the sensor 2 to monitor a first predefined spatial region associated with the payment point, and a second predefined spatial region associated with the collection point. This is another example whereby a single sensor 2 can be used to perform the functionality of two or more inductive loops.
The above example involves a very simple "controlled" drive-through lane whereby all vehicles must queue up and progress in a predictable sequence through a drive-through path. Flowever, many QSRs have "uncontrolled" drive-through paths whereby vehicles can "pull-in" to join a drive-through path midway (e.g. the customer has already ordered and paid for their order online and simply needs to join the drive-through path to pick up their order), or "pull-out" from the drive-through path midway (e.g. in QSRs with undefined collection points whereby a QSR staff member simply delivers orders directly to respective vehicles such that a customer can pull-out of the drive-through path at any point after they have received their order).
In such uncontrolled drive-throughs, loops can have difficulty accurately tracking such vehicle behaviour. For example, in a drive-through path with a series of loops arranged in a sequence, if a first vehicle leaves the drive-through path midway, the loops will not be able to identify this behaviour. Instead, the loop that would have been downstream of the first vehicle is now triggered by a second vehicle directly behind the first vehicle, but this information is not conveyed to the QSR's central servers, which instead misattributes data from the first vehicle to the second vehicle.
Similarly, loops may not identify when a vehicle has joined a drive-through path midway and may instead attribute the data of an upstream vehicle to the vehicle that has newly joined the queue. While such misattributions of data are relatively insignificant and can be somewhat corrected for when there are not many vehicles in the drive-through path (e.g. by a timing system which recognises that a vehicle has not triggered a downstream loop for an inordinate amount of time, or has not lingered at a loop for a sufficient amount of time), the error becomes less detectable and correctable during busier periods when there is a steady flow of traffic through a drive-through, in which case each vehicle downstream of the first misattribution is also attributed with incorrect vehicle data. Because loops are unable to accurately identify events such as vehicle pull-ins and pull-outs, loops can fail to track the flow of vehicles through uncontrolled drive-through lanes with 100% accuracy.
Advantageously, one or more of the present sensors 2 can be integrated with uncontrolled drive-throughs and accurately identify vehicle behaviour such as pull-ins and pull-outs such that data from one vehicle is not incorrectly misattributed to another. As such, a vehicle-tracking system which utilises one or more of the present sensors 2 can provide more accurate vehicle data than known loop systems and may even achieve 100% accurate vehicle tracking in many drive-through configurations. Unlike loops, which can only monitor the location where they are installed, the sensor 2 can monitor activity within a more expansive spatial region chosen by the user when deciding on the predefined spatial region to monitor. For example, when configuring the sensor 2 to monitor a spatial region associated with a payment point, the sensor 2 is not merely monitoring the location that would be monitored by a loop; the predefined spatial region can comprise several continuous and adjacent subregions which encompass the payment point as well as several metres upstream and downstream thereof, and even to one or both sides of the payment point. Because the spatial region monitored by a sensor 2 is more expansive than a loop, the sensor 2 can thus detect and track vehicles beyond fixed points (e.g. the aforementioned four points of the hypothetical controlled drive- through); the sensor 2 can track spatial regions between such defined points. For example, consider an uncontrolled drive-through lane in which a QSR staff member delivers food orders directly to vehicles based on when the food orders are ready (which does not necessarily correspond with the order in which the vehicles are queued), thereby permitting customers to pull out of the drive-through path at any time after they have received their order. A sensor 2 can be mounted to the QSR to monitor a ten metre stretch in the drive- through path, wherein the 10 metre stretch is segmented into three continuous and bounded subregions. The sensor 2 could thus detect and track the presence of a vehicle as it enters the first subregion, lingers in the first subregion, leaves the first subregion and enters the second subregion, and thus conclude that the same vehicle has moved from the first subregion to the second subregion. Flowever, if the sensor 2 then detects that the vehicle has left the second subregion but has not entered the third subregion (or, for example, has exited the third subregion along a path that does conform with the predicted or learned optical flow of the drive-through path), the sensor 2 could thus conclude that the vehicle has pulled-out from the drive-through path and thus adjust its vehicle data to recognise that the next vehicle that enters the third subregion is not the vehicle which has just pulled-out of the drive-through. This sensor 2 can of course be similarly integrated to function with a parallel lane drive-through configuration wherein vehicles from both the inner and outer lanes can pull-out at any given time.
The above is only an example of how the present sensor 2 can outperform the reliability and accuracy of known loop systems. There are of course other drive-through configurations in which the present sensor 2 would outperform loops systems. For example, in drive-through configurations wherein two or more lanes merge to form one, loops are unable to detect the order in which the vehicles from the separate lanes have merged. In these configurations, a sensor 2 could be positioned to survey a spatial region wherein a first subregion monitors a first upstream lane, a second subregion monitors a second upstream lane, a third subregion monitors the spatial region where the vehicles merge, and a fourth subregion monitors a downstream region where the vehicles have merged into a single lane. By detecting and tracking the flow of vehicles as they move between the subregions, the sensor 2 is thus able to determine the order in which the vehicles have merged and therefore attribute to respective vehicles the correct vehicle data.
Many modifications of the above embodiments will be apparent to those skilled in the art without departing from the scope of the present invention. For example, the sensor 2 may be configured to monitor wait bays and kerbside pickup locations, wherein if the sensor 2 detects a vehicle has entered such a bay or location, the QSR will be notified that a customer is waiting for their order in said bay or location. One or more such sensors 2 could of course be similarly used to monitor and detect the presence of vehicles in multiple wait bays, wherein customers simply park at respective waiting bays and order and wait for their food. The present sensors 2 may also be configured to communicate vehicle data with one another to further improve the accuracy and reliability of vehicle tracking. The present sensors 2 can of course be configured to provide more than just a binary signal to a QSR's central server which confirms or denies the presence of a vehicle at a specific location. For example, the sensor 2 may also communicate to the central server certain characteristics or aspects about the vehicle, such as the vehicle's colour, license plate number, and how many people are in each vehicle. This data could be stored and analysed by the QSR to further improve the QSR's ability to predict drive-through trends and thus optimise its drive-through processes. It is also envisaged that the visual data of a vehicle captured by the sensor 2 can be processed and converted into a virtual visual representation of the vehicle for display on screen. The virtual vehicle can be displayed, for example, overlapping with a virtual or real representation of the predefined path over which the vehicle's behaviour is tracked. This can help the QSR better visualise the flow of vehicles through its drive-through, potentially helping to spot bottlenecks and other inefficiencies which may otherwise be difficult to detect.
Throughout this specification and the claims which follow, unless the context requires otherwise, the word "comprise", and variations such as "comprises" and "comprising", will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps.
The reference in this specification to any prior publication (or information derived from it), or to any matter which is known, is not, and should not be taken as an acknowledgment or admission or any form of suggestion that that prior publication (or information derived from it) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates.

Claims

CLAIMS:
1. A sensor for determining whether a vehicle is present in a predefined spatial region, the sensor comprising a housing in which is enclosed: a camera module for capturing visual data associated with the spatial region; a radar module for capturing location data associated with objects in the spatial region; and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the spatial region.
2. The sensor of claim 1, comprising a first circuit board on which the radar module is mounted, and a second processing circuit board arranged behind and substantially parallel to the first circuit board, wherein both circuit boards comprise a through hole arranged such that the camera module can protrude therethrough and monitor the predefined spatial region, and such that a plane of the field of view monitored by the camera module is substantially parallel with a plane in which the radar module is mounted.
3. The sensor of claim 2, wherein the camera module and the radar module are mounted within the housing such that they are substantially co-planar with one another.
4. The sensor of any one of the preceding claims, wherein the housing comprises: a front cover substantially formed from plastic and having a window through which the camera module can monitor the spatial region; and a rear portion substantially formed from metal which acts as a thermal mass to passively cool the sensor.
5. The sensor of claim 4 as appended to claim 2 or claim 3, wherein the rear portion comprises heat dissipation elements which extend inwardly into the housing and contact one or both circuit boards and/or componentry mounted thereon so as to cool the sensor.
6. The sensor of any one of the preceding claims, being configured to output one of two possible signals for a predefined spatial region at a given time, a first signal indicating that a vehicle is present in the predefined spatial region, and a second signal indicating that a vehicle is not present in the predefined spatial region.
7. The sensor of any one of the preceding claims, wherein the radar module is a two- dimensional spatial radar comprising a four radar element array.
8. The sensor of any one of the preceding claims, being a power over ethernet sensor wherein the radar module comprises a 77 GHz radar.
9. The sensor of any one of the preceding claims, wherein in determining whether a vehicle is present in a predefined spatial region, the processor is configured to: correlate the visual data and the location data; determine the optical flow of an object in the predefined spatial region; determine the size and/or mass of the object; determine the speed of the object; and identify visual characteristics of the object, and based on this information determine whether the detected object is a vehicle.
10. The sensor of any one of the preceding claims, wherein the processor uses a machine learning process to determine whether a vehicle is present in the predefined spatial region.
11. The sensor of any one of the preceding claims, wherein the predefined spatial region comprises two or more spatial subregions.
12. The sensor of claim 11, wherein the subregions are substantially adjacent to one another and define a continuous path along which a vehicle can travel.
13. A system for detecting the presence of a vehicle as it moves along a predefined path and through a series of predefined spatial regions along that path, comprising: a plurality of sensors configured for determining whether a vehicle is present in a predefined spatial region, each sensor comprising a camera module for capturing visual data associated with the spatial region, a radar module for capturing location data associated with objects in the spatial region, and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the spatial region, wherein each sensor is mounted at a respective position along the predefined path and configured to monitor respective predefined spatial regions on said path to determine whether a vehicle is present within each respective predefined spatial region; and a central processing unit in communication with each of the sensors and configured to receive data therefrom which either confirms or denies the presence of a vehicle in said predefined spatial regions.
14. The system of claim 13, wherein the predefined path comprises at least a portion of the path of a drive-through of a quick service restaurant.
15. The system of claim 13, wherein: a first sensor is mounted at a first location of the quick service restaurant so as to monitor a first spatial region associated with an entrance of the drive-through; a second sensor is mounted at a second location of the quick service restaurant so as to monitor a second spatial region associated with an order point of the drive-through; a third sensor is mounted at a third location of the quick service restaurant so as to monitor a third spatial region associated with a payment point of the drive-through; a fourth sensor is mounted at a fourth location of the quick service restaurant so as to monitor a fourth spatial region associated with a food pick-up point of the drive-through; and a fifth sensor is mounted at a fifth location of the quick service restaurant so as to monitor a fifth spatial region associated with a waiting area where a vehicle can park while waiting for food to be delivered from the quick service restaurant to the vehicle.
16. The system of any one of claims 13 to 15, wherein each sensor is configured to output to the central processing unit one of two possible signals at a given time for a given predefined spatial region, a first signal indicating that a vehicle is present in the predefined spatial region, and a second signal indicating that a vehicle is not present in the predefined spatial region.
17. The system of any one of claims 13 to 16, wherein each sensor comprises a sensor according to any one of claims 1 to 10.
18. A system for determining the presence of a vehicle as it moves along a predefined path and through a series of predefined spatial regions along that path, comprising: a plurality of sensors configured for sensing activity in respective predefined spatial regions, the plurality of sensors including: at least one each sensor comprising a camera module for capturing visual data associated with a predefined spatial region, a radar module for capturing location data associated with objects in the predefined spatial region, and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the predefined spatial region; and at least one inductive loop for sensing vehicles in respective predefined spatial regions; wherein each sensor is positioned along the predefined path and configured to monitor respective predefined spatial regions on said path to determine if a vehicle is present within each respective predefined spatial region; and a central processing unit in communication with each of the sensors and configured to receive data therefrom which either confirms or denies the presence of a vehicle in each of the respective predefined spatial regions.
19. The system of claim 18, wherein each sensor is configured to output to the central processing unit a signal indicative of whether a vehicle is or is not present at a respective predefined spatial region such that the central processing unit can correlate signals from the or each camera-radar sensor and the or each inductive loop sensor to determine the presence and track the path of a vehicle along the predefined path.
20. The system of claim 18 or 19, wherein each camera-radar sensor is a sensor according to any one of claims 1 to 10.
21. A method of determining the presence of a vehicle in a predefined spatial region, the method being performed within a sensor having a camera module for capturing visual data associated with the spatial region, a radar module for capturing location data associated with objects in the spatial region, and a processor for receiving and processing the visual data and the location data to determine whether a vehicle is present in the predefined spatial region, the processor performing the steps of: correlating the visual data and the location data; determining the optical flow of an object in the predefined spatial region; determining the size and/or mass of the object; determining the speed of the object; and identifying visual characteristics of the object, and based on this information determine whether the detected object is a vehicle.
PCT/AU2020/051127 2019-10-21 2020-10-20 Sensor and associated system and method for detecting a vehicle WO2021077157A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2019903958A AU2019903958A0 (en) 2019-10-21 Sensor and associated system and method for detecting a vehicle
AU2019903958 2019-10-21

Publications (1)

Publication Number Publication Date
WO2021077157A1 true WO2021077157A1 (en) 2021-04-29

Family

ID=75619246

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2020/051127 WO2021077157A1 (en) 2019-10-21 2020-10-20 Sensor and associated system and method for detecting a vehicle

Country Status (1)

Country Link
WO (1) WO2021077157A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110163904A1 (en) * 2008-10-08 2011-07-07 Delphi Technologies, Inc. Integrated radar-camera sensor
US20140118170A1 (en) * 2012-10-29 2014-05-01 Emx Industries Inc. Vehicle detector
US20140195138A1 (en) * 2010-11-15 2014-07-10 Image Sensing Systems, Inc. Roadway sensing systems
US20140218527A1 (en) * 2012-12-28 2014-08-07 Balu Subramanya Advanced parking management system
US20160125713A1 (en) * 2013-05-23 2016-05-05 Sony Corporation Surveillance apparatus having an optical camera and a radar sensor
WO2016092537A1 (en) * 2014-12-07 2016-06-16 Brightway Vision Ltd. Object detection enhancement of reflection-based imaging unit
US20170328997A1 (en) * 2016-05-13 2017-11-16 Google Inc. Systems, Methods, and Devices for Utilizing Radar with Smart Devices
US10140855B1 (en) * 2018-08-24 2018-11-27 Iteris, Inc. Enhanced traffic detection by fusing multiple sensor data

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110163904A1 (en) * 2008-10-08 2011-07-07 Delphi Technologies, Inc. Integrated radar-camera sensor
US20140195138A1 (en) * 2010-11-15 2014-07-10 Image Sensing Systems, Inc. Roadway sensing systems
US20140118170A1 (en) * 2012-10-29 2014-05-01 Emx Industries Inc. Vehicle detector
US20140218527A1 (en) * 2012-12-28 2014-08-07 Balu Subramanya Advanced parking management system
US20160125713A1 (en) * 2013-05-23 2016-05-05 Sony Corporation Surveillance apparatus having an optical camera and a radar sensor
WO2016092537A1 (en) * 2014-12-07 2016-06-16 Brightway Vision Ltd. Object detection enhancement of reflection-based imaging unit
US20170328997A1 (en) * 2016-05-13 2017-11-16 Google Inc. Systems, Methods, and Devices for Utilizing Radar with Smart Devices
US10140855B1 (en) * 2018-08-24 2018-11-27 Iteris, Inc. Enhanced traffic detection by fusing multiple sensor data

Similar Documents

Publication Publication Date Title
EP1997090B1 (en) A system for detecting vehicles
GB2488890A (en) Speed enforcement system which triggers higher-accuracy active sensor when lower-accuracy passive sensor detects a speeding vehicle
EP3179458A1 (en) Method and monitoring device for monitoring a tag
US20100194859A1 (en) Configuration module for a video surveillance system, surveillance system comprising the configuration module, method for configuring a video surveillance system, and computer program
US20190172345A1 (en) System and method for detecting dangerous vehicle
US20090070163A1 (en) Method and apparatus for automatically generating labor standards from video data
US9870704B2 (en) Camera calibration application
US20160005313A1 (en) Method and apparatus for smart stop sign
CN105549110A (en) Airport runway foreign object debris detection device and airport runway foreign object debris detection method
US10453216B1 (en) Automatic camera configuration using a fiducial marker
KR101754407B1 (en) Car parking incoming and outgoing control system
CA2505841C (en) Foreign object detection system and method
JP5634222B2 (en) Traffic vehicle monitoring system and vehicle monitoring camera
EP3113477B1 (en) Monitoring camera
WO2021077157A1 (en) Sensor and associated system and method for detecting a vehicle
KR100883055B1 (en) Portable roadway detector evaluation system and method for processing the same
US9310252B2 (en) Automated object classification using temperature profiles
US20200128188A1 (en) Image pickup device and image pickup system
US10223596B1 (en) Identifying vehicles in a parking bay
WO2011158038A1 (en) Tracking method
KR101164410B1 (en) Invader trace monitering system and tracking method thereof
CN106663236A (en) Inventory management
US20210039258A1 (en) Apparatus, system, and method for robotic datacenter monitoring
KR101733321B1 (en) Component-based integrated real-time surveillance system for recognizing events in multiple camera environments
TWI688924B (en) Monitoring system for identifying and tracking object

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20878224

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20878224

Country of ref document: EP

Kind code of ref document: A1