WO2019173585A2 - Determining position of vehicle based on image of tag - Google Patents

Determining position of vehicle based on image of tag Download PDF

Info

Publication number
WO2019173585A2
WO2019173585A2 PCT/US2019/021143 US2019021143W WO2019173585A2 WO 2019173585 A2 WO2019173585 A2 WO 2019173585A2 US 2019021143 W US2019021143 W US 2019021143W WO 2019173585 A2 WO2019173585 A2 WO 2019173585A2
Authority
WO
WIPO (PCT)
Prior art keywords
tag
image
determining
vehicle
size
Prior art date
Application number
PCT/US2019/021143
Other languages
French (fr)
Other versions
WO2019173585A3 (en
Inventor
Kevin TACHENY
Original Assignee
Global Traffic Technologies, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Global Traffic Technologies, Llc filed Critical Global Traffic Technologies, Llc
Publication of WO2019173585A2 publication Critical patent/WO2019173585A2/en
Publication of WO2019173585A3 publication Critical patent/WO2019173585A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/485Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/07Controlling traffic signals
    • G08G1/087Override of traffic control, e.g. by signal transmitted by an emergency vehicle

Definitions

  • the disclosure generally describes methods and systems for determining a position of a vehicle based on an image of a tag.
  • GPS global positioning system
  • GPS has limitations. For example, GPS is available when there is a line-of-sight between one or more GPS satellites and a GPS receiver on a vehicle. If the line-of-sight is obstructed, then another positioning method, such as dead reckoning, must be employed and can require additional systems and/or inputs. GPS is limited by a fixed update rate of once per second.
  • a disclosed method includes capturing an image of a tag with a camera onboard a vehicle.
  • the method further includes determining a location of the tag from data encoded in the image.
  • the method further includes comparing a size of the image of the tag to a baseline size of the tag.
  • the method further includes determining a capture position of the image of the tag.
  • the method further includes determining a position of the vehicle based on the determined location of the tag and the capture position.
  • a disclosed system for determining a position of a vehicle includes a global positioning system (GPS) subsystem, an image-based positioning subsystem, and a controller coupled to the GPS subsystem and the image-based positioning sub- system.
  • the GPS subsystem is configured and arranged to receive a GPS signal and determine GPS coordinates of the vehicle from the GPS signal.
  • the image- based positioning subsystem includes a camera onboard the vehicle.
  • the image- based positioning subsystem is configured and arranged to capture an image of a tag by the camera; determine a location of the tag from data encoded in the image, compare a size of the image of the tag to a baseline size of the tag, determine a capture position of the image of the tag, and determine a position of the vehicle based on the determined location of the tag and the capture position.
  • the controller is configured and arranged to activate the image-based positioning sub-system in response to a strength of the GPS signal being less than a threshold strength.
  • a disclosed method includes determining whether an aerial vehicle is within a threshold distance from a destination.
  • the method further includes, in response to determining that the aerial vehicle is within the threshold distance, capturing an image of a tag with a camera onboard the aerial vehicle.
  • the tag marks the destination.
  • the method further includes, in response to determining that the aerial vehicle is within the threshold distance, determining a location of the tag from data encoded in the image, comparing a size of the image of the tag to a baseline size of the tag, determining a capture position of the image of the tag, determining a position of the aerial vehicle based on the determined location of the tag and the capture position, and adjusting a trajectory of the aerial vehicle towards the destination based on the determined locations
  • FIG. 1 illustrates effects on perceived size of a square based on vantage points
  • FIGs. 2A and 2B show images of the square of FIG. 1 produced from different vantage points
  • FIG. 3 shows an exemplary tag including a Quick Response (QR) code
  • FIG. 4 shows an exemplary tag including a QR code and a directional marker
  • FIGs. 5A-5D show images of the tag of FIG. 4 from different vantage points relative to a baseline size of the tag
  • FIG. 6 is a flowchart of a method for determining a position of a vehicle based on an image
  • FIGs. 7A-7B show an illustrative example described in conjunction with FIG.
  • FIG. 8 is a block diagram showing a circuit arrangement for determining a position of a vehicle based on an image
  • FIG. 9 shows exemplary fields of view associated with a traffic signal controller
  • FIG. 10 is a flowchart of a method for determining whether to issue a traffic signal priority (TSP) request based on a position of a vehicle determined from an image;
  • TSP traffic signal priority
  • FIG. 1 1 is a flowchart of a method for determining whether to issue a TSP request based on headway that is based on a position of a vehicle determined from an image;
  • FIG. 12 is a flowchart of a method for adjusting a trajectory of an aerial vehicle based on a position of the aerial vehicle determined from an image.
  • a transit system relies on accurate position information of transit vehicles to maintain transit schedules.
  • GPS is used to provide an accurate position of a transit vehicle while on a transit route.
  • buildings, tunnels, and/or overpasses may obstruct or block the line-of-sight between the transit vehicle and a GPS satellite.
  • An obstruction can decrease the accuracy of the GPS coordinates of a transit vehicle or render GPS unavailable.
  • GPS may not update the GPS coordinates at a frequency high enough to accommodate a high rate of speed and maneuverability of an aerial drone. GPS is limited to one update per second. While GPS coordinates can be interpolated between updates, the interpolation requires inputs and other measurement devices. For example, dead reckoning is based on the speed and heading of a vehicle and is subject to cumulative errors.
  • the disclosed approaches provide automated methods and systems that enable a vehicle to determine its position despite a weak GPS signal or GPS being unavailable. The disclosed approaches are not limited to the update rate of GPS. Rather, the disclosed approaches can determine a position of a vehicle as fast as an image can be processed. Thus, the update frequency of the disclosed approaches can be faster than the update rate of a GPS system.
  • an image of a tag is captured by a camera onboard a vehicle.
  • the location of the tag is determined from data encoded in the image.
  • the size of the image of the tag is compared to a baseline size of the tag.
  • a position at which the image of the tag was captured (hereinafter referred to as a capture position) is determined.
  • the position of the vehicle is determined based on the determined location of the tag and the capture position.
  • FIG. 1 illustrates effects on perceived size of a square based on vantage points.
  • the perceived size of an object is dependent on the distance from which the object is viewed and the angle at which the object is viewed relative to the object. If an object is viewed from a vantage point far from the object, then the object will appear to be smaller than if the object is viewed at a vantage point close to the object. If an object is viewed at an angle that is not orthogonal to the object, then the object will appear to be skewed.
  • FIG. 1 shows a square 102 and two vantage points 104 and 106.
  • the square 102 is drawn as lying in the x-y plane for ease of illustration as indicated by the axes 1 10.
  • the x-axis represents width
  • the y-axis represents height
  • the z-axis represents depth.
  • the vantage point 104 has x and y coordinates at the center of the square and a z coordinate some distance above the x-y plane.
  • the vantage point 106 has x and y coordinates that are outside the perimeter of the square and a z coordinate that is some distance above the x-y plane.
  • the solid lines from the square 102 to the vantage point 104 compared to the dashed lines from the square 102 to the vantage point 106 illustrate how viewing the square off-center from the square 102 skews the size of the square 102.
  • FIGs. 2A and 2B show images produced from different vantage points.
  • FIG. 2A shows the image 108 of the square 102 as viewed from the vantage point 104. Because the vantage point 104 is centered over the square, the image 108 of the square 102 is not skewed as illustrated in FIG. 2A.
  • FIG. 2B shows the image 1 12 of the square 102 as viewed from the vantage point 106. Because the vantage point 106 is off-center of the square, the image 1 12 of the square 102 is skewed as illustrated in FIG. 2B. The sides of the square 102 in image 1 12 are shorter than the sides of the square in the image 108 in FIG. 2A, because the vantage point 106 is further away from the square 102 than the vantage point 104.
  • FIG. 3 shows an exemplary tag 340 including a Quick Response (QR) code.
  • QR Quick Response
  • the QR code encodes data associated with the location of the tag 340.
  • Examples of the encoded location information can include, but are not limited to, GPS coordinates of the tag 340, a height relative to the ground or sea level of the tag 340, an angle of the tag 340 relative to the horizon, a direction of the tag 340, rotation of the tag 340 about an axis that is orthogonal of the tag 340, and an address or intersection at which the tag 340 is located.
  • the QR code can encode characteristics of a baseline image, such as a baseline size and a baseline distance associated with the tag 340.
  • the baseline image size includes a baseline height and width of an image of the tag 340, and the baseline distance indicates the distance from the tag 340 at which an image of the tag 340 is the baseline image size. For example, at two feet from the tag 340, the baseline image size of the tag 340 is x pixels by y pixels.
  • the QR code can encode metadata such as a unique identifier of the tag 340, which can be used to look up the baseline image size and baseline distance associated with the tag 340.
  • the QR code can encode data identifying one or more corners of the baseline image size of the tag (e.g., top left corner or bottom right corner).
  • the QR code can encode information about the object to which the tag 340 is affixed. For example, the tag 340 can be at a particular transit stop or at a particular distance away from a destination.
  • the encoded data is read from the image of the tag 340.
  • the tag 340 includes a frame 354.
  • the frame 354 can be used to determine the size of the image of the tag 340 relative to a baseline image size of the tag 340. For example, the length of one or more sides of the frame 354 in the image can be compared to a baseline length(s) of the sides of the frame 354. In at least one approach, an angle between a side of the frame 354 in an image of the tag 340 and the side of the frame 354 in a baseline orientation is determined.
  • FIG. 4 shows an exemplary tag 450 including a QR code and a directional marker 452.
  • the directional marker 452 is a visual indication of a baseline orientation of the tag 450.
  • the directional marker 452 can be used in place of and/or in conjunction with data encoded in the QR code. For example, if the baseline orientation of the tag 450 is encoded in the QR code but there is error in reading the data encoded in the QR code, the directional marker 452 can be used to determine a baseline orientation of the tag 450. In at least one approach, an angle between the orientation of the directional marker in an image of the tag 450 and the orientation of the directional marker 452 in a baseline size is determined.
  • the directional marker 452 points to the right in the baseline size of the tag 450, but may point up in an image of the tag 450.
  • the arrow shape of the directional marker 452 aides in determining an orientation of an image of the tag 450 by providing a clear indication of the orientation of the tag 450.
  • the directional marker 452 is pointed in a particular direction (e.g., north) in the baseline size.
  • the direction to which the directional marker 452 is pointed can be encoded in the QR code.
  • the directional marker 452 is shown as an arrow indicating the bottom of the frame 454, the marker can be placed elsewhere in other
  • FIGs. 5A-5D show images of the tag 450 of FIG. 4 from different vantage points relative to a baseline image of the tag 450.
  • the baseline image of the tag 450 is represented by the frame 454.
  • the capture position of the image is to the right of the tag 450.
  • the directional marker 452 points to the right in FIG. 5A, indicating that the camera at the capture position is not rotated relative to the baseline image. Because the size of the image 502 of the tag 450 is smaller than the frame 454, the capture position of the image is some distance from the tag 450 greater than the distance associated with the baseline image.
  • the capture position of the image is to the left of the tag 450.
  • the directional marker 452 points to the right in FIG. 5B indicating that the camera is not rotated relative to the baseline size of the tag 450. Because the size of the image 504 is smaller than the frame 454, and smaller than the size of the image 502 in FIG. 5A, the capture position of the image 504 is at a distance from the tag 450 greater than the distance associated with the baseline size, and a distance from the tag 450 greater than the capture position of image 502 of FIG. 5A.
  • the capture position of the image is slightly to the right of the tag 450.
  • the directional marker 452 points to the left in FIG. 5C indicating that the camera at the capture position is rotated approximately 180 degrees relative to the baseline orientation of the tag 450. Because the size of the image 506 of the tag 450 is smaller than the frame 454, the capture position of the image 506 is at a distance further away from the tag 450 than the distance associated with the baseline image.
  • the capture position of the image is slightly to the right of the tag 450.
  • the directional marker 452 points down in FIG. 5D indicating that the camera at the capture position is rotated approximately 90 degrees relative to the baseline image of the tag 450. Because the size of the image 508 of the tag 450 is smaller than the frame 454, the capture position of the image is at a distance from the tag 450 greater than the distance associated with the baseline size.
  • FIG. 6 is a flowchart of a method for determining a position of a vehicle based on an image of a tag.
  • FIGs. 7A-7B show an illustrative example described in conjunction with FIG. 6.
  • a position system onboard a vehicle determines whether the strength of a signal from a GPS satellite is less than a threshold. As explained above, the strength of a signal from a GPS satellite can be affected by obstruction(s) in the line-of-sight between the GPS satellite and a GPS receiver onboard the vehicle. The signal may become intermittent or completely blocked.
  • the system can use GPS to determine the position of the vehicle. The system can periodically check and determine the strength of the GPS signal.
  • the system In response to determining that the strength of the signal is less than the threshold, at block 706, the system captures an image of a tag 450 with a camera onboard the vehicle.
  • the tag has a known and fixed position.
  • the tag 450 is illustrated affixed to a pole 756, such as a street lamp, a telephone pole, or a transit stop sign. Although the tag 450 is shown, any tag can be used.
  • the system determines the location of the tag 450 from data encoded in the image of the tag 450. In one approach, the location is determined directly from the data encoded in the image. For example, the location can be encoded in a QR code included on the tag 450.
  • a unique identifier of the tag 450 can be encoded in the image of the tag 450.
  • a unique identifier can be encoded in a QR code included on the tag 450.
  • Baseline sizes, baseline distances, and/or GPS coordinates of tags are associated with unique identifiers of the tags and can be stored in a computer database onboard a vehicle or stored centrally in a computer server. The onboard computer can look- up the unique identifier read from a tag and read the baseline size, baseline distance, and/or GPS coordinates associated with the tag 450 captured in the image.
  • the system compares the size of the image of the tag 450 to the baseline image size.
  • the comparison can include determining a difference between a length of a side of the image of the tag 450 and a baseline length of the side of the baseline image.
  • the comparison can include determining an amount of skew of the image relative to the baseline image.
  • the comparison can also include determining an amount of rotation of the image of the tag 450 relative to a baseline image.
  • the system determines a capture position 750 (FIG. 7A) of the image of the tag 450.
  • the capture position 750 is the position of the camera that captured an image of the tag 450 relative to the tag 450.
  • the camera is on board a vehicle, such as a transit vehicle or an aerial drone.
  • the skew of the captured image is indicative of an angle 752 of the capture position 750 relative to the tag 450.
  • the skew is also indicative of the location of the capture position 750 relative to the tag 450. For example, the skew indicates whether the capture position 750 is to the left of, to the right of, above, or below the tag 450.
  • the 7B illustrates the skew of the image of the tag 450 at the capture position 750.
  • the size of the captured image of the tag 450 relative to the size of the baseline image is indicative of a distance 754 of the capture position 750 from the tag 450.
  • the distance 754 is related to the ratio of the size of the captured image to the size of the baseline image.
  • the capture position 750 is determined from the angle 752, the relative position, and the distance 754.
  • the system determines the position of the vehicle based on the location of the tag determined at block 708 and the capture position determined at block 712.
  • the capture position 750 can be analogized to an offset from the location of the tag 450.
  • the determined position of the vehicle can be GPS coordinates offset from the GPS coordinates of the tag.
  • GPS coordinates such as those determined before the strength of the GPS signal was less than the threshold, is supplemented with the location determined at block 714.
  • FIG. 8 is a block diagram showing a circuit arrangement for determining a position of a vehicle based on an image.
  • the on-vehicle circuitry 800 includes a processor(s) 802, memory 804, and storage 806 for program instructions 808 and location data 810, all of which are coupled by bus 820.
  • the circuitry 800 further includes a location signal receiver 812, a transmitter 814, and peripheral interface(s) 826, which are also coupled to bus 820.
  • the transmitter 814 can issue a traffic signal priority (TSP) request.
  • TSP traffic signal priority
  • the location signal receiver 812 can be a GPS receiver.
  • the peripheral interface(s) 826 provide access to data and control signals from a camera 828, for example.
  • the camera 828 is onboard the vehicle.
  • the on-vehicle circuitry 800 is implemented on a Nexcom VTC 6100 in-vehicle computer.
  • the computer includes a processor, memory, peripheral interfaces, a bus, and retentive storage for program code and data.
  • the location signal receiver is a TRIMBLE® Placer Gold Series receiver
  • the transmitter is a Sierra Wireless GX-400 cellular modem.
  • the storage device 806 is configured with program instructions 808 that are executable by the processor and with location data 810. In executing the
  • the location data include unique identifiers of tags and location information associated with a respective tag and corresponding unique identifier. Examples of location information include, but are not limited to, addresses, intersections, and GPS coordinates.
  • Traffic signals are in abundance in metropolitan areas and have long been used to regulate the flow of traffic at intersections. Generally, traffic signals have relied on timers or vehicle sensors to determine when to change traffic signal lights, thereby signaling alternating directions of traffic to stop, and others to proceed.
  • Emergency vehicles e.g., the vehicle 908 shown in FIG. 9
  • police cars such as police cars, fire trucks and ambulances
  • Emergency vehicles generally have the right to cross an intersection against a traffic signal.
  • Emergency vehicles have in the past typically depended on horns, sirens and flashing lights to alert other drivers approaching the intersection that an emergency vehicle intends to cross the intersection.
  • due to hearing impairment, air conditioning, audio systems and other distractions often the driver of a vehicle approaching an intersection will not be aware of a warning being emitted by an approaching emergency vehicle.
  • Traffic control preemption systems assist authorized vehicles (emergency, public safety, and transit vehicles) through signalized intersections by making traffic signal priority (TSP) requests to the traffic signal controllers that control the traffic signal at the intersections.
  • TSP traffic signal priority
  • the traffic signal controller may respond to a TSP request from the vehicle 908 by changing the intersection lights to green in the direction of travel of the vehicle 908.
  • T raffic control preemption systems improve the response time of public safety personnel, while reducing dangerous situations at intersections when an emergency vehicle is trying to cross on a red light.
  • the speed and schedule efficiency for transit vehicles can be improved by traffic control preemption systems as well.
  • FIG. 9 shows exemplary fields of view associated with a traffic signal controller.
  • the field of view 910 is associated with the traffic signal controller 909
  • the field of view 912 is associated with the traffic signal controller 91 1
  • the field of view 914 is associated with the traffic signal controller 913.
  • the vehicle 908 transmits a TSP request when the vehicle 908 is in the field of view 910
  • the TSP request will be received by the traffic signal controller 909 and may be acted upon.
  • a TSP request transmitted in the field of view 910 may be received by the traffic signal controllers 91 1 or 913 but not acted upon.
  • GPS coordinates can be used to determine if the vehicle is within the field of view of a traffic signal controller and should be granted traffic signal priority. If GPS coordinates of the vehicle are unavailable, because a building obstructs the line-of- sight for the whole field of view, for example, then the vehicle 908 may be unable to issue a TSP request having accurate location information. If GPS coordinates are unavailable until right before the vehicle 908 reaches an intersection, because a building obstructs a portion of the field of view, then the vehicle may issue a TSP request but the traffic signal may not change in time and/or provide enough time for traffic to adjust to the preempted traffic signal.
  • the disclosed approaches determine the position (e.g., GPS coordinates) of a vehicle in the absence of GPS to provide uninterrupted availability of traffic control preemption systems.
  • FIG. 10 is a flowchart of a method for determining whether to issue a TSP request based on a position of a vehicle determined from an image.
  • a traffic control preemption system captures an image of a tag with a camera onboard the vehicle 908.
  • the tag is located on or near a traffic signal controller, such as the traffic signal controllers 909, 91 1 , and 913 illustrated in FIG. 9.
  • a traffic control preemption system determines the location of the tag from data encoded in the image of the tag.
  • the location is determined directly from the data encoded in the image.
  • the location can be encoded in a QR code included on the tag.
  • a unique identifier of the tag can be encoded in the image of the tag.
  • a unique identifier can be encoded in a QR code included on the tag.
  • Baseline sizes and baseline distances of tags are associated with unique identifiers of the tags and can be stored in a computer database onboard a vehicle or stored centrally in a computer server. The onboard computer can look-up the unique identifier read from a tag or submit a request to a central server for the baseline size and baseline distance associated with the tag captured in the image.
  • the system compares the size of the captured image to the baseline size.
  • the comparison can include determining a difference between a length of a side of the image of the tag and a baseline length of the side of the tag.
  • the comparison can include determining an amount of skew of the image relative to the baseline size of the tag.
  • the comparison can include determining an amount of rotation of the image of the tag relative to a baseline size of the tag.
  • the system determines a capture position of the image of the tag.
  • the capture position is the position of the camera onboard the vehicle 908.
  • the skew of the image of the tag is indicative of an angle of the capture position relative to the tag.
  • the skew is also indicative of the location of the capture position relative to the tag. For example, the skew indicates whether the capture position is to the left of, to the right of, above, or below the tag.
  • the size of the image relative to the baseline size of the tag is indicative of a distance of the capture position from the tag. The distance is related to the ratio of the size of the image of the tag to the baseline size of the tag.
  • the capture position is determined from the angle, the relative position, and the distance.
  • the system determines the position of the vehicle based on the location of the tag from block 1004 and the capture position from block 1008.
  • the capture position can be analogized to an offset from the location of the tag.
  • the position of the vehicle 908 can be determined.
  • the determined position of the vehicle 908 can be GPS coordinates offset from the GPS coordinates of the tag.
  • the system determines whether the position of the vehicle 908 is within a field of view of a traffic signal controller. For example, the system determines whether the determined GPS coordinates from block 1010 are within the field of view. In response to determining that the position of the vehicle 908 is within the field of view, at block 1014, the system issues a TSP request. In response to determining that the position of the vehicle 908 is not within the field of view, at block 1016, the system captures another image of the tag with a camera onboard the vehicle 908.
  • transit vehicles e.g. buses
  • traffic control e.g.
  • the disclosed approaches to determining a position of a vehicle can be used to determine whether the vehicle is on-schedule or off-schedule.
  • the determined position is transmitted from the vehicle to a server.
  • the server compares the determined position to a scheduled position.
  • a signal indicative of an adherence status is transmitted from the server to the vehicle.
  • the adherence status can be on-schedule or off-schedule.
  • a TSP request is issued from the vehicle to the traffic signal controller based on the adherence status.
  • FIG. 1 1 is a flowchart of a method for determining whether to issue a TSP request based on headway that is based on a position of a vehicle determined from an image.
  • headway refers to the difference in arrival or departure times of consecutive transit vehicles on the same route.
  • the headway parameter may be useful for multiple transit vehicles servicing the same route.
  • Scheduled headway refers to the scheduled difference in arrival or departure times of consecutive transit vehicles on the same route whereas the headway refers to the difference in estimated and/or actual arrival or departure times of consecutive transit vehicles on the same route.
  • the transit vehicle from which a TSP request is issued depends on whether the headway is greater than or less than desired between a leading vehicle and a trailing vehicle. If the headway is greater than the scheduled headway, the trailing vehicle issues a TSP request in order to reduce the headway between the vehicles.
  • the leading vehicle issues a TSP request in order to increase the headway between the vehicles.
  • Block 1 102 corresponds to the steps performed at blocks 706, 708, 710, 712, and 714 of FIG. 6.
  • the vehicle described in association with FIG. 6 can be analogous to the vehicle 908 illustrated in FIG. 9.
  • the tag is located on or near a traffic signal controller, such as the traffic signal controllers 909, 91 1 , and 913.
  • the system determines the actual headway between the transit vehicle and another transit vehicle on the same transit route.
  • the determined position of the transit vehicle is used to determine an estimated and/or actual arrival or departure time of the transit vehicle.
  • the difference between the arrival or departure time of the transit vehicle and the arrival or departure time of the other transit vehicle is the headway.
  • the system compares the headway determined at block 1 1 12 to the scheduled headway between the transit vehicle and the other transit vehicle.
  • the system determines whether the headway is greater than the scheduled headway.
  • the system determines whether the transit vehicle is leading the other transit vehicle.
  • the system issues a TSP request from the transit vehicle. In response to determining that the transit vehicle is leading the other transit vehicle, at block 1 124, the system issues a TSP request from the transit vehicle. In response to
  • the system issues a TSP request from the other transit vehicle.
  • the system determines whether the transit vehicle is leading the other transit vehicle. In response to determining that the transit vehicle is leading the other transit vehicle, at block 1 124, the system issues a TSP request from the transit vehicle. In response to
  • the system issues a TSP request from the other transit vehicle.
  • FIG. 12 is a flowchart of a method for adjusting a trajectory of an aerial vehicle based on a position of the aerial vehicle determined from an image.
  • a trajectory control system of the aerial vehicle determines whether the aerial vehicle is within a threshold distance from a destination.
  • the destination for example, can be a landing zone.
  • GPS coordinates of the aerial vehicle can be used to determine if the aerial vehicle is within the threshold distance.
  • the system maintains the current trajectory of the aerial vehicle.
  • the system captures an image of a tag with a camera onboard the aerial vehicle. The tag is located at or near the destination.
  • the system determines the location of the tag from data encoded in the image of the tag.
  • the location is determined directly from the data encoded in the image.
  • the location can be encoded in a QR code included on the tag.
  • a unique identifier of the tag can be encoded in the image of the tag.
  • Baseline sizes of tags are associated with a plurality of unique identifiers for each tag. The unique identifier is looked up to determine the baseline size of the tag captured in the image. The baseline sizes can be stored in memory onboard the vehicle and/or on a server.
  • the system can be in communication with the server and transmit the unique identifier to the server.
  • the server can respond with the associated baseline size of the tag.
  • the system compares the size of the image of the tag to the baseline size of the tag.
  • the comparison can include determining a difference between a length of a side of the image of the tag and a baseline length of the side of the tag.
  • the comparison can include determining an amount of skew of the image relative to the baseline size of the tag.
  • the comparison can include determining an amount of rotation of the image of the tag relative to a baseline size of the tag.
  • the system determines a capture position of the image of the tag.
  • the capture position is the position of the camera onboard the aerial vehicle.
  • the skew of the image of the tag is indicative of an angle of the capture position relative to the tag.
  • the skew is also indicative of the location of the capture position relative to the tag. For example, the skew indicates whether the capture position is to the left of, to the right of, above, or below the tag.
  • the size of the image relative to the baseline size of the tag is indicative of a distance of the capture position away from the tag. The distance is related to the ratio of the size of the image of the tag to the baseline size of the tag.
  • the capture position is determined from the angle, the relative position, and the distance.
  • the system determines the position of the aerial vehicle based on the location of the tag determined at block 1204 and the capture position from block 1208.
  • the capture position can be analogized to an offset from the location of the tag.
  • the position of the aerial vehicle can be determined.
  • the determined position of the aerial vehicle can be GPS coordinates offset from the GPS coordinates of the tag.
  • the system determines whether the current trajectory of the aerial vehicle is headed towards the destination. In response to determining that the current trajectory of the aerial vehicle is headed towards the destination, at block 1204, the system maintains the current trajectory of the aerial vehicle. In response to determining that the current trajectory of the aerial vehicle is not headed towards the destination, at block 1218, the system adjusts the current trajectory of the aerial vehicle towards the destination based on the determined position of the aerial vehicle determined at block 1214. For example, the system can adjust the orientation of one or more control surfaces and/or thrusters of the aerial vehicle. At block 1220, the system captures another image of the tag with the camera on the aerial vehicle.
  • a block, module, device, system, unit, or controller is a circuit that carries out one or more of the disclosed or related operations/activities.
  • one or more blocks, modules, devices, systems, units, or controllers are discrete logic circuits or programmable circuits configured and arranged for implementing these operations/activities, as shown in FIGS. 6 and I Q- 12.
  • the programmable circuitry can be one or more computer circuits programmed to execute a set (or sets) of instructions (and/or configuration data).
  • the instructions (and/or configuration data) can be in the form of firmware or software stored in and accessible from a memory (circuit).
  • Some implementations are directed to a computer program product (e.g., nonvolatile memory device), which includes a machine or computer-readable medium having stored thereon instructions which may be executed by a computer (or other electronic device) to perform these operations/activities.
  • a computer program product e.g., nonvolatile memory device
  • the embodiments are thought to be applicable to a variety of systems for controlling traffic signal phases. Other aspects and embodiments will be apparent to those skilled in the art from consideration of the specification.
  • the embodiments may be implemented as one or more processors configured to execute software, as an application specific integrated circuit (ASIC), or as a logic on a programmable logic device. It is intended that the specification and illustrated embodiments be considered as examples only, with a true scope of the invention being indicated by the following claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

Approaches are disclosed for determining a position of a vehicle. An image of a tag is captured with a camera onboard a vehicle. A location of the tag is determined from data encoded in the image. A size of the image of the tag is compared to a baseline size of the tag. A capture position of the image of the tag is determined. A position of the vehicle is determined based on the determined location of the tag and the capture position.

Description

DETERMINING POSITION OF VEHICLE BASED ON IMAGE OF TAG
FIELD OF THE INVENTION
[0001] The disclosure generally describes methods and systems for determining a position of a vehicle based on an image of a tag.
BACKGROUND
[0002]Vehicles may use a global positioning system (GPS) to determine the position of the vehicle. However, GPS has limitations. For example, GPS is available when there is a line-of-sight between one or more GPS satellites and a GPS receiver on a vehicle. If the line-of-sight is obstructed, then another positioning method, such as dead reckoning, must be employed and can require additional systems and/or inputs. GPS is limited by a fixed update rate of once per second.
SUMMARY
[0003]A disclosed method includes capturing an image of a tag with a camera onboard a vehicle. The method further includes determining a location of the tag from data encoded in the image. The method further includes comparing a size of the image of the tag to a baseline size of the tag. The method further includes determining a capture position of the image of the tag. The method further includes determining a position of the vehicle based on the determined location of the tag and the capture position.
[0004] A disclosed system for determining a position of a vehicle includes a global positioning system (GPS) subsystem, an image-based positioning subsystem, and a controller coupled to the GPS subsystem and the image-based positioning sub- system. The GPS subsystem is configured and arranged to receive a GPS signal and determine GPS coordinates of the vehicle from the GPS signal. The image- based positioning subsystem includes a camera onboard the vehicle. The image- based positioning subsystem is configured and arranged to capture an image of a tag by the camera; determine a location of the tag from data encoded in the image, compare a size of the image of the tag to a baseline size of the tag, determine a capture position of the image of the tag, and determine a position of the vehicle based on the determined location of the tag and the capture position. The controller is configured and arranged to activate the image-based positioning sub-system in response to a strength of the GPS signal being less than a threshold strength.
[0005]A disclosed method includes determining whether an aerial vehicle is within a threshold distance from a destination. The method further includes, in response to determining that the aerial vehicle is within the threshold distance, capturing an image of a tag with a camera onboard the aerial vehicle. The tag marks the destination. The method further includes, in response to determining that the aerial vehicle is within the threshold distance, determining a location of the tag from data encoded in the image, comparing a size of the image of the tag to a baseline size of the tag, determining a capture position of the image of the tag, determining a position of the aerial vehicle based on the determined location of the tag and the capture position, and adjusting a trajectory of the aerial vehicle towards the destination based on the determined locations
[0006] Other embodiments will be recognized from consideration of the Detailed Description and Claims, which follow.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] Various aspects and advantages of the disclosed embodiments will become apparent upon review of the following detailed description and upon reference to the drawings in which:
[0008] FIG. 1 illustrates effects on perceived size of a square based on vantage points;
[0009] FIGs. 2A and 2B show images of the square of FIG. 1 produced from different vantage points;
[0010] FIG. 3 shows an exemplary tag including a Quick Response (QR) code;
[0011] FIG. 4 shows an exemplary tag including a QR code and a directional marker;
[0012] FIGs. 5A-5D show images of the tag of FIG. 4 from different vantage points relative to a baseline size of the tag;
[0013] FIG. 6 is a flowchart of a method for determining a position of a vehicle based on an image;
[0014] FIGs. 7A-7B show an illustrative example described in conjunction with FIG.
6;
[0015] FIG. 8 is a block diagram showing a circuit arrangement for determining a position of a vehicle based on an image;
[0016] FIG. 9 shows exemplary fields of view associated with a traffic signal controller; [0017] FIG. 10 is a flowchart of a method for determining whether to issue a traffic signal priority (TSP) request based on a position of a vehicle determined from an image;
[0018] FIG. 1 1 is a flowchart of a method for determining whether to issue a TSP request based on headway that is based on a position of a vehicle determined from an image; and
[0019] FIG. 12 is a flowchart of a method for adjusting a trajectory of an aerial vehicle based on a position of the aerial vehicle determined from an image.
DETAILED DESCRIPTION OF THE DRAWINGS
[0020] In the following description, numerous specific details are set forth to describe specific examples presented herein. It should be apparent, however, to one skilled in the art, that one or more other examples and/or variations of these examples may be practiced without all the specific details given below. In other instances, well known features have not been described in detail so as not to obscure the description of the examples herein. For ease of illustration, the same reference numerals may be used in different diagrams to refer to the same elements or additional instances of the same element.
[0021]A transit system relies on accurate position information of transit vehicles to maintain transit schedules. GPS is used to provide an accurate position of a transit vehicle while on a transit route. However, in metropolitan areas where transit systems are relied upon heavily, buildings, tunnels, and/or overpasses, for example, may obstruct or block the line-of-sight between the transit vehicle and a GPS satellite. An obstruction can decrease the accuracy of the GPS coordinates of a transit vehicle or render GPS unavailable.
[0022] Autonomous vehicles, such as self-driving vehicles or aerial drones, rely on accurate position information to adjust the trajectory of the autonomous vehicle. A disruption in a GPS signal can cause an autonomous vehicle to veer off course and possibly crash. In the case of high-speed aerial drones, GPS may not update the GPS coordinates at a frequency high enough to accommodate a high rate of speed and maneuverability of an aerial drone. GPS is limited to one update per second. While GPS coordinates can be interpolated between updates, the interpolation requires inputs and other measurement devices. For example, dead reckoning is based on the speed and heading of a vehicle and is subject to cumulative errors. [0023] The disclosed approaches provide automated methods and systems that enable a vehicle to determine its position despite a weak GPS signal or GPS being unavailable. The disclosed approaches are not limited to the update rate of GPS. Rather, the disclosed approaches can determine a position of a vehicle as fast as an image can be processed. Thus, the update frequency of the disclosed approaches can be faster than the update rate of a GPS system.
[0024] In one approach, an image of a tag is captured by a camera onboard a vehicle. The location of the tag is determined from data encoded in the image. The size of the image of the tag is compared to a baseline size of the tag. A position at which the image of the tag was captured (hereinafter referred to as a capture position) is determined. The position of the vehicle is determined based on the determined location of the tag and the capture position.
[0025] FIG. 1 illustrates effects on perceived size of a square based on vantage points. The perceived size of an object is dependent on the distance from which the object is viewed and the angle at which the object is viewed relative to the object. If an object is viewed from a vantage point far from the object, then the object will appear to be smaller than if the object is viewed at a vantage point close to the object. If an object is viewed at an angle that is not orthogonal to the object, then the object will appear to be skewed.
[0026] FIG. 1 shows a square 102 and two vantage points 104 and 106. The square 102 is drawn as lying in the x-y plane for ease of illustration as indicated by the axes 1 10. The x-axis represents width, the y-axis represents height, and the z-axis represents depth. The vantage point 104 has x and y coordinates at the center of the square and a z coordinate some distance above the x-y plane. The vantage point 106 has x and y coordinates that are outside the perimeter of the square and a z coordinate that is some distance above the x-y plane. The solid lines from the square 102 to the vantage point 104 compared to the dashed lines from the square 102 to the vantage point 106 illustrate how viewing the square off-center from the square 102 skews the size of the square 102.
[0027] FIGs. 2A and 2B show images produced from different vantage points. FIG. 2A shows the image 108 of the square 102 as viewed from the vantage point 104. Because the vantage point 104 is centered over the square, the image 108 of the square 102 is not skewed as illustrated in FIG. 2A. [0028] FIG. 2B shows the image 1 12 of the square 102 as viewed from the vantage point 106. Because the vantage point 106 is off-center of the square, the image 1 12 of the square 102 is skewed as illustrated in FIG. 2B. The sides of the square 102 in image 1 12 are shorter than the sides of the square in the image 108 in FIG. 2A, because the vantage point 106 is further away from the square 102 than the vantage point 104.
[0029] FIG. 3 shows an exemplary tag 340 including a Quick Response (QR) code.
In at least one implementation, the QR code encodes data associated with the location of the tag 340. Examples of the encoded location information can include, but are not limited to, GPS coordinates of the tag 340, a height relative to the ground or sea level of the tag 340, an angle of the tag 340 relative to the horizon, a direction of the tag 340, rotation of the tag 340 about an axis that is orthogonal of the tag 340, and an address or intersection at which the tag 340 is located. The QR code can encode characteristics of a baseline image, such as a baseline size and a baseline distance associated with the tag 340. The baseline image size includes a baseline height and width of an image of the tag 340, and the baseline distance indicates the distance from the tag 340 at which an image of the tag 340 is the baseline image size. For example, at two feet from the tag 340, the baseline image size of the tag 340 is x pixels by y pixels. The QR code can encode metadata such as a unique identifier of the tag 340, which can be used to look up the baseline image size and baseline distance associated with the tag 340. The QR code can encode data identifying one or more corners of the baseline image size of the tag (e.g., top left corner or bottom right corner). The QR code can encode information about the object to which the tag 340 is affixed. For example, the tag 340 can be at a particular transit stop or at a particular distance away from a destination.
[0030] The encoded data is read from the image of the tag 340. The tag 340 includes a frame 354. The frame 354 can be used to determine the size of the image of the tag 340 relative to a baseline image size of the tag 340. For example, the length of one or more sides of the frame 354 in the image can be compared to a baseline length(s) of the sides of the frame 354. In at least one approach, an angle between a side of the frame 354 in an image of the tag 340 and the side of the frame 354 in a baseline orientation is determined.
[0031] FIG. 4 shows an exemplary tag 450 including a QR code and a directional marker 452. The directional marker 452 is a visual indication of a baseline orientation of the tag 450. The directional marker 452 can be used in place of and/or in conjunction with data encoded in the QR code. For example, if the baseline orientation of the tag 450 is encoded in the QR code but there is error in reading the data encoded in the QR code, the directional marker 452 can be used to determine a baseline orientation of the tag 450. In at least one approach, an angle between the orientation of the directional marker in an image of the tag 450 and the orientation of the directional marker 452 in a baseline size is determined. The directional marker 452 points to the right in the baseline size of the tag 450, but may point up in an image of the tag 450. The arrow shape of the directional marker 452 aides in determining an orientation of an image of the tag 450 by providing a clear indication of the orientation of the tag 450. In at least one approach, the directional marker 452 is pointed in a particular direction (e.g., north) in the baseline size. Alternatively, the direction to which the directional marker 452 is pointed can be encoded in the QR code. Although the directional marker 452 is shown as an arrow indicating the bottom of the frame 454, the marker can be placed elsewhere in other
implementations.
[0032] FIGs. 5A-5D show images of the tag 450 of FIG. 4 from different vantage points relative to a baseline image of the tag 450. The baseline image of the tag 450 is represented by the frame 454. Based on the skew of the image 502 of the tag 450 in FIG 5A, the capture position of the image is to the right of the tag 450. The directional marker 452 points to the right in FIG. 5A, indicating that the camera at the capture position is not rotated relative to the baseline image. Because the size of the image 502 of the tag 450 is smaller than the frame 454, the capture position of the image is some distance from the tag 450 greater than the distance associated with the baseline image.
[0033] Based on the skew of the image 504 of the tag in FIG 5B, the capture position of the image is to the left of the tag 450. The directional marker 452 points to the right in FIG. 5B indicating that the camera is not rotated relative to the baseline size of the tag 450. Because the size of the image 504 is smaller than the frame 454, and smaller than the size of the image 502 in FIG. 5A, the capture position of the image 504 is at a distance from the tag 450 greater than the distance associated with the baseline size, and a distance from the tag 450 greater than the capture position of image 502 of FIG. 5A. [0034] Based on the skew of the image 506 of the tag 450 in FIG 5C, the capture position of the image is slightly to the right of the tag 450. The directional marker 452 points to the left in FIG. 5C indicating that the camera at the capture position is rotated approximately 180 degrees relative to the baseline orientation of the tag 450. Because the size of the image 506 of the tag 450 is smaller than the frame 454, the capture position of the image 506 is at a distance further away from the tag 450 than the distance associated with the baseline image.
[0035] Based on the skew of the image 508 of the tag 450 in FIG 5D, the capture position of the image is slightly to the right of the tag 450. The directional marker 452 points down in FIG. 5D indicating that the camera at the capture position is rotated approximately 90 degrees relative to the baseline image of the tag 450. Because the size of the image 508 of the tag 450 is smaller than the frame 454, the capture position of the image is at a distance from the tag 450 greater than the distance associated with the baseline size.
[0036] Although not shown in FIGs. 5A-5D, if the size of an image of the tag 450 was larger than the frame 454, then the capture position of the image is at a distance less than the distance associated with the baseline size.
[0037] FIG. 6 is a flowchart of a method for determining a position of a vehicle based on an image of a tag. FIGs. 7A-7B show an illustrative example described in conjunction with FIG. 6. At decision block 702, a position system onboard a vehicle determines whether the strength of a signal from a GPS satellite is less than a threshold. As explained above, the strength of a signal from a GPS satellite can be affected by obstruction(s) in the line-of-sight between the GPS satellite and a GPS receiver onboard the vehicle. The signal may become intermittent or completely blocked. In response to determining that the strength of the signal is greater than or equal to the threshold, at block 704, the system can use GPS to determine the position of the vehicle. The system can periodically check and determine the strength of the GPS signal.
[0038] In response to determining that the strength of the signal is less than the threshold, at block 706, the system captures an image of a tag 450 with a camera onboard the vehicle. The tag has a known and fixed position. In the example shown in FIG. 7A, the tag 450 is illustrated affixed to a pole 756, such as a street lamp, a telephone pole, or a transit stop sign. Although the tag 450 is shown, any tag can be used. [0039] At block 708 of FIG. 6, the system determines the location of the tag 450 from data encoded in the image of the tag 450. In one approach, the location is determined directly from the data encoded in the image. For example, the location can be encoded in a QR code included on the tag 450. In another approach, a unique identifier of the tag 450 can be encoded in the image of the tag 450. For example, a unique identifier can be encoded in a QR code included on the tag 450. Baseline sizes, baseline distances, and/or GPS coordinates of tags are associated with unique identifiers of the tags and can be stored in a computer database onboard a vehicle or stored centrally in a computer server. The onboard computer can look- up the unique identifier read from a tag and read the baseline size, baseline distance, and/or GPS coordinates associated with the tag 450 captured in the image.
[0040] At block 710, the system compares the size of the image of the tag 450 to the baseline image size. The comparison can include determining a difference between a length of a side of the image of the tag 450 and a baseline length of the side of the baseline image. The comparison can include determining an amount of skew of the image relative to the baseline image. The comparison can also include determining an amount of rotation of the image of the tag 450 relative to a baseline image.
[0041]At block 712, the system determines a capture position 750 (FIG. 7A) of the image of the tag 450. The capture position 750 is the position of the camera that captured an image of the tag 450 relative to the tag 450. The camera is on board a vehicle, such as a transit vehicle or an aerial drone. The skew of the captured image is indicative of an angle 752 of the capture position 750 relative to the tag 450. The skew is also indicative of the location of the capture position 750 relative to the tag 450. For example, the skew indicates whether the capture position 750 is to the left of, to the right of, above, or below the tag 450. FIG. 7B illustrates the skew of the image of the tag 450 at the capture position 750. The size of the captured image of the tag 450 relative to the size of the baseline image is indicative of a distance 754 of the capture position 750 from the tag 450. The distance 754 is related to the ratio of the size of the captured image to the size of the baseline image. The capture position 750 is determined from the angle 752, the relative position, and the distance 754.
[0042] At block 714, the system determines the position of the vehicle based on the location of the tag determined at block 708 and the capture position determined at block 712. The capture position 750 can be analogized to an offset from the location of the tag 450. Thus, if the location of the tag 450 and the capture position 750 are known, then the position of the vehicle can be determined. The determined position of the vehicle can be GPS coordinates offset from the GPS coordinates of the tag.
[0043] At block 716, GPS coordinates, such as those determined before the strength of the GPS signal was less than the threshold, is supplemented with the location determined at block 714.
[0044] FIG. 8 is a block diagram showing a circuit arrangement for determining a position of a vehicle based on an image. The on-vehicle circuitry 800 includes a processor(s) 802, memory 804, and storage 806 for program instructions 808 and location data 810, all of which are coupled by bus 820. The circuitry 800 further includes a location signal receiver 812, a transmitter 814, and peripheral interface(s) 826, which are also coupled to bus 820. The transmitter 814 can issue a traffic signal priority (TSP) request. The location signal receiver 812 can be a GPS receiver. The peripheral interface(s) 826 provide access to data and control signals from a camera 828, for example. The camera 828 is onboard the vehicle.
[0045] In an example implementation, the on-vehicle circuitry 800 is implemented on a Nexcom VTC 6100 in-vehicle computer. The computer includes a processor, memory, peripheral interfaces, a bus, and retentive storage for program code and data. In one implementation, the location signal receiver is a TRIMBLE® Placer Gold Series receiver, and the transmitter is a Sierra Wireless GX-400 cellular modem. Those skilled in the art will recognize that other products may be suitably configured or circuitry custom built to provide the capabilities described herein.
[0046] The storage device 806 is configured with program instructions 808 that are executable by the processor and with location data 810. In executing the
instructions, the processor 802 performs the processes and functions described herein. The location data include unique identifiers of tags and location information associated with a respective tag and corresponding unique identifier. Examples of location information include, but are not limited to, addresses, intersections, and GPS coordinates.
[0047] As explained above, buildings may obstruct the line-of-sight between a vehicle and a GPS satellite, rendering GPS unavailable. Traffic signals are in abundance in metropolitan areas and have long been used to regulate the flow of traffic at intersections. Generally, traffic signals have relied on timers or vehicle sensors to determine when to change traffic signal lights, thereby signaling alternating directions of traffic to stop, and others to proceed.
[0048] Emergency vehicles (e.g., the vehicle 908 shown in FIG. 9), such as police cars, fire trucks and ambulances, generally have the right to cross an intersection against a traffic signal. Emergency vehicles have in the past typically depended on horns, sirens and flashing lights to alert other drivers approaching the intersection that an emergency vehicle intends to cross the intersection. However, due to hearing impairment, air conditioning, audio systems and other distractions, often the driver of a vehicle approaching an intersection will not be aware of a warning being emitted by an approaching emergency vehicle.
[0049] Traffic control preemption systems assist authorized vehicles (emergency, public safety, and transit vehicles) through signalized intersections by making traffic signal priority (TSP) requests to the traffic signal controllers that control the traffic signal at the intersections. The traffic signal controller may respond to a TSP request from the vehicle 908 by changing the intersection lights to green in the direction of travel of the vehicle 908. T raffic control preemption systems improve the response time of public safety personnel, while reducing dangerous situations at intersections when an emergency vehicle is trying to cross on a red light. The speed and schedule efficiency for transit vehicles can be improved by traffic control preemption systems as well.
[0050] As used herein,“field of view” refers to an area in which a transmitted TSP request would be received by a traffic signal control and acted upon to grant traffic signal preemption. A field of view may be defined by GPS coordinates that identify the boundaries of the field of view. FIG. 9 shows exemplary fields of view associated with a traffic signal controller. The field of view 910 is associated with the traffic signal controller 909, the field of view 912 is associated with the traffic signal controller 91 1 , and the field of view 914 is associated with the traffic signal controller 913. For example, if the vehicle 908 transmits a TSP request when the vehicle 908 is in the field of view 910, then the TSP request will be received by the traffic signal controller 909 and may be acted upon. However, a TSP request transmitted in the field of view 910 may be received by the traffic signal controllers 91 1 or 913 but not acted upon.
[0051]GPS coordinates can be used to determine if the vehicle is within the field of view of a traffic signal controller and should be granted traffic signal priority. If GPS coordinates of the vehicle are unavailable, because a building obstructs the line-of- sight for the whole field of view, for example, then the vehicle 908 may be unable to issue a TSP request having accurate location information. If GPS coordinates are unavailable until right before the vehicle 908 reaches an intersection, because a building obstructs a portion of the field of view, then the vehicle may issue a TSP request but the traffic signal may not change in time and/or provide enough time for traffic to adjust to the preempted traffic signal. The disclosed approaches determine the position (e.g., GPS coordinates) of a vehicle in the absence of GPS to provide uninterrupted availability of traffic control preemption systems.
[0052] FIG. 10 is a flowchart of a method for determining whether to issue a TSP request based on a position of a vehicle determined from an image. At block 1002, a traffic control preemption system captures an image of a tag with a camera onboard the vehicle 908. The tag is located on or near a traffic signal controller, such as the traffic signal controllers 909, 91 1 , and 913 illustrated in FIG. 9.
[0053] At block 1004, a traffic control preemption system determines the location of the tag from data encoded in the image of the tag. In one approach, the location is determined directly from the data encoded in the image. For example, the location can be encoded in a QR code included on the tag. In another approach, a unique identifier of the tag can be encoded in the image of the tag. For example, a unique identifier can be encoded in a QR code included on the tag. Baseline sizes and baseline distances of tags are associated with unique identifiers of the tags and can be stored in a computer database onboard a vehicle or stored centrally in a computer server. The onboard computer can look-up the unique identifier read from a tag or submit a request to a central server for the baseline size and baseline distance associated with the tag captured in the image.
[0054] At block 1006, the system compares the size of the captured image to the baseline size. The comparison can include determining a difference between a length of a side of the image of the tag and a baseline length of the side of the tag. The comparison can include determining an amount of skew of the image relative to the baseline size of the tag. The comparison can include determining an amount of rotation of the image of the tag relative to a baseline size of the tag.
[0055] At block 1008, the system determines a capture position of the image of the tag. The capture position is the position of the camera onboard the vehicle 908. The skew of the image of the tag is indicative of an angle of the capture position relative to the tag. The skew is also indicative of the location of the capture position relative to the tag. For example, the skew indicates whether the capture position is to the left of, to the right of, above, or below the tag. The size of the image relative to the baseline size of the tag is indicative of a distance of the capture position from the tag. The distance is related to the ratio of the size of the image of the tag to the baseline size of the tag. The capture position is determined from the angle, the relative position, and the distance.
[0056] At block 1010, the system determines the position of the vehicle based on the location of the tag from block 1004 and the capture position from block 1008. The capture position can be analogized to an offset from the location of the tag. Thus, if the location of the tag and the capture position are known, then the position of the vehicle 908 can be determined. The determined position of the vehicle 908 can be GPS coordinates offset from the GPS coordinates of the tag.
[0057] At block 1012, the system determines whether the position of the vehicle 908 is within a field of view of a traffic signal controller. For example, the system determines whether the determined GPS coordinates from block 1010 are within the field of view. In response to determining that the position of the vehicle 908 is within the field of view, at block 1014, the system issues a TSP request. In response to determining that the position of the vehicle 908 is not within the field of view, at block 1016, the system captures another image of the tag with a camera onboard the vehicle 908.
[0058]As explained above, transit vehicles (e.g. buses) use traffic control
preemption systems. It is important that transit vehicles adhere to published schedules in order to satisfy riders' needs and ultimately to ensure the success of designated routes. If a transit vehicle arrives late to a scheduled stop or departs early, riders may be inconvenienced by having to wait for the next transit vehicle. If transit vehicles persistently fail to adhere to the published schedules, some riders may opt for alternative means of transportation. Declining ridership may affect the financial viability of certain routes. [0059] The disclosed approaches to determining a position of a vehicle can be used to determine whether the vehicle is on-schedule or off-schedule. In one approach, the determined position is transmitted from the vehicle to a server. The server compares the determined position to a scheduled position. A signal indicative of an adherence status is transmitted from the server to the vehicle. The adherence status can be on-schedule or off-schedule. A TSP request is issued from the vehicle to the traffic signal controller based on the adherence status.
[0060] FIG. 1 1 is a flowchart of a method for determining whether to issue a TSP request based on headway that is based on a position of a vehicle determined from an image. As used herein,“headway” refers to the difference in arrival or departure times of consecutive transit vehicles on the same route. The headway parameter may be useful for multiple transit vehicles servicing the same route. Scheduled headway refers to the scheduled difference in arrival or departure times of consecutive transit vehicles on the same route whereas the headway refers to the difference in estimated and/or actual arrival or departure times of consecutive transit vehicles on the same route.
[0061] The transit vehicle from which a TSP request is issued depends on whether the headway is greater than or less than desired between a leading vehicle and a trailing vehicle. If the headway is greater than the scheduled headway, the trailing vehicle issues a TSP request in order to reduce the headway between the vehicles.
If the headway is less than the scheduled headway, the leading vehicle issues a TSP request in order to increase the headway between the vehicles.
[0062] Block 1 102 corresponds to the steps performed at blocks 706, 708, 710, 712, and 714 of FIG. 6. The vehicle described in association with FIG. 6 can be analogous to the vehicle 908 illustrated in FIG. 9. The tag is located on or near a traffic signal controller, such as the traffic signal controllers 909, 91 1 , and 913.
[0063] At block 1 1 12, the system determines the actual headway between the transit vehicle and another transit vehicle on the same transit route. The determined position of the transit vehicle is used to determine an estimated and/or actual arrival or departure time of the transit vehicle. The difference between the arrival or departure time of the transit vehicle and the arrival or departure time of the other transit vehicle is the headway.
[0064] At block 1 1 14, the system compares the headway determined at block 1 1 12 to the scheduled headway between the transit vehicle and the other transit vehicle. At decision block 1 1 16, the system determines whether the headway is greater than the scheduled headway. In response to determining that the headway is greater than the scheduled headway, at decision block 1 1 18, the system determines whether the transit vehicle is leading the other transit vehicle. In response to determining that the transit vehicle is leading the other transit vehicle, at block 1 124, the system issues a TSP request from the transit vehicle. In response to
determining that the transit vehicle is trailing the other transit vehicle, at block 1 122, the system issues a TSP request from the other transit vehicle.
[0065] In response to determining, at decision block 1 1 16, that the headway is less than the scheduled headway, at decision block 1 120, the system determines whether the transit vehicle is leading the other transit vehicle. In response to determining that the transit vehicle is leading the other transit vehicle, at block 1 124, the system issues a TSP request from the transit vehicle. In response to
determining that the transit vehicle is trailing the other transit vehicle, at block 1 122, the system issues a TSP request from the other transit vehicle.
[0066] FIG. 12 is a flowchart of a method for adjusting a trajectory of an aerial vehicle based on a position of the aerial vehicle determined from an image. At decision block 1202, a trajectory control system of the aerial vehicle determines whether the aerial vehicle is within a threshold distance from a destination. The destination, for example, can be a landing zone. GPS coordinates of the aerial vehicle can be used to determine if the aerial vehicle is within the threshold distance. In response to determining that the aerial vehicle is not within the threshold distance, at block 1204, the system maintains the current trajectory of the aerial vehicle. In response to determining that the aerial vehicle is within the threshold distance, at block 1206, the system captures an image of a tag with a camera onboard the aerial vehicle. The tag is located at or near the destination.
[0067]At block 1208, the system determines the location of the tag from data encoded in the image of the tag. In one approach, the location is determined directly from the data encoded in the image. For example, the location can be encoded in a QR code included on the tag. In another approach, a unique identifier of the tag can be encoded in the image of the tag. For example, a unique identifier can be encoded in a QR code included on the tag. Baseline sizes of tags are associated with a plurality of unique identifiers for each tag. The unique identifier is looked up to determine the baseline size of the tag captured in the image. The baseline sizes can be stored in memory onboard the vehicle and/or on a server. The system can be in communication with the server and transmit the unique identifier to the server. The server can respond with the associated baseline size of the tag.
[0068] At block 1210, the system compares the size of the image of the tag to the baseline size of the tag. The comparison can include determining a difference between a length of a side of the image of the tag and a baseline length of the side of the tag. The comparison can include determining an amount of skew of the image relative to the baseline size of the tag. The comparison can include determining an amount of rotation of the image of the tag relative to a baseline size of the tag.
[0069] At block 1212, the system determines a capture position of the image of the tag. The capture position is the position of the camera onboard the aerial vehicle. The skew of the image of the tag is indicative of an angle of the capture position relative to the tag. The skew is also indicative of the location of the capture position relative to the tag. For example, the skew indicates whether the capture position is to the left of, to the right of, above, or below the tag. The size of the image relative to the baseline size of the tag is indicative of a distance of the capture position away from the tag. The distance is related to the ratio of the size of the image of the tag to the baseline size of the tag. The capture position is determined from the angle, the relative position, and the distance.
[0070] At block 1214, the system determines the position of the aerial vehicle based on the location of the tag determined at block 1204 and the capture position from block 1208. The capture position can be analogized to an offset from the location of the tag. Thus, if the location of the tag and the capture position are known, then the position of the aerial vehicle can be determined. The determined position of the aerial vehicle can be GPS coordinates offset from the GPS coordinates of the tag.
[0071]At decision block 1216, the system determines whether the current trajectory of the aerial vehicle is headed towards the destination. In response to determining that the current trajectory of the aerial vehicle is headed towards the destination, at block 1204, the system maintains the current trajectory of the aerial vehicle. In response to determining that the current trajectory of the aerial vehicle is not headed towards the destination, at block 1218, the system adjusts the current trajectory of the aerial vehicle towards the destination based on the determined position of the aerial vehicle determined at block 1214. For example, the system can adjust the orientation of one or more control surfaces and/or thrusters of the aerial vehicle. At block 1220, the system captures another image of the tag with the camera on the aerial vehicle.
[0072]Various blocks, modules, devices, systems, units, controllers, or engines can be implemented to carry out one or more of the operations and activities described herein and/or shown in the figures. In these contexts, a block, module, device, system, unit, or controller is a circuit that carries out one or more of the disclosed or related operations/activities. For example, in certain of the above-discussed implementations, one or more blocks, modules, devices, systems, units, or controllers are discrete logic circuits or programmable circuits configured and arranged for implementing these operations/activities, as shown in FIGS. 6 and I Q- 12. The programmable circuitry can be one or more computer circuits programmed to execute a set (or sets) of instructions (and/or configuration data). The instructions (and/or configuration data) can be in the form of firmware or software stored in and accessible from a memory (circuit).
[0073] Some implementations are directed to a computer program product (e.g., nonvolatile memory device), which includes a machine or computer-readable medium having stored thereon instructions which may be executed by a computer (or other electronic device) to perform these operations/activities.
[0074] Though aspects and features may in some cases be described in individual figures, it will be appreciated that features from one figure can be combined with features of another figure even though the combination is not explicitly shown or explicitly described as a combination.
[0075] The embodiments are thought to be applicable to a variety of systems for controlling traffic signal phases. Other aspects and embodiments will be apparent to those skilled in the art from consideration of the specification. The embodiments may be implemented as one or more processors configured to execute software, as an application specific integrated circuit (ASIC), or as a logic on a programmable logic device. It is intended that the specification and illustrated embodiments be considered as examples only, with a true scope of the invention being indicated by the following claims.

Claims

CLAIMS What is claimed is:
1. A method, comprising:
capturing an image of a tag with a camera onboard a vehicle;
determining a location of the tag from data encoded in the image;
comparing a size of the image of the tag to a baseline size of the tag;
determining a capture position of the image of the tag; and
determining a position of the vehicle based on the determined location of the tag and the capture position.
2. The method of claim 1 , wherein the comparing the size of the image includes comparing lengths of sides of the image of the tag to baseline lengths of sides of the tag.
3. The method of any of claims 1 or 2, wherein the comparing the size of the image includes determining skew of the image of the tag relative to the baseline size of the tag.
4. The method of any of claims 1 , 2, or 3, wherein the tag includes a Quick Response (QR) code, and the determining the location of the tag includes determining the location of the tag from data encoded in the QR code.
5. The method of claim 4, wherein the QR code encodes global positioning system (GPS) coordinates and a baseline orientation of the tag, and the determining the location of the tag is based on the encoded GPS coordinates.
6. The method of claim 4, wherein the QR code encodes an identifier, and the method further comprises:
looking up the identifier in a database; and
reading GPS coordinates associated with the identifier in the database.
7. The method of any of claims 1 , 2, 3, 4, 5 or 6, wherein the tag includes a directional marker, and the method further comprises: determining an orientation of the directional marker of the tag in the image; and
determining an orientation of the tag from the determined orientation of the directional marker.
8. The method of any of claims 1 , 2, 3, 4, 5, 6, or 7, further comprising:
determining whether the position of the vehicle is within a field of view of a traffic signal controller; and
in response to determining the position of the vehicle is within the field of view, issuing a traffic signal priority (TSP) request to the traffic signal controller.
9. A system for determining a position of a vehicle, comprising:
a global positioning system (GPS) subsystem configured and arranged to: receive a GPS signal; and
determine GPS coordinates of the vehicle from the GPS signal;
an image-based positioning subsystem including a camera onboard the vehicle, the image-based positioning subsystem configured and arranged to:
capture an image of a tag with the camera;
determine a location of the tag from data encoded in the image;
compare a size of the image of the tag to a baseline size of the tag; determine a capture position of the image of the tag; and determine a position of the vehicle based on the determined location of the tag and the capture position; and
a controller coupled to the GPS subsystem and the image-based positioning subsystem, the controller configured and arranged to activate the image-based positioning sub-system in response to a strength of the GPS signal being less than a threshold strength.
10. The system of claim 9, wherein the controller is configured and arranged to supplement the GPS coordinates with the determined position in response to the strength of the GPS signal being less than a threshold strength.
1 1. A method, comprising: determining whether an aerial vehicle is within a threshold distance from a destination; and
in response to determining that the aerial vehicle is within the threshold distance:
capturing an image of a tag with a camera onboard the aerial vehicle, wherein the tag marks the destination;
determining a location of the tag from data encoded in the image; comparing a size of the image of the tag to a baseline size of the tag; determining a capture position of the image of the tag;
determining a position of the aerial vehicle based on the determined location of the tag and the capture position; and
adjusting a trajectory of the aerial vehicle towards the destination based on the determined locations.
12. The method of claim 1 1 , further comprising:
capturing another image of the tag by the camera onboard the aerial vehicle subsequent to adjusting the trajectory;
comparing a size of the other image of the tag to the baseline size of the tag; determining a capture position of the other image of the tag;
determining a position of the aerial vehicle based on the determined location of the other tag and the capture position of the other image; and
adjusting the adjusted trajectory towards the destination based on the determined locations.
13. The method of any of claims 1 1 or 12, wherein determining whether the aerial vehicle is within the threshold distance is based on global positioning system (GPS) coordinates, and
wherein the method further comprises, in response to determining that the aerial vehicle is within the threshold distance, supplementing the GPS coordinates with the determined position, wherein the determined position has greater accuracy than the GPS coordinates.
14. The method of any of claims 1 1 , 12, or 13, wherein the tag includes a Quick Response (QR) code, and the determining the location of the tag includes determining the location of the tag from data encoded in the QR code.
PCT/US2019/021143 2018-03-08 2019-03-07 Determining position of vehicle based on image of tag WO2019173585A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201815915897A 2018-03-08 2018-03-08
US15/915,897 2018-03-08

Publications (2)

Publication Number Publication Date
WO2019173585A2 true WO2019173585A2 (en) 2019-09-12
WO2019173585A3 WO2019173585A3 (en) 2019-12-05

Family

ID=65818708

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/021143 WO2019173585A2 (en) 2018-03-08 2019-03-07 Determining position of vehicle based on image of tag

Country Status (1)

Country Link
WO (1) WO2019173585A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021091989A1 (en) * 2019-11-05 2021-05-14 Continental Automotive Systems, Inc. System and method for precise vehicle positioning using bar codes, polygons and projective transformation

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006065563A2 (en) * 2004-12-14 2006-06-22 Sky-Trax Incorporated Method and apparatus for determining position and rotational orientation of an object
KR20100094570A (en) * 2007-12-11 2010-08-26 콸콤 인코포레이티드 Gnss method and receiver with camera aid
WO2013003504A2 (en) * 2011-06-27 2013-01-03 Stc, Inc. Signal light priority system utilizing estimated time of arrival

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021091989A1 (en) * 2019-11-05 2021-05-14 Continental Automotive Systems, Inc. System and method for precise vehicle positioning using bar codes, polygons and projective transformation

Also Published As

Publication number Publication date
WO2019173585A3 (en) 2019-12-05

Similar Documents

Publication Publication Date Title
JP6566132B2 (en) Object detection method and object detection apparatus
US9637050B2 (en) Vehicle collision avoidance assist apparatus
US9507345B2 (en) Vehicle control system and method
US20180056998A1 (en) System and Method for Multi-Vehicle Path Planning Technical Field
US20150153184A1 (en) System and method for dynamically focusing vehicle sensors
US8571786B2 (en) Vehicular peripheral surveillance device
CN110036426B (en) Control device and control method
US11830364B2 (en) Driving assist method and driving assist device
US10705207B2 (en) Control device, server system and vehicle
US10917808B2 (en) Extra-vehicular communication device, onboard device, onboard communication system, communication control method, and communication control program
US20200193821A1 (en) Control device for vehicle and automatic driving system
US11550330B2 (en) Driver assistance system and method
JP2020506387A (en) Method for locating a more highly automated, eg, highly automated vehicle (HAF) with a digital location map
US20230115708A1 (en) Automatic driving device and vehicle control method
US10388154B1 (en) Virtual induction loops for adaptive signalized intersections
US11987245B2 (en) Method for controlling vehicle and vehicle control device
CN111959499A (en) Vehicle control method and device
US11423780B2 (en) Traffic control system
CN111104957A (en) Detecting attacks on a vehicle network
JP2008041058A (en) System for notifying blind mobing body, image processor, on-vehicle device and notification method
CN113291298A (en) Driving assistance system for vehicle
WO2019173585A2 (en) Determining position of vehicle based on image of tag
CN113200043A (en) Method and control unit for automatically controlling lane change assistance
CN114973644B (en) Road information generating device
US20240190475A1 (en) Travel area determination device and travel area determination method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19712439

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19712439

Country of ref document: EP

Kind code of ref document: A2