US20150179088A1 - Traffic light detecting system and method - Google Patents
Traffic light detecting system and method Download PDFInfo
- Publication number
- US20150179088A1 US20150179088A1 US13/011,036 US201113011036A US2015179088A1 US 20150179088 A1 US20150179088 A1 US 20150179088A1 US 201113011036 A US201113011036 A US 201113011036A US 2015179088 A1 US2015179088 A1 US 2015179088A1
- Authority
- US
- United States
- Prior art keywords
- traffic light
- status
- user
- user device
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/007—Teaching or communicating with blind persons using both tactile and audible presentation of the information
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
-
- G06K9/00671—
-
- G06K9/3208—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/003—Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/006—Teaching or communicating with blind persons using audible presentation of the information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72457—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72475—User interfaces specially adapted for cordless or mobile telephones specially adapted for disabled users
- H04M1/72481—User interfaces specially adapted for cordless or mobile telephones specially adapted for disabled users for visually impaired users
-
- H04M1/72569—
-
- H04M1/72572—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/026—Services making use of location information using location based information parameters using orientation information, e.g. compass
Definitions
- the present invention relates generally to traffic light detection systems and methods. More particularly, the present invention relates to identifying the state of traffic lights using a mobile device.
- a method of identifying a traffic light status comprises at least one of a color illumination status and a sign status of a traffic light.
- the method comprises receiving, at a user device, geolocation data associated with the traffic light and the user device.
- the user device geolocation data includes a location of the user device, and the traffic light geolocation data includes a location of the traffic light.
- the method also includes detecting an elevation and a direction of the user device; and determining the status of the traffic light with the user device based on the geolocation data associated with the traffic light and the user device, the elevation and the direction associated with the user device.
- the method also includes communicating the status to a user of the user device.
- a device for identifying a traffic light status comprises at least one of a color illumination status and a sign status of a traffic light.
- the device comprises a memory for storing information, including geolocation data associated with a traffic light and the device, direction and elevation information of the device.
- the device geolocation data includes a location of the device, and the traffic light geolocation data includes a location of the traffic light.
- the processor is coupled to the memory.
- the processor is configured to receive geolocation data associated with the traffic light and the device, and to detect an elevation and a direction associated with the device.
- the processor is also configured to determine the status of the traffic light with the device based on the received geolocation data, the detected elevation and the direction of the device.
- the processor is further configured to communicate the status to a user of the device.
- a method for identifying a traffic light status with a server computer comprises receiving, from a user device, geolocation data associated with the traffic light and the user device, elevation and direction data associated with the user device.
- the user device geolocation data includes a location of the user device, and the traffic light geolocation data includes a location of the traffic light.
- the method also comprises determining the status of the traffic light based on the received geolocation data, the elevation and direction data, and transmitting the status to the user device. Determining the status of the traffic light comprises generating one or more instructions to orient the user device to face the traffic light and transmitting the instructions to the user device.
- the method further comprises instructing the user device to communicate the status to the user.
- a server apparatus is employed to identify a traffic light status.
- the traffic light status comprises at least one of a color illumination status and a sign status of a traffic light.
- the apparatus comprises a memory for storing information, including geolocation data associated with a traffic light and a user device, direction and elevation information of the user device.
- the user device geolocation data includes a location of the device, and the traffic light geolocation data includes a location of the traffic light.
- the processor is coupled to the memory.
- the processor is configured to receive geolocation data associated with the traffic light and the user device, and to detect an elevation and a direction associated with the user device.
- the processor is further configured to instruct the user device to communicate the status to the user.
- a system comprising memory means for storing information data.
- the information data includes geolocation data associated with a traffic light and a user device, direction and elevation information of the user device.
- the user device geolocation data includes a location of the device, and the traffic light geolocation data includes a location of the traffic light.
- the system also includes means for detecting a location of a traffic light and a location of the user device, means for detecting a direction of the user device, means for detecting an elevation of the user device, and means for detecting a light signal of the traffic light.
- the system further includes means for capturing an image of the traffic light.
- the system also includes processor means for determining a status of the traffic light.
- the traffic light status comprises at least one of a color illumination status and a sign status of a traffic light.
- the processor means is also for generating one or more instructions to orient the user device to face the traffic light based on the geolocation data associated with the traffic light and the user device, the elevation and the direction of the user device.
- the system further includes means for outputting the traffic light status to a user of the user device.
- determining the status of the traffic light comprises determining, from the geolocation data of the user device, if a current location of the user device is adjacent to an intersection. In the case where the current location is adjacent to the intersection, the method further comprises prompting the user to orient the user device to face the traffic light.
- determining the status of the traffic light with the user device includes generating one or more instructions based on the geolocation data of the traffic light and the user device, the elevation and the direction of the user device. Orienting the user device to face the traffic light also includes providing the instructions to the user.
- generating the one or more instructions to orient the user device includes generating a map for an area between the user device and the traffic light; and calculating deviations from the user device to the traffic light based on the map.
- the method comprises capturing one or more images of the traffic light and a surrounding area by an image capture device of the user device. In this case, determining the status of the traffic light is performed based on the captured images.
- the method includes receiving information related to the traffic light, where the information comprises at least one of a size, a type and a timing sequence of the traffic light. In this situation, determining the status of the traffic light is performed based on the received information related to the traffic light.
- the information related to the traffic light is received from a server computer.
- the information related to the traffic light is generated based on the captured images.
- determining the status of the traffic light with the user device comprises receiving light signals from a plurality of light sources, where one of the light sources comprises the traffic light. Determining the status of the traffic light also includes filtering light signals for frequency ranges emitted by the traffic light.
- determining the status of the traffic light includes recognizing signs associated with the traffic light.
- the method includes receiving audible information related to the status of the traffic light, and determining the status of the traffic light is further based on the audible information.
- communicating the traffic light status to the user includes generating an audible output.
- communicating the traffic light status to the user includes generating a tactile output.
- the location of the traffic light and the location of the user device is detected by a geographical position device.
- the direction associated with the user device is detected by a digital compass.
- FIG. 1 is a pictorial diagram of a system in accordance with aspects of the invention.
- FIG. 2A is a pictorial diagram of a system in accordance with aspects of the invention.
- FIG. 2B is a functional diagram in accordance with aspects of the invention.
- FIG. 3 is a flowchart in accordance with aspects of the invention.
- FIG. 4 is a functional diagram in accordance with aspects of the invention.
- FIG. 5A is an exemplary diagram in accordance with aspects of the invention.
- FIG. 5B is an exemplary diagram in accordance with aspects of the invention.
- FIG. 6 is an exemplary diagram in accordance with aspects of the invention.
- a system determines the status of traffic lights through a mobile device and describes the status to the user of the device.
- the mobile device detects its geographical location, e.g., through a GPS system and determines if the user is at an intersection or is otherwise near a traffic light based on the knowledge from a map database.
- the mobile device also receives the geographical location and other information related to the traffic light at the intersection from a database storing traffic lights information.
- the direction and elevation of the device is calculated (e.g., by the mobile device).
- the device then generates one or more prompts such as in the form of audible or tactile cues that progressively guide the user until the device is precisely pointed at the light.
- the mobile device detects the image and color of the traffic light. The detected image and color are processed, and the status of the traffic light is determined. Then the mobile device communicates the traffic light status to the user.
- a system 100 in accordance with one aspect of the invention includes a server computer 110 , a mobile device 160 , a network 90 , a traffic light 102 and a satellite 103 .
- the mobile device 160 is connected through network 90 to the server 110 .
- the mobile device may have a built-in GPS receiver to receive geolocation data from satellite 103 .
- the traffic light 102 may have lights 102 a - 102 c , each of which is dedicated to one corresponding street lane.
- the lights emitted by the traffic light 103 may be detected by the mobile device 160 and be further processed by the device or the server or both.
- the network 90 may connect with one or more mobile devices 160 and 170 , server computers 110 and 112 and a plurality of databases 136 , 138 , 140 and 142 .
- Various types of data such as user-related information, traffic light information, location/map data, image processing programs, may be stored in these databases and downloaded to the server or the mobile device.
- Various functions, such as image processing, may be performed on the mobile device 160 or on the server 110 .
- the server computer contains a processor 120 , memory 130 and other components typically present in general purpose computers.
- the memory 130 stores information accessible by processor 120 , including instructions 132 and data 134 that may be executed or otherwise used by the processor 120 .
- the memory 130 may be of any type capable of storing information accessible by the processor, including a computer-readable medium, or other medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM, RAM, DVD or other optical disks, as well as other write-capable and read-only memories.
- Systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
- the instructions 132 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor.
- the instructions may be stored as computer code on the computer-readable medium.
- the terms “instructions” and “programs” may be used interchangeably herein.
- the instructions may be stored in object code format for direct processing by the processor, or in any other computer language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below.
- the data 134 may be retrieved, stored or modified by processor 120 in accordance with the instructions 132 .
- the data may be stored in computer registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files.
- the data may also be formatted in any computer-readable format.
- image data may be stored as bitmaps comprised of grids of pixels that are stored in accordance with formats that are compressed or uncompressed, lossless (e.g., BMP) or lossy (e.g., JPEG), and bitmap or vector-based (e.g., SVG), as well as computer instructions for drawing graphics.
- the data may comprise any information sufficient to identify the relevant information, such as numbers, descriptive text, proprietary codes, references to data stored in other areas of the same memory or different memories (including other network locations) or information that is used by a function to calculate the relevant data.
- the processor 120 may be any conventional processor, such as processors from Intel Corporation or Advanced Micro Devices. Alternatively, the processor may be a dedicated device such as an ASIC.
- FIG. 2 functionally illustrates the processor and memory as being within the same block, it will be understood by those of ordinary skill in the art that the processor and memory may actually comprise multiple processors and memories that may or may not be stored within the same physical housing.
- memory may be a hard drive or other storage media located in a server farm of a data center. Accordingly, references to a processor or computer will be understood to include references to a collection of processors or computers or memories that may or may not operate in parallel.
- the server 110 may be at one node of network 90 and capable of directly and indirectly communicating with other nodes of the network.
- server 110 may comprise a web server that is capable of communicating with user devices 160 and 170 via network 90 such that server 110 uses network 90 to transmit and display information to a user, such as person 191 or 192 of FIG. 1B , on display of client device 160 .
- Server 110 may also comprise a plurality of computers that exchange information with different nodes of a network for the purpose of receiving, processing and transmitting data to the client devices. In this instance, the user devices will typically be at different nodes of the network than any of the computers comprising server 110 .
- Network 90 may comprise various configurations and use various protocols including the Internet, World Wide Web, intranets, virtual private networks, local Ethernet networks, private networks using communication protocols proprietary to one or more companies, cellular and wireless networks (e.g., WiFi), instant messaging, HTTP and SMTP, and various combinations of the foregoing.
- cellular and wireless networks e.g., WiFi
- instant messaging HTTP and SMTP, and various combinations of the foregoing.
- the user devices 160 and 170 may comprise mobile devices capable of wirelessly exchanging data with a server over a network such as the Internet.
- user device 170 may be a wireless-enabled PDA or a cellular phone capable of obtaining information via the Internet.
- the user may input information using a small keyboard (in the case of a Blackberry phone), a keypad (in the case of a typical cell phone) or a touch screen (in the case of a PDA).
- Each user device may be configured with a processor 120 , memory 130 and instructions 132 .
- Each client device 160 or 170 may be a device intended for use by a person 191 - 192 , and have all of the components normally used in connection with a mobile device such as a central processing unit (CPU), memory (e.g., RAM and internal hard drives) storing data and instructions such as a web browser, an electronic display 162 (e.g., a small LCD touch-screen or any other electrical device that is operable to display information), and user input 163 (e.g., keyboard, touch-screen and/or microphone), a network interface device (e.g., transceiver and antenna), as well as all of the components used for connecting these elements to one another.
- the output components on each user device may include a speaker 168 and a tactile output 166 .
- Memory 132 in each user device may store data 134 such as computer code that, in response to the detected light, orientation and position of the device, generates a set of prompts that continuously guide the user to orient the device to the traffic light.
- Data 134 may also include an image processing library 142 that consists of image recognition routines and appropriately tuned image filters to detect traffic lights. History of intersections and traffic lights data may be recorded in memory 132 .
- the user devices may also include one or more geographic position components to determine the geographic location and orientation of the device.
- client device 160 may include a GPS 174 receiver to determine the device's latitude, longitude and/or altitude position.
- the geographic position components may also comprise software for determining the position of the device based on other signals received at the client device 160 , such as signals received at the antenna from one or more cellular towers or WiFi access points.
- It may also include an accelerometer, gyroscope or other acceleration device 172 to determine the direction in which the device is oriented.
- the acceleration device may determine its pitch, yaw or roll (or changes thereto) relative to the direction of gravity or a plane perpendicular thereto.
- a client device's provision of location and orientation data as set forth herein may be provided automatically to the user, to the server, or both.
- each user device may also include other components that help to detect the position, orientation and elevation of the device.
- Such components include but are not limited to, a digital compass 176 , an inclinometer 178 and an altimeter 186 .
- Each user device may include image and/or color capture components such as a camera 184 , one or more image sensors 180 and one or more image filters 182 .
- aspects of the invention are not limited to any particular manner of transmission of information.
- information may be sent via a medium such as an optical disk or portable drive.
- the information may be transmitted in a non-electronic format and manually entered into the system.
- functions are indicated as taking place on a server and others on a client, various aspects of the system and method may be implemented by a single computer having a single processor.
- Server 110 may store map-related information 140 , at least a portion of which may be transmitted to a client device.
- the server may store map tiles, where each tile comprises a map image of a particular geographic area.
- a single tile may cover an entire region such as a state in relatively little detail and another tile may cover just a few streets in high detail.
- a single geographic point may be associated with multiple tiles, and a tile may be selected for transmission based on the desired level of zoom.
- the map information is not limited to any particular format.
- the images may comprise street maps, satellite images, or a combination of these, and may be stored as vectors (particularly with respect to street maps), bitmaps (particularly with respect to satellite images), or flat files.
- the various map tiles are each associated with geographical locations, such that the server 110 and/or client device are capable of selecting, retrieving, transmitting, or displaying one or more tiles in response to receiving one or more geographical locations.
- the system and method may process locations expressed in different ways, such as latitude/longitude positions, street addresses, street intersections, an x-y coordinate with respect to the edges of a map (such as a pixel position when a user clicks on a map), names of buildings and landmarks, and other information in other reference systems that is capable of identifying a geographic locations (e.g., lot and block numbers on survey maps).
- a location may define a range of the foregoing. Locations may be further translated from one reference system to another.
- the user device 160 may access a geocoder to convert a location identified in accordance with one reference system (e.g., a street address such as “1600 Amphitheatre Parkway, Mountain View, Calif.”) into a location identified in accordance with another reference system (e.g., a latitude/longitude coordinate such as (37.423021°, ⁇ 122.083939°)).
- a reference system e.g., a street address such as “1600 Amphitheatre Parkway, Mountain View, Calif.”
- another reference system e.g., a latitude/longitude coordinate such as (37.423021°, ⁇ 122.083939°)
- FIG. 2B illustrates that information related to traffic light may be stored in the server 110 .
- information related to traffic light includes but is not limited to geographical location data, type and scale (e.g., size, shape and elevation) of the traffic light, pictures and other data related to the visual or positional features of the traffic light.
- type and scale e.g., size, shape and elevation
- User-specific or user-device specific data such as history of the intersections that the user has crossed, may also be stored in the server.
- the databases storing different types of data may reside on the same server, as shown in the configuration 240 of FIG. 2B .
- locations of traffic light may be integrated with the map data in server 110 .
- the databases may reside on different servers distributed through the network, as illustrated in FIG. 2A .
- Data related to user information may be stored in database 136 .
- Traffic light related information may reside on database 138 .
- Location/map data may be stored in database 140 .
- Database 142 may contain calculation routines and modules for image processing.
- FIG. 3 depicts a flowchart 300 of one embodiment of the invention. It will be understood that the operations do not have to be performed in the precise order described below. Rather, various steps can be handled in a different order or simultaneously.
- a user holding a mobile device may walk along a busy urban area.
- the mobile device desirably continuously receives GPS data and detects its current GPS coordinates.
- the device may also detect positioning data such as elevation and orientation of the device.
- the mobile device Based on the detected current location and the knowledge of map information from a map database, the mobile device is able to correctly determine if the user has come to an intersection in block 312 .
- the map information may be retrieved from the map database by server 110 which sends the data to the mobile device. If the user's current location is at an intersection, the mobile device may prompt the user to stop and invoke the function of querying the state of the intersection in block 314 . If the current location is not an intersection, the user device desirably gives no indication or prompt, and continues to detect the present location as the user continues walking.
- the user invokes the device to query the state of the intersection.
- the device makes a request to retrieve traffic light information from a server connected to the device.
- the device receives traffic light related information such as the location, geometrical data (e.g., size, elevation, shape, etc.) and other data in block 318 .
- the mobile device may have prior knowledge of traffic light information, which comprises geolocation of traffic lights.
- the knowledge may be obtained from a server, such as server 110 or a database, such as database 138 , on a real time basis, or may be downloaded to the device. Therefore, the mobile device may automatically determine that the user is at an intersection and there is a traffic light at the intersection based on the present location, map data and the traffic light information. In this scenario, the device may prompt the user of the existence of a traffic light.
- the device may communicate to the user about the status of the intersection in block 334 .
- the device may proceed to detect the traffic light in block 322 .
- the device may detect light of varying wavelengths from the surrounding area and filter the received wavelengths according to the wavelengths emitted from the traffic light.
- the device may also capture an image of the traffic light and the adjacent area and analyze the image to find the targeted traffic light.
- the device may determine that the user is not pointing the device to the traffic light. It then, in block 326 , calculates a vector of positional deviations from the device to the traffic light based on the detected orientation, elevation and position of the user device, as well as based on the scale and location data of the traffic light. The device may also incorporate the detected information of the surrounding area in the calculation.
- the vector of deviations may comprise horizontal deviation and vertical deviation.
- the user standing at intersection 501 and holding device 160 may initially point the device at the direction of building 510 , and therefore the device deviates from the desired traffic light 102 c by an angle ⁇ to the east.
- the device may prompt the user to move the device to the left by angle ⁇ .
- the device may also be configured to continuously prompt the user to move the device by a fraction of angle ⁇ and prompt the user to stop moving the device until the desired position is reached. If the user's move is larger than desired and points the device to the direction of light 102 a instead of light 102 c , the device may prompt the user to move to the east by an angle ⁇ .
- the user device may be held too high by the user, and thus points to the building 515 over the traffic light 102 by an angle ⁇ .
- the device may prompt the user to move the device downward by angle ⁇ .
- the device may also be held too low and points to the lower portion of the traffic light pole so the device may ask the user to move the device upward by angle ⁇ .
- the information related to the traffic light such as scale and elevation may be obtained by the user device from a server database storing such data. If the user device cannot obtain this information from such a database, it may acquire the information by capturing and processing an image of the traffic light.
- the user device may generate one or more simple instructions prompting the user to move the device to minimize the deviation in block 328 .
- the instructions may be output to the user in the form of spoken message, such as “point up and to the left.”
- the device continues the loop of detecting the traffic light, calculating the deviation and prompting the user to adjust the direction of the device until the device is precisely pointed at the light.
- the device may filter and/or process the color images captured by an image capture component, such as a camera, in block 330 .
- the device determines the status of the traffic light accordingly.
- the device informs the traffic light status to the user, for example, through a speaker in spoken language “green, pass.”
- the device may communicate to the user through non-audible methods, such as haptic output, the color of the traffic light and/or the status of the intersection.
- the device may be configured to provide the user with options to choose the type of communication.
- user device 160 may contain a position and direction detection module 408 that receives geolocation data 402 .
- the position and direction module 408 comprises components such as a GPS receiver 174 , a digital compass 176 , an altimeter 186 and an inclinometer 178 .
- server 110 may receive geolocation information from the GPS receiver embedded in the user device.
- the device may have access to latitude and longitude information. This information may be received by server 110 during connection with the user device in conformance with communication protocols.
- the device may use a system such as Google Chrome or the browser of the Android operating system, each of which may be configured with user permission to send GPS information to trusted network sites.
- Server 110 may use this information to determine a location of the device. Because the accuracy of GPS determinations may depend on the quality of the device and external factors such as environment, the device may further transmit data indicative of accuracy.
- the user device 160 may inform the server 110 that the transmitted latitude/longitude position is accurate within 50 meters, i.e., the device may be at any location within 50 meters of the transmitted position.
- the server may also assume a level of accuracy in the absence of such information.
- server 110 may extrapolate geographic locations from one or more various types of information received from the user devices.
- server 110 may receive location data from the user device indicating a potential location of the device.
- Location data may include an exact point, a set of points, a distance from a point, a range of points, a set of points, or arbitrary boundaries, for example streets, cities, zip codes, counties, states or the like.
- Server 110 may also determine a range of accuracy of the location data and use this information to determine a position or area in which the user device may be located.
- Another location technology employs triangulation among multiple cell towers to estimate the device's position.
- a further location technology is IP geocoding.
- a client device's IP address on a data network may be mapped to a physical location.
- locations may be provided in any number of forms including street addresses, points of interest, or GPS coordinates of the user device.
- the detected geolocation data, elevation, tilt and orientation may be transmitted to a position and direction calculation module 412 for further processing and calculations.
- the calculation module may be a set of computer code that resides on the memory of the user device.
- a vector map may be built for the area range approximately between the user's position to the traffic light's position.
- a raster grid overlay may be created where each cell in the grid overlay may be assigned an elevation data. Elevation data such as those of the user, of the traffic light pole and of the other clutters within the area are included in the map.
- the maps may help the device to find a path of view from the device to the traffic light, so the device may calculate the direction and magnitude with which the user device should be moved, and to further provide the instructions prompting the user to move the device to the desired position.
- User device 160 may contain an image capture module 410 that comprises a camera 184 , one or more sensors 180 and/or filters 182 .
- This module may detect traffic light data 404 .
- the sensor 180 may comprise single or multi element photo detector or monochrome/color detectors.
- the user device may also include a group of hardware bandpass filters 182 that only allow light at the desired wavelength to pass through. For example, the bandpass filters may only allow the emission frequencies of the traffic light LEDs to go through.
- the user device may further comprise an image processing module 414 .
- the image processing module 414 may include one or more image processing programs that perform the functions of image recognition and processing.
- the camera 184 may capture a photo image of the traffic light and the surrounding area.
- Various color and image processing models may be applied to process the captured image.
- color filtering may be performed by an appropriately tuned filter targeted for the red, yellow and/or green light(s). So regions of red, yellow and green light indicating an active traffic light may be identified in the image.
- color screening may also be performed by converting the captured image of the traditional RGB (Red, Green, Blue) tri-color model into another representative model, such as the HIS (Hue, Saturation, and Intensity) color space, which is more related to human perception and less sensitive to illumination variation.
- HIS space may be used to screen out the pixels not satisfying the threshold ranges.
- a binary foreground map may thus be obtained by setting those pixels falling within the desired range as foreground point with a positive value and those pixels being screened out as background points having a value of zero.
- Various screening techniques may be applied to reduce the impact of environmental variation, for example, by setting different illumination conditions for daytime and nighttime. Because traffic lights are active light sources emitting light in a particular direction, images may be purposefully toned down or otherwise made darker to enhance the contrast between different a traffic light and other light sources.
- Morphology technologies such as erosion and dilation may be used to mask the foreground map and thus to remove noise and broken regions to find the shape of the traffic light.
- Edge detection and shape characterization may also be performed to further process the image to detect the traffic lights.
- traffic light 102 at intersection 501 includes three traffic lights 102 a , 102 b and 102 c with each respectively dedicated to one of the three lanes 521 - 523 .
- the green light of light 102 a for lane 521 may have alternating signs of left-turn arrow and straight arrow, while lights 192 - 193 may bear no signs.
- Such references may be stored in a database, such as database 138 , or in a server, such as server 110 , containing traffic light information.
- the device may make use of the reference to the traffic light to decide which light the device should detect, and prompt the user accordingly to point the device to the light 102 c instead of the light 102 a . If no such reference is available, the device may take an image of the traffic light, and analyze the image to find the appropriate traffic light.
- various types of pedestrian traffic lights may be present at an intersection. Some pedestrian lights may only perform pattern changing but not color changing. For example, the background color of pedestrian light 610 or 615 may stay the same when they change the signs from “DON'T WALK” to “WALK”, or from a standing person to a walking person.
- the image processing module 425 may include pattern and image recognition routines to discern the signs on the traffic light. For example, classified models of traffic lights may be used by the image processing module 425 for template matching. Existing classification and knowledge of the traffic lights and signs may be obtained from a database on the network or may be built into the image processing programs.
- the traffic light status e.g., geometrical status (vertical and horizontal orientation, size and position) of the traffic light, signs on the light, timing sequence of the traffic light, and visual environmental cures etc.
- the image processing routines are also robust to the visual clutters from other sources, e.g., neon light from a nearby building.
- Features other than visual characteristics, such as sound data, may also be used to determine the light status. For instance, audible speech may state “walk” or “don't walk”; or chirping tones from a sound device 620 in FIG. 6 (such as a transducer or speaker) may be incorporated into the configuration.
- neural networks may be employed for recognition and classification of the traffic lights and the related signs.
- large computations for template matching may be avoided.
- the input image may not need to be transformed into another representative space such as Hough space or Fourier domain.
- the recognition result may depend only on the correlation between the network weights and the network topology itself.
- the user device 160 may include a prompt generating module 416 that generates routines based on the deviation from the user device to the traffic light.
- a speech synthesis module 418 may convert the prompts into speech utterance and output the prompts to the user through the speaker 168 .
- the device may also provide the user with options to choose the type of output, e.g., audio or tactile output.
- systems and methods in accordance with aspects of the invention may include different traffic light patterns, visual or audio characteristics, environmental features, data values, data types and configurations, and different image and sound processing techniques.
- the systems and methods may be provided and received at different times (e.g., via different servers or databases) and by different entities (e.g., some values may be pre-suggested or provided from different sources).
- every feature in a given embodiment, alternative or example may be used in any other embodiment, alternative or example herein.
- any technology for determining the location of a traffic light or a mobile device may be employed in any configuration herein.
- Each way of communicating the location of a traffic light or the status of the light may be used in any configuration herein.
- Any mobile user device may be used with any of the configurations herein.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Educational Administration (AREA)
- Business, Economics & Management (AREA)
- General Health & Medical Sciences (AREA)
- Educational Technology (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Environmental & Geological Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- This application claims the benefit of the filing date of U.S. Provisional Patent Application No. 61/297,455 filed Jan. 22, 2010, the disclosure of which is hereby incorporated herein by reference.
- 1. Field of the Invention
- The present invention relates generally to traffic light detection systems and methods. More particularly, the present invention relates to identifying the state of traffic lights using a mobile device.
- 2. Description of Related Art
- Blind travelers normally use the sound of traffic to judge whether a traffic light is green when deciding to cross a street. Building devices that can read traffic lights for one who cannot see is a technical challenge. Such electronic aids that can reliably announce the state of the traffic light typically require special-purpose hardware to be installed on traffic lights, and therefore such devices are often prohibitively expensive in the past.
- Technologies that do not require the installation of special hardware on traffic lights have been used in custom built vehicles, e.g., the robot cars seen in the DARPA challenge that sense their environment when negotiating the city streets. However, such technologies typically rely on sensors. Such sensor-based solutions are likely to remain intractable on mainstream mobile devices. Consequently such solutions remain out of reach of the average consumer because of the high cost.
- In one embodiment, a method of identifying a traffic light status is provided. The traffic light status comprises at least one of a color illumination status and a sign status of a traffic light. The method comprises receiving, at a user device, geolocation data associated with the traffic light and the user device. The user device geolocation data includes a location of the user device, and the traffic light geolocation data includes a location of the traffic light. The method also includes detecting an elevation and a direction of the user device; and determining the status of the traffic light with the user device based on the geolocation data associated with the traffic light and the user device, the elevation and the direction associated with the user device. The method also includes communicating the status to a user of the user device.
- In accordance with another embodiment, a device for identifying a traffic light status is provided. The traffic light status comprises at least one of a color illumination status and a sign status of a traffic light. The device comprises a memory for storing information, including geolocation data associated with a traffic light and the device, direction and elevation information of the device. The device geolocation data includes a location of the device, and the traffic light geolocation data includes a location of the traffic light. The processor is coupled to the memory. The processor is configured to receive geolocation data associated with the traffic light and the device, and to detect an elevation and a direction associated with the device. The processor is also configured to determine the status of the traffic light with the device based on the received geolocation data, the detected elevation and the direction of the device. The processor is further configured to communicate the status to a user of the device.
- In accordance with a further embodiment, a method for identifying a traffic light status with a server computer is provided. The traffic light status comprises at least one of a color illumination status and a sign status of a traffic light. The method comprises receiving, from a user device, geolocation data associated with the traffic light and the user device, elevation and direction data associated with the user device. The user device geolocation data includes a location of the user device, and the traffic light geolocation data includes a location of the traffic light. The method also comprises determining the status of the traffic light based on the received geolocation data, the elevation and direction data, and transmitting the status to the user device. Determining the status of the traffic light comprises generating one or more instructions to orient the user device to face the traffic light and transmitting the instructions to the user device. The method further comprises instructing the user device to communicate the status to the user.
- In accordance with a further embodiment, a server apparatus is employed to identify a traffic light status. The traffic light status comprises at least one of a color illumination status and a sign status of a traffic light. The apparatus comprises a memory for storing information, including geolocation data associated with a traffic light and a user device, direction and elevation information of the user device. The user device geolocation data includes a location of the device, and the traffic light geolocation data includes a location of the traffic light. The processor is coupled to the memory. The processor is configured to receive geolocation data associated with the traffic light and the user device, and to detect an elevation and a direction associated with the user device. The processor is also configured to determine the status of the traffic light with the device based on the received geolocation data, the detected elevation and the direction of the device. Determining the status of the traffic light comprises generating one or more instructions to orient the user device to face the traffic light and transmitting the instructions to the user device. The processor is further configured to instruct the user device to communicate the status to the user.
- In accordance with a further embodiment, a system is provided. The system comprises memory means for storing information data. The information data includes geolocation data associated with a traffic light and a user device, direction and elevation information of the user device. The user device geolocation data includes a location of the device, and the traffic light geolocation data includes a location of the traffic light. The system also includes means for detecting a location of a traffic light and a location of the user device, means for detecting a direction of the user device, means for detecting an elevation of the user device, and means for detecting a light signal of the traffic light. The system further includes means for capturing an image of the traffic light. The system also includes processor means for determining a status of the traffic light. The traffic light status comprises at least one of a color illumination status and a sign status of a traffic light. The processor means is also for generating one or more instructions to orient the user device to face the traffic light based on the geolocation data associated with the traffic light and the user device, the elevation and the direction of the user device. The system further includes means for outputting the traffic light status to a user of the user device.
- It is to be appreciated that, unless explicitly stated to the contrary, any feature in any embodiment, alternative or example can be used in any other embodiment, alternative or example herein and hereafter.
- In one example, determining the status of the traffic light comprises determining, from the geolocation data of the user device, if a current location of the user device is adjacent to an intersection. In the case where the current location is adjacent to the intersection, the method further comprises prompting the user to orient the user device to face the traffic light.
- In one alternative, determining the status of the traffic light with the user device includes generating one or more instructions based on the geolocation data of the traffic light and the user device, the elevation and the direction of the user device. Orienting the user device to face the traffic light also includes providing the instructions to the user.
- In another alternative, generating the one or more instructions to orient the user device includes generating a map for an area between the user device and the traffic light; and calculating deviations from the user device to the traffic light based on the map.
- In a further alternative, the method comprises capturing one or more images of the traffic light and a surrounding area by an image capture device of the user device. In this case, determining the status of the traffic light is performed based on the captured images.
- In another example, the method includes receiving information related to the traffic light, where the information comprises at least one of a size, a type and a timing sequence of the traffic light. In this situation, determining the status of the traffic light is performed based on the received information related to the traffic light.
- In one alternative, the information related to the traffic light is received from a server computer.
- In another alternative, the information related to the traffic light is generated based on the captured images.
- In yet another example, determining the status of the traffic light with the user device comprises receiving light signals from a plurality of light sources, where one of the light sources comprises the traffic light. Determining the status of the traffic light also includes filtering light signals for frequency ranges emitted by the traffic light.
- In another example, determining the status of the traffic light includes recognizing signs associated with the traffic light.
- In yet another example, the method includes receiving audible information related to the status of the traffic light, and determining the status of the traffic light is further based on the audible information.
- In one alternative, communicating the traffic light status to the user includes generating an audible output.
- In another alternative, communicating the traffic light status to the user includes generating a tactile output.
- In one example, the location of the traffic light and the location of the user device is detected by a geographical position device.
- In another example, the direction associated with the user device is detected by a digital compass.
-
FIG. 1 is a pictorial diagram of a system in accordance with aspects of the invention. -
FIG. 2A is a pictorial diagram of a system in accordance with aspects of the invention. -
FIG. 2B is a functional diagram in accordance with aspects of the invention. -
FIG. 3 is a flowchart in accordance with aspects of the invention. -
FIG. 4 is a functional diagram in accordance with aspects of the invention. -
FIG. 5A is an exemplary diagram in accordance with aspects of the invention. -
FIG. 5B is an exemplary diagram in accordance with aspects of the invention. -
FIG. 6 is an exemplary diagram in accordance with aspects of the invention. - Aspects, features and advantages of the invention will be appreciated when considered with reference to the following description of exemplary embodiments and accompanying figures. The same reference numbers in different drawings may identify the same or similar elements. Furthermore, the following description is not limiting; the scope of the invention is defined by the appended claims and equivalents.
- In accordance with aspects of the invention, a system determines the status of traffic lights through a mobile device and describes the status to the user of the device. The mobile device detects its geographical location, e.g., through a GPS system and determines if the user is at an intersection or is otherwise near a traffic light based on the knowledge from a map database. The mobile device also receives the geographical location and other information related to the traffic light at the intersection from a database storing traffic lights information. The direction and elevation of the device is calculated (e.g., by the mobile device). The device then generates one or more prompts such as in the form of audible or tactile cues that progressively guide the user until the device is precisely pointed at the light. The mobile device detects the image and color of the traffic light. The detected image and color are processed, and the status of the traffic light is determined. Then the mobile device communicates the traffic light status to the user.
- As shown in
FIG. 1 , asystem 100 in accordance with one aspect of the invention includes aserver computer 110, amobile device 160, anetwork 90, atraffic light 102 and asatellite 103. Themobile device 160 is connected throughnetwork 90 to theserver 110. The mobile device may have a built-in GPS receiver to receive geolocation data fromsatellite 103. Thetraffic light 102 may havelights 102 a-102 c, each of which is dedicated to one corresponding street lane. The lights emitted by thetraffic light 103 may be detected by themobile device 160 and be further processed by the device or the server or both. As theconfiguration 200 ofFIG. 2A shows, thenetwork 90 may connect with one or moremobile devices server computers databases mobile device 160 or on theserver 110. - As illustrated in the functional diagram of
FIG. 2B , the server computer contains aprocessor 120,memory 130 and other components typically present in general purpose computers. Thememory 130 stores information accessible byprocessor 120, includinginstructions 132 anddata 134 that may be executed or otherwise used by theprocessor 120. Thememory 130 may be of any type capable of storing information accessible by the processor, including a computer-readable medium, or other medium that stores data that may be read with the aid of an electronic device, such as a hard-drive, memory card, ROM, RAM, DVD or other optical disks, as well as other write-capable and read-only memories. Systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media. - The
instructions 132 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computer code on the computer-readable medium. In that regard, the terms “instructions” and “programs” may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor, or in any other computer language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in more detail below. - The
data 134 may be retrieved, stored or modified byprocessor 120 in accordance with theinstructions 132. For instance, although the system and method is not limited by any particular data structure, the data may be stored in computer registers, in a relational database as a table having a plurality of different fields and records, XML documents or flat files. The data may also be formatted in any computer-readable format. By further way of example only, image data may be stored as bitmaps comprised of grids of pixels that are stored in accordance with formats that are compressed or uncompressed, lossless (e.g., BMP) or lossy (e.g., JPEG), and bitmap or vector-based (e.g., SVG), as well as computer instructions for drawing graphics. The data may comprise any information sufficient to identify the relevant information, such as numbers, descriptive text, proprietary codes, references to data stored in other areas of the same memory or different memories (including other network locations) or information that is used by a function to calculate the relevant data. - The
processor 120 may be any conventional processor, such as processors from Intel Corporation or Advanced Micro Devices. Alternatively, the processor may be a dedicated device such as an ASIC. AlthoughFIG. 2 functionally illustrates the processor and memory as being within the same block, it will be understood by those of ordinary skill in the art that the processor and memory may actually comprise multiple processors and memories that may or may not be stored within the same physical housing. For example, memory may be a hard drive or other storage media located in a server farm of a data center. Accordingly, references to a processor or computer will be understood to include references to a collection of processors or computers or memories that may or may not operate in parallel. - The
server 110 may be at one node ofnetwork 90 and capable of directly and indirectly communicating with other nodes of the network. For example,server 110 may comprise a web server that is capable of communicating withuser devices network 90 such thatserver 110 usesnetwork 90 to transmit and display information to a user, such asperson FIG. 1B , on display ofclient device 160.Server 110 may also comprise a plurality of computers that exchange information with different nodes of a network for the purpose of receiving, processing and transmitting data to the client devices. In this instance, the user devices will typically be at different nodes of the network than any of thecomputers comprising server 110. -
Network 90, and intervening nodes betweenserver 110 and user devices, may comprise various configurations and use various protocols including the Internet, World Wide Web, intranets, virtual private networks, local Ethernet networks, private networks using communication protocols proprietary to one or more companies, cellular and wireless networks (e.g., WiFi), instant messaging, HTTP and SMTP, and various combinations of the foregoing. Although only a few computers are depicted inFIG. 1-2 , it should be appreciated that a typical system can include a large number of connected computers. - The
user devices user device 170 may be a wireless-enabled PDA or a cellular phone capable of obtaining information via the Internet. The user may input information using a small keyboard (in the case of a Blackberry phone), a keypad (in the case of a typical cell phone) or a touch screen (in the case of a PDA). - Each user device may be configured with a
processor 120,memory 130 andinstructions 132. Eachclient device speaker 168 and atactile output 166. -
Memory 132 in each user device may storedata 134 such as computer code that, in response to the detected light, orientation and position of the device, generates a set of prompts that continuously guide the user to orient the device to the traffic light.Data 134 may also include animage processing library 142 that consists of image recognition routines and appropriately tuned image filters to detect traffic lights. History of intersections and traffic lights data may be recorded inmemory 132. - The user devices may also include one or more geographic position components to determine the geographic location and orientation of the device. For example,
client device 160 may include aGPS 174 receiver to determine the device's latitude, longitude and/or altitude position. The geographic position components may also comprise software for determining the position of the device based on other signals received at theclient device 160, such as signals received at the antenna from one or more cellular towers or WiFi access points. It may also include an accelerometer, gyroscope orother acceleration device 172 to determine the direction in which the device is oriented. By way of example only, the acceleration device may determine its pitch, yaw or roll (or changes thereto) relative to the direction of gravity or a plane perpendicular thereto. In that regard, it will be understood that a client device's provision of location and orientation data as set forth herein may be provided automatically to the user, to the server, or both. - Besides the
GPS data receiver 174 and theaccelerometer 172, each user device may also include other components that help to detect the position, orientation and elevation of the device. Such components include but are not limited to, adigital compass 176, aninclinometer 178 and analtimeter 186. - Each user device may include image and/or color capture components such as a
camera 184, one ormore image sensors 180 and one or more image filters 182. - Although certain advantages are obtained when information is transmitted or received as noted above, aspects of the invention are not limited to any particular manner of transmission of information. For example, in some aspects, information may be sent via a medium such as an optical disk or portable drive. In other aspects, the information may be transmitted in a non-electronic format and manually entered into the system. Yet further, although some functions are indicated as taking place on a server and others on a client, various aspects of the system and method may be implemented by a single computer having a single processor.
-
Server 110 may store map-relatedinformation 140, at least a portion of which may be transmitted to a client device. For example and as shown inFIG. 2A , the server may store map tiles, where each tile comprises a map image of a particular geographic area. A single tile may cover an entire region such as a state in relatively little detail and another tile may cover just a few streets in high detail. In that regard, a single geographic point may be associated with multiple tiles, and a tile may be selected for transmission based on the desired level of zoom. The map information is not limited to any particular format. For example, the images may comprise street maps, satellite images, or a combination of these, and may be stored as vectors (particularly with respect to street maps), bitmaps (particularly with respect to satellite images), or flat files. - The various map tiles are each associated with geographical locations, such that the
server 110 and/or client device are capable of selecting, retrieving, transmitting, or displaying one or more tiles in response to receiving one or more geographical locations. - The system and method may process locations expressed in different ways, such as latitude/longitude positions, street addresses, street intersections, an x-y coordinate with respect to the edges of a map (such as a pixel position when a user clicks on a map), names of buildings and landmarks, and other information in other reference systems that is capable of identifying a geographic locations (e.g., lot and block numbers on survey maps). Moreover, a location may define a range of the foregoing. Locations may be further translated from one reference system to another. For example, the
user device 160 may access a geocoder to convert a location identified in accordance with one reference system (e.g., a street address such as “1600 Amphitheatre Parkway, Mountain View, Calif.”) into a location identified in accordance with another reference system (e.g., a latitude/longitude coordinate such as (37.423021°, −122.083939°)). In that regard, it will be understood that exchanging or processing locations expressed in one reference system, such as street addresses, may also be received or processed in other references systems as well. -
FIG. 2B illustrates that information related to traffic light may be stored in theserver 110. Such information includes but is not limited to geographical location data, type and scale (e.g., size, shape and elevation) of the traffic light, pictures and other data related to the visual or positional features of the traffic light. User-specific or user-device specific data, such as history of the intersections that the user has crossed, may also be stored in the server. - In one embodiment, the databases storing different types of data may reside on the same server, as shown in the configuration 240 of
FIG. 2B . For example, locations of traffic light may be integrated with the map data inserver 110. In another embodiment, the databases may reside on different servers distributed through the network, as illustrated inFIG. 2A . Data related to user information may be stored indatabase 136. Traffic light related information may reside ondatabase 138. Location/map data may be stored indatabase 140.Database 142 may contain calculation routines and modules for image processing. -
FIG. 3 depicts aflowchart 300 of one embodiment of the invention. It will be understood that the operations do not have to be performed in the precise order described below. Rather, various steps can be handled in a different order or simultaneously. - In one example, as shown in
block 310, a user holding a mobile device, such asdevice 160, may walk along a busy urban area. The mobile device desirably continuously receives GPS data and detects its current GPS coordinates. The device may also detect positioning data such as elevation and orientation of the device. - Based on the detected current location and the knowledge of map information from a map database, the mobile device is able to correctly determine if the user has come to an intersection in
block 312. The map information may be retrieved from the map database byserver 110 which sends the data to the mobile device. If the user's current location is at an intersection, the mobile device may prompt the user to stop and invoke the function of querying the state of the intersection inblock 314. If the current location is not an intersection, the user device desirably gives no indication or prompt, and continues to detect the present location as the user continues walking. - In
block 316, the user invokes the device to query the state of the intersection. The device makes a request to retrieve traffic light information from a server connected to the device. The device then receives traffic light related information such as the location, geometrical data (e.g., size, elevation, shape, etc.) and other data inblock 318. - In an alternative, the mobile device may have prior knowledge of traffic light information, which comprises geolocation of traffic lights. The knowledge may be obtained from a server, such as
server 110 or a database, such asdatabase 138, on a real time basis, or may be downloaded to the device. Therefore, the mobile device may automatically determine that the user is at an intersection and there is a traffic light at the intersection based on the present location, map data and the traffic light information. In this scenario, the device may prompt the user of the existence of a traffic light. - If the device decides that there is no traffic light at this intersection in
block 320, it may communicate to the user about the status of the intersection inblock 334. - If the device decides that there is traffic light at this intersection, it may proceed to detect the traffic light in
block 322. The device may detect light of varying wavelengths from the surrounding area and filter the received wavelengths according to the wavelengths emitted from the traffic light. The device may also capture an image of the traffic light and the adjacent area and analyze the image to find the targeted traffic light. - If the device can not detect the traffic light in
block 324, the device may determine that the user is not pointing the device to the traffic light. It then, inblock 326, calculates a vector of positional deviations from the device to the traffic light based on the detected orientation, elevation and position of the user device, as well as based on the scale and location data of the traffic light. The device may also incorporate the detected information of the surrounding area in the calculation. - The vector of deviations may comprise horizontal deviation and vertical deviation. For example, in
scenario 500 inFIG. 5A , the user standing atintersection 501 and holdingdevice 160 may initially point the device at the direction of building 510, and therefore the device deviates from the desiredtraffic light 102 c by an angle α to the east. In this scenario, the device may prompt the user to move the device to the left by angle α. The device may also be configured to continuously prompt the user to move the device by a fraction of angle α and prompt the user to stop moving the device until the desired position is reached. If the user's move is larger than desired and points the device to the direction of light 102 a instead of light 102 c, the device may prompt the user to move to the east by an angle β. In anotherscenario 505 ofFIG. 5B , the user device may be held too high by the user, and thus points to thebuilding 515 over thetraffic light 102 by an angle θ. The device may prompt the user to move the device downward by angle θ. The device may also be held too low and points to the lower portion of the traffic light pole so the device may ask the user to move the device upward by angle μ. - The information related to the traffic light, such as scale and elevation may be obtained by the user device from a server database storing such data. If the user device cannot obtain this information from such a database, it may acquire the information by capturing and processing an image of the traffic light.
- Returning to
FIG. 3 , based on the detected deviation from the traffic light, the user device may generate one or more simple instructions prompting the user to move the device to minimize the deviation inblock 328. For example, the instructions may be output to the user in the form of spoken message, such as “point up and to the left.” The device continues the loop of detecting the traffic light, calculating the deviation and prompting the user to adjust the direction of the device until the device is precisely pointed at the light. - When the traffic light is detected, the device may filter and/or process the color images captured by an image capture component, such as a camera, in
block 330. The device then determines the status of the traffic light accordingly. Inblock 332, the device informs the traffic light status to the user, for example, through a speaker in spoken language “green, pass.” - The device may communicate to the user through non-audible methods, such as haptic output, the color of the traffic light and/or the status of the intersection. The device may be configured to provide the user with options to choose the type of communication.
- Aspects of the invention will now be described in greater detail with regard to
FIG. 4 , which illustrates a system diagram of the embodiments of the present invention. Here,user device 160 may contain a position anddirection detection module 408 that receivesgeolocation data 402. The position anddirection module 408 comprises components such as aGPS receiver 174, adigital compass 176, analtimeter 186 and aninclinometer 178. - The approximate location of the client's device may be found with a number of different technologies. For example,
server 110 may receive geolocation information from the GPS receiver embedded in the user device. Thus the device may have access to latitude and longitude information. This information may be received byserver 110 during connection with the user device in conformance with communication protocols. For example, the device may use a system such as Google Chrome or the browser of the Android operating system, each of which may be configured with user permission to send GPS information to trusted network sites.Server 110 may use this information to determine a location of the device. Because the accuracy of GPS determinations may depend on the quality of the device and external factors such as environment, the device may further transmit data indicative of accuracy. For example, theuser device 160 may inform theserver 110 that the transmitted latitude/longitude position is accurate within 50 meters, i.e., the device may be at any location within 50 meters of the transmitted position. The server may also assume a level of accuracy in the absence of such information. - In another example,
server 110 may extrapolate geographic locations from one or more various types of information received from the user devices. For example,server 110 may receive location data from the user device indicating a potential location of the device. Location data may include an exact point, a set of points, a distance from a point, a range of points, a set of points, or arbitrary boundaries, for example streets, cities, zip codes, counties, states or the like.Server 110 may also determine a range of accuracy of the location data and use this information to determine a position or area in which the user device may be located. - Another location technology employs triangulation among multiple cell towers to estimate the device's position. A further location technology is IP geocoding. In this technique, a client device's IP address on a data network may be mapped to a physical location. As noted before, locations may be provided in any number of forms including street addresses, points of interest, or GPS coordinates of the user device.
- The detected geolocation data, elevation, tilt and orientation may be transmitted to a position and
direction calculation module 412 for further processing and calculations. The calculation module may be a set of computer code that resides on the memory of the user device. - Various calculation techniques may be used to estimate the deviation of the user device to the traffic light. For example, a vector map may be built for the area range approximately between the user's position to the traffic light's position. In another example, a raster grid overlay may be created where each cell in the grid overlay may be assigned an elevation data. Elevation data such as those of the user, of the traffic light pole and of the other clutters within the area are included in the map. The maps may help the device to find a path of view from the device to the traffic light, so the device may calculate the direction and magnitude with which the user device should be moved, and to further provide the instructions prompting the user to move the device to the desired position.
-
User device 160 may contain animage capture module 410 that comprises acamera 184, one ormore sensors 180 and/or filters 182. This module may detecttraffic light data 404. Thesensor 180 may comprise single or multi element photo detector or monochrome/color detectors. The user device may also include a group ofhardware bandpass filters 182 that only allow light at the desired wavelength to pass through. For example, the bandpass filters may only allow the emission frequencies of the traffic light LEDs to go through. - The user device may further comprise an
image processing module 414. Theimage processing module 414 may include one or more image processing programs that perform the functions of image recognition and processing. For example, thecamera 184 may capture a photo image of the traffic light and the surrounding area. Various color and image processing models may be applied to process the captured image. - In one example, color filtering may be performed by an appropriately tuned filter targeted for the red, yellow and/or green light(s). So regions of red, yellow and green light indicating an active traffic light may be identified in the image.
- In another example, color screening may also be performed by converting the captured image of the traditional RGB (Red, Green, Blue) tri-color model into another representative model, such as the HIS (Hue, Saturation, and Intensity) color space, which is more related to human perception and less sensitive to illumination variation. By applying appropriate formulas or statistical models, HIS space may be used to screen out the pixels not satisfying the threshold ranges. A binary foreground map may thus be obtained by setting those pixels falling within the desired range as foreground point with a positive value and those pixels being screened out as background points having a value of zero.
- Various screening techniques may be applied to reduce the impact of environmental variation, for example, by setting different illumination conditions for daytime and nighttime. Because traffic lights are active light sources emitting light in a particular direction, images may be purposefully toned down or otherwise made darker to enhance the contrast between different a traffic light and other light sources.
- Morphology technologies, such as erosion and dilation may be used to mask the foreground map and thus to remove noise and broken regions to find the shape of the traffic light. Edge detection and shape characterization may also be performed to further process the image to detect the traffic lights.
- In one scenario, there may be lights for different lanes on one traffic light pole at an intersection. For example, as shown in
FIG. 5A ,traffic light 102 atintersection 501 includes threetraffic lights lane 521 may have alternating signs of left-turn arrow and straight arrow, while lights 192-193 may bear no signs. Such references may be stored in a database, such asdatabase 138, or in a server, such asserver 110, containing traffic light information. In this situation, the device may make use of the reference to the traffic light to decide which light the device should detect, and prompt the user accordingly to point the device to the light 102 c instead of the light 102 a. If no such reference is available, the device may take an image of the traffic light, and analyze the image to find the appropriate traffic light. - In another
scenario 600, as shown inFIG. 6 , various types of pedestrian traffic lights may be present at an intersection. Some pedestrian lights may only perform pattern changing but not color changing. For example, the background color of pedestrian light 610 or 615 may stay the same when they change the signs from “DON'T WALK” to “WALK”, or from a standing person to a walking person. In these situations, the image processing module 425 may include pattern and image recognition routines to discern the signs on the traffic light. For example, classified models of traffic lights may be used by the image processing module 425 for template matching. Existing classification and knowledge of the traffic lights and signs may be obtained from a database on the network or may be built into the image processing programs. - Many parameters may be taken into account to correctly identify the traffic light status, e.g., geometrical status (vertical and horizontal orientation, size and position) of the traffic light, signs on the light, timing sequence of the traffic light, and visual environmental cures etc. The image processing routines are also robust to the visual clutters from other sources, e.g., neon light from a nearby building. Features other than visual characteristics, such as sound data, may also be used to determine the light status. For instance, audible speech may state “walk” or “don't walk”; or chirping tones from a
sound device 620 inFIG. 6 (such as a transducer or speaker) may be incorporated into the configuration. - Alternatively, neural networks may be employed for recognition and classification of the traffic lights and the related signs. In this way, large computations for template matching may be avoided. For example, the input image may not need to be transformed into another representative space such as Hough space or Fourier domain. The recognition result may depend only on the correlation between the network weights and the network topology itself.
- The
user device 160 may include aprompt generating module 416 that generates routines based on the deviation from the user device to the traffic light. Aspeech synthesis module 418 may convert the prompts into speech utterance and output the prompts to the user through thespeaker 168. The device may also provide the user with options to choose the type of output, e.g., audio or tactile output. - It will be further understood that the sample values, types and configurations of data described and shown in the figures are for the purposes of illustration only. In that regard, systems and methods in accordance with aspects of the invention may include different traffic light patterns, visual or audio characteristics, environmental features, data values, data types and configurations, and different image and sound processing techniques. The systems and methods may be provided and received at different times (e.g., via different servers or databases) and by different entities (e.g., some values may be pre-suggested or provided from different sources).
- As these and other variations and combinations of the features discussed above can be utilized without departing from the invention as defined by the claims, the foregoing description of exemplary embodiments should be taken by way of illustration rather than by way of limitation of the invention as defined by the claims. It will also be understood that the provision of examples of the invention (as well as clauses phrased as “such as,” “e.g.”, “including” and the like) should not be interpreted as limiting the invention to the specific examples; rather, the examples are intended to illustrate only some of many possible aspects.
- Unless expressly stated to the contrary, every feature in a given embodiment, alternative or example may be used in any other embodiment, alternative or example herein. For instance, any technology for determining the location of a traffic light or a mobile device may be employed in any configuration herein. Each way of communicating the location of a traffic light or the status of the light may be used in any configuration herein. Any mobile user device may be used with any of the configurations herein.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/011,036 US9070305B1 (en) | 2010-01-22 | 2011-01-21 | Traffic light detecting system and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US29745510P | 2010-01-22 | 2010-01-22 | |
US13/011,036 US9070305B1 (en) | 2010-01-22 | 2011-01-21 | Traffic light detecting system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150179088A1 true US20150179088A1 (en) | 2015-06-25 |
US9070305B1 US9070305B1 (en) | 2015-06-30 |
Family
ID=53400652
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/011,036 Active 2033-09-20 US9070305B1 (en) | 2010-01-22 | 2011-01-21 | Traffic light detecting system and method |
Country Status (1)
Country | Link |
---|---|
US (1) | US9070305B1 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150227788A1 (en) * | 2015-04-21 | 2015-08-13 | David Douglas | Simplified real time location-dependent color-coded display ("chloropleth") system and method |
US20150237481A1 (en) * | 2012-07-31 | 2015-08-20 | Ariel-University Research And Development Company Ltd. | Navigation method and device |
US9551591B2 (en) * | 2015-03-03 | 2017-01-24 | Verizon Patent And Licensing Inc. | Driving assistance based on road infrastructure information |
EP3174328A1 (en) * | 2015-11-24 | 2017-05-31 | Advanced Digital Broadcast S.A. | System and method for radio signal coverage mapping |
WO2017153143A1 (en) * | 2016-03-11 | 2017-09-14 | Osram Gmbh | Representation of orientation information by a lighting system |
US20180307925A1 (en) * | 2017-04-20 | 2018-10-25 | GM Global Technology Operations LLC | Systems and methods for traffic signal light detection |
WO2018199941A1 (en) * | 2017-04-26 | 2018-11-01 | The Charles Stark Draper Laboratory, Inc. | Enhancing autonomous vehicle perception with off-vehicle collected data |
CN109116846A (en) * | 2018-08-29 | 2019-01-01 | 五邑大学 | A kind of automatic Pilot method, apparatus, computer equipment and storage medium |
CN109388148A (en) * | 2017-08-10 | 2019-02-26 | 松下电器(美国)知识产权公司 | Moving body, control method and control program |
US10262532B2 (en) * | 2017-06-01 | 2019-04-16 | Hyundai Motor Company | System and method for providing forward traffic light information during stop |
US20190180532A1 (en) * | 2017-12-08 | 2019-06-13 | Ford Global Technologies, Llc | Systems And Methods For Calculating Reaction Time |
CN110021176A (en) * | 2018-12-21 | 2019-07-16 | 文远知行有限公司 | Traffic lights decision-making technique, device, computer equipment and storage medium |
CN110335273A (en) * | 2019-07-15 | 2019-10-15 | 北京海益同展信息科技有限公司 | Detection method, detection device, electronic equipment and medium |
US20200012870A1 (en) * | 2017-02-10 | 2020-01-09 | Continental Automotive France | Method for detecting false positives relating to a traffic light |
US10641613B1 (en) * | 2014-03-14 | 2020-05-05 | Google Llc | Navigation using sensor fusion |
CN111488821A (en) * | 2020-04-08 | 2020-08-04 | 北京百度网讯科技有限公司 | Method and device for identifying traffic signal lamp countdown information |
CN111492366A (en) * | 2017-12-21 | 2020-08-04 | 华为技术有限公司 | Information detection method and mobile device |
CN112507956A (en) * | 2020-12-21 | 2021-03-16 | 北京百度网讯科技有限公司 | Signal lamp identification method and device, electronic equipment, road side equipment and cloud control platform |
CN112581534A (en) * | 2020-12-24 | 2021-03-30 | 济南博观智能科技有限公司 | Signal lamp repositioning method and device, electronic equipment and storage medium |
US11138444B2 (en) * | 2017-06-08 | 2021-10-05 | Zhejiang Dahua Technology Co, , Ltd. | Methods and devices for processing images of a traffic light |
CN114565897A (en) * | 2022-01-19 | 2022-05-31 | 北京深睿博联科技有限责任公司 | Traffic light intersection blind guiding method and device |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9305224B1 (en) * | 2014-09-29 | 2016-04-05 | Yuan Ze University | Method for instant recognition of traffic lights countdown image |
US11092446B2 (en) | 2016-06-14 | 2021-08-17 | Motional Ad Llc | Route planning for an autonomous vehicle |
US10309792B2 (en) | 2016-06-14 | 2019-06-04 | nuTonomy Inc. | Route planning for an autonomous vehicle |
US10126136B2 (en) | 2016-06-14 | 2018-11-13 | nuTonomy Inc. | Route planning for an autonomous vehicle |
US10829116B2 (en) | 2016-07-01 | 2020-11-10 | nuTonomy Inc. | Affecting functions of a vehicle based on function-related information about its environment |
US10681513B2 (en) | 2016-10-20 | 2020-06-09 | nuTonomy Inc. | Identifying a stopping place for an autonomous vehicle |
US10331129B2 (en) | 2016-10-20 | 2019-06-25 | nuTonomy Inc. | Identifying a stopping place for an autonomous vehicle |
US10857994B2 (en) | 2016-10-20 | 2020-12-08 | Motional Ad Llc | Identifying a stopping place for an autonomous vehicle |
US10473470B2 (en) | 2016-10-20 | 2019-11-12 | nuTonomy Inc. | Identifying a stopping place for an autonomous vehicle |
CN108229250B (en) * | 2016-12-14 | 2020-07-10 | 杭州海康威视数字技术股份有限公司 | Traffic signal lamp repositioning method and device |
EP3612424A4 (en) * | 2017-04-18 | 2020-09-16 | Nutonomy Inc. | Automatically perceiving travel signals |
US10650256B2 (en) | 2017-04-18 | 2020-05-12 | nuTonomy Inc. | Automatically perceiving travel signals |
US10643084B2 (en) * | 2017-04-18 | 2020-05-05 | nuTonomy Inc. | Automatically perceiving travel signals |
CN107644538B (en) * | 2017-11-01 | 2020-10-23 | 广州汽车集团股份有限公司 | Traffic signal lamp identification method and device |
KR102451896B1 (en) | 2017-12-18 | 2022-10-06 | 현대자동차 주식회사 | Method for controlling driving of hybrid vehicle using dynamic traffic information |
KR102505717B1 (en) | 2019-01-29 | 2023-03-02 | 모셔널 에이디 엘엘씨 | traffic light estimation |
CN112307840A (en) * | 2019-07-31 | 2021-02-02 | 浙江商汤科技开发有限公司 | Indicator light detection method, device, equipment and computer readable storage medium |
TWI822025B (en) * | 2022-05-03 | 2023-11-11 | 凃彥羽 | Traffic light control system |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5790974A (en) * | 1996-04-29 | 1998-08-04 | Sun Microsystems, Inc. | Portable calendaring device having perceptual agent managing calendar entries |
US6018697A (en) * | 1995-12-26 | 2000-01-25 | Aisin Aw Co., Ltd. | Navigation system for vehicles |
US20020055817A1 (en) * | 2000-08-18 | 2002-05-09 | Yue-Hong Chou | Real-time smart mobile device for location information processing |
US20040218910A1 (en) * | 2003-04-30 | 2004-11-04 | Chang Nelson L. | Enabling a three-dimensional simulation of a trip through a region |
US20080139245A1 (en) * | 2006-12-07 | 2008-06-12 | Samsung Electronics Co., Ltd. | Mobile terminal and schedule management method using the same |
US20080297608A1 (en) * | 2007-05-30 | 2008-12-04 | Border John N | Method for cooperative capture of images |
US20100211307A1 (en) * | 2006-01-18 | 2010-08-19 | Pieter Geelen | Method of Storing the Position of a Parked Vehicle and Navigation Device Arranged for That |
US20100253541A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Traffic infrastructure indicator on head-up display |
US20110282542A9 (en) * | 2009-04-03 | 2011-11-17 | Certusview Technologies, Llc | Methods, apparatus, and systems for acquiring and analyzing vehicle data and generating an electronic representation of vehicle operations |
US20120095646A1 (en) * | 2009-09-15 | 2012-04-19 | Ghazarian Ohanes D | Intersection vehicle collision avoidance system |
US8350758B1 (en) * | 2009-10-01 | 2013-01-08 | Lighthouse Signal Systems LLC | Systems and methods for indoor geolocation based on yield of RF signals |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6266442B1 (en) | 1998-10-23 | 2001-07-24 | Facet Technology Corp. | Method and apparatus for identifying objects depicted in a videostream |
US7060981B2 (en) | 2003-09-05 | 2006-06-13 | Facet Technology Corp. | System for automated detection of embedded objects |
US7590310B2 (en) | 2004-05-05 | 2009-09-15 | Facet Technology Corp. | Methods and apparatus for automated true object-based image analysis and retrieval |
-
2011
- 2011-01-21 US US13/011,036 patent/US9070305B1/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6018697A (en) * | 1995-12-26 | 2000-01-25 | Aisin Aw Co., Ltd. | Navigation system for vehicles |
US5790974A (en) * | 1996-04-29 | 1998-08-04 | Sun Microsystems, Inc. | Portable calendaring device having perceptual agent managing calendar entries |
US20020055817A1 (en) * | 2000-08-18 | 2002-05-09 | Yue-Hong Chou | Real-time smart mobile device for location information processing |
US20040218910A1 (en) * | 2003-04-30 | 2004-11-04 | Chang Nelson L. | Enabling a three-dimensional simulation of a trip through a region |
US20100211307A1 (en) * | 2006-01-18 | 2010-08-19 | Pieter Geelen | Method of Storing the Position of a Parked Vehicle and Navigation Device Arranged for That |
US20080139245A1 (en) * | 2006-12-07 | 2008-06-12 | Samsung Electronics Co., Ltd. | Mobile terminal and schedule management method using the same |
US20080297608A1 (en) * | 2007-05-30 | 2008-12-04 | Border John N | Method for cooperative capture of images |
US20100253541A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Traffic infrastructure indicator on head-up display |
US20110282542A9 (en) * | 2009-04-03 | 2011-11-17 | Certusview Technologies, Llc | Methods, apparatus, and systems for acquiring and analyzing vehicle data and generating an electronic representation of vehicle operations |
US20120095646A1 (en) * | 2009-09-15 | 2012-04-19 | Ghazarian Ohanes D | Intersection vehicle collision avoidance system |
US8350758B1 (en) * | 2009-10-01 | 2013-01-08 | Lighthouse Signal Systems LLC | Systems and methods for indoor geolocation based on yield of RF signals |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150237481A1 (en) * | 2012-07-31 | 2015-08-20 | Ariel-University Research And Development Company Ltd. | Navigation method and device |
US10715963B2 (en) | 2012-07-31 | 2020-07-14 | Ariel-University Research And Development Company Ltd. | Navigation method and device |
US10477356B2 (en) * | 2012-07-31 | 2019-11-12 | Ariel-University Research And Development Company Ltd. | Navigation method and device |
US10641613B1 (en) * | 2014-03-14 | 2020-05-05 | Google Llc | Navigation using sensor fusion |
US9551591B2 (en) * | 2015-03-03 | 2017-01-24 | Verizon Patent And Licensing Inc. | Driving assistance based on road infrastructure information |
US9437013B2 (en) * | 2015-04-21 | 2016-09-06 | David Douglas | Simplified real time location-dependent color-coded display (“chloropleth”) system and method |
US20150227788A1 (en) * | 2015-04-21 | 2015-08-13 | David Douglas | Simplified real time location-dependent color-coded display ("chloropleth") system and method |
EP3174328A1 (en) * | 2015-11-24 | 2017-05-31 | Advanced Digital Broadcast S.A. | System and method for radio signal coverage mapping |
WO2017153143A1 (en) * | 2016-03-11 | 2017-09-14 | Osram Gmbh | Representation of orientation information by a lighting system |
US11126874B2 (en) * | 2017-02-10 | 2021-09-21 | Continental Automotive France | Method for detecting false positives relating to a traffic light |
US20200012870A1 (en) * | 2017-02-10 | 2020-01-09 | Continental Automotive France | Method for detecting false positives relating to a traffic light |
CN108734979A (en) * | 2017-04-20 | 2018-11-02 | 通用汽车环球科技运作有限责任公司 | Traffic lights detecting system and method |
US20180307925A1 (en) * | 2017-04-20 | 2018-10-25 | GM Global Technology Operations LLC | Systems and methods for traffic signal light detection |
US10699142B2 (en) * | 2017-04-20 | 2020-06-30 | GM Global Technology Operations LLC | Systems and methods for traffic signal light detection |
WO2018199941A1 (en) * | 2017-04-26 | 2018-11-01 | The Charles Stark Draper Laboratory, Inc. | Enhancing autonomous vehicle perception with off-vehicle collected data |
US10262532B2 (en) * | 2017-06-01 | 2019-04-16 | Hyundai Motor Company | System and method for providing forward traffic light information during stop |
US11138444B2 (en) * | 2017-06-08 | 2021-10-05 | Zhejiang Dahua Technology Co, , Ltd. | Methods and devices for processing images of a traffic light |
CN109388148A (en) * | 2017-08-10 | 2019-02-26 | 松下电器(美国)知识产权公司 | Moving body, control method and control program |
US20190180532A1 (en) * | 2017-12-08 | 2019-06-13 | Ford Global Technologies, Llc | Systems And Methods For Calculating Reaction Time |
CN111492366A (en) * | 2017-12-21 | 2020-08-04 | 华为技术有限公司 | Information detection method and mobile device |
US20200320317A1 (en) * | 2017-12-21 | 2020-10-08 | Huawei Technologies Co., Ltd. | Information detection method and mobile device |
EP3719692A4 (en) * | 2017-12-21 | 2020-12-30 | Huawei Technologies Co., Ltd. | Information detection method and mobile device |
CN109116846A (en) * | 2018-08-29 | 2019-01-01 | 五邑大学 | A kind of automatic Pilot method, apparatus, computer equipment and storage medium |
CN110021176A (en) * | 2018-12-21 | 2019-07-16 | 文远知行有限公司 | Traffic lights decision-making technique, device, computer equipment and storage medium |
CN110335273A (en) * | 2019-07-15 | 2019-10-15 | 北京海益同展信息科技有限公司 | Detection method, detection device, electronic equipment and medium |
CN111488821A (en) * | 2020-04-08 | 2020-08-04 | 北京百度网讯科技有限公司 | Method and device for identifying traffic signal lamp countdown information |
CN112507956A (en) * | 2020-12-21 | 2021-03-16 | 北京百度网讯科技有限公司 | Signal lamp identification method and device, electronic equipment, road side equipment and cloud control platform |
CN112581534A (en) * | 2020-12-24 | 2021-03-30 | 济南博观智能科技有限公司 | Signal lamp repositioning method and device, electronic equipment and storage medium |
CN114565897A (en) * | 2022-01-19 | 2022-05-31 | 北京深睿博联科技有限责任公司 | Traffic light intersection blind guiding method and device |
Also Published As
Publication number | Publication date |
---|---|
US9070305B1 (en) | 2015-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9070305B1 (en) | Traffic light detecting system and method | |
US12046006B2 (en) | LIDAR-to-camera transformation during sensor calibration for autonomous vehicles | |
US9057618B1 (en) | System and method of determining map coordinates from images | |
US9116011B2 (en) | Three dimensional routing | |
WO2017193933A1 (en) | Traffic accident pre-warning method and traffic accident pre-warning device | |
ES2366875T3 (en) | DEVICE AND PROCEDURE FOR UPDATING CARTOGRAPHIC DATA. | |
KR100884100B1 (en) | System and method for detecting vegetation canopy using airborne laser surveying | |
EP3904831A1 (en) | Visual localization using a three-dimensional model and image segmentation | |
KR101261409B1 (en) | System for recognizing road markings of image | |
EP3644013B1 (en) | Method, apparatus, and system for location correction based on feature point correspondence | |
EP2737279A1 (en) | Variable density depthmap | |
WO2020093966A1 (en) | Positioning data generation method, apparatus, and electronic device | |
EP3644229A1 (en) | Method, apparatus, and system for determining a ground control point from image data using machine learning | |
WO2020093939A1 (en) | Positioning method, device, and electronic apparatus | |
US11055862B2 (en) | Method, apparatus, and system for generating feature correspondence between image views | |
US20240272308A1 (en) | Systems and Methods of Determining an Improved User Location Using Real World Map and Sensor Data | |
JP6165422B2 (en) | Information processing system, information processing device, server, terminal device, information processing method, and program | |
US9286689B2 (en) | Method and device for detecting the gait of a pedestrian for a portable terminal | |
KR20170015754A (en) | Vehicle Location Method of Skyline | |
JP2024138342A (en) | Map representation data processing device, information processing method, and program | |
US11700504B2 (en) | Method and system for calibration of an indoor space | |
Novais et al. | Community based repository for georeferenced traffic signs | |
KR101181294B1 (en) | Positioning method of reported object image and processing system thereof | |
KR100959246B1 (en) | A method and a system for generating geographical information of city facilities using stereo images and gps coordination | |
CN106161803B (en) | A kind of scene recognition method based on indoor positioning and GPS |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RAMAN, TIRUVILWAMALAI VENKATRAMAN;CHEN, CHARLES L.;LEVANDOWSKI, ANTHONY SCOTT;SIGNING DATES FROM 20110111 TO 20110426;REEL/FRAME:026216/0289 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044334/0466 Effective date: 20170929 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |