US9123152B1 - Map reports from vehicles in the field - Google Patents

Map reports from vehicles in the field Download PDF

Info

Publication number
US9123152B1
US9123152B1 US13/465,578 US201213465578A US9123152B1 US 9123152 B1 US9123152 B1 US 9123152B1 US 201213465578 A US201213465578 A US 201213465578A US 9123152 B1 US9123152 B1 US 9123152B1
Authority
US
United States
Prior art keywords
map
image
vehicle
laser sensor
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/465,578
Inventor
Andrew Hughes Chatham
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Waymo LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US13/465,578 priority Critical patent/US9123152B1/en
Application filed by Google LLC filed Critical Google LLC
Assigned to GOOGLE INC. reassignment GOOGLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHATHAM, ANDREW HUGHES
Priority to US14/813,822 priority patent/US9810540B1/en
Application granted granted Critical
Publication of US9123152B1 publication Critical patent/US9123152B1/en
Assigned to WAYMO HOLDING INC. reassignment WAYMO HOLDING INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GOOGLE INC.
Assigned to WAYMO LLC reassignment WAYMO LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WAYMO HOLDING INC.
Assigned to GOOGLE LLC reassignment GOOGLE LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: GOOGLE INC.
Priority to US15/729,182 priority patent/US10520323B1/en
Assigned to WAYMO LLC reassignment WAYMO LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WAYMO HOLDING INC.
Assigned to GOOGLE LLC reassignment GOOGLE LLC CORRECTIVE ASSIGNMENT TO CORRECT THE CORRECTIVE BY NULLIFICATIONTO CORRECT INCORRECTLY RECORDED APPLICATION NUMBERS PREVIOUSLY RECORDED ON REEL 044142 FRAME 0357. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME. Assignors: GOOGLE INC.
Priority to US16/690,246 priority patent/US11519739B1/en
Assigned to WAYMO LLC reassignment WAYMO LLC SUBMISSION TO CORRECT AN ERROR MADE IN A PREVIOUSLY RECORDED DOCUMENT THAT ERRONEOUSLY AFFECTS THE IDENTIFIED APPLICATIONS Assignors: WAYMO LLC
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/44Browsing; Visualisation therefor
    • G06F16/444Spatial browsing, e.g. 2D maps, 3D or virtual spaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/005Tree description, e.g. octree, quadtree
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks

Definitions

  • Autonomous vehicles use various computing systems to aid in the transport of passengers from one location to another. Some autonomous vehicles may require some initial input or continuous input from an operator, such as a pilot, driver, or passenger. Other systems, such as autopilot systems, may be used only when the system has been engaged, which permits the operator to switch from a manual mode (where the operator exercises a high degree of control over the movement of the vehicle) to an autonomous mode (where the vehicle essentially drives itself) to modes that lie somewhere in between.
  • a manual mode where the operator exercises a high degree of control over the movement of the vehicle
  • autonomous mode where the vehicle essentially drives itself
  • the autonomous vehicle may rely on maps, particularly in autonomous mode, for navigating the vehicle.
  • maps particularly in autonomous mode, for navigating the vehicle.
  • the maps relied on by the autonomous vehicles may be out of date or otherwise inaccurate as compared to reality, due to construction, accidents or landscape changes.
  • a prohibited zone or blockage may be created on that map preventing entry by the autonomous vehicles in an autonomous mode.
  • the method may include receiving image data from a laser sensor.
  • the image data may be collected along a vehicle path and compiled to form a first image.
  • the method may also include identifying a map related to the vehicle path, comparing the map to the first image, and assessing validity of the map based on the comparison.
  • a signal may be transmitted to a server via a network indicating the result of the assessment.
  • the map may include an area prohibiting entry by a vehicle.
  • Another aspect of the disclosure provides a method for assessing validity of a map, the method including determining a plurality of locations of a vehicle along a path. A trajectory of the vehicle may be determined based on the locations and compared with the map, and a validity of the map may be assessed based on the comparison.
  • the method may include a determination of whether the map includes a plausible path consistent with the trajectory.
  • a signal may be transmitted to a server indicating that the map is invalid.
  • a signal may be transmitted to a server indicating that the map is valid.
  • Yet another aspect of the disclosure provides a method for assessing validity of a map, the method including receiving a compressed image describing a vehicle path of an area, reconstructing the compressed image to form a first image, identifying a map related to the area, and displaying the first image in relation to the map.
  • the compressed image is derived from image data collected by a laser sensor.
  • the system may include a processor, and may also include at least one of a laser sensor, a Global Positioning Sensor, and a camera.
  • a memory may store a map associated with a vehicle path, and may include data and instructions executable by the processor. The data and instructions, when executed by the processor, may determine a plurality of locations of a vehicle along the vehicle path, and may determine a trajectory of the vehicle based on the locations. The trajectory may be compared with the map, and a validity of the map may be assessed based on the comparison.
  • the system may include a cellular modem configured to transmit a result of the assessment to a server.
  • Yet another aspect of the disclosure provides a method for assessing validity of a map, in which a current location of a vehicle is determined.
  • the method may include determination of whether the map includes a plausible location consistent with the current location of the vehicle. Validity of the map may be assessed based on the determination.
  • FIG. 1 is a block diagram of a network that connects an autonomous vehicle and a server in accordance with aspects of the disclosure.
  • FIG. 2 is a block diagram of a configuration of the autonomous vehicle according to one aspect.
  • FIG. 3 is a schematic side view of the vehicle.
  • FIG. 4 is a block diagram of a configuration of an autonomous component of the vehicle.
  • FIG. 5 is a schematic top down view of a road and a trajectory of the vehicle along the road.
  • FIGS. 5A-C are schematic views of raw images captured by a laser sensor as the vehicle maneuvers along the road of FIG. 5 .
  • FIG. 5D is a compiled image produced by a compilation unit of the autonomous component according to one aspect.
  • FIG. 6 is a flowchart illustrating operations by the autonomous component according to one aspect.
  • FIG. 6A is a flowchart illustrating operations by the autonomous component according to another aspect.
  • FIG. 7A is a schematic view of a map stored in a map database illustrating a work-zone area under construction.
  • FIG. 7B is a schematic view of a compiled image corresponding to the same area of FIG. 7A after construction is over.
  • FIG. 7C is a schematic view of an actual trajectory after construction is over.
  • FIG. 8 is a flowchart illustrating operations by the autonomous component according to one aspect.
  • FIG. 9 is flowchart illustrating operations by the server according to one aspect.
  • FIG. 10 is a compiled image based on real raw data generated by a laser sensor depicting an area having a subarea under construction.
  • FIG. 11 is another compiled image illustrated the same area of FIG. 10 after the construction is over.
  • Flowcharts may be used in the drawings to illustrate processes, operations or methods performed by components, devices, parts, systems, or apparatuses disclosed herein.
  • the flowcharts are mere exemplary illustrations of steps performed in individual processes, operations or methods. Steps may not be performed in the precise order as illustrated in the flowcharts. Rather, various steps may be handled simultaneously or performed in sequences different from that illustrated. Steps may also be omitted from or added to the flowcharts unless otherwise stated.
  • FIG. 1 illustrates an environment 100 in which embodiments of the present invention may be utilized, including an autonomous vehicle 110 , a server 120 , and a network 130 that facilitates direct or indirect communication between the autonomous vehicle 110 and the server 120 .
  • a user 140 may interact with the autonomous vehicle 110 via the server 120 .
  • the network 130 may be, e.g., a wireless network, such as the Global System for Mobile Communications/General Packet Radio service (GSM/GPRS), Code Division Multiple Access (CDMA), Enhanced Data Rates for Global Evolution (EDGE), Universal Mobile Telecommunications System (UMTS), or a broadband network such as Bluetooth and Wi-Fi (the brand name for products using IEEE 802.11 standards).
  • GSM/GPRS Global System for Mobile Communications/General Packet Radio service
  • CDMA Code Division Multiple Access
  • EDGE Enhanced Data Rates for Global Evolution
  • UMTS Universal Mobile Telecommunications System
  • the network 130 may be identified by a Service Set Identifier (SSID), which is the key to access the network 130 by a wireless device.
  • SSID Service Set Identifier
  • Each component coupled to the network 130 e.g., the vehicle 110 or the server 120 , is a node in the network 130 .
  • Each node on the network 130 is preferably uniquely identified by a Media Access Control address (MAC address).
  • the present invention is not limited to the network types and network components described in the illustrative embodiment of FIG. 1 , other network types and network components may also be used.
  • more than one autonomous vehicle 110 may be included in the network 130 , and each may be identified by a unique MAC address.
  • more than one server 120 may be included in the network 130 , and the servers may work independently from or collaboratively with each other.
  • a server 120 may include a processor and a memory (not shown).
  • the server 120 may store in its memory information relevant to the navigation of the vehicle 110 .
  • Such information may include maps, traffic patterns, and road conditions.
  • the server 120 may receive from the vehicle 110 information related to one or more of the following: map updates, map corrections, traffic pattern updates, and traffic pattern corrections.
  • the server 120 may store the received information in its memory.
  • the server 120 may distribute the received information to other vehicles 110 via the network 130 .
  • a vehicle 110 may be any type of vehicle including, but not limited to, cars, trucks, motorcycles, busses, boats, airplanes, helicopters, lawnmowers, recreational vehicles, amusement park vehicles, construction vehicles, farm equipment, trams, golf carts, trains, and trolleys.
  • FIG. 2 is a block diagram illustrating hardware configurations of the vehicle 110 in accordance with one aspect of the disclosure.
  • the vehicle 110 may include one or more of the following components: a processor 150 , a memory 152 , a cellular modem 154 , user I/O devices 156 , a laser sensor 160 , a radar sensor 162 , a camera 164 , a Global Positioning System (GPS) sensor 166 , a non-autonomous component 168 , and an autonomous component 170 .
  • GPS Global Positioning System
  • These components may be operatively connected with each other via physical coupling and/or electrical coupling.
  • One of more of these components may transmit or receive executable instructions in analog/digital signals to or from other component or components.
  • one or more of these components may also transmit or receive communications to or from the server 120 via the network 130 . Details with respect to each component are given below.
  • the processor 150 may be any conventional processor, such as processors from the Intel Corporation or Advanced Micro Devices (“AMD”). Alternatively, the processor 150 may be a dedicated device such as an applicant-specific integrated circuit (“ASIC”). The processor 150 may refer to a collection of processors that may or may not operate in parallel. In one aspect, the vehicle 110 as a whole may have a single processor 150 to perform acts described herein. Alternatively, one or more components of the vehicle 110 , e.g., the autonomous component 170 , may each have their own processor executing instructions specific to each individual component.
  • the processor 150 or a collection of processors is physically mounted within the vehicle 110 . In some other aspects, the processor 150 or the collection of processors are physically located away from the vehicle 110 and communicate with the vehicle 110 via the network 130 . Alternatively, one or more processors 150 of the collection of processors are physically mounted within the vehicle 110 , while the remaining processors are remotely connected with the vehicle 110 via the network 130 .
  • the memory 152 may include a volatile memory, a non-volatile memory, or a combination thereof.
  • the volatile memory may include a RAM, such as a dynamic random access memory (DRAM) or static random access memory (SRAM), or any other forms of alterable memory that may be electrically erased and reprogrammed.
  • the non-volatile memory may include a ROM, a programmable logical array, or other forms of non-alterable memory which cannot be modified, or can be modified only slowly or with difficulty.
  • the cellular modem 154 may include a transmitter and receiver.
  • the modem 154 may receive and transmit information via the network 130 .
  • the modem 154 may connect the vehicle 110 to other nodes in the network 130 , e.g., the server 120 or other vehicles in the network 130 .
  • the modem 154 may transmit to the server 120 information such as maps, information about traffic patterns, and road conditions.
  • the modem 154 may also communicate with roadside sensors, such as traffic cameras or laser sensors stationed on the side of a road.
  • the user I/O devices 156 may facilitate communication between a user, e.g., a driver or a passenger in the vehicle 110 and the vehicle 110 .
  • the user I/O devices 156 may include a user input device, which may include a touch screen. The touch screen may allow the user to switch the vehicle 110 between two operation modes: an autonomous or self-driving mode, and a non-autonomous or operator-driving mode.
  • the user I/O devices 156 may also include a user output device, such as a display or a status bar.
  • the display may display information regarding the status of the vehicle 110 .
  • the status bar may indicate the current status of the vehicle 110 , e.g., the present operation mode or the present speed.
  • the laser sensor 160 , the radar sensor 162 , the camera 164 , and the Global Positioning System (GPS) sensor 166 each may observe the environment of the vehicle 110 , and provide observation data to the autonomous component 170 of the vehicle 110 for analysis.
  • the observation data may include one or more objects in the surrounding environment of the vehicle 110 , such as vehicles, traffic obstacles, traffic signs/signals, trees, and people.
  • one or more of the sensors may continuously produce observation data to reflect changes or updates in the environment.
  • the sensors may provide updated observation data to the autonomous component 170 in real-time or quasi-real time, or on demand.
  • the autonomous component 170 may vary navigation parameters, e.g., direction and/or speed of the vehicle 110 , as a response to changes in the environment.
  • the laser sensor 160 may detect any surrounding object that absorbs/reflects energy radiated from the laser sensor 160 .
  • the laser sensor 160 may refer to one or more laser sensors, e.g., 160 a and 160 b of FIG. 3 .
  • the laser sensors 160 a and 160 b may be respectively mounted on the top and the front of the vehicle 110 .
  • the laser sensor 160 a positioned at the top of the vehicle 110 may have a horizontal field of view in the range between 50 and 80 meters, and a vertical field of view of about 30°.
  • the laser sensor 160 b positioned at the front of the vehicle 110 may have a horizontal field of view of about 150 meters, and a vertical field view of about 30°.
  • the fields of view of each laser sensor 160 a and 160 b may vary as needed.
  • the laser sensor 160 a may be physically connected to the vehicle 110 in a manner that the laser sensor 160 a may rotate 360° about a rotation axis “R”. In one aspect, to determine a distance between the vehicle 110 and a surrounding object, the laser sensor 160 a may first rotate to face a surrounding object, and record observation data while facing the surrounding object. The laser sensor 160 a may then output the observation data to the autonomous component 170 for determination of the distance.
  • the radar sensor 162 may refer to one or more radar sensors, e.g., 162 a - c of FIG. 3 .
  • the radar sensors 162 a - c may be located at various positions on the vehicle 110 , e.g., the front or the back of the vehicle 110 , or one or both lateral sides of the front bumper. In FIG. 3 , the radar sensors 162 a - c are positioned, respectively, at the front of the vehicle 110 , the back of the vehicle 110 , a left lateral side of the front bumper. Another radar sensor (not shown) may be positioned at a right lateral side of the front bumper.
  • one or more of the radar sensors may have a horizontal field of view of about 200 meters, and a vertical field of view of about 18°.
  • one or more of the radar sensors e.g., 162 a - c
  • the fields of view of each radar sensor 162 a - c may vary as needed.
  • the camera 164 may refer to one or more cameras mounted on the vehicle 110 . As shown in FIG. 3 , two cameras 164 a - b may be mounted under a windshield 169 near the rear view mirror (not shown). The cameras 164 a - b may have identical or different fields of view. For instance, one camera 164 a may have a horizontal field of view of about 200 meters and a vertical field of view of about 30°, and the other camera 164 b may have a horizontal field of view of about 100 meters and a vertical field of view of about 60°. The fields of view of each camera 164 a - b may vary as needed. The cameras 164 a - b may provide image data to the autonomous component 170 for computing a distance between various objects.
  • the arrangement of the various sensors discussed above with reference to FIG. 3 is merely exemplary. The arrangement of the sensors may vary as needed.
  • the non-autonomous component 168 may include hardware typically found in a non-autonomous car.
  • the non-autonomous component 168 may include one or more of the following: engine components and parts, braking system, suspension and steering system, transmission system, wheels and tire parts, lighting and signaling system, and other devices or systems that facilitate manual operation the vehicle 110 .
  • the autonomous component 170 may operate the vehicle 110 autonomously or semi-autonomously, without user interventions.
  • the autonomous component 170 may control a set of navigation parameters of the vehicle 110 , which may relate to the speed or direction of the vehicle 110 .
  • the autonomous component 170 may translate the navigation parameters into physical actions of the vehicle 110 .
  • the autonomous component 170 may actuate systems or parts of the non-autonomous component 168 , e.g., braking system or engine, based on the navigation parameters.
  • the autonomous component 170 may also vary settings of the systems or parts of the non-autonomous component 168 based on the navigation parameters.
  • the autonomous component 170 may adjust the vehicle 110 in response to changes in the surrounding environment of the vehicle 110 . Specifically, the autonomous component 170 may receive observation data produced by various sensors discussed above. The autonomous component 170 may adjust one or more of the navigation parameters based on the observation data.
  • the autonomous component 170 may synthesize the observation data, and use the synthesized data to assess validity, expiration or accuracy of maps relied on by the vehicle 120 . More details regarding this aspect are discussed next with reference to FIG. 4 .
  • FIG. 4 is a block diagram illustrating an example of a configuration of the autonomous component 170 .
  • the autonomous component 170 may include one or more of the following units: a compilation unit 172 , a compression unit 174 , a localization unit 176 , a projection unit 178 , a map database 180 , a validation unit 184 , and a comparison unit 186 . Details with respect to each unit are as follows.
  • the compilation unit 172 may receive observation data captured by the laser sensor 160 , and may compile the observation data.
  • the observation data may be raw images that have not been marked upon or altered by the processor 150 .
  • FIGS. 5A-5C illustrate schematic views of raw images captured by one of the laser sensors, e.g., 160 a - b , at various points 512 - 516 along a vehicle path illustrated in FIG. 5 .
  • the road 500 includes two lanes: Lane 1 and Lane 2 . Each lane may have a length about 80 m.
  • at least one of the laser sensors e.g., 160 a - b , captures raw images of the surrounding environment, e.g., the road 500 , at various points, e.g., 512 - 526 , along the path of the vehicle 110 .
  • FIGS. 5A-C illustrate raw images 500 a - c that are taken by the laser sensor 160 a or 106 b at points 152 - 156 , respectively.
  • One or more raw images may capture various details of the path taken by the vehicle 110 , including the type, color and number of lines on the road 500 .
  • the raw images may indicate one or more of the following information: a solid yellow line, a broken yellow lines, solid yellow double lines, two sets of solid double yellow lines spaced 2 feet or more apart, a solid white line, and a broken white line, double white lines.
  • the raw images 500 a - c each illustrate two broken white lines that are separated by a standard 12-foot lane width.
  • the compilation unit 172 may compile the raw images. For instance, the compilation unit 172 may synthesize raw images captured over a given period of time, or along a given path. The compilation unit 172 may synthesize the raw images when the vehicle 110 operates in either the autonomous mode or the non-autonomous mode. For instance, the compilation unit 172 may request or receive raw images from the laser sensor 160 when the non-autonomous mode starts, and stop requesting or receiving raw images from the laser sensor 160 when the non-autonomous mode stops.
  • the compilation unit 172 may render a raw image from a 3D view image to a 2D view image.
  • the rendering process may take into consideration of the configurations of the laser sensor 160 , e.g., the horizontal and vertical fields of view, and/or the degree of rotation.
  • the compilation unit 172 may also assemble all the 2D view images together to form a clean Mercator projection of the path taken by the vehicle 110 . During this processes, some of the 2D view images may partially overlap each other.
  • FIG. 5D is a pictorial, schematic view of a compiled image 500 d produced by the compilation unit 172 , indicating the current state of the road 500 .
  • the solid lines are derived from compilation of the raw images, representing the current state of the road 500 .
  • the dashed lines may represent the past state of the road 500 recorded on a map that has been previously prepared.
  • discrepancies exist between the current state of the road 500 and the past state of the same road. Accordingly, the discrepancies in FIG. 5D may suggest that the previously prepared map which records the past state of the road 500 becomes invalid or expired.
  • FIGS. 10 and 11 provide examples of images compiled from real raw images captured by a laser sensor about the same area.
  • FIG. 10 represents a compiled image of the area, having a subarea 1000 under construction. As shown in FIG. 10 , the subarea 1000 is plain, and no features are illustrated therein.
  • FIG. 11 represents a compiled image of the same area, illustrating the state of the subarea 1000 after completion of the construction. As shown in FIG. 11 , as a result of the construction, the subarea 1000 is now painted with white lane lines on the road. Accordingly, a comparison of FIGS. 10 and 11 reveals that any map based on the image of FIG. 10 becomes invalid, outdated or expired.
  • the compression unit 174 may perform image compression to reduce irrelevance and redundancy of the compiled image output by the compilation unit 172 .
  • the compression unit 174 may reduce quality, resolution or size of the compiled image.
  • the image compression technique performed by the compression unit 174 may be lossy or lossless.
  • the compression unit 174 may use one or more of the following techniques to compress images: run-length encoding, entropy encoding, deflation, chain codes, chroma subsampling, transform coding, and fractal compression.
  • the compression unit 174 may output the compressed image to the cellular modem 154 .
  • the cellular modem 154 may send the compressed image to the server 120 via the network 130 .
  • FIG. 6 provides a flowchart 600 illustrating a method performed by the autonomous component 170 for processing raw images.
  • the compilation unit 172 of the autonomous component 170 receives one or more raw images from the laser sensor 160 . Thereafter, at block 604 , the compilation unit 172 generates a compiled image based on the raw images received at block 602 . Specifically, the compilation unit 172 may render a plurality of raw images taken at different locations along an actual path of the vehicle 110 into a single compiled image. The single compiled image indicates features of the actual path.
  • the compression unit 174 of the autonomous component 170 compresses the compiled image produced by the compilation unit 172 into a reduced form.
  • the reduced form may be the result of reduction in image quantity, reduction in image resolution or reduction in image size.
  • the autonomous component 170 outputs the compressed image generated by the compression unit 174 to the cellular modem 154 .
  • the cellular modem 154 may then transmit the compressed image to the server 120 via the network 130 .
  • the autonomous component 170 may selectively transmit either the compiled image outputted by the compilation unit 172 or the compressed image outputted by the compression unit 174 to the server 120 . For instance, when the bandwidth is low, the autonomous component 170 may transmit the compressed image outputted by the compression unit 174 to the server 120 . Conversely, when the bandwidth is high, the autonomous component 170 may transmit the compiled image outputted by the compilation unit 172 to the server 120 .
  • the autonomous component 170 may temporarily or permanently store a plurality of maps of the real world in a map database 180 .
  • the map database 180 may be a relational database. Maps may be stored in one or more of the following formats: compressed or uncompressed, lossless (e.g., BMP) or lossy (e.g., JPEG), and bitmap or vector-based (e.g., SVG), as well as computer instructions for drawing graphics.
  • the maps may comprise any information sufficient to identify the relevant information, such as numbers, descriptive text, proprietary codes, references to data stored in areas of the same memory or different memories (including other network locations) or information that is used by a function to calculate the relevant data.
  • the maps may include environmental data that was obtained at a previous point in time and is expected to persist regardless of the vehicle's presence in the environment.
  • a map may include detailed map information, e.g., highly detailed maps detecting the shape and elevation of roadways, lane lines, intersections, crosswalks, speed limits, traffic signals, buildings, signs, real time traffic information, or other such features and information. These features may be persistent. For example, when the vehicle 110 approaches the location of a feature in the detailed map information, the vehicle 110 may expect to detect the feature.
  • the detailed map information may also include explicit speed limit information associated with various roadway segments. The speed limit data may be entered manually or scanned from previously taken images of a speed limit sign using, for example, optical-character recognition.
  • the detailed map information may also include two-dimensional street-level imagery, such as highly detailed image data depicting the surroundings of a vehicle from the vehicle's point-of-view.
  • the detailed map information may also include three-dimensional terrain maps incorporating one or more of the objects listed above.
  • the maps may include detailed map information such as zone information, indicating zones that are unsuitable for driving autonomously. For example, an on-ramp, off-ramp, or other complicated or high traffic areas may be identified as such zones as a driver may feel the need to continuously monitor the vehicle in case the driver must take control. Other zones may be identified as unsuitable for any driving, such as a sidewalk, river, mountain, cliff, desert, etc.
  • the detailed map information may also include one or more roadgraphs or graph networks of information such as roads, lanes, intersections, and the connections between these features.
  • Each feature may be stored as graph data and may be associated information such as a geographic location and whether or not it is linked to other related features. For example, a stop sign may be linked to a road and an intersection, etc.
  • the associated data may include grid-based indices of a roadgraph to allow efficient lookup of certain roadgraph features.
  • the map information may include zones.
  • a zone may include a place where driving may become complicated or challenging for humans and computers, such as merges, construction zones, or other obstacles.
  • a zone's rules may require an autonomous vehicle to alert the driver that the vehicle is approaching an area where it may be challenging for the vehicle to drive autonomously.
  • the vehicle may require a driver to take control of steering, acceleration, deceleration, etc.
  • FIG. 7A illustrates a map 700 a including a problematic zone 715 , which is a construction or work zone unsuitable for driving autonomously.
  • a zone's rules may require an autonomous vehicle to alert the driver, but rather than requiring the driver to take control, the vehicle may lower its speed and/or increase its following distance (between the autonomous vehicle and another vehicle).
  • the autonomous component 170 may include a comparison unit 186 that assesses validity of a map retrieved from the map database 180 .
  • the comparison unit 186 may receive a map from the map database 180 , and may receive the compiled image from the compilation unit 172 , where both the map and the compiled image relate to the same area, or relate to the same problematic zone.
  • the comparison unit 186 may compare the compiled image to the map of the same area/zone, and determine if there is any discrepancy. If there is, the comparison unit 186 may issue an alert signal to the cellular modem 154 , indicating that the map is invalid or outdated.
  • the cellular modem 154 may transmit the alert signal to the server 120 via the network 130 .
  • 700 a may represent the map related to Area 1 retrieved from the map database 180
  • 700 b may represent the compiled image generated by the compilation unit 172 which also relates to Area 1 . Because the map 700 a illustrates a work zone 715 absent from the compiled image 700 b , the comparison unit 186 may issue an alert indicating that the map 700 a is invalid or outdated.
  • FIG. 6A provides a flowchart 650 illustrating a method performed by the autonomous component 170 according to this aspect.
  • Blocks 652 - 654 may be identical to blocks 602 - 604 discussed with reference to FIG. 6 .
  • the autonomous component 170 retrieves a map from the map database 180 corresponding to the same geographical area captured by the raw data at block 652 .
  • the comparison unit 186 compares the retrieved map to the compiled image obtained at block 654 . If the comparison unit 186 determines in block 660 that no discrepancy exists between the retrieved map and the compiled image, the comparison unit 186 may reach a conclusion that the retrieved map remains valid or up-to-date.
  • the method may terminate or, alternatively, may issue a signal to the cellular modem 154 indicating that the retrieved map remains valid (block 662 ).
  • the comparison unit 186 may issue a signal to the cellular modem 154 that the map becomes invalid, outdated, or expired (block 664 ).
  • the cellular modem 154 may subsequently pass signals to the server 120 .
  • the server 120 may then make decisions as to how to update the map.
  • the localization unit 176 of the autonomous component 170 may determine the current geographic location of the vehicle 110 .
  • the localization unit 176 may derive the present location of the vehicle 110 based on information collected or derived from one or more of the following sensors: the GPS sensor 166 , the laser sensor 160 , the camera 164 , and an inertial-aided sensor (not shown).
  • the localization unit 176 may determine the location of the vehicle, including one or more of the following: an absolute geographical location, e.g., latitude, longitude, and altitude, and a relative location, e.g., location relative to other cars immediately around the vehicle 110 .
  • the mechanism used to determine the relative location may produce less signal distortion compared to the mechanism used to determine the absolute geographical location. Accordingly, the relative location may be more accurate than the absolute geographical location.
  • the location of the vehicle 110 may also indicate the position of the vehicle 110 relative to a ground level, e.g., an underground position when the vehicle is in a tunnel or a cave, or an aboveground position.
  • the location unit 176 may run for a given time frame, for a given area of interest, or when the vehicle 110 is in either the autonomous mode or the non-autonomous mode. In one aspect, the localization unit 176 starts to determine location of the vehicle 110 once the vehicle 110 enters a particular zone. In another aspect, the localization unit 176 starts to determine location of the vehicle 110 once the autonomous mode is activated. The localization unit 176 may output the current location of the vehicle 110 to the projection unit 178 .
  • the projection unit 178 may receive locations of the vehicle 110 periodically from the localization unit 176 . Alternatively, the projection unit 178 may demand provision of locations of the vehicle 100 from the localization unit 176 . The projection unit 178 may keep in record the locations of the vehicle 110 during a given time frame, or when the vehicle 110 is in an area of interest, e.g., a problematic area where obstacles were previously recorded. Alternatively, the projection unit 178 may start recording the locations of the vehicle 110 when the vehicle 110 enters an autonomous mode. The projection unit 178 may stop recording once the vehicle 110 exits the autonomous mode. Based on the serious locations recorded, the projection unit 178 may project an actual trajectory taken by the vehicle 110 . For instance, as illustrated in FIG. 7C , the projection unit 178 determines an actual trajectory 720 taken by the vehicle 110 in an area.
  • the validation unit 184 may assess validity, currency or expiration of a map. More specifically, the validation unit 184 may receive a map from the map database 180 of a particular area. Additionally, the validation unit 184 may receive one or more of the following information related to the same area: the actual trajectory taken by the vehicle 110 from the projection unit 178 , and the current location of the vehicle 110 output by the localization unit 176 .
  • the validation unit 184 may compare the retrieved map with the actual trajectory. The validation unit 184 may determine if the map includes a plausible path that is consistent with the actual trajectory. If there is no plausible path on the map that is consistent with the actual trajectory, the validation unit 184 may conclude that the map is invalid, outdated or expired. For instance, the validation unit 184 may receive a map 700 a of FIG. 7A from the map database 180 , and may receive an actual trajectory 720 of FIG. 7C from the projection unit 178 , both of which relate to the same area. The map 700 a includes a problematic zone, e.g., construction or work zone 715 , and represents the state of the area before completion of the construction. By contrast, the actual trajectory 720 of FIG.
  • a problematic zone e.g., construction or work zone 715
  • the validation unit 184 may conclude that the map 700 a of FIG. 7A is invalid, outdated or expired.
  • the validation unit 184 may compare the retrieved map with the current location of the vehicle 110 as received from the localization unit 176 .
  • the validation unit 184 may determine if the map includes a plausible location, which may be reached by the vehicle 110 , in consistency with the current location of the vehicle 110 . If there is no plausible location on the map that is consistent with the current location of the vehicle 110 , the validation unit 184 may conclude that the map is invalid, outdated or expired.
  • the vehicle 110 may receive the map 700 a of FIG. 7A from the map database 180 , which represents the state of the area before completion of the construction.
  • the current location as received by the validation unit 184 may be the location 722 in FIG.
  • the validation unit 184 may conclude that the map 700 a of FIG. 7A is invalid, outdated or expired.
  • the validation unit 184 may assess validity of the map based on both the actual trajectory of the vehicle 110 and the current location of the vehicle 110 .
  • the validation unit 184 may determine (1) whether the map includes a plausible path consistent with the actual trajectory, and (2) whether the map includes a plausible location consistent with the current location. If the answer is “No” to at least one of the questions, the validation unit 184 may conclude that the map is invalid. However, if the answer is “Yes” to both questions, the validation unit 184 may conclude that the map remains valid.
  • the validation unit 184 may issue an alert signal to the cellular modem 154 indicating that the map is invalid.
  • the cellular modem 154 may transmit the alert signal to the server 120 via the network 130 .
  • the validation unit 184 may issue a signal to the cellular modem 154 indicating that the map remains valid.
  • the cellular modem 154 may in turn transmit the same instruction to the server 120 .
  • the validation unit 184 may output the actual trajectory received from the projection unit 178 to the cellular modem 154 .
  • the cellular modem 154 may transmit the actual trajectory to the server 120 via the network 130 for analysis.
  • FIG. 8 a flowchart 800 illustrating a method performed by the autonomous component 170 for assessing validity of a map according to one aspect described above.
  • the localization unit 176 may determine a set of locations visited by the vehicle 110 , including the current location of the vehicle 110 .
  • the localization unit 176 may run in a degraded mode, such that the unit 176 may determine the vehicle location at a low frequency. Under such mode, the determination made about the location of the vehicle 110 may be less accurate than that under a fully-functional mode.
  • the projection unit 178 projects the actual trajectory of the vehicle 110 based on the set of locations determined by the localization unit 176 .
  • the set of locations are determined by the localization unit 176 during the period when the vehicle 110 is in a particular area. Alternatively, the set of locations may be determined during the period of time when the vehicle 110 is in a non-autonomous mode.
  • the autonomous component 170 may retrieve a map from the map database 180 corresponding to the same area in which localization has been performed.
  • the validation unit 184 compares the retrieved map to at least one of the following: the actual trajectory and the current location of the vehicle 110 .
  • the validation unit 184 may determine at least one of the following two questions: (1) whether the map includes a plausible path consistent with the actual trajectory, and (2) whether the map includes a plausible location consistent with the current location of the vehicle 110 . If the answer is “No” to at least one of the two questions, the validation unit 184 may conclude that the map is invalid, outdated or expired. Otherwise, the validation unit 184 may conclude that the map remains valid or up-to-date.
  • the validation unit 184 may issue an alert to the server 120 that the map is invalid.
  • the server 120 may act accordingly, e.g., update the identified map.
  • the validation unit 184 may then terminate or, alternatively, the validation unit 184 may issue a signal to the server 120 , at block 812 , that the map remains valid.
  • the server 120 may have a map database identical or similar to the map database 180 in the autonomous component.
  • the two map databases may be synchronized overtime via the network 130 .
  • FIG. 9 provides a flowchart 900 illustrating a method performed the server 120 according to one aspect.
  • the server 120 may receive a compressed image produced by the compression unit 174 from the autonomous component 170 via the network 130 .
  • the server 120 may reconstruct the compressed image.
  • the server 120 may restore the compiled image produced by the compilation unit 172 based on the compressed image.
  • the server 120 may identify a map related to the restored image. The map may relate to the same geographical are as that of the restored image or the compressed image.
  • the server 120 may present the identified map and the restored image to the user 140 via a display.
  • the restored image may overlay on top of the identified map.
  • the user 140 may then make informed decision of whether the identified map remains valid, or becomes invalid or expired.
  • aspects of the present disclosure may be advantageous in that they provide for increased accuracy of navigational data, and therefore provide for increased safety in the operation of vehicles. Moreover, because updated map information and other information related to roadways may be detected and generated using sensors on the vehicle to which the map information will be delivered, this information is highly accurate. Furthermore, this highly accurate information may be provided to other vehicles through a network to enable those vehicles to update their navigational information as well.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

Aspects of the present disclosure relate generally to systems and methods for assessing validity of a map using image data collected by a laser sensor along a vehicle path. The method may compile image data received from the laser sensor. The map subject to assessment may define an area prohibiting entry by a vehicle.

Description

BACKGROUND OF THE INVENTION
Autonomous vehicles use various computing systems to aid in the transport of passengers from one location to another. Some autonomous vehicles may require some initial input or continuous input from an operator, such as a pilot, driver, or passenger. Other systems, such as autopilot systems, may be used only when the system has been engaged, which permits the operator to switch from a manual mode (where the operator exercises a high degree of control over the movement of the vehicle) to an autonomous mode (where the vehicle essentially drives itself) to modes that lie somewhere in between.
The autonomous vehicle may rely on maps, particularly in autonomous mode, for navigating the vehicle. However, the maps relied on by the autonomous vehicles may be out of date or otherwise inaccurate as compared to reality, due to construction, accidents or landscape changes.
BRIEF SUMMARY OF THE INVENTION
When suspicion arises with respect to the accuracy of any map, a prohibited zone or blockage may be created on that map preventing entry by the autonomous vehicles in an autonomous mode.
One aspect of the disclosure provides a method for assessing validity of a map. The method may include receiving image data from a laser sensor. The image data may be collected along a vehicle path and compiled to form a first image. The method may also include identifying a map related to the vehicle path, comparing the map to the first image, and assessing validity of the map based on the comparison. In one example, a signal may be transmitted to a server via a network indicating the result of the assessment. The map may include an area prohibiting entry by a vehicle.
Another aspect of the disclosure provides a method for assessing validity of a map, the method including determining a plurality of locations of a vehicle along a path. A trajectory of the vehicle may be determined based on the locations and compared with the map, and a validity of the map may be assessed based on the comparison.
In one example, the method may include a determination of whether the map includes a plausible path consistent with the trajectory. When the map does not include a plausible path consistent with the trajectory, a signal may be transmitted to a server indicating that the map is invalid. When the map includes a plausible path consistent with the trajectory, a signal may be transmitted to a server indicating that the map is valid.
Yet another aspect of the disclosure provides a method for assessing validity of a map, the method including receiving a compressed image describing a vehicle path of an area, reconstructing the compressed image to form a first image, identifying a map related to the area, and displaying the first image in relation to the map. In one example, the compressed image is derived from image data collected by a laser sensor.
Another aspect of the disclosure provides a system for assessing validity of a map. The system may include a processor, and may also include at least one of a laser sensor, a Global Positioning Sensor, and a camera. A memory may store a map associated with a vehicle path, and may include data and instructions executable by the processor. The data and instructions, when executed by the processor, may determine a plurality of locations of a vehicle along the vehicle path, and may determine a trajectory of the vehicle based on the locations. The trajectory may be compared with the map, and a validity of the map may be assessed based on the comparison. In one example, the system may include a cellular modem configured to transmit a result of the assessment to a server.
Yet another aspect of the disclosure provides a method for assessing validity of a map, in which a current location of a vehicle is determined. The method may include determination of whether the map includes a plausible location consistent with the current location of the vehicle. Validity of the map may be assessed based on the determination.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the present disclosure and, together with the detailed description of the embodiments given below, serve to explain the principles of the disclosure.
FIG. 1 is a block diagram of a network that connects an autonomous vehicle and a server in accordance with aspects of the disclosure.
FIG. 2 is a block diagram of a configuration of the autonomous vehicle according to one aspect.
FIG. 3 is a schematic side view of the vehicle.
FIG. 4 is a block diagram of a configuration of an autonomous component of the vehicle.
FIG. 5 is a schematic top down view of a road and a trajectory of the vehicle along the road.
FIGS. 5A-C are schematic views of raw images captured by a laser sensor as the vehicle maneuvers along the road of FIG. 5.
FIG. 5D is a compiled image produced by a compilation unit of the autonomous component according to one aspect.
FIG. 6 is a flowchart illustrating operations by the autonomous component according to one aspect.
FIG. 6A is a flowchart illustrating operations by the autonomous component according to another aspect.
FIG. 7A is a schematic view of a map stored in a map database illustrating a work-zone area under construction.
FIG. 7B is a schematic view of a compiled image corresponding to the same area of FIG. 7A after construction is over.
FIG. 7C is a schematic view of an actual trajectory after construction is over.
FIG. 8 is a flowchart illustrating operations by the autonomous component according to one aspect.
FIG. 9 is flowchart illustrating operations by the server according to one aspect.
FIG. 10 is a compiled image based on real raw data generated by a laser sensor depicting an area having a subarea under construction.
FIG. 11 is another compiled image illustrated the same area of FIG. 10 after the construction is over.
DETAILED DESCRIPTION
For simplicity and clarity of illustration, like reference numerals may be used in the drawings to identify identical or analogous structural elements.
Flowcharts may be used in the drawings to illustrate processes, operations or methods performed by components, devices, parts, systems, or apparatuses disclosed herein. The flowcharts are mere exemplary illustrations of steps performed in individual processes, operations or methods. Steps may not be performed in the precise order as illustrated in the flowcharts. Rather, various steps may be handled simultaneously or performed in sequences different from that illustrated. Steps may also be omitted from or added to the flowcharts unless otherwise stated.
FIG. 1 illustrates an environment 100 in which embodiments of the present invention may be utilized, including an autonomous vehicle 110, a server 120, and a network 130 that facilitates direct or indirect communication between the autonomous vehicle 110 and the server 120. A user 140 may interact with the autonomous vehicle 110 via the server 120.
The network 130 may be, e.g., a wireless network, such as the Global System for Mobile Communications/General Packet Radio service (GSM/GPRS), Code Division Multiple Access (CDMA), Enhanced Data Rates for Global Evolution (EDGE), Universal Mobile Telecommunications System (UMTS), or a broadband network such as Bluetooth and Wi-Fi (the brand name for products using IEEE 802.11 standards). The network 130 may be identified by a Service Set Identifier (SSID), which is the key to access the network 130 by a wireless device. Each component coupled to the network 130, e.g., the vehicle 110 or the server 120, is a node in the network 130. Each node on the network 130 is preferably uniquely identified by a Media Access Control address (MAC address).
It is understood that the present invention is not limited to the network types and network components described in the illustrative embodiment of FIG. 1, other network types and network components may also be used. For instance, more than one autonomous vehicle 110 may be included in the network 130, and each may be identified by a unique MAC address. Similarly, more than one server 120 may be included in the network 130, and the servers may work independently from or collaboratively with each other.
A server 120 may include a processor and a memory (not shown). The server 120 may store in its memory information relevant to the navigation of the vehicle 110. Such information may include maps, traffic patterns, and road conditions. The server 120 may receive from the vehicle 110 information related to one or more of the following: map updates, map corrections, traffic pattern updates, and traffic pattern corrections. The server 120 may store the received information in its memory. In some aspects, the server 120 may distribute the received information to other vehicles 110 via the network 130.
A vehicle 110 may be any type of vehicle including, but not limited to, cars, trucks, motorcycles, busses, boats, airplanes, helicopters, lawnmowers, recreational vehicles, amusement park vehicles, construction vehicles, farm equipment, trams, golf carts, trains, and trolleys.
FIG. 2 is a block diagram illustrating hardware configurations of the vehicle 110 in accordance with one aspect of the disclosure. The vehicle 110 may include one or more of the following components: a processor 150, a memory 152, a cellular modem 154, user I/O devices 156, a laser sensor 160, a radar sensor 162, a camera 164, a Global Positioning System (GPS) sensor 166, a non-autonomous component 168, and an autonomous component 170. These components may be operatively connected with each other via physical coupling and/or electrical coupling. One of more of these components may transmit or receive executable instructions in analog/digital signals to or from other component or components. In some aspects, one or more of these components may also transmit or receive communications to or from the server 120 via the network 130. Details with respect to each component are given below.
The processor 150 may be any conventional processor, such as processors from the Intel Corporation or Advanced Micro Devices (“AMD”). Alternatively, the processor 150 may be a dedicated device such as an applicant-specific integrated circuit (“ASIC”). The processor 150 may refer to a collection of processors that may or may not operate in parallel. In one aspect, the vehicle 110 as a whole may have a single processor 150 to perform acts described herein. Alternatively, one or more components of the vehicle 110, e.g., the autonomous component 170, may each have their own processor executing instructions specific to each individual component.
In some aspects, the processor 150 or a collection of processors is physically mounted within the vehicle 110. In some other aspects, the processor 150 or the collection of processors are physically located away from the vehicle 110 and communicate with the vehicle 110 via the network 130. Alternatively, one or more processors 150 of the collection of processors are physically mounted within the vehicle 110, while the remaining processors are remotely connected with the vehicle 110 via the network 130.
The memory 152 may include a volatile memory, a non-volatile memory, or a combination thereof. The volatile memory may include a RAM, such as a dynamic random access memory (DRAM) or static random access memory (SRAM), or any other forms of alterable memory that may be electrically erased and reprogrammed. The non-volatile memory may include a ROM, a programmable logical array, or other forms of non-alterable memory which cannot be modified, or can be modified only slowly or with difficulty.
The cellular modem 154 may include a transmitter and receiver. The modem 154 may receive and transmit information via the network 130. The modem 154 may connect the vehicle 110 to other nodes in the network 130, e.g., the server 120 or other vehicles in the network 130. The modem 154 may transmit to the server 120 information such as maps, information about traffic patterns, and road conditions. In some aspects, the modem 154 may also communicate with roadside sensors, such as traffic cameras or laser sensors stationed on the side of a road.
The user I/O devices 156 may facilitate communication between a user, e.g., a driver or a passenger in the vehicle 110 and the vehicle 110. The user I/O devices 156 may include a user input device, which may include a touch screen. The touch screen may allow the user to switch the vehicle 110 between two operation modes: an autonomous or self-driving mode, and a non-autonomous or operator-driving mode. The user I/O devices 156 may also include a user output device, such as a display or a status bar. The display may display information regarding the status of the vehicle 110. The status bar may indicate the current status of the vehicle 110, e.g., the present operation mode or the present speed.
The laser sensor 160, the radar sensor 162, the camera 164, and the Global Positioning System (GPS) sensor 166 each may observe the environment of the vehicle 110, and provide observation data to the autonomous component 170 of the vehicle 110 for analysis. The observation data may include one or more objects in the surrounding environment of the vehicle 110, such as vehicles, traffic obstacles, traffic signs/signals, trees, and people.
In some aspects, one or more of the sensors may continuously produce observation data to reflect changes or updates in the environment. The sensors may provide updated observation data to the autonomous component 170 in real-time or quasi-real time, or on demand. Based on the observation data, the autonomous component 170 may vary navigation parameters, e.g., direction and/or speed of the vehicle 110, as a response to changes in the environment.
Details of some of the sensors, including their arrangements on the vehicle 110, are discussed with respect to FIG. 3.
The laser sensor 160 may detect any surrounding object that absorbs/reflects energy radiated from the laser sensor 160. The laser sensor 160 may refer to one or more laser sensors, e.g., 160 a and 160 b of FIG. 3. In FIG. 3, the laser sensors 160 a and 160 b may be respectively mounted on the top and the front of the vehicle 110. The laser sensor 160 a positioned at the top of the vehicle 110 may have a horizontal field of view in the range between 50 and 80 meters, and a vertical field of view of about 30°. The laser sensor 160 b positioned at the front of the vehicle 110 may have a horizontal field of view of about 150 meters, and a vertical field view of about 30°. The fields of view of each laser sensor 160 a and 160 b may vary as needed.
The laser sensor 160 a may be physically connected to the vehicle 110 in a manner that the laser sensor 160 a may rotate 360° about a rotation axis “R”. In one aspect, to determine a distance between the vehicle 110 and a surrounding object, the laser sensor 160 a may first rotate to face a surrounding object, and record observation data while facing the surrounding object. The laser sensor 160 a may then output the observation data to the autonomous component 170 for determination of the distance.
The radar sensor 162 may refer to one or more radar sensors, e.g., 162 a-c of FIG. 3. The radar sensors 162 a-c may be located at various positions on the vehicle 110, e.g., the front or the back of the vehicle 110, or one or both lateral sides of the front bumper. In FIG. 3, the radar sensors 162 a-c are positioned, respectively, at the front of the vehicle 110, the back of the vehicle 110, a left lateral side of the front bumper. Another radar sensor (not shown) may be positioned at a right lateral side of the front bumper. In one example, one or more of the radar sensors, e.g., 162 a-c, may have a horizontal field of view of about 200 meters, and a vertical field of view of about 18°. In another example, one or more of the radar sensors, e.g., 162 a-c, may have a horizontal field of view of about 60 meters, and a vertical field of view of about 56°. The fields of view of each radar sensor 162 a-c may vary as needed.
The camera 164 may refer to one or more cameras mounted on the vehicle 110. As shown in FIG. 3, two cameras 164 a-b may be mounted under a windshield 169 near the rear view mirror (not shown). The cameras 164 a-b may have identical or different fields of view. For instance, one camera 164 a may have a horizontal field of view of about 200 meters and a vertical field of view of about 30°, and the other camera 164 b may have a horizontal field of view of about 100 meters and a vertical field of view of about 60°. The fields of view of each camera 164 a-b may vary as needed. The cameras 164 a-b may provide image data to the autonomous component 170 for computing a distance between various objects.
The arrangement of the various sensors discussed above with reference to FIG. 3 is merely exemplary. The arrangement of the sensors may vary as needed.
Returning to FIG. 2, the non-autonomous component 168 may include hardware typically found in a non-autonomous car. For instance, the non-autonomous component 168 may include one or more of the following: engine components and parts, braking system, suspension and steering system, transmission system, wheels and tire parts, lighting and signaling system, and other devices or systems that facilitate manual operation the vehicle 110.
The autonomous component 170 may operate the vehicle 110 autonomously or semi-autonomously, without user interventions. The autonomous component 170 may control a set of navigation parameters of the vehicle 110, which may relate to the speed or direction of the vehicle 110. The autonomous component 170 may translate the navigation parameters into physical actions of the vehicle 110. In some aspects, the autonomous component 170 may actuate systems or parts of the non-autonomous component 168, e.g., braking system or engine, based on the navigation parameters. The autonomous component 170 may also vary settings of the systems or parts of the non-autonomous component 168 based on the navigation parameters.
The autonomous component 170 may adjust the vehicle 110 in response to changes in the surrounding environment of the vehicle 110. Specifically, the autonomous component 170 may receive observation data produced by various sensors discussed above. The autonomous component 170 may adjust one or more of the navigation parameters based on the observation data.
In one aspect, the autonomous component 170 may synthesize the observation data, and use the synthesized data to assess validity, expiration or accuracy of maps relied on by the vehicle 120. More details regarding this aspect are discussed next with reference to FIG. 4.
FIG. 4 is a block diagram illustrating an example of a configuration of the autonomous component 170. In one aspect, the autonomous component 170 may include one or more of the following units: a compilation unit 172, a compression unit 174, a localization unit 176, a projection unit 178, a map database 180, a validation unit 184, and a comparison unit 186. Details with respect to each unit are as follows.
The compilation unit 172 may receive observation data captured by the laser sensor 160, and may compile the observation data. The observation data may be raw images that have not been marked upon or altered by the processor 150. FIGS. 5A-5C illustrate schematic views of raw images captured by one of the laser sensors, e.g., 160 a-b, at various points 512-516 along a vehicle path illustrated in FIG. 5.
Referring to FIG. 5, the road 500 includes two lanes: Lane 1 and Lane 2. Each lane may have a length about 80 m. In this example, as the vehicle 110 proceeds along Lane 2, at least one of the laser sensors, e.g., 160 a-b, captures raw images of the surrounding environment, e.g., the road 500, at various points, e.g., 512-526, along the path of the vehicle 110. FIGS. 5A-C illustrate raw images 500 a-c that are taken by the laser sensor 160 a or 106 b at points 152-156, respectively.
One or more raw images may capture various details of the path taken by the vehicle 110, including the type, color and number of lines on the road 500. For instance, the raw images may indicate one or more of the following information: a solid yellow line, a broken yellow lines, solid yellow double lines, two sets of solid double yellow lines spaced 2 feet or more apart, a solid white line, and a broken white line, double white lines. In FIGS. 5A-C, the raw images 500 a-c each illustrate two broken white lines that are separated by a standard 12-foot lane width.
After receiving the raw images from the laser sensor 160, the compilation unit 172 may compile the raw images. For instance, the compilation unit 172 may synthesize raw images captured over a given period of time, or along a given path. The compilation unit 172 may synthesize the raw images when the vehicle 110 operates in either the autonomous mode or the non-autonomous mode. For instance, the compilation unit 172 may request or receive raw images from the laser sensor 160 when the non-autonomous mode starts, and stop requesting or receiving raw images from the laser sensor 160 when the non-autonomous mode stops.
The compilation unit 172 may render a raw image from a 3D view image to a 2D view image. The rendering process may take into consideration of the configurations of the laser sensor 160, e.g., the horizontal and vertical fields of view, and/or the degree of rotation. The compilation unit 172 may also assemble all the 2D view images together to form a clean Mercator projection of the path taken by the vehicle 110. During this processes, some of the 2D view images may partially overlap each other.
FIG. 5D is a pictorial, schematic view of a compiled image 500 d produced by the compilation unit 172, indicating the current state of the road 500. In FIG. 5D, the solid lines are derived from compilation of the raw images, representing the current state of the road 500. The dashed lines may represent the past state of the road 500 recorded on a map that has been previously prepared. As shown in FIG. 5D, discrepancies exist between the current state of the road 500 and the past state of the same road. Accordingly, the discrepancies in FIG. 5D may suggest that the previously prepared map which records the past state of the road 500 becomes invalid or expired.
FIGS. 10 and 11 provide examples of images compiled from real raw images captured by a laser sensor about the same area. FIG. 10 represents a compiled image of the area, having a subarea 1000 under construction. As shown in FIG. 10, the subarea 1000 is plain, and no features are illustrated therein. FIG. 11 represents a compiled image of the same area, illustrating the state of the subarea 1000 after completion of the construction. As shown in FIG. 11, as a result of the construction, the subarea 1000 is now painted with white lane lines on the road. Accordingly, a comparison of FIGS. 10 and 11 reveals that any map based on the image of FIG. 10 becomes invalid, outdated or expired.
Referring back to FIG. 4, the compression unit 174 may perform image compression to reduce irrelevance and redundancy of the compiled image output by the compilation unit 172. The compression unit 174 may reduce quality, resolution or size of the compiled image. The image compression technique performed by the compression unit 174 may be lossy or lossless. For instance, the compression unit 174 may use one or more of the following techniques to compress images: run-length encoding, entropy encoding, deflation, chain codes, chroma subsampling, transform coding, and fractal compression.
As indicated in FIG. 4, the compression unit 174 may output the compressed image to the cellular modem 154. The cellular modem 154 may send the compressed image to the server 120 via the network 130.
FIG. 6 provides a flowchart 600 illustrating a method performed by the autonomous component 170 for processing raw images. At block 602, the compilation unit 172 of the autonomous component 170 receives one or more raw images from the laser sensor 160. Thereafter, at block 604, the compilation unit 172 generates a compiled image based on the raw images received at block 602. Specifically, the compilation unit 172 may render a plurality of raw images taken at different locations along an actual path of the vehicle 110 into a single compiled image. The single compiled image indicates features of the actual path. At block 606, the compression unit 174 of the autonomous component 170 compresses the compiled image produced by the compilation unit 172 into a reduced form. The reduced form may be the result of reduction in image quantity, reduction in image resolution or reduction in image size. At block 608, the autonomous component 170 outputs the compressed image generated by the compression unit 174 to the cellular modem 154. The cellular modem 154 may then transmit the compressed image to the server 120 via the network 130.
Depending on the bandwidth of the network 130, the autonomous component 170 may selectively transmit either the compiled image outputted by the compilation unit 172 or the compressed image outputted by the compression unit 174 to the server 120. For instance, when the bandwidth is low, the autonomous component 170 may transmit the compressed image outputted by the compression unit 174 to the server 120. Conversely, when the bandwidth is high, the autonomous component 170 may transmit the compiled image outputted by the compilation unit 172 to the server 120.
Referring back to FIG. 4, the autonomous component 170 may temporarily or permanently store a plurality of maps of the real world in a map database 180. The map database 180 may be a relational database. Maps may be stored in one or more of the following formats: compressed or uncompressed, lossless (e.g., BMP) or lossy (e.g., JPEG), and bitmap or vector-based (e.g., SVG), as well as computer instructions for drawing graphics. The maps may comprise any information sufficient to identify the relevant information, such as numbers, descriptive text, proprietary codes, references to data stored in areas of the same memory or different memories (including other network locations) or information that is used by a function to calculate the relevant data.
The maps may include environmental data that was obtained at a previous point in time and is expected to persist regardless of the vehicle's presence in the environment. For example, a map may include detailed map information, e.g., highly detailed maps detecting the shape and elevation of roadways, lane lines, intersections, crosswalks, speed limits, traffic signals, buildings, signs, real time traffic information, or other such features and information. These features may be persistent. For example, when the vehicle 110 approaches the location of a feature in the detailed map information, the vehicle 110 may expect to detect the feature. The detailed map information may also include explicit speed limit information associated with various roadway segments. The speed limit data may be entered manually or scanned from previously taken images of a speed limit sign using, for example, optical-character recognition. The detailed map information may also include two-dimensional street-level imagery, such as highly detailed image data depicting the surroundings of a vehicle from the vehicle's point-of-view. The detailed map information may also include three-dimensional terrain maps incorporating one or more of the objects listed above.
The maps may include detailed map information such as zone information, indicating zones that are unsuitable for driving autonomously. For example, an on-ramp, off-ramp, or other complicated or high traffic areas may be identified as such zones as a driver may feel the need to continuously monitor the vehicle in case the driver must take control. Other zones may be identified as unsuitable for any driving, such as a sidewalk, river, mountain, cliff, desert, etc.
The detailed map information may also include one or more roadgraphs or graph networks of information such as roads, lanes, intersections, and the connections between these features. Each feature may be stored as graph data and may be associated information such as a geographic location and whether or not it is linked to other related features. For example, a stop sign may be linked to a road and an intersection, etc. In some examples, the associated data may include grid-based indices of a roadgraph to allow efficient lookup of certain roadgraph features.
In some examples, the map information may include zones. A zone may include a place where driving may become complicated or challenging for humans and computers, such as merges, construction zones, or other obstacles. As described in more detail below, a zone's rules may require an autonomous vehicle to alert the driver that the vehicle is approaching an area where it may be challenging for the vehicle to drive autonomously. In one example, the vehicle may require a driver to take control of steering, acceleration, deceleration, etc. For instance, FIG. 7A illustrates a map 700 a including a problematic zone 715, which is a construction or work zone unsuitable for driving autonomously. In another example, a zone's rules may require an autonomous vehicle to alert the driver, but rather than requiring the driver to take control, the vehicle may lower its speed and/or increase its following distance (between the autonomous vehicle and another vehicle).
Referring back to FIG. 4, the autonomous component 170 may include a comparison unit 186 that assesses validity of a map retrieved from the map database 180. The comparison unit 186 may receive a map from the map database 180, and may receive the compiled image from the compilation unit 172, where both the map and the compiled image relate to the same area, or relate to the same problematic zone. The comparison unit 186 may compare the compiled image to the map of the same area/zone, and determine if there is any discrepancy. If there is, the comparison unit 186 may issue an alert signal to the cellular modem 154, indicating that the map is invalid or outdated. The cellular modem 154 may transmit the alert signal to the server 120 via the network 130. For instance, with reference to FIGS. 7A-B, 700 a may represent the map related to Area 1 retrieved from the map database 180, and 700 b may represent the compiled image generated by the compilation unit 172 which also relates to Area 1. Because the map 700 a illustrates a work zone 715 absent from the compiled image 700 b, the comparison unit 186 may issue an alert indicating that the map 700 a is invalid or outdated.
FIG. 6A provides a flowchart 650 illustrating a method performed by the autonomous component 170 according to this aspect. Blocks 652-654 may be identical to blocks 602-604 discussed with reference to FIG. 6. At block 656, the autonomous component 170 retrieves a map from the map database 180 corresponding to the same geographical area captured by the raw data at block 652. At block 658, the comparison unit 186 compares the retrieved map to the compiled image obtained at block 654. If the comparison unit 186 determines in block 660 that no discrepancy exists between the retrieved map and the compiled image, the comparison unit 186 may reach a conclusion that the retrieved map remains valid or up-to-date. At that point, the method may terminate or, alternatively, may issue a signal to the cellular modem 154 indicating that the retrieved map remains valid (block 662). However, if the comparison unit 186 determines that a discrepancy exists, the comparison unit 186 may issue a signal to the cellular modem 154 that the map becomes invalid, outdated, or expired (block 664). The cellular modem 154 may subsequently pass signals to the server 120. The server 120 may then make decisions as to how to update the map.
Returning to FIG. 4, the localization unit 176 of the autonomous component 170 may determine the current geographic location of the vehicle 110. For instance, the localization unit 176 may derive the present location of the vehicle 110 based on information collected or derived from one or more of the following sensors: the GPS sensor 166, the laser sensor 160, the camera 164, and an inertial-aided sensor (not shown). The localization unit 176 may determine the location of the vehicle, including one or more of the following: an absolute geographical location, e.g., latitude, longitude, and altitude, and a relative location, e.g., location relative to other cars immediately around the vehicle 110. The mechanism used to determine the relative location may produce less signal distortion compared to the mechanism used to determine the absolute geographical location. Accordingly, the relative location may be more accurate than the absolute geographical location. The location of the vehicle 110 may also indicate the position of the vehicle 110 relative to a ground level, e.g., an underground position when the vehicle is in a tunnel or a cave, or an aboveground position.
The location unit 176 may run for a given time frame, for a given area of interest, or when the vehicle 110 is in either the autonomous mode or the non-autonomous mode. In one aspect, the localization unit 176 starts to determine location of the vehicle 110 once the vehicle 110 enters a particular zone. In another aspect, the localization unit 176 starts to determine location of the vehicle 110 once the autonomous mode is activated. The localization unit 176 may output the current location of the vehicle 110 to the projection unit 178.
The projection unit 178 may receive locations of the vehicle 110 periodically from the localization unit 176. Alternatively, the projection unit 178 may demand provision of locations of the vehicle 100 from the localization unit 176. The projection unit 178 may keep in record the locations of the vehicle 110 during a given time frame, or when the vehicle 110 is in an area of interest, e.g., a problematic area where obstacles were previously recorded. Alternatively, the projection unit 178 may start recording the locations of the vehicle 110 when the vehicle 110 enters an autonomous mode. The projection unit 178 may stop recording once the vehicle 110 exits the autonomous mode. Based on the serious locations recorded, the projection unit 178 may project an actual trajectory taken by the vehicle 110. For instance, as illustrated in FIG. 7C, the projection unit 178 determines an actual trajectory 720 taken by the vehicle 110 in an area.
Referring back to FIG. 4, the validation unit 184 may assess validity, currency or expiration of a map. More specifically, the validation unit 184 may receive a map from the map database 180 of a particular area. Additionally, the validation unit 184 may receive one or more of the following information related to the same area: the actual trajectory taken by the vehicle 110 from the projection unit 178, and the current location of the vehicle 110 output by the localization unit 176.
In one aspect, the validation unit 184 may compare the retrieved map with the actual trajectory. The validation unit 184 may determine if the map includes a plausible path that is consistent with the actual trajectory. If there is no plausible path on the map that is consistent with the actual trajectory, the validation unit 184 may conclude that the map is invalid, outdated or expired. For instance, the validation unit 184 may receive a map 700 a of FIG. 7A from the map database 180, and may receive an actual trajectory 720 of FIG. 7C from the projection unit 178, both of which relate to the same area. The map 700 a includes a problematic zone, e.g., construction or work zone 715, and represents the state of the area before completion of the construction. By contrast, the actual trajectory 720 of FIG. 7C represents a path taken by the vehicle 110 after completion of the construction, and more particularly, after the removal of the construction or work zone 715. If the construction or work zone 715 were still in place, a path along the actual trajectory 720 would not be plausible. Accordingly, by comparing the map 700 a with the actual trajectory 720, the validation unit 184 may conclude that the map 700 a of FIG. 7A is invalid, outdated or expired.
In another aspect, the validation unit 184 may compare the retrieved map with the current location of the vehicle 110 as received from the localization unit 176. The validation unit 184 may determine if the map includes a plausible location, which may be reached by the vehicle 110, in consistency with the current location of the vehicle 110. If there is no plausible location on the map that is consistent with the current location of the vehicle 110, the validation unit 184 may conclude that the map is invalid, outdated or expired. For instance, similar to the scenario described above, the vehicle 110 may receive the map 700 a of FIG. 7A from the map database 180, which represents the state of the area before completion of the construction. The current location as received by the validation unit 184 may be the location 722 in FIG. 7C, which is a location that used to be occupied by the construction or work zone 715. If the construction or work zone 715 were still in place, the vehicle 110 would not be able to arrive at the location 722. Accordingly, the validation unit 184 may conclude that the map 700 a of FIG. 7A is invalid, outdated or expired.
In another aspect, the validation unit 184 may assess validity of the map based on both the actual trajectory of the vehicle 110 and the current location of the vehicle 110. The validation unit 184 may determine (1) whether the map includes a plausible path consistent with the actual trajectory, and (2) whether the map includes a plausible location consistent with the current location. If the answer is “No” to at least one of the questions, the validation unit 184 may conclude that the map is invalid. However, if the answer is “Yes” to both questions, the validation unit 184 may conclude that the map remains valid.
Once the validation unit 184 has determined that the map is invalid, outdated or expired, the validation unit 184 may issue an alert signal to the cellular modem 154 indicating that the map is invalid. The cellular modem 154 may transmit the alert signal to the server 120 via the network 130.
If the map remains valid or up-to-date, the validation unit 184 may issue a signal to the cellular modem 154 indicating that the map remains valid. The cellular modem 154 may in turn transmit the same instruction to the server 120.
In another aspect, the validation unit 184 may output the actual trajectory received from the projection unit 178 to the cellular modem 154. The cellular modem 154 may transmit the actual trajectory to the server 120 via the network 130 for analysis.
FIG. 8 a flowchart 800 illustrating a method performed by the autonomous component 170 for assessing validity of a map according to one aspect described above. At block 802, the localization unit 176 may determine a set of locations visited by the vehicle 110, including the current location of the vehicle 110. The localization unit 176 may run in a degraded mode, such that the unit 176 may determine the vehicle location at a low frequency. Under such mode, the determination made about the location of the vehicle 110 may be less accurate than that under a fully-functional mode.
At block 804, the projection unit 178 projects the actual trajectory of the vehicle 110 based on the set of locations determined by the localization unit 176. The set of locations are determined by the localization unit 176 during the period when the vehicle 110 is in a particular area. Alternatively, the set of locations may be determined during the period of time when the vehicle 110 is in a non-autonomous mode.
At block 805, the autonomous component 170 may retrieve a map from the map database 180 corresponding to the same area in which localization has been performed.
At block 808, the validation unit 184 compares the retrieved map to at least one of the following: the actual trajectory and the current location of the vehicle 110. The validation unit 184 may determine at least one of the following two questions: (1) whether the map includes a plausible path consistent with the actual trajectory, and (2) whether the map includes a plausible location consistent with the current location of the vehicle 110. If the answer is “No” to at least one of the two questions, the validation unit 184 may conclude that the map is invalid, outdated or expired. Otherwise, the validation unit 184 may conclude that the map remains valid or up-to-date.
At block 810, once the validation unit 184 has concluded that the map is invalid, the validation unit 184 may issue an alert to the server 120 that the map is invalid. The server 120 may act accordingly, e.g., update the identified map. On the other hand, if the validation unit 184 has concluded that the map remains valid, the validation unit 184 may then terminate or, alternatively, the validation unit 184 may issue a signal to the server 120, at block 812, that the map remains valid.
In some aspects, the server 120 may have a map database identical or similar to the map database 180 in the autonomous component. The two map databases may be synchronized overtime via the network 130.
FIG. 9 provides a flowchart 900 illustrating a method performed the server 120 according to one aspect. The server 120 may receive a compressed image produced by the compression unit 174 from the autonomous component 170 via the network 130. At block 904, the server 120 may reconstruct the compressed image. At this block, the server 120 may restore the compiled image produced by the compilation unit 172 based on the compressed image. At block 906, the server 120 may identify a map related to the restored image. The map may relate to the same geographical are as that of the restored image or the compressed image. At block 908, the server 120 may present the identified map and the restored image to the user 140 via a display. The restored image may overlay on top of the identified map. The user 140 may then make informed decision of whether the identified map remains valid, or becomes invalid or expired.
Aspects of the present disclosure may be advantageous in that they provide for increased accuracy of navigational data, and therefore provide for increased safety in the operation of vehicles. Moreover, because updated map information and other information related to roadways may be detected and generated using sensors on the vehicle to which the map information will be delivered, this information is highly accurate. Furthermore, this highly accurate information may be provided to other vehicles through a network to enable those vehicles to update their navigational information as well.
As these and other variations and combinations of the features discussed above can be utilized without departing from the subject matter as defined by the claims, the foregoing description of exemplary implementations should be taken by way of illustration rather than by way of limitation of the subject matter as defined by the claims. It will also be understood that the provision of the examples described herein (as well as clauses phrased as “such as,” “e.g.”, “including” and the like) should not be interpreted as limiting the claimed subject matter to the specific examples; rather, the examples are intended to illustrate only some of many possible aspects.

Claims (15)

The invention claimed is:
1. A method for assessing validity of a map, comprising:
receiving image data from a laser sensor, the image data being collected along a vehicle path on a roadway;
compiling the image data to form a first image, wherein the compiling includes at least partially overlaying a first image data from the laser sensor over a second image data from the laser sensor;
identifying a map related to the vehicle path; comparing the map to the first image; and
assessing validity of the identified map based on the comparison.
2. The method of claim 1, further comprising:
transmitting a signal to a server via a network indicating a result of the assessed validity.
3. The method of claim 1, wherein the map includes an area prohibiting entry by a vehicle.
4. The method of claim 1, further comprising: rendering the first image to a Mercator projection of an area related to the vehicle path.
5. A system for assessing validity of a map, comprising:
a laser sensor a memory storing a map associated with a vehicle path on a roadway; and one or more processors in communication with at least one of the memory and the laser sensor, the one or more processors configured to:
compile image data detected by the laser sensor along a vehicle path into a first image, wherein compiling includes at least partially overlaying a first image data from the laser sensor over a second image data from the laser sensor,
compare the first image with the map, and
assess validity of the map based on the comparison.
6. The system of claim 5, wherein the map includes an area prohibiting entry by a vehicle.
7. The system of claim 5, further comprising a cellular modem configured to transmit a result of the assessment to a server.
8. The method of claim 1, wherein compiling image data to form a first image further comprises rendering the first image based on a plurality of images taken at a plurality of locations along the vehicle path.
9. The method of claim 1, wherein comparing the map to the first image further comprises determining whether one or more discrepancies exist between the map and the first image.
10. The method of claim 9, further comprising, if one or more discrepancies exist between the map and the first image, transmitting a signal to a remote computing device indicating that the map is outdated.
11. The method of claim 9, further comprising, if no discrepancies exist between the map and the first image, transmitting a signal a remote computing device indicating that the map is valid.
12. The system of claim 5, wherein compiling the image data to form a first image further comprises rendering the first image based on a plurality of images taken at a plurality of locations along the vehicle path.
13. The system of claim 5, wherein comparing the map to the first image further comprises determining whether one or more discrepancies exist between the map and the first image.
14. The system of claim 13, wherein the one or more processors are further configured to transmit a signal to a remote computing device indicating that the map is outdated based on determining that one or more discrepancies exist between the map and the first image.
15. The system of claim 13, wherein the one or more processors are further configured to transmit a signal to a remote computing device indicating that the map is valid based on determining that no discrepancies exist between the map and the first image.
US13/465,578 2012-05-07 2012-05-07 Map reports from vehicles in the field Active 2033-10-23 US9123152B1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US13/465,578 US9123152B1 (en) 2012-05-07 2012-05-07 Map reports from vehicles in the field
US14/813,822 US9810540B1 (en) 2012-05-07 2015-07-30 Map reports from vehicles in the field
US15/729,182 US10520323B1 (en) 2012-05-07 2017-10-10 Map reports from vehicles in the field
US16/690,246 US11519739B1 (en) 2012-05-07 2019-11-21 Map reports from vehicles in the field

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/465,578 US9123152B1 (en) 2012-05-07 2012-05-07 Map reports from vehicles in the field

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/813,822 Division US9810540B1 (en) 2012-05-07 2015-07-30 Map reports from vehicles in the field

Publications (1)

Publication Number Publication Date
US9123152B1 true US9123152B1 (en) 2015-09-01

Family

ID=53938930

Family Applications (4)

Application Number Title Priority Date Filing Date
US13/465,578 Active 2033-10-23 US9123152B1 (en) 2012-05-07 2012-05-07 Map reports from vehicles in the field
US14/813,822 Active US9810540B1 (en) 2012-05-07 2015-07-30 Map reports from vehicles in the field
US15/729,182 Active US10520323B1 (en) 2012-05-07 2017-10-10 Map reports from vehicles in the field
US16/690,246 Active 2033-02-19 US11519739B1 (en) 2012-05-07 2019-11-21 Map reports from vehicles in the field

Family Applications After (3)

Application Number Title Priority Date Filing Date
US14/813,822 Active US9810540B1 (en) 2012-05-07 2015-07-30 Map reports from vehicles in the field
US15/729,182 Active US10520323B1 (en) 2012-05-07 2017-10-10 Map reports from vehicles in the field
US16/690,246 Active 2033-02-19 US11519739B1 (en) 2012-05-07 2019-11-21 Map reports from vehicles in the field

Country Status (1)

Country Link
US (4) US9123152B1 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160063032A1 (en) * 2014-08-29 2016-03-03 Telenav, Inc. Navigation system with content delivery mechanism and method of operation thereof
EP3165879A1 (en) * 2015-11-04 2017-05-10 Toyota Jidosha Kabushiki Kaisha Map update determination system
GB2554481A (en) * 2016-09-21 2018-04-04 Univ Oxford Innovation Ltd Autonomous route determination
US10006779B2 (en) 2016-08-08 2018-06-26 Toyota Jidosha Kabushiki Kaisha Transmission necessity determination apparatus and route planning system
US20180308191A1 (en) * 2017-04-25 2018-10-25 Lyft, Inc. Dynamic autonomous vehicle servicing and management
US20180315305A1 (en) * 2017-04-27 2018-11-01 Volvo Car Corporation Determination of a road work area characteristic set
WO2018207632A1 (en) * 2017-05-11 2018-11-15 日立オートモティブシステムズ株式会社 Vehicle control device, vehicle control method, and vehicle control system
US20190187723A1 (en) * 2017-12-15 2019-06-20 Baidu Usa Llc System for building a vehicle-to-cloud real-time traffic map for autonomous driving vehicles (advs)
CN110036425A (en) * 2016-11-18 2019-07-19 伟摩有限责任公司 Dynamic routing for automatic driving vehicle
US10363832B2 (en) * 2015-03-06 2019-07-30 Honda Motor Co., Ltd. Vehicle parking control device
WO2019169348A1 (en) * 2018-03-02 2019-09-06 DeepMap Inc. Visualization of high definition map data
US20200080852A1 (en) * 2018-09-06 2020-03-12 Uber Technologies, Inc. Identifying incorrect coordinate prediction using route information
US20200363214A1 (en) * 2019-05-17 2020-11-19 Robert Bosch Gmbh Method for using a feature-based localization map for a vehicle
US11096026B2 (en) 2019-03-13 2021-08-17 Here Global B.V. Road network change detection and local propagation of detected change
US20210291830A1 (en) * 2020-03-17 2021-09-23 Honda Motor Co., Ltd. Travel control apparatus, vehicle, travel control method, and non-transitory computer-readable storage medium
US11255680B2 (en) * 2019-03-13 2022-02-22 Here Global B.V. Maplets for maintaining and updating a self-healing high definition map
US11280622B2 (en) 2019-03-13 2022-03-22 Here Global B.V. Maplets for maintaining and updating a self-healing high definition map
US11287267B2 (en) 2019-03-13 2022-03-29 Here Global B.V. Maplets for maintaining and updating a self-healing high definition map
US11287266B2 (en) 2019-03-13 2022-03-29 Here Global B.V. Maplets for maintaining and updating a self-healing high definition map
US11402220B2 (en) 2019-03-13 2022-08-02 Here Global B.V. Maplets for maintaining and updating a self-healing high definition map
US11409281B2 (en) * 2017-05-31 2022-08-09 Panasonic Intellectual Property Corporation Of America Information processing method for determining difficult area in which travel of vehicle by automatic driving is difficult, information processing apparatus, system, and storage medium
US20220274625A1 (en) * 2021-02-26 2022-09-01 Zoox, Inc. Graph neural networks with vectorized object representations in autonomous vehicle systems
US11519739B1 (en) 2012-05-07 2022-12-06 Waymo Llc Map reports from vehicles in the field
US20230016335A1 (en) * 2021-07-19 2023-01-19 Embark Trucks Inc. Dynamically modifiable map
US11630465B2 (en) * 2015-02-01 2023-04-18 Lyft, Inc. Using zone rules to control autonomous vehicle operation within a zone

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109195138B (en) * 2018-09-29 2021-09-28 北京新能源汽车股份有限公司 Data uploading method, data acquisition terminal and automobile
US11030898B2 (en) 2018-12-13 2021-06-08 Here Global B.V. Methods and systems for map database update based on road sign presence

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001088827A1 (en) 2000-05-15 2001-11-22 Modular Mining Systems, Inc. Permission system for control of autonomous vehicles
US7102496B1 (en) 2002-07-30 2006-09-05 Yazaki North America, Inc. Multi-sensor integration for a vehicle
US20070239331A1 (en) 2005-12-24 2007-10-11 Kaplan Craig R GPS, cellular, FM speed and safety control devise
US20080021628A1 (en) 2004-03-30 2008-01-24 Williams International Co., L.L.C. Hybrid electric vehicle energy management system
US20080161987A1 (en) 1997-10-22 2008-07-03 Intelligent Technologies International, Inc. Autonomous Vehicle Travel Control Systems and Methods
US20100030473A1 (en) * 2008-07-30 2010-02-04 Honeywell International Inc. Laser ranging process for road and obstacle detection in navigating an autonomous vehicle
EP2216225A1 (en) 2009-02-05 2010-08-11 Paccar Inc Autonomic vehicle safety system
US7865277B1 (en) 2007-05-07 2011-01-04 The United States Of America As Represented By The Secretary Of The Navy Obstacle avoidance system and method
US20110279452A1 (en) * 2010-05-13 2011-11-17 Denso Corporation Map display apparatus

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5032845A (en) * 1990-02-08 1991-07-16 D.G.R., Inc. Vehicle locating system with Loran-C
US6012013A (en) * 1995-03-31 2000-01-04 Trimble Navigation Limited Vehicle position reporting in user defined uni-dimensional coordinate system
US5774824A (en) * 1995-08-24 1998-06-30 The Penn State Research Foundation Map-matching navigation system
US5913379A (en) * 1996-01-26 1999-06-22 Figgie International, Inc. Articulated aerial work platform system
JP3647538B2 (en) * 1996-02-12 2005-05-11 本田技研工業株式会社 Vehicle steering device
US6249740B1 (en) * 1998-01-21 2001-06-19 Kabushikikaisha Equos Research Communications navigation system, and navigation base apparatus and vehicle navigation apparatus both used in the navigation system
KR100579767B1 (en) * 1999-01-25 2006-05-15 가부시키가이샤 젠린 Device and method for creating and using data on road map expressed by polygons
US6728392B1 (en) * 2001-01-30 2004-04-27 Navigation Technologies Corp. Shape comparison using a rotational variation metric and applications thereof
JP4004818B2 (en) * 2002-02-28 2007-11-07 松下電器産業株式会社 Position information transmission apparatus and method
EP1632925A1 (en) * 2003-06-11 2006-03-08 Matsushita Electric Industrial Co., Ltd. Digital map position information compressing method and device
US7363151B2 (en) * 2004-06-21 2008-04-22 Matsushita Electric Industrial Co., Ltd. Map error information obtaining system and map error information obtaining method
JP4724043B2 (en) * 2006-05-17 2011-07-13 トヨタ自動車株式会社 Object recognition device
JP5082295B2 (en) * 2006-05-19 2012-11-28 株式会社デンソー Map data providing device
JP4341649B2 (en) * 2006-07-12 2009-10-07 トヨタ自動車株式会社 Navigation device and position detection method
US8389100B2 (en) * 2006-08-29 2013-03-05 Mmi-Ipco, Llc Temperature responsive smart textile
JP4735480B2 (en) * 2006-09-01 2011-07-27 株式会社デンソー Vehicle position detection system
WO2008054203A1 (en) 2006-10-30 2008-05-08 Tele Atlas B.V. Method and apparatus for detecting objects from terrestrial based mobile mapping data
KR20100037487A (en) * 2008-10-01 2010-04-09 엘지전자 주식회사 Mobile vehicle navigation method and apparatus thereof
JP2010191867A (en) * 2009-02-20 2010-09-02 Panasonic Corp Image compression apparatus, image compression method and vehicle-mounted image recording apparatus
US20120316780A1 (en) * 2009-11-04 2012-12-13 Achim Huth Map corrections via human machine interface
FR2955192B1 (en) * 2010-01-12 2012-12-07 Thales Sa METHOD AND DEVICE FOR VERIFYING THE CONFORMITY OF A TRACK OF AN AIRCRAFT
US9377528B2 (en) * 2010-03-19 2016-06-28 Northeastern University Roaming mobile sensor platform for collecting geo-referenced data and creating thematic maps
TWI431246B (en) * 2010-06-30 2014-03-21 Mitac Int Corp Electronic map of the road rendering methods, computer programs and navigation devices
WO2012052057A1 (en) * 2010-10-21 2012-04-26 Tomtom Belgium N.V. Geographic switch for digital maps
US20120202525A1 (en) * 2011-02-08 2012-08-09 Nokia Corporation Method and apparatus for distributing and displaying map events
JP5589900B2 (en) * 2011-03-03 2014-09-17 株式会社豊田中央研究所 Local map generation device, global map generation device, and program
US8543320B2 (en) * 2011-05-19 2013-09-24 Microsoft Corporation Inferring a behavioral state of a vehicle
ITTO20110850A1 (en) * 2011-09-23 2013-03-24 Sisvel Technology Srl METHOD OF MANAGING A MAP OF A PERSONAL NAVIGATION DEVICE AND ITS DEVICE
US8838376B2 (en) * 2012-03-30 2014-09-16 Qualcomm Incorporated Mashup of AP location and map information for WiFi based indoor positioning
US9123152B1 (en) 2012-05-07 2015-09-01 Google Inc. Map reports from vehicles in the field
SI2698606T1 (en) * 2012-08-13 2015-04-30 Kapsch Trafficcom Ag Method for updating a digital street map
US8457880B1 (en) * 2012-11-28 2013-06-04 Cambridge Mobile Telematics Telematics using personal mobile devices
US10406981B2 (en) * 2014-03-20 2019-09-10 Magna Electronics Inc. Vehicle vision system with curvature estimation
US9384402B1 (en) * 2014-04-10 2016-07-05 Google Inc. Image and video compression for remote vehicle assistance

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080161987A1 (en) 1997-10-22 2008-07-03 Intelligent Technologies International, Inc. Autonomous Vehicle Travel Control Systems and Methods
WO2001088827A1 (en) 2000-05-15 2001-11-22 Modular Mining Systems, Inc. Permission system for control of autonomous vehicles
US7102496B1 (en) 2002-07-30 2006-09-05 Yazaki North America, Inc. Multi-sensor integration for a vehicle
US20080021628A1 (en) 2004-03-30 2008-01-24 Williams International Co., L.L.C. Hybrid electric vehicle energy management system
US20070239331A1 (en) 2005-12-24 2007-10-11 Kaplan Craig R GPS, cellular, FM speed and safety control devise
US7865277B1 (en) 2007-05-07 2011-01-04 The United States Of America As Represented By The Secretary Of The Navy Obstacle avoidance system and method
US20100030473A1 (en) * 2008-07-30 2010-02-04 Honeywell International Inc. Laser ranging process for road and obstacle detection in navigating an autonomous vehicle
EP2216225A1 (en) 2009-02-05 2010-08-11 Paccar Inc Autonomic vehicle safety system
US20110279452A1 (en) * 2010-05-13 2011-11-17 Denso Corporation Map display apparatus

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"Google Cars Drive Themselves, in Traffic" [online]. [Retrieved Aug. 19, 2011] Retrieved from the internet: , 4 pages.
"Google Cars Drive Themselves, in Traffic" [online]. [Retrieved Aug. 19, 2011] Retrieved from the internet: <http://www.nytimes.com/2010/10/10/science/10google.html>, 4 pages.
Wikipedia, "Simultaneous localization and mapping", retrieved from , downloaded on Apr. 9, 2015.
Wikipedia, "Simultaneous localization and mapping", retrieved from <http://en.wikipedia.org/wiki/Simultaneous-localization-and-mapping>, downloaded on Apr. 9, 2015.

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11519739B1 (en) 2012-05-07 2022-12-06 Waymo Llc Map reports from vehicles in the field
US9959289B2 (en) * 2014-08-29 2018-05-01 Telenav, Inc. Navigation system with content delivery mechanism and method of operation thereof
US20160063032A1 (en) * 2014-08-29 2016-03-03 Telenav, Inc. Navigation system with content delivery mechanism and method of operation thereof
US11630465B2 (en) * 2015-02-01 2023-04-18 Lyft, Inc. Using zone rules to control autonomous vehicle operation within a zone
US10363832B2 (en) * 2015-03-06 2019-07-30 Honda Motor Co., Ltd. Vehicle parking control device
US10215572B2 (en) 2015-11-04 2019-02-26 Toyota Jidosha Kabushiki Kaisha Map update determination system
EP3165879A1 (en) * 2015-11-04 2017-05-10 Toyota Jidosha Kabushiki Kaisha Map update determination system
JP2017090548A (en) * 2015-11-04 2017-05-25 トヨタ自動車株式会社 Map update determination system
CN106996793A (en) * 2015-11-04 2017-08-01 丰田自动车株式会社 Map rejuvenation decision-making system
CN106996793B (en) * 2015-11-04 2020-11-03 丰田自动车株式会社 Map update determination system
DE102017112024B4 (en) 2016-08-08 2022-04-28 Toyota Jidosha Kabushiki Kaisha Transmission Necessity Determination Device
US10006779B2 (en) 2016-08-08 2018-06-26 Toyota Jidosha Kabushiki Kaisha Transmission necessity determination apparatus and route planning system
GB2554481B (en) * 2016-09-21 2020-08-12 Univ Oxford Innovation Ltd Autonomous route determination
GB2554481A (en) * 2016-09-21 2018-04-04 Univ Oxford Innovation Ltd Autonomous route determination
CN110036425A (en) * 2016-11-18 2019-07-19 伟摩有限责任公司 Dynamic routing for automatic driving vehicle
US11537133B2 (en) 2016-11-18 2022-12-27 Waymo Llc Dynamic routing for autonomous vehicles
CN110036425B (en) * 2016-11-18 2022-02-01 伟摩有限责任公司 Method and system for maneuvering a vehicle and non-transitory computer readable medium
US20180308191A1 (en) * 2017-04-25 2018-10-25 Lyft, Inc. Dynamic autonomous vehicle servicing and management
US10679312B2 (en) * 2017-04-25 2020-06-09 Lyft Inc. Dynamic autonomous vehicle servicing and management
US20180315305A1 (en) * 2017-04-27 2018-11-01 Volvo Car Corporation Determination of a road work area characteristic set
US10643463B2 (en) * 2017-04-27 2020-05-05 Volvo Car Corporation Determination of a road work area characteristic set
US11852491B2 (en) 2017-05-11 2023-12-26 Hitachi Astemo, Ltd. Vehicle control apparatus, vehicle control method, and vehicle control system
WO2018207632A1 (en) * 2017-05-11 2018-11-15 日立オートモティブシステムズ株式会社 Vehicle control device, vehicle control method, and vehicle control system
JP2018189900A (en) * 2017-05-11 2018-11-29 日立オートモティブシステムズ株式会社 Vehicle control device, vehicle control method and vehicle control system
US11409281B2 (en) * 2017-05-31 2022-08-09 Panasonic Intellectual Property Corporation Of America Information processing method for determining difficult area in which travel of vehicle by automatic driving is difficult, information processing apparatus, system, and storage medium
US20190187723A1 (en) * 2017-12-15 2019-06-20 Baidu Usa Llc System for building a vehicle-to-cloud real-time traffic map for autonomous driving vehicles (advs)
US11269352B2 (en) * 2017-12-15 2022-03-08 Baidu Usa Llc System for building a vehicle-to-cloud real-time traffic map for autonomous driving vehicles (ADVS)
WO2019169348A1 (en) * 2018-03-02 2019-09-06 DeepMap Inc. Visualization of high definition map data
CN112204343B (en) * 2018-03-02 2024-05-17 辉达公司 Visualization of high definition map data
EP3759432A4 (en) * 2018-03-02 2022-01-26 Deepmap Inc. Visualization of high definition map data
US11566903B2 (en) * 2018-03-02 2023-01-31 Nvidia Corporation Visualization of high definition map data
CN112204343A (en) * 2018-03-02 2021-01-08 迪普迈普有限公司 Visualization of high definition map data
US11365976B2 (en) 2018-03-02 2022-06-21 Nvidia Corporation Semantic label based filtering of objects in an image generated from high definition map data
US20200080852A1 (en) * 2018-09-06 2020-03-12 Uber Technologies, Inc. Identifying incorrect coordinate prediction using route information
US10955251B2 (en) * 2018-09-06 2021-03-23 Uber Technologies, Inc. Identifying incorrect coordinate prediction using route information
US11287266B2 (en) 2019-03-13 2022-03-29 Here Global B.V. Maplets for maintaining and updating a self-healing high definition map
US11402220B2 (en) 2019-03-13 2022-08-02 Here Global B.V. Maplets for maintaining and updating a self-healing high definition map
US11255680B2 (en) * 2019-03-13 2022-02-22 Here Global B.V. Maplets for maintaining and updating a self-healing high definition map
US11287267B2 (en) 2019-03-13 2022-03-29 Here Global B.V. Maplets for maintaining and updating a self-healing high definition map
US11096026B2 (en) 2019-03-13 2021-08-17 Here Global B.V. Road network change detection and local propagation of detected change
US11280622B2 (en) 2019-03-13 2022-03-22 Here Global B.V. Maplets for maintaining and updating a self-healing high definition map
US20200363214A1 (en) * 2019-05-17 2020-11-19 Robert Bosch Gmbh Method for using a feature-based localization map for a vehicle
US20210291830A1 (en) * 2020-03-17 2021-09-23 Honda Motor Co., Ltd. Travel control apparatus, vehicle, travel control method, and non-transitory computer-readable storage medium
US11938933B2 (en) * 2020-03-17 2024-03-26 Honda Motor Co., Ltd. Travel control apparatus, vehicle, travel control method, and non-transitory computer-readable storage medium
US20220274625A1 (en) * 2021-02-26 2022-09-01 Zoox, Inc. Graph neural networks with vectorized object representations in autonomous vehicle systems
US11745740B2 (en) * 2021-07-19 2023-09-05 Embark Trucks Inc. Dynamically modifiable map
US20230016335A1 (en) * 2021-07-19 2023-01-19 Embark Trucks Inc. Dynamically modifiable map

Also Published As

Publication number Publication date
US10520323B1 (en) 2019-12-31
US11519739B1 (en) 2022-12-06
US9810540B1 (en) 2017-11-07

Similar Documents

Publication Publication Date Title
US11519739B1 (en) Map reports from vehicles in the field
US11287823B2 (en) Mapping active and inactive construction zones for autonomous driving
US8954217B1 (en) Determining when to drive autonomously
CN107851125B9 (en) System and method for two-step object data processing through vehicle and server databases to generate, update and transmit accurate road characteristics databases
CN107850453B (en) System and method for matching road data objects to update an accurate road database
EP2313741B1 (en) Method for updating a geographic database for a vehicle navigation system
DE112020004133T5 (en) SYSTEMS AND PROCEDURES FOR IDENTIFICATION OF POSSIBLE COMMUNICATION BARRIERS
DE112020002175T5 (en) SYSTEMS AND METHODS FOR VEHICLE NAVIGATION
DE112020002604T5 (en) SYSTEMS AND PROCEDURES FOR VEHICLE NAVIGATION
DE112020002764T5 (en) SYSTEMS AND METHODS FOR VEHICLE NAVIGATION
DE112020006426T5 (en) SYSTEMS AND METHODS FOR VEHICLE NAVIGATION
DE112019004323T5 (en) VEHICLE SIDE DEVICE, METHOD AND STORAGE MEDIUM
DE112019004352T5 (en) CARD SYSTEM, VEHICLE SIDE DEVICE, METHOD AND STORAGE MEDIUM
DE112020003897T5 (en) SYSTEMS AND METHODS FOR MONITORING LANE CONGESTION
EP3130945A1 (en) System and method for precision vehicle positioning
DE112020002592T5 (en) SYSTEMS AND METHODS FOR VEHICLE NAVIGATION BASED ON IMAGE ANALYSIS
US10094670B1 (en) Condensing sensor data for transmission and processing
DE112020002869T5 (en) NAVIGATION SYSTEMS AND METHODS OF DETERMINING OBJECT DIMENSIONS
DE112021003811T5 (en) SYSTEMS AND METHODS FOR DYNAMIC ROAD GEOMETRY MODELING AND NAVIGATION
DE112021004128T5 (en) SYSTEMS AND METHODS FOR MAP-BASED MODELING OF THE REAL WORLD
DE112020005275T5 (en) SYSTEMS AND METHODS FOR SELECTIVE DECELERATION OF A VEHICLE
DE112020006427T5 (en) SYSTEMS AND METHODS FOR DETECTING TRAFFIC LIGHTS
DE102022128968A1 (en) IMAGE COLLECTION SYSTEMS AND METHODS FOR VEHICLE NAVIGATION
US20220242442A1 (en) Drive trajectory system and device
DE112021002014T5 (en) CONTROL LOOP FOR NAVIGATING A VEHICLE

Legal Events

Date Code Title Description
AS Assignment

Owner name: GOOGLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CHATHAM, ANDREW HUGHES;REEL/FRAME:028177/0822

Effective date: 20120502

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: WAYMO HOLDING INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GOOGLE INC.;REEL/FRAME:042099/0935

Effective date: 20170321

AS Assignment

Owner name: WAYMO LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WAYMO HOLDING INC.;REEL/FRAME:042108/0021

Effective date: 20170322

AS Assignment

Owner name: GOOGLE LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044142/0357

Effective date: 20170929

AS Assignment

Owner name: WAYMO LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WAYMO HOLDING INC.;REEL/FRAME:047571/0274

Effective date: 20170322

AS Assignment

Owner name: GOOGLE LLC, CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE CORRECTIVE BY NULLIFICATIONTO CORRECT INCORRECTLY RECORDED APPLICATION NUMBERS PREVIOUSLY RECORDED ON REEL 044142 FRAME 0357. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:047837/0678

Effective date: 20170929

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

AS Assignment

Owner name: WAYMO LLC, CALIFORNIA

Free format text: SUBMISSION TO CORRECT AN ERROR MADE IN A PREVIOUSLY RECORDED DOCUMENT THAT ERRONEOUSLY AFFECTS THE IDENTIFIED APPLICATIONS;ASSIGNOR:WAYMO LLC;REEL/FRAME:051093/0861

Effective date: 20191001

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8