CN116803784A - Autonomous control in dense vehicle environments - Google Patents

Autonomous control in dense vehicle environments Download PDF

Info

Publication number
CN116803784A
CN116803784A CN202310775555.2A CN202310775555A CN116803784A CN 116803784 A CN116803784 A CN 116803784A CN 202310775555 A CN202310775555 A CN 202310775555A CN 116803784 A CN116803784 A CN 116803784A
Authority
CN
China
Prior art keywords
data
vehicle
computer
vehicles
automatic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310775555.2A
Other languages
Chinese (zh)
Inventor
马特·Y·鲁普
杰拉尔德·H·恩格尔曼
亚历克斯·莫里斯·米勒
蒂莫西·D·兹维基
勒瓦瑟·特里斯
理查德·李·斯蒂芬森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Publication of CN116803784A publication Critical patent/CN116803784A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/14Adaptive cruise control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0965Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages responding to signals from another vehicle, e.g. emergency vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0289Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling with means for avoiding collisions between vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/65Data transmitted between vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2756/00Output or target parameters relating to data
    • B60W2756/10Involving external transmission of data to or from the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

The present application provides a computer in a first vehicle programmed to receive a first set of data from at least one sensor in the first vehicle and a second set of data from at least one second vehicle. The second set of data is from at least one sensor in at least one second vehicle. The computer is further programmed to use both the first set of data and the second set of data to resolve at least one characteristic of the road being traversed by the first vehicle.

Description

Autonomous control in dense vehicle environments
The application is a divisional application of an application patent application with the application number of 201510079208.1 and the application date of 2015, 02 and 13, which is filed by Ford Global technology company and has the application name of autonomous control in a dense vehicle environment.
Background
Vehicles, particularly vehicles being operated automatically or semi-automatically, may obtain data regarding ambient conditions through a variety of mechanisms, such as sensors included in the vehicle, and the like. The sensor data may provide information about environmental conditions, road edges, or lanes in the road, etc., and may be used to regulate proper speed of the vehicle, proper path of the vehicle, etc. However, existing vehicle sensor data is limited by limitations associated with information that may be determined therefrom. For example, vehicle sensors may be limited in their ability to provide data for automatically or semi-automatically operating the vehicle. This may be particularly true in dense environments, i.e., areas where multiple vehicles share roads and often affect the vision, navigation capabilities, etc. of other vehicles.
Disclosure of Invention
According to the present application there is provided a method performed in a computer in a first vehicle, the computer being arranged to operate the vehicle in at least one of an automatic and semi-automatic mode, the method comprising:
receiving a first set of data from at least one sensor in a first vehicle;
receiving a second set of data from at least one second vehicle, wherein the second set of data is from at least one sensor in the at least one second vehicle; and is also provided with
Both the first set of data and the second set of data are used to resolve at least one characteristic of a road being traversed by the first vehicle.
According to one embodiment of the application, it further includes using both the first set of data and the second set of data including the resolved at least one feature to determine at least one automatic activity for the first vehicle.
According to one embodiment of the application, wherein the at least one automatic activity is at least one of braking, accelerating, changing lanes, and changing distance from a vehicle in front.
According to one embodiment of the application, wherein the at least one characteristic resolved comprises one or more of available driving lanes, curvature of the road, slope of the road, highway entrance ramp, obstacle, weather conditions and road surface conditions.
According to one embodiment of the application, wherein the at least one characteristic resolved comprises at least one target comprising one of an object near the road and the at least one second vehicle.
According to one embodiment of the application, the at least one object comprises at least one third vehicle.
According to one embodiment of the application, it further comprises constructing a virtual map of the environment surrounding the first vehicle, the virtual map comprising the resolved at least one feature.
According to one embodiment of the application, wherein the second set of data from the at least one second vehicle comprises at least one of radar data, lidar data and picture data.
According to one embodiment of the application, some, but not all, of the second set of data is provided in Dedicated Short Range Communications (DSRC).
According to the present application there is provided a system comprising a computer in a vehicle, the computer comprising a processor and a memory, wherein the computer program is arranged to:
receiving a first set of data from at least one sensor in the first vehicle;
receiving a second set of data from at least one second vehicle, wherein the second set of data is from at least one sensor in the at least one second vehicle; and
both the first set of data and the second set of data are used to resolve at least one feature being traversed by the first vehicle.
According to an embodiment of the application, the computer program is further arranged to use both the first set of data and the second set of data comprising the resolved at least one feature to determine the at least one automatic activity for the first vehicle.
According to one embodiment of the application, wherein the at least one automatic activity is at least one of braking, accelerating, lane changing and changing distance from a vehicle in front.
According to one embodiment of the application, wherein the at least one characteristic resolved comprises one or more of available driving lanes, curvature of the road, slope of the road, highway entrance ramp, obstacle, weather conditions and road surface conditions.
According to one embodiment of the application, wherein the at least one characteristic resolved comprises at least one target comprising one of an object near the road and at least one second vehicle.
According to one embodiment of the application, the at least one object comprises at least one third vehicle.
According to one embodiment of the application, the computer is further programmed to construct a virtual map of the environment surrounding the first vehicle, the virtual map comprising the resolved at least one feature.
According to one embodiment of the application, wherein the second set of data from the at least one second vehicle comprises at least one of radar data, lidar data and picture data.
According to one embodiment of the application, some, but not all, of the second set of data is provided in Dedicated Short Range Communication (DSRC) information.
Drawings
FIG. 1 is a block diagram of an exemplary automatic vehicle sensing system.
Fig. 2 is a block diagram of a vehicle road.
FIG. 3 is a diagram of an exemplary process for an automated vehicle in a dense environment.
Detailed Description
FIG. 1 is a block diagram of an exemplary automated vehicle system 100 in a dense environment, i.e., a roadway or the like including more than one vehicle 101, such as interstate highways, city roads, and the like. The computing device 105 in the vehicle 101 typically receives the collected data 115 from the one or more data collectors 110 and/or the one or more second vehicles 101 via one or more information 116. The computing device 105 further includes an autopilot module 106, for example, as a processor stored in the computing device 105 and executable by the computing device 105. The collected data 115 may be used by the first vehicle 101 computer 105 to make decisions regarding the operation of the vehicle 101, including operation of the vehicle 101 in an automatic or semi-automatic mode.
The use of data from the information 116 may provide improved operation of the vehicle 101 compared to automatic or semi-automatic operation of the vehicle 101 based solely on the collected data 115 from the vehicle 101 data collector. For example, a data collector 110, such as a camera, radar, lidar, etc., on the front of the first vehicle 101 may be limited or obstructed by features of the second vehicle 101 and/or the road, such as curves, hillsides, obstacles, such as falling objects, enclosed lanes, building areas, etc. Accordingly, the one or more information 116 may provide information from the one or more second vehicles 101 to the vehicle 101 computer 105 to augment and/or replace the collected data 115, the second vehicle 101 being located in a location that provides data that is not available from the data collector 110 in the first vehicle 101 or is known to be more accurate than the data 115 in the first vehicle 101.
Thus, the system 100 may be used in dense environments, i.e. roads or the like comprising a plurality of vehicles 101. In general, the system 100 may assist in predicting a path that the vehicle 101 will travel based on the data 115 regarding the surrounding environment, assist in determining a path that the vehicle 101 should travel based on the data 115, and in resolving obstacles, anomalies, etc. in the surrounding environment of the vehicle 101, such as a road. The system 100 may thus be beneficial in that conditions in the environment surrounding the vehicle 101 do not reflect existing maps or may be stored in the vehicle 101 computer 105 memory and/or in what is known as an electronic field of view for navigation. For example, in a road construction area, several lanes available in the road, curvature ramps of the road, etc. may not be reflected by the stored map. Further, the sensor data collector 110 of the first vehicle 101 may be blocked or obstructed, for example, by traffic jams, precipitation, haze, etc., and thus cannot obtain the data 115 or at least cannot obtain the correct data 115. However, by using the data 115 from the one or more second vehicles 101, the first vehicle 101 may construct an electronic field of view, such as a virtual map reflecting phenomena presented in the surrounding environment of the vehicle, such as available driving lanes, actual road curvature, slopes, obstacles, etc.
Exemplary System elements
The vehicle 101 includes a vehicle computer 105 that generally includes a processor and memory, including one or more forms of computer-readable media, and storing instructions for execution by the processor including the various operations disclosed herein. For example, the computer 105 typically includes, and is capable of executing, instructions for selecting an automatic mode of operation, to adjust the automatic mode of operation, to change the automatic mode of operation of the vehicle 101, and so on.
Further, the computer 105 may include more than one computing device. The operations of the computer 105 described herein may be performed, for example, by one or more computing devices, such as a controller or the like included in the vehicle 101 for monitoring and/or controlling various vehicle components, such as an Engine Control Unit (ECU), a Transmission Control Unit (TCU), a Power Steering Control Unit (PSCU), or the like. Computer 105 is typically configured for communication over a Controller Area Network (CAN) bus or the like. The computer 105 may also have a connection to an on-board diagnostic connector (OBD II) or the computer 105 may be hardwired to a specific driver control interface or subsystem ECU I/O (input/output). The computer 105 may transmit information to and/or receive information from various devices in the vehicle, such as a controller, an actuator, a sensor, etc. including the data collector 110, via a CAN bus, an OBD II, and/or other wired or wireless mechanism. Alternatively or additionally, where the computer 105 actually contains multiple devices, the CAN bus or the like may be used for communication between the devices described as the computer 105 in the present disclosure.
Further, the computer 105 may include or be communicatively coupled to one or more Radio Frequency (RF) transceivers, receivers, and/or transmitters, and may be configured to communicate with the network 120 and/or other vehicles 101. For example, the computer 105 is typically configured to send information 116 (described further below) to the other vehicle 101 and receive information 116 from the other vehicle 101. Various techniques including hardware, communication protocols, etc. are known for vehicle communication. For example, the information 116 described herein may be transmitted and received in accordance with Dedicated Short Range Communications (DSRC) or the like. As is known, DSRC is a relatively low power operation in the short to medium wave range in the spectrum in the 5.9GHz band, which is specifically allocated by the federal government.
In general, the communications of the vehicle 101 computer 105 may include a variety of wired and/or wireless network technologies, such as mobile (cellular), bluetooth, wired and/or wireless packet networks, and the like. Further, instructions for receiving data, such as from one or more data collectors 110 and/or human-machine interfaces (HMIs), such as an Interactive Voice Response (IVR) system, a Graphical User Interface (GUI) including a touch screen, etc., are typically included in the computer 105, such as module 106. Further, the computer 105 is typically configured to obtain data regarding one or more other vehicles from one or more information 116.
The autopilot module 106 is typically included in instructions stored in the computer 105 and executed by the computer 105. Using data received from the computer, such as from the data collector 110, the server 125, etc., the module 106 can control various vehicle 101 components and/or operations without the driver operating the vehicle 101. For example, the module 106 may be used to steer the vehicle 101 speed, accelerate, decelerate, steer, distance between vehicles and/or amount of time between vehicles, minimum lane change gap between vehicles, minimum left turn across a path, time to arrival, minimum time to arrival at an intersection (no signal) across an intersection, etc. The module 106 may use the collected data 115 and/or data from one or more other vehicles 101 provided in the one or more information 116 in determining one or more actions to be taken when the vehicle 101 is in an automatic or semi-automatic mode.
The data collector 110 may comprise a variety of devices. For example, various controllers in the vehicle may operate as data collectors 110 to provide collected data 115, such as collected data 115 regarding vehicle speed, acceleration, etc., over a CAN bus. Further, sensors or the like, global Positioning System (GPS) devices or the like may be included in the vehicle and may be provided as a data collector 110 to provide data directly to the computer 105, for example, through a wired or wireless connection. The data collector 110 may also include sensors or the like, such as mid-wave and long-wave range sensors, for detecting and possibly also obtaining information from other conditions besides the markers 160 and the vehicle 101 as further described below. For example, the sensor data collector 110 may include mechanisms such as radio, radar, lidar, sonar, cameras, or other image capture devices that may be used to detect markers 160 and/or obtain other collected data 115 related to the automated operation of the vehicle 101, such as measuring the distance between the vehicle 101 and other vehicles or objects, to detect other vehicles or objects, and/or to detect road conditions, such as bends, potholes, inclinations, bumps, changes in gradients, and the like.
The memory of computer 105 typically stores collected data 115. The collected data 115 may include a variety of data collected in the vehicle 101 from the data collector 110, including data 115 obtained from one or more markers 160. Above and below, examples of the collected data 115 are provided, for example, with respect to the indicia 160, and further, the data 115 may also include data calculated thereby in the computer 105. In general, the collected data 115 may include data that may be aggregated by the collection device 110 and/or any data calculated from such data. Thus, the collected data 115 may include a variety of data 115 related to the operation and/or performance of the vehicle 101, and in particular, data related to the movement of the vehicle 101. For example, in addition to data 115 obtained from, for example, markers 160 discussed below, the collected data 115 may also include data related to vehicle 101 speed, acceleration, braking, lane changing and/or lane usage (e.g., on a particular road and/or type of road, such as an interstate highway), average distance from other vehicles at various speeds or speed ranges, and/or other data 115 related to vehicle 101 operation.
The information 116 may include a variety of data related to the operation of the vehicle 101. For example, the present specification published by the society of automotive engineers (Society of Automotive Engineers) for DSRC is provided for including vehicle 101 data in a wide variety of information 116, including vehicle 101 position (e.g., latitude and longitude), speed, heading, acceleration status, brake system status, transmission status, steering wheel position, and the like. However, the information 116 is not limited to data elements included in the DSRC standard or any other standard. For example, the information 116 may include a wide variety of collected data 115 obtained from the vehicle 101 data collector 110, such as camera pictures, radar or lidar data, data from infrared sensors, and the like. Thus, the first vehicle 101 may receive the collected data 115 from the second vehicle 101, whereby the first vehicle 101 computer 105 may use the collected data 115 from the second vehicle 101 as input to the automatic mode 106 in the first vehicle 101, i.e., to determine automatic or semi-automatic operation of the first vehicle 101.
Advantageously, the information 116 may include historical data 115 from the vehicle 101. That is, in addition to reporting data 115 in information 116 on a real-time or near real-time basis, vehicle 101 may include data 115 related to one or more time periods at or prior to the present time associated with information 116. For example, the information 116 may indicate the vehicle 101 location (e.g., latitude and longitude), speed, and heading at a plurality of times. Such data 115 may sometimes be referred to as "trace navigation" data because the past path and/or projected path of the vehicle 101 may be determined from such data 115. Further, the trace navigation data is not limited to the data 115 of the path of the vehicle 101 associated with providing such data 115. For example, the vehicle 101 may provide trace navigation data relating to a variety of phenomena that may be included in the collected data 115 and provided in the information 116, such as a location, speed, orientation, etc. of another vehicle detected by the information 116, a road, weather conditions such as precipitation, temperature, etc., and objects on the road, etc.
Network 120, as described above, represents one or more mechanisms by which vehicle computer 105 may communicate with remote server 125 and/or user server 150. Thus, network 120 may be one or more of a variety of wired or wireless communication mechanisms, including any desired combination of wired (e.g., cable and fiber) and/or wireless (e.g., mobile, wireless, satellite, microwave, and radio frequency) communication mechanisms, and any desired network topology (or topologies when multiple communication mechanisms are utilized). Exemplary communication networks include wireless communication networks (e.g., using Bluetooth, IEEE 802.11, etc.), local Area Networks (LANs), and/or Wide Area Networks (WANs), including the Internet, which provides a data communication server.
The server 125 may be one or more computer servers, each generally including at least one processor and at least one memory storing instructions executable by the processor, including instructions for performing the various steps and processes described herein. The server 125 may include or be communicatively coupled to a data store 130 for storing the collected data 115 received from the one or more vehicles 101.
User device 150 may be any of a variety of computing devices including a processor and memory and communication capabilities. For example, the user device 150 may be a portable computer, tablet computer, smart phone, etc. that uses IEEE 802.11, bluetooth, and/or mobile communication protocols, including capabilities for wireless communication. Further, the user device 150 may use such communication capabilities to communicate over the network 120, including with the vehicle 101 computer 105. The user device 150 may communicate with the vehicle 101 computer 105 through other mechanisms, such as a network in the vehicle 101, known protocols such as bluetooth, etc. Accordingly, certain operations related to the data collector 110 described herein may be performed using the user device 150, such as sound resolution functions, cameras, global Positioning System (GPS) functions, etc., in the user device 150 that may be used to provide data 115 to the computer 105. Further, the user device 150 may be used to provide a human-machine interface (HMI) to the computer 105.
Exemplary Environment
Fig. 2 is a block diagram of a roadway 155 being traversed by a plurality of vehicles 101a, 101b, 101c, 01d, and 101 e. In general, as discussed above, the plurality of vehicles 101 may use the data collector 110 to obtain a variety of collected data 115 and communicate the variety of collected data 115 to other vehicles 101 through one or more information 116.
Thus, each vehicle 101 may obtain data 115 that it may not obtain. For example, the object 160 may enclose the roadway 155. The first vehicle 101a may receive one or more pieces of information 116 about such data 115 of the object 160 from one or more second vehicles 101, such information 115 being unavailable to the first vehicle 101 a. For example, as described in fig. 2, the second vehicle 101b may obstruct the first vehicle 101a from obtaining information about the object 160. As another example, the road 155 may be curved, change in elevation, present traffic obstructions, lane closures, etc., preventing the first vehicle 101 from obtaining information 115 that is used to identify and/or avoid objects that affect travel and/or characteristics of the road 155.
Exemplary Process flow
Fig. 3 is a diagram of an exemplary process of an automated vehicle 101 sensing system in an automated mode, which is generally performed in accordance with instructions in a first vehicle 101 computer 105.
The process 300 begins in block 305, where the vehicle 101 directs an autopilot operation. That is, the vehicle 101 is partially or fully automatically operated, i.e., partially or fully controlled by the autopilot module 160, which may be configured to operate the vehicle 101 in accordance with the collected data 115. For example, all vehicle 101 operations, such as steering, braking, operation, etc., may be controlled by a module 106 in the computer 105. It is also possible that the vehicle 101 may be operated in block 220 in a partially automatic mode, sometimes referred to as a semi-automatic mode (i.e., a partially manual mode, wherein some operations, such as braking, may be manually controlled by the driver, while other operations, including steering, may be controlled by the computer 105). Similarly, the module 106 may control the time that the vehicle 101 changes lanes. Further, the process 200 may begin at some point after the vehicle 101 driving operation begins, such as when manually initiated by a vehicle occupant through a user interface of the computer 105.
Vehicles 101 in an automatic or semi-automatic mode typically transmit and receive, or at least listen for, information 116. Moreover, it is to be appreciated that vehicles 101 that are not in an automatic or semi-automatic mode and/or do not include an automatic module 106 or lack some or all of the automatic operating capabilities may include an information collector 110 that obtains collected information 115 and provides information 116. The specific data elements included in the information 116 may be collected data 115 of the vehicle 101 according to a known standard or protocol, such as DSRC, but as also described above, may be included that is not included in any modern standard or protocol. For example, in addition to data 115 relating to location, speed, etc., the vehicle 101 may provide information 116 sensor data 115, such as radar, lidar, etc., data, pictures, sounds, etc.
In any event, in block 310 following block 305, the computer 105 determines whether it has received one or more information 116 from one or more second vehicles 101. If not, the process 300 returns to block 305. Alternatively, process 300 proceeds to block 315.
In each of blocks 315 and 330, described in the following sequence, the computer 105 determines, for example in block 106, whether any activity is appropriate in the vehicle 101 to perform semi-automatic or automatic operations. In making such a determination, the computer 105 performs the automated or semi-automated operations described above with respect to block 305, but additionally or alternatively serves as input for determining at least one automated operation data obtained from the one or more second vehicles 101 in the one or more information 116. The computer 105 may then perform the automatic operation thus determined. Possible automatic operations or activities include, by way of example and without limitation, braking, accelerating, lane changing, and changing the distance from the vehicle in front, i.e., the vehicle in front of the vehicle 101.
For example, assume that a first vehicle 101 follows a second vehicle 101 along a hilly road. Further, it is assumed that the first vehicle 101 operates the first vehicle 101 in at least a semi-autonomous mode, such as adaptive cruise control, which is being used. That is, the first vehicle 101 uses radar or the like to detect and measure the distance from the second vehicle 101. However, in hilly environments, particularly if the first vehicle 101 has a rigid suspension, the radar may not be able to effectively detect the second vehicle 101, such as when the first vehicle 101 is driving up a slope and the second vehicle 101 is at or above the top of the slope. Thus, the collected data 115 from the second vehicle 101, e.g., reporting the position, speed, orientation, etc., of the second vehicle may be included in the information 116 and used by the computer 105 in the first vehicle 101 to maintain a predetermined speed and/or a predetermined distance from the second vehicle 101. Further, the vehicle 101 computer 105 may receive information 116 from a plurality of second vehicles 101 as described above. Moreover, the computer 105 may use the data 115 from the plurality of second vehicles 101 when in an automatic or semi-automatic mode in regulating one or more activities. For example, the confidence of the automation module 106 may increase or each of the plurality of second vehicles may be reached to provide mutually consistent data 115, such as indicating objects 160 in the road 155, indicating specific conditions in the road 155, such as water, ice, etc., indicating a desired path in the road 155, such as changing lanes, to avoid objects 160, such as construction obstacles, etc.
Thus, in block 315, real-time collected data 115 is obtained, along with real-time or near real-time data received in one or more information 116. In addition, the computer 105 also analyzes information 116 of historical data, such as DSRC "trace navigation" data, and the like. That is, the computer 105 may use the data in the one or more information 116 to determine the location, speed, and/or other distribution, e.g., acceleration, deceleration, etc., of the one or more vehicles 101 in more than one point in time, i.e., time period.
Then, in block 320, the computer 105 in the first vehicle 101, which uses historical data or trace navigation data from the plurality of second vehicles 101 in addition to the real-time or near real-time data 115 collected in the first vehicle 101, may construct a so-called electronic view, which is a map of the environment surrounding the first vehicle 101, locally disposed in the vehicle 101, such as in the computer 105. For example, the vehicle 101 may rely on the stored map to navigate using a Global Positioning System (GPS) application. The map, for example stored in the computer 105 and/or in a memory associated with the devices in the vehicle 101, may include information about the road 155, a number of available lanes, the direction of traffic flow, intersections, entrance ramps, exit ramps, etc., and accordingly, such stored map may be used to construct an initial electronic view.
However, because the information from this electronic field of view relies on GPS to locate the position of the vehicle (101) relative to the infrastructure, position determination is limited by the performance of the GPS system within the vehicle 101. In many cases the accuracy of this mechanism will not allow positioning the vehicle 101 in the exact traffic lane on the road 155, or in some cases, extending substantially parallel or with exit/entrance/service/turning lanes in the vicinity of the road 155. Moreover, the information in such stored maps may not reflect the actual situation, such as when one or more objects 160 obstruct some or all of the roads 155, when a construction area is performed in the roads 155, and so forth. In these cases, the available travel lanes, intersections, available exit ramps and/or entrance ramps, etc. may not be as reflective as in the stored map. However, by using data 115 from the surrounding vehicle 101, such as position data, speed data, heading data, etc., the vehicle 101 computer 105 may construct a virtual map, such as an electronic view reflecting real conditions or characteristics of the environment surrounding the vehicle 101, such as the road 155. The virtual map so constructed may indicate available travel lanes, obstacles such as objects 160 in the road 155 (sometimes the objects 160 and/or the second vehicle 101 are referred to as "targets"), and other conditions such as weather conditions, road surface conditions-e.g., ice, trash, etc., affecting the presence of travel through the road 155, etc. Generally, as used herein, the "environment" surrounding the vehicle 101 may refer to a predetermined radius around the vehicle 101, e.g., 500 meters, 1000 meters, etc., and/or a predetermined distance in front of and/or behind the vehicle 101 on the road 155, etc.
Following block 320, in block 325, the first vehicle 101 computer 105 estimates a first vehicle 101 path based on the collected data 115 in the first vehicle 101. That is, using information about the speed, position, acceleration/deceleration, etc. of the first vehicle 101, the computer 105 may project a similar path for the first vehicle 101 with respect to the electronic field of view.
Following block 325, in block 330, the first vehicle 101 computer 105 analyzes, for example, the aggregate, electronic view of all regions-for example, maps, information that have been generated by fusion of the data 115 from the first vehicle 101 and one or more second vehicles 101 as described above with respect to block 320. That is, statistical data analysis techniques or the like are used to estimate possible environmental and target locations (e.g., according to the X, Y, Z coordinate system), velocity/acceleration vectors (e.g., again according to the X, Y, Z coordinate system), and upcoming or connected road/vehicle paths, similar paths, and the like.
Then, after block 330, in block 335, the computer 105 provides the predicted vehicle 101 path, possibly one or more target locations and/or paths, and environmental data, such as lane availability, road friction, etc., to any automatic and/or semi-automatic ones of the modules 106 and/or vehicles 101.
In block 340, after block 330, computer 105 determines whether process 300 should continue. For example, if the automatic driving operation is stopped and the driver resumes manual control, the process 300 may be stopped if the vehicle 101 is powered off, or the like. In any event, if the process 300 should not continue, the process 300 stops after block 310. Otherwise, the process 300 continues at block 305.
Conclusion(s)
Computing devices such as those discussed herein substantially each include instructions executable by one or more computing devices such as those identified above and for performing the blocks or steps of the processes described above. For example, the process blocks discussed above are embodied as computer-executable instructions.
Computer-executable instructions may be encoded or translated from a computer program produced using a variety of programming languages and/or techniques, including without limitation, and Java alone or in combination TM C, C ++, visual Basic, java Script, perl, HTML, etc. Typically, a processor (e.g., a microprocessor) receives instructions from, for example, a memory, a computer-readable medium, or the like, and executes the instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions and other data may be stored and transmitted using a variety of computer-readable media. Files in a computing device are typically a collection of data stored on a computer readable medium, such as a storage medium, random access memory, or the like.
Computer-readable media include any medium that can be read by a computer and that participates in providing data (e.g., instructions). Such a medium may take many forms, including but not limited to, non-volatile media, and the like. Non-volatile media includes, for example, optical or magnetic disks and other durable memory. Volatile media includes Dynamic Random Access Memory (DRAM), which typically constitutes a main memory. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH EEPROM, any other memory chip or cartridge, or any other medium from which a computer can read.
In the drawings, like reference numerals designate like elements. Further, some or all of these elements may be changed. With respect to the media, processes, systems, methods, etc. described herein, it should be understood that although the steps, etc. of such processes have been described as occurring according to some ordered sequence, such processes may also be performed using steps that are performed in an order other than the order described herein. It is further understood that certain steps may be performed concurrently, other steps may be added, or certain steps described herein may be omitted. In other words, the description of the processes herein is provided for the purpose of explaining certain embodiments, and should in no way be construed as limiting the claimed application.
Accordingly, it is to be understood that the above description is intended to be illustrative, and not restrictive. Many embodiments and applications other than the examples provided will be apparent to those skilled in the art upon reading the above description. The scope of the application should be determined, not with reference to the above description, but instead should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that further developments will occur in the arts discussed herein, and that the disclosed systems and methods will be incorporated into such further embodiments. In summary, it is to be understood that the application is capable of modification and variation and is limited only by the following claims.
All terms used in the claims are intended to be given their broadest reasonable constructions and their ordinary meaning as understood by those skilled in the art unless an explicit indication to the contrary is made herein. In particular, the use of singular articles such as "a," "the," "said," etc. should be understood to describe one or more of the elements specified unless the claims explicitly recite the contrary.

Claims (16)

1. A method performed in a computer in a first vehicle, the computer being configured to operate the vehicle in an automatic mode, the method comprising:
receiving a first set of data from at least one sensor in a first vehicle;
receiving a plurality of sets of data, the plurality of sets of data including a second set of data and a third set of data received from a different second vehicle, wherein the second set of data and the third set of data are each from at least one sensor of the second vehicle; and is also provided with
Analyzing the first, second, and third sets of data and predicting a path of the first vehicle based on the analysis;
the second set of data and the third set of data further comprise automatic operation data for each of the second vehicles; and
when the automatic operation data provided by each of the second vehicles agree with each other, the first vehicle performs an automatic operation on the first vehicle using the at least one automatic operation data to perform at least one of the first vehicle paths.
2. The method of claim 1, further comprising:
and using the automatic operation data of each of the second vehicles for input of the first vehicle based on the determination of the automatic operation of the first vehicle.
3. The method of claim 1, further comprising:
at least one automatic activity for the first vehicle is determined based on the predicted path.
4. A method according to claim 3, wherein the at least one automatic activity is at least one of braking, accelerating, changing lanes, and changing distance from a vehicle in front.
5. The method of claim 1, further comprising analyzing historical data in the first, at least second, and third sets of data using statistical data analysis techniques, constructing a virtual map of an environment surrounding the first vehicle based on the analysis; wherein the virtual map is structured as an electronic field of view reflecting real conditions or features of the first vehicle surroundings.
6. The method of claim 5, the virtual map comprising at least one feature comprising available one or more lanes of travel, curvature of a road, slope of a road, highway entrance ramp, obstacle, weather condition, and road surface condition.
7. The method of claim 5, wherein the historical data includes location data, speed data, and direction of travel data of the first vehicle or the second vehicle over one or more past time periods.
8. The method of claim 5, wherein the at least one characteristic comprises at least one target comprising one of an object proximate to a roadway, each of the second vehicles.
9. A system comprising a computer in a vehicle, the computer being arranged to operate a first vehicle in an automatic mode and comprising a processor and a memory, wherein the computer program is arranged to:
receiving a first set of data from at least one sensor in the first vehicle;
receiving a plurality of sets of data, the plurality of sets of data including at least a second set of data and a third set of data received from different second vehicles, wherein the second set of data and the third set of data are each from at least one sensor in each second vehicle;
analyzing the first, second, and third sets of data and predicting a path of the first vehicle based on the analysis;
the second set of data and the third set of data further comprise automatic operation data for each of the second vehicles; and
when the automatic operation data provided by each of the second vehicles agree with each other, the first vehicle performs an automatic operation on the first vehicle using the at least one automatic operation data to perform at least one of the first vehicle paths.
10. The system of claim 9, wherein the computer performs a determination that the first vehicle is operating automatically and receives the automatic operation data for each of the second vehicles for input to the first vehicle based on the determination.
11. The system of claim 9, wherein the computer program is further configured to determine at least one automatic activity for the first vehicle based on the virtual map.
12. The system of claim 11, wherein the at least one automatic activity is at least one of braking, accelerating, changing lanes, and changing distance from a vehicle in front.
13. The system of claim 9, wherein the computer analyzes historical data in the first, at least second, and third sets of data using statistical data analysis techniques, constructing a virtual map of an environment surrounding the first vehicle based on the analysis; wherein the virtual map is structured as an electronic field of view reflecting real conditions or features of the first vehicle surroundings.
14. The system of claim 13, the virtual map comprising at least one feature comprising available one or more travel lanes, curvature of a road, slope of a road, highway entrance ramp, obstacle, weather condition, and road surface condition.
15. The system of claim 13, wherein the historical data includes location data, speed data, and direction of travel data of the first vehicle or the second vehicle over one or more past time periods.
16. The system of claim 13, wherein the at least one characteristic comprises at least one target comprising one of an object proximate to a roadway, each of the second vehicles.
CN202310775555.2A 2014-02-14 2015-02-13 Autonomous control in dense vehicle environments Pending CN116803784A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US14/180,788 US9079587B1 (en) 2014-02-14 2014-02-14 Autonomous control in a dense vehicle environment
US14/180,788 2014-02-14
CN201510079208.1A CN104843001A (en) 2014-02-14 2015-02-13 Autonomous control in a dense vehicle environment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201510079208.1A Division CN104843001A (en) 2014-02-14 2015-02-13 Autonomous control in a dense vehicle environment

Publications (1)

Publication Number Publication Date
CN116803784A true CN116803784A (en) 2023-09-26

Family

ID=52781433

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201510079208.1A Pending CN104843001A (en) 2014-02-14 2015-02-13 Autonomous control in a dense vehicle environment
CN202310775555.2A Pending CN116803784A (en) 2014-02-14 2015-02-13 Autonomous control in dense vehicle environments

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201510079208.1A Pending CN104843001A (en) 2014-02-14 2015-02-13 Autonomous control in a dense vehicle environment

Country Status (6)

Country Link
US (1) US9079587B1 (en)
CN (2) CN104843001A (en)
DE (1) DE102015202367A1 (en)
GB (1) GB2524384A (en)
MX (1) MX342567B (en)
RU (1) RU2015104551A (en)

Families Citing this family (118)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10157422B2 (en) 2007-05-10 2018-12-18 Allstate Insurance Company Road segment safety rating
US9932033B2 (en) 2007-05-10 2018-04-03 Allstate Insurance Company Route risk mitigation
US10096038B2 (en) 2007-05-10 2018-10-09 Allstate Insurance Company Road segment safety rating system
US8606512B1 (en) 2007-05-10 2013-12-10 Allstate Insurance Company Route risk mitigation
US10520952B1 (en) 2011-07-06 2019-12-31 Peloton Technology, Inc. Devices, systems, and methods for transmitting vehicle data
US10520581B2 (en) * 2011-07-06 2019-12-31 Peloton Technology, Inc. Sensor fusion for autonomous or partially autonomous vehicle control
US8744666B2 (en) 2011-07-06 2014-06-03 Peloton Technology, Inc. Systems and methods for semi-autonomous vehicular convoys
US11334092B2 (en) 2011-07-06 2022-05-17 Peloton Technology, Inc. Devices, systems, and methods for transmitting vehicle data
US9645579B2 (en) 2011-07-06 2017-05-09 Peloton Technology, Inc. Vehicle platooning systems and methods
US20170242443A1 (en) * 2015-11-02 2017-08-24 Peloton Technology, Inc. Gap measurement for vehicle convoying
US20180210463A1 (en) 2013-03-15 2018-07-26 Peloton Technology, Inc. System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles
US11294396B2 (en) 2013-03-15 2022-04-05 Peloton Technology, Inc. System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles
US9563199B1 (en) * 2013-11-27 2017-02-07 Google Inc. Assisted perception for autonomous vehicles
US9390451B1 (en) 2014-01-24 2016-07-12 Allstate Insurance Company Insurance system related to a vehicle-to-vehicle communication system
US9355423B1 (en) 2014-01-24 2016-05-31 Allstate Insurance Company Reward system related to a vehicle-to-vehicle communication system
US10096067B1 (en) 2014-01-24 2018-10-09 Allstate Insurance Company Reward system related to a vehicle-to-vehicle communication system
US10783587B1 (en) * 2014-02-19 2020-09-22 Allstate Insurance Company Determining a driver score based on the driver's response to autonomous features of a vehicle
US10803525B1 (en) 2014-02-19 2020-10-13 Allstate Insurance Company Determining a property of an insurance policy based on the autonomous features of a vehicle
US10796369B1 (en) 2014-02-19 2020-10-06 Allstate Insurance Company Determining a property of an insurance policy based on the level of autonomy of a vehicle
US9940676B1 (en) 2014-02-19 2018-04-10 Allstate Insurance Company Insurance system for analysis of autonomous driving
US10783586B1 (en) * 2014-02-19 2020-09-22 Allstate Insurance Company Determining a property of an insurance policy based on the density of vehicles
US9720411B2 (en) * 2014-02-25 2017-08-01 Ford Global Technologies, Llc Autonomous driving sensing system and method
US11669090B2 (en) 2014-05-20 2023-06-06 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
US9754325B1 (en) 2014-05-20 2017-09-05 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
US9972054B1 (en) 2014-05-20 2018-05-15 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
US10373259B1 (en) 2014-05-20 2019-08-06 State Farm Mutual Automobile Insurance Company Fully autonomous vehicle insurance pricing
US10599155B1 (en) 2014-05-20 2020-03-24 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
US10832327B1 (en) 2014-07-21 2020-11-10 State Farm Mutual Automobile Insurance Company Methods of providing insurance savings based upon telematics and driving behavior identification
DE102014014120A1 (en) * 2014-09-24 2015-04-02 Daimler Ag Function release of a highly automated driving function
US10241509B1 (en) 2014-11-13 2019-03-26 State Farm Mutual Automobile Insurance Company Autonomous vehicle control assessment and selection
US9656805B1 (en) 2014-12-12 2017-05-23 Amazon Technologies, Inc. Mobile base utilizing transportation units for receiving items
US9928474B1 (en) 2014-12-12 2018-03-27 Amazon Technologies, Inc. Mobile base utilizing transportation units for delivering items
US9809305B2 (en) 2015-03-02 2017-11-07 Amazon Technologies, Inc. Landing of unmanned aerial vehicles on transportation vehicles for transport
US9555736B2 (en) 2015-04-03 2017-01-31 Magna Electronics Inc. Vehicle headlamp control using sensing and communication systems
US9639537B2 (en) 2015-06-19 2017-05-02 International Business Machines Corporation Geographic space management
US10019446B2 (en) 2015-06-19 2018-07-10 International Business Machines Corporation Geographic space management
US9497590B1 (en) 2015-06-19 2016-11-15 International Business Machines Corporation Management of moving objects
US20180211546A1 (en) 2015-08-26 2018-07-26 Peloton Technology, Inc. Devices, systems, and methods for authorization of vehicle platooning
US20210258486A1 (en) 2015-08-28 2021-08-19 State Farm Mutual Automobile Insurance Company Electric vehicle battery conservation
EP3353615A4 (en) * 2015-09-15 2019-04-10 Peloton Technology Inc. Vehicle identification and location using senor fusion and inter-vehicle communication
US9865163B2 (en) * 2015-12-16 2018-01-09 International Business Machines Corporation Management of mobile objects
US9805598B2 (en) 2015-12-16 2017-10-31 International Business Machines Corporation Management of mobile objects
US20170174221A1 (en) * 2015-12-18 2017-06-22 Robert Lawson Vaughn Managing autonomous vehicles
US10776636B2 (en) * 2015-12-29 2020-09-15 Faraday&Future Inc. Stereo camera-based detection of objects proximate to a vehicle
US9921581B2 (en) * 2016-01-04 2018-03-20 Ford Global Technologies, Llc Autonomous vehicle emergency operating mode
US11441916B1 (en) 2016-01-22 2022-09-13 State Farm Mutual Automobile Insurance Company Autonomous vehicle trip routing
US10395332B1 (en) 2016-01-22 2019-08-27 State Farm Mutual Automobile Insurance Company Coordinated autonomous vehicle automatic area scanning
US10134278B1 (en) 2016-01-22 2018-11-20 State Farm Mutual Automobile Insurance Company Autonomous vehicle application
US11242051B1 (en) 2016-01-22 2022-02-08 State Farm Mutual Automobile Insurance Company Autonomous vehicle action communications
US11719545B2 (en) 2016-01-22 2023-08-08 Hyundai Motor Company Autonomous vehicle component damage and salvage assessment
US10324463B1 (en) 2016-01-22 2019-06-18 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation adjustment based upon route
US10802477B1 (en) 2016-01-22 2020-10-13 State Farm Mutual Automobile Insurance Company Virtual testing of autonomous environment control system
US9940834B1 (en) 2016-01-22 2018-04-10 State Farm Mutual Automobile Insurance Company Autonomous vehicle application
US10269075B2 (en) 2016-02-02 2019-04-23 Allstate Insurance Company Subjective route risk mapping and mitigation
US9996080B2 (en) * 2016-02-26 2018-06-12 Ford Global Technologies, Llc Collision avoidance using auditory data
US10239529B2 (en) * 2016-03-01 2019-03-26 Ford Global Technologies, Llc Autonomous vehicle operation based on interactive model predictive control
US10444763B2 (en) * 2016-03-21 2019-10-15 Ford Global Technologies, Llc Systems, methods, and devices for fusion of predicted path attributes and drive history
US10553122B1 (en) 2016-03-22 2020-02-04 Amazon Technologies, Inc. Unmanned aerial vehicle data collection for routing
CN105774729B (en) * 2016-03-25 2018-05-11 奇瑞汽车股份有限公司 V2V active safety systems
WO2017180394A1 (en) * 2016-04-12 2017-10-19 Pcms Holdings, Inc. Method and system for online performance monitoring of the perception system of road vehicles
US10121367B2 (en) * 2016-04-29 2018-11-06 Ford Global Technologies, Llc Vehicle lane map estimation
JP7005526B2 (en) 2016-05-31 2022-01-21 ぺロトン テクノロジー インコーポレイテッド State machine of platooning controller
CN109415065B (en) * 2016-07-15 2022-11-01 哈曼国际工业有限公司 Apparatus and method for virtualizing driving environment, and vehicle
US10216188B2 (en) 2016-07-25 2019-02-26 Amazon Technologies, Inc. Autonomous ground vehicles based at delivery locations
US10921810B2 (en) 2016-08-02 2021-02-16 Pcms Holdings, Inc. System and method for optimizing autonomous vehicle capabilities in route planning
US10139244B2 (en) 2016-08-17 2018-11-27 Veoneer Us Inc. ADAS horizon and vision supplemental V2X
JP6690056B2 (en) 2016-08-22 2020-04-28 ぺロトン テクノロジー インコーポレイテッド Control system architecture for motor vehicle
US10369998B2 (en) 2016-08-22 2019-08-06 Peloton Technology, Inc. Dynamic gap control for automated driving
JP6597520B2 (en) 2016-08-26 2019-10-30 トヨタ自動車株式会社 Information processing device
DE102016216520A1 (en) * 2016-09-01 2018-03-01 Robert Bosch Gmbh Method and system for operating a vehicle
US10248120B1 (en) 2016-09-16 2019-04-02 Amazon Technologies, Inc. Navigable path networks for autonomous vehicles
US10303171B1 (en) 2016-09-29 2019-05-28 Amazon Technologies, Inc. Autonomous ground vehicles providing ordered items in pickup areas
US10222798B1 (en) 2016-09-29 2019-03-05 Amazon Technologies, Inc. Autonomous ground vehicles congregating in meeting areas
US10241516B1 (en) 2016-09-29 2019-03-26 Amazon Technologies, Inc. Autonomous ground vehicles deployed from facilities
US10245993B1 (en) 2016-09-29 2019-04-02 Amazon Technologies, Inc. Modular autonomous ground vehicles
US10152058B2 (en) 2016-10-24 2018-12-11 Ford Global Technologies, Llc Vehicle virtual map
US10233021B1 (en) 2016-11-02 2019-03-19 Amazon Technologies, Inc. Autonomous vehicles for delivery and safety
US10514690B1 (en) 2016-11-15 2019-12-24 Amazon Technologies, Inc. Cooperative autonomous aerial and ground vehicles for item delivery
WO2018098161A1 (en) * 2016-11-22 2018-05-31 Dispatch, Inc. Methods for autonomously navigating across uncontrolled and controlled intersections
DE102016224109A1 (en) * 2016-12-05 2018-06-07 Robert Bosch Gmbh A method and apparatus for providing a signal to operate at least two vehicles along a first trajectory
US11263579B1 (en) 2016-12-05 2022-03-01 Amazon Technologies, Inc. Autonomous vehicle networks
EP3552071B1 (en) 2016-12-08 2020-10-21 PCMS Holdings, Inc. System and method for routing and reorganization of a vehicle platoon in a smart city
JP6809890B2 (en) * 2016-12-15 2021-01-06 日立オートモティブシステムズ株式会社 Vehicle control device
US10310500B1 (en) 2016-12-23 2019-06-04 Amazon Technologies, Inc. Automated access to secure facilities using autonomous vehicles
US10308430B1 (en) 2016-12-23 2019-06-04 Amazon Technologies, Inc. Distribution and retrieval of inventory and materials using autonomous vehicles
US10310499B1 (en) 2016-12-23 2019-06-04 Amazon Technologies, Inc. Distributed production of items from locally sourced materials using autonomous vehicles
US10459441B2 (en) * 2016-12-30 2019-10-29 Baidu Usa Llc Method and system for operating autonomous driving vehicles based on motion plans
US10108191B2 (en) * 2017-01-06 2018-10-23 Ford Global Technologies, Llc Driver interactive system for semi-autonomous modes of a vehicle
US10232849B2 (en) * 2017-01-23 2019-03-19 Ford Global Technologies, Llc Collision mitigation and avoidance
US10782704B2 (en) 2017-01-30 2020-09-22 Toyota Motor Engineering & Manufacturing North America, Inc. Determination of roadway features
CN108536114A (en) * 2017-03-01 2018-09-14 北京图森未来科技有限公司 A kind of controller of vehicle
US11364922B2 (en) * 2017-03-07 2022-06-21 Mitsubishi Electric Corporation Driving assistance device, driving assistance method, and computer readable medium
US20180286246A1 (en) * 2017-03-31 2018-10-04 Intel Corporation Sensor-derived road hazard detection and reporting
DE102017208163A1 (en) * 2017-05-15 2018-11-15 Robert Bosch Gmbh Method and device for operating an automated vehicle
GB2562522B (en) * 2017-05-18 2020-04-22 Jaguar Land Rover Ltd Systems and methods for controlling vehicle manoeuvers
US11222299B1 (en) 2017-08-31 2022-01-11 Amazon Technologies, Inc. Indoor deliveries by autonomous vehicles
US10551506B2 (en) * 2017-12-20 2020-02-04 Cubic Corporation Onboard device and controller for vehicle-to-vehicle detection
JP7034721B2 (en) * 2018-01-10 2022-03-14 アルパイン株式会社 Control device and control method for unmanned transport aircraft
US10890920B2 (en) * 2018-02-15 2021-01-12 Aptiv Technologies Limited Vehicle map-data gathering system and method
DE102018202712A1 (en) 2018-02-22 2019-08-22 Volkswagen Aktiengesellschaft Swarm-based trajectories for motor vehicles
US10884418B2 (en) * 2018-04-25 2021-01-05 Aptiv Technologies Limited Vehicle route planning based on instances of other vehicles stopping automated operation
US10723362B2 (en) 2018-06-05 2020-07-28 Denso International America, Inc. Driver assistance system operating based on autonomous statuses of host and local vehicles while in a multi-level autonomous environment
US10899323B2 (en) 2018-07-08 2021-01-26 Peloton Technology, Inc. Devices, systems, and methods for vehicle braking
US10762791B2 (en) 2018-10-29 2020-09-01 Peloton Technology, Inc. Systems and methods for managing communications between vehicles
US11392130B1 (en) 2018-12-12 2022-07-19 Amazon Technologies, Inc. Selecting delivery modes and delivery areas using autonomous ground vehicles
US11052914B2 (en) 2019-03-14 2021-07-06 GM Global Technology Operations LLC Automated driving systems and control logic using maneuver criticality for vehicle routing and mode adaptation
CN109835339B (en) * 2019-03-21 2020-11-03 北京经纬恒润科技有限公司 Channel change decision method and device
US11427196B2 (en) 2019-04-15 2022-08-30 Peloton Technology, Inc. Systems and methods for managing tractor-trailers
CN110111566B (en) * 2019-04-19 2021-07-06 腾讯科技(深圳)有限公司 Trajectory prediction method, apparatus and storage medium
US11300677B2 (en) 2019-07-08 2022-04-12 GM Global Technology Operations LLC Automated driving systems and control logic for host vehicle velocity estimation using wide aperture radar
US11392122B2 (en) 2019-07-29 2022-07-19 Waymo Llc Method for performing a vehicle assist operation
US11474530B1 (en) 2019-08-15 2022-10-18 Amazon Technologies, Inc. Semantic navigation of autonomous ground vehicles
WO2021061810A1 (en) 2019-09-26 2021-04-01 Amazon Technologies, Inc. Autonomous home security devices
US10796562B1 (en) 2019-09-26 2020-10-06 Amazon Technologies, Inc. Autonomous home security devices
US20210124360A1 (en) * 2019-10-23 2021-04-29 GM Global Technology Operations LLC System and process for closest in path vehicle following
US11584377B2 (en) * 2019-11-21 2023-02-21 Gm Cruise Holdings Llc Lidar based detection of road surface features
WO2022075493A1 (en) * 2020-10-06 2022-04-14 엘지전자 주식회사 Method for performing reinforcement learning by v2x communication device in autonomous driving system
CN114475617B (en) * 2022-04-15 2022-07-29 中汽创智科技有限公司 Road condition identification method, device, equipment and storage medium

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7629899B2 (en) 1997-10-22 2009-12-08 Intelligent Technologies International, Inc. Vehicular communication arrangement and method
WO1998035331A1 (en) 1997-02-06 1998-08-13 Mannesmann Ag Transmission of localized traffic information
US8965677B2 (en) * 1998-10-22 2015-02-24 Intelligent Technologies International, Inc. Intra-vehicle information conveyance system and method
DE10244205A1 (en) * 2002-09-23 2004-03-25 Robert Bosch Gmbh Vehicle collision prevention method for preventing collision between motor vehicles uses sensors to detect a vehicle's surroundings and its amount of movement
US8947531B2 (en) * 2006-06-19 2015-02-03 Oshkosh Corporation Vehicle diagnostics based on information communicated between vehicles
JP4821498B2 (en) 2006-08-14 2011-11-24 トヨタ自動車株式会社 Operation management system and platooning device
WO2008043795A1 (en) * 2006-10-13 2008-04-17 Continental Teves Ag & Co. Ohg Method and apparatus for identifying concealed objects in road traffic
US8532862B2 (en) * 2006-11-29 2013-09-10 Ryan A. Neff Driverless vehicle
JP5195927B2 (en) * 2009-01-19 2013-05-15 トヨタ自動車株式会社 Vehicle control device
WO2011085430A1 (en) * 2010-01-15 2011-07-21 Leica Geosystems Ag A system and method of data sharing
DE102010007240A1 (en) * 2010-02-09 2011-08-11 Daimler AG, 70327 Method for determining a track course of a route
CN101823486A (en) * 2010-04-30 2010-09-08 奇瑞汽车股份有限公司 Automatic driving system
US8509982B2 (en) * 2010-10-05 2013-08-13 Google Inc. Zone driving
US9282144B2 (en) * 2011-01-14 2016-03-08 Bae Systems Plc Unmanned vehicle selective data transfer system and method thereof
DE102011007132A1 (en) * 2011-04-11 2012-10-11 Robert Bosch Gmbh Method for energy-saving operation control of motor car, involves outputting signal for adaption of travel course in particular velocity of car such that threshold value of spacing is not exceeded
US8880272B1 (en) * 2012-03-16 2014-11-04 Google Inc. Approach for estimating the geometry of roads and lanes by using vehicle trajectories
CN102616235B (en) * 2012-04-09 2016-01-20 北京航空航天大学 A kind of collaborative collision avoidance device based on truck traffic and collision avoidance method
US20130289824A1 (en) * 2012-04-30 2013-10-31 GM Global Technology Operations LLC Vehicle turn assist system and method
JP5641073B2 (en) * 2012-05-18 2014-12-17 株式会社デンソー Wireless communication device and wireless positioning system
CN102831768B (en) * 2012-08-15 2014-10-15 大连理工大学 Hybrid power bus driving condition forecasting method based on internet of vehicles
US8880273B1 (en) * 2013-01-16 2014-11-04 Google Inc. System and method for determining position and distance of objects using road fiducials
US8849494B1 (en) * 2013-03-15 2014-09-30 Google Inc. Data selection by an autonomous vehicle for trajectory modification
CN103259851A (en) 2013-04-19 2013-08-21 麦特汽车服务股份有限公司 Method for managing vehicle tracks based on service center and vehicle-mounted intelligent terminal device thereof

Also Published As

Publication number Publication date
DE102015202367A1 (en) 2015-08-20
US9079587B1 (en) 2015-07-14
RU2015104551A3 (en) 2018-07-06
RU2015104551A (en) 2016-08-27
CN104843001A (en) 2015-08-19
MX342567B (en) 2016-10-05
GB2524384A (en) 2015-09-23
MX2015001842A (en) 2015-08-13
GB201502281D0 (en) 2015-04-01

Similar Documents

Publication Publication Date Title
CN116803784A (en) Autonomous control in dense vehicle environments
US10963462B2 (en) Enhancing autonomous vehicle perception with off-vehicle collected data
CN104908741B (en) Autonomous driving sensing system and method
CN110377025B (en) Sensor aggregation frame for an autonomous vehicle
US10810872B2 (en) Use sub-system of autonomous driving vehicles (ADV) for police car patrol
CN111123933B (en) Vehicle track planning method and device, intelligent driving area controller and intelligent vehicle
CN106891893B (en) Vehicle mode determination
JP2019145077A (en) System for building vehicle-to-cloud real-time traffic map for autonomous driving vehicle (adv)
JP7258233B2 (en) backward horizon state estimator
US11456890B2 (en) Open and safe monitoring system for autonomous driving platform
JP2019095210A (en) Vehicle controller, method for controlling vehicle, and program
EP3757711B1 (en) A vehicle-platoons implementation under autonomous driving system designed for single vehicle
WO2018199941A1 (en) Enhancing autonomous vehicle perception with off-vehicle collected data
EP3905121A1 (en) Vehicle system with a safety mechanism and method of operation thereof
US11662745B2 (en) Time determination of an inertial navigation system in autonomous driving systems
US20200118424A1 (en) Map information system
US20230256994A1 (en) Assessing relative autonomous vehicle performance via evaluation of other road users
JP2018173800A (en) Automatic travel control device
JP6997006B2 (en) In-vehicle devices, servers, information systems
JP6933069B2 (en) Pathfinding device
KR20220129661A (en) Time delay compensation of inertial navigation system
CN113771845A (en) Method, device, vehicle and storage medium for predicting vehicle track
CN111326002A (en) Prediction method, device and system for environment perception of automatic driving automobile
CN115407344B (en) Grid map creation method, device, vehicle and readable storage medium
US20240126254A1 (en) Path selection for remote vehicle assistance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination