CN116778734A - Intelligent vehicle-mounted unit for serving cooperative automatic driving of vehicle and road - Google Patents

Intelligent vehicle-mounted unit for serving cooperative automatic driving of vehicle and road Download PDF

Info

Publication number
CN116778734A
CN116778734A CN202211626476.7A CN202211626476A CN116778734A CN 116778734 A CN116778734 A CN 116778734A CN 202211626476 A CN202211626476 A CN 202211626476A CN 116778734 A CN116778734 A CN 116778734A
Authority
CN
China
Prior art keywords
viu
vehicle
module
information
cads
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211626476.7A
Other languages
Chinese (zh)
Inventor
冉斌
赵克刚
李汉初
张明恒
李深
程阳
陈志军
石皓天
蒋俊峰
吴任飞
张万铭
徐畅
脱祥亮
梁志豪
董硕煊
叶杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Pilot Transportation Technology Co ltd
Original Assignee
Nanjing Pilot Transportation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Pilot Transportation Technology Co ltd filed Critical Nanjing Pilot Transportation Technology Co ltd
Priority to CN202211626476.7A priority Critical patent/CN116778734A/en
Publication of CN116778734A publication Critical patent/CN116778734A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/48Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for in-vehicle communication

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

This patent provides automated driving related technology, and particularly, but not exclusively, to an intelligent on-board unit (VIU) configured to provide vehicle operation and control to an intelligent networked vehicle (CAV), and more particularly, the VIU is configured to interface with a Coordinated Automated Driving System (CADS), manage and/or control information interaction between the CAV and the CADS, and manage and/or control lateral and longitudinal movement of the CAV, including vehicle following, lane changing, and route guidance.

Description

Intelligent vehicle-mounted unit for serving cooperative automatic driving of vehicle and road
Technical Field
The present patent provides techniques related to autopilot, particularly but not exclusively, intelligent on-board units (VIUs) providing vehicle operation and control for intelligent networked vehicles (CAVs), and more particularly, VIUs configured to interface with a vehicular collaborative autopilot system (CADS), manage and/or control information interactions between CAVs and CADS, and manage and/or control lateral and longitudinal movement of CAVs, including vehicle following, lane changing, and route guidance.
Background
Intelligent networked vehicles capable of autonomous driving under specific conditions are under development. However, the deployment of CAVs is limited by high costs (e.g., capital and/or energy costs) and the related technical complexity of the large number of sensors and computing devices provided on the CAVs, and by insufficient functional capabilities of the CAVs, e.g., to address long tail complex driving scenarios.
Disclosure of Invention
Recently, techniques have been developed to address these issues. For example, collaborative autopilot system (CADS) and/or components thereof are described, for example, in U.S. Pat. app. Ser. No.63/149,804, which is incorporated herein by reference. In some embodiments, the technology described herein relates to a CADS comprising 1) a collaborative management subsystem; 2) A road subsystem; 3) A vehicle subsystem; 4) A communication subsystem; 5) A user subsystem; and/or 6) a support subsystem. In particular, the technology provided herein relates to intelligent on-board units (VIUs) that provide interfaces to CADS. In some embodiments, the CADS's vehicle subsystem includes the vehicle adapter and/or VIU described in this patent.
The VIU technology provided by the patent reduces the cost of CAV and improves the function and capability of CAV by providing interfaces to CADS and associated intelligent network link (CAH) and intelligent infrastructure system (IRIS). In particular, the VIU is configured to be installed in a vehicle and provide an interface with the CADS to improve the autopilot functionality and level of intelligence of the vehicle. By providing CADS services to the vehicle, the VIU significantly reduces the cost and burden of Conventional Vehicle Control Systems (CVCS) and provides autopilot functionality for vehicles of various levels of intelligence or automation defined by SAE.
In some embodiments, the VIU is configured to manage the autopilot functionality of a vehicle (e.g., a CAV). In some embodiments, the VIU provides an interface configured for information interaction between the vehicle and the CADS, between the vehicle and the CADS subsystem, and/or between the vehicle and a roadway infrastructure (e.g., IRIS). In some embodiments, the VIU provides an interface between the vehicle and the IRIS that is an intelligent road side toolbox (IRT) (see, e.g., u.s.pat.app.ser.no.17/192, 529, which is incorporated herein by reference). In some embodiments, the VIU provides an interface between the vehicle and the IRIS subsystem, such as an intelligent roadside unit (RIU). In some embodiments, the VIU provides an interface between the vehicle and the user and/or between the vehicle and the support subsystem. In some embodiments, the VIU is configured to manage sensing, prediction, planning, and/or control functions of the vehicle. In some embodiments, the VIU is configured to manage sensing, prediction, planning, and/or control functions of a plurality of vehicles, including a plurality of vehicles with different levels of intelligence, a plurality of vehicles with different brands and/or manufacturers, a plurality of vehicles with different vehicle types and corresponding manufacturing year, and/or a plurality of vehicles of different vehicle types.
In some embodiments, the VIU provides an interface configured for information interaction between the vehicle and an Automated Driving System (ADS) and/or components thereof, such as in U.S. Pat. app. Pub. Nos.20190096238;20190340921;20190244521;20200005633;20200168081; and 20200021961; U.S. Pat.App.Ser. Nos.16/996, 684;63/004, 551; and 63/004, 564, and in U.S. Pat. nos.10, 380, 886; and 10, 692, 365, each of which is incorporated herein by reference. In some embodiments, ADS technology provides systems, system components, methods, and related functions that overcome the limitations of current CAV technology. In particular, some embodiments of ADS technology include a roadside infrastructure configured to provide roadside awareness, roadside prediction, roadside planning and/or decision making, and/or roadside control of CAVs. These ADS technologies (e.g., systems, system components, methods, and related functions) provide autopilot functionality, for example, by supporting CAV to perform autopilot tasks on a roadway.
In some embodiments, the VIU technology improves upon and/or extends specific ADS technologies, such as intelligent internet traffic (CAVH) technology and at, for example, u.s.pat.app.pub.nos.20190096238;20190340921;20190244521;20200005633;20200168081; and 20200021961; U.S. Pat.App.Ser. Nos.16/996, 684;63/004, 551; and 63/004, 564, and in U.S. Pat. nos.10, 380, 886; and 10, 692, 365, each of which is incorporated herein by reference. In particular, in some embodiments, the VIU provides an interface configured for information interaction between the vehicle and the CAVH system and/or components of the CAVH system. Thus, the VIU techniques described in this patent improve upon the CAVH techniques (e.g., CAVH systems, components of CAVH systems, CAVH methods, and related functions of CAVH) by enhancing the CAVH subsystem design and adding more subsystems and algorithms to the CAVH techniques.
In some embodiments, the VIU provides a system configured to support vehicle and Distributed Driving System (DDS) technology and, for example, in U.S. Pat. app. Pub. No.20210065547; and/or interfaces for information interaction between related techniques described in U.S. Pat.App.Ser. No.62/894,703. In particular, in some embodiments, the VIU technologies provided herein provide an interface configured for information interaction between a vehicle and a Distributed Driving System (DDS) including an intelligent roadside kit (IRT) that provides modular access to CAVH and IRIS technologies (e.g., services) according to the autopilot needs of a particular vehicle.
In addition, embodiments of the VIU technology improve and/or extend the prior ADS and CAVH technologies and related technologies described in, for example, U.S. Pat. app. Ser. No.16/505,034, which is incorporated herein by reference. Accordingly, in some embodiments, the VIU technology provided herein relates to a vehicle control on-board unit (OBU) configured to provide data interaction with a vehicular collaborative traffic system. In some embodiments, the vehicle control OBU is configured to interact data with the CAVH and/or IRIS.
In some embodiments, the technology provides a Conventional Vehicle Control System (CVCS). The CVCS is a vehicle control and execution system that performs an autopilot function as a "brain" for an automated vehicle. In some embodiments, the CVCS is provided to vehicles of different levels of intelligence and provides a variety of techniques for safely operating the vehicle in manual and/or automatic driving modes. These techniques include sensing (e.g., camera, radar, lidar), monitoring, global positioning (e.g., broadcast using global navigation satellite system), computing, artificial intelligence, and wireless and wired communications (e.g., in-vehicle mobile internet, inter-vehicle communication network, in-vehicle communication network).
In some embodiments, the vehicle containing the CVCS has an intelligent level or automation level V defined by the SAE. In some embodiments, the CVCS is provided by an automotive manufacturer, original Equipment Manufacturer (OEM), or technical company. In some embodiments, the CVCS provides a technique for independent driving of the vehicle at intelligent level V. However, in some embodiments, when the vehicle cannot or cannot adequately drive at intelligent level V due to its own system problems, technical limitations, or challenges of the driving environment, CADS with intelligent level S and related IRIS with intelligent level I provide autonomous driving functions and capabilities to the vehicle through the CVCS and/or VIU interface so that the vehicle can adequately perform autonomous driving tasks for the driving environment. Thus, the technology allows the vehicle to be automatically driven at an intelligent level of at least an intelligent level V or higher, and satisfies the user's demand for automatic driving.
Typically, the level of intelligence S is equal to or greater than the level of intelligence V. The VIU is configured to be installed in a vehicle, providing an interface for CADS and/or IRIS to provide autopilot functionality to the vehicle, and in some embodiments, to promote a level of intelligence of the vehicle. Techniques for enhancing vehicle intelligence level V by distributing driving intelligence between a vehicle and an infrastructure are described, for example, in U.S. Pat. app. Ser. No. 16/406, 621, which is incorporated herein by reference. Specifically, U.S. Pat.App.Ser. No. 16/406,621 describes and/or defines five intelligent levels S (S1-S5) of CADS and/or CAVH and five intelligent levels I (I1-I5) of IRIS. In some embodiments, the technology provided by this patent relates to CADS and CAVH system intelligence and system intelligence levels, and to systems and methods for assigning, arranging, and/or distributing driving intelligence and functionality to CADS and CAVH systems based on two dimensions: vehicle intelligence V and infrastructure intelligence I. Thus, the technology provides that a vehicle including a VIU (e.g., a VIU installed in the vehicle) performs an autonomous driving function of intelligent class S with the aid of CADS (e.g., CAVH) and IRIS systems.
As described herein, the technology provides VIUs designed to support autopilot functions, such as supplementing, enhancing, backing up, lifting, and/or replacing vehicle autopilot tasks (e.g., as further described herein). Specifically, in some embodiments, the VIU provides supplemental, enhancement, and backup functions for the vehicle sensing, decision, and control functions of the CVCS. In some embodiments, the VIU increases the level of intelligence or automation defined by SAE. In some embodiments, the VIU function partially or fully replaces the CVCS function to provide an autopilot function, such as in an emergency situation.
In some embodiments, the VIU supplements the vehicle sensing, decision and/or control functions of the CVCS to accomplish the vehicle sensing, decision and/or control functions provided by the CVCS.
In some embodiments, the VIU improves the autopilot functionality (e.g., awareness, prediction, planning, and control) of the vehicle provided by the CVCS.
In some embodiments, the VIU provides backup support for the functionality of the CVCS, for example, in the event of a failure of the CVCS and/or components of the CVCS.
In some embodiments, the VIU promotes the vehicle level of intelligence (e.g., defined by SAE) from a lower level of intelligence to a higher level of intelligence.
In some embodiments, the VIU partially or fully replaces the CVCS, providing the vehicle with a partially or fully autopilot function.
Thus, in some embodiments, the technology provides a VIU that includes one or more in-vehicle sensor access and information processing modules, communication modules, information conversion modules, sensory fusion modules, collaborative decision-making modules, high-precision map and positioning modules, intelligent control instructions/assistance modules, redundancy verification modules, human-machine interaction modules, and/or support modules. In some embodiments, the VIU is configured to be installed in a vehicle and provide some or all of the autopilot functionality for the vehicle.
In some embodiments, the in-vehicle sensor access and information processing module receives information collected by the in-vehicle sensor, processes information collected by the in-vehicle sensor, and/or replaces the information processing functions of a Conventional Vehicle Control System (CVCS). In some embodiments, the on-board sensor access and information processing module overrides the information processing functions of the CVCS when the CVCS fails, i.e., is not operating and/or is not operating properly.
In some embodiments, the communication module manages information interaction between the in-vehicle system and the external system, manages information interaction between the VIU and the CVCS, and manages communication between the VIU subsystems and/or the VIU modules. In some embodiments, the information conversion module manages information interaction between the in-vehicle system and the external system. In some embodiments, the information conversion module includes a codebook and a communication protocol. In some embodiments, the information conversion module manages communications between entities having different data format standards and/or communication protocols. In some embodiments, the information conversion module manages communications between a vehicle equipped with a VIU, one or more vehicles, an intelligent infrastructure system (IRIS), and/or a Collaborative Management (CM) subsystem of a collaborative autopilot system (CADS).
In some embodiments, the perception fusion module fuses perception information provided by the vehicle subsystem with perception information provided by an external system to provide fused perception information. In some embodiments, the perception fusion module outputs fused perception information and/or self-cognition and environmental perception information to the collaborative decision-making module.
In some embodiments, the collaborative decision-making module receives the fused awareness information and uses the fused awareness information to make decisions, path planning, safety identification, and/or generate vehicle control instructions.
In some embodiments, the high-precision map and positioning module provides the high-precision map to the VIU by CADS. In some embodiments, the high-precision map and positioning module provides positioning using high-precision maps, satellite navigation and satellite networks, internet of things devices, and/or geographic tags.
In some embodiments, the intelligent control command/assist module coordinates the vehicle control output generated by the CVCS and the vehicle control output generated by the VIU to generate the integrated control command for controlling the vehicle. In some embodiments, the vehicle control output (e.g., overall vehicle control instructions and/or integrated vehicle control instructions) generated by the VIU is generated by the VIU decision module. In some embodiments, the intelligent control command/assistance module coordinates vehicle control outputs, which are control commands provided by the on-board system and/or control commands provided by the external system. In some embodiments, the redundancy verification module verifies the control instructions to improve and/or maximize the safety of the vehicle.
In some embodiments, the human-machine interaction module receives input from a driver to output information describing the vehicle's external environment and vehicle operating conditions. In some embodiments, the inputs include destination information, driving requirements, and/or control instructions. In some embodiments, the human-machine interaction module prompts the driver to control the vehicle.
In some embodiments, the support module provides power to the VIU subsystem and/or module and maintains system security.
In some embodiments, the VIU includes a combination of multiple modules, and the combination provides some or all of the autopilot functionality depending on the CVCS and the driving mission requirements.
In some embodiments, the VIU is installed in a vehicle and is configured as a subsystem of CADS. In some embodiments, the VIU implements CADS functionality for the vehicle and performs CADS functionality for the vehicle. In some embodiments, the vehicle performs the autopilot tasks at an intelligent level 1, 2, 3, 4, and/or 5.
In some embodiments, the information conversion module manages information interaction between the CADS and the vehicle.
In some embodiments, the CADS receives and processes sensory data describing the vehicle and the driving environment of the vehicle, providing driving control instructions for the vehicle.
In some embodiments, a VIU configured to be a class 1, 2, 3, 4, or 5 intelligent class vehicle is capable of cooperating with an IRIS class 1, 2, 3, 4, or 5 intelligent class to provide a CADS class 1, 2, 3, 4, or 5 intelligent class.
In some embodiments, the VIU is configured to assist multiple vehicles with different levels of intelligence, different brands and/or manufacturers, different vehicle models and corresponding year of manufacture, different vehicle models and/or different platforms of vehicles in performing collaborative autopilot tasks.
In some embodiments, the communication module provides both wired and wireless communication. In some embodiments, the communication module provides information sharing and information interaction between the vehicle containing the VIU, the CADS' collaborative management system, such as the IRIS or IRIS subsystem, and other vehicles. In some embodiments, the IRIS subsystem is an intelligent roadside unit (RIU) or an intelligent roadside kit (IRT). In some embodiments, the communication module communicates using a 4G, 5G, 6G, or 7G cell; dedicated Short Range Communication (DSRC) technology; and/or C-V2X technology. In some embodiments, the communication module interacts with CADS, IRIS or IRIS subsystems and/or collaborative management systems of other vehicles through the information conversion module. In some embodiments, the IRIS subsystem is an RIU or IRT.
In some embodiments, the VIU communicates with CADS, IRIS, or IRIS subsystems and/or other vehicle collaborative management systems to provide communication for automated driving tasks. In some embodiments, the information conversion module provides an information encoding function to encode the autopilot mission data and information using a codebook. In some embodiments, the information conversion module provides information interaction functionality to communicate driving demand, driving information, driving environment information, and/or real-time status of autopilot to the collaborative management system of the CADS; and receives data and information from other modules for use in perceptual data fusion and collaborative decisions by the VIU.
In some embodiments, a perception fusion module receives perception data and information from a vehicle and an external system, performs data fusion on the perception data and information, and provides a perception function. In some embodiments, the sensory data and information from the vehicle and external systems includes High Definition (HD) map information, traffic information, driving information from surrounding vehicles, route planning information, and/or driving decision instructions. In some embodiments, the awareness fusion module obtains resources from an external system to provide enhanced awareness functionality for the vehicle. In some embodiments, the enhanced awareness functionality supports longitudinal and/or lateral trajectory planning and control of a vehicle of grade 1 intelligence. In some embodiments, the awareness fusion module sends information to and/or obtains resources from the CADS to provide supplemental awareness functionality to the vehicle. In some embodiments, a perceptual fusion module sends information to a collaborative decision module of the VIU. In some embodiments, supplemental awareness functionality is provided to the intelligent class 2 vehicle. In some embodiments, the awareness fusion module assists in the operation of the intelligent class 3 vehicle to take over the driving decisions of the human driver. In some embodiments, the awareness fusion module obtains resources from the CADS system and provides additional awareness and monitoring of the driver in real-time. In some embodiments, the perception fusion module sends information to a collaborative decision module of the VIU, provides perception results to the VIU, and determines whether the VIU uses the perception results to take over driving decisions of a human driver. In some embodiments, the awareness fusion module supports operation of intelligent level 4 vehicles in long tail scenarios by obtaining resources from the CADS system and providing awareness information to address long tail ODD risks. In some embodiments, the awareness fusion module supports operation of the intelligent level 5 vehicle by providing improved dynamic HD maps, a greater range of environmental awareness, route planning information, driving decisions, and improved awareness. In some embodiments, the VIU reduces development time and cost for class 5 intelligent vehicles.
In some embodiments, the collaborative decision module cooperates with the CADS to generate fusion results and collaborative decision instructions. In some embodiments, CADS provides external awareness, decision making, and vehicle control information and functions. In some embodiments, the collaborative decision-making module generates longitudinal and/or lateral support vehicle control decisions to provide a partially autonomous driving function to the intelligent level 1 vehicle. In some embodiments, the collaborative decision-making module provides trajectory planning decisions and detailed driving decisions; and sends driver take over decision information from the intelligent level 2 vehicle. In some embodiments, the collaborative decision-making module collaborates with external systems to generate decisions for intelligent level 3 vehicles. In some embodiments, such decisions take over driver decisions. In some embodiments, the VIU takes over the request of the human driver in response to the vehicle CVCS and generates vehicle control instructions. In some embodiments, the CVCS uses the perceptual fusion result to make a decision requesting to take over the human driver. In some embodiments, the VIU decides that it cannot take over the human driver's decision and prompts the driver to assume control of the vehicle, the VIU monitors the state and/or driving behavior of the human driver, the VIU responds to an emergency, and/or the VIU provides vehicle control to assist the human driver. In some embodiments, the collaborative decision-making module collaborates with external systems to generate decisions to address the long tail scenario of the intelligent class 4 vehicle. In some embodiments, the collaborative decision module receives resources from CADS to increase the security of driving decisions. In some embodiments, the collaborative decision module receives resources from CADS to reduce long tail risk and extend ODDs. In some embodiments, the collaborative decision-making module improves predictive decision-making and trajectory planning based on the perceived results of the intelligent level 5 vehicle.
In some embodiments, the intelligent control instruction/assistance module is configured to fuse the VIU decision instruction and the CVCS decision instruction. In some embodiments, the VIU is configured to extend some or all of the CADS autopilot functionality to a vehicle equipped with the VIU by executing CADS system instructions. In some embodiments, the VIU is configured to provide road and traffic information to a vehicle in which the VIU is installed. In some embodiments, the VIU is configured to provide positioning and navigation requirements to the CADS's system map while the VIU sends the origin and destination information to the CADS. In some embodiments, the VIU is configured to send and share awareness information with the CADS when a vehicle in which the VIU is installed is connected to the CADS. In some embodiments, the awareness information is shared by CADS and users of CADS. In some embodiments, the users of the CADS include cloud platforms, IRIS subsystems, road side infrastructure, communication devices, or vehicles equipped with a VIU and connected to the CADS.
In some embodiments, the VIU is configured to supplement, augment, back up, boost, and/or replace the autopilot functionality of the vehicle CVCS. In some embodiments, the VIU cooperates with the vehicle CVCS to supplement, enhance, back up, boost, and/or replace the autopilot functionality of the vehicle's CVCS. In some embodiments, the VIU is configured to supplement, enhance, back up, boost, and/or replace the autopilot functionality of a vehicle of level 1, 2, 3, 4, or 5 intelligence, and drive on a road of level 0, 1, 2, 3, 4, or 5 intelligence. In some embodiments, the VIU is configured to complement the autopilot functionality of the CVCS to provide autopilot for vehicles in long tail scenarios including accidents, events, buildings and/or workspaces, extreme and/or inclement weather, dangerous roads, unclear road markings, signs and/or geometric designs, and/or dense pedestrians and/or bicycles.
In some embodiments, the perception fusion module and collaborative decision-making module of the VIU supplements the autopilot functionality of the CVCS with perception information, decisions, and vehicle control instructions provided by the CADS, CADS subsystems, IRIS, RIU, IRT, and/or road side infrastructure. In some embodiments, the VIU is configured to perform a method for enhancing sensing, prediction, planning, and control functions of the CVCS, the method comprising: the fusion module of the VIU perceives the fusion data and information to enhance the perception and prediction capability of the CVCS; the planning capability of the CVCS is enhanced through cooperation of the cooperative decision module of the VIU and the CADS; and fusing the instructions from the VIU and the CVCS by the intelligent control instruction/auxiliary module of the VIU to generate a comprehensive control instruction, thereby enhancing the control capability of the CVCS.
In some embodiments, the redundancy verification module eliminates and/or minimizes errors and resolves conflicts in information processing and transmission. In some embodiments, the redundancy verification module eliminates and/or minimizes errors, resolves conflicts, and/or verifies perceived information, decisions, and control instructions provided by the on-board system and the external system; a driving decision generated by the CVCS; and/or driving decisions generated by the VIU. In some embodiments, the VIU is configured to cooperate with a vehicle CVCS to provide automated driving of the vehicle, wherein the CVCS generates driving decisions and control instructions; the VIU generates driving decision and control instructions; the VIU fuses driving decisions and/or control instructions from the CVCS and the VIU. In some embodiments, the VIU also provides redundant on-board units for the vehicle to provide stable autopilot functionality for the vehicle. In some embodiments, the CVCS generates driving decisions and control instructions in response to unexpected traffic conditions.
In some embodiments, the VIU cooperates with the CADS or its subsystems, IRIS, RIU, IRT, and/or road side infrastructure to generate driving decisions and control instructions.
In some embodiments, where a module in the CVCS fails or malfunctions, a corresponding module in the VIU system replaces the failed module in the CVCS. In some embodiments, the VIU is configured to increase the level of vehicle intelligence by enhancing the autonomous driving function of the vehicle using a sensory fusion module and a collaborative decision-making module in the VIU.
In some embodiments, the VIU promotes the level of intelligence of the vehicle having level 1 of intelligence to level 2, 3, 4, or 5 of intelligence.
In some embodiments, the VIU promotes the level of intelligence of the vehicle having level 2 of intelligence to level 3, 4, or 5 of intelligence.
In some embodiments, the VIU promotes the level of intelligence of the vehicle having level 3 of intelligence to level 4 or 5 of intelligence.
In some embodiments, the VIU promotes the level of intelligence of the vehicle having level 4 of intelligence to level 5 of intelligence.
In some embodiments, the VIU increases the safety level of a vehicle with a level of intelligence and/or reduces the cost of the vehicle.
In some embodiments, the VIU is configured to replace some or all of the autopilot tasks of the CVCS when the CVCS fails or malfunctions. In some embodiments, the vehicle information access and processing module generates and transmits the awareness information to the awareness fusion module. In some embodiments, the communication module and the information conversion module receive external information and send the external information to the perceptual fusion module. In some embodiments, the perceptual fusion module generates and sends the perceptual fusion result to the collaborative decision module. In some embodiments, the collaborative decision module generates and sends decision instructions to the intelligent control instruction/assistance module to generate integrated control instructions for the vehicle driving task.
In some embodiments, the VIU is configured to form parallel, sequential, and intersecting architectural relationships with the CVCS for information processing. In some embodiments, information processing includes sensory fusion, intelligent decision making, and integrated vehicle control. In some embodiments, forming parallel, sequential, and cross-architecture relationships with the CVCS for information processing consists of integrated and/or fused functional modules of the VIU and CVCS. In some embodiments, the VIU and CVCS share information, data, and/or resources to provide VIU supplementing, enhancing, backing up, boosting, and replacing functions.
Based on the embodiments encompassed by this patent, one of ordinary skill in the relevant art will be able to handle other embodiments.
Drawings
These and other features, aspects, and advantages of the present technology will become better understood with regard to the following drawings.
FIG. 1 illustrates a schematic diagram of an overview of a functional architecture of a coordinated autopilot system (CADS) including a CAV, where the CAV includes a VIU (112). 101, a driver; 102: a vehicle-mounted system; 103, an external system; 104: a perception layer; 105: a decision control layer; 106: an execution layer; 107: the information interacts with the decision; 108: man-machine interaction; 109: self-cognition and environmental perception; 110: a comprehensive control instruction; 111: a conventional vehicle control system; 112, a vehicle-mounted intelligent unit; 113 other vehicles; 114: a road system; 115, collaborative management system; 116, a support system; 117 communication; 118: sensing and fusing; 119 collaborative decision-making; 120: and (5) intelligent control.
Fig. 2A shows a schematic diagram of physical and/or functional subsystems and modules in a vehicle intelligent unit. 201: a vehicle intelligent unit; 202: an interaction subsystem; 203: a perception subsystem; 204: a decision subsystem; 205: a replenishment subsystem; 206: a man-machine interaction module; 207: an information conversion module; 208: a communication module; 209: the vehicle-mounted sensor access and information processing module; 210: a perception fusion module; 211: a high-precision map and positioning module; 212: an intelligent control command/auxiliary module; 213: a collaborative decision-making module; 214: a support module; 215 redundant validation module.
Fig. 2B illustrates a schematic diagram of exemplary logical connections between and exemplary data flows through subsystems and/or modules of an embodiment of a VIU.
Fig. 2C illustrates a schematic diagram of an embodiment of a VIU including an exemplary sub-combination of the subsystems and/or modules illustrated in fig. 2A.
Fig. 2D illustrates a schematic diagram of an embodiment of a VIU including an exemplary sub-combination of the subsystems and/or modules illustrated in fig. 2A.
Fig. 2E illustrates a schematic diagram of an embodiment of a VIU including an exemplary sub-combination of the subsystems and/or modules illustrated in fig. 2A.
FIG. 3 shows a schematic of CAV manufacturer, brand, family, year, and platform diversity; the diversity of the operational design field; and discovering a diversity of vehicle intelligence levels that use and/or interact with embodiments of the VIU technology described herein. 301: CAV;302: an intelligent level; 303, various operation design fields; 304, each manufacturer; 305 various brands; 306: various series; 307, each year; 308: various platforms; 309:v1;310:v2;311:v3;312:v4;313:V5.
Fig. 4 shows a schematic diagram of information processing using an information conversion module. 401: and an information conversion module. 402: decoding; 403: a code book; 404: encoding; 405: a collaboration management system; 406: information interaction (e.g., information exchange); 407: other modules; 408: CAV.
FIG. 5 illustrates a flow chart of an embodiment of a fusion and awareness method for vehicles with different levels of intelligence, including evaluating vehicle-centric awareness and tasks performed by a collaborative management system of CADS, a fusion module in CVCS, and/or a awareness and awareness fusion module of VIU.
Fig. 6 illustrates a flow chart of an embodiment of a collaborative decision method, including tasks performed by a collaborative decision module in a CVCS and a collaborative decision module of a VIU.
FIG. 7 illustrates a schematic diagram of an embodiment of a redundancy verification technique. 701 VIU;702: a perception layer; 703: a decision layer; 704: a control layer; 705: an infrastructure; 706: other systems; 707: a redundancy check module; 708: a collaborative decision-making module; 709: a perception fusion module; 710: a high-precision map positioning and identifying module; 711: the vehicle-mounted sensor access and information processing module; 712: a communication module; 713: an information conversion module; 714: an intelligent control command/auxiliary module; 715: data flow: sensing information; 716: data flow: information/decision/control instructions; 717: data flow: a comprehensive control instruction; 718: data flow: control decisions/instructions; 719: data flow: sensing results from the vehicle-mounted sensor access and information processing module; 720: data flow: a positioning request is sent to a cloud (external system) through a high-precision map and a positioning module; 721: data flow: high-definition positioning information is obtained from a cloud (external system) through a high-precision map and a positioning module; 722: data flow: sensing information from on-board and external systems; 723: data flow: sensing a fusion result; 724: data flow: fusing results and decisions from external systems; 725: data flow: a decision instruction of the VIU; 726: data flow: control instructions from on-board and external systems; 727: data flow: a comprehensive control instruction; 728: conventional Vehicle Control Systems (CVCS).
Fig. 8 illustrates a flow chart of the intelligent control command/assistance module functions for combining decision and control commands from the VIU and other systems.
Fig. 9 illustrates a flow chart of an autopilot function provided to a vehicle including a VIU connected to CADS.
Fig. 10 is a schematic diagram depicting information uploaded by a VIU (1001) and sharing the uploaded information with users of CADS systems (e.g., cloud platform 1002, roadside infrastructure 1003, communication device 1004, and other vehicles equipped with a VIU and connected to CADS system 1005).
Fig. 11 illustrates a flow chart of a method for splitting and/or providing collaboration between a CVCS and a VIU.
Fig. 12 illustrates a flow chart of a method of providing automated driving to a vehicle by modules and sub-modules of a VIU when a CVCS fails.
Fig. 13 illustrates a flow chart of a method of taking over a CVCS with a VIU function to provide an autopilot task to a vehicle when the CVCS fails.
Fig. 14 illustrates a flow chart of an exemplary method for VIU replenishment functionality.
Fig. 15 illustrates a flow chart of an exemplary method for VIU enhancement.
Fig. 16 illustrates a flow chart of an exemplary method of the VIU lifting function.
Fig. 17A illustrates a flow chart of an embodiment of a collaboration method for autopilot, wherein a VIU and a CVCS cooperate to provide information processing using a sequential information processing architecture.
Fig. 17B illustrates a flow chart of an embodiment of a collaboration method for autopilot, wherein a VIU and a CVCS cooperate to provide information processing using an information processing architecture that combines parallel, sequential, and cross information processing.
It should be understood that the drawings are not necessarily drawn to scale and that the objects in the drawings are not necessarily drawn to scale. The accompanying drawings are included to provide a clear and understanding of the various embodiments of the devices, systems, and methods disclosed herein. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. Furthermore, it should be understood that the drawings are not intended to limit the scope of the application in any way.
Detailed Description
Provided herein are technologies relating to autopilot, particularly, but not limited to, technologies relating to a Vehicle Intelligent Unit (VIU) configured to assist a Connected Automated Vehicle (CAV) in overall vehicle operation and control, and more particularly, to a VIU configured to interface with a coordinated autopilot system (CADS) and assist the CAV in managing and/or controlling its lateral and longitudinal movements, including vehicle following, lane changing, route guidance, and related information. The VIU design is installed on a vehicle that is capable of performing some or all of the autopilot functions and/or enhancing the level of vehicle autopilot intelligence. In some embodiments, the VIU provides interfaces that allow the vehicle to access and/or interact with one or more collaborative autopilot systems (CADS), including, for example, passengers and drivers, connected Autopilots (CAVs), connected Autohighways (CAHs), communication systems, collaborative management subsystems, and/or support subsystems. In some embodiments, the VIU works with a Conventional Vehicle Control System (CVCS) of the vehicle to perform some or all of the autopilot tasks. In some embodiments, the VIU provides supplemental, augmentation, backup, boost, and override functions that provide the vehicle with an autopilot task, allowing the vehicle to perform the autopilot task.
In the detailed description of the various embodiments, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, it will be understood by those skilled in the art that the various embodiments may be practiced with or without these specific details. In other instances, structures and devices are shown in block diagram form. Moreover, those skilled in the art will readily appreciate that the specific sequences of presentation and execution of the methods are illustrative, and that it is contemplated that the sequences may be varied and still remain within the spirit and scope of the various embodiments disclosed herein.
All documents and similar materials cited in this application, including but not limited to patents, patent applications, articles, books, treatises, and internet web pages, are expressly incorporated by reference in their entirety for any purpose. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the various embodiments described herein belong. When the definitions of terms in the incorporated references appear to be different from those provided in the present teachings, the definitions provided in the present teachings control. The section headings used herein are for organizational purposes only and are not to be construed as limiting the subject matter described in any way.
Definition of the definition
To facilitate an understanding of the present technology, some terms and phrases are defined below. Additional definitions are set forth throughout the detailed description.
Throughout the specification and claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise. The phrase "in one embodiment" as used herein does not necessarily refer to the same embodiment, although it may. Furthermore, the phrase "in another embodiment" as used herein does not necessarily refer to a different embodiment, although it may. Accordingly, as described below, various embodiments of the present invention may be readily combined without departing from the scope or spirit of the present invention.
Furthermore, as used herein, the term "or" is an inclusive "or" operator and is equivalent to the term "and/or" unless the context clearly dictates otherwise. The term "based on" is not exclusive and allows for being based on other factors not described, unless the context clearly dictates otherwise. Furthermore, throughout the specification, the meaning of "a", "an", and "the" includes plural references. The meaning of "in" includes "in" and "on".
As used herein, the terms "about," "substantially," and "substantially" are understood by those of ordinary skill in the art and will vary to some extent depending on the context in which they are used. If the use of these terms is not clear to one of ordinary skill in the art, then, given their context, "about" and "approximately" means less than or equal to 10% of the particular term, and "substantially" and "significantly" means greater than 10% of the particular term.
As used herein, the disclosure of a range includes all values disclosed and further divided ranges across the range, including the endpoints and sub-ranges given for the range.
As used herein, the suffix "-free" refers to an embodiment of the technique that omits the feature of the base root of the word to which the "-free" word is appended. That is, the term "X-free" as used herein means "without X", where X is a feature of a technology omitted in the "X-free" technology. For example, a "no calcium" composition does not include calcium, a "no mix" process does not include a mixing step, and so forth.
Although the terms "first," "second," "third," etc. may be used herein to describe various steps, elements, components, regions, layers and/or sections, these steps, elements, components, regions, layers and/or sections should not be limited by these terms unless otherwise indicated. These terms are used to distinguish one step, element, component, region, layer or section from another step, element, component, region, layer or section. Terms such as "first," "second," and other numerical terms used herein do not imply a sequence or order unless clearly indicated by the context. Thus, a first step, element, composition, component, region, layer or section discussed herein could be termed a second step, element, composition, component, region, layer or section without departing from the teachings.
As used herein, the terms "present" or "absent" (or "present" or "absent") are used in a relative sense to describe the number or level of a particular entity (e.g., component, action, element). For example, when an entity is referred to as "present," it means that the level or number of the entity is above a predetermined threshold; conversely, when an entity is referred to as "absent," it means that the level or number of the entity is below a predetermined threshold. The predetermined threshold may be a detectability threshold associated with a particular test for detecting an entity or any other threshold. An entity is "present" when it is "detected"; when an entity "does not detect" it "does not exist".
As used herein, "increase" or "decrease" refers to a detectable (e.g., measured) positive or negative change in the variable value relative to a previously measured variable value, relative to a predetermined value, and/or relative to a standard control value, respectively. The increase is a positive change with respect to the previously measured variable value, the predetermined value and/or the value of the standard control, preferably at least 10%, more preferably 50%, still more preferably 2-fold, even more preferably at least 5-fold, and most preferably at least 10-fold. Similarly, the decrease is a negative change, preferably at least 10%, more preferably 50%, still more preferably at least 80%, and most preferably at least 90% of the previously measured variable value, predetermined value and/or standard control value. Other terms indicating quantitative changes or differences, such as "more" or "less", are used herein in the same manner as described above.
As used herein, the term "number" shall refer to one or an integer greater than one (e.g., a complex number).
As used herein, a "system" refers to a plurality of real and/or abstract components that operate together for a common purpose. In some embodiments, a "system" is an integrated collection of hardware and/or software components. In some embodiments, each component of the system interacts with and/or is associated with one or more other components. In some embodiments, the system refers to a combination of components and software for controlling and directing the method. For example, a "system" or "subsystem" may include one or more or any combination of the following: a mechanical device, hardware, a hardware component, a circuit, a logic design, a logic component, software, a software module, a software component or software module, a software process, a software instruction, a software routine that performs a system or subsystem function, a software object, a software function, a software class, a software program, a file containing software, etc. Thus, the methods and apparatus of the embodiments, or certain aspects or portions thereof, may take the form of program code (e.g., instructions) embodied in tangible media, such as floppy diskettes, CD-roms, hard drives, flash memory, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the embodiments. In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (e.g., volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs that may implement or utilize the processes described in connection with the embodiments, e.g., through the use of Application Programming Interfaces (APIs), reusable controls, etc. Such programs are preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language, and combined with hardware implementations.
As used herein, the term "long tail" scene, event, environment, etc., refers to a scene, event, environment, etc., that occurs at a low frequency and/or that is predicted to occur with a low probability. Exemplary long tail scenarios, events, and/or environments include, but are not limited to, vehicle accidents; special events (e.g., physical activity, dangerous evacuation, etc.); a construction zone and/or a working zone; extreme and/or adverse weather (e.g., snow storm, road icing, heavy rain, etc.); dangerous roads (e.g., animal highways, bumpy roads, gravel, rugged edges, uneven expansion joints, smooth surfaces, ponding, debris, uphill, downhill, sharp turns, guardless, narrow roads, narrow bridges, etc.); road marking, marking and/or geometric design; the density of pedestrians and/or bicycles is high.
As used herein, the term "automated driving system" (abbreviated "ADS") refers to a system that performs driving tasks for a vehicle (e.g., lateral and longitudinal control of the vehicle) to allow the vehicle to drive with reduced and/or no human control of the driving tasks.
As used herein, the term "coordinated autopilot system" (abbreviated as "CADS") refers to an autopilot system that performs driving tasks (e.g., lateral and/or longitudinal control) for a vehicle in the event that the vehicle is fully or partially coordinated with a roadway infrastructure (e.g., IRIS). Thus, CADS allows the vehicle to drive with reduced manual control of the driving task and/or without manual control of the driving task. See U.S. Pat.App.Ser. No.63/149,804 for details, which is incorporated herein by reference.
As used herein, "integrated control instructions" refers to vehicle control instructions generated by a cooperative process between a vehicle and CADS and/or CADS components. In some embodiments, the term "integrated control command" refers to a vehicle control command generated by a cooperative process between the VIU and the CVC in which the VIU generated control command and the CVCs generated control command are fused and/or validated to generate an integrated control command control for providing control of the vehicle, such as by an actuator.
As used herein, the term operational design domain (abbreviated ODD) refers to an operating condition in which a particular automated driving system and/or features thereof are specifically designed for work, including, but not limited to, features and/or limitations related to environmental, geographic, and/or temporal factors, and/or related to the presence or absence of certain traffic or road features. In some embodiments, the ODD is defined by SAE international standard J3016", class and definition of road motor vehicle driving automation system related terms (j3016—201606), which is incorporated herein by reference.
As used herein, the term "intelligent networked traffic system" ("CAVH system") refers to an integrated system (e.g., ADS or CADS) that provides all-vehicle operation and control for Connected and Automated Vehicles (CAVs), and more specifically, sends detailed, time-sensitive control instructions to the CAV-controlling system, including vehicle following, lane changing, route guidance, and related information. The CAVH system includes sensing, communication and control components that are connected by segments and nodes that manage the overall transport system. The CAVH system includes four control levels: a vehicle; roadside units (RSUs), which in some embodiments are similar or identical to Roadside Intelligent Units (RIUs); traffic Control Units (TCUs); a Traffic Control Center (TCC). See U.S. Pat. nos.10,380,886 for details; 10,867,512; and/or 10,692,365.
As used herein, the term "intelligent roadway infrastructure system" ("IRIS") refers to a system that facilitates vehicle operation and control of a CAVH system. See U.S. Pat. nos.10,867,512 and/or 10,692,365, each of which is incorporated herein by reference. In some embodiments, IRIS provides transport management and operation for Connected and Automated Vehicles (CAVs) as well as individual vehicle control. For example, in some embodiments, IRIS provides a system for controlling CAV by sending customized, detailed, and time-sensitive control instructions to individual vehicles and traffic information for automated vehicle driving (e.g., vehicle following, lane changing, route guidance, and/or other related information).
As used herein, an "intelligent roadside kit" ("IRT") refers to a system for vehicle operation and control of a Distributed Driving System (DDS), which is one of the IRIS technologies. In some embodiments, the IRT provides modular access to CAVH and IRIS technologies (e.g., services) based on the autopilot needs of a particular vehicle. See U.S. Pat.App.Ser. Nos.17/192,529 and 16/996,684, each of which is incorporated by reference into the present protocol. In some embodiments, the IRT provides the vehicle with individual customized information and real-time control instructions for the vehicle performing driving tasks (e.g., vehicle following, lane changing, and/or route guidance).
As used herein, the term "GPS" refers to a Global Navigation Satellite System (GNSS) that provides geographic location and time information to a receiver. Examples of global navigation satellite systems include, but are not limited to, the global positioning system developed in the united states, the Differential Global Positioning System (DGPS), the beidou navigation satellite system (BDS), the GLONASS global navigation satellite system (GLONASS), the european union galileo positioning system, the indian navigation system, the quasi zenith satellite system of japan.
As used herein, the term "vehicle" refers to any type of power transmission device, including, but not limited to, an automobile, truck, bus, motorcycle, or boat. The vehicle may be controlled by an operator, or may be unmanned, and operated remotely or autonomously in another manner, such as with control devices other than steering wheels, gear shifting, brake pedals, and accelerator pedals.
As used herein, the term "autonomous vehicle" (abbreviated "AV") refers to an automated vehicle in an automated mode, for example at any level of automation (e.g., as defined by SAE international standard J3016, "classification and definition of road motor vehicle driving automation system related terms" (published 2014 (J3016_ 201401), revision 2016 (J3016_201609) and revision 2018 (J3016_201606), which are incorporated herein by reference).
As used herein, the term "road system" refers to roads and road infrastructures, such as intelligent road infrastructures (e.g., IRIS, IRT, RIU/RSU), road signs, road markings, traffic control devices (e.g., traffic signal controllers); and/or a traditional traffic operation center.
As used herein, the term "external system" refers to a CADS and/or any component or portion of a CADS (e.g., a CAVH system) that is separate from and/or external to the particular vehicle (e.g., a vehicle containing a VIU) being referenced. Some example external systems include, for example, other vehicles, road systems, CMS, support systems (e.g., power and security systems), clouds, edge computing devices, maps, and/or positioning devices (e.g., reference marks, DGPS base stations).
As used herein, the term "actuator" refers to a vehicle component that moves or controls a vehicle mechanical component in response to an electrical or logical (e.g., digital) signal. For example, the actuators may receive control signals and generate mechanical movement of vehicle components that results in vehicle acceleration, vehicle deceleration, vehicle braking, and/or vehicle steering and/or steering. The actuator may act on the switch to activate and/or deactivate the electrical and/or electronic components. For example, the actuator may receive a control signal and activate a switch to blink the turn signal.
As used herein, "allocating," "allocating," and similar terms refer to allocating resources also including allocating, scheduling, providing, managing, allocating, controlling, and/or coordinating resources.
As used herein, the term "resource" refers to computing power (e.g., computing power, computing cycle, etc.); memory and/or data storage capacity; sensing capability; communication capacity (e.g., bandwidth, signal strength, signal fidelity, etc.); and/or power.
As used herein, the term "service" refers to a process, a function to perform a process, and/or a component or module configured to provide a function to perform a process.
As used herein, the term "adapter" refers to an interface that connects two components, systems, subsystems, modules, etc. In some embodiments, an adapter provides communication between two components, systems, subsystems, modules (e.g., for exchanging data, instructions, and/or information between the two components, systems, subsystems, modules). In some embodiments, an adapter provides a translation service for converting a first data format output by a first component, system, subsystem, or module to a second data format output for use by a second component, system, subsystem, or module. In some embodiments, an "adapter" defines the type of request that can be issued; the type of response that can be made to the request; how to make requests and respond to requests; data formats for requests, request responses, and data exchanges; and/or other conventions defining the interaction of two components, systems, subsystems, modules, etc.
As used herein, the term "connected vehicle" or "CV" refers to a connected vehicle, e.g., configured for any communication level (e.g., V2V, V2I and/or I2V).
As used herein, the term "connected autonomous vehicle" or "CAV" refers to an autonomous vehicle capable of communicating with other vehicles (e.g., through V2V communication), roadside Intelligent Units (RIUs), traffic control signals, and/or other infrastructure (e.g., ADS or components thereof) or devices. That is, the term "connected autonomous vehicle" or "CAV" refers to a connected autonomous vehicle having any level of automation (e.g., as defined by SAE international standard J3016 (2014)) and communications (e.g., V2V, V2I and/or I2V).
As used herein, the term "data fusion" refers to integrating multiple data sources to provide more consistent, accurate, and useful information (e.g., fusion data) than any single data source of the multiple data sources.
As used herein, the term "configured" refers to a component, module, system, subsystem, etc. (e.g., hardware and/or software) that is constructed and/or programmed to perform the indicated function.
As used herein, the terms "determine," "calculate," and variations thereof are used interchangeably with any type of method, process, mathematical operation, or technique.
As used herein, the term "perception" refers to a function and/or ability of a sensor (e.g., a perception device provided on a vehicle or road infrastructure) to detect and measure a state of a vehicle and/or driving environment, e.g., to provide "perception data". For example, vehicle sensors detect and measure the state (e.g., position, speed, acceleration, deceleration, and angular movement) and driving environment (e.g., objects around, and near the vehicle, such as vehicles, pedestrians, bicycles, obstacles, road signs, and markings) of the vehicle, and the vehicle sensors may be mounted at different locations of the vehicle.
As used herein, the term "perception" refers to the use of sensors and/or sensory data to continuously scan and monitor a driving environment, similar to humans using vision and other senses to collect information and integrate the information into a dynamic description of the human environment. In some embodiments, "perception" includes techniques including computer vision and/or artificial intelligence to help a vehicle perceive and perceive its environment. In some embodiments, the sensing module provides sensing functionality. In some embodiments, the perception module is installed on a vehicle or roadside infrastructure (e.g., RSU, RIU).
As used herein, the term "reliability" refers to a measure (e.g., a statistical measure) of system performance that is free of faults and/or errors. In some embodiments, reliability is a measure of the length of time and/or the number of functional cycles that a system performs without failure and/or error.
As used herein, the term "support" when used in reference to providing support to ADS, CADS, CAVH, CAV and/or one or more other components of a vehicle and/or supporting ADS, CADS, CAVH, CAV and/or one or more components of a vehicle refers to, for example, information and/or data exchange between components and/or levels of ADS, CADS, CAV, and/or vehicle; transmitting and/or receiving instructions between ADS, CADS, CAVH, CAV and/or components and/or levels of the vehicle; and/or ADS, CADS, CAVH, CAV and/or other interactions between components and/or levels of the vehicle that provide information exchange, data transmission, messaging, and/or alarm functions, among others.
As used herein, the term "CADS component" or "CADS component" individually and/or collectively refers to one or more components of a CADS and/or CAVH system, such as VIU, RIU, TCC, TCU, TCC/TCU, TOC, CAV, support subsystem, and/or cloud component.
As used herein, the term "roadside intelligent unit" (abbreviated as "RIU") may refer to a RIU, multiple RIUs, and/or a network of RIUs.
As used herein, the term "critical point" refers to a portion or region of a roadway that is identified as suitable for embodiments of the function distribution techniques provided herein. In some embodiments, the critical points are classified as "static critical points" and in some embodiments, the critical points are classified as "dynamic critical points". As used herein, a "static critical point" is a point (e.g., area or location) of a road that is a critical point based on the identification of road and/or traffic conditions that are typically constant or that change very slowly (e.g., on a time scale exceeding one day, one week, or one month) or are only reconstructed by the planned infrastructure. As used herein, a "dynamic critical point" is a point (e.g., region or location) of a road that is a critical point based on the identification of road conditions that change (e.g., are predictable or unpredictable) over time (e.g., on a time scale of one hour, day, week, or month). Critical points based on historical collision data, traffic signs, traffic signals, traffic capacity, and road geometry are typically static critical points. The critical point based on traffic oscillations, real-time traffic management or real-time traffic events is typically a dynamic critical point.
In some embodiments, the threshold points (e.g., the first 20% (e.g., the first 15-25% (e.g., the first 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, or 25%) are identified using historical collision data, the most common collision points in a roadway system are identified as the threshold points), the traffic signs (e.g., where certain traffic signs (e.g., accident-prone areas) are detected are determined as the threshold points), the traffic capacities (e.g., the first 20% (e.g., the first 15-25% (e.g., the first 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, or 25%) highest traffic capacity regions are determined as the threshold points), the road geometries (e.g., roads with critical road geometries (e.g., curves, blind spots, hills, intersections (e.g., signal intersections, stop sign intersections), ring intersections) are determined as the threshold points), the traffic oscillations (e.g., points with significant traffic oscillations are determined as the threshold points), the real-time traffic management (e.g., points with potential traffic oscillations), and/or traffic events such as the traffic stops, traffic events (e.g., traffic stops, traffic events, traffic jams, or traffic congestion, or traffic related traffic events are determined as the threshold points, traffic events, traffic congestion, or traffic related traffic events are determined, or real-time events, or traffic events.
As used herein, "microscopic", "mesoscopic" and "macroscopic" refer to relative dimensions in time and space. In some embodiments, scales include, but are not limited to, microscopic levels associated with individual vehicles (e.g., longitudinal movement (car following, acceleration and deceleration, parking and standing) and lateral movement (lane keeping, lane changing), with road corridors and/or road segments (e.g., special event advance notice, event prediction, merge and split, rank and integration, speed limit prediction and reaction, road segment travel time prediction and/or road segment traffic flow prediction), macroscopic levels associated with the entire road network (e.g., potential congestion prediction, potential event prediction, network traffic demand prediction, network status prediction, network travel time prediction. In some embodiments, microscopic time scales range from 1 millisecond to 10 milliseconds and are associated with tasks such as vehicle control command calculation.
As used herein, the automation and/or intelligence level of a vehicle (V), infrastructure (I), and system are described with respect to "intelligence level" and/or "automation level". In some embodiments, the vehicle intelligence and/or automation level is one of: v0: no automation function exists; v1: assisting the driver in controlling basic functions of the vehicle; v2: helping the driver control the vehicle to perform simple tasks and provide basic sensing functions; v3: the environment can be sensed in detail in real time, and relatively complex driving tasks can be completed; v4: a function of allowing the vehicle to independently drive under limited conditions, and sometimes independently drive under human driver backup; v5: allowing the vehicle to be driven independently under all conditions without requiring manual driver backups. As used herein, a vehicle having a level of intelligence of 1.5 (V1.5) refers to a vehicle having capabilities between vehicle intelligence 1 and vehicle intelligence level 2, e.g., a vehicle at V1.5 has minimal or no autopilot capabilities, but includes capabilities and/or functions (e.g., hardware and/or software) to control V1.5 vehicles via a CAVH system (e.g., a vehicle having "enhanced driver assistance" or "driver assistance function" functions).
In some embodiments, the infrastructure intelligence and/or automation level is one of: i0: is nonfunctional; i1: information collection and traffic management, wherein the infrastructure provides raw sensing functions in aggregate traffic data collection and basic planning and decision making to support simple traffic management with low spatial-temporal resolution; i2:i2x and vehicle guidance for driving assistance, wherein the infrastructure implements limited sensing functions for road surface condition detection and vehicle kinematics detection, such as lateral and/or longitudinal position, speed and/or acceleration of part of the traffic, in seconds or minutes, in addition to the functions provided in I1; the infrastructure also provides traffic information and vehicle control advice and instructions to the vehicle via I2X communications; and I3: dedicated lane automation, wherein the infrastructure provides individual vehicles with information describing the dynamics of surrounding vehicles and other objects on a millisecond timescale and supports fully automated driving on CAVH compatible vehicle dedicated lanes; the infrastructure has limited traffic behavior prediction capabilities; and I4: scene specific automata, wherein the infrastructure provides detailed driving instructions for vehicles to enable fully automated driving in a specific scene and/or area, e.g., locations containing predefined geofenced areas, wherein traffic is mixed (e.g., including automatic and non-automatic vehicles); necessary vehicle-based automation capabilities, such as emergency braking, are provided as a backup system in the event of a failure of the infrastructure; and I5: complete infrastructure automation, wherein the infrastructure provides complete control and management of individual vehicles in all cases and optimizes the entire road network of infrastructure deployment; the vehicle automation function is not required to be provided as a backup; full active security functions are available.
In some embodiments, the system intelligence and/or automation level is one of: s0: is nonfunctional; s1: the system provides simple functions for a single vehicle, such as cruise control and passive safety functions; the system detects the speed, position and distance of the vehicle;
s2: the system comprises personal intelligence for detecting vehicle running status, vehicle acceleration and/or traffic signs and signals; the individual vehicles make decisions according to their own information and have a partially automatic driving function to provide complex functions such as auxiliary vehicle adaptive cruise control, lane keeping, lane changing and automatic parking; s3: the system integrates information of a group of vehicles, has self-organizing intelligence and prediction capability, has intelligence for making decisions on the group of vehicles, can finish complex conditional automatic driving tasks such as cooperative cruise control, vehicle scheduling and the like, and allows the vehicles to pass through intersections, confluence and diversion; s4: the system optimizes the integrated driving behavior in part of the network; the system detects and transmits detailed information in part of the network, makes decisions according to vehicles and traffic information in the network, processes complex advanced automatic driving tasks such as navigation traffic signal corridor, and provides optimal track for vehicles in the small transport network; s5: vehicle automation and system traffic automation, wherein the system optimally manages the entire traffic network; the system detects and transmits detailed information in the transport network and makes decisions based on all available information in the network; the system handles fully automated driving tasks, including single vehicle tasks and transportation tasks, and coordinates all vehicles to manage traffic.
In some embodiments, the system dimension depends on the vehicle and infrastructure dimensions, e.g., as in the following equation (s=system automation; v=vehicle intelligence; i=infrastructure intelligence):
S=f(V,I)
in some embodiments, vehicle intelligence is provided by and/or associated with the CAV subsystem, and infrastructure intelligence is provided by and/or associated with the CAH subsystem. Those skilled in the art may refer to SAE international standard J3016, "classification and definition of terms related to road motor vehicle driving automation system" (published 2014 (j3016_ 201401), revision 2016 (J3016201609) and revision 2018 (J3016201806)), which provides additional understanding of terms used in the art and herein.
Description of the invention
Provided herein are technologies related to autopilot, particularly but not limited to intelligent on-board units (Vehicle Intelligent Unit, VIU) providing vehicle operation and control for intelligent networked vehicles (Connected Automated Vehicles, CAV), and more particularly, VIU configured to interface with a coordinated autopilot system (Collaborative Automated Driving System, CADS), manage and/or control information exchange between CAV and CADS, manage and/or control lateral and longitudinal movement of CAV, including vehicle following, lane changing, and route guidance. While certain illustrated embodiments are referred to herein, it should be understood that these embodiments are presented by way of example, and not by way of limitation.
Intelligent vehicle-mounted unit
The VIU is an onboard subsystem installed in the vehicle that provides CADS for the vehicle. The VIU provides CADS functionality to the vehicle to assist or enable the vehicle to perform autonomous driving tasks. The VIU provides for information exchange, data processing, and control command generation between the CADS and the vehicle via the information-transfer system. In some embodiments, the VIU is configured to supplement, augment, back up, boost, and replace the autopilot functionality of a Conventional Vehicle Control System (CVCS) of the vehicle. In some embodiments, the VIU is a supplement, enhancement, backup, boost, and/or replacement to the autopilot function of a vehicle's Conventional Vehicle Control System (CVCS).
In some embodiments, the CVCS system is a vehicle control and actuator system, and a "brain" for an autonomous vehicle to perform an autonomous function. The CVCS is equipped on vehicles of different intelligent classes and provides techniques enabling the vehicles to operate safely in manual and/or automatic driving modes. These include sensing (e.g., camera, radar, lidar), surveillance, global positioning, computing, artificial intelligence, and wireless and wired communications (e.g., in-vehicle mobile internet, workshop communication network, in-vehicle communication network).
In some embodiments, the VIU includes an interaction subsystem (e.g., including a human-machine interaction module, an information conversion module, a communication module, and/or an in-vehicle sensor access and information processing module); a perception subsystem (e.g., comprising a perception fusion module and/or a high-precision map and localization module); decision-making subsystem (e.g., including intelligent control instructions/assistance modules and/or collaborative decision-making modules) and/or supplemental subsystem (e.g., including support modules and/or redundancy verification modules).
In some embodiments, for example, as shown in fig. 1, the technology relates to collaborative autopilot systems (CADS) and CAVs including VIUs. The VIU manages human-machine interaction 108 between driver 101 and in-vehicle system 102, and the VIU manages communication 117 between in-vehicle system 102 and external system 103. The information is processed by the in-vehicle system 102, e.g., by the perception and perception layer 104, the decision and control layer 105, and the execution layer 106. The self-cognition and environmental awareness 109 is transferred from the awareness and awareness layer 104 to the decision layer 105. The decision and control layer 105 communicates with the external system 103 via information and decision interactions 107, the external system 103 comprising other vehicles 113, road systems 114, collaborative management systems 115 and/or support systems 116. Within the decision layer 105, an intelligent on-board unit (VIU) 112 and a Conventional Vehicle Control System (CVCS) 111 interact and cooperate to provide a perceptual fusion 118, a collaborative decision 119, and an intelligent vehicle control 120. The integrated vehicle control instructions 110 generated from the decision layer 105 are then sent to the execution layer 106.
In some embodiments, for example, as in fig. 2A-2E, the VIU 201 is made up of subsystems and modules (fig. 2A) that may be combined into the VIU embodiments shown in fig. 2B-2E.
In some embodiments, for example, as shown in fig. 2A, VIU 201 includes subsystems and modules. The different subsystems are classified as: an interaction subsystem 202, a perception subsystem 203, a decision subsystem 204, and a supplementation subsystem 205. The interaction subsystem 202 comprises a man-machine interaction module 206, an information conversion module 207, a communication module 208 and an on-board sensor access and information processing module 209, the perception subsystem 203 comprises a perception fusion module 210 and a high-precision map and positioning module 211, and the decision subsystem 204 comprises an intelligent control instruction/assistance module 212 and a collaborative decision module 213. The supplemental subsystem 205 is comprised of a support module 214 and a redundancy verification module 215. The VIU includes various combinations and/or configurations of subsystems and modules to provide corresponding functions related to the level of intelligence of the vehicle in which the VIU is installed, and specific driving tasks performed by the vehicle.
In some embodiments, for example, as shown in fig. 2B, the VIU includes data flows between subsystems and/or modules of the VIU 2010. The interaction subsystem 202 in the VIU 201 gathers awareness and other information related to the autopilot mission, which is sent to the awareness subsystem 203. The perception subsystem 203 obtains internal and external environmental information by fusing and integrating information from the interaction subsystem 202. Based on the results of the perception subsystem 203, the decision subsystem 204 generates driving decisions for vehicle control (e.g., executing vehicle control instructions), and the replenishment subsystem 205 provides resources such as power for each module and subsystem, and maintains system safety.
In various embodiments, one or more modules and functions in the VIU are flexibly combined and/or configured to accomplish some or all of the autopilot functions in accordance with the CVCS and driving mission requirements. In some embodiments, for example, as shown in fig. 2C, 2D, and 2E, the VIU may be comprised of subsystems and modules for performing autopilot functions. In some embodiments, for example, as shown in fig. 2C, the VIU is comprised of an information conversion module 207, a communication module 208, and a perceptual fusion module 210, which are supported by a support module 214. Thus, the VIU is configured to provide a sensory fusion function for the vehicle. In some embodiments, for example, as shown in fig. 2D, the VIU is comprised of an information conversion module 207, a communication module 208, and a high-precision map and location module 211, which are supported by a support module 214. Accordingly, the VIU is configured to provide high-precision map and positioning functions for the vehicle. In some embodiments, for example, as shown in fig. 2E, the VIU is comprised of an information conversion module 207, a communication module 208, a perceptual fusion module 210, a high-precision map and location module 211, and a collaborative decision module 213, which are supported by a support module 214. Accordingly, the VIU is configured to provide a vehicle with a sensory fusion function, a high-precision map and position location function, and a collaborative decision function.
In some embodiments, for example, as shown in fig. 3, the VIU may provide various VIU configurations for different types of CAVs 301. For example, CAV 301 is provided by various manufacturers 304, various brands 305 of vehicle manufacturers, various series 306 of vehicle brands, various years 307 of vehicle product series, and different platforms 308 of vehicle manufacturers. In addition, CAV 301 is provided by having a series of intelligent levels 302, and has SAE-defined intelligent levels: v1 309, V2 310, V3 311, V4 312 or V5 313. The VIU may also be applied to various operational design domain ODD303.
In some embodiments, such as shown in fig. 4, the technique provides for information processing, such as by an information conversion module. The information conversion module 401 of CAV 408 communicates with the collaborative management system 405 of the CADS through information interaction (e.g., information exchange) 406. In CAV 408, the information conversion module 401 performs decoding 402 and encoding 404 using a codebook 403. Other modules 407 of CAV 408 are also included in the information exchange process.
In some embodiments, for example, as shown in fig. 5, the technique provides a perceived fusion approach for vehicles with different levels of intelligence. Vehicle-centric and non-vehicle-centric approaches involve a collaborative management system of CADS, fusion modules in CVCS, and/or fusion modules of VIUs. The information from the CADS collaborative management system is processed in a perception fusion module of the VIU and used to calibrate the vehicle-centric perception generated by the CVCS. A perception fusion module in the vehicle VIU (e.g., at intelligent level 1,2,3,4, or 5) provides the driver with supplemental perception information and varying degrees of monitoring substitution. In addition, the perception fusion module is mainly responsible for planning and controlling the longitudinal track and the transverse track, and sends information to the planning module. In some embodiments, the driver or VIU may take over and send information to the collaborative decision-making module to reduce long-tail operational design area (ODD) risk and provide a more accurate dynamic HD map, a wider range of environmental awareness, route planning information, and driving decisions.
In some embodiments, for example, as shown in fig. 6, the technique provides a collaborative decision method involving a collaborative decision module in a CVCS and a collaborative decision module of a VIU. And the cooperative decision module of the VIU generates a driving decision according to the result of the perception fusion. Collaborative decision-making modules in the VIU (e.g., on class 1, 2, 3, 4, or 5 vehicles) improve autopilot by providing longitudinal and lateral control, assuming control of driving tasks, making trajectory planning and detailed driving decisions, replacing drivers in extreme conditions, reducing long tail Operation Design Domain (ODD) risk, and generating more predictive decisions.
In some embodiments, for example, as shown in fig. 7, the technique provides a redundancy verification method. The redundancy verification module 707 collects information (e.g., perception and perception results, decision information, and other information) from the collaborative decision module 708, the perception fusion module 709, the high precision map and location module 710, and the in-vehicle sensor access and information processing module 711. Perception information 715 is sent from the in-vehicle perception and perception layer 702 to the VIU 701 (e.g., via communication module 712). The VIU's in-vehicle sensor access and information processing module 711 receives the perception information 715 from the perception and perception layer 702.
Information/decision/control instructions 716 are transmitted from external systems (e.g., infrastructure 705 and other systems 706) to VIU 701 via communication module 712 and information conversion module 713. The perceptual fusion module 709 and the high precision map and localization module 710 then process this information. The redundancy verification module 707 interacts with the perception fusion module 709 to verify and confirm the perception information 722 and fused perception results 723 from the on-board and external systems. In addition, the redundancy verification module 707 interacts with the high-precision map and positioning module 710 and verifies and validates the positioning request 720 to the cloud (external system) and the HD positioning information 721 from the cloud (external system).
In the decision layer 703, decision results and control decisions/instructions 718 are generated by the CVCS 728 and sent to the redundancy verification module 707 for verification and validation. Furthermore, in decision layer 703, collaborative decision module 708 uses the fused perceptual results and instructions to generate decision instructions 725 for VIU 701. The redundant validation module 707 interacts with the collaborative decision module 708 and shares fused perceptual results and decisions from the external system 724. The redundancy verification module 707 then verifies and validates the decision instruction 725 made by the VIU 701. In addition, the decision results and control decisions/instructions 718 are shared with the CVCS 728 and external systems.
Control commands from the on-board and external systems are generated in the intelligent control command/assistance module 714 based on decisions made by the collaborative decision module 708. The redundancy verification module 707 interacts with the intelligent control instructions/assistance module 714 and verifies and validates the integrated control instructions 727 from the on-board and external systems 726.
The output of the intelligent control command/assist module 714 is utilized to generate the integrated control command 717 and to send the integrated control command to the control layer 704 or to be shared with the control layer 704. The VIU 701 then backs up the execution results of the actuators in the control layer 704.
In some embodiments, for example, as shown in FIG. 8, the technique provides an intelligent control instruction unit that operates to combine decision and control instructions from the VIU and other systems. The process comprises three steps: (1) Acquiring decision and/or control instruction information (2) from a subsystem to generate a comprehensive control instruction; and (3) sending the results to the vehicle executor and selectively reporting the results to the collaborative management system of the CADS.
In some embodiments, for example, as shown in fig. 9, the technique provides for activating and/or providing the autopilot functionality of the vehicle when the vehicle is equipped with a VIU and the VIU is connected to CADS. After connection to the CADS system, four functions are activated. First, the vehicle becomes the actuator of the CADS command, which means that the VIU provides CADS autopilot functionality and driving intelligence for the vehicle. CADS coordinates with IRIS or IRIS subsystems (e.g., IRT or RIU). For example, in some embodiments, IRIS coordinates with roadside infrastructure (e.g., traffic lights) to coordinate with automated driving tasks based on vehicle driving information and actual control instructions sent by the VIU. Secondly, the vehicle is used as a part of road traffic to collect road condition information. The vehicle cooperates with the IRIS to provide information required by the CADS system, for example, from multiple partial data sets describing road conditions provided by a single vehicle in the traffic stream. Third, the VIU reports the departure and destination information to the system map and puts forward positioning and navigation requirements to the CADS system map. The CADS cooperative management system retrieves road information of a specific route for the vehicle according to the map information. The vehicle and other vehicles connected to the CADS share sensory information and interact with each other.
In some embodiments, for example, as shown in fig. 10, information sent by VIU 1001 is shared with other users in the CADS. These users include cloud platform 1002, roadside infrastructure 1003, communication devices 1004, and other vehicles equipped with VIUs and connected to CADS 1005.
In some embodiments, for example, as shown in fig. 11, the technique provides a method of dividing tasks and collaboration between the CVCS and the VIU. During automatic driving, the CVCS uses information perceived by the vehicle describing the surrounding environment to generate driving decisions and vehicle control instructions, for example in response to unexpected traffic conditions. The VIU uses the driving environment information provided by the vehicle and CADS to generate driving decisions and vehicle control instructions in cooperation with the CADS, IRIS, or IRIS subsystems (e.g., IRT or RIU). In some embodiments, CADS environment information describes a larger area or a wider range than information provided by the vehicle. The VIU may integrate the CVCS and the VIU generated instructions. When the CVCS fails, the VIU provides a redundant on-board unit to provide and/or maximize stability of the autopilot function.
In some embodiments, for example, as shown in fig. 12, modules and sub-modules are provided in the VIU to assist the vehicle in performing autopilot tasks in the event of a CVCS failure. The VIU identifies the failed module or sub-module in the CVCS and activates the corresponding and/or appropriate VIU module or sub-module to assist the vehicle system in performing the autopilot task. And if the sensing and information processing module of the CVCS fails, activating the VIU system sensing fusion module and/or the vehicle-mounted sensor access and information processing module to replace the sensing and information processing module of the CVCS. And if the positioning and navigation functions of the CVCS are invalid, activating a high-precision map and equipotential module to provide positioning and navigation services for the vehicle. If the decision module of the CVCS fails, the collaborative decision module is activated as the CVCS decision module. And if the man-machine interaction function of the CVCS is invalid, activating the man-machine interaction module of the VIU to exchange information with the driver. If the CVCS power support module fails, the VIU support module is activated to help the CVCS meet power requirements and perform power management.
In some embodiments, for example, as shown in fig. 13, the technique provides a method for the VIU system to replace the CVCS and independently perform the autopilot tasks when the CVCS fails. The vehicle-mounted sensor access and processing module generates sensing information and sends the sensing information to the fusion module. The communication module and the information conversion module receive information from an external system and send the information to the fusion module. The sensing fusion module fuses the sensing information from the vehicle-mounted sensor access processing module and the information from the external system, and sends the fused information to the collaborative decision-making module. The collaborative decision module generates a driving decision instruction, and sends the driving decision instruction to the intelligent control instruction/auxiliary module to generate a comprehensive vehicle control instruction for the automatic driving task of the vehicle.
In some embodiments, for example, as shown in fig. 14, the technique provides a method for VIU system replenishment functionality. For example, in some embodiments, when a vehicle encounters a long tail condition (e.g., an accident, an event, construction and/or work area, extreme and/or adverse weather, dangerous roads, signs, marks and/or geometric design ambiguity, high concentration of pedestrians/bicycles, or a combination thereof), the sensory fusion module and the collaborative decision module of the VIU system supplement the autopilot function of the vehicle with sensory information, decisions and/or vehicle control instructions provided by IRIS or IRIS subsystems (e.g., IRT or RIU) and CADS. Thus, the technique provides a solution to long tail Operational Design Domain (ODD) risk.
In some embodiments, for example, as shown in fig. 15, the technique provides a method for VIU system enhancement functions. For example, in some embodiments, the VIU system uses different modules to enhance the autopilot functionality of the vehicle. In some embodiments, the awareness fusion module fuses information from the on-board system, the CADS' collaborative management system, and the external system to enhance awareness and predictive capabilities of the vehicle. In some embodiments, the collaborative decision-making module cooperates with external systems in the CADS to enhance vehicle planning capabilities. In some embodiments, the intelligent control unit fuses the drive commands from the VIU and the drive commands from the on-board CVCS system to generate integrated control commands that enhance vehicle control capabilities.
In some embodiments, for example, as shown in FIG. 16, a technique for VIU system lift functionality. For example, in some embodiments, the VIU increases the level of the autonomous vehicle from a lower level of intelligence to a higher level of intelligence by enhancing the ability of the vehicle to operate automatically. In some embodiments, the level of intelligence of a vehicle having level 1 of intelligence may be increased to level 2, 3, 4, or 5 of intelligence. In some embodiments, the level of intelligence of a vehicle having level 2 of intelligence may be increased to level 3, 4, or 5 of intelligence. In some embodiments, the level of intelligence of a vehicle having level 3 of intelligence may be increased to level 4 or 5 of intelligence. In some embodiments, the level of intelligence of a vehicle having level 4 of intelligence may be increased to level 5 of intelligence. For example, in some embodiments, this technique promotes the intelligence of a vehicle with 4-level intelligence to 5-level intelligence by helping to address long tail odd-numbered risks. In some embodiments, the VIU system boost functionality improves and/or maximizes the safety of vehicles with level 5 intelligence. In some embodiments, the VIU system boost functionality reduces and/or minimizes the cost of a vehicle with 5 levels of intelligence.
In some embodiments, for example, as shown in fig. 17A and 17B, the technique provides collaboration between the VIU and the CVCS for information processing in autopilot.
In some embodiments, for example, as shown in fig. 17A, the technique provides a sequential information processing architecture for CVCS and VIU. The CVCS senses the environment and transmits the sensed data to the VIU to fuse the sensed and sensed information with the VIU. The VIU provides the fused perceptual and perceptual information to the CVCS and, in some embodiments, sends the fused perceptual and perceptual data to other subsystems of the CADS. The CVCS uses the sensory and perceptive data received from the VIU to make driving decisions for vehicle control. The driving decision generated by the CVCS is sent to the VIU, and the VIU (such as a cooperative decision module of the VIU) makes a final driving decision after receiving the driving decision result of the CVCS and other information of the CADS. The driving decisions generated by the VIU (e.g., by the cooperative decision module of the VIU) are communicated to the CVCS. The CVCS uses the driving decisions to generate initial vehicle control commands and sends the vehicle control commands to the VIU intelligent control command/assistance module. After the initial vehicle control instruction is verified by the VIU redundancy verification module, the VIU generates a comprehensive control instruction for controlling the vehicle.
In some embodiments, for example, as shown in fig. 17B, the technique provides an information processing architecture that combines CVCS and VIU in parallel, sequential, and interleaved approaches. First, the CVCS senses the environment, the on-board systems and surrounding vehicles to provide sensing data. And secondly, the CVCS transmits the perception data to the VIU for perception fusion, and the fused perception information is provided. The fused perceptual information is used by the VIU and the CVCS. And the VIU makes a decision by using the fused perception information and positioning information, and outputs a decision result to the VIU control module. And the CVCS makes a decision by using the fused perception information and positioning information, and outputs a decision result to the CVCS control module. The VIU control module processes the VIU decision result and generates a VIU control instruction. The CVCS control module processes the CVCS decision result and generates a CVCS control instruction. The CVCS control Module shares CVCS control instructions with the VIU. The VIU control module shares the VIU control instructions with the CVCS. The VIU (e.g., a redundancy verification module of the VIU) verifies the CVCS control instructions and/or the CVCS verifies the VIU control instructions. The verified control instructions are executed by the vehicle. The VIU backs up the decision results from the CVCS decision module and the control instructions from the CVCS control module.
Accordingly, as described herein, the present technology relates to a VIU that provides a vehicle with full-automatic driving functionality, comprising a plurality of modules: the system comprises a vehicle-mounted sensor access and information processing module, a communication module, an information conversion module, a sensing and sensing fusion module, a high-precision map and positioning module, a collaborative decision module, an intelligent control instruction/auxiliary module, a redundancy verification module, a man-machine interaction module and/or a support module. In some embodiments, many of these modules of the VIU are flexibly configured to accomplish some or all of the auto-drive functions according to the CVCS and drive task requirements. In some embodiments, the VIU is configured (e.g., the VIU includes a plurality of modules) to provide and/or provide full-automatic driving to the vehicle.
In some embodiments, the in-vehicle sensor access and information processing module is configured to receive and/or receive information collected by the in-vehicle sensors and process the information. In some embodiments, the in-vehicle sensor access and information processing module replaces the information processing functions of the CVCS.
In some embodiments, the communication module is configured to exchange and/or exchange information, decisions, and integrated control instructions between an on-board system of the vehicle and external systems (e.g., the infrastructure and components of the CADS external to the vehicle). The communication module provides communication functions and network support for operation of the VIU module. In some embodiments, the communication module provides wired and/or wireless communication for information sharing and exchange between a VIU (e.g., a vehicle including a VIU), a collaborative management system of CADS, an Intelligent Road Infrastructure System (IRIS), and other vehicles. In some embodiments, the communication techniques include one or more communication techniques including 4G, 5G, 6G, 7G, dedicated Short Range Communication (DSRC), IEEE802.11p, and/or cellular V2X (C-V2X) techniques.
In some embodiments, the information conversion module is configured to provide and/or provide information encoding, information decoding, information translation, and/or information conversion for information exchange between the on-board subsystem (e.g., the VIU and/or the vehicle subsystem) and the external system (e.g., the CADS subsystem). In some embodiments, the information conversion module includes a built-in coding dictionary and a communication protocol. In some embodiments, the information conversion module includes, for example, the techniques described in U.S. Pat. App. Ser. No.63/137,243, which is incorporated herein by reference. In some embodiments, CADS provides automated driving for a plurality of heterogeneous Connected Automated Vehicles (CAVs), and an Intelligent Information Conversion System (IICS) provides a codebook configured to convert and/or translate information (e.g., by encoding and decoding) between formats and standards used by different vehicles. In some embodiments, the CADS provides sensory, decision, and/or control instructions for a particular vehicle, and the IICS converts the sensory, decision, and/or control instructions into a format usable by the particular vehicle. In some embodiments, the particular vehicle provides the awareness information and/or other data to the CADS, and the IICS converts the awareness information and/or other data into a format usable by the CADS. Thus, in some embodiments, the VIU of the CAV receives sensory, decision, and/or control instructions through the IICS to perform driving tasks, and the CADS receives sensory information and/or other data from the VIU through the IICS to provide automated driving of the vehicle. In some embodiments, the CADS sends sensory, decision, and/or control instructions to the CADS' VIU over the IICS for autopilot, and the VIU sends sensory information and/or other data to the CADS over the IICS. The IICS translates information (e.g., by encoding and decoding) between formats and standards for all road type vehicles and provides or facilitates intelligent allocation of functions, resources, and/or services for collaborative autonomous driving; the service level of the system is improved; and/or provide a higher level of information, intelligence, and coordination abilities.
In some embodiments, the information conversion module is configured to convert and/or convert information (e.g., convert information formats and/or standards) between different roads (e.g., different road infrastructure logical and/or physical components), different vehicles, different communication protocols, different environments, and/or different communication conditions. Thus, in some embodiments, the IICS module receives, compiles, converts, and/or transmits information for exchange between subsystems of the CADS, such as on-board systems, other vehicles, IRIS or subsystems thereof (e.g., IRT or RIU), and other autopilot systems. In some embodiments, the information conversion module provides an information encoding function that encodes information and data for an auto-drive task of a CAV using an encoding dictionary. In some embodiments, the information exchange includes uploading (e.g., by the CAV) the driving demand of the CAV, driving information, and/or vehicle environment information to the co-management system of the CADS. In some embodiments, the interaction subsystem receives information from other CADS subsystems and the information is transferred to the VIU subsystem for fusion of the awareness data and collaborative decisions of the VIU. In some embodiments, the interaction subsystem uploads real-time performance data of autonomous driving to the collaborative management system of the CADS.
In some embodiments, the perception fusion module fuses perception and perception information of the vehicle-mounted subsystem and the external system. In some embodiments, the perception fusion module outputs the fusion result of the vehicle and the environmental perception to the collaborative decision module of the VIU.
In some embodiments, the collaborative decision-making module is configured to provide and/or provide decisions, path planning, security identification, and/or output control instructions based on the fused awareness information.
In some embodiments, the high-precision map and positioning module is configured to load and/or load a high-precision map provided by a vehicle or CADS and perform positioning using the high-precision map, satellite navigation and networking, internet of things (IOT) devices, and/or geographic tags.
In some embodiments, the intelligent control command/assistance module is configured to coordinate and/or coordinate control commands provided by the vehicle CVCS and VIU decision modules to generate integrated control commands for the vehicle actuators.
In some embodiments, the redundancy verification module is configured to verify and/or validate control instructions from the vehicle CVCS and external systems to improve the safety performance of the vehicle.
In some embodiments, the human-machine interaction module is configured to cause the driver to input and/or receive driver-entered destination information, driving requests, control instructions, and the like. In some embodiments, the human-machine interaction module displays external information and operating status to the driver. When driver intervention is required, the VIU prompts the driver to control the vehicle.
In some embodiments, the support module is configured to manage and/or manage power to each module. In some embodiments, the support module manages the power supply to each module to ensure that each module receives power to function adequately. In some embodiments, the support module is configured to maintain and/or maintain security of the communication network and the VIU system.
In some embodiments, the functions and modules in the VIU are flexibly configured to accomplish some or all of the autopilot functions depending on the CVCS and drive task requirements.
For example, in some embodiments, the VIU is installed on a vehicle with a level of intelligence defined by SAE of 1 to 5 (e.g., L1, L2, L3, L4, or L5, or V1, V2, V3, V4, or V5). The VIU enables the vehicle to work with IRIS at intelligent levels 1 to 5 (e.g., I1, I2, I3, I4, or I5) to provide support to CADS at intelligent levels 1 to 5 (e.g., S1, S2, S3, S4, or S5). In some embodiments, the VIU is configured to include different design configurations to provide the functional subsystems of different CADS.
In some embodiments, the VIU provides autopilot functionality and/or services for the CAV to perform various automated tasks in multiple collaborative autopilot systems and/or scenarios. For example, embodiments of VIU technology provide automated driving for different types of CAVs operating at different levels of automation, operating in various collaborative automation driving states, and/or performing various driving tasks for automated driving. In some embodiments, the VIU provides autopilot functionality and/or services for CAVs of various manufacturers, various brands, various families, various model years, and various platforms. In some embodiments, the VIU provides autopilot functionality and/or services for CAVs with a level of intelligence of 1 to 5 (e.g., L1, L2, L3, L4, or L5, or V1, V2, V3, V4, or V5), e.g., as defined by SAE. In some embodiments, the VIU provides autopilot functionality and/or services for collaborative autonomous driving of CAVs with authentication links to IRIS. In some embodiments, the VIU provides autopilot functionality and/or services for a variety of collaborative autonomous driving mission requirements, including autopilot in various driving scenarios.
In some embodiments, the awareness fusion module receives and fuses information from the on-board subsystem and the external system. In particular, in some embodiments, a awareness fusion module receives vehicle and/or environmental information perceived by a vehicle and/or environmental information provided by the CADS collaborative management system and fuses such information to provide fused vehicle and/or environmental information and a data environment describing vehicle and environmental states. Thus, in some embodiments, the VIU awareness fusion module fuses information from the on-board subsystem and the external system. Similarly, in some embodiments, a fusion module of the CVCS fuses information from the on-board subsystem and the external system.
In some embodiments, the on-board perception and perception of the vehicle is provided by the vehicle CVCS. In some embodiments, the on-board sensing and perception of the vehicle is provided by the sensing and perception module of the VIU, for example, when the sensing and perception functions of the CVCS are insufficient, inoperative, or malfunctioning. In some embodiments, the VIU perception and perception fusion module receives perception and perception information from external systems to calibrate vehicle perception and perception provided by the VIU and/or CVCS. In some embodiments, when the sensing and sensing functions of the vehicle CVCS fail, the VIU provides autopilot by using the sensing and sensing information provided by (e.g., and received from) the CADS to provide sensing and sensing information (e.g., supplement and/or correct sensing information), e.g., to maintain normal autopilot operation, maximize safety, and minimize and/or eliminate accidents.
In some embodiments, the collaborative decision process is completed by a VIU collaborative decision module. In some embodiments, the collaborative decision-making process is performed by a VIU collaborative decision-making module and/or by a decision-making module of the CVCS. Similarly, in some embodiments, the vehicle decision process for autonomous driving is performed by the vehicle CVCS and/or VIU. In some embodiments, the VIU provides a decision of the vehicle when the decision function of the vehicle CVCS is insufficient, inoperative, or malfunctioning. Thus, when the decision function of the vehicle CVCS fails, the VIU provides the redundant decision function to the vehicle by cooperating with the CADS. Thus, the collaborative decision module of the VIU cooperates with the CADS to provide driving decisions (e.g., based on perceived fusion results) to provide autopilot, e.g., maintain normal autopilot operation, maximize safety, and minimize and/or eliminate accidents. The VIU provides collaborative decision-making functionality based on the autonomous driving needs and the level of intelligence of the vehicle in which the VIU is installed.
In some embodiments, the intelligent control command/assistance module receives and uses the fused vehicle and/or environmental information and data describing the vehicle and environmental conditions. In some embodiments, the intelligent control instructions/assistance module receives driving decisions (e.g., collaborative driving decisions). The fused vehicle and/or environment information and data describing the vehicle and environment conditions are used to evaluate the feasibility and priority of the driving decision instructions. In some embodiments, the fused vehicle and/or environmental information and data describing the vehicle and environmental states are used to evaluate the feasibility and priority of driving decision instructions to prevent abnormal performance of automatic driving functions due to decision of the CVCS and decision conflicts of the VIU. In some embodiments, the fused vehicle and/or environment information and data describing the state and/or driving decisions (e.g., collaborative driving decisions) of the vehicle and environment are used to generate integrated vehicle control commands, and in some embodiments, the intelligent control commands/assistance module sends vehicle control commands to the vehicle actuators to control steering wheel rotation, powertrain acceleration or deceleration, brake system braking, and/or other vehicle systems (e.g., vehicle lights, etc.). In some embodiments, the fused vehicle and/or environment information and data describe the state of the vehicle and the environment and/or driving decisions (e.g., collaborative driving decisions) for improving vehicle drivability and/or improving and/or maximizing the safety and reliability of CAV autopilot. In some embodiments, vehicle control instructions are reported to a collaborative management system of the CADS via a communication module, for example, to coordinate collaboration between the vehicle and the IRIS and/or to coordinate collaboration between the vehicle and subsystems (e.g., IRT, TCC, TCU, TOC and/or RIU) of the IRIS.
In some embodiments, the VIU provides correction or assistance to the CVCS sensing and sensing functions. For example, when the sensing information processing module of the CVCS is insufficient, inoperative, or malfunctioning, the VIU perception and perception fusion module uses information from the external system to generate fusion perception results to correct and/or assist the vehicle perception information provided by the CVCS perception information processing module. Further, in some embodiments, the in-vehicle sensor is connected to an in-vehicle sensor access and information processing module of the VIU to replace the CVCS-aware functionality provided by the CVCS-aware information processing module for vehicle awareness and environmental awareness. When the CVCS is inadequate, inoperative, or malfunctioning in its positioning and navigation functions, the VIU high-precision map and positioning module provides positioning and navigation services and/or provides the vehicle with proper positioning and navigation information. In some embodiments, the VIU high-precision map and positioning module replaces the positioning and navigation functions of the CVCS when the positioning and navigation functions of the CVCS are insufficient, inoperative, or malfunctioning. When the decision module of the CVCS is insufficient, nonfunctional or fails, the collaborative decision module of the VIU will replace the CVCS decision module to generate a driving decision according to the sensing information of the vehicle. When the man-machine interaction function of the CVCS is insufficient, nonfunctional or fails, the man-machine interaction interface of the VIU exchanges information with the driver. The VIU support system helps the CVCS to recover sufficient power when the power support module of the CVCS is insufficient, inoperable, or fails.
In some embodiments, when a vehicle equipped with a VIU is connected to the CADS, the autopilot function is activated and/or provided to the vehicle. First, in some embodiments, the VIU provides a bridge configured to connect the CADS with the vehicle, and the VIU provides functionality for the vehicle to receive, implement, and execute driving instructions provided by the CADS' collaborative management system. The CADS' collaborative management system coordinates with IRIS, IRIS subsystems (e.g., IRT or RIU) and/or roadside infrastructure (e.g., traffic lights), and provides collaborative autopilot for CAV using vehicle driving information and control instructions sent to CADS by the VIU. Second, in some embodiments, the vehicle (e.g., as part of a road traffic flow) collects awareness information including road condition information (e.g., partial road condition information), wherein the awareness information collected by the vehicle is transmitted to an IRIS or IRIS subsystem (e.g., IRT or RIU) to provide the road condition information to the CADS. In some embodiments, a first portion of the traffic information sent by a first vehicle to the CADS is used by the CADS to supplement a second portion of the traffic information sent by a second vehicle to the CADS. In some embodiments, multiple partial road information data sets are provided to and/or received by the CADS from multiple vehicles, and the CADS fuses the partial road information data sets to provide a complete road information data set. Third, in some embodiments, the VIU reports origin and destination information to the CADS's system map and puts positioning and navigation requirements to the CADS' system map. The CADS cooperative management system retrieves road information of a specific route traveled by the vehicle according to the map information. Fourth, in some embodiments, the awareness information collected by the on-board subsystems of the vehicle is communicated to the collaborative management system of the CADS, and the CADS shares the awareness information with other vehicles.
In some embodiments, information sent by the VIU and/or the user to the CADS is shared by the CADS with one or more other users of the CADS, including cloud platforms, roadside infrastructure, communication devices, and other vehicles equipped with the VIU and connected to the CADS. The VIU and/or the user transmits information to the CADS via the communication network. In some embodiments, the CADS integrates and/or fuses information and data received by the CADS and sends the integrated and/or fused data to the user and/or the VIU over a communication network.
In some embodiments, the VIU system cooperates with the CVCS to provide autopilot for CAV. For example, in some embodiments, the CVCS generates driving decisions and vehicle control instructions based on vehicle perceived information, and the VIU generates driving decisions and vehicle control instructions based on environmental information. The VIU intelligently integrates the CVCS and the vehicle control instructions generated by the VIU. In some embodiments, the VIU provides a backup on-board unit to provide reliable autopilot functionality when the CVCS is insufficient, inoperative, or malfunctioning.
In some embodiments, the VIU supplements the autopilot function to provide autopilot in long tail scenarios. The corresponding autopilot functionality of the CVCS is supplemented when the vehicle encounters a long tail condition (e.g., a road accident, special activity, construction and/or work area, extreme and/or adverse weather, dangerous roads, unclear signs, signals, or geometric designs, high concentrations of pedestrians and/or bicycles, or a combination of any of the above modules). For example, the awareness fusion module fuses information from the onboard systems of the vehicle, other vehicles, IRIS or IRIS subsystems (e.g., IRT or RIU), and/or other systems, and sends the fused information to the collaborative decision module of the VIU. The cooperative decision module of the VIU generates a comprehensive decision instruction to complete the automatic driving task in the long tail scene. In some embodiments, the VIU requests and/or retrieves resources from the CADS to generate the integrated decision instruction. Thus, operational Design Domain (ODD) risks can be effectively resolved using the VIU system.
In some embodiments, the VIU system enhances vehicle functions (e.g., sensing functions, prediction functions, planning functions, and vehicle control functions). Specifically, in some embodiments, the perception fusion module of the VIU fuses information from the on-board systems, external systems, collaborative management systems of the CADS, and/or IRIS subsystems (e.g., IRT or RIU) of the vehicle to enhance the perception and prediction capabilities of the vehicle including the VIU. In some embodiments, the collaborative decision-making module of the VIU cooperates with external systems of the CADS to provide enhanced planning functionality. In some embodiments, the intelligent control command/assist module combines the commands generated by the VIU and the commands generated by the CVCS to generate the integrated control command to provide enhanced vehicle control.
In some embodiments, the redundant validation module of the VIU validates transmission of information, driving decisions, and the integrated control commands between the on-board system and the external system are error-free and/or error-correcting, and resolves contradictions between information types to improve and/or maximize stability, reliability, and safety of the autopilot system.
In some embodiments, the redundancy verification module of the VIU verifies that the transmission of the driving decisions generated by the VIU is error-free and/or error-corrected and resolves the contradiction between information types to improve and/or maximize the stability, reliability, and safety of the automated driving system.
In some embodiments, the VIU system increases the level of intelligence of the vehicle by enhancing the ability of the vehicle to automatically drive functions (e.g., by using a VIU awareness fusion module, a collaborative decision module, and/or an intelligent control instruction/assistance module). In particular, embodiments provide that the level of intelligence of a vehicle at level 1 may be increased to levels 2, 3, 4 by providing additional awareness functionality and making driving decisions for longitudinal and lateral control of the vehicle by the VIU awareness fusion module and the collaborative decision-making module. Embodiments provide that the level of intelligence of a vehicle at level 2 may be increased to level 3, 4 or 5 of intelligence by a VIU awareness fusion module and a collaborative decision-making module providing supplemental awareness information for trajectory planning and detailed driving decisions. Embodiments provide that the vehicle of intelligent level 3 may be advanced to intelligent level 4 or 5 by the VIU awareness fusion module providing additional awareness and monitoring of the driver in real time, the VIU collaborative decision module generating driving decisions that cooperate with other vehicles and IRIS or IRIS subsystems (e.g., IRTs or RIUs) in CADS. Embodiments provide that a vehicle of intelligent level 4 may be elevated to intelligent level 5 by the VIU awareness fusion module providing a greater range of environmental awareness (e.g., providing awareness beyond surrounding objects (e.g., buildings)) providing a more rational, safer driving decision. Embodiments further provide that by using additional awareness information from the VIU and more predictive driving decisions, the security level of the vehicle at intelligent level 5 may be enhanced and the cost of the vehicle at intelligent level 5 may be reduced.
In some embodiments, the VIU replaces the autopilot task when the CVCS fails (e.g., when the CVCS provides insufficient functionality, the CVCS is nonfunctional, and/or the CVCS fails). In the event of a failure of the CVCS, the vehicle sensing access and information processing module of the VIU accesses, collects, integrates and/or generates vehicle-perceived information and sends the information to the VIU-perceived fusion module. In addition, the VIU communication module and the information conversion module receive information from the external system and send the information from the external system to the perception fusion module. The perception fusion module fuses information perceived by the vehicle with information from an external system, provides fused perception information, and sends the fused perception information to the collaborative decision-making module. The collaborative decision module generates a driving decision instruction and sends the driving decision instruction to the intelligent control instruction/auxiliary module. The intelligent control command/assistance module generates overall vehicle control commands for the vehicle to perform driving tasks.
In some embodiments, for example, for a CAV of level 1 intelligence (e.g., including CAVs comprised of part of the driver assistance system), the VIU cooperates with the CADS to provide supplemental, enhanced, and/or backup functionality for the vehicle. For example, the VIU receives awareness information from the CADS system and/or the IRIS or IRIS subsystem (e.g., IRT or RIU) and provides supplemental driving assistance to the vehicle. The VIU then receives traffic control information (e.g., traffic signal timing control) from the CADS system and/or IRIS subsystems (e.g., IRT or RIU) and improves the planning and decision function of the vehicle.
In some embodiments, for example, for a CAV of level 2 intelligence, the VIU cooperates with the CADS to provide further supplemental functionality to the vehicle. For example, the VIU receives more detailed awareness information and traffic information from CADS and/or IRIS subsystems (e.g., RIU) and provides improved functions of driver assistance for the vehicle.
In some embodiments, for example, for a CAV with an intelligent level of 3, the VIU cooperates with the CADS to provide further enhancements to the vehicle. For example, the VIU receives optimized awareness information and/or decision and control instructions from the CADS system and/or IRIS subsystems (e.g., RIU) to provide and/or improve planning and control tasks of the vehicle.
In some embodiments, for example, for a CAV with an intelligent level of 4, the VIU cooperates with the CADS and/or IRIS subsystems (e.g., RIU) to provide driving decisions and/or vehicle control instructions in long tail situations and/or to provide information describing specific areas or road segments to better enable automatic driving service coverage, e.g., to provide additional sensory information for vision-obscuring areas of high-rise buildings, etc.
In some embodiments, for example, for a CAV with an intelligent level of 5, the VIU cooperates with the CADS and/or IRIS subsystems (e.g., RIU) to provide further information supplements and enhancements, such as more accurate traffic predictions and more intelligent path selection.
In some embodiments, multiple functional modules of the VIU and CVCS are fused to form parallel, sequential, and cross-architecture relationships in information processing (including perceptual fusion, intelligent decision and comprehensive control). In some embodiments, after sensing the information, the CVCS sends the information to the VIU for fusion sensing, and the fusion information is used by other VIU modules. In some embodiments, the VIU and CVCS make decisions based on the perceptual fusion and positioning information and output the decision results to the next module to improve decision capability. In some embodiments, the VIU and CVCS communicate control decisions and logic for generating the control decisions with each other to improve decision capability.
Automatic driving system
In some embodiments, the techniques provide improvements (e.g., VIU) to vehicle operation and control systems (e.g., CAVH and techniques described herein). In some embodiments, the CAVH comprises one or more Roadside Intelligent Unit (RIU) networks; a Traffic Control Unit (TCU), a Traffic Control Center (TCC); a TCU/TCC network; a Vehicle Intelligent Unit (VIU) (e.g., a vehicle containing a VIU); and/or a Traffic Operation Center (TOC). In some embodiments, the system includes a variety of sensors and computing devices on the CAV and infrastructure (e.g., roadside infrastructure) and is configured to integrate sensing, prediction, planning, and control for the autonomous CAV.
In some embodiments, the technology relates to ADS provided as a Connected and Automated Vehicle Highway (CAVH) system, e.g., one or more components including an Intelligent Road Infrastructure System (IRIS) (see, e.g., u.s.pat. Nos.10,867,512, and 10,380,886, each of which is incorporated herein by reference). In some embodiments, the ADS is provided as or supports a Distributed Drive System (DDS), intelligent roadside kit (IRT), and/or a device distribution system (DAS) (see U.S. Pat.App.Ser. Nos.16/996,684;63/004,551; and 63/004,564, each of which is incorporated herein by reference). In some embodiments, the term "roadside intelligent unit" and its abbreviation "RIU" are used to refer to components named "roadside unit" and its abbreviation "RSU", respectively, as described in, for example, the patent for CAVH technology, U.S. Pat. nos.10,867,512and 10,380,886, each of which is incorporated herein by reference.
The technology described herein relates to and improves upon VIU technology. In some embodiments, the term "vehicle intelligent unit" and its abbreviation "VIU" are used to refer to components named "on-board unit" and its abbreviation "OBU", respectively, as described in, for example, the patent for CAVH technology, U.S. Pat. nos.10,867,512and 10,380,886, each of which is incorporated herein by reference. In some embodiments, the term "vehicle intelligent unit" and its abbreviation "VIU" are used to refer to components named "on-board intelligent unit" and its abbreviation "OIU", respectively, as described in U.S. patent, no. 63/042620, incorporated herein by reference.
In some embodiments, the technology provides a system (e.g., a vehicle operation and control system including a RIU and/or RIU network; a TCU/TCC network; a vehicle including a vehicle intelligence unit; a TOC; and/or a cloud-based platform configured to provide information and computing services (see, e.g., U.S. Pat. app. Ser. No.16/454,268, incorporated herein by reference) configured to provide awareness functions, transport behavior prediction and management functions, planning and decision functions, and/or vehicle control functions.
In some embodiments, the RIU network includes a RIU subsystem. In some embodiments, the RIU subsystem includes a perception module configured to perceive characteristics of the driving environment; a communication module configured to communicate with the vehicle, the TCU, and the cloud; a data processing module for processing, fusing and calculating data from the sensing and/or communication module; an interface module for communicating between the data processing module and the communication module; and an adaptive power module configured to provide power and regulate power according to conditions of the local power grid. In some embodiments, the adaptive power supply module is configured to provide backup redundancy. In some embodiments, the communication module communicates using a wired or wireless medium.
In some embodiments, the sensing module includes a radar-based sensor. In some embodiments, the perception module includes a vision-based sensor. In some embodiments, the perception module includes a radar-based sensor and a vision-based sensor, wherein the vision-based sensor and the radar-based sensor are configured to perceive the driving environment and the vehicle attribute data. In some embodiments, the radar-based sensor is a lidar, a microwave radar, an ultrasonic radar, or a millimeter wave radar. In some embodiments, the vision-based sensor is a camera, an infrared camera, or a thermal camera. In some embodiments, the camera is a color camera.
In some embodiments, the perception module comprises a Global Navigation Satellite System (GNSS). In some embodiments, the sensing module includes an inertial navigation system. In some embodiments, the sensing module includes a satellite-based navigation system and an inertial navigation system, and the sensing module and/or the inertial navigation system are configured to provide vehicle position data. In some embodiments, the global navigation satellite system is, for example, the global positioning system developed in the united states, the Differential Global Positioning System (DGPS), the beidou navigation satellite system (BDS), the gnonass global navigation satellite system (GLONASS), the eu galileo positioning system, the indian navigation system, the quasi-zenith satellite system of japan.
In some embodiments, the sensing module includes a vehicle identification device. In some embodiments, the vehicle identification device includes RFID, bluetooth, wi-fi (IEEE 802.11), cellular network radio (e.g., 4G, 5G, 6G, or 7G cellular network radio), dedicated short range communication technology (DSRC); or a C-V2X communication system.
In some embodiments, the RIU subsystem is deployed at a fixed location near a roadway that includes an automated lane and (optionally) a manual lane. In some embodiments, the RIU subsystem is deployed in a fixed location near the road infrastructure. In some embodiments, the RIU subsystem is deployed near a highway roadside, highway entrance ramp, highway exit ramp, overpass, intersection, bridge, tunnel, toll station, or on a strategically located drone. In some embodiments, the RIU subsystem is deployed on a mobile component. In some embodiments, the RIU subsystem is deployed at strategic locations of unmanned aerial vehicles, unmanned Aerial Vehicles (UAVs), traffic congestion sites, traffic accident sites, highway construction sites, and/or extreme weather sites. In some embodiments, the RIU subsystem is positioned according to road geometry, traffic volume, vehicle type using the road, road size, and/or regional geographic location. In some embodiments, the RIU subsystem is mounted on a rack (e.g., an overhead component, such as a highway sign or signal mounted thereon). In some embodiments, the RIU subsystem is mounted using a single cantilever or a double cantilever mount.
In some embodiments, the TCC network is configured to provide business operation optimization, data processing, and archiving. In some embodiments, the TCC network includes a human operator interface. In some embodiments, the TCC network is a macroscopic TCC, regional TCC, or corridor TCC based on the geographic area covered by the TCC network. See, e.g., U.S. Pat. nos.10,380,886;10,867,512;10,692,365; and U.S. Pat. app. Pub. Nos.20200005633and 20200021961, each of which is incorporated herein by reference.
In some embodiments, the TCU network is configured to provide real-time vehicle control and data processing. In some embodiments, real-time vehicle control and data processing is automated based on pre-installed algorithms. In some embodiments, the TCU network includes a segment TCU or point TCU based on a geographic area covered by the TCU network. In some embodiments, the system includes a point TCU that is physically combined or integrated with the RIU. In some embodiments, the system includes a segment TCU that is physically combined or integrated with the RIU. See, e.g., U.S. Pat. nos.10,380,886;10,867,512;10,692,365; and U.S. Pat. app. Pub. Nos.20200005633and 20200021961, each of which is incorporated herein by reference.
In some embodiments, the TCC network includes a macroscopic TCC configured to process information from the regional TCC and provide control targets to the regional TCC; the regional torque converter clutch is configured to process information of the corridor torque converter clutch and provide control targets to the corridor torque converter dispatcher; the corridor torque converter control system is configured to process macroscopic and segmented TCU information and provide control targets to segment the TCU. See, e.g., U.S. Pat. nos.10,380,886;10,867,512;10,692,365; and U.S. Pat. app. Pub. Nos.20200005633and 20200021961, each of which is incorporated herein by reference.
In some embodiments, the TCU network includes a segment TCU configured to process information from the corridor and/or point TOC and provide control targets to the point TCU; and a point TCU configured to process information from the segment TCU and the RIU and provide vehicle-based control instructions (e.g., detailed and time-sensitive control instructions for a single vehicle) to the RIU. See, e.g., U.S. Pat. nos.10,380,886;10,867,512;10,692,365; and U.S. Pat. app. Pub. Nos.20200005633 and 20200021961, each of which is incorporated herein by reference.
In some embodiments, the RIU network provides customized traffic information and control instructions (e.g., detailed and time-sensitive control instructions for individual vehicles) to the vehicles and receives vehicle-provided information.
In some embodiments, the TCC network includes one or more TCCs including a connection and data exchange module configured to provide data connection and exchange between TCCs. In some embodiments, the connection and data exchange module includes software components that provide data correction, data format conversion, firewall, encryption and decryption methods. In some embodiments, the TCC network includes one or more TCCs, including transmission and network modules configured to provide a communication method for data exchange between TCCs. In some embodiments, the transport and network module includes software components that provide access functionality and data conversion between different transport networks within the cloud platform. In some embodiments, the TCC network includes one or more TCCs that include a service management module configured to provide data storage, data searching, data analysis, information security, privacy protection, and network management functions. In some embodiments, the TCC network includes one or more TCCs including an application module configured to provide management and control of the TCC network. In some embodiments, the application module is configured to manage coordinated control of vehicles and roads, system monitoring, emergency services, and human-to-device interactions.
In some embodiments, the TCU network includes one or more TCUs including sensors and control modules configured to provide sensing and control functions of the RIU. In some embodiments, the sensor and control module is configured to provide sensing and control functions of radar, camera, RFID, and/or V2I (vehicle to infrastructure) devices. In some embodiments, the sensors and control modules include DSRC, GPS, 4G, 5G, 6G, 7G, and/or wireless (e.g., ieee 802.11) radios. In some embodiments, the TCU network includes one or more TCUs including transport and network modules configured to provide communication network functions for data exchange between the automated vehicle and the RIU. In some embodiments, the TCU network includes one or more TCUs that include a service management module configured to provide data storage, data searching, data analysis, information security, privacy protection, and network management. In some embodiments, the TCU network includes one or more TCUs including application modules configured to provide management and control methods for the RIUs. In some embodiments, the management and control methods of the RIU include local collaborative control of vehicles and roads, system monitoring, and emergency services. In some embodiments, the TCC network includes one or more TCCs, the TCC network further includes an application module, and the service management module provides data analysis for the application module. In some embodiments, the TCU network includes one or more TCUs that further include application modules, and the service management module provides data analysis for the application modules.
In some embodiments, the TOC includes an interactive interface. In some embodiments, the interactive interface provides control and data exchange of the TCC network. In some embodiments, the interactive interface includes an information sharing interface and a vehicle control interface. In some embodiments, the information sharing interface includes an interface to share and obtain business data; sharing and acquiring interfaces of traffic events; sharing and acquiring interfaces of passenger demand modes in a shared mobile system; an interface for dynamically adjusting the price based on the indication of the vehicle operation control system; and/or interfaces that allow a particular organization (e.g., a vehicle management office or police) to delete, alter, and/or share information. In some embodiments, the vehicle control interface includes an interface that allows the vehicle operation and control system to assume control of the vehicle; an interface that allows the vehicle to form a row with other vehicles; and/or interfaces that allow a particular institution (e.g., a vehicle management office or police) to control the vehicle. In some embodiments, the traffic data includes vehicle density, vehicle speed, and/or vehicle trajectory. In some embodiments, traffic data is provided by vehicle operation and control systems and/or other shared mobile systems. In some embodiments, traffic events include extreme conditions, major and/or minor accidents, and/or natural disasters. In some embodiments, the interface allows the vehicle operation and control system to control the vehicle in the event of traffic events, extreme weather, or road faults when the vehicle operation and control system and/or other shared mobile systems raise an alarm. In some embodiments, the interface allows vehicles to form rows with other vehicles while traveling on the same automated vehicle-specific lane.
In some embodiments, the VIU includes a communication module configured to communicate with the RIU. In some embodiments, a VIU includes a communication module configured to communicate with another VIU. In some embodiments, the VIU includes a data acquisition module configured to collect data from external vehicle sensors and internal vehicle sensors; and monitors the vehicle status and the driver status. In some embodiments, the VIU includes a vehicle control module configured to execute control instructions for driving tasks. In some embodiments, the driving task includes vehicle follow-up and/or lane change. In some embodiments, the control instructions are received from the RIU. In some embodiments, the VIU is configured to control the vehicle using data received from the RIU. In some embodiments, the data received from the RIU includes vehicle control instructions (e.g., detailed and time-sensitive control instructions for a single vehicle); travel route and traffic information; and/or service information. In some embodiments, the vehicle control instructions include a longitudinal acceleration rate, a lateral acceleration rate, and/or a vehicle direction. In some embodiments, the travel route and traffic information includes traffic conditions, event locations, intersection locations, entrance locations, and/or exit locations. In some embodiments, the service data includes a location of the fuel station and/or a location of the point of interest. In some embodiments, the VIU is configured to send data to the RIU. In some embodiments, the data sent to the RIU includes driver input data; driver status data; and/or vehicle condition data. In some embodiments, the driver input data includes travel origin, travel destination, desired travel time, and/or service request. In some embodiments, the driver condition data includes driver behavior, fatigue level, and/or driver distraction. In some embodiments, the vehicle condition data includes a vehicle ID, a vehicle type, and/or data collected by the data acquisition module.
In some embodiments, the VIU is configured to collect data including vehicle engine status; vehicle speed; surrounding objects detected by the vehicle; and/or driver conditions. In some embodiments, the VIU is configured to act as a control of the vehicle. In some embodiments, the VIU is configured to assume control of the vehicle upon failure of the autopilot system. In some embodiments, the VIU is configured to assume control of the vehicle when the vehicle condition and/or traffic condition prevents the autonomous driving system from driving the vehicle. In some embodiments, the vehicle condition and/or traffic condition is an adverse weather condition, traffic accident, system malfunction, and/or communication malfunction.
Example
When a driver drives an intelligent networked vehicle (CAV) equipped with a CVCS and a vehicle-mounted intelligent unit (VIU), the driver inputs destination information and selects an automatic driving function (e.g., automatic following, adaptive cruise control, automatic lane change or full-automatic driving mode, etc.), a human-machine interaction module using the VIU (e.g., inputting voice commands using a microphone, selecting keys using a function, using a touch screen, etc.).
The following company or vehicle platform gives examples of CVCS that provide various levels of intelligence or automation: intelligent ranking 4:Google Waymo,GM Cruise,Argo AI,Amazon Zoox
Intelligent ranking 3:Audi A8Traffic Jam Pilot,Honda Legend Traffic Jam Pilot,Cadillac CT5Super Cruise,BMW Traffic Jam Assistant
Intelligent ranking 2:Tesla Autopilot for Model 3or Tesla FSD
The VIU establishes a connection with a roadside server of an intelligent network link road (CAH) equipped with IRIS, IRT, or RIU infrastructure or devices (e.g., roadside lidar, video camera, radar, edge computing devices, roadside awareness result generating devices, roadside communication devices, intelligent roadside devices with traffic control and operation functions, and/or intelligent traffic lights, etc.), through a communication network (e.g., 4G, 5G, 6G, 7G cellular network; dedicated short range communication technology DSRC or C-V2X, etc.), and logs into a cloud system. The support modules of the VIU maximize and/or ensure the security and reliability of the communication network and user privacy and ensure reliable and stable power supply for each module of the VIU. The communication mode of the VIU uses 4G, 5G, 6G and/or 7G cellular networks; dedicated short range communication technology (DSRC); and/or C-V2X technology for communicating and/or interacting with other vehicles equipped with VIU, IRIS, IRIS subsystems (e.g., IRT or RIU) and/or cloud systems.
The information conversion module of the VIU carries out protocol conversion on information under different vehicle road environments, encodes the information obtained by the communication module according to a built-in encoding dictionary (such as NRZ encoding, manchester encoding, miller encoding, matrix encoding and the like), and sends the information to the relevant module for processing.
The driver selects an automatic driving function and inputs destination information through a man-machine interaction module. The high-precision map and location module of the VIU reads the high-precision map and location navigation information (e.g., a map containing information about vehicles, buildings, pedestrians, real-time geographic coordinates of the vehicles, geographic coordinates of the departure and destination, and navigation routes, etc.) from the cloud system by using a communication network (e.g., 4G, 5G, 6G, 7G cellular network; dedicated short-range communication technology DSRC or C-V2X, etc.), and sends the high-precision map and location navigation information to the perception fusion module.
The vehicle-mounted sensor acquires data of the VIU information processing module, wherein the data comprise point cloud data acquired by a laser radar, distance data acquired by a millimeter wave radar or a microwave radar, image data acquired by the vehicle-mounted camera and/or electric signals from other vehicle sensors through a CAN bus or Ethernet, and the data and the information are transmitted to the perception fusion module after being processed.
The perception fusion module of the VIU receives vehicle perception results (e.g., vehicle speed; acceleration and/or deceleration; vehicle location, presence of other vehicles in the vicinity of the vehicle, pedestrians or obstacles; lane recognition; color recognition of traffic lights on the route, etc.) generated by the on-board control system over the CAN bus and receives environmental perception results from CADS, IRIS or IRIS subsystems (e.g., IRT or RIU), including, for example, weather conditions, road congestion data, location and speed of other vehicles on the road, presence of other vehicles to avoid by changing lanes or turning, presence of pedestrians on the road, status of traffic lights (e.g., color), presence of high building blocking vision on one or both sides of the road, presence and visibility of road signs (e.g., having dirt, rain or snow covering road signs), presence of obstacles on the road, presence of emergency on the road, etc.
The perception fusion module of the VIU obtains high-definition map information from the high-definition map and the positioning module. The high-precision map and location module provides information related to identifying a current driving scenario (e.g., a congested city, a busy intersection, or a long and/or busy highway, etc.); status information of the road outside the vehicle perception range and/or outside the driver's line of sight; the influence of weather on traffic; the position of the vehicle on the road; real-time variation of speed; navigation route; the countdown time of the traffic lights of the vehicle forward direction; the change of the position and the speed of other vehicles in the target lane when the vehicle changes lanes; prediction of dangerous situations, and the like.
The VIU uses the communication module to send the perceived fusion result (e.g., including driving data of the vehicle and surrounding perceived information of the road) to the cloud platform of the CADS, which performs path planning and makes driving decisions, and uses the communication module to send the driving decision result to the VIU.
The collaborative decision module of the VIU receives the fusion results from the perceptual fusion module, determines drivable regions based on image recognition and other techniques, and plans routes. Based on the perceived information received from the external system, the collaborative decision-making module is also able to identify and predict changes in the driving environment and generate driving decisions (e.g., decisions and plans of driving routes and vehicle control commands for adjusting parameters of the engine and controlling actuators, such as accelerators, brakes, steering signals, etc.) that are sent to the intelligent control command/assistance module.
The VIU intelligent control instruction/auxiliary module collects decision instructions generated by the CVCS vehicle-mounted control system through the CAN bus, combines the decision instructions generated by the VIU cooperative decision module and the fusion results to generate comprehensive control instructions, and sends the comprehensive control instructions to an Electronic Control Unit (ECU) of each actuator through the CAN bus to control an engine, a brake, a steering actuator, a signal lamp and other actuators, so that an automatic driving function is realized. Meanwhile, the integrated control instructions are sent to the communication module and to the CADS, IRIS or IRIS subsystem (e.g., IRT or RIU) and cloud platform via the communication network.
The redundancy verification module of the VIU provides information interaction for other modules, for example, by reading and identifying redundancy check codes in the interaction data, and ensures accuracy of data transmission.
The human-computer interaction module of the VIU uses image and voice generation techniques to display fusion results, VIU operating states, vehicle driving states and other information to the driver, and uses voice recognition techniques to collect driver voice commands and facial and motion recognition techniques to determine the driver's state.
All publications and patents mentioned in the above specification are herein incorporated by reference. Various modifications and variations of the described compositions, methods and technical uses can be made by those skilled in the art without departing from the scope and spirit of the described technology. While the technology has been described in connection with specific exemplary embodiments, it should be understood that the present patent of the required protection should not be unduly limited to such specific embodiments. Indeed, various modifications of the described modes for carrying out the invention which are obvious to those skilled in the art are intended to be covered by the claims.

Claims (92)

1. An intelligent on-board unit (VIU) comprising one or more of:
a) The vehicle-mounted sensor access and information processing module;
b) A communication module;
c) An information conversion module;
d) A perception fusion module;
e) A collaborative decision-making module;
f) A high-precision map and positioning module;
g) An intelligent control command/auxiliary module;
h) A redundancy verification module;
i) A man-machine interaction module; and/or
j) And supporting the module.
2. The VIU of claim 1 wherein: the VIU is installed in a vehicle and provides some or all of the autopilot functionality for the vehicle.
3. The VIU of claim 1 wherein the in-vehicle sensor access and information processing module is characterized by: information collected by the in-vehicle sensors is received, information collected by the in-vehicle sensors is processed, and/or information processing functions of a Conventional Vehicle Control System (CVCS) are replaced.
4. The VIU of claim 3 wherein the in-vehicle sensor access and information processing module is characterized by: and when the CVCS cannot function and/or fails, the vehicle-mounted sensor access and information processing module replaces the information processing function of the CVCS.
5. The communication module of claim 1, wherein: information interaction between the on-board system and the external system, information interaction between the VIU and the CVCS, and communication between the VIU subsystem and/or the VIU module.
6. The information conversion module of the VIU of claim 1, wherein: and managing information exchange between the vehicle-mounted system and the external system.
7. The information conversion module of the VIU of claim 1, wherein: including codebooks and communication protocols.
8. The information conversion module of the VIU of claim 1, wherein: managing communications between entities having different data format standards and/or communication protocols.
9. The information conversion module of the VIU of claim 1, wherein: managing communications between one or more vehicles equipped with a VIU, an intelligent infrastructure system (IRIS), and/or a collaborative management subsystem with a collaborative autopilot system (CADS).
10. The perception fusion module of the VIU of claim 1, wherein: the perception information provided by the vehicle subsystem and the perception information provided by the external system are fused to provide fused perception information.
11. The perception fusion module of the VIU of claim 1, wherein: and outputting the fused perception information and/or the self-cognition and environment perception information to the collaborative decision-making module.
12. The synergistic decision module of claim 1, wherein: and receiving the fused perception information, and using the fused perception information to make decisions, path planning, safety identification and/or generate vehicle control instructions.
13. The high precision map and locator module of claim 1, wherein: the high-precision map data provided by CADS is provided to the VIU.
14. The high precision map and locator module of claim 1, wherein: positioning is provided using high-precision maps, satellite navigation and satellite networks, internet of things devices, and/or geographic tags.
15. The intelligent control command/assist module of the VIU of claim 1 wherein: the vehicle control output generated by the CVCS and the vehicle control output generated by the VIU are coordinated to generate a comprehensive control instruction for controlling the vehicle.
16. The vehicle control output of the VIU of claim 15 wherein: the vehicle control generated by the VIU is output from the decision module of the VIU.
17. The redundant validation module of claim 1, wherein: the control instructions are validated to improve and/or maximize the safety of the vehicle.
18. The control instruction of claim 17, wherein: including control instructions provided by the on-board system and/or control instructions provided by an external system.
19. The human-machine interaction module of claim 1, wherein: inputs from the driver are received to output information describing the environment external to the vehicle and the operating conditions of the vehicle.
20. The input to the VIU of claim 19 wherein: including destination information, driving requirements, and/or control instructions.
21. The human-machine interaction module of claim 19, wherein: the driver is prompted to assume control of the vehicle.
22. The support module of the VIU of claim 1 wherein: power is provided to the VIU subsystem and/or modules and system security is maintained.
23. The VIU of claim 1 wherein: a combination of modules is included and provides some or all of the autopilot functionality depending on the CVCS and driving mission requirements.
24. The VIU of claim 1 wherein: is installed in a vehicle, and the VIU is configured as a subsystem of CADS.
25. The VIU of claim 24 wherein: the VIU implements CADS functionality for the vehicle and executes CADS functionality for the vehicle.
26. The VIU of claim 24 wherein: the vehicle performs an autonomous driving task with a level of intelligence of 1, 2, 3, 4 and/or 5.
27. The information conversion module of claim 24, wherein: and managing information interaction between the CADS and the vehicle.
28. The VIU of claim 24 wherein: the CADS receives and processes sensory data describing the vehicle and the driving environment of the vehicle, and the CADS provides vehicle control instructions for the vehicle.
29. The VIU of claim 1 wherein: vehicles of level 1, 2, 3, 4, or 5 are enabled to cooperate with IRIS of level 1, 2, 3, 4, or 5 to provide CADS of level 1, 2, 3, 4, or 5.
30. The VIU of claim 1 wherein: collaborative autopilot tasks are facilitated to be performed for a plurality of vehicles, including vehicles having different levels of intelligence, different brands and/or manufacturers, different vehicle models and corresponding manufacturing years, different vehicle models, and/or different platforms.
31. The communication module of claim 1, wherein: both wired and wireless communication is provided.
32. The communication module of claim 1, wherein: information sharing and information interaction is provided between the VIU equipped vehicle, the CADS collaborative management system, the IRIS or IRIS subsystem, and other vehicles.
33. The VIU of claim 32 wherein: the IRIS subsystem is an intelligent roadside unit (RIU) or an intelligent roadside kit (IRT).
34. The VIU communication module of claim 31 wherein: using 4G, 5G, 6G or 7G cells for communication; dedicated Short Range Communications (DSRC); and/or C-V2X technology.
35. The communication module of claim 1, wherein: and exchanging information with the CADS, the IRIS or the IRIS subsystem and/or a collaborative management system of other vehicles through the information conversion module.
36. The VIU of claim 35 wherein: the IRIS subsystem is an RIU or IRT.
37. The VIU of claim 35 wherein: the VIU communicates with the CADS' collaborative management system, IRIS or IRIS subsystem, and/or other vehicles to provide communication for automated driving tasks.
38. The information conversion module of claim 35, wherein: an information encoding function is provided to encode the autopilot mission data and information using a codebook.
39. The information conversion module of claim 35, wherein: providing an information exchange function to transmit driving requirements, driving information, vehicle environment information, and/or real-time status of automatic driving to a collaborative management system of the CADS; and is combined with
Data and information are received from other modules of the VIU for use in sensing data fusion and collaborative decisions.
40. The perception fusion module of the VIU of claim 1, wherein: receiving sensory data and information from the vehicle and external systems; carrying out data fusion on the perception data and the information; and provides a sensing function.
41. The VIU of claim 40 wherein: the perceived data and information from the vehicle and external systems include:
a) High Definition (HD) map information;
b) Traffic information;
c) Driving information from surrounding vehicles;
d) Route planning information; and/or
e) Driving decision instructions.
42. The perception fusion module of the VIU of claim 1, wherein: resources are obtained from external systems to provide enhanced awareness functionality to the vehicle.
43. The enhanced perceptive function of the VIU of claim 42, wherein: longitudinal and/or lateral trajectory planning and control for intelligent class 1 vehicles is supported.
44. The perception fusion module of the VIU of claim 1, wherein: information is sent to and/or resources are obtained from the CADS to provide complementary awareness functions to the vehicle.
45. The perceptual fusion module of claim 44, wherein: and sending the information to a collaborative decision-making module of the VIU.
46. The supplemental sensory function of claim 44, wherein: is provided to a vehicle having an intelligent class 2.
47. The perception fusion module of the VIU of claim 1, wherein: the operation of the intelligent class 3 vehicle is facilitated to take over the driving decisions of the replacement human driver.
48. The perceptual fusion module of claim 47, wherein: resources are obtained from the CADS system and additional real-time awareness and monitoring is performed on the driver.
49. The perceptual fusion module of claim 47, wherein: transmitting the information to a collaborative decision-making module of the VIU; providing a perception result to the VIU; and determining from the perceived result whether the VIU should take over the driving decision to replace the human driver.
50. The perception fusion module of the VIU of claim 1, wherein: the problem of running risk of the intelligent class 4 vehicle in long tail scenes is solved by obtaining resources from the CADS system and providing perception information.
51. The perception fusion module of the VIU of claim 1, wherein: operation of the intelligent class 5 vehicle is supported by providing improved dynamic high definition maps, greater range of environmental awareness, route planning information, driving decisions, and improved awareness.
52. The VIU of claim 51 wherein: the VIU reduces research and development time and cost of intelligent class 5 vehicles.
53. The synergistic decision module of claim 1, wherein: and cooperated with the CADS to generate a fusion result and a cooperated decision instruction.
54. The VIU of claim 53 wherein: the CADS provides external awareness, decision making, and vehicle control information and functions.
55. The synergistic decision module of claim 1, wherein: for a vehicle of intelligent class 1, decisions are generated to support longitudinal and/or lateral vehicle control to provide the vehicle with a partially autonomous driving function.
56. The synergistic decision module of claim 1, wherein: for a vehicle of intelligent class 2, providing a trajectory planning decision and a detailed driving decision; and uses information from the vehicle with level 2 intelligence to send a decision to take over the driver.
57. The synergistic decision module of claim 1, wherein: for a vehicle of intelligent class 3, it cooperates with external systems to generate driving decisions for the vehicle.
58. The decision making method of claim 57, wherein: the decision takeover replaces the driving decision of the human driver.
59. The VIU of claim 58 wherein: the VIU takes over a human driver's request in response to a Conventional Vehicle Control System (CVCS) of the vehicle and generates vehicle control instructions.
60. The VIU of claim 59 wherein: the CVCS uses the perceived fusion result to make a request to take over the human driver.
61. The VIU of claim 57 wherein: the VIU determines that it cannot take over the human driver's decision and prompts the driver to assume control of the vehicle, monitors the status and/or driving of the human driver, responds to an emergency, and/or provides vehicle control to assist the human driver in controlling the vehicle.
62. The synergistic decision module of claim 1, wherein: and generating a decision in cooperation with the external system to solve the long tail scene driving problem of the vehicle with the intelligent level 4.
63. The collaborative decision-making module of claim 62, wherein: resources are received from CADS to increase the security of driving decisions.
64. The collaborative decision-making module of claim 63, wherein: resources are received from CADS to reduce long tail risk and extend ODDs.
65. The synergistic decision module of claim 1, wherein: for intelligent class 5 vehicles, predictive decisions and trajectory planning are enhanced based on the perceived results.
66. The intelligent control command/assist module of the VIU of claim 1 wherein: and fusing the VIU decision instruction and the CVCS decision instruction.
67. The VIU of claim 1 wherein: part or all of the CADS autopilot functionality is extended to a vehicle equipped with a VIU by executing CADS system instructions.
68. The VIU of claim 1 wherein: road and traffic information is provided to the vehicle equipped with the VIU.
69. The VIU of claim 1 wherein: and when the VIU sends the information of the departure place and the destination to the CADS, providing positioning and navigation requests to a system map of the CADS.
70. The VIU of claim 1 wherein: when a vehicle equipped with a VIU is connected to a CADS, information is transmitted to and shared with the CADS.
71. The shared information of the VIU of claim 70, wherein: the information is shared by the CADS and users of the CADS.
72. The VIU of claim 71 wherein: the users of the CADS include cloud platforms, IRIS subsystems, road side infrastructures, communication devices, or vehicles equipped with VIUs and connected to the CADS.
73. The VIU of claim 1 wherein: supplementing, enhancing, backing up, enhancing and/or replacing the autopilot functionality of the CVCS of the vehicle.
74. The VIU of claim 1 wherein: cooperate with the vehicle CVCS to supplement, enhance, back up, boost and/or replace the autopilot functionality of the vehicle CVCS.
75. The VIU of claim 73 wherein: automatic driving functions of vehicles traveling on roads with level 1, 2, 3, 4, or 5 intelligence and level 0, 1, 2, 3, 4, or 5 intelligence are supplemented, enhanced, backed up, promoted, and/or replaced.
76. The VIU of claim 1 wherein: supplementing the autopilot function of the CVCS to provide autopilot function services for vehicles in long tail scenarios, including accidents; an event; building and/or work areas; extreme and/or adverse weather; dangerous roads; unclear road markings, signs and/or geometric designs; and/or a highly concentrated pedestrian and/or bicycle scenario.
77. The VIU according to claim 76 wherein: the perception fusion module and collaborative decision-making module of the VIU supplements the automatic driving functions of the CVCS with perception information, decisions and vehicle control instructions provided by the CADS, CADS subsystem, IRIS, RIU, IRT and/or road side infrastructure.
78. The VIU of claim 1 wherein: a method of performing sensing, prediction, planning and control functions for enhancing a CVCS, the method comprising:
a) The fusion module of the VIU is used for perceiving the fusion data and information so as to enhance the perception and prediction capability of the CVCS;
b) The planning capability of the CVCS is enhanced through cooperation of the cooperative decision module of the VIU and the CADS;
c) The intelligent control command/auxiliary module of the VIU fuses the commands from the VIU and the CVCS to generate comprehensive control commands, so that the vehicle control capability of the CVCS is enhanced.
79. The redundant validation module of claim 1, wherein: having a backup function for the CVCS eliminates and/or minimizes errors and solves contradictions between information processing and transmission.
80. The redundant validation module of claim 1, wherein: eliminate and/or minimize errors, resolve contradictions, and/or verify:
a) Sensing information, decision information and control instructions provided by the vehicle-mounted system and an external system;
b) A driving decision generated by the CVCS; and/or
c) Driving decisions generated by the VIU.
81. The VIU of claim 1 wherein: cooperates with the vehicle CVCS to provide automated driving functions of the vehicle, including:
a) The CVCS generates a driving decision and a control instruction;
b) The VIU generates driving decision and control instructions; and
c) The VIU fuses driving decisions and/or control instructions from the CVCS and the VIU.
82. The VIU of claim 81 wherein: the VIU provides redundant on-board units for the vehicle to provide stable autopilot functionality for the vehicle.
83. The VIU of claim 81 wherein: the CVCS generates driving decisions and control instructions in response to unexpected traffic conditions.
84. The VIU of claim 81 wherein: the VIU cooperates with CADS or its subsystems, IRIS, RIU, IRT and/or road side infrastructure to generate driving decisions and control instructions.
85. The VIU of claim 1 wherein: when a module in the CVCS fails or malfunctions, the corresponding module in the VIU system replaces the failed module function in the CVCS.
86. The VIU of claim 1 wherein: the intelligent level of the vehicle is improved by upgrading the automatic driving function of the vehicle by using the VIU perception fusion module and the collaborative decision-making module.
87. The VIU of claim 86 wherein:
a) The intelligent level of the vehicle having the intelligent level 1 is raised to the intelligent level 2, 3, 4 or 5;
b) The intelligent level of the vehicle having the intelligent level 2 is raised to the intelligent level 3, 4 or 5;
c) The intelligent level of the vehicle having the intelligent level 3 is raised to the intelligent level 4 or 5;
d) The intelligent level of the vehicle having the intelligent level 4 is raised to the intelligent level 5; and/or
e) The safety level of a vehicle with intelligent level 5 is increased and/or the cost of the vehicle is reduced.
88. The VIU of claim 1 wherein: when the CVCS fails or malfunctions, replacing part or all of the automatic driving tasks of the CVCS, wherein:
a) The vehicle-mounted sensor access and information processing module generates sensing information and sends the sensing information to the sensing fusion module;
b) The communication module and the information conversion module receive external information and send the external information to the perception fusion module;
c) The perception fusion module generates a perception fusion result and sends the perception fusion result to the collaborative decision module;
d) The collaborative decision module generates a decision instruction and sends the decision instruction to the intelligent control instruction/auxiliary module to generate a comprehensive control instruction aiming at the driving task of the vehicle.
89. The VIU of claim 1 wherein: form parallel, serial and cross architecture relations with the CVCS for information processing.
90. The information processing of the VIU of claim 89, wherein: including awareness fusion information, intelligent decision information, and vehicle control information.
91. The VIU of claim 89 wherein: parallel, serial and cross architecture relationships are formed with the CVCS for information processing including integrating and/or fusing functional modules of the VIU and CVCS.
92. The VIU of claim 89 wherein: the VIU and CVCS share information, data, and/or resources to provide VIU supplementing, enhancing, backing up, boosting, and replacing functions.
CN202211626476.7A 2022-12-16 2022-12-16 Intelligent vehicle-mounted unit for serving cooperative automatic driving of vehicle and road Pending CN116778734A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211626476.7A CN116778734A (en) 2022-12-16 2022-12-16 Intelligent vehicle-mounted unit for serving cooperative automatic driving of vehicle and road

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211626476.7A CN116778734A (en) 2022-12-16 2022-12-16 Intelligent vehicle-mounted unit for serving cooperative automatic driving of vehicle and road

Publications (1)

Publication Number Publication Date
CN116778734A true CN116778734A (en) 2023-09-19

Family

ID=87995164

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211626476.7A Pending CN116778734A (en) 2022-12-16 2022-12-16 Intelligent vehicle-mounted unit for serving cooperative automatic driving of vehicle and road

Country Status (1)

Country Link
CN (1) CN116778734A (en)

Similar Documents

Publication Publication Date Title
US20230056023A1 (en) Vehicle-road driving intelligence allocation
Kuru et al. A framework for the synergistic integration of fully autonomous ground vehicles with smart city
US20220114885A1 (en) Coordinated control for automated driving on connected automated highways
US20210163021A1 (en) Redundancy in autonomous vehicles
US20220332337A1 (en) Vehicle intelligent unit
WO2020164021A1 (en) Driving control method and apparatus, device, medium, and system
US20210394797A1 (en) Function allocation for automated driving systems
CN113496602B (en) Intelligent roadside tool box
US20220219731A1 (en) Intelligent information conversion for automatic driving
US20220270476A1 (en) Collaborative automated driving system
CN110606070B (en) Intelligent driving vehicle and braking method thereof, vehicle-mounted equipment and storage medium
US20210314752A1 (en) Device allocation system
CN110568847B (en) Intelligent control system and method for vehicle, vehicle-mounted equipment and storage medium
US20220111858A1 (en) Function allocation for automated driving systems
US20230377461A1 (en) Distributed driving systems and methods for automated vehicles
CN110562269A (en) Method for processing fault of intelligent driving vehicle, vehicle-mounted equipment and storage medium
US20220171400A1 (en) Systematic intelligent system
US20220032934A1 (en) Method, apparatus, device and system for controlling driving
CN114407915A (en) Method and device for processing operation design domain ODD and storage medium
Park et al. Glossary of connected and automated vehicle terms
US20220281484A1 (en) Mobile intelligent road infrastructure system
CN110908367A (en) SCSTSV-based intelligent networking automobile computing platform
CN116778734A (en) Intelligent vehicle-mounted unit for serving cooperative automatic driving of vehicle and road
CN113272195A (en) Control system and control method for intelligent networked vehicle
CN117087695A (en) Collaborative autopilot system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination