WO2020057105A1 - 用于控制车辆的自动驾驶的方法、设备、介质和系统 - Google Patents

用于控制车辆的自动驾驶的方法、设备、介质和系统 Download PDF

Info

Publication number
WO2020057105A1
WO2020057105A1 PCT/CN2019/081607 CN2019081607W WO2020057105A1 WO 2020057105 A1 WO2020057105 A1 WO 2020057105A1 CN 2019081607 W CN2019081607 W CN 2019081607W WO 2020057105 A1 WO2020057105 A1 WO 2020057105A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
environment
perception
driving
perception result
Prior art date
Application number
PCT/CN2019/081607
Other languages
English (en)
French (fr)
Inventor
陶吉
夏添
胡星
Original Assignee
百度在线网络技术(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 百度在线网络技术(北京)有限公司 filed Critical 百度在线网络技术(北京)有限公司
Priority to US17/042,747 priority Critical patent/US20210024095A1/en
Publication of WO2020057105A1 publication Critical patent/WO2020057105A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0011Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4046Behavior, e.g. aggressive or erratic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/55External transmission of data to or from the vehicle using telemetry

Definitions

  • Embodiments of the present disclosure mainly relate to the field of off-vehicle interaction, and more particularly, to a method, an apparatus, a device, a computer-readable storage medium, and a vehicle-road collaboration system for controlling the automatic driving of a vehicle.
  • autonomous driving also known as driverless driving
  • the basis of autonomous driving technology is the perception of the surrounding environment of the vehicle, that is, identifying the specific conditions of the surrounding environment. Only on the basis of sensing the environment can we further determine the driving behavior that the vehicle can perform in the current environment, and then control the vehicle to achieve the corresponding driving behavior.
  • the vehicle itself is required to be able to sense the surrounding environment, so the vehicle needs to be equipped with various sensor devices, such as lidar.
  • sensor devices have high manufacturing and maintenance costs, and cannot be reused as the vehicle is renewed.
  • the high requirements for the vehicle's own perception ability make it impossible to easily and inexpensively upgrade non-autonomous vehicles or vehicles with weaker autonomous driving capabilities to vehicles with stronger autonomous driving capabilities.
  • a scheme for controlling automatic driving of a vehicle is provided.
  • a method for controlling automatic driving of a vehicle includes obtaining an environmental perception result related to the environment around the vehicle, the environmental perception result is based on the sensing information collected by at least one sensor disposed in the environment and independent of the vehicle, and the environmental perception result indicates a plurality of objects in the environment. Relevant information; determining vehicle outside perception results by excluding the vehicle perception results corresponding to the vehicle from the environment perception results; and controlling the driving behavior of the vehicle based at least on the outside perception results.
  • an apparatus for controlling automatic driving of a vehicle includes a communication module configured to obtain an environmental perception result related to the environment around the vehicle, the environmental perception result is based on the sensing information collected by at least one sensor arranged in the environment and independent of the vehicle, and the environmental perception result indicates the environment Related information of multiple objects in the; and an information processing module configured to determine an outside perception result of the vehicle by excluding the own vehicle perception result corresponding to the vehicle from the environmental perception results; and a driving control module configured to Control the driving behavior of the vehicle based on at least the outside perception results.
  • a device including one or more processors; and a storage device for storing one or more programs, when the one or more programs are executed by the one or more processors So that one or more processors implement the method according to the first aspect of the present disclosure.
  • a computer-readable storage medium having stored thereon a computer program that, when executed by a processor, implements a method according to the first aspect of the present disclosure.
  • a vehicle-road coordination system includes a vehicle-side control device including the device according to the second aspect; at least one sensor arranged in the environment and independent of the vehicle, configured to collect environmentally-related sensing information; and a roadside assist device configured to Process the perception information to determine the environmental perception results related to the environment.
  • FIG. 1 illustrates a schematic diagram of an example environment in which various embodiments of the present disclosure can be implemented
  • FIG. 2 illustrates a block diagram of a vehicle-road collaboration system according to some embodiments of the present disclosure
  • FIG. 3 illustrates a schematic diagram of an example static map according to some embodiments of the present disclosure
  • FIG. 4 is a flowchart of a process for controlling automatic driving of a vehicle according to some embodiments of the present disclosure
  • FIG. 5 is a flowchart of a process for assisting in controlling autonomous driving of a vehicle according to some embodiments of the present disclosure.
  • FIG. 6 illustrates a block diagram of a computing device capable of implementing various embodiments of the present disclosure.
  • the accuracy and cost of a sensor are often directly proportional. If the cost of the sensor is reduced in order to save costs, it will inevitably bring about a reduction in sensing performance, or more low-performance sensors need to cooperate with each other to reduce the sensing blind zone as much as possible. In the process of use, once the on-board sensor is damaged, the repair of the separate vehicle or device will bring additional costs.
  • the sensors installed on each vehicle are usually adapted to the design and manufacture of the vehicle itself, and therefore may not be reused as the vehicle is scrapped.
  • the high requirements for the vehicle's own perception ability make it impossible to easily and inexpensively upgrade non-autonomous vehicles or vehicles with weaker autonomous driving capabilities to vehicles with stronger autonomous driving capabilities. Upgrades to vehicle self-driving capabilities can usually only be achieved by replacing the vehicle.
  • a scheme of an automatic driving control with an assist perception outside the vehicle is proposed.
  • the environment-related sensing information is collected by sensors arranged in the environment around the vehicle and independent of the vehicle.
  • An environment perception result is determined based on such perception information.
  • the self-vehicle perception result corresponding to the vehicle is excluded from such environment perception results, and the outside perception result of the vehicle is obtained for controlling the driving behavior of the vehicle.
  • Performing the perception of the environment by sensors outside the vehicle can reduce the requirements for the vehicle's own perception capabilities, enabling non-autonomous vehicles or vehicles with weaker self-driving capabilities to simply and cost-effectively improve self-driving capabilities.
  • the outside sensor can also be used to assist the automatic driving control of multiple vehicles in the environment, which improves the utilization rate of the sensor.
  • FIG. 1 illustrates a schematic diagram of an example traffic environment 100 in which various embodiments of the present disclosure can be implemented.
  • Some typical objects are shown schematically in this example environment 100, including road 102, traffic indication facility 103, plants 107 on both sides of the road, and pedestrians 109 that may appear.
  • road 102 road 102
  • traffic indication facility 103 traffic indication facility
  • plants 107 on both sides of the road 107
  • pedestrians 109 that may appear.
  • these illustrated facilities and objects are merely examples, and according to actual conditions, the presence of objects that may appear in different traffic environments will vary. The scope of the present disclosure is not limited in this regard.
  • one or more vehicles 110-1, 110-2 are driving on road 102.
  • vehicle 110 may be any type of vehicle that can carry people and / or objects and is moved by a power system such as an engine, including but not limited to cars, trucks, buses, electric vehicles, motorcycles, motor homes, trains, and the like.
  • a power system such as an engine
  • One or more vehicles 110 in the environment 100 may be vehicles with a certain degree of self-driving capability, such vehicles are also referred to as driverless vehicles.
  • the other or some vehicles 110 in the environment 100 may also be vehicles that do not have an automatic driving capability.
  • the environment 100 also has one or more sensors 105-1 to 105-6 (collectively referred to as sensors 105).
  • the sensor 105 is independent of the vehicle 110 and is used to monitor the condition of the environment 100 to obtain the perceptual information related to the environment 100.
  • the sensors 105 may be arranged near the road 102 and may include one or more types of sensors.
  • the sensors 105 may be arranged on both sides of the road 102 at a certain interval for monitoring a specific area of the environment 100.
  • Various types of sensors can be arranged in each area.
  • a movable sensor 105 may be provided, such as a movable sensing site or the like.
  • the sensing information collected by the sensors 105 arranged corresponding to the road 102 may also be referred to as road-side sensing information.
  • Roadside perception information may be used to facilitate driving control of the vehicle 110.
  • the roadside and the vehicle side can perform the control of the vehicle in cooperation.
  • FIG. 2 shows a schematic diagram of such a vehicle-road cooperation system 200.
  • the vehicle-road coordination system 200 will be discussed below with reference to FIG. 1.
  • the vehicle-road cooperation system 200 includes a sensor 105, a roadside assist device 210 for assisting the automatic driving of the vehicle 110, and a vehicle-side control device 220 for controlling the automatic driving of the vehicle 110.
  • the roadside assist device 210 is also sometimes referred to herein as a device for assisting autonomous driving of a vehicle.
  • the roadside assistance device 210 is used in combination with the environment 100 to assist in controlling the automatic driving of a vehicle appearing in the environment 100.
  • the roadside assist device 210 can be installed at any position, as long as the roadside assist device 210 can communicate with the sensor 105 and the vehicle-side control device 220. Since both are deployed on the roadside, the sensor 105 and the roadside auxiliary device 210 can also constitute a roadside auxiliary subsystem.
  • the vehicle-side control device 220 is sometimes also referred to herein as a device that controls 110 automatic driving of the vehicle.
  • the vehicle-side control device 220 is used in association with a corresponding vehicle 110, for example, is integrated into the vehicle 110 to control the automatic driving of the vehicle 110.
  • One or more vehicles 110 in the environment 100 may be respectively equipped with a vehicle-side control device 220.
  • a vehicle-side control device 220 may be integrated on the vehicle 110-1, and a vehicle-side control device 220 may be similarly integrated on the vehicle 110-2.
  • corresponding functions of the vehicle-side control device 220 are described for one vehicle 110.
  • the roadside assistance device 210 includes a communication module 212 and an information processing module 214.
  • the communication module 212 may support wired / wireless communication with the sensor 105, and is configured to acquire the collected perceptual information related to the environment 100 from the sensor 105.
  • the communication module 212 may also support communication with the vehicle-side control device 220, and such communication is usually wireless communication.
  • the communication of the communication module 212 with the sensor 105 and the vehicle-side control device 220 may be based on any communication protocol, and the implementation of the present disclosure is not limited in this regard.
  • the sensors 105 arranged in the environment 100 may include various types of sensors.
  • the sensor 105 may include, but are not limited to: image sensors (e.g., cameras), lidar, millimeter wave radar, infrared sensors, positioning sensors, light sensors, pressure sensors, temperature sensors, humidity sensors, wind speed sensors, wind direction sensors, air quality Sensors and more.
  • Image sensors can collect image information related to the environment 100; lidar and millimeter wave radar can collect laser point cloud data related to the environment 100; infrared sensors can use infrared to detect the environmental conditions in the environment 100; positioning sensors can collect and environmental Location information of related objects at 100; light sensors can collect measurements that indicate the intensity of light in environment 100; pressure, temperature, and humidity sensors can collect measurements that indicate pressure, temperature, and humidity in environment 100; wind speed and direction sensors Metrics used to indicate wind speed and direction in the environment 100 can be collected separately; air quality sensors can collect some air quality related indicators in the environment 100, such as oxygen concentration, carbon dioxide concentration, dust concentration, pollutant concentration, etc. . It should be understood that only a few examples of sensors 105 are listed above. According to actual needs, there may be other different types of sensors. In some embodiments, different sensors may be integrated at a certain location or may be distributed in an area of the environment 100 for monitoring a specific type of roadside perception information.
  • the perception information collected by the sensor 105 is collectively processed by the roadside auxiliary device 210 (specifically, the information processing module 214 in the roadside auxiliary device 210).
  • the information processing module 214 of the roadside assistance device 210 processes the perception information obtained from the sensor 105 to determine an environmental perception result related to the environment 100.
  • the environment perception result may indicate that the overall condition of the environment 100 is understood, and may specifically indicate related information of multiple objects including the vehicle 110 in the environment. Such related information includes the size, position (for example, the precise position in the Earth coordinate system), speed, direction of movement, distance from a specific viewpoint, and the like of each object.
  • the information processing module 214 may fuse different types of sensing information from different sensors 105 to determine an environmental sensing result.
  • the information processing module 214 may use various different information fusion technologies to determine the environment perception result.
  • the communication module 212 in the roadside assistance device 210 is configured to transmit the environment perception result obtained after processing by the information processing module 214 to the vehicle-side control device 220.
  • the vehicle-side control device 220 controls the corresponding vehicle 110 (for example, a driving behavior in which the vehicle-side control device 220 is installed) based on the environmental perception result obtained from the roadside assist device 210.
  • the vehicle-side control device 220 includes a communication module 222, an information processing module 224, and a driving control module 226.
  • the communication module 222 is configured to be communicatively coupled with the communication module 212 in the road-side auxiliary device 210, in particular, the road-side auxiliary device 210 to receive the environment perception result from the communication module 212.
  • the information processing module 224 is configured to perform processing on the environmental perception result to make the environmental perception result suitable for the automatic driving control of the vehicle 110.
  • the driving control module 226 is configured to control the driving behavior of the vehicle 110 based on a processing result of the information processing module 224.
  • vehicle-side control device 220 performs automatic driving control of the vehicle 110.
  • the communication module 222 of the vehicle-side control device 220 may obtain an environment perception result related to the environment 100 around the vehicle 110 from the roadside assistance device 210.
  • Such an environment perception result is based on the perception information collected by one or more sensors 105 arranged in the environment 100 and independent of the vehicle 110, and indicates related information of multiple objects in the environment 100, such as the size and position of the objects (E.g., precise location in the Earth coordinate system), speed, direction of movement, distance from a particular viewpoint, and more.
  • the vehicle side control device 220 may also obtain the environment perception result from sensors on other vehicles integrated in the environment 100 as a supplement. Some vehicles in the environment 100 may have sensors with strong sensing capabilities (such as lidar) or sensors with general sensing capabilities (such as cameras). The sensory information collected by these sensors can also assist other vehicles' autonomous driving control. For a certain vehicle (for example, vehicle 110-1), the vehicle-side control device 220 associated with the vehicle 110-1 may obtain original perceptual information from sensors on other vehicles (for example, vehicle 110-2) or Perceived results after processing perceptual information.
  • a sensor equipped on a vehicle detects the surrounding environment from the perspective of the vehicle itself, so the obtained perceptual information does not include information related to the vehicle itself.
  • sensors outside the vehicle such as roadside sensors or sensors on other vehicles
  • these sensors indiscriminately monitor relevant information about the vehicle and other objects.
  • Information contains perceptual information about objects in the entire environment.
  • the information processing device 224 excludes the self-vehicle sensing result corresponding to the vehicle 110 from the environment sensing results to determine the vehicle outside sensing result.
  • the self-vehicle perception result may refer to information related to the vehicle 110 itself in the environment perception result, such as the size, position, speed, direction, distance from a specific viewpoint, and the like of the vehicle 110.
  • Out-of-vehicle perception results include related information of objects other than the vehicle 110.
  • the vehicle 110 needs to treat all objects other than the own vehicle as obstacles, so as to reasonably plan the driving path and avoid collision with the obstacles.
  • the outside-vehicle perception result is more suitable for the automatic driving control of the vehicle 110.
  • the vehicle 110 may be equipped with a tag portion for identifying the vehicle 110.
  • a tag portion may be one or more of the following: a license plate of the vehicle 110, a two-dimensional code affixed to the outside of the vehicle 110, a non-visible light label affixed to the outside of the vehicle 110, and a radio frequency tag mounted on the vehicle 110.
  • a two-dimensional code specific to the vehicle 110 may be pasted outside the vehicle 110 as a label portion of the vehicle.
  • the license plate and / or two-dimensional code of the vehicle 110 can be identified from image information collected by the image sensor.
  • a non-visible light tag such as an infrared or ultraviolet reflective tag, may be attached to the vehicle 110 to identify the vehicle 110.
  • a radio frequency tag installed on the vehicle 110 may also be used to identify the vehicle 110. The radio frequency tag may transmit a signal, and read the transmitted signal through a radio frequency reader to identify the vehicle 110.
  • the information processing module 224 can identify identification information related to the label portion of the vehicle 110 from the environment perception result.
  • identification information may be, for example, license plate or two-dimensional code image information of the vehicle 110, instruction information indicating specific signals of the invisible light tag and the radio frequency tag, and the like.
  • the information processing module 224 identifies the corresponding identification information by matching the identification indicated by the label portion of the self-vehicle with the environmental perception result. Then, the information processing module 224 determines the own vehicle perception result corresponding to the vehicle 110 among the environmental perception results based on the identification information.
  • the roadside assistance device 210 combines related information of each object. Therefore, through the identification information of the vehicle 110, other information related to the vehicle 110 in the environment perception result can be determined, such as the position, size, and the like of the vehicle 110.
  • the self-vehicle perception result in the environment perception result may be identified based on the location of the vehicle 110.
  • the environmental perception results may include the positions of multiple objects.
  • the information processing module 224 may use various positioning technologies to determine the position of the vehicle 110, and then match the position of the vehicle 110 with the positions of multiple objects in the environment perception result, and identify the matching objects of the vehicle 110 from the multiple objects. object. In this way, the information processing module 224 can identify which object in the environment perception result is the vehicle 110 itself. Therefore, the information processing module 224 can exclude the perception result corresponding to the object matched by the vehicle 110 from the environment perception result, and obtain the perception result outside the vehicle.
  • the position of the vehicle 110 may be the precise position of the vehicle 110 (for example, similar to the accuracy of the position of the object included in the environmental perception result) or may be the rough position of the vehicle 110 (for example, Sub-meter positioning).
  • the location of the vehicle 110 may be determined by a positioning device, such as a Global Positioning System (GPS) antenna, a position sensor, and the like, that the vehicle 110 itself has.
  • GPS Global Positioning System
  • the vehicle 110 may also perform positioning based on other positioning technologies, such as a base station in communication with the communication module 222 and / or a roadside assist device 210 arranged in the environment 100, or any other technology.
  • the information processing module 224 may delete or ignore the self-vehicle sensing result corresponding to the vehicle 110 among the environment sensing results, and only consider other environmental sensing results (that is, the outside sensing result).
  • the result of sensing outside the vehicle is used by the driving control module 226 in the vehicle-side control device 220 to control the driving behavior of the vehicle 110.
  • the driving control module 226 may use various automatic driving strategies to control the driving behavior of the vehicle 110 on the basis of known outside car sensing results.
  • the driving behavior of the vehicle 110 may include a driving path, a driving direction, a driving speed, and the like of the vehicle 110.
  • the driving control module 226 may generate specific operation commands for the driving behavior of the vehicle 110, such as operation commands for the driving system and steering system of the car, so that the vehicle 110 drives according to such operation commands.
  • the operation command may be, for example, any command related to the driving of the vehicle 110 such as acceleration, deceleration, left steering, right steering, parking, whistle, turning on or off the lights, and the like.
  • the driving control module 226 may determine the behavior prediction of one or more objects (ie, obstacles) in the environment 100 based on the perception results outside the vehicle. Behavior prediction includes one or more aspects of an object's expected motion trajectory, expected motion speed, and expected motion direction. Object behavior prediction is also useful for the vehicle's automatic driving control, because the vehicle's automatic driving control often needs to judge how the objects around the vehicle will move in order to take corresponding driving behaviors to respond.
  • the driving control module 226 may perform behavior prediction based on a pre-trained prediction model. Such a prediction model may be, for example, a general behavior prediction mode, or include different prediction models for different types of objects.
  • the driving control module 226 may determine the driving behavior of the vehicle 110 based on the behavior prediction of the object.
  • the information processing module 224 controls the driving of the vehicle based on the position of the vehicle 110 in addition to the result of the perception outside the vehicle.
  • the vehicle 110 may be equipped with sensors capable of performing precise positioning.
  • the precise position of the vehicle 110 can also be determined from the environmental perception results, which can also reduce the requirements for the precise positioning hardware of the vehicle 110 and improve the positioning accuracy and stability.
  • the environmental perception results include a higher accuracy position of the vehicle 110.
  • the precise position used in the automatic driving control of the vehicle 110 can be determined from the environmental perception results.
  • the vehicle-side control device 220 may include a vehicle positioning module (not shown).
  • the vehicle positioning module may be configured to identify the vehicle 110 from the environment perception result by means of position matching.
  • the vehicle positioning module may first determine a rough position of the vehicle 110, for example, by using a GPS antenna of the vehicle 110 or by using an auxiliary device such as a base station.
  • the vehicle positioning module determines an object matching the vehicle 110 from the environment perception result based on the rough position of the vehicle 110, and determines the position of the object matching the vehicle 110 in the environment perception result as the fine position of the vehicle 110 (that is, has a higher accuracy) s position).
  • the precise position of the vehicle 110 can be obtained for controlling the driving behavior of the vehicle 110 without requiring the vehicle 110 or the vehicle-side control device 220 to have an accurate on-board positioning device.
  • the self-vehicle sensing result corresponding to the vehicle 110 may also be identified by a tag portion provided with the vehicle 110. Therefore, the precise positioning of the vehicle 110 can be obtained from the identified self-vehicle sensing result. This can enable the vehicle 110 to achieve accurate positioning even without an on-board positioning device.
  • the vehicle-side control device 220 may obtain other assisted driving information for assisting the automatic driving of the vehicle 110 in addition to obtaining the environment perception result from the roadside assist device 210.
  • the communication module 222 of the vehicle-side control device 220 may obtain behavior predictions of one or more objects in the environment 100 from the roadside assistance device 210 (eg, from the communication module 214). Behavior prediction includes one or more aspects of an object's expected motion trajectory, expected motion speed, and expected motion direction.
  • the communication module 222 of the vehicle-side control device 220 may obtain an automatic driving recommendation for the vehicle 110 from the roadside assistance device 210 (for example, from the communication module 214), including the driving route recommendation and the driving direction recommendation of the vehicle 110. And one or more of specific operation instructions recommended to control the driving behavior of the vehicle.
  • the driving control module 226 of the vehicle-side control device 220 may control the driving behavior of the vehicle 110 based on behavior predictions about objects and / or automatic driving recommendations obtained from the roadside assist device 210.
  • the vehicle-side control module 226 may refer to or adjust the behavior prediction and / or automatic driving recommendation obtained from the roadside assistance device 210 in order to determine the actual driving behavior of the vehicle 110.
  • the roadside assistance device 210 performs behavior prediction and automatic driving recommendation, which can further reduce the requirements for the automatic driving capability of the vehicle 110 itself or the vehicle side control device 220, and reduce the processing and control complexity of the vehicle side.
  • the vehicle-side control device 220 may determine the driving of the vehicle 110 based on a simple automatic driving control strategy, based on behavior predictions and / or automatic driving recommendations obtained from the roadside assist device 210, and in combination with actual outside perception results. behavior.
  • the vehicle-side control device 220 obtains an environment perception result from the road-side assistance device 210 and may also obtain behavior predictions and / or automatic driving recommendations of objects to control the driving behavior of the vehicle 110.
  • the sensor 105 and the roadside assisting device 210 assume the sensing function of the surrounding environment of the vehicle 110, and in addition, it can provide driving assistance information such as behavior prediction and / or automatic driving recommendation.
  • the environment perception results obtained by the roadside assistance device 210 and the sensor 105 and other assisted driving information can be provided to a plurality of vehicles 110 in the environment 100, thereby achieving centralized environment perception and information processing.
  • the vehicle 110 itself may not be required to have strong environmental awareness capabilities, self-localization capabilities, behavior prediction capabilities, and / or autonomous driving planning capabilities to enable autonomous driving.
  • the improvement of the automatic driving capability of the vehicle 110 can be achieved by integrating the vehicle-side control device 220.
  • the function of the vehicle-side control device 220 may be integrated into the vehicle 110 by upgrading a software system of the vehicle 110 and by adding a communication function or by using a communication function that the vehicle 110 itself has.
  • the roadside assistance device 210 provides behavior prediction capabilities and / or automatic driving recommendations, and can also guarantee the automatic driving process of the vehicle 110 in the event that the hardware and / or software of the vehicle 110 fails to perform behavior prediction and driving planning. continued.
  • a roadside assist device 210 and a sensor 105 are deployed in a certain road section of the road system of the vehicle, only the vehicle-side control device 220 needs to be integrated, and the vehicle 110 traveling to the road section can obtain a more powerful automatic Driving ability.
  • vehicles 110 that do not have self-driving capabilities for example, vehicles classified as L0, L1 in the autonomous driving classification
  • vehicles 110 that have weaker driving capabilities for example, L2 vehicles
  • the above embodiment mainly describes the specific implementation of the vehicle-side control device 220 in the cooperative control system 200 of FIG. 2.
  • some embodiments of the roadside assistance device 210 in the cooperative control system 200 will continue to be described.
  • the roadside assisting device 210 acquires the sensing information of the sensor 105 and determines an environmental sensing result by processing the sensing information. The roadside assistance device 210 then provides the environment perception results to the vehicle-side control device 220 for assisting in controlling the driving behavior of the vehicle 110.
  • the roadside assistance device 210 may determine an outside perception result corresponding to one or more vehicles 110 from the environment perception results, and The sensing result is provided to the vehicle-side control device 220. That is, the perception results that the roadside assistance device 210 can provide to each vehicle 110 are different outside perception results for each vehicle and can be directly used for driving control of these vehicles. Specifically, the information processing module 214 of the roadside assistance device 210 excludes the self-vehicle sensing result corresponding to a certain vehicle 110 from the environment sensing results, thereby determining the outside sensing result of the vehicle 110. The roadside assistance device 210 then provides the determined out-of-vehicle perception result to the vehicle-side control device associated with the vehicle for assisting in controlling the driving behavior of the vehicle.
  • the manner in which the information processing module 214 identifies the outside sensing result of a certain vehicle 110 is similar to the manner adopted by the vehicle-side control device 220.
  • the information processing module 214 may also identify the vehicle 110 based on a tag portion equipped with the vehicle 110, such as one or more of a license plate, a two-dimensional code, a non-visible light tag, and a radio frequency tag of the vehicle 110.
  • the information processing module 214 identifies the identification information related to the tag portion equipped with the vehicle 110 from the environment perception result, and then determines the vehicle perception result corresponding to the vehicle 110 in the environment perception result based on the identification information.
  • the information processing module 214 may exclude the self-vehicle perception result from the environment perception result to obtain the outside perception result for providing to the vehicle-side control device 220.
  • the information processing module 214 may also determine the environment perception result by means of a static high-precision map associated with the environment 100.
  • the static high-precision map includes information about static objects of the environment 100.
  • the static high-precision map may be generated from information related to the environment 100 collected in advance by the sensors 105 arranged in the environment 100.
  • the static high-precision map includes only relevant information of objects in the environment 100 that protrude above the ground and remain stationary for a relatively long time.
  • FIG. 3 illustrates an example of a static high-resolution map 300 associated with the environment 100 of FIG. 1.
  • the static high-precision map 300 includes only stationary objects, such as poles with sensors 105, traffic indicating facilities 103, and plants 107 on both sides of the road. These objects remain stationary for a period of time. Objects such as the vehicle 110 and the pedestrian 109 sometimes appear in the environment 100, sometimes disappear from the environment 100, or move in the environment 100. Therefore, such objects are called dynamic objects.
  • the static high-precision map 300 shown in FIG. 3 is only given for the purpose of illustration. Generally, in addition to schematically showing an object or giving an image of an object, a high-precision map is also labeled with other information about the object, such as precise position, speed, direction, and so on.
  • the static high-precision map includes a three-dimensional static high-precision map, which includes related information of an object in a three-dimensional space.
  • a static high-precision map such as the static high-precision map 300, may be collected by the high-precision map collection vehicle and related to the environment 100 and generated based on such information.
  • the static high-precision map associated with the environment 100 may also be updated periodically or triggered by a corresponding event.
  • the update period of the static high-precision map can be set to a relatively long period of time.
  • the update of the static high-precision map may be based on the perception information collected by the sensors 105 arranged in the environment 100 and monitoring the environment 100 in real time.
  • the information processing module 214 may use the implementation perception result provided by the sensor 105 to update the static high-resolution map to obtain the real-time high-resolution map associated with the environment 100 as the environment perception result.
  • the sensory information from the sensor 105 and the static high-precision map can be fused, and the dynamic objects and the related information of the dynamic objects in the sensory information are combined into the static high-precision map.
  • static high-precision maps can correct or delete objects that may be incorrectly detected in real-time perception information, improving the accuracy of environmental perception results. For example, due to errors in real-time perception information, an object in the environment 100 is detected to have a certain speed. By combining a static high-precision map, it can be determined that the object is actually a static object, so it can avoid incorrectly marking the object speed and affecting the vehicle. 110 automatic driving control.
  • a static high-precision map helps to label the precise locations of objects in the environment 100, and such precise locations can form part of the environment perception results.
  • the information processing module 214 may use image sensing information in the sensing result collected by the sensor 105.
  • the information processing module 214 identifies objects in the environment from the image sensing information, and the identified objects include static objects in the environment and other objects (such as dynamic objects newly entered into the environment 100). This can be achieved through image processing techniques for object recognition.
  • the information processing module 214 determines the positions of other objects from the positions of the static objects indicated by the static high-precision map based on the identified relative position relationship between the static objects and other objects.
  • the image sensing information collected by the image sensor usually cannot indicate the geographic location of the object, such as the specific position in the earth coordinate system, but the image sensing information can reflect the relative position relationship between different objects. Based on such a relative position relationship, the precise positions of other objects can be determined from the positions of the static objects indicated by the known static high-precision map.
  • the absolute geographical position of other objects in the environment 100 may also be determined by referring to the conversion relationship of the static objects from the image perception information to the static high-precision map. Object positioning using static high-precision maps can quickly and accurately obtain high-precision positions, reducing the computational overhead required for accurate positioning.
  • the roadside assistance device 210 may also process the environment perception result to obtain one or more of the environment 100 Other assisted driving information for the vehicle, such as behavior predictions of objects in the environment 100 and / or automatic driving recommendations for a particular vehicle 110. How to determine the behavior prediction of an object and the automatic driving recommendation of a vehicle in the roadside assist device 210 will be discussed in detail below.
  • the roadside assistance device 210 further includes a behavior prediction module (not shown) configured to determine behavior prediction of one or more objects in the environment 100 based on the environment perception results. The determined predicted behavior is provided to the vehicle-side control device 220 via the communication module 212 for further assisting in controlling the driving behavior of the corresponding vehicle 110.
  • the behavior prediction of an object includes one or more aspects of an object's expected motion trajectory, expected motion speed, and expected motion direction. Object behavior prediction is also useful for the vehicle's automatic driving control, because the vehicle's automatic driving control often needs to judge how the objects around the vehicle will move in order to take corresponding driving behaviors to respond.
  • the behavior prediction module of the roadside assistance device 210 may utilize a prediction model specific to the location or area where the sensor 105 is located to determine the behavior prediction of the object. Unlike the universal prediction model for all objects or different types of objects used on the vehicle side, the prediction model local to the sensor 105 can be trained based on the behavior of the object appearing in the area where the sensor 105 is located. The training data used to train the prediction model may be previously recorded behavior of one or more objects at the area where the sensor 105 is located.
  • Objects that appear in different geographic areas may exhibit specific behavioral patterns related to that area. For example, if the sensor 105 is arranged near a tourist attraction, the pedestrian walking in this area may be less directional, similar to a pattern of wandering. If the sensor 105 is arranged near an office space such as an office building, the pedestrian walking in this area may be more purposeful, for example, toward a specific building or buildings. Therefore, by training a region-specific prediction model, you can more accurately predict the behavior of an object at a specific region.
  • the roadside assistance device 210 further includes a driving recommendation module (not shown) configured to determine an automatic driving recommendation for one or more vehicles 110 based on an environmental perception result.
  • the automatic driving recommendation may include a driving route recommendation of the vehicle 110, a driving direction recommendation of the vehicle 110, or may even include a specific operation instruction recommendation for controlling the driving behavior of the vehicle 110.
  • the automatic driving recommendation determined by the driving recommendation module is provided to the vehicle-side control device 220 via the communication module 212 for further assisting in controlling the driving behavior of the corresponding vehicle 110.
  • the driving recommendation module of the roadside assistance device 210 utilizes a recommendation model specific to the area in which the sensor 105 is located to determine an automatic driving recommendation.
  • the recommended model is trained based on the driving behavior performed by the vehicle in the area where the sensor 105 is located.
  • the data used to train the recommendation model may be previously recorded driving behaviors taken by one or more vehicles in the area where the sensor 105 is located.
  • vehicles may exhibit specific driving behavior patterns related to that area. For example, vehicles may perform deceleration operations in advance at intersections with heavy traffic. At some intersections, more vehicles may turn left.
  • the roadside assistance device 210 may also provide other driving assistance information to the vehicle-side control device 220, such as traffic conditions, accident conditions, etc. in the environment 100 monitored by the sensor 105, which are helpful to the vehicle
  • the side control device 220 controls the driving behavior of the vehicle 110 more accurately and reasonably.
  • the roadside assistance device 210 and the sensor 105 jointly provide the vehicle side control device 220 with an environmental perception result and may also provide behavior prediction and / or automatic driving recommendation of an object, to assist in controlling the driving behavior of the vehicle 110 .
  • the environment perception results obtained by the roadside assistance device 210 and the sensor 105 and other assisted driving information can be provided to a plurality of vehicles 110 in the environment 100, thereby achieving centralized environment perception and information processing.
  • the vehicle 110 itself may not be required to have strong environmental awareness capabilities, self-localization capabilities, behavior prediction capabilities, and / or autonomous driving planning capabilities to enable autonomous driving.
  • the improvement of the automatic driving capability of the vehicle 110 can be achieved by integrating the vehicle-side control device 220.
  • the function of the vehicle-side control device 220 may be integrated into the vehicle 110 by upgrading a software system of the vehicle 110 and by adding a communication function or by using a communication function that the vehicle 110 itself has.
  • the roadside assistance device 210 provides behavior prediction capabilities and / or automatic driving recommendations, and can also guarantee the automatic driving process of the vehicle 110 in the event that the hardware and / or software of the vehicle 110 fails to perform behavior prediction and driving planning. continued.
  • the roadside assistance device 210 realizes functions such as environmental perception results, object behavior prediction, and / or automatic driving control for vehicles. In some embodiments, one, some, or all of these functions may be performed by other devices with stronger computing capabilities, such as at the cloud, edge computing sites, roadside base stations, or servers.
  • the roadside assistance device 210 may provide the sensing information of the sensor 105 to a corresponding processing device, obtain a processing result, and provide the corresponding processing result to the vehicle-side control device 220.
  • FIG. 4 shows a flowchart of a method 400 for controlling automatic driving of a vehicle according to an embodiment of the present disclosure.
  • the method 400 may be implemented by the vehicle-side control device 220 of FIG. 2.
  • the vehicle-side control device 220 obtains an environmental perception result related to the environment around the vehicle.
  • the environment perception result is based on the perception information collected by at least one sensor arranged in the environment and independent of the vehicle, and the environment perception result indicates related information of a plurality of objects in the environment.
  • the vehicle-side control device 220 determines the vehicle's out-of-vehicle perception result by excluding the own-vehicle perception result corresponding to the vehicle from the environmental perception results.
  • the vehicle-side control device 220 controls the driving behavior of the vehicle based at least on the outside perception results.
  • controlling the driving behavior of the vehicle further includes: obtaining a behavior prediction of at least one of the plurality of objects, the behavior prediction including at least one of the following: an expected motion trajectory of the at least one object, an expected motion speed of the at least one object, and The expected movement direction of the at least one object; and controlling the driving behavior of the vehicle also based on the behavior prediction of the at least one object.
  • controlling the driving behavior of the vehicle further includes: obtaining an automatic driving recommendation for the vehicle, and the automatic driving recommendation includes at least one of the following: a driving path recommendation of the vehicle, a driving direction recommendation of the vehicle, and an operation for controlling the driving behavior of the vehicle Instruction recommendation; and also controlling the driving behavior of the vehicle based on the automatic driving recommendation for the vehicle.
  • determining the outside perception result of the vehicle includes: identifying identification information related to a tag portion equipped with the vehicle from the environment perception result; and determining the own vehicle perception result corresponding to the vehicle in the environment perception result based on the identification information; And exclude the vehicle perception result from the environment perception result to obtain the vehicle outside perception result.
  • the tag portion of the vehicle includes at least one of the following: a license plate of the vehicle, a QR code affixed to the outside of the vehicle, a non-visible light label affixed to the outside of the vehicle, and a radio frequency (RF) tag mounted on the vehicle.
  • a license plate of the vehicle a QR code affixed to the outside of the vehicle
  • a non-visible light label affixed to the outside of the vehicle
  • RF radio frequency
  • the environment perception result includes the positions of multiple objects
  • determining the vehicle's outside perception results includes: determining the position of the vehicle; identifying the multiple objects by matching the position of the vehicle with the positions of the plurality of objects To find objects that match the vehicle; and to exclude the perception results corresponding to the objects that match the vehicle from the environmental perception results to obtain the perception results outside the vehicle.
  • the method 400 further includes: determining a rough position of the vehicle in the environment; determining an object corresponding to the vehicle from a plurality of objects from the environmental perception result based on the rough position; The position information of the vehicle corresponding object is determined as the fine position of the vehicle in the environment.
  • controlling the driving behavior of the vehicle further includes controlling the driving behavior of the vehicle based on the fine position of the vehicle.
  • the at least one sensor includes at least one of: a sensor arranged near a road on which the vehicle is traveling, and a sensor integrated on other vehicles in the environment.
  • FIG. 5 illustrates a flowchart of a method 500 for assisting in controlling automatic driving of a vehicle according to an embodiment of the present disclosure.
  • the method 500 may be implemented by the roadside control device 210 of FIG. 2.
  • the roadside control device 210 acquires environment-related sensing information collected by at least one sensor. At least one sensor is arranged in the environment and is independent of the vehicle.
  • the roadside control device 210 determines the environmental perception result related to the environment by processing the acquired perception information, and the environmental perception result indicates related information of multiple objects in the environment, and the multiple objects include vehicles.
  • the roadside control device 210 provides the environment perception results to the vehicle-side control device associated with the vehicle for assisting in controlling the driving behavior of the vehicle.
  • the method 500 further comprises determining a behavior prediction of at least one of the plurality of objects based on the environmental perception result, the behavior prediction includes at least one of the following: an expected motion trajectory of the at least one object, an expected motion speed of the at least one object And the expected movement direction of at least one object; and providing the determined behavior prediction to the on-board control system for further assisting in controlling the driving behavior of the vehicle.
  • determining the behavior prediction includes determining a behavior prediction using a prediction model specific to a region where the at least one sensor is located, the prediction model being trained based on the behavior of another object appearing at the region.
  • the method 500 further includes: determining an automatic driving recommendation for the vehicle based on the environmental perception result, the automatic driving recommendation includes at least one of the following: a driving path recommendation of the vehicle, a driving direction recommendation of the vehicle, and controlling the driving behavior of the vehicle Recommended operation instructions; and providing the determined automatic driving recommendation to the on-board control system for further assisting in controlling the driving behavior of the vehicle.
  • determining an automatic driving recommendation includes determining an automatic driving recommendation using a recommendation model specific to an area where the at least one sensor is located, the recommendation model being trained based on driving behavior performed by another vehicle at the area.
  • determining the environment perception result includes: obtaining a static high-precision map associated with the environment, the static map indicating at least the position of a static object in the environment; and determining the environment perception result based on the perception information and the static high-precision map.
  • determining the environment perception result based on the perception information and the static high-precision map includes: updating the static high-precision map with the perception information to obtain a real-time high-precision map associated with the environment as the environment perception result.
  • the perception information includes image perception information
  • determining the environment perception result based on the perception information and the static high-precision map includes: identifying static objects and other objects in the environment from the image perception information; and based on the static objects and other objects
  • the relative position relationship in the image perception information determines the positions of other objects from the positions of the static objects indicated by the static high-precision map.
  • providing the out-of-environment perception result to the vehicle-side control device includes determining an out-of-vehicle perception result of the vehicle by excluding the own-vehicle perception result corresponding to the vehicle from the ambient-perception result; and Send to the vehicle-side control device.
  • determining the outside perception result of the vehicle includes: identifying identification information related to a tag portion equipped with the vehicle from the environment perception result; and determining the own vehicle perception result corresponding to the vehicle in the environment perception result based on the identification information; And exclude the vehicle perception result from the environment perception result to obtain the vehicle outside perception result.
  • the tag portion provided with the vehicle includes at least one of the following: a license plate of the vehicle, a QR code attached to the outside of the vehicle, a non-visible light tag attached to the outside of the vehicle, and a radio frequency tag installed on the vehicle.
  • the at least one sensor includes at least one of: a sensor arranged near a road on which the vehicle is traveling, and a sensor integrated on other vehicles in the environment.
  • FIG. 6 shows a schematic block diagram of an example device 600 that can be used to implement embodiments of the present disclosure.
  • the device 600 may be used to implement the roadside auxiliary device 210 or the vehicle-side control device 220 of FIG. 2.
  • the device 600 includes a computing unit 601, which may be based on computer program instructions stored in a read-only memory (ROM) 602 or computer program instructions loaded from a storage unit 608 into a random access memory (RAM) 603. Perform various appropriate actions and processes.
  • ROM read-only memory
  • RAM random access memory
  • the computing units 601, ROM 602, and RAM 603 are connected to each other through a bus 604.
  • An input / output (I / O) interface 605 is also connected to the bus 604.
  • I / O interface 605 Multiple components in the device 600 are connected to the I / O interface 605, including: an input unit 606, such as a keyboard, a mouse, etc .; an output unit 607, such as various types of displays, speakers, etc .; a storage unit 608, such as a magnetic disk, an optical disk, etc. And a communication unit 609, such as a network card, a modem, a wireless communication transceiver, and the like.
  • the communication unit 609 allows the device 600 to exchange information / data with other devices through a computer network such as the Internet and / or various telecommunication networks.
  • the computing unit 601 may be various general-purpose and / or special-purpose processing components having processing and computing capabilities. Some examples of the computing unit 601 include, but are not limited to, a central processing unit (CPU), a graphics processing unit (GPU), various specialized artificial intelligence (AI) computing chips, various computing units running machine learning model algorithms, and digital signal processing Processor (DSP), and any suitable processor, controller, microcontroller, etc.
  • the computing unit 601 may perform various methods and processes described above, such as the process 400 or the process 500.
  • process 400 or process 500 may be implemented as a computer software program that is tangibly embodied on a machine-readable medium, such as storage unit 608.
  • part or all of the computer program may be loaded and / or installed on the device 600 via the ROM 602 and / or the communication unit 609.
  • the computer program When the computer program is loaded into the RAM 603 and executed by the computing unit 601, one or more steps of the process 400 or the process 500 described above may be performed.
  • the computing unit 601 may be configured to perform the process 400 or the process 500 in any other suitable manner (for example, by means of firmware).
  • exemplary types of hardware logic components include: Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), Application Specific Standard Product (ASSP), System on a Chip (SOC), Load programmable logic device (CPLD) and so on.
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • ASSP Application Specific Standard Product
  • SOC System on a Chip
  • CPLD Load programmable logic device
  • Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general-purpose computer, special-purpose computer, or other programmable data processing device, so that when executed by the processor or controller, the functions specified in the flowcharts and / or block diagrams / The operation is implemented.
  • the program code can be executed entirely on the machine, partly on the machine, partly on the machine as an independent software package and partly on the remote machine or entirely on the remote machine or server.
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • a machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • machine-readable storage media would include electrical connections based on one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), fiber optics, compact disc read-only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or flash memory erasable programmable read-only memory
  • CD-ROM compact disc read-only memory
  • optical storage devices magnetic storage devices, or any suitable combination of the foregoing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

根据本公开的实施例,提供了用于控制车辆的自动驾驶的方法、设备、介质和系统。一种用于控制车辆的自动驾驶的方法包括获取与车辆周围的环境相关的环境感知结果,环境感知结果基于由布置在环境中并且独立于车辆的至少一个传感器采集到的感知信息,并且环境感知结果指示环境中的多个物体的相关信息;通过从环境感知结果中排除与车辆相对应的自车感知结果,确定车辆的车外感知结果;以及至少基于车外感知结果,控制车辆的驾驶行为。由车外传感器来执行对环境的感知,可以降低对车辆本身的感知能力的要求,使得非自动驾驶车辆或具有较弱自动驾驶能力的车辆能够简单且低成本地提升自动驾驶能力。

Description

用于控制车辆的自动驾驶的方法、设备、介质和系统 技术领域
本公开的实施例主要涉及车外交互领域,并且更具体地,涉及用于控制车辆的自动驾驶的方法、装置、设备、计算机可读存储介质以及车路协同系统。
背景技术
近年来,自动驾驶(也称为无人驾驶)相关技术逐渐崭露头角。车辆的自动驾驶能力越来越令人期待。自动驾驶技术的基础是对车辆周围环境的感知,即识别周围环境的具体状况。在感知环境的基础上才能进一步确定车辆在当前环境下可执行的驾驶行为,进而控制车辆实现相应的驾驶行为。在当前的自动驾驶领域中,要求车辆本身能够感知周围环境,因此车辆上需要装备各种传感器件,诸如激光雷达。这样的传感器器件的造价和维护成本都很高,随着车辆更新而无法重复利用。此外,对于车辆自身感知能力的高要求使得无法容易地且以低成本将非自动驾驶车辆或具有较弱自动驾驶能力的车辆升级为具有较强自动驾驶能力的车辆。
发明内容
根据本公开的实施例,提供了一种用于控制车辆的自动驾驶的方案。
在本公开的第一方面中,提供了一种用于控制车辆的自动驾驶的方法。该方法包括获取与车辆周围的环境相关的环境感知结果,环境感知结果基于由布置在环境中并且独立于车辆的至少一个传感器采集到的感知信息,并且环境感知结果指示环境中的多个物体的相关信息;通过从环境感知结果中排除与车辆相对应的自车感知结果,确定车辆的车外感知结果;以及至少基于车外感知结果,控制 车辆的驾驶行为。
在本公开的第二方面中,提供了一种用于控制车辆的自动驾驶的装置。该装置包括通信模块,被配置为获取与车辆周围的环境相关的环境感知结果,环境感知结果基于由布置在环境中并且独立于车辆的至少一个传感器采集到的感知信息,并且环境感知结果指示环境中的多个物体的相关信息;以及信息处理模块,被配置为通过从环境感知结果中排除与车辆相对应的自车感知结果,确定车辆的车外感知结果;以及驾驶控制模块,被配置为至少基于车外感知结果,控制车辆的驾驶行为。
在本公开的第三方面中,提供了一种设备,包括一个或多个处理器;以及存储装置,用于存储一个或多个程序,当一个或多个程序被一个或多个处理器执行,使得一个或多个处理器实现根据本公开的第一方面的方法。
在本公开的第四方面中,提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现根据本公开的第一方面的方法。
在本公开的第五方面中,提供了一种车路协同系统。该系统包括车侧控制装置,包括根据第二方面的装置;被布置在环境中并且独立于车辆的至少一个传感器,被配置为采集与环境相关的感知信息;以及路侧辅助装置,被配置为处理感知信息以确定与环境相关的环境感知结果。
应当理解,发明内容部分中所描述的内容并非旨在限定本公开的实施例的关键或重要特征,亦非用于限制本公开的范围。本公开的其它特征将通过以下的描述变得容易理解。
附图说明
结合附图并参考以下详细说明,本公开各实施例的上述和其他特征、优点及方面将变得更加明显。在附图中,相同或相似的附图标注表示相同或相似的元素,其中:
图1示出了本公开的多个实施例能够在其中实现的示例环境的示意图;
图2示出了根据本公开的一些实施例的车路协同系统的框图;
图3示出了根据本公开的一些实施例的示例静态地图的示意图;
图4根据本公开的一些实施例的用于控制车辆的自动驾驶的过程的流程图;
图5根据本公开的一些实施例的用于辅助控制车辆的自动驾驶的过程的流程图;以及
图6示出了能够实施本公开的多个实施例的计算设备的框图。
具体实施方式
下面将参照附图更详细地描述本公开的实施例。虽然附图中显示了本公开的某些实施例,然而应当理解的是,本公开可以通过各种形式来实现,而且不应该被解释为限于这里阐述的实施例,相反提供这些实施例是为了更加透彻和完整地理解本公开。应当理解的是,本公开的附图及实施例仅用于示例性作用,并非用于限制本公开的保护范围。
在本公开的实施例的描述中,术语“包括”及其类似用语应当理解为开放性包含,即“包括但不限于”。术语“基于”应当理解为“至少部分地基于”。术语“一个实施例”或“该实施例”应当理解为“至少一个实施例”。术语“第一”、“第二”等等可以指代不同的或相同的对象。下文还可能包括其他明确的和隐含的定义。
如以上提及的,为了支持车辆的自动驾驶能力,对于车辆周围环境的感知是很重要的。在传统自动驾驶技术中,要求车辆自身装备高成本的传感器来获得感知能力。这不仅在经济上增加成本,而且也阻碍了对已有车辆的自动驾驶能力的提升。
通常,传感器的精度和成本往往成正比。如果为了节约成本而降低传感器成本,势必带来感知性能的下降,或者需要更多低性能的传感器互相配合才能尽可能降低感知盲区。在使用过程中,车载 传感器一旦出现损坏,单独车辆或器件维修又会带来额外的成本。此外,安装在各个车辆上的传感器通常与车辆本身的设计和制造相适配,因此可能随着车辆的报废而无法被再次利用。另一方面,对于车辆自身感知能力的高要求使得无法容易地且以低成本将非自动驾驶车辆或具有较弱自动驾驶能力的车辆升级为具有较强自动驾驶能力的车辆。通常只能够通过更换车辆来获得车辆自动驾驶能力的升级。
根据本公开的实施例,提出了一种具有车外辅助感知的自动驾驶控制的方案。在该方案中,由布置在车辆周围的环境中并且独立于车辆的传感器来采集与环境相关的感知信息。基于这样的感知信息确定环境感知结果。从这样的环境感知结果中排除与车辆相对应的自车感知结果,获得车辆的车外感知结果以用于控制车辆的驾驶行为。由车外传感器来执行对环境的感知,可以降低对车辆本身的感知能力的要求,使得非自动驾驶车辆或具有较弱自动驾驶能力的车辆能够简单且低成本地提升自动驾驶能力。车外传感器还可以用于辅助环境中的多个车辆的自动驾驶控制,这提高了传感器的利用率。
以下将参照附图来具体描述本公开的实施例。
示例环境和系统
图1示出了本公开的多个实施例能够在其中实现的示例交通环境100的示意图。在该示例环境100中示意性示出了一些典型物体,包括道路102、交通指示设施103、道路两侧的植物107以及可能出现的行人109。应当理解,这些示出的设施和物体仅是示例,根据实际情况,不同交通环境中存在可能出现的物体将会变化。本公开的范围在此方面不受限制。
在图1的示例中,一个或多个车辆110-1、110-2正在道路102上行驶。为便于描述,多个车辆110-1、110-2统称为车辆110。车辆110可以是可以承载人和/或物并且通过发动机等动力系统移动的 任何类型的车辆,包括但不限于轿车、卡车、巴士、电动车、摩托车、房车、火车等等。环境100中的一个或多个车辆110可以是具有一定自动驾驶能力的车辆,这样的车辆也被称为无人驾驶车辆。当然,环境100中的另外一个或一些车辆110还可以是不具有自动驾驶能力的车辆。
环境100中还布置有一个或多个传感器105-1至105-6(统称为传感器105)。传感器105独立于车辆110,用于监测环境100的状况,以获得与环境100相关的感知信息。为了全方位监测环境100,传感器105可以被布置在道路102附近,并且可以包括一个或多个类型的传感器。例如,传感器105可以按一定间隔被布置在道路102的两侧,用于监测环境100的特定区域。在每个区域中可以布置有多种类型的传感器。在一些示例中,除了将传感器105固定在特定位置之外,还可以设置可移动的传感器105,诸如可移动感知站点等。
由与道路102相应布置的传感器105采集到的感知信息也可以被称为路侧感知信息。路侧感知信息可以被用于促进对车辆110的驾驶控制。为了实现利用路侧感知信息进行车辆110的自动驾驶控制,路侧与车侧可以协作执行车辆的控制。图2示出了这样的车路协同系统200的示意图。为便于描述,下文将结合图1来讨论车路协同系统200。
车路协同系统200包括传感器105,用于辅助车辆110的自动驾驶的路侧辅助装置210,以及用于控制车辆110的自动驾驶的车侧控制装置220。路侧辅助装置210在本文中有时也被称为用于辅助车辆的自动驾驶的装置。路侧辅助装置210与环境100相结合使用,以便辅助控制在环境100中出现的车辆的自动驾驶。路侧辅助装置210可以被安装在任何位置,只需要路侧辅助装置210能够与传感器105和车侧控制装置220通信即可。由于均被部署在路侧,传感器105和路侧辅助装置210还可以构成路侧辅助子系统。
车侧控制装置220在本文中有时也被称为控制车辆的110自动驾驶的装置。车侧控制装置220与一个相应车辆110相关联地使用, 例如被集成到车辆110上,以控制该车辆110的自动驾驶。环境100中的一个或多个车辆110上可以分别配备车侧控制装置220。例如,车辆110-1上可以集成有一个车侧控制装置220,车辆110-2上也可以类似地集成有一个车侧控制装置220。在下文中,针对一个车辆110来描述车侧控制装置220的相应功能。
路侧辅助装置210包括通信模块212和信息处理模块214。通信模块212可以支持与传感器105的有线/无线通信,用于从传感器105获取采集到的与环境100相关的感知信息。通信模块212还可以支持与车侧控制装置220的通信,这样的通信通常是无线通信。通信模块212与传感器105和车侧控制装置220的通信可以基于任何通信协议,本公开的实现在此方面不受限制。
如以上提及的,为了全方位监测环境100,被布置在环境100中的传感器105可以包括多种类型的传感器。传感器105的示例可以包括但不限于:图像传感器(例如摄像头)、激光雷达、毫米波雷达、红外传感器、定位传感器、光照传感器、压力传感器、温度传感器、湿度传感器、风速传感器、风向传感器、空气质量传感器等等。图像传感器可以采集与环境100相关的图像信息;激光雷达和毫米波雷达可以采集与环境100相关的激光点云数据;红外传感器可以利用红外线来探测环境100中的环境状况;定位传感器可以采集与环境100相关的物体的位置信息;光照传感器可以采集指示环境100中的光照强度的度量值;压力、温度和湿度传感器可以分别采集指示环境100中的压力、温度和湿度的度量值;风速、风向传感器可以分别采集用于指示环境100中的风速、风向的度量值;空气质量传感器可以采集环境100中一些与空气质量相关的指标,诸如空气中的氧气浓度、二氧化碳浓度、粉尘浓度、污染物浓度等。应当理解,以上仅列出了传感器105的一些示例。根据实际需要,还可以存在其他不同类型的传感器。在一些实施例中,不同的传感器可以被集成在某个位置,或者可以被分布在环境100的一个区域中,以用于监测特定类型的路侧感知信息。
由于传感器105直接采集到的感知信息的数据量较大且类型多样,若直接将传感器105采集到的感知信息传输给车侧控制装置220,不仅造成较大的通信传输开销、造成通信资源过度占用,而且同样的感知信息可能需要在不同车辆处被分别单独处理,造成系统总体性能降低。在本公开的实现中,传感器105采集到的感知信息由路侧辅助装置210(具体由路侧辅助装置210中的信息处理模块214)集中处理。
路侧辅助装置210的信息处理模块214处理从传感器105处获取的感知信息,以确定与环境100相关的环境感知结果。环境感知结果可以指示理解环境100的整体状况,具体可以指示环境中包括车辆110在内的多个物体的相关信息。这样的相关信息包括各个物体的尺寸、位置(例如在地球坐标系中的精确位置)、速度、运动方向、与特定视点的距离等等。信息处理模块214可以将来自不同传感器105的不同类型的感知信息进行融合,以确定环境感知结果。信息处理模块214可以采用各种不同信息融合技术来确定环境感知结果。
为了保证车辆110的安全驾驶,环境感知结果所提供的各个物体的相关信息的精度应当较高。路侧辅助装置210对传感器105采集到的感知信息的具体处理将在下文中详细描述。路侧辅助装置210中的通信模块212被配置为将信息处理模块214处理后获得的环境感知结果传输给车侧控制装置220。
车侧控制装置220基于从路侧辅助装置210获取的环境感知结果来控制对应车辆110(例如,安装有车侧控制装置220的驾驶行为)。车侧控制装置220包括通信模块222、信息处理模块224和驾驶控制模块226。通信模块222被配置为与路侧辅助装置210、特别是路侧辅助装置210中的通信模块212通信耦合,以从通信模块212接收环境感知结果。信息处理模块224被配置为执行对环境感知结果的处理,以使环境感知结果适合用于车辆110的自动驾驶控制。驾驶控制模块226被配置为基于信息处理模块224的处理结果来控 制车辆110的驾驶行为。
车侧驾驶控制
下文将首先详细描述车侧控制装置220如何执行对车辆110的自动驾驶控制。
车侧控制装置220的通信模块222可以从路侧辅助装置210获取与车辆110周围的环境100相关的环境感知结果。这样的环境感知结果基于由布置在环境100中并且独立于该车辆110的一个或多个传感器105采集到的感知信息,并且指示环境100中的多个物体的相关信息,诸如物体的尺寸、位置(例如在地球坐标系中的精确位置)、速度、运动方向、与特定视点的距离等等。
在一些实施例中,除了从路侧辅助装置210获得环境感知结果之外,车侧控制装置220还可以从被集成在环境100中的其他车辆上的传感器获得环境感知结果以作为补充。环境100中的一些车辆可能有较强感知能力的传感器(诸如激光雷达)或者具有一般感知能力的传感器(诸如摄像头)。这些传感器采集到的感知信息也有助于辅助其他车辆的自动驾驶控制。对于某个车辆(例如,车辆110-1)来说,与车辆110-1相关联的车侧控制装置220可以从其他车辆(例如,车辆110-2)上的传感器获得原始感知信息或者对原始感知信息进行处理后的感知结果。
通常,被装备在车辆上的传感器将从车辆本身的角度来检测周围环境,因此所获得的感知信息中不包含车辆本身相关的信息。然而,由于车辆外部的传感器(诸如路侧传感器或其他车辆上的传感器)传感器本身而不是车辆的角度来观察整个环境,这些传感器无差别地监测该车辆以及其他物体的相关信息,因此采集到的信息包含整个环境中的物体相关的感知信息。
根据本公开的实施例,信息处理装置224从环境感知结果中排除与车辆110相对应的自车感知结果,以确定车辆110的车外感知结果。自车感知结果可以指的是由环境感知结果中与车辆110本身 相关的信息,诸如车辆110的尺寸、位置、速度、方向、与特定视点的距离等信息。车外感知结果包括除了车辆110之外的其他物体的相关信息。在车辆110的驾驶中,车辆110需要将自车之外的其他物体都视为障碍物,从而合理规划行驶路径,避免与障碍物碰撞。在本公开的实施例中,通过从环境感知结果识别和排除自车感知结果,使得车外感知结果更适合于用于车辆110的自动驾驶控制。
为了从全面的环境感知结果中确定车辆110的车外感知结果,在一些实施例中,车辆110可以配备有标签部分,用于识别车辆110。这样的标签部分可以是以下一项或多项:车辆110的车牌、粘贴在车辆110外部的二维码、粘贴在车辆110外部的非可见光标签、以及安装在车辆110上的无线射频标签。
在道路上行驶的机动车辆通常都安装有车牌,用于唯一标识车辆。在一些情况下,对于没有车牌的车辆或者考虑到车牌容易被遮挡,还可以在车辆110外部粘贴特定于车辆110的二维码作为车辆的标签部分。车辆110的车牌和/或二维码可以从图像传感器采集的图像信息中识别。在一些示例中,为了不影响车辆的外观,还可以在车辆110上粘贴非可见光标签,诸如红外、紫外反射标签,用于识别车辆110。这样的非可见光标签可以由非可见光传感器识别。备选地或附加地,车辆110上安装的无线射频标签也可以用于识别车辆110。无线射频标签可以发射信号,并且通过射频读取器读取发射的信号来识别车辆110。
通过车辆110的标签部分,信息处理模块224可以从环境感知结果中标识出与车辆110的标签部分相关的标识信息。这样的标识信息例如可以是车辆110的车牌或二维码图像信息、指示非可见光标签和无线射频标签的特定信号的指示信息等。信息处理模块224通过将自车的标签部分所指示的标识与环境感知结果进行匹配,以从中识别出对应的标识信息。然后,信息处理模块224基于标识信息确定环境感知结果中与车辆110相对应的自车感知结果。通常,路侧辅助装置210将各个物体的相关信息结合在一起。因此,通过 车辆110的标识信息,可以确定环境感知结果中与车辆110相关的其他信息,诸如车辆110的位置、尺寸等等。
在一些实施例中,除了利用车辆配备的标签部分来识别车辆110本身之外,还可以基于车辆110的位置来标识环境感知结果中的自车感知结果。如以上提及的,环境感知结果可以包括多个物体的位置。信息处理模块224可以利用各种定位技术来确定车辆110的位置,然后将车辆110的位置与环境感知结果中的多个物体的位置进行匹配,从多个物体中标识出与车辆110相匹配的物体。通过这种方式,信息处理模块224可以识别环境感知结果中哪个物体是车辆110本身。由此,信息处理模块224可以从环境感知结果中排除与车辆110匹配的物体相对应的感知结果,获得车外感知结果。
在基于位置匹配来确定车外感知结果时,车辆110的位置可以是车辆110的精确位置(例如与环境感知结果所包括的物体的位置的精度相似)或者可以是车辆110的粗略位置(例如,亚米级定位)。在环境100中的物体互相之间距离较大时,车辆110的粗略位置也能够准确地从环境感知结果中匹配出在重叠位置处的匹配物体。在一些实施例中,车辆110的位置可以由车辆110自身具有的定位设备,诸如全球定位系统(GPS)天线、位置传感器等来确定。车辆110也可以基于其他定位技术,诸如与通信模块222通信的基站和/或被布置在环境100中的路侧辅助装置210、或者任何其他技术来进行定位。
在识别出车辆110的自车感知结果后,信息处理模块224可以删除或者忽略环境感知结果中与车辆110相对应的自车感知结果,并且仅考虑其他环境感知结果(即车外感知结果)。车外感知结果由车侧控制装置220中的驾驶控制模块226用于控制车辆110的驾驶行为。驾驶控制模块226可以在已知车外感知结果的基础上,利用各种自动驾驶策略来控制车辆110的驾驶行为。车辆110的驾驶行为可以包括车辆110的行驶路径、行驶方向、行驶速度等。驾驶控制模块226可以生成针对车辆110的驾驶行为的具体操作命令,诸 如针对汽车的行驶系、转向系等的操作命令,使得车辆110根据这样的操作命令进行驾驶。操作命令例如可以是加速、减速、左转向、右转向、停车、鸣笛、打开或关闭车灯等等任何与车辆110的驾驶相关的命令。
在一些实施例中,在控制车辆110的驾驶行为时,驾驶控制模块226可以基于车外感知结果来确定环境100中的一个或多个物体(即障碍物)的行为预测。行为预测包括物体的预期运动轨迹、预期运动速度、预期运动方向等一个或多个方面。物体的行为预测对于车辆的自动驾驶控制而言也是有用的,因为车辆的自动驾驶控制往往需要判断车辆周围的物体即将如何运动,以便于采取相应的驾驶行为进行应对。在一些实施例中,驾驶控制模块226可以基于预先训练的预测模型来执行行为预测。这样的预测模型例如可以一个通用的行为预测模式,或者包括针对不同类型的物体的不同预测模型。驾驶控制模块226可以基于物体的行为预测来确定车辆110的驾驶行为。
在一些实施例中,在控制车辆的驾驶行为时,除了车外感知结果之外,信息处理模块224还基于车辆110的位置来控制车辆的驾驶。通常,为了准确、安全的自动驾驶控制,期望获知车辆110的精确位置。在一个实施例中,车辆110可以配备有能够执行精确定位的传感器。在另一个实施例中,还可以从环境感知结果中确定车辆110的精确位置,这也可以降低对车辆110的精确定位硬件的要求,并且提高定位精度和稳定性。
如以上讨论的,环境感知结果包括车辆110的较高精度位置。在车辆110的自动驾驶控制中使用的精确位置可以从环境感知结果中确定。在这样的实施例中,车侧控制装置220可以包括车辆定位模块(未示出)。车辆定位模块可以被配置为通过位置匹配的方式从环境感知结果中标识车辆110。
具体地,车辆定位模块可以首先确定车辆110的粗略位置,例如通过车辆110的GPS天线或者借助诸如基站等辅助设备进行定位。 车辆定位模块基于车辆110的粗略位置,从环境感知结果中确定与车辆110匹配的物体,并且将环境感知结果中与车辆110匹配的物体的位置确定为车辆110的精细位置(即具有更高精度的位置)。以此方式,可以无需要求车辆110或者车侧控制装置220具备精确车载定位装置,即可获得车辆110的精确位置用于控制车辆110的驾驶行为。
在另外一些实施例中,如以上讨论的,还可以通过车辆110配备的标签部分来标识车辆110相对应的自车感知结果。因此,可以从标识的自车感知结果中获得车辆110的精确定位。这可以使车辆110甚至无需车载定位装置即可实现准确定位。
在本公开的一些实施例中,车侧控制装置220除了从路侧辅助装置210获得环境感知结果之外,还可以获得其他辅助驾驶信息用于辅助车辆110的自动驾驶。在一个实施例中,车侧控制装置220的通信模块222可以从路侧辅助装置210(例如从通信模块214)获取环境100中的一个或多个物体的行为预测。行为预测包括物体的预期运动轨迹、预期运动速度、预期运动方向等一个或多个方面。在另一个实施例中,车侧控制装置220的通信模块222可以从路侧辅助装置210(例如从通信模块214)获取针对车辆110的自动驾驶推荐,包括车辆110的行驶路径推荐、行驶方向推荐和控制车辆的驾驶行为的具体操作指令推荐等一个或多个。
除了车外感知结果之外,车侧控制装置220的驾驶控制模块226还可以基于从路侧辅助装置210获得的关于物体的行为预测和/或自动驾驶推荐来控制车辆110的驾驶行为。车侧控制模块226在控制车辆110的驾驶行为时,可以参考或调整从路侧辅助装置210获得的行为预测和/或自动驾驶推荐,以便确定车辆110的实际驾驶行为。
由路侧辅助装置210执行行为预测和自动驾驶推荐,可以进一步降低对车辆110本身或者对车侧控制装置220的自动驾驶能力的要求,降低了车侧的处理和控制复杂度。例如,车侧控制装置220可以基于简单的自动驾驶控制策略,在从路侧辅助装置210获得的 行为预测和/或自动驾驶推荐的基础上,结合实际的车外感知结果来确定车辆110的驾驶行为。
以上已经描述了车侧控制装置220从路侧辅助装置210获得环境感知结果以及还可能获得物体的行为预测和/或自动驾驶推荐,来控制车辆110的驾驶行为。在上述实施例中,传感器105和路侧辅助装置210承担对车辆110周围环境的感知功能,此外还可以提供诸如行为预测和/或自动驾驶推荐等辅助驾驶信息。借助路侧辅助装置210和传感器105获得的环境感知结果以及其他辅助驾驶信息可以被提供给环境100中的多个车辆110,实现了集中式的环境感知和信息处理。
在这样的实现下,可以不要求车辆110本身具备强大的环境感知能力、自定位能力、行为预测能力和/或自动驾驶规划能力,就能够实现自动驾驶。车辆110的自动驾驶能力的提升可以通过集成车侧控制装置220来实现。例如,可以通过升级车辆110的软件系统并且通过附加通信功能或者借助车辆110本身具有的通信功能,来将车侧控制装置220的功能集成到车辆110。此外,通过路侧辅助装置210提供行为预测能力和/或自动驾驶推荐,还可以在车辆110的硬件和/或软件出现故障无法执行行为预测和驾驶规划的情况下,保障车辆110的自动驾驶过程持续。
在一个具体示例中,如果车辆行驶道路系统的某一个路段中部署了路侧辅助装置210和传感器105,只需要集成车侧控制装置220,行驶到该路段的车辆110就可以获得更强大的自动驾驶能力。在一些情况下,不具有自动驾驶能力的车辆110(例如被划分为自动驾驶分级中的L0、L1级别的车辆)或者具有较弱驾驶能力的车辆110(例如,L2级别的车辆)可以借助于环境感知结果而获得更强大的自动驾驶能力(例如类似于L3或L4级别的自动驾驶车辆)。
路侧辅助驾驶控制
以上的实施例主要描述了在图2的协同控制系统200中车侧控 制装置220的具体实现。下文将继续描述在协同控制系统200中的路侧辅助装置210的一些实施例。
根据本公开的实施例中,路侧辅助装置210获取传感器105的感知信息并且通过处理感知信息来确定环境感知结果。路侧辅助装置210然后将环境感知结果提供给车侧控制装置220,以用于辅助控制车辆110的驾驶行为。
在一些实施例中,为了进一步降低车侧控制装置220的处理复杂度,路侧辅助装置210可以从环境感知结果中确定与一个或多个车辆110相对应的车外感知结果,并且将车外感知结果提供给车侧控制装置220。也就是说,路侧辅助装置210可以向各个车辆110提供的感知结果是针对各个车辆并且可被直接用于这些车辆的驾驶控制的不同车外感知结果。具体地,路侧辅助装置210的信息处理模块214从环境感知结果中排除与某个车辆110相对应的自车感知结果,从而确定该车辆110的车外感知结果。路侧辅助装置210然后将确定的车外感知结果提供给与该车辆相关联的车侧控制装置,以用于辅助控制该车辆的驾驶行为。
信息处理模块214标识某个车辆110的车外感知结果的方式与车侧控制装置220所采用的方式类似。例如,信息处理模块214也可以基于车辆110配备的标签部分来识别车辆110,标签部分诸如是车辆110的车牌、二维码、非可见光标签和无线射频标签中的一个或多个。具体地,信息处理模块214从环境感知结果中标识与车辆110配备的标签部分相关的标识信息,然后基于标识信息确定环境感知结果中与车辆110相对应的自车感知结果。信息处理模块214可以从环境感知结果中排除自车感知结果,以获得车外感知结果用于提供给车侧控制装置220。
在一些实施例中,为了更快速、准确地从传感器105获得的感知信息中确定出环境感知结果,信息处理模块214还可以借助与环境100相关联的静态高精地图来确定环境感知结果。静态高精地图包括环境100的静态物体的相关信息。静态高精地图可以由环境100 中布置的传感器105预先采集到的与环境100相关的信息而生成。静态高精地图中仅包括环境100中突出于地面并且在相对较长时间内保持静止的物体的相关信息。
图3示出了与图1的环境100相关联的一个静态高精地图300的示例。相较于环境100而言,静态高精地图300中仅包括静止物体、诸如布置有传感器105的立杆、交通指示设施103、道路两侧的植物107等。这些物体在一段时间内是保持静止的。诸如车辆110和行人109等物体有时候会出现在环境100中,有时候会从环境100中消失,或者在环境100中移动。因此,这样的物体被称为动态物体。
应当理解,图3示出的静态高精地图300仅是为了图示的目的而给出。通常,高精地图中除了示意性示出物体或者给出物体的图像之外,还会标注物体的其他信息,诸如精确位置、速度、方向等等。在一些实现中,静态高精地图包括三维静态高精地图,其包括物体在三维空间中的相关信息。
在初始时候,静态高精地图,诸如静态高精地图300可以由高精地图采集车采集环境100相关的信息并且基于这样的信息而生成。环境100相关联的静态高精地图也可以定时或者由相应事件触发而被更新。静态高精地图的更新周期可以设置为相对长的一段时间。静态高精地图的更新可以基于被布置在环境100中并且实时监测环境100的传感器105采集到的感知信息。
在使用静态高精地图来确定环境感知结果时,为了自动驾驶的目的,环境感知结果要反映环境100的实时状况。因此,信息处理模块214可以利用传感器105提供的实施感知结果来更新静态高精地图,以获得与环境100相关联的实时高精地图作为环境感知结果。在更新静态高精地图时,可以将来自传感器105的感知信息和静态高精地图融合,将感知信息中的动态物体以及动态物体的相关信息被结合到静态高精地图中。
在确定环境感知结果时,静态高精地图的使用可以将实时感知 信息中可能错误检测出的物体纠正或删除,提升了环境感知结果的准确度。例如,由于实时感知信息的误差,环境100中的某个物体被检测为具有一定速度,通过结合静态高精地图可以确定该物体实际上是静态物体,因此可以避免错误标记物体速度、进而影响车辆110的自动驾驶控制。
在一些实施例中,静态高精地图有助于标注环境100中的物体的精确位置,这样的精确位置可以形成环境感知结果的一部分。具体而言,信息处理模块214可以利用传感器105采集到的感知结果中的图像感知信息。信息处理模块214从图像感知信息中识别出环境中的物体,所识别的物体包括环境中的静态物体以及其他物体(例如新进入环境100中的动态物体)。这可以通过用于物体识别的图像处理技术来实现。
然后,信息处理模块214基于识别出的静态物体与其他物体之间的相对位置关系,从静态高精地图指示的静态物体的位置确定其他物体的位置。通过图像传感器采集到的图像感知信息通常不能指示其中的物体的地理位置、例如在地球坐标系中的具体位置,但是图像感知信息可以反映不同物体之间的相对位置关系。基于这样的相对位置关系,可以从已知的静态高精地图指示的静态物体的位置确定其他物体的精确位置。在确定时,还可以参考静态物体从图像感知信息到静态高精地图的转换关系,来确定其他物体在环境100中的绝对地理位置。利用静态高精地图的物体定位可以快速准确地获得高精度的位置,降低了精确定位所需要的计算开销。
如以上关于车侧控制装置220的讨论中提及的,路侧辅助装置210除了提供环境感知结果或者车外感知结果之外,还可以处理环境感知结果以获得针对环境100中的一个或多个车辆的其他辅助驾驶信息,诸如环境100中的物体的行为预测和/或特定车辆110的自动驾驶推荐。下文将详细讨论如何在路侧辅助装置210中确定物体的行为预测和车辆的自动驾驶推荐。
在一些实施例中,路侧辅助装置210还包括行为预测模块(未 示出),其被配置为基于环境感知结果来确定环境100中的一个或多个物体的行为预测。所确定的预测行为经由通信模块212被提供给车侧控制装置220,以用于进一步辅助控制相应车辆110的驾驶行为。物体的行为预测包括物体的预期运动轨迹、预期运动速度、预期运动方向等一个或多个方面。物体的行为预测对于车辆的自动驾驶控制而言也是有用的,因为车辆的自动驾驶控制往往需要判断车辆周围的物体即将如何运动,以便于采取相应的驾驶行为进行应对。
在一些实施例中,路侧辅助装置210的行为预测模块可以利用特定于传感器105所处的位置或区域的预测模型来确定物体的行为预测。不同于在车侧使用的针对全部物体或不同类型的物体的通用预测模型,传感器105本地的预测模型可以基于传感器105所在区域处出现的物体的行为来训练。用于训练预测模型的训练数据可以是先前记录的在传感器105所处区域处的一个或多个物体的行为。
在不同地理区域出现的物体可能会表现出与该区域相关的特定行为模式。例如,如果传感器105被布置在旅游景点附近,那么在这个区域中的行人的行走可能是方向性较弱,类似于徘徊漫步的模式。如果传感器105被布置在接近写字楼等办公场所附近的,在这个区域中的行人的行走可能是目的性更强,例如走向特定的一个或多个建筑中。因此,通过训练特定于区域处的预测模型,可以更准确地预测出物体在特定区域处将发生的行为。
在一些实施例中,路侧辅助装置210还包括驾驶推荐模块(未示出),其被配置为基于环境感知结果来确定针对一个或多个车辆110的自动驾驶推荐。自动驾驶推荐可以包括车辆110的行驶路径推荐、车辆110的行驶方向推荐,或者甚至可以包括控制车辆110的驾驶行为的具体操作指令推荐。驾驶推荐模块确定的自动驾驶推荐经由通信模块212提供给车侧控制装置220,以用于进一步辅助控制相应车辆110的驾驶行为。
在一些实施例中,路侧辅助装置210的驾驶推荐模块利用特定于传感器105所处的区域的推荐模型来确定自动驾驶推荐。推荐模 型基于车辆在传感器105所处区域中执行的驾驶行为而训练。用于训练推荐模型的数据可以是先前记录的在传感器105所处区域处的一个或多个车辆所采取的驾驶行为。在不同地理区域中,车辆可能会表现出与该区域相关的特定驾驶行为模式。例如,在人流密集的十字路口,车辆可能会提前执行减速操作。在某些路口,更多车辆可能会左转弯。通过训练特定于区域处的推荐模型,可以更准确地提供适合于在特定区域处执行的车辆驾驶行为。
在一些实施例中,路侧辅助装置210还可以向车侧控制装置220提供其他辅助驾驶信息,诸如由传感器105监测的环境100中的交通状况、事故状况等等,这些信息均有助于车侧控制装置220更准确合理地控制车辆110的驾驶行为。
根据本公开的实施例,路侧辅助装置210和传感器105共同为车侧控制装置220提供环境感知结果以及还可能提供物体的行为预测和/或自动驾驶推荐,用于辅助控制车辆110的驾驶行为。借助路侧辅助装置210和传感器105获得的环境感知结果以及其他辅助驾驶信息可以被提供给环境100中的多个车辆110,实现了集中式的环境感知和信息处理。
在这样的实现下,可以不要求车辆110本身具备强大的环境感知能力、自定位能力、行为预测能力和/或自动驾驶规划能力,就能够实现自动驾驶。车辆110的自动驾驶能力的提升可以通过集成车侧控制装置220来实现。例如,可以通过升级车辆110的软件系统并且通过附加通信功能或者借助车辆110本身具有的通信功能,来将车侧控制装置220的功能集成到车辆110。此外,通过路侧辅助装置210提供行为预测能力和/或自动驾驶推荐,还可以在车辆110的硬件和/或软件出现故障无法执行行为预测和驾驶规划的情况下,保障车辆110的自动驾驶过程持续。
以上描述了由路侧辅助装置210实现环境感知结果、物体的行为预测和/或针对车辆的自动驾驶控制等功能。在一些实施例中,这些功能中的一个、一些或全部可以由其他具有更强计算能力的设备 来执行,诸如在云端、边缘计算站点、路侧的基站或服务器等设备处执行。路侧辅助装置210可以将传感器105的感知信息提供给相应处理设备、获得处理结果并且将相应处理结果提供给车侧控制装置220。
车侧示例流程
图4示出了根据本公开实施例的用于控制车辆的自动驾驶的方法400的流程图。方法400可以由图2的车侧控制装置220实现。在框410,车侧控制装置220获取与车辆周围的环境相关的环境感知结果。环境感知结果基于由布置在环境中并且独立于车辆的至少一个传感器采集到的感知信息,并且环境感知结果指示环境中的多个物体的相关信息。在框420,车侧控制装置220通过从环境感知结果中排除与车辆相对应的自车感知结果,确定车辆的车外感知结果。在框430,车侧控制装置220至少基于车外感知结果,控制车辆的驾驶行为。
在一些实施例中,控制车辆的驾驶行为还包括:获取多个物体中的至少一个的行为预测,行为预测包括以下至少一项:至少一个物体的预期运动轨迹、至少一个物体的预期运动速度和至少一个物体的预期运动方向;以及还基于至少一个物体的行为预测,控制车辆的驾驶行为。
在一些实施例中,控制车辆的驾驶行为还包括:获取针对车辆的自动驾驶推荐,自动驾驶推荐包括以下至少一项:车辆的行驶路径推荐、车辆的行驶方向推荐和控制车辆的驾驶行为的操作指令推荐;以及还基于针对车辆的自动驾驶推荐,控制车辆的驾驶行为。
在一些实施例中,确定车辆的车外感知结果包括:从环境感知结果中标识与车辆配备的标签部分相关的标识信息;基于标识信息确定环境感知结果中与车辆相对应的自车感知结果;以及从环境感知结果中排除自车感知结果,以获得车外感知结果。
在一些实施例中,车辆配备的标签部分包括以下至少一项:车 辆的车牌、粘贴在车辆外部的二维码、粘贴在车辆外部的非可见光标签、以及安装在车辆上的无线射频标签。
在一些实施例中,环境感知结果包括多个物体的位置,确定车辆的车外感知结果包括:确定车辆的位置;通过将车辆的位置和多个物体的位置进行匹配,从多个物体中标识出与车辆匹配的物体;以及从环境感知结果中排除与车辆匹配的物体相对应的感知结果,以获得车外感知结果。
在一些实施例中,方法400还包括:确定车辆在环境中的粗略位置;基于粗略位置,从环境感知结果中确定多个物体中与车辆相对应的物体;以及将环境感知结果中包括的与车辆相对应的物体的位置信息确定为车辆在环境中的精细位置。
在一些实施例中,控制车辆的驾驶行为还包括:还基于车辆的精细位置,控制车辆的驾驶行为。
在一些实施例中,至少一个传感器包括以下至少一项:被布置在车辆正在其上行驶的道路附近的传感器,以及被集成在环境中的其他车辆上的传感器。
路侧示例流程
图5示出了根据本公开实施例的用于辅助控制车辆的自动驾驶的方法500的流程图。方法500可以由图2的路侧控制装置210实现。在框510,路侧控制装置210获取由至少一个传感器采集到的与环境相关的感知信息。至少一个传感器被布置在环境中并且独立于车辆。在框520,路侧控制装置210通过处理所获取的感知信息,确定与环境相关的环境感知结果,环境感知结果指示环境中的多个物体的相关信息,多个物体包括车辆。在框530,路侧控制装置210将环境感知结果提供给与车辆相关联的车侧控制装置,以用于辅助控制车辆的驾驶行为。
在一些实施例中,方法500还包括基于环境感知结果,确定多个物体中的至少一个的行为预测,行为预测包括以下至少一项:至 少一个物体的预期运动轨迹、至少一个物体的预期运动速度和至少一个物体的预期运动方向;以及将所确定的行为预测提供给车载控制系统,以用于进一步辅助控制车辆的驾驶行为。
在一些实施例中,确定行为预测包括:利用特定于至少一个传感器所处的区域的预测模型来确定行为预测,预测模型基于区域处出现的另一物体的行为而训练。
在一些实施例中,方法500还包括:基于环境感知结果,确定针对车辆的自动驾驶推荐,自动驾驶推荐包括以下至少一项:车辆的行驶路径推荐、车辆的行驶方向推荐和控制车辆的驾驶行为的操作指令推荐;以及将所确定的自动驾驶推荐提供给车载控制系统,以用于进一步辅助控制车辆的驾驶行为。
在一些实施例中,确定自动驾驶推荐包括:利用特定于至少一个传感器所处的区域的推荐模型来确定自动驾驶推荐,推荐模型基于另一车辆在区域处执行的驾驶行为而训练。
在一些实施例中,确定环境感知结果包括:获取与环境相关联的静态高精地图,静态地图至少指示环境中的静态物体的位置;以及基于感知信息和静态高精地图来确定环境感知结果。
在一些实施例中,基于感知信息和静态高精地图来确定环境感知结果包括:利用感知信息更新静态高精地图,以获得与环境相关联的实时高精地图作为环境感知结果。
在一些实施例中,感知信息包括图像感知信息,基于感知信息和静态高精地图来确定环境感知结果包括:从图像感知信息中标识环境中的静态物体和其他物体;以及基于静态物体与其他物体在图像感知信息中的相对位置关系,从静态高精地图指示的静态物体的位置确定其他物体的位置。
在一些实施例中,将环境外感知结果提供给车侧控制装置包括通过从环境感知结果中排除与车辆相对应的自车感知结果,来确定车辆的车外感知结果;以及将车外感知结果发送给车侧控制装置。
在一些实施例中,确定车辆的车外感知结果包括:从环境感知 结果中标识与车辆配备的标签部分相关的标识信息;基于标识信息确定环境感知结果中与车辆相对应的自车感知结果;以及从环境感知结果中排除自车感知结果,以获得车外感知结果。
在一些实施例中,车辆配备的标签部分包括以下至少一项:车辆的车牌、粘贴在车辆外部的二维码、粘贴在车辆外部的非可见光标签、以及安装在车辆上的无线射频标签。
在一些实施例中,至少一个传感器包括以下至少一项:被布置在车辆正在其上行驶的道路附近的传感器,以及被集成在环境中的其他车辆上的传感器。
示例设备实现
图6示出了可以用来实施本公开的实施例的示例设备600的示意性框图。设备600可以用于实现图2的路侧辅助装置210或车侧控制装置220。如图所示,设备600包括计算单元601,其可以根据存储在只读存储器(ROM)602中的计算机程序指令或者从存储单元608加载到随机访问存储器(RAM)603中的计算机程序指令,来执行各种适当的动作和处理。在RAM 603中,还可存储设备600操作所需的各种程序和数据。计算单元601、ROM 602以及RAM 603通过总线604彼此相连。输入/输出(I/O)接口605也连接至总线604。
设备600中的多个部件连接至I/O接口605,包括:输入单元606,例如键盘、鼠标等;输出单元607,例如各种类型的显示器、扬声器等;存储单元608,例如磁盘、光盘等;以及通信单元609,例如网卡、调制解调器、无线通信收发机等。通信单元609允许设备600通过诸如因特网的计算机网络和/或各种电信网络与其他设备交换信息/数据。
计算单元601可以是各种具有处理和计算能力的通用和/或专用处理组件。计算单元601的一些示例包括但不限于中央处理单元(CPU)、图形处理单元(GPU)、各种专用的人工智能(AI)计算 芯片、各种运行机器学习模型算法的计算单元、数字信号处理器(DSP)、以及任何适当的处理器、控制器、微控制器等。计算单元601可以执行上文所描述的各个方法和处理,例如过程400或过程500。例如,在一些实施例中,过程400或过程500可被实现为计算机软件程序,其被有形地包含于机器可读介质,例如存储单元608。在一些实施例中,计算机程序的部分或者全部可以经由ROM 602和/或通信单元609而被载入和/或安装到设备600上。当计算机程序加载到RAM 603并由计算单元601执行时,可以执行上文描述的过程400或过程500的一个或多个步骤。备选地,在其他实施例中,计算单元601可以通过其他任何适当的方式(例如,借助于固件)而被配置为执行过程400或过程500。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:场可编程门阵列(FPGA)、专用集成电路(ASIC)、专用标准产品(ASSP)、芯片上系统的系统(SOC)、负载可编程逻辑设备(CPLD)等等。
用于实施本公开的方法的程序代码可以采用一个或多个编程语言的任何组合来编写。这些程序代码可以提供给通用计算机、专用计算机或其他可编程数据处理装置的处理器或控制器,使得程序代码当由处理器或控制器执行时使流程图和/或框图中所规定的功能/操作被实施。程序代码可以完全在机器上执行、部分地在机器上执行,作为独立软件包部分地在机器上执行且部分地在远程机器上执行或完全在远程机器或服务器上执行。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具 体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
此外,虽然采用特定次序描绘了各操作,但是这应当理解为要求这样操作以所示出的特定次序或以顺序次序执行,或者要求所有图示的操作应被执行以取得期望的结果。在一定环境下,多任务和并行处理可能是有利的。同样地,虽然在上面论述中包含了若干具体实现细节,但是这些不应当被解释为对本公开的范围的限制。在单独的实施例的上下文中描述的某些特征还可以组合地实现在单个实现中。相反地,在单个实现的上下文中描述的各种特征也可以单独地或以任何合适的子组合的方式实现在多个实现中。
尽管已经采用特定于结构特征和/或方法逻辑动作的语言描述了本主题,但是应当理解所附权利要求书中所限定的主题未必局限于上面描述的特定特征或动作。相反,上面所描述的特定特征和动作仅仅是实现权利要求书的示例形式。

Claims (22)

  1. 一种用于控制车辆的自动驾驶的方法,包括:
    获取与所述车辆周围的环境相关的环境感知结果,所述环境感知结果基于由布置在所述环境中并且独立于所述车辆的至少一个传感器采集到的感知信息,并且所述环境感知结果指示所述环境中的多个物体的相关信息;
    通过从所述环境感知结果中排除与所述车辆相对应的自车感知结果,确定所述车辆的车外感知结果;以及
    至少基于所述车外感知结果,控制所述车辆的驾驶行为。
  2. 根据权利要求1所述的方法,其中控制所述车辆的驾驶行为还包括:
    获取所述多个物体中的至少一个的行为预测,所述行为预测包括以下至少一项:所述至少一个物体的预期运动轨迹、所述至少一个物体的预期运动速度和所述至少一个物体的预期运动方向;以及
    还基于所述至少一个物体的行为预测,控制所述车辆的驾驶行为。
  3. 根据权利要求1所述的方法,其中控制所述车辆的驾驶行为还包括:
    获取针对所述车辆的自动驾驶推荐,所述自动驾驶推荐包括以下至少一项:所述车辆的行驶路径推荐、所述车辆的行驶方向推荐和控制所述车辆的驾驶行为的操作指令推荐;以及
    还基于针对所述车辆的自动驾驶推荐,控制所述车辆的驾驶行为。
  4. 根据权利要求1所述的方法,其中确定所述车辆的车外感知结果包括:
    从所述环境感知结果中标识与所述车辆配备的标签部分相关的标识信息;
    基于所述标识信息确定所述环境感知结果中与所述车辆相对应 的自车感知结果;以及
    从所述环境感知结果中排除所述自车感知结果,以获得所述车外感知结果。
  5. 根据权利要求4所述的方法,其中所述车辆配备的所述标签部分包括以下至少一项:所述车辆的车牌、粘贴在所述车辆外部的二维码、粘贴在所述车辆外部的非可见光标签、以及安装在所述车辆上的无线射频标签。
  6. 根据权利要求1所述的方法,其中所述环境感知结果包括所述多个物体的位置,确定所述车辆的车外感知结果包括:
    确定所述车辆的位置;
    通过将所述车辆的所述位置和所述多个物体的所述位置进行匹配,从所述多个物体中标识出与所述车辆匹配的物体;以及
    从所述环境感知结果中排除与所述车辆匹配的物体相对应的感知结果,以获得所述车外感知结果。
  7. 根据权利要求1所述的方法,还包括:
    确定所述车辆在所述环境中的粗略位置;
    基于所述粗略位置,从所述环境感知结果中确定所述多个物体中与所述车辆相对应的物体;以及
    将所述环境感知结果中包括的与所述车辆相对应的物体的位置信息确定为所述车辆在所述环境中的精细位置。
  8. 根据权利要求7所述的方法,其中控制所述车辆的驾驶行为还包括:
    还基于所述车辆的所述精细位置,控制所述车辆的驾驶行为。
  9. 根据权利要求1至8中任一项所述的方法,其中所述至少一个传感器包括以下至少一项:
    被布置在所述车辆正在其上行驶的道路附近的传感器,以及
    被集成在所述环境中的其他车辆上的传感器。
  10. 一种用于控制车辆的自动驾驶的装置,包括:
    通信模块,被配置为获取与所述车辆周围的环境相关的环境感 知结果,所述环境感知结果基于由布置在所述环境中并且独立于所述车辆的至少一个传感器采集到的感知信息,并且所述环境感知结果指示所述环境中的多个物体的相关信息;以及
    信息处理模块,被配置为通过从所述环境感知结果中排除与所述车辆相对应的自车感知结果,确定所述车辆的车外感知结果;以及
    驾驶控制模块,被配置为至少基于所述车外感知结果,控制所述车辆的驾驶行为。
  11. 根据权利要求10所述的装置,其中所述驾驶控制模块还被配置为:
    获取所述多个物体中的至少一个的行为预测,所述行为预测包括以下至少一项:所述至少一个物体的预期运动轨迹、所述至少一个物体的预期运动速度和所述至少一个物体的预期运动方向;以及
    还基于所述至少一个物体的行为预测,控制所述车辆的驾驶行为。
  12. 根据权利要求10所述的装置,其中所述驾驶控制模块还被配置为:
    获取针对所述车辆的自动驾驶推荐,所述自动驾驶推荐包括以下至少一项:所述车辆的行驶路径推荐、所述车辆的行驶方向推荐和控制所述车辆的驾驶行为的操作指令推荐;以及
    还基于针对所述车辆的自动驾驶推荐,控制所述车辆的驾驶行为。
  13. 根据权利要求10所述的装置,其中所述信息处理模块被配置为:
    从所述环境感知结果中标识与所述车辆配备的标签部分相关的标识信息;
    基于所述标识信息确定所述环境感知结果中与所述车辆相对应的自车感知结果;以及
    从所述环境感知结果中排除所述自车感知结果,以获得所述车 外感知结果。
  14. 根据权利要求13所述的装置,其中所述车辆配备的所述标签部分包括以下至少一项:所述车辆的车牌、粘贴在所述车辆外部的二维码、粘贴在所述车辆外部的非可见光标签、以及安装在所述车辆上的无线射频标签。
  15. 根据权利要求10所述的装置,其中所述环境感知结果包括所述多个物体的位置,所述信息处理模块被配置为:
    确定所述车辆的位置;
    通过将所述车辆的所述位置和所述多个物体的所述位置进行匹配,从所述多个物体中标识出与所述车辆匹配的物体;以及
    从所述环境感知结果中排除与所述车辆匹配的物体相对应的感知结果,以获得所述车外感知结果。
  16. 根据权利要求10所述的装置,还包括车辆定位模块,被配置为:
    确定所述车辆在所述环境中的粗略位置;
    基于所述粗略位置,从所述环境感知结果中确定所述多个物体中与所述车辆相对应的物体;以及
    将所述环境感知结果中包括的与所述车辆相对应的物体的位置信息确定为所述车辆在所述环境中的精细位置。
  17. 根据权利要求16所述的装置,其中所述驾驶控制模块还被配置为:
    还基于所述车辆的所述精细位置,控制所述车辆的驾驶行为。
  18. 根据权利要求10所述的装置,其中所述至少一个传感器包括以下至少一项:
    被布置在所述车辆正在其上行驶的道路附近的传感器,以及
    被集成在所述环境中的其他车辆上的传感器。
  19. 根据权利要求10至18中任一项所述的装置,其中所述装置被集成到所述车辆。
  20. 一种设备,所述设备包括:
    一个或多个处理器;以及
    存储装置,用于存储一个或多个程序,当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-9中任一项所述的方法。
  21. 一种计算机可读存储介质,其上存储有计算机程序,所述程序被处理器执行时实现如权利要求1至9中任一项所述的方法。
  22. 一种车路协同系统,包括:
    车侧控制装置,包括根据权利要求10至19中任一项所述的装置;
    被布置在环境中并且独立于车辆的至少一个传感器,被配置为采集与所述环境相关的感知信息;以及
    路侧辅助装置,被配置为处理所述感知信息以确定与所述环境相关的环境感知结果。
PCT/CN2019/081607 2018-09-19 2019-04-04 用于控制车辆的自动驾驶的方法、设备、介质和系统 WO2020057105A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/042,747 US20210024095A1 (en) 2018-09-19 2019-04-04 Method and device for controlling autonomous driving of vehicle, medium, and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811120306.5 2018-09-19
CN201811120306.5A CN110928286B (zh) 2018-09-19 2018-09-19 用于控制车辆的自动驾驶的方法、设备、介质和系统

Publications (1)

Publication Number Publication Date
WO2020057105A1 true WO2020057105A1 (zh) 2020-03-26

Family

ID=69856370

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/081607 WO2020057105A1 (zh) 2018-09-19 2019-04-04 用于控制车辆的自动驾驶的方法、设备、介质和系统

Country Status (3)

Country Link
US (1) US20210024095A1 (zh)
CN (1) CN110928286B (zh)
WO (1) WO2020057105A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190049993A1 (en) * 2018-09-26 2019-02-14 Intel Corporation Computer-assisted or autonomous driving assisted by roadway navigation broadcast

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11537134B1 (en) * 2017-05-25 2022-12-27 Apple Inc. Generating environmental input encoding for training neural networks
US11574538B2 (en) * 2019-08-16 2023-02-07 GM Global Technology Operations LLC Method and apparatus for perception-sharing between vehicles
DE102019213612A1 (de) * 2019-09-06 2021-03-11 Robert Bosch Gmbh Verfahren und Vorrichtung zum Betreiben eines automatisierten Fahrzeugs
CN111879305B (zh) * 2020-06-16 2022-03-18 华中科技大学 一种面向高危生产环境的多模态感知定位模型与系统
CN111896010A (zh) * 2020-07-30 2020-11-06 北京百度网讯科技有限公司 车辆定位方法、装置、车辆以及存储介质
JP2022104397A (ja) * 2020-12-28 2022-07-08 株式会社Subaru 車両の運転制御システム、及び、車両の管制装置
CN112926476B (zh) * 2021-03-08 2024-06-18 京东鲲鹏(江苏)科技有限公司 车辆识别方法、装置及存储介质
CN113781819A (zh) * 2021-06-01 2021-12-10 深圳致成科技有限公司 实现多车辆同时定位的车路协同车辆定位系统和方法
CN114326469B (zh) * 2021-11-26 2023-12-08 江苏徐工工程机械研究院有限公司 一种无人矿山智能辅助作业安全控制方法及系统
CN114248806A (zh) * 2022-01-13 2022-03-29 云控智行科技有限公司 一种无人车驾驶控制方法、装置及电子设备
US20230322255A1 (en) * 2022-04-11 2023-10-12 Ford Global Technologies, Llc Multiple source mec assistance strategy for autonomous vehicles

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8061648B2 (en) * 2008-02-26 2011-11-22 Lachenmeier Timothy T System for tactical balloon launch and payload return
CN107807633A (zh) * 2017-09-27 2018-03-16 北京图森未来科技有限公司 一种路侧设备、车载设备以及自动驾驶感知方法及系统
CN108010360A (zh) * 2017-12-27 2018-05-08 中电海康集团有限公司 一种基于车路协同的自动驾驶环境感知系统
CN108417087A (zh) * 2018-02-27 2018-08-17 浙江吉利汽车研究院有限公司 一种车辆安全通行系统及方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10029682B2 (en) * 2016-01-22 2018-07-24 Toyota Motor Engineering & Manufacturing North America, Inc. Surrounding vehicle classification and path prediction
CN105844964A (zh) * 2016-05-05 2016-08-10 深圳市元征科技股份有限公司 一种车辆安全驾驶预警方法及装置
US10268200B2 (en) * 2016-12-21 2019-04-23 Baidu Usa Llc Method and system to predict one or more trajectories of a vehicle based on context surrounding the vehicle
CN106926779B (zh) * 2017-03-09 2019-10-29 吉利汽车研究院(宁波)有限公司 一种车辆变道辅助系统
CN107272683A (zh) * 2017-06-19 2017-10-20 中国科学院自动化研究所 基于acp方法的平行智能车控制系统
WO2019006743A1 (zh) * 2017-07-07 2019-01-10 驭势科技(北京)有限公司 一种用于控制车辆行驶的方法与设备
CN107886043B (zh) * 2017-07-20 2022-04-01 吉林大学 视觉感知的汽车前视车辆和行人防碰撞预警系统及方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8061648B2 (en) * 2008-02-26 2011-11-22 Lachenmeier Timothy T System for tactical balloon launch and payload return
CN107807633A (zh) * 2017-09-27 2018-03-16 北京图森未来科技有限公司 一种路侧设备、车载设备以及自动驾驶感知方法及系统
CN108010360A (zh) * 2017-12-27 2018-05-08 中电海康集团有限公司 一种基于车路协同的自动驾驶环境感知系统
CN108417087A (zh) * 2018-02-27 2018-08-17 浙江吉利汽车研究院有限公司 一种车辆安全通行系统及方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190049993A1 (en) * 2018-09-26 2019-02-14 Intel Corporation Computer-assisted or autonomous driving assisted by roadway navigation broadcast
US11009890B2 (en) * 2018-09-26 2021-05-18 Intel Corporation Computer-assisted or autonomous driving assisted by roadway navigation broadcast

Also Published As

Publication number Publication date
CN110928286B (zh) 2023-12-26
US20210024095A1 (en) 2021-01-28
CN110928286A (zh) 2020-03-27

Similar Documents

Publication Publication Date Title
CN110928284B (zh) 辅助控制车辆的自动驾驶的方法、设备、介质和系统
WO2020057105A1 (zh) 用于控制车辆的自动驾驶的方法、设备、介质和系统
CN110103953B (zh) 用于辅助车辆的驾驶控制的方法、设备、介质和系统
CN108732589B (zh) 利用3d lidar和定位自动采集用于对象识别的训练数据
US10691131B2 (en) Dynamic routing for autonomous vehicles
US11056005B2 (en) Traffic light detection and lane state recognition for autonomous vehicles
US9575490B2 (en) Mapping active and inactive construction zones for autonomous driving
US20190243364A1 (en) Autonomous vehicle integrated user alert and environmental labeling
WO2021217420A1 (zh) 车道线跟踪方法和装置
US9196164B1 (en) Pedestrian notifications
US11110932B2 (en) Methods and systems for predicting object action
JP6910452B2 (ja) より高度に自動化された、例えば、高度自動化車両(haf)をデジタル式位置特定マップで位置特定するための方法
CN113264039A (zh) 基于路侧感知装置的车辆驾驶方法、装置和车路协同系统
US11496707B1 (en) Fleet dashcam system for event-based scenario generation
CN110333725B (zh) 自动驾驶避让行人的方法、系统、设备及存储介质
CN118176406A (zh) 用于为自动驾驶车辆提供服务的优化的路线规划应用
US11841704B2 (en) Behavior prediction for railway agents for autonomous driving system
WO2021010083A1 (ja) 情報処理装置、情報処理方法及び情報処理プログラム
US20230027357A1 (en) Vehicle control system, vehicle control method, and storage medium
WO2021261167A1 (ja) 情報処理システム、情報処理装置および情報処理方法
CN110648547A (zh) 运输基础设施通信和控制
JP2020101960A (ja) 情報処理装置、情報処理方法及びプログラム
US11932242B1 (en) Fleet dashcam system for autonomous vehicle operation
Li Ros-Based Sensor Fusion and Motion Planning for Autonomous Vehicles: Application to Automated Parkinig System
CN214504216U (zh) 一种自动行驶配送车

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19861999

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08/07/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19861999

Country of ref document: EP

Kind code of ref document: A1