CN111696373B - Motorcade cooperative sensing method, motorcade cooperative control method and motorcade cooperative control system - Google Patents
Motorcade cooperative sensing method, motorcade cooperative control method and motorcade cooperative control system Download PDFInfo
- Publication number
- CN111696373B CN111696373B CN201910198291.2A CN201910198291A CN111696373B CN 111696373 B CN111696373 B CN 111696373B CN 201910198291 A CN201910198291 A CN 201910198291A CN 111696373 B CN111696373 B CN 111696373B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- fleet
- motorcade
- information
- obstacle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 57
- 230000004927 fusion Effects 0.000 claims abstract description 76
- 238000004891 communication Methods 0.000 claims abstract description 17
- 230000008447 perception Effects 0.000 claims abstract description 13
- 238000004422 calculation algorithm Methods 0.000 claims description 36
- 238000001914 filtration Methods 0.000 claims description 14
- 238000005516 engineering process Methods 0.000 claims description 13
- 238000004364 calculation method Methods 0.000 claims description 8
- 238000012163 sequencing technique Methods 0.000 claims description 8
- 230000001133 acceleration Effects 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 7
- 230000002452 interceptive effect Effects 0.000 claims description 5
- 239000002245 particle Substances 0.000 claims description 5
- 230000000977 initiatory effect Effects 0.000 claims 1
- 238000010295 mobile communication Methods 0.000 claims 1
- 230000006855 networking Effects 0.000 claims 1
- 230000008859 change Effects 0.000 description 13
- 230000006870 function Effects 0.000 description 6
- 230000006399 behavior Effects 0.000 description 5
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- GOLXNESZZPUPJE-UHFFFAOYSA-N spiromesifen Chemical compound CC1=CC(C)=CC(C)=C1C(C(O1)=O)=C(OC(=O)CC(C)(C)C)C11CCCC1 GOLXNESZZPUPJE-UHFFFAOYSA-N 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096725—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/22—Platooning, i.e. convoy of communicating vehicles
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Traffic Control Systems (AREA)
Abstract
The embodiment of the application provides a motorcade cooperative sensing method, a motorcade cooperative control method and a motorcade cooperative control system. The motorcade cooperative perception method comprises the following steps: the single vehicle fusion processor fuses information acquired by each sensor on the corresponding vehicle to obtain obstacle information around the corresponding vehicle; the communication device sends the obstacle information around the corresponding vehicle to the multi-vehicle fusion processor; and the multi-vehicle fusion processor fuses the obstacle information around each vehicle in the motorcade to obtain the obstacle information around the motorcade. According to the method and the device, the single intelligent sensing information of all vehicles in the motorcade is secondarily fused to obtain the obstacle information aiming at the whole motorcade, the whole cooperative control decision of the motorcade is favorably made, the precision of the cooperative control of the motorcade is improved, the problems that the sensing blind area exists in the single intelligent sensing, the sensing distance is too short and the like are solved, the configuration requirement on each vehicle in the motorcade is lower, and the flexible arrangement capacity of the motorcade is improved.
Description
Technical Field
The embodiment of the application relates to the technical field of automatic driving, in particular to a motorcade cooperative sensing method, a motorcade cooperative control method and a motorcade cooperative control system.
Background
This section is intended to provide a background or context to the embodiments of the application that are recited in the claims. The description herein is not admitted to be prior art by inclusion in this section.
The fleet cooperative driving refers to formation state of a plurality of vehicles in a form of extremely small vehicle distance trailing based on support of an automatic driving technology and a low-delay communication technology. In the motorcade, the distance is far lower than the safe driving distance in the general sense, and is only 20 meters or even shorter, and the airflow broken by the pilot vehicle can be directly received by the second vehicle at the tail of the vehicle by the extremely small distance without forming a low-pressure vortex region, so that the total air resistance value of the whole motorcade in the driving process is effectively reduced. Taking the galloping auto-run truck fleet as an example, the manufacturer indicates the reduced resistance to running in the queue state, which can save fuel consumption by approximately 10%.
The fleet cooperative driving technology requires real-time information interaction between vehicles in a fleet, so that all vehicles in the fleet can maintain consistency of behaviors, such as accelerating and decelerating simultaneously to maintain a following distance, changing lanes simultaneously to maintain formation of the fleet, and the like.
Disclosure of Invention
In the existing motorcade cooperative driving solution, motorcade perception is mostly single-vehicle perception, namely, a vehicle makes a decision by using a perception result of a local vehicle-mounted sensor, and the position relation between the vehicle and other vehicles is ensured by means of single-vehicle intelligence. In this solution, in order to ensure cooperative operation, each vehicle in the fleet must have a complete set of automatic driving systems at level L4, resulting in high configuration cost and poor flexible arrangement capability of the fleet.
In addition, since the queue requires all cars therein to perform decisions synchronously, such as acceleration, braking, and merging at the same time. This requires the decision-maker to know the environment around the entire fleet. However, due to the limitation of sensor technology, the sensing of a single vehicle often has the problems of dead zone sensing, too short sensing distance and the like, and is not beneficial to making a decision on the overall driving behavior of the fleet.
In view of the above, the present application provides a fleet coordination awareness method, a fleet coordination control method and a system that overcome or at least partially address the above-mentioned problems.
In a first aspect of embodiments of the present application, a fleet coordination awareness method is provided, including:
the single vehicle fusion processor fuses information acquired by each sensor on the corresponding vehicle to obtain obstacle information around the corresponding vehicle; the single vehicle fusion processor corresponds to vehicles in the motorcade;
the communication device sends the obstacle information around the corresponding vehicle to the multi-vehicle fusion processor;
and the multi-vehicle fusion processor fuses the obstacle information around each vehicle in the motorcade to obtain the obstacle information around the motorcade.
In a second aspect of embodiments of the present application, there is provided a fleet collaborative awareness system, comprising: the single vehicle fusion processor, the communication device and the multi-vehicle fusion processor are used for executing the vehicle fleet cooperation perception method.
In a third aspect of embodiments of the present application, there is provided a fleet cooperative control method, including:
obtaining obstacle information around the motorcade by using the motorcade cooperative perception method; and the number of the first and second groups,
and the motorcade decision system makes a running path and/or a driving strategy of each vehicle in the motorcade according to the obstacle information around the motorcade so that each vehicle runs according to the running path and/or the driving strategy.
In a fourth aspect of embodiments of the present application, there is provided a fleet cooperative control system, comprising:
the fleet collaborative awareness system as set forth above; and the number of the first and second groups,
and the motorcade decision system is used for making a running path and/or a driving strategy of each vehicle in the motorcade according to the obstacle information around the motorcade obtained by the motorcade collaborative perception system so that each vehicle runs according to the running path and/or the driving strategy.
By means of the technical scheme, the motorcade cooperative sensing method provided by the embodiment of the application obtains the obstacle information around a single vehicle by fusing the information acquired by each sensor on the single vehicle in the motorcade, and then obtains the obstacle information around the whole motorcade by fusing the obstacle information around each vehicle.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present application is further described in detail by the accompanying drawings and examples.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the present application are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
fig. 1 schematically illustrates an application scenario of an embodiment of the present application;
fig. 2 schematically illustrates a flow of a fleet collaborative awareness method provided in an embodiment of the present application;
FIG. 3 schematically illustrates a fleet collaborative awareness system provided by an embodiment of the present application;
FIG. 4 schematically illustrates a fleet collaborative awareness system according to an embodiment of the present application;
FIG. 5 schematically illustrates a fleet collaborative awareness system according to another embodiment of the present application;
FIG. 6 schematically illustrates a fleet collaborative awareness system according to yet another embodiment of the present application;
FIG. 7 schematically shows a flow of a fleet cooperative control method provided by an embodiment of the present application;
fig. 8 schematically illustrates a fleet cooperative control system provided by an embodiment of the present application;
FIG. 9 schematically illustrates another application scenario of an embodiment of the present application;
in the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Detailed Description
The principles and spirit of the present application will be described with reference to a number of exemplary embodiments. It is understood that these embodiments are given only to enable those skilled in the art to better understand and to implement the present application, and do not limit the scope of the present application in any way. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As will be appreciated by one skilled in the art, embodiments of the present application may be embodied as a system, apparatus, device, method, or computer program product. Accordingly, the present disclosure may be embodied in the form of: entirely hardware, entirely software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software.
For convenience of understanding, technical terms related to the present application are explained as follows:
the 'fleet cooperative driving' refers to formation state of a plurality of vehicles in a form of extremely small vehicle distance trailing based on support of automatic driving technology and low-delay communication technology.
As referred to herein, a "fleet" refers to a plurality of vehicles traveling in a fleet coordinated driving pattern.
The piloting vehicle is a vehicle running at the forefront in a motorcade running in a motorcade collaborative driving mode and plays a piloting role.
The vehicle is an automatic driving automobile realized based on automatic driving technology.
The automatic driving automobile is an intelligent automobile which senses road environment through a vehicle-mounted sensing system, automatically plans a driving route and controls a vehicle to reach a preset target. The vehicle-mounted sensor is used for sensing the surrounding environment of the vehicle, and controlling the steering and the speed of the vehicle according to the road, the vehicle position and the obstacle information obtained by sensing, so that the vehicle can safely and reliably run on the road. Specifically, the vehicle may be a vehicle that has a manned function (e.g., a type such as a car for home use, a bus, etc.), a cargo function (e.g., a type such as a general truck, a van, a closed truck, a tank truck, a flat truck, a container van, a dump truck, a truck with a special structure, etc.), or a special rescue function (e.g., a type such as a fire truck, an ambulance, etc.) and is realized by using an automatic driving technology.
The term "and/or" in this application is only one kind of association relationship describing the associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
Moreover, any number of elements in the drawings are by way of example and not by way of limitation, and any nomenclature is used solely for differentiation and not by way of limitation.
The principles and spirit of the present application are explained in detail below with reference to several representative embodiments of the present application.
Fig. 1 shows one of the application scenarios of the embodiment of the present application, and as shown in fig. 1, each vehicle in the fleet of vehicles travels with a small distance to trail based on the automatic driving technology. During driving, the vehicles are interacted with each other through information, so that all vehicles in the fleet can keep consistent behaviors, such as simultaneous acceleration, simultaneous deceleration, lane change, braking, merging and the like.
According to the existing cooperative driving solution of the fleet, in order to ensure the consistency of the behaviors of the vehicles in the fleet, each vehicle must have a complete set of automatic driving system at the level of L4, which results in high configuration cost of the fleet and poor flexible arrangement capability of the fleet; in addition, most of the existing motorcade cooperative driving solutions rely on single-vehicle intelligence to ensure the position relationship between the vehicle and other vehicles, but are limited by sensor technology, and single-vehicle sensing often has the problems of sensing blind areas, too short sensing distance and the like, and is not beneficial to making decisions on the overall driving behaviors of the motorcade.
It should be noted that the above application scenarios are only presented to facilitate understanding of the spirit and principles of the present application, and the embodiments of the present application are not limited in this respect. Rather, embodiments of the present application may be applied to any scenario where applicable.
In order to overcome various defects of the existing motorcade cooperative driving solution, an embodiment of the present application provides a motorcade cooperative sensing system, as shown in fig. 3, the system includes: the system comprises a single vehicle fusion processor 31, a communication device 32 and a multi-vehicle fusion processor 33, and is used for executing a fleet cooperation perception method, as shown in fig. 2, the method comprises the following steps:
step S100, the single vehicle fusion processor 31 fuses information collected by each sensor on the corresponding vehicle to obtain obstacle information around the corresponding vehicle; wherein the single vehicle fusion processor 31 corresponds to a vehicle in the platoon.
In step S200, the communication device 32 transmits the obstacle information around the corresponding vehicle to the multi-vehicle fusion processor 33.
In step S300, the multi-vehicle fusion processor 33 fuses the obstacle information around each vehicle in the fleet to obtain the obstacle information around the fleet.
Sensors used in the automotive field to sense the environment surrounding a vehicle generally include, but are not limited to: cameras, laser radars, millimeter wave radars, ultrasonic radars, inertial navigation devices, satellite positioning devices, and the like.
Obstacle information generally includes, but is not limited to: position, velocity, size, category, acceleration, number of frames tracked.
In step S100, the single vehicle fusion processor 31 is used to fuse the information collected by each sensor on each vehicle in the fleet to obtain the information of the obstacles around the single vehicle. In some embodiments, step S100 may be performed as follows steps S101 to S103:
step S101, the single vehicle fusion processor 31 acquires information acquired by each sensor on the corresponding vehicle, and processes the information acquired by each sensor to obtain the information of the obstacle sensed by each sensor.
The step is to pre-process the information collected by each sensor, and includes: if the sensor is a camera, the acquired information is an image, the image output by the camera can be identified by using a deep learning training model to obtain the information such as the position, the size, the category and the number of tracked frames of the obstacle, and the information such as the speed and the acceleration of the obstacle can be obtained by continuously using a target tracking algorithm (such as a Kalman filtering algorithm); if the sensor is a laser radar, the acquired information is point cloud data and reflection intensity information, the point cloud data and the reflection intensity information can be subjected to clustering analysis by using a classifier for deep learning training or based on a geometric characteristic relation, so that information such as the position, the size, the category, the number of tracked frames and the like of the obstacle is obtained, and information such as the speed, the acceleration and the like of the obstacle is obtained by continuously using a target tracking algorithm (such as a Kalman filtering algorithm). The obstacle information obtained by the preprocessing process of the step can be finally used for the subsequent step in the form of an obstacle list or an occupation grid map.
In step S103, the single vehicle fusion processor 31 fuses the obstacle information sensed by the sensors to obtain the obstacle information around the corresponding vehicle.
This step is to merge the obstacle information sensed by the sensors on the single vehicle, and in some embodiments, this step may be implemented as the following steps S105 to S109:
step S105, the single vehicle fusion processor 31 counts all the first-class obstacles sensed by the sensors according to the obstacle information sensed by the sensors, and determines the confidence of each item of information of each obstacle in all the first-class obstacles.
In some embodiments, the confidence level of each item of information of the obstacle in all the obstacles in the first category may be determined according to the type of the sensor sensing the item of information, for example, a specific confidence level may be set for each type of sensor in advance, and the confidence level of each item of information of the obstacle may be directly adopted as the confidence level of the sensor sensing the item of information.
In other embodiments, the confidence of each item of information of the obstacles in all the obstacles in the first category may also be determined according to the accuracy of the sensor that senses the item of information in different sensing ranges, for example, the accuracy of the camera when the sensing distance is 0-10 m is higher than the accuracy when the sensing distance is 10 m or more, and accordingly, when the obstacle is located 0-10 m away from the vehicle itself, the confidence is higher than the confidence when the obstacle is located 10 m or more away from the vehicle itself.
In other embodiments, the confidence of each item of information of the obstacles in the first category of all obstacles may also be a random process covariance matrix calculated in the target tracking algorithm used for preprocessing the information collected by the sensor.
In step S107, the single vehicle fusion processor 31 performs weighted average calculation on each item of information of each obstacle in the first category of all obstacles by using a target tracking algorithm (for example, kalman filter algorithm, interactive multi-model kalman filter algorithm, bayesian filter algorithm, or particle filter algorithm), to obtain final information of each obstacle in the first category of all obstacles, where the weight used in the weighted average calculation is a confidence of each item of information of each obstacle in the first category of all obstacles.
The target tracking algorithm adopted in the step can be a Kalman filtering algorithm, an interactive multi-model Kalman filtering algorithm, a Bayesian filtering algorithm or a particle filtering algorithm. The specific selection of the target tracking algorithm is not limited, and the selection can be carried out according to the actual situation.
In step S109, the single vehicle fusion processor 31 determines the final information of each obstacle of all the first-type obstacles as the obstacle information around the corresponding vehicle.
In consideration of the fact that the moving speed of the vehicle is fast, the communication between the single vehicle fusion processor 31 and the multi-vehicle fusion processor 33 needs to meet the requirements of low delay, high accuracy and the like, the communication device 32 in step S200 may perform communication by using wireless communication technologies such as V2X, 5G, 4G and the like.
In step S300, the multi-vehicle fusion processor 33 fuses the obstacle information around the single vehicle. In some embodiments, step S300 may be implemented as steps S301 to S305 as follows:
in step S301, the multi-vehicle fusion processor 33 counts all the second obstacles around the corresponding vehicle according to the obstacle information around the corresponding vehicle, and determines the confidence of each item of information of each obstacle in the second obstacles.
In some embodiments, the confidence of each item of information of the obstacles in the second type of all obstacles may be a random process covariance matrix calculated when the target tracking algorithm is used to calculate the final information of each obstacle in the first type of all obstacles.
Step S303, the multi-vehicle fusion processor 33 performs weighted average calculation on each item of information of each obstacle in the second all kinds of obstacles by using a target tracking algorithm (for example, kalman filter algorithm, interactive multi-model kalman filter algorithm, bayesian filter algorithm, or particle filter algorithm), to obtain final information of each obstacle in the second all kinds of obstacles, where the weight used in the weighted average calculation is a confidence of each item of information of each obstacle in the second all kinds of obstacles.
The target tracking algorithm adopted in the step can be a Kalman filtering algorithm, an interactive multi-model Kalman filtering algorithm, a Bayesian filtering algorithm or a particle filtering algorithm. The specific selection of the target tracking algorithm is not limited, and the selection can be carried out according to the actual situation.
In step S305, the multi-vehicle fusion processor 33 determines the final information of each obstacle in all the second-type obstacles as the information of the obstacles around the vehicle fleet.
According to the cooperative sensing method for the motorcade, the information collected by each sensor on a single vehicle in the motorcade is fused to obtain the obstacle information around the single vehicle, then the obstacle information around the whole motorcade is obtained by fusing the obstacle information around each vehicle, the method carries out secondary fusion on the single intelligent sensing information of all vehicles in the motorcade to obtain the obstacle information aiming at the whole motorcade, the whole cooperative control decision of the motorcade is favorably made, the cooperative control accuracy of the motorcade is improved, the problems that sensing blind areas exist in single intelligent sensing, sensing distances are too short and the like are solved, the configuration requirements on each vehicle in the motorcade are lower, and the flexible arrangement capability of the motorcade is improved.
In some embodiments, the method for sensing fleet cooperation provided in the embodiments of the present application may further include:
and step S400, the public information acquisition unit acquires road condition information and/or weather information of a road on which the motorcade runs.
The road condition information of the driven road includes but is not limited to: traffic accident information (such as accident occurrence road sections, serious conditions, rescue progress, whether to forbid passage and the like), construction information (such as construction road sections, construction time, whether to forbid passage and the like), road congestion conditions, traffic control information (such as restricted road sections, restricted passage time and the like) and speed limit information.
Weather information for the road being traveled includes, but is not limited to: rainfall conditions (e.g., rainfall range, rainfall amount, duration), snowfall conditions (e.g., snowfall range, snowfall amount, duration), wind level, fog/haze conditions (e.g., impact range, duration, pollution level), visibility, and the like.
The "road on which the vehicle group travels" referred to in the present application includes: the road of the motorcade in driving, the road of the motorcade about to drive (such as navigation path, planning path, etc.).
According to the motorcade cooperative sensing method, the motorcade cooperative control decision made based on the obstacle information around the motorcade can be adjusted by utilizing the road condition information and/or the weather information of the road on which the motorcade runs, so that the precision degree of the motorcade cooperative control is further improved, and the defects of single-vehicle intelligent sensing are overcome.
In some embodiments, the single vehicle fusion processor 31 is in a one-to-one correspondence with each vehicle in the fleet. In such embodiments, the single vehicle fusion processor 31 may be deployed on the corresponding vehicle as shown in fig. 4, or on the cloud server as shown in fig. 5.
In other embodiments, the single vehicle fusion processor 31 corresponds to at least one vehicle in the fleet of vehicles. In such embodiments, the single vehicle fusion processor 31 may be distributed on each respective vehicle, or may be deployed on the cloud server as shown in fig. 6.
In some embodiments, as shown in fig. 4-6, the multi-vehicle fusion processor 33 is deployed on a cloud server.
In other embodiments, a multi-vehicle fusion processor is deployed on each vehicle in the fleet, and the method for sensing the cooperation of the fleet provided by the embodiments of the present invention further includes: one of the multi-vehicle fusion processors 33 disposed on the respective vehicles within the vehicle fleet is selected and activated to cause the non-selected multi-vehicle fusion processor 33 to stand by.
Such embodiments may also enable the multi-vehicle fusion processor 33 on other vehicles to perform corresponding tasks when a vehicle in the fleet needs to be driven out of the fleet or when the multi-vehicle fusion processor 33 on the vehicle fails, by deploying the multi-vehicle fusion processor 33 on each vehicle in the fleet. In specific implementation, the multi-vehicle fusion processor 33 that is not started is always in a standby state.
In specific implementation, selecting one of the multiple vehicle fusion processors 33 deployed on each vehicle in the fleet and starting the selected one to operate may generally include the following embodiments:
in the mode 1, the current pilot vehicle of a fleet is determined, and a multi-vehicle fusion processor 33 deployed on the pilot vehicle is started to work; (ii) a This is typically used when changing the pilot.
Mode 2, determining the vehicle with the highest current configuration in the fleet, and starting the multi-vehicle fusion processor 33 deployed on the vehicle with the highest configuration to work; this approach is typically employed when there are newly added vehicles in the fleet or when there are vehicles leaving the fleet.
Mode 3, sequencing all vehicles in a fleet in advance, and judging whether the most front vehicle in the sequencing result is still in the fleet and works normally; if the vehicle is still in the fleet and works normally, the multi-vehicle fusion processor 33 deployed on the vehicle is started to work; if the vehicle is not in the fleet or works abnormally, deleting the serial number of the vehicle from the sequencing result, and returning to continuously judge whether the vehicle which is most front in the sequencing result is still in the fleet and works normally; this approach is typically employed when there are vehicles that are driving off of a fleet of vehicles.
Based on the same inventive concept, an embodiment of the present application further provides a fleet cooperative control system, as shown in fig. 8, the system includes: a fleet cooperative perception system 81 and a fleet decision system 82, wherein the fleet cooperative perception system 81 includes a single vehicle fusion processor 31, a communication device 32, and a multi-vehicle fusion processor 33 as shown in fig. 3.
The fleet cooperative control system shown in fig. 8 is used to perform a fleet cooperative control method, as shown in fig. 7, the method comprising:
step S500, the bicycle fusion processor 31 fuses information collected by each sensor on the corresponding vehicle to obtain obstacle information around the corresponding vehicle; wherein the single vehicle fusion processor 31 corresponds to a vehicle in the platoon.
In step S600, the communication device 32 transmits the obstacle information around the corresponding vehicle to the multi-vehicle fusion processor 33.
In step S700, the multi-vehicle fusion processor 33 fuses the obstacle information around each vehicle in the fleet to obtain the obstacle information around the fleet.
Step S800, the fleet decision system 82 formulates a driving path and/or a driving strategy of each vehicle in the fleet according to the information of the obstacles around the fleet, so that each vehicle drives according to the driving path and/or the driving strategy.
The detailed implementation of steps S500 to S700 can refer to the description of steps S100 to S300, and will not be described herein again.
Step S800 is to apply the information of the obstacles around the fleet to plan the driving path and to make driving strategies (e.g., acceleration, deceleration, lane change, merging, braking, etc.) for the fleet collaborative driving.
In the application scenario shown in fig. 9, the fleet is composed of four cars numbered V1, V2, V3, and V4, respectively, all the four cars are driven in lane E1 at time T, and the fleet intends to change lane to lane E2, and according to the information of obstacles around the fleet at time T, the cars V1 and V2 may directly change lane from E1 to E2 according to the conventional lane change strategy (e.g., steering wheel angle at the time of lane change), while the cars V3 and V4 may collide with obstacles M, N existing around the fleet according to the conventional lane change strategy, so that the cars V3 and V4 cannot directly change lane from E1 to E2 according to the conventional lane change strategy, but need to continue to travel a distance in lane E1 to change lane from E1 to E2, and the dashed lines in fig. 9 are routes of the vehicles V3668, V386, V4 for the lane change strategy of the fleet.
In some embodiments, the fleet cooperative control method provided in the embodiments of the present application further includes:
step S900, the fleet decision system 82 adjusts the driving path and/or driving strategy of each vehicle in the fleet according to the road condition information and/or weather information of the road on which the fleet travels.
For example, in the application scenario shown in fig. 9, an initial lane change policy and a travel path are respectively established for each vehicle according to obstacle information around the vehicle fleet at time T, but since a message that the next intersection connected with the lane E2 is about to be traffic-regulated is received, the initial lane change policy and the travel path need to be adjusted, for example, to continue traveling along the lane E1.
The above-mentioned embodiments are further described in detail for the purpose of illustrating the invention, and it should be understood that the above-mentioned embodiments are only illustrative of the present invention and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements, etc. made within the spirit and principle of the present invention should be included in the scope of the present invention.
Those of skill in the art will further appreciate that the various illustrative logical blocks, elements, and steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate the interchangeability of hardware and software, various illustrative components, elements, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design requirements of the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The various illustrative logical blocks, or units, or devices described in this application may be implemented or operated by a general purpose processor, a digital signal processor, an Application Specific Integrated Circuit (ASIC), a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other similar configuration.
The steps of a method or algorithm described in the embodiments herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. For example, a storage medium may be coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC, which may be located in a user terminal. In the alternative, the processor and the storage medium may reside in different components in a user terminal.
In one or more exemplary designs, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination of the three. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media that facilitate transfer of a computer program from one place to another. Storage media may be any available media that can be accessed by a general purpose or special purpose computer. For example, such computer-readable media can include, but is not limited to, RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store program code in the form of instructions or data structures and which can be read by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Additionally, any connection is properly termed a computer-readable medium, and, thus, is included if the software is transmitted from a website, server, or other remote source via a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wirelessly, e.g., infrared, radio, and microwave. Such discs (disk) and disks (disc) include compact disks, laser disks, optical disks, DVDs, floppy disks and blu-ray disks where disks usually reproduce data magnetically, while disks usually reproduce data optically with lasers. Combinations of the above may also be included in the computer-readable medium.
Claims (18)
1. A motorcade cooperative perception method is characterized by comprising the following steps:
the single vehicle fusion processor fuses information acquired by each sensor on the corresponding vehicle to obtain obstacle information around the corresponding vehicle; the single vehicle fusion processor corresponds to vehicles in the motorcade;
the communication device sends the obstacle information around the corresponding vehicle to the multi-vehicle fusion processor;
the multi-vehicle fusion processor fuses obstacle information around each vehicle in the fleet to obtain the obstacle information around the fleet, and the method comprises the following steps:
the multi-vehicle fusion processor counts all second-class obstacles around the corresponding vehicle according to the obstacle information around the corresponding vehicle, and determines the confidence degree of each item of information of each obstacle in the second-class all obstacles;
the multi-vehicle fusion processor performs weighted average calculation on each item of information of each obstacle in all the second-class obstacles by using a target tracking algorithm to obtain final information of each obstacle in all the second-class obstacles, wherein the weight used in the weighted average calculation is the confidence coefficient of each item of information of each obstacle in all the second-class obstacles;
and the multi-vehicle fusion processor determines the final information of each obstacle in all the second-type obstacles as the obstacle information around the vehicle fleet.
2. The cooperative sensing method for the fleet according to claim 1, wherein the fusing processor for the single vehicle fuses the information collected by the sensors of the corresponding vehicles to obtain the information of the obstacles around the corresponding vehicles, comprising:
the single vehicle fusion processor collects information collected by each sensor on a corresponding vehicle, processes the information collected by each sensor to obtain obstacle information sensed by each sensor, and fuses the obstacle information sensed by each sensor to obtain obstacle information around the corresponding vehicle.
3. The cooperative sensing method for the fleet according to claim 2, wherein the fusing processor for the single vehicle fuses the obstacle information sensed by each sensor to obtain the obstacle information around the corresponding vehicle, comprising:
the bicycle fusion processor counts all first-class obstacles sensed by the sensors according to the obstacle information sensed by the sensors, and determines the confidence coefficient of each item of information of each obstacle in all first-class obstacles;
the single vehicle fusion processor performs weighted average calculation on each item of information of each obstacle in all the first-class obstacles by using a target tracking algorithm to obtain final information of each obstacle in all the first-class obstacles, wherein the weight used in the weighted average calculation is the confidence coefficient of each item of information of each obstacle in all the first-class obstacles;
and the single vehicle fusion processor determines the final information of each obstacle in all the first-class obstacles as the obstacle information around the corresponding vehicle.
4. The cooperative sensing method for the fleet according to any one of claims 1 to 3, wherein said obstacle information comprises one or more of the following information of obstacles: position, velocity, size, category, acceleration, number of frames tracked.
5. The cooperative sensing method for the fleet according to any one of claims 1 to 3, wherein each sensor comprises one or more of the following:
camera, laser radar, millimeter wave radar, ultrasonic radar, inertial navigation equipment, satellite positioning equipment.
6. The fleet collaboration awareness method of claim 1, further comprising:
and the public information acquisition unit acquires road condition information and/or weather information of a road on which the motorcade runs.
7. The fleet collaboration awareness method of claim 3, wherein said target tracking algorithm is: a Kalman filtering algorithm, an interactive multi-model Kalman filtering algorithm, a Bayesian filtering algorithm, or a particle filtering algorithm.
8. The fleet collaboration awareness method of claim 1,
the single vehicle fusion processors correspond to the vehicles in the motorcade one by one; or,
the single vehicle fusion processor corresponds to at least one vehicle in a fleet of vehicles.
9. The fleet collaboration awareness method of claim 1,
the single vehicle fusion processor is deployed on a respective vehicle; or,
the bicycle fusion processor is deployed on the cloud server.
10. The fleet synergy awareness method according to claim 1,
the multi-vehicle fusion processor is deployed on the cloud server.
11. The fleet collaboration awareness method of claim 1, wherein each vehicle in the fleet has one of said multiple vehicle fusion processors deployed thereon, said method further comprising:
and selecting one of the multi-vehicle fusion processors deployed on each vehicle in the fleet and starting the multi-vehicle fusion processor to enable the non-selected multi-vehicle fusion processor to be in standby.
12. The fleet collaboration awareness method of claim 11, wherein selecting and initiating operation of one of the multiple vehicle fusion processors deployed on each vehicle in the fleet comprises:
determining a current pilot vehicle of a fleet, and starting a multi-vehicle fusion processor deployed on the pilot vehicle to work; or,
determining the vehicle with the highest current configuration in the fleet, and starting a multi-vehicle fusion processor deployed on the vehicle with the highest configuration to work; or,
sequencing all vehicles in a fleet in advance, and judging whether the most front vehicle in the sequencing result is still in the fleet and works normally; if the vehicle is still in the fleet and works normally, starting a multi-vehicle fusion processor deployed on the vehicle to work; if the vehicle is not in the fleet or works abnormally, the serial number of the vehicle is deleted from the sequencing result, and whether the vehicle which is most front in the sequencing result is still in the fleet and works normally is continuously judged.
13. The fleet collaboration awareness method according to claim 1, wherein said communication devices communicate based on vehicle networking technology V2X and/or fifth generation mobile communication technology 5G.
14. A fleet collaborative awareness system, comprising: a single vehicle fusion processor, a communication device and a multi-vehicle fusion processor for executing the fleet cooperation perception method according to any one of claims 1 to 13.
15. A fleet cooperative control method, comprising:
obtaining obstacle information around a fleet of vehicles by using the fleet collaborative awareness method according to any one of claims 1 to 13; and the number of the first and second groups,
and the motorcade decision system makes a running path and/or a driving strategy of each vehicle in the motorcade according to the obstacle information around the motorcade so that each vehicle runs according to the running path and/or the driving strategy.
16. The fleet cooperative control method of claim 15, further comprising:
and the motorcade decision system adjusts the running path and/or the driving strategy of each vehicle in the motorcade according to the road condition information and/or the weather information of the road on which the motorcade runs.
17. A fleet cooperative control system, comprising:
the fleet collaborative awareness system of claim 14; and the number of the first and second groups,
and the motorcade decision system is used for making a running path and/or a driving strategy of each vehicle in the motorcade according to the obstacle information around the motorcade obtained by the motorcade collaborative perception system so that each vehicle runs according to the running path and/or the driving strategy.
18. The fleet coordination control system of claim 17, wherein said fleet decision system is further configured to adjust a driving path and/or driving strategy of each vehicle in said fleet according to information about road conditions and/or weather on a road on which said fleet is traveling.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910198291.2A CN111696373B (en) | 2019-03-15 | 2019-03-15 | Motorcade cooperative sensing method, motorcade cooperative control method and motorcade cooperative control system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910198291.2A CN111696373B (en) | 2019-03-15 | 2019-03-15 | Motorcade cooperative sensing method, motorcade cooperative control method and motorcade cooperative control system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111696373A CN111696373A (en) | 2020-09-22 |
CN111696373B true CN111696373B (en) | 2022-05-24 |
Family
ID=72475340
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910198291.2A Active CN111696373B (en) | 2019-03-15 | 2019-03-15 | Motorcade cooperative sensing method, motorcade cooperative control method and motorcade cooperative control system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111696373B (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112270841B (en) * | 2020-10-23 | 2021-09-10 | 清华大学 | Information credible identification method based on multi-vehicle motion characteristics under vehicle-road cooperative environment |
CN112462381B (en) * | 2020-11-19 | 2024-06-04 | 浙江吉利控股集团有限公司 | Multi-laser radar fusion method based on vehicle-road cooperation |
CN114594755A (en) * | 2020-11-30 | 2022-06-07 | 湖北三环智能科技有限公司 | Intelligent transport vehicle safety driving system |
CN112597869A (en) * | 2020-12-17 | 2021-04-02 | 东风商用车有限公司 | Obstacle information pushing method, device and equipment and readable storage medium |
CN112629883B (en) * | 2020-12-28 | 2022-11-11 | 东南大学 | Test evaluation method for intelligent vehicle queue driving performance |
CN113012434A (en) * | 2021-03-18 | 2021-06-22 | 大众问问(北京)信息科技有限公司 | Vehicle running control method and device and electronic equipment |
CN113587951A (en) * | 2021-09-30 | 2021-11-02 | 国汽智控(北京)科技有限公司 | Path planning method, device, system, server, storage medium and product |
CN113935614B (en) * | 2021-10-11 | 2023-10-13 | 华中科技大学 | Fleet control method and device, electronic equipment and storage medium |
WO2023060410A1 (en) * | 2021-10-11 | 2023-04-20 | 深圳技术大学 | Motorcade regulation and control method and apparatus, electronic device, and storage medium |
CN113963561B (en) * | 2021-11-15 | 2023-05-05 | 东莞理工学院 | Autonomous driving vehicle group control method and communication system |
CN114386481A (en) * | 2021-12-14 | 2022-04-22 | 京东鲲鹏(江苏)科技有限公司 | Vehicle perception information fusion method, device, equipment and storage medium |
CN115083152A (en) * | 2022-06-09 | 2022-09-20 | 北京主线科技有限公司 | Vehicle formation sensing system, method, device, equipment and medium |
CN116170779B (en) * | 2023-04-18 | 2023-07-25 | 西安深信科创信息技术有限公司 | Collaborative awareness data transmission method, device and system |
CN117215316B (en) * | 2023-11-08 | 2024-02-13 | 四川大学 | Method and system for driving environment perception based on cooperative control and deep learning |
CN117631676B (en) * | 2024-01-25 | 2024-04-09 | 上海伯镭智能科技有限公司 | Method and device for automatically guiding unmanned vehicle in mining area to advance |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103956045A (en) * | 2014-05-13 | 2014-07-30 | 中国人民解放军军事交通学院 | Method for achieving collaborative driving of vehicle fleet by means of semi-physical simulation technology |
CN107195176A (en) * | 2017-07-07 | 2017-09-22 | 北京汽车集团有限公司 | Control method and device for fleet |
CN109062221A (en) * | 2018-09-03 | 2018-12-21 | 成都市新筑路桥机械股份有限公司 | A kind of intelligently marshalling Vehicular system and its control method |
CN109318898A (en) * | 2018-11-06 | 2019-02-12 | 山东派蒙机电技术有限公司 | A kind of intelligent driving automotive fleet cooperative control system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040252050A1 (en) * | 2003-06-16 | 2004-12-16 | Tengler Steven C. | Vehicle fleet navigation system |
US9606544B2 (en) * | 2014-10-31 | 2017-03-28 | Clearpath Robotics, Inc. | System, computing device and method for unmanned vehicle fleet control |
-
2019
- 2019-03-15 CN CN201910198291.2A patent/CN111696373B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103956045A (en) * | 2014-05-13 | 2014-07-30 | 中国人民解放军军事交通学院 | Method for achieving collaborative driving of vehicle fleet by means of semi-physical simulation technology |
CN107195176A (en) * | 2017-07-07 | 2017-09-22 | 北京汽车集团有限公司 | Control method and device for fleet |
CN109062221A (en) * | 2018-09-03 | 2018-12-21 | 成都市新筑路桥机械股份有限公司 | A kind of intelligently marshalling Vehicular system and its control method |
CN109318898A (en) * | 2018-11-06 | 2019-02-12 | 山东派蒙机电技术有限公司 | A kind of intelligent driving automotive fleet cooperative control system |
Also Published As
Publication number | Publication date |
---|---|
CN111696373A (en) | 2020-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111696373B (en) | Motorcade cooperative sensing method, motorcade cooperative control method and motorcade cooperative control system | |
US11550331B1 (en) | Detecting street parked vehicles | |
CN108510795B (en) | Collaborative vehicle navigation | |
US20190347931A1 (en) | Systems and methods for driving intelligence allocation between vehicles and highways | |
WO2021128028A1 (en) | Method and device for controlling autonomous vehicle | |
US11720102B2 (en) | Parking behaviors for autonomous vehicles | |
CN110667578A (en) | Lateral decision making system and lateral decision making determination method for automatic driving vehicle | |
US11679780B2 (en) | Methods and systems for monitoring vehicle motion with driver safety alerts | |
CN112292719A (en) | Adapting the trajectory of an ego-vehicle to a moving foreign object | |
EP3934936B1 (en) | Signaling for turns for autonomous vehicles | |
CN114475648A (en) | Autonomous vehicle control based on behavior of ambient contributing factors and limited environmental observations | |
CN113895456A (en) | Intersection driving method and device for automatic driving vehicle, vehicle and medium | |
CN115593429A (en) | Response of autonomous vehicle to emergency vehicle | |
Ozguner et al. | The OSU Demo'97 Vehicle | |
US11541887B2 (en) | Enabling reverse motion of a preceding vehicle at bunched traffic sites | |
CN116767272A (en) | Adaptation of driving behavior of autonomous vehicles | |
GB2579346A (en) | Vehicle control system and method | |
US11460846B2 (en) | Unmarked crosswalks for autonomous vehicles | |
US20240351519A1 (en) | Vehicle cut-in threat detection and mitigation between a leader and follower platoon | |
US20240208504A1 (en) | Method for behavior planning of a vehicle | |
CN116631223A (en) | Method and apparatus for assisting a mobile robot in mixing with a vehicle | |
CN116193401A (en) | Method, system and equipment for collaborative UWB test management of civil aviation airport road | |
CN114312822A (en) | Automatic driving control method, automatic driving control system, medium, and vehicle | |
CN113819920A (en) | Automatic driving non-autonomous navigation method for congested road section | |
CN116572980A (en) | Auxiliary driving system, vehicle and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |